At a time when AI technology continues to expand its application scenarios, an AI diary application called Rosebud is making waves in the field of mental health. Not only did it receive a $6 million seed round led by top venture capital Bessemer Venture Partners, but it also redefined the journaling experience with its unique “awareness engine”.
Have you ever thought that journaling may have changed completely? A traditional diary is like a silent vessel into which you pour your thoughts, and it just accepts silently and never responds. But now, a new journaling experience is changing all that – it not only listens to you, but also talks to you and helps you discover the patterns of life hidden between the lines.
Recently, an AI diary application called Rosebud received a $6 million seed round of financing, led by Bessemer Venture Partners, with participation from well-known investors such as Initialized Capital, 776, and Fuel Capital. This led me to think deeply about the question: what happens when AI agents are no longer just tools to complete tasks, but become guides for our inner worlds?
I found that what Rosebud is doing is much deeper than meets the eye. It is not simply using AI to analyze text or provide suggestions, but is redefining the way people talk to their hearts. The significance of this transformation may be more profound than we think. This is a very interesting niche track, I shared in the previous article “Two “Special Pits” AI Product Entrepreneurship Direction, Do You Know” AI general assistant is a bit of a pit product direction, the reason is that the current technology cannot reach the level of a comprehensive assistant to improve efficiency. However, Rosebud cleverly transforms from the outside to the inside, turning efficiency into inner self-thinking and awakening, so as to improve personal growth. This transition successfully opened up a new direction and also achieved PMF to some extent, because introspective self-thinking is less technically demanding.
The AI agent’s inward turn
While everyone is talking about how AI agents can improve work efficiency and automate business processes, I realize that we may be overlooking another huge potential of AI agents: exploring inward. Most AI agents help us with the outside world, but the real power may lie in their ability to analyze information and spot patterns. When this capability turns inward, the AI agent becomes a powerful tool for self-reflection.
B-end product manager’s ability model and learning improvement
The first challenge faced by B-end product managers is how to correctly analyze and diagnose business problems. This is also the most difficult part, product design knowledge is basically not helpful for this part of the work, if you want to do a good job in business analysis and diagnosis, you must have a solid …
View details >
I think this turn is particularly interesting. We live in an era of extreme extroversion, where all attention is pulled to the outside – social media, news feeds, notifications of all kinds. But Rosebud founders Chrys Bader and Sean Dadashi realized five years ago that this attention economy was actually hurting us, keeping kids and adults glued to screens chasing dopamine. As technicians, they feel a responsibility to build something healthier and more nutritious.
This sense of responsibility brings them back to a simple but profound realization: most people lack high-quality guidance and companionship, not because they don’t need it, but because of cost, time, or social barriers. Traditional counseling can cost hundreds of dollars an hour, require an appointment, and consider geographical limitations. But what would change if the AI agent could become a personal mentor at any time, specifically to help you grow?
Rosebud’s user data gives the answer: since its launch in 2023, users have logged 500 million words and spent more than 30 million minutes reflecting. What’s more, 75% of users reported a significant improvement in their mental health after 30 days of sticking to it. It’s not a simple numbers game, it’s a real life transformation. I saw a user share that was particularly touching: “When I couldn’t trust my perception in a relationship, Rosebud’s memories of my past diary validated my feelings and helped me believe in myself again.” “This is the voice of a survivor of domestic violence.
I believe that this trend of AI agents turning inward represents an important direction in technological development. We are moving from “AI helps me do things” to “AI helps me understand myself”. At the heart of this shift is that AI is no longer just an external tool, but an internal partner.
What makes Rosebud unique
After diving deeper into Rosebud, I realized that its core innovation lies in a system known as the “Awareness Engine”. This system is based on Anthropic’s Claude 3.5 Sonnet large language model, but the key is not the underlying technology, but its unique long-term memory capabilities. I think this design is very clever because it solves a fundamental problem with traditional AI interaction: the lack of continuity and deep understanding.
Traditional journaling apps either provide static prompt questions or simply simple mood tracking. More often, you’re done writing, and the app just passively stores your text and doesn’t provide any valuable feedback. But Rosebud completely flips this model. It remembers conversations from weeks or months ago, connecting past experiences to current situations, providing insights in a way that reveals deeper patterns. Like a skilled coach or trusted mentor, Rosebud’s specialized memory system doesn’t just record your entries—it proactively identifies your blind spots, identifies repetitive patterns in your behavior and relationships, and gently brings them to your consciousness when you’re ready to receive these insights.
I especially appreciate a detail of this system: it knows when to speak and when to remain silent. This timing is crucial, as psychological growth has its own rhythm, and intervention too early or too late can be counterproductive. Rosebud looks at your emotional state, current situation, and historical patterns and chooses the most suitable time to share its insights. For example, when you mention a similar conflict with a colleague for the third time, it may gently state, “I noticed that this is the third time you have mentioned communication issues with a colleague this month, do you feel there is anything in common between these situations?” This approach does not make people feel judged, but also effectively leads to self-reflection.
What’s even more impressive is the system’s ability to make connections across time. It remembers not just facts but also emotions, relationship dynamics, and the trajectory of personal growth. When you face a similar challenge a few months later, it is able to review how you handled it before, what you learned from it, and what strategies worked best for you. This continuity is a core value of human tutors, and now AI can provide this experience as well. I think this represents an important shift from a “tool” to a “partner”.
One thing I particularly appreciate is that Rosebud doesn’t try to be your friend or emotional dependent. Founder Chrys Bader made it clear in an interview: “I’m actually against AI companionship because human connection is very important. “Their philosophy is that Rosebud should help you build a relationship with yourself, not with AI. This design philosophy is very clever because it avoids many of the pitfalls of AI companion products – allowing users to escape real human connections.
Instead, Rosebud’s goal is to help you gain the tools and resources to build healthy relationships in the real world. It helps you overcome challenges privately and gives you enough confidence to finally talk about them openly. This approach is more like a mirror that helps you reflect on yourself rather than being an object of your dependence.
From a technical implementation perspective, Rosebud integrates therapist-supported methodologies, drawing on techniques such as Cognitive Behavioral Therapy (CBT), Acceptance and Commitment Therapy (ACT), and Internal Family Systems (IFS). It has a library of journals created by mental health professionals that cover topics such as nervous system regulation and positive psychology. This means that users are not just getting AI responses, but professionally proven guidance methods.
Rosebud also does a great job of privacy protection. All diary data is encrypted, not shared with third parties, and not used to train AI models. They both have zero data retention (ZDR) agreements with OpenAI and Anthropic, ensuring that data is destroyed immediately after processing. Privacy protection is crucial for this highly personalized application.
Why now is the time for AI journaling to explode
I think the reason why AI-powered personal growth tools are exploding at this point in time is due to the confluence of several key factors. The first is the maturity of large language model technology. Today’s LLMs can not only understand complex emotional expressions but also engage in meaningful conversations, providing a technical foundation for AI journaling.
The second is the outbreak of psychosocial needs. During the pandemic, people’s attention to mental health has reached an unprecedented level. At the same time, the limitations of traditional mental health services have become more pronounced – expensive, difficult to obtain, time constraints and other issues prevent many people from getting timely help. AI-powered tools offer a viable solution to make mental health support more accessible and accessible.
The third factor is changes in user behavior. Our generation has become accustomed to talking to AI, from Siri to ChatGPT, and people’s acceptance of AI interaction has greatly increased. But at the same time, people are beginning to realize that excessive external stimuli can be harmful and begin to seek deeper self-connection.
From an investment perspective, the words of Maha Malik, an investor at Bessemer Venture Partners, are very telling: “Mental health support should not be limited by time, place, or privilege. Rosebud leads the way in combining AI with long-term memory, making it a trusted companion in your pocket – transforming self-reflection into growth and everyday thoughts into lasting transformations. ”
Judging by Rosebud’s trajectory, this demand is real. Launched in July 2023, they started with a lean team and limited capital, using a scenario-driven development approach, and achieved profitability within 18 months, garnering over 100,000 user registrations. Now they have more than 8,000 paying customers and the monthly subscription fee is $12.99. This growth rate shows that there is a strong demand for such products in the market.
I’m particularly interested in how many therapists and coaches are starting to recommend Rosebud to their clients as a support tool between sessions. This suggests that professionals also recognize this AI-assisted approach. Even for those with weekly professional support, it can still be challenging to keep progress between sessions. Rosebud is there to help you make progress when “life happens”, ensuring that insights and breakthroughs in professional sessions are not lost between appointments, but are integrated into daily life and practice.
Founder’s deep thinking
In my research on Rosebud, I was particularly intrigued by a profound perspective from founder Chrys Bader that could redefine our relationship with AI. He shared an interesting reflection in the interview: he opposes seeing AI as a pure task-delegating tool. He argued, “Delegating tasks is different from having thought partners, and if you blur those boundaries, delegate critical thinking and then accept AI output without question, for me it’s a risk for any business or anyone who operates on this information.” ”
This perspective has made me rethink the way we use AI on a daily basis. Chrys shared a specific example: their team would have the AI “create a marketing plan” in the early days and then use the generated content directly. But he found that these AI-generated plans lacked the context that humans would have when writing about business, goals, etc., and felt more general. “If people operate purely on this output, it will dilute the quality of work over time, and if everyone does this, it will lead to what I call ‘great dilution’, where smart people will go down to the level of AI.”
I think this concept of “great dilution” is particularly profound. It points to a risk that we may not be fully aware of: when we rely too much on AI for decision-making, we may lose our ability to judge and learn. Because humans learn through trial and error, if we no longer experience the process of making decisions, we cannot update our “internal software”. It’s like outsourcing our cognitive abilities to an external system.
Instead, Chrys advocates for the use of AI as a thought partner. “It can give you a different perspective and help you think about things you may not have thought of, but it still requires you to think and make decisions and take responsibility for them.” He now uses Claude as a design partner, uploads a lot of data and says “I’m thinking about how to design this”, and the AI will use artifacts to render the page, and then he gives feedback. But this is not to get a word-for-word design, but to generate ideas at the information architecture level.
The wisdom of this approach is that it preserves human dominance and the learning process while leveraging AI’s creative capabilities. As Chrys says, “Anyone who is creative, they are not generative on their own, what they are really good at is getting more perspectives, and then choosing the best aspects of them and putting them together.” Now that his team members are starting to work in this way, engineers who may not be very good at design, through Claude as a thought partner, are now able to come up with design ideas that are an order of magnitude higher than before.
I think this redefinition of AI roles may be one of the most important cognitive shifts of our time. It’s not about what AI can do, it’s about how we should collaborate with AI to truly benefit, not be weakened by it. This mindset is at the heart of Rosebud’s success – it is not intended to replace human self-reflection, but to enhance the process.
Bigger thinking about mental health technology
The success of Rosebud has made me think about a bigger question: Where are the boundaries of AI in mental health? New research shows that AI may be effective in achieving therapeutic effects, and therapy and companionship are now the number one use cases for AI. But it also raises ethical and operational issues.
Critics point to the potential risks of over-reliance, as well as the challenges of responding to users in crisis. Rosebud does not currently have a clinical officer, but does work with therapists to get feedback and is working to improve the way situations involve suicidal thoughts are handled. The founders acknowledged that “we recognize the ethical boundaries and limitations in addressing serious mental health conditions,” noting that Rosebud encourages professional support when necessary.
I think this humility is important. Rosebud clearly positions itself as a journaling app, not a therapeutic substitute. Its value lies in complementing, not replacing professional help. Many Rosebud users find the platform to work seamlessly with traditional therapy and coaching relationships. In fact, therapists and coaches are increasingly recommending Rosebud to their clients as a necessary support tool between sessions.
Looking at it more broadly, I feel that tools like Rosebud represent an attempt to democratize mental health. Not everyone can afford or access high-quality mental health services, but almost everyone has access to a smartphone. If AI can provide a level of self-awareness and growth support, it could be beneficial for society as a whole.
As Rosebud co-founder Sean Dadashi puts it, “What if everyone had something for them?” Or have something that helps them become the best version of themselves? I think that’s the exciting thing about AI enablement. I have benefited greatly from mentoring at various times in my life, and I have suffered when I did not have it. I think what we see is a future that can be offered to everyone in a way that was not possible before. You can have something that really thinks about you, really tries to understand you, and is aligned with who you want to be. ”
Humanistic care in technical details
From a technical implementation perspective, I am particularly interested in Rosebud’s “awareness engine”. The core of this system is not just memory, but intelligent pattern recognition and timely insight delivery. It requires feedback when the user is ready to accept it, which requires a deep understanding of human psychology.
Chrys mentioned in the interview that the next-generation features they are working on will make Rosebud not just a journaling app, but a personal growth platform tailored to you. “Rosebud will be a place where you can not only journal every day, but there will also be customized programs built for you to help you learn not only about yourself, but also about all the models and philosophies in the world and how they apply to your life.”
What’s even more exciting is that their long-term vision is a dynamically rendered app where AI will build interfaces for you, giving you the right tools at the right time, completely rendered by AI. This personalization capability is unprecedented and represents the immense potential of AI in the field of personal growth.
I feel that achieving this vision requires addressing several key challenges. The first is how to ensure that the AI’s recommendations are truly in the best interests of the user and not be influenced by some kind of algorithmic bias. The second is how to provide personalized guidance while maintaining user autonomy and critical thinking skills. The third is how to deal with users with different cultural backgrounds and values, ensuring that the AI’s suggestions are appropriate and helpful.
But I believe that with the advancement of technology and the accumulation of more practical application data, these challenges can be solved gradually. The key is to maintain a focus on human well-being rather than just pursuing technological advancement.
My outlook for the future
Looking at Rosebud’s trajectory and the trends it represents, I am excited about the future of AI in the field of personal growth. I think we are at a turning point where AI is starting to transform from an external tool to an internal partner. The implications of this shift may be more profound than it seems.
On a personal level, AI-powered self-reflection tools may help us better understand ourselves and discover behavior patterns and thinking habits that we may never be aware of. This increased self-awareness ultimately translates into healthier relationships, smarter life decisions, and higher life satisfaction.
At the societal level, if a large number of people have access to this support for personal growth, we may see a more conscious, compassionate, and collaborative world. As Rosebud’s vision says, “When millions of people have access to this level of personal growth support, we believe it will create a more intentional, compassionate, and collaborative world—one person at a time.” ”
Of course, this future implementation requires careful handling of some key issues. We need to ensure that AI does not replace human connection, but rather enhances it. We need to protect user privacy and ensure that even the most private thoughts and feelings are not abused. We also need to prevent AI’s suggestions from being taken as absolute truth and maintain human critical thinking and autonomous judgment.
I’m also thinking about the impact of this technology on the traditional mental health industry. While Rosebud has made it clear that it is not intended to replace therapists, as AI tools become more intelligent and effective, traditional mental health services may need to redefine their values and roles. I think the future model may be AI providing basic daily support, while human professionals focus on more complex and deeper problems.
From an investment and business perspective, Rosebud’s success is a testament to the huge potential of the mental health technology market. As society continues to pay more attention to mental health and AI technology continues to improve, we may see more innovative mental health AI products emerge. This market is still in its early stages and has a lot of room for development.
Ultimately, I believe that products like Rosebud represent an important direction in the evolution of technology: not just to make us do things faster and more efficiently, but to help us become better people. In an increasingly fast-paced, increasingly fragmented world, having an inner guide at our fingertips to help us pause and reflect, grow, and connect may be one of the technological innovations we need most.
When AI learns to listen to our hearts, we are also learning to listen to ourselves better. This is probably one of the best things in the age of AI