
A colleague introduced Lisa and me because we were both, in different ways, working to support women. On Zoom, we talked about our shared interest. Hers came from years of work in finance and the broken rung that holds women back. Mine came from building The Meraki Dignity Project and listening to women who are navigating health, money, work, and family all at once.
We finally found ourselves in the same place at the same time and carved out a rare weekday brunch in New York. It felt indulgent in the best way. Two busy women, both balancing work, family, and our own quiet questions about what comes next, sitting down without an agenda.
Within twenty minutes, the conversation shifted. We discovered that we both have family members with Alzheimer’s. We talked about how confusing and fragmented the information is, how much fear sits underneath the data, and how hard it is to know what any of it really means for your own life.
At one point I asked Lisa something deeply personal. Would you ask your doctor to test you for the genetic markers? It wasn’t really a medical question. It was an ethical one. What would you do with that knowledge? How would it shape your choices? Would it give you clarity or more anxiety?
We shared what we knew. What our doctors had said. What we had read. What we worried about. What we would or would not want to know. In that small, trusted space, we were doing something powerful. We weren’t just exchanging information. We were making sense of it together.
In a way, we had become our own small language model.
That might sound abstract, but it’s actually very simple. Two people. A bounded conversation. Trusted sources. Shared values. Lived experience. Judgment. Context. Everything that turns content into knowledge.
Most of today’s AI is very good at producing content. It can summarize studies, list risks, and generate answers at scale. But content alone doesn’t help you decide whether to take a genetic test, change a career, or invest in your health. Those decisions live at the intersection of facts, fear, responsibility, and what you value most.
Knowledge is different. Knowledge is curated. It has boundaries. It reflects what applies to you. It carries the weight of real lives and real tradeoffs.
This is why how we use AI matters.
When AI is broad and unbounded, it flattens context. When it is purpose-built, constrained, and guided by human judgment, it can do something far more useful. It can help people connect information across the parts of their lives that actually interact. It can support reflection rather than replace it. It can feel less like a machine delivering answers and more like a trusted space where thinking is allowed to unfold.
That is the heart of The Meraki Dignity Project. Not faster answers. Clearer orientation. Not more content. Knowledge that helps women see their lives as they really are, and make choices that feel intentional rather than reactive.
What Lisa and I did over brunch is what we are trying to make possible at scale: grounded, human, context-rich ways of making sense of what matters most.
That is where dignity, design, and technology meet.
by Stephanie Georges
Founder and CEO
The Meraki Dignity Project