Hope as a KPI
How we incorporated the experience and needs of 20 stakeholders to build hope.
I walked into a conference room at the University of Oregon in Eugene excited. Twenty years of expertise. Multiple organizations across Oregon and northern California. Everyone wants to make a difference for kids.
People had ideas. Good ones. But the ideas weren't connecting. CSAW at the University of Oregon had deep research on what works—rigorous studies, evidence-based frameworks, years of academic credibility. Protect Our Children had practitioner knowledge from years in the field, the kind you only get from doing the work. Other groups brought their own hard-won lessons from communities across the region.
The problem wasn't expertise—it was that no one had figured out how to weave it together into something buildable. Everyone was pulling in roughly the same direction, but there was no shared artifact, no common language for what "next" looked like.
They needed a way to get unstuck.
The question that stops people
Here's what keeps mission-driven organizations stuck: How do you move from "we know a lot" to "here's what we're building"?
The constraints were familiar. Limited funding. People with different perspectives. Pressure to show results. They couldn't bet everything on one big initiative that might miss. But they couldn't keep operating in fragments either.
So what do you do?
You create a container for the conversation that hasn't happened yet.
What we actually did
We ran a multi-day design sprint—practitioners, researchers, and organizational leaders from across Oregon and northern California in the same room. Not a conference. Not a strategic planning session. A sprint: structured exercises designed to surface assumptions, generate possibilities, and force decisions.
The goal wasn't to design a product. It was to surface what actually mattered.
What emerged surprised me.
The group moved away from a one-size-fits-all solution. Different communities face different versions of the same problem. Rural Oregon doesn't look like Portland. Northern California has its own context. Any platform needed to flex accordingly. That part made sense.
"The sprint gave us something our usual meetings couldn't—a structure that made it safe to disagree. We'd been circling the same questions for years, but this process forced us to actually make choices. By the end of day two, we weren't just nodding along anymore. We were building something together."
— Dr. Jeff Todahl, Director, CSAW at the University of Oregon
But then someone named the real metric. Not engagement. Not downloads. Not even behavior change, at least not directly.
Hope.
The room got quiet for a moment. After all the mapping and sketching and debating, the through-line was this: could we make two decades of hard-won expertise available in a way that made practitioners feel like progress was possible? Could we help people in the prevention field believe that their work was moving the needle—even when the wins are hard to see?
That's not a metric you find in a requirements document.
We put the concepts in front of real users. Practitioners. People who would actually be living with whatever we built. Some ideas clicked immediately—the resource library, the AI-powered guidance, the community connections. Others needed rethinking. A few we let go entirely.
"Talking directly with practitioners changed everything. We thought we knew what they needed, but hearing them react to our concepts in real time—watching what excited them and what fell flat—that's when the project got real. One conversation in particular made us completely rethink the navigation. We wouldn't have caught that from a survey."
— Simone Piper, Senior Research Assistant, CSAW
What remained was stronger for the refining.
Then we prototyped. Using AI as a development partner, we moved from validated concept to working prototype in weeks—not months. This wasn't a clickable mockup. It was a functional demonstration people could explore, test, and react to. Real search. Real content. Real interactions.
The speed mattered. It meant we could iterate on real feedback without burning through the budget. When a practitioner told us the navigation was confusing, we could fix it the same week. When someone suggested a feature we hadn't considered, we could prototype it and test it before the next call.
The result: "The Guide"—a prototype for a living collection of tools, resources, AI-powered assistance, and connection points designed to help organizations build a culture of prevention in their communities.
What emerged
The prototype was just one outcome.
The alignment and clarity created through the sprint and validation process enabled the coalition to develop something they'd been circling for years: a comprehensive 75-page Promotive Plan with 10 actionable factors for preventing child sexual abuse.
The Promotive Plan wasn't something we created for them—it emerged from the foundation the design work provided. The sprint forced decisions. The prototype proved concepts were viable. The validation work surfaced what practitioners actually needed. Together, these created the conditions for the coalition to finally articulate their full prevention strategy.
Now they had two things that didn't exist before: a proven concept (The Guide prototype) and a strategic framework (the Promotive Plan). More importantly, they had a clear relationship between them—The Guide would be the implementation platform for the Promotive Plan's 10 factors.
So what changed?
The most significant outcome isn't a feature list or even a document.
It's a fundable roadmap and organizational alignment that didn't exist before. Instead of seeking one large grant for one uncertain thing, the coalition can now pursue phased funding. They have a working prototype that demonstrates viability. They have a strategic plan that shows where they're headed. They have validation from real practitioners that the approach resonates.
Each funding phase delivers standalone value while advancing the larger vision. They're not hostage to a single funder's timeline.
The sprint and prototype work also sparked new conversations—with potential funders who now have something concrete to react to, with peer organizations who see possibilities for their own communities, with practitioners who now see a path where they previously saw obstacles.
Imaginations activated.
That's the shift.
Why it worked
Three factors turned ambiguity into momentum:
Process before product. The design sprint didn't assume we knew what to build. It created space for disagreement, synthesis, and unexpected insight. "Hope as a KPI" wouldn't have emerged from a requirements document. It emerged because we gave smart people a structure for discovering what they actually cared about.
Validation before commitment. Testing concepts with real users before heavy development meant we could set aside weak ideas early and double down on what resonated. We weren't guessing what practitioners needed—we were watching them react in real time.
AI-accelerated prototyping. Functional prototypes in weeks changed the economics of exploration. More ideas tested. More iterations completed. More learning per dollar spent. The old model was: plan exhaustively, build slowly, hope you got it right. The new model is: prototype fast, learn fast, adjust fast.
What's next
The Guide moves into full development soon. The prototype proved the concept. The Promotive Plan provides the strategic framework. The validation work confirmed practitioner need. Now it's about building the real thing—the platform that turns two decades of prevention expertise into something practitioners can actually use.
INSPIRED?
Let's Chat About How We Can Help
Got questions? Good. That's how every meaningful project starts. You don't need a detailed plan or a technical background to reach out—just curiosity and a willingness to explore. We'll meet you where you are, answer what we can, and be honest about what we don't know yet. The best partnerships begin with a simple hello. We're looking forward to yours.