I set out to design a Personal Finance Manager that felt less like a spreadsheet and more like a coach, fast to start, clear about what’s happening with your money, and flexible enough to fit real life. What follows is a short read on the journey: what we learned, what we built, and why it still mattered even when priorities shifted. This is a shortened version of the full PDF so feel free to dive into the full case study below.

View Full Case Study

Connect to Content

Add layers or components to swipe between.

⭐️

Connect to Content

Add layers or components to make infinite auto-playing slideshows.

Overview & Problems

In fintech, seeing balances isn’t enough, people want confidence and control. Early research surfaced three consistent pain points:


  • Budgeting was painful: setup took too long and many quit.

  • Transactions lacked clarity: confusing labels eroded trust.

  • Savings felt rigid: fixed rules (e.g., round-up to $1) didn’t fit everyone.


Our goal: re-imagine budgeting, transactions, and savings so they’re simple, understandable, and adaptable.

My Role

As HCD Lead & Senior Digital Strategist, I drove the vision end-to-end, aligning business goals with customer needs, shaping strategy and journeys, guiding prototyping and testing, and partnering across engineering, research, compliance, and leadership to keep feasibility and experience in balance.

HCD Approach and Human Research

Discovery. We grounded in known insights, then dove into interviews, market scans, and competitor reviews. Four truths kept repeating: budgeting is hard, saving needs automation with choice, transactions must be clear, and people want actionable insights, not just numbers.

Prototyping & testing. Through iterative clickable prototypes (small cohorts of 8–12), we saw:

  • Faster budget setup with a one-tap starter.

  • Higher savings adoption when round-ups were customizable.

  • More trust when labels were clearer and controls were explicit.

Competitive analysis. Many tools were clunky and one-size-fits-all. We aimed for personal, intuitive, and adaptable.

Success Measure & Results

We conducted multiple rounds of interactive prototype testing with small groups of 8–12 participants per session. While the product didn’t make it to production, these tests provided measurable results and valuable insights, showing that the experience met its goals and revealing patterns that can be reused with confidence

  • Setup speed: ~2 minutes with one-tap vs. 6–7 minutes before (target: 40%+ reduction).

  • Savings adoption: Customizable round-ups outperformed fixed rules, users chose control.

  • Clarity & ease: Exceeded 70% with a 90% result for “clear and easy” benchmark in sessions.

  • Trust & control: Users actively used dismiss/controls—small choices built trust.

  • Confidence: Qualitative feedback showed users felt more in control of their money.

Even without production deployment, rigorous user testing surfaced what works, and why, so these patterns and learnings can be reused with confidence.

Lessons Learned

We found that speed and control don’t have to be opposites. A one-tap budget gave users a quick start, while the ability to fine-tune later kept them engaged. Trust also proved to be something you design for, plain language, merchant logos, and small touches like “Why we ask” built confidence and improved adoption.

We also saw that people manage money differently: some want automation, others want control. By supporting both, satisfaction rose across the board. Finally, treating compliance as a partner instead of a checkpoint changed everything. Co-creating patterns early reduced friction and kept designs both user-friendly and compliant.

These lessons reinforced that small, thoughtful details often make the biggest difference in financial experiences.

We found that speed and control don’t have to be opposites. A one-tap budget gave users a quick start, while the ability to fine-tune later kept them engaged. Trust also proved to be something you design for, plain language, merchant logos, and small touches like “Why we ask” built confidence and improved adoption.

We also saw that people manage money differently: some want automation, others want control. By supporting both, satisfaction rose across the board. Finally, treating compliance as a partner instead of a checkpoint changed everything. Co-creating patterns early reduced friction and kept designs both user-friendly and compliant.

These lessons reinforced that small, thoughtful details often make the biggest difference in financial experiences.

Reflections

Although the product never made it to launch, the work was far from wasted. We uncovered where money management breaks down for real people and created solutions, like one-tap budgets, customizable round-ups, and clearer labels, that actually worked in testing. We also built a research and KPI playbook the organization could carry forward.


The biggest takeaway? Even when something doesn’t ship, the process leaves behind insights, tools, and patterns that shape the next opportunity. Good design and research don’t just deliver features, they build momentum for better solutions down the road.

We found that speed and control don’t have to be opposites. A one-tap budget gave users a quick start, while the ability to fine-tune later kept them engaged. Trust also proved to be something you design for, plain language, merchant logos, and small touches like “Why we ask” built confidence and improved adoption.

We also saw that people manage money differently: some want automation, others want control. By supporting both, satisfaction rose across the board. Finally, treating compliance as a partner instead of a checkpoint changed everything. Co-creating patterns early reduced friction and kept designs both user-friendly and compliant.

These lessons reinforced that small, thoughtful details often make the biggest difference in financial experiences.

Success Measures & Results

We conducted multiple rounds of interactive prototype testing with small groups of 8–12 participants per session. While the product didn’t make it to production, these tests provided measurable results and valuable insights, showing that the experience met its goals and revealing patterns that can be reused with confidence

  • Setup speed: ~2 minutes with one-tap vs. 6–7 minutes before (target: 40%+ reduction).

  • Savings adoption: Customizable round-ups outperformed fixed rules, users chose control.

  • Clarity & ease: Exceeded 70% with a 90% result for “clear and easy” benchmark in sessions.

  • Trust & control: Users actively used dismiss/controls—small choices built trust.

  • Confidence: Qualitative feedback showed users felt more in control of their money.

Even without production deployment, rigorous user testing surfaced what works, and why, so these patterns and learnings can be reused with confidence.

Lessons Learned

We found that speed and control don’t have to be opposites. A one-tap budget gave users a quick start, while the ability to fine-tune later kept them engaged. Trust also proved to be something you design for, plain language, merchant logos, and small touches like “Why we ask” built confidence and improved adoption.

We also saw that people manage money differently: some want automation, others want control. By supporting both, satisfaction rose across the board. Finally, treating compliance as a partner instead of a checkpoint changed everything. Co-creating patterns early reduced friction and kept designs both user-friendly and compliant.

These lessons reinforced that small, thoughtful details often make the biggest difference in financial experiences.

Reflections

Although the product never made it to launch, the work was far from wasted. We uncovered where money management breaks down for real people and created solutions, like one-tap budgets, customizable round-ups, and clearer labels, that actually worked in testing. We also built a research and KPI playbook the organization could carry forward.


The biggest takeaway? Even when something doesn’t ship, the process leaves behind insights, tools, and patterns that shape the next opportunity. Good design and research don’t just deliver features, they build momentum for better solutions down the road.