From messages that interrupt to moments that matter: redesigning in-app communication with AI

I designed an AI-guided in-app communication system; mapped journeys and created UX rules to control when and how messages appear; MVP improved conversion rates over older channels; the solution was built with engineering, data, and partner designers from a transversal engagement team.

The spark

I set out to prove how aligning UI behavior with AI decisioning would increase engagement and trust inside an app in a specific domain—less shouting, more context. My guiding principle: the right message, at the right time, in the right place, without blocking core tasks.

Where I sat

I worked within a central Relationship & Engagement group—a transversal team serving multiple product squads with omnichannel AI for communication and engagement. My remit was to connect AI decisioning with UX patterns so communication can feel helpful, not pushy.

In that transversal team, I connected AI decisions to clear UX patterns. NBA-style prioritization supported the work, but it wasn’t the focus—the focus was on when to be interruptive vs. passive, and on learning from user feedback to improve targeting over time.

My role

  • I led the UX strategy and research: desk research, a 41‑product benchmark (with 8 deep dives), and a hypothesis backlog tied to measurable outcomes.

  • I mapped end‑to‑end journeys across multiple products in a specific domain and systematized communication intentions (e.g., Basic & Available, Life Emergency, Future Thinking, Additional Product, Critical Pending Tasks, New Feature, Resuming an Incomplete Flow).

  • I defined experience rules (Interruptive vs. Passive): placement priority, overlays, progressive disclosure, closure, and feedback mechanics.

  • I shaped the MVP scope with engineering to validate architecture and UX with a single message type before scaling to “any message, any intention.”

Who participated (team effort for the full solution)

While I owned UX strategy and design, the complete solution came together with:

  • Engineering for architecture, instrumentation, and delivery;

  • Data Science for modeling and propensity signals powering the NBA engine;

  • Product for prioritization, alignment with OKRs, and rollout strategy;

  • Partner Designers embedded in product journeys to co‑create maps and resolve edge cases.

Method in action

  1. Research → Hypotheses: I translated insights like banner blindness and the need for transparent closure/feedback into six testable hypotheses.

  2. Journey‑first, not slot‑first: I avoided generic banner real estate and aligned patterns to the user’s moment and intention.

  3. Experience rules: I codified when to escalate from passive to interruptive (e.g., critical pending tasks, resume an incomplete flow) and when to stay calm (exploration, generic audiences).

  4. Feedback loops: I implemented closure and “reason for closing” to teach the model and increase transparency.

Outcomes

Design

  • Interface: UI components for each kind of message (inline card, banner, modal, empty‑state nudge), with documented states, accessibility notes, and variants for passive and interruptive use.

  • Design System: guidelines for all product teams (naming, placement rules, tone and microcopy, do/don’t examples) plus a shared Figma library to speed adoption and keep patterns consistent across journeys.

  • Rules for the algorithm: a clear contract between UX and AI—prioritization matrix by intention/urgency, eligibility and throttling rules, and mapping of user feedback (close + reason) back into the system.

Business

  • +10–30% improvement in click conversion, depending on the kind of communication (validated through controlled experiments).

  • +22% improvement in sales rate for products in the specific domain.

  • Higher consistency and delivery speed: teams ship new messages faster by assembling components and following shared rules instead of reinventing patterns.

  • Clearer measurement and governance: A/B templates and KPI definitions (click, conversion) make it easy to evaluate both interface and strategy.

Why it worked

  • I treated the interaction layer (UI, microcopy, controls) as part of the machine‑learning loop—user actions and feedback inform targeting, not just data signals. Making feedback on messages (close + reason) a first‑class element is still rare in the market and became a differentiator.

  • Operating from a central/transversal team kept me aligned with product squads and let that alignment be part of the design process, not an afterthought.

  • I delivered practically: prove value with an MVP, harden the platform, then scale.