Customers didn’t experience a platform. They experienced disconnected tools.
Over the past decade, the company had grown through a combination of organic product development and strategic acquisitions. What started as a single conversational AI solution had evolved into a comprehensive enterprise suite: acquired products including CoreLayer, ConvoCapture, AgentAssist, CampaignHub, and AudienceIQ, alongside in-house built tools like DataStream, ModelForge, AnalyticsHub, InsightExplorer, BotStudio, ComplianceDesk, and SalesIQ.
However, this growth came at a cost. Each product was built by different teams, at different times, with different technology stacks and design philosophies. Acquired products maintained their original branding and user experience. The result was a fragmented ecosystem where customers had to navigate between completely separate applications to accomplish their goals.
I recognized this as both a significant business problem and a design opportunity. If we could unify these 12 products into a cohesive platform, we could dramatically improve customer experience, reduce support costs, increase cross-sell opportunities, and position the company as a true platform leader.
Hypothesis: If we could reorganize our products around user intent rather than product boundaries, we could create a unified experience that felt intuitive regardless of which features a customer used. Users don’t think in terms of ‘products’—they think in terms of tasks they want to accomplish.
Before proposing any solutions, I needed to fully understand the scope and impact of the fragmentation. I spent three weeks conducting discovery research, which included analyzing support tickets, interviewing customers, shadowing users, and auditing each product’s information architecture.
What I discovered was worse than expected. The fragmentation wasn’t just a UX inconvenience—it was actively preventing customers from getting value from our products. Many customers were only using 2–3 of our 12 products, not because they didn’t need the others, but because the effort required to learn and manage additional systems was too high.
Support ticket analysis revealed that 23% of all tickets were related to navigation confusion, permission issues across products, or questions about how features in different products related to each other.
Given the complexity of unifying twelve products built over a decade—a mix of acquisitions and internal builds—I knew this project required a rigorous research foundation. I structured my research in three phases: Discovery (understanding the current state), Exploration (identifying possible solutions), and Validation (testing proposed structures).
The entire research phase took approximately 4 months, involving 35+ stakeholder interviews, 18 customer interviews, competitive analysis of 12 platforms, and validation testing with 24 users.
I began by mapping the internal landscape. I conducted 35 interviews with product managers, engineers, customer success managers, and sales teams across all twelve products—from acquired suites like AgentAssist, ConvoCapture, CampaignHub, and AudienceIQ to in-house tools like ModelForge, DataStream, AnalyticsHub, and SalesIQ.
Key Discovery: I created a comprehensive feature matrix that revealed 47 instances of duplicate or overlapping functionality across products. Multiple products had their own ‘dashboard builder,’ AnalyticsHub, InsightExplorer, and AudienceIQ had separate reporting modules, and analytics capabilities were scattered across AnalyticsHub, ConvoCapture, and AgentAssist with no shared terminology.
35 stakeholder interviews · 47 feature overlaps identified · Complete feature matrix created
I conducted deep-dive analysis of 12 enterprise platforms: Salesforce, HubSpot, ServiceNow, Zendesk, Adobe Experience Cloud, Microsoft Dynamics, SAP, Oracle, Workday, Atlassian, Pega, and Genesys.
Key Patterns: The most successful platforms organized navigation around user intent (what you want to do) rather than product boundaries (which tool you’re using). They used consistent patterns: primary navigation for functional areas, secondary navigation for modules, consistent settings placement.
12 platforms analyzed · 8 IA patterns documented · Best practices synthesized
I ran both open and closed card sorting exercises with 24 participants.
Open Card Sort: Participants grouped 87 feature cards into categories. This revealed users naturally thought in terms of ‘viewing/analyzing,’ ‘building/creating,’ ‘managing/configuring,’ and ‘connecting/integrating’—closely aligned with my proposed four pillars.
Tree Testing: Initial testing showed 78% task success; after two iterations, this improved to 92%.
24 participants tested · 92% final task success · 3 iterations completed
Based on all research, I developed a framework organizing our product suite around four primary pillars:
Insights: “I want to understand what’s happening” — Analytics, dashboards, reports, measurement tools.
Applications: “I want to use tools to do my job” — Operational products for daily use.
Services: “I want to access platform capabilities” — Knowledge bases, AI models, data management.
Administration: “I want to configure and manage” — Settings, user management, integrations.
Intent-based organization · Scalable framework · Research-validated
With a validated framework, I needed to get buy-in from leadership and twelve product teams.
The Pitch Strategy: I led with the business problem (backed by data), demonstrated user pain (research quotes, journey maps), showed competitive pressure (benchmarking), and presented the solution as evolution—not replacement—of existing products.
I created detailed Figma mockups showing how each product would appear in the unified structure. The pitch was successful—leadership approved and product teams shifted from resistance to enthusiasm.
Executive approval secured · 12 product teams aligned · Implementation roadmap defined
The final architecture wasn’t arbitrary—every decision was grounded in research:
Why four pillars? Users naturally grouped features into 3–5 categories. Four provided enough separation while remaining few enough to be memorable.
Why organize by intent? Users think in tasks, not products. “See how my team is performing” not “open AnalyticsHub.”
Why maintain product identity? Complete dissolution would cause too much disruption. We preserved familiar modules within the new structure.
Why progressive disclosure? 200+ features would overwhelm. A three-level hierarchy—Pillars → Modules → Features—kept the interface clean.
With the information architecture validated and stakeholders aligned, I designed the final platform screens across all four pillars — translating the research-driven structure into a cohesive, production-ready interface.
What I’d Do Differently: Start user research earlier—should have run stakeholder and user interviews in parallel. Create a change management plan—users needed help transitioning mental models. Document trade-offs more explicitly for future team members inheriting the architecture.
This project demonstrated that the hardest design problems aren’t about pixels—they’re about people, systems, and strategy. By grounding every decision in research and bringing stakeholders along the journey, we turned twelve competing products into one coherent platform.