Spectrum Enterprise Web Portal
After a huge telecom merger between Time Warner Cable, Bright House Networks, and Charter Communications in 2016, Charter (now a fortune 70 company) rolled all of their residential, small business, and enterprise accounts under the Spectrum brand. In 2019, I was brought on board to continue advancing UX design efforts for the Spectrum Enterprise web portal. Within two months, I knew we needed to start from scratch.
1. No design system
From popups to modals to partial screen takeovers to slide outs, it seemed every page had its own personality. As UI elements and logic models transformed over time, they were only updated for new features and not refactored site-wide. Even basic design foundations like card styles, typography, and color standards fell prey to the feature factory mindset.
2. Not mobile friendly
Instead of conducting user testing to determine the needs of mobile users, or better yet designing everything to be compatible for mobile from the onset, business decisions were made to simply hide content and completely remove features so the web portal could 'fit' on a mobile screen.
3. Years of tech-debt
What started as an 'alpha' proof-of-concept was never rethought and graduated to a mature, stable code base. Years of development without product design guidance created conflicting dev stacks, zero consistency across components, a nightmare for the QA team, excruciatingly slow releases, and inevitably, painful performance lag for our users.
4. Unscalable for future roadmaps
Not only was the code fragmented and unscalable, but the overall design architecture inhibited our ability to add new services, expand features, and cater to even basic enterprise needs like defined user roles. People couldn't understand why our 1995 Camry wasn't performing like a 2020 Tesla.
5. No user feedback loop
Analytics were bolted on to pages after feature releases which made it extremely difficult to track performance and user behaviors. In addition, stakeholders hadn't embraced the tactic of fortifying their roadmaps with user analytics and instead, they were relying solely on institutional knowledge to push initiatives.
Back to the basics
Breaking down the current portal
I started by meticulously diagramming the current portal. Though I did my best to simplify the flow chart, it was clear that content was buried multi-layered dead-ends and convoluted interaction patterns. I also worked with the backend teams to diagram how information got from our databases to the screen (spoiler; it's complicated).
After dissecting our portal structure and researching how other web-apps handled complex data relationships and hierarchical services, it was time to hit the drawing board. The new strategy would rely on three components; The sidebar to locate information, the context bar to organize content, and the quick actions menu to take action.
Defining our user testing strategy
During an ideation session, my team and I narrowed our user roles into five categories and identified the primary tasks that each of these roles accomplishes in the portal before writing a test script for each feature.
AB testing the prototype
With the baseline concept fleshed out and our test script ready to go, I created an Axure prototype and worked with my research counterpart to launch a user test (usertesting.com) where we tested 24 participants; 12 of which were existing customers familiar with the current portal and 12 of which that were just IT professionals but not Spectrum customers. Of each 12, half saw the current version first and the other half saw the new version first.
225% Faster at completing tasks (1m 19s vs 35s)
66% Less clicks with each task (51 clicks vs 34 clicks)
4 out of 5 said it was easier to use (20 out of 24)
4 out of 5 said it best met their needs (21 out of 24)
4 out of 5 preferred it overall (20 out of 24)
"The old navigation left me clicking trying to hunt and find information which took time and was frustrating."
Auditing existing components
One of the many challenges about working with large, established teams is trying to push new initiatives without disregarding years of existing work. I created an Airtable database so the design and dev teams could analyzed components, libraries, and effort levels to to determine what code (if any) could be salvaged.
Though we couldn't identify a clear winner, we were able to salvage portions of the code and the exercise helped us understand which framework to invest our efforts and which frameworks to avoid.
With our initial concept validated and our code library established, it was time to tighten down the site architecture. I enlisted the help of our analytics team to determine common devices and viewport widths. To no-one's surprise, desktop sizes were predominant with our demographic (network admins), however, we knew that this could have been a byproduct of our lackluster mobile experience.
Understanding our viewports helped us establish sizing and breakpoint constraints for the header, collapsible sidebar, custom margins, and dynamic 12-column content area. In addition to detailed documentation, I coded an example on CodePen to better articulate the fluidity of the new container. I also created scaled examples of how our content would adapt to each to each device, paying careful attention to how content would adapt for smaller screens.
User testing for task completion
Equipped with the knowledge of our dev libraries, breakpoint strategy, and numerous conversations with our product team, it was time to understand how each service would fit within our new architecture. Unlike the first round of wires that focused on rapidly proving a concept, the second round embodied a nearly complete portal with numerous interaction points to test task completion on both desktop and mobile platforms.
What we learned after 6 wire revisions
The initial prototype we tested focused on a recurring table structure since the majority of our portal consisted of tabular content. Though users agreed that the information was well organized and tasks were easy to complete, they disliked how repetitive each page was and had a hard time distinguishing differences between products and features. It was a great wake-up call that even though content is dense, its presentation doesn't have to be so we went back to the drawing board to reassess how our content could be more engaging. After the sixth round of wires and a much simpler user flow, we were ready to polish the design.
With the wires tested and our strategy roadmap defined, it was time to execute our plan. The project's massive undertaking attracted high-visibility across multiple teams which meant it could't simply be buried in (yet another) Jira/Trello board. Instead, I created an analogue project chart to surface bi-weekly deliverables, design phases, blockers, and overall project status.
Color, elevation, typography, and micro-interactions
Using Charter's existing design standards, I worked with the UI designers to develop an enterprise specific standard complete with color codes, card elevations, border radiuses, typeface rules, and even micro interaction animations.
New architecture meant new interaction patterns, new menus, and of course, new icons. With something as complex as enterprise telecom software, this exercise needed to be more than a simple design refresh. For this reason, we started with a baseline, uniform icon pack that compared candidates without any extra frills. In other words, we wanted to compare trucks vs helicopters, not F-150s vs F-250s.
With the results analyzed, it was time to increase the fidelity (I went with Google's 24pt Material Icon Grid) and conduct a second round of testing; this time with only two icons per category instead of four. In this test, we scrambled the icons and asked users to locate content within variations of our prototype.
Based on task completion time, click counts, and subjective feedback from our users, we identified the winners and finalized the icon pack.
Accessibility and content
With nearly everything in place, it was time for final accessibility and content reviews. I worked with our accessibility architect and accessibility tester to ensure our interaction patterns, color contrasts, button sizing, and typography rules met AA WCAG compliance before reworking titles, tooltips, descriptions and other content to uphold Charter's tone and content guide standards.
The XD Prototype
Because the new portal design relied so heavily on interaction patterns, I needed a more robust tool for communicating fluid movements to our stakeholders, developers, and user testers so I researched tools and selected XD for my final canvas. Though XD doesn't handle conditional logic as well as other tools (like Axure and Proto.io), it did a fantastic job at portraying the smooth, effortless design for our new portal. Though I can only show fragments of the finished work here, contact me to view this fully interactive prototype.
Product Design Lead James Lacey
UX Design James Lacey, Dan Cayo, Courtney Blank, Jessica Simon
UI Design James Lacey, Tara MacDaniels
UX Research James Lacey, Brian Azcuenaga
UX Copywriting Otis Garrison
Accessibility Architect Drew Cartwright
Accessibility Tester Fred Haner