Verizon Plan Recommender

JPMAM Product Discovery

The Plan Recommender is a tool that empowers customers to make informed plan changes based on their past usage behavior. Customers are alerted when there is another plan that would better match their needs and are reassured if they are already on the right plan. This tool is essential because it boosts plan satisfaction and loyalty and helps to build trust.

As the Lead Product Designer, I was responsible for creating and documenting the product strategy, conducting ideation workshops, planning and synthesizing research, collaborating on the roadmap, and creating all iterations of wireframes, prototypes, and high fidelity comps. I also worked closely with our engineers to ensure functional and visual quality assurance.

Company
Verizon

Year
2020

Team 
Lead PD (me!), PM, ENG (FE, BE, ML), Research

THE PROBLEM

Limited functionality to solve a complex problem.

Quick and easy discovery

Verizon's phone plans are complex, driving customers to rely on customer service to help them decide which plan to change to. The existing digital recommendation experience was limited and missed clear opportunities to assist customers in this process and drive revenue for Verizon. Our task was to discover exactly what those opportunities were to redefine the recommendation experience.

old (1)
old copy (1)

The existing Plan Recommender lacked crucial information and a clear CTA.

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

We gathered analytics to determine that the existing plan recommender was performing poorly. Success would mean customers using our recommendation tool, taking our recommendations and completing plan changes in digital rather than assisted channels. We worked closely with our finance and business partners to determine specific metrics to strive for:

  1. Increase selection of recommended plans by 8%
  2. Increase digital plan changes by 10%
  3. Increase step-up migration by 5%
  4. Increase NPS
THE DISCOVERY

Customers need to feel confident.

Insights and Opportunities

Before going into solution mode we had to thoroughly understand the problem at hand. First, we conducted generative research in the form of one-on-one interviews with eight Verizon customers who had recently changed plans. Our goals were to understand how customers went about making their plan changes and their attitudes towards plan recommendations. We also wanted to know about their previous experiences with recommendations outside of Verizon and what they believe makes for a trustworthy recommendation.

What we found was that customers have a lot of anxiety about switching their plan because they consider it to be a big purchase decision and don't feel confident that they can pick the right one. One customer even compared buying a phone plan to buying a car because of the expense that's accrued over time. This is what leads them to rely on human interaction and call customer service. We also learned that customers want to feel like we have their best interest at heart and that we aren't just looking to squeeze more money out of them with upsells. And to make confident plan decisions, customers need a full understanding of the difference between plans, particularly the price and features.

After we heard from our customers, we needed to gain insight from the plan recommendation experts: Verizon customer service reps. Our goal was to get a sense of how they calculate and communicate a plan recommendation. Reps can access the same system-generated plan recommendations that surface for customers, but they consider them to be highly inaccurate based on a customer's profile. This partially explains why the tool was performing so poorly. Instead, they do their own calculations to determine the best match based on things like data usage, mobile hotspot usage, and network performance.

Customer learnings

  1. Users are cautious about switching their plans for fear of making a wrong decision, pushing them to call customer service.
  2. Trust erodes when recommendations are perceived as pushy upsells.
  3. Users need a thorough understanding of the differences between plans.

Rep learnings

  1. Many reps don’t use or trust current system recommendations because they are often inaccurate.
  2. Top factors for plan recommendations are price, data usagge, mobile hotspot usage, and network performance.

Based on our findings, the team and I determined that to be successful, our solution would have to be:

  1. Personalized
  2. Contextual
  3. Transparent
  4. Accurate
THE APPROACH

How to make a robot feel human.

Once we knew what our users needed, it was time to move into solutioning. Our solution was defined by three steps:

Definition by research

What we recommend
We worked with engineers to build an AI-powered recommendation engine that would ensure recommendation accuracy. But the algorithm was comprised of data that wasn't all necessarily customer-facing. For example, we didn't want to use the customer's age, one of the top 20 features of the algorithm, to explain why they got a certain recommendation. We would have to establish what customer data we could play with to make sense of the algorithmic output. 

Why we recommend it
We discovered through our research and analysis of the AI algorithm that there are a handful of factors that are both important for accuracy but also resonate with customers and reps. The eight categories of information that we could surface to customers are:

  1. Cost
  2. Network Experience
  3. Data Usage
  4. Hotspot Usage
  5. Devices
  6. Features
  7. Subscriptions
  8. Discounts

I worked with my product manager and backend engineers to create a matrix of data points that we could pull from a customer's profile to explain the AI-generated recommendation. We manually created 150 different rules from our eight "buckets". I then worked with a content strategist to translate each rule into concise, personalized, and customer-friendly content.

The main challenge was to balance personalization with feasibility. The more rules, the more personalized we could get. However, more rules would also mean a greater operational burden, especially as Verizon's plans change every year. We would have to limit our recommendation rules and content to 150 different variations until we could move from manual to AI-generated content.

How we recommend it
The next step was figuring out how these rules translate into an interface that the customer sees.

Screen Shot 2021-02-09 at 2.49.44 PM

A glimpse of the rule/content matrix

IDEATION

How we recommend it

Sketches, wireframes and prototypes

I conducted an ideation workshop with my cross-functional partners and our stakeholders to drive collaboration and an exchange of different perspectives. To prepare for the workshop, I audited companies known to have great recommendation experiences, many of which were discussed during our customer interviews. I audited Netflix, Stitch Fix, Spotify, and Sleep Number to formulate best practices and document how, when, and where they displayed recommendations. I found, for example, that the most successful recommendation experiences explicitly communicate the reason for a recommendation (ex: “Based on your account usage”) and support it with data visualizations where appropriate. I shared these findings out during the workshop to help inspire ideas.

During the crazy 8's workshop, I had everyone ideate on prompts based on our design principles (personalized, contextual, transparent, accurate). These were the most compelling ideas that got the most votes:

Show & Tell (1)

Concept 1
Customers are presented with explanatory copy and supportive data visualizations.

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

BlackRock FX

feature comparison (1)

Concept 2
Customers are presented with a plan comparison chart.

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

BlackRock PDP

Plan Match Score

Concept 3
Customers get a match score for their current plan and other plan options.

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

Goldman Sachs FX

User Input

Concept 4
Customers select feature preferences to make recommendations even more accurate.

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

Goldman Sachs FX

We tested these ideas with customers to learn what elements were the most effective for understanding and trusting a plan recommendation. Rather than pointing to one design that would best solve the problem, there were a couple of ideas that customers responded well to. Customers liked and expected a combination of explanatory copy and supportive data usage visualizations to determine if the recommended plan is the best for their needs (Concept 1). Recommendations are more trustworthy when they show substantial evidence and are personalized. Customers also said that the plan comparison chart would be vital to help them make a plan change decision (Concept 2). Verizon's plans are loaded with features and most customers expressed that they don't have a good grasp of even their current plan's features. 

As for the Plan Match Score concept (Concept 3), customers understood that a 100% match would be hard to attain but felt that any score less than 100% (even a 99%) would make them feel negatively about their current plan. Customers responded fairly well to the recommendation settings concept (Concept 4), but the team and I deemed the idea too heavy of a lift for the value it would add. 

Once we landed on the most important information to surface in our desktop FX table, we sketched out different ways of presenting all that data without overwhelming the user. We landed on a split table with two tabs, one for prices data and one for performance data. As most users visit the site through desktop, we were able to heavily scale back the FX table data on mobile to show only the essentials, fund name and nav price.

Fleshing out the modal

Translating the designs

I then created wireframes for both 'change' and 'stay' recommendations that incorporated the different elements that performed well in our concept test. We decided that a modal would make the most sense for our solution so as not to disrupt the change plan flow. I incorporated tabs in the 'change' recommendation modal to avoid overwhelming the customer with too much information on one screen. The 'stay' modal was more straightforward as it didn't need comparison elements. We ran tests with our recommendation rules and found that most customers would receive either one or two "reasons why" a plan was recommended. This helped me understand how many rules I needed to account for in my designs for a single customer. From there, I fleshed out high fidelity comps for both desktop and mobile.

Wire_change1

'Change' modal wire, details tab

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

Goldman Sachs FX

Wire_change2 (1)

'Change' modal wire, comparison tab

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

Goldman Sachs FX

Wire_stay (1)

'Stay' modal wire

Iteration1_change1

'Change' modal comp, details tab

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

Goldman Sachs FX

Iteration1_change2 (1)

'Change' modal comp, comparison tab

The original fund explorer highlighted less useful information and hid or didn't support more important information and functionality.

Goldman Sachs FX

Iteration1_stay (1)

'Stay' modal comp

Revising the user flow

Translating the designs

After we learned what design elements we wanted to move forward with, the next step was figuring out where our solution would live. I built a user flow of the current change plan experience and realized that there were multiple places where the recommendation tool could be used. By increasing the number of touchpoints and fluidity between them, we could make our recommendations more accessible and effective. Once the modals were in a good place, I designed out the new recommendation touchpoints and entry-points.

Existing touchpoints

Artboard Copy 11

New touchpoints

Artboard Copy 12
m.1.0 – Manage plan – AL to LL Copy

PMD touchpoint

m.2.0 – Fork – AL to LL (1)

Fork page touchpoint

m.3.0 – Line Selector – AL to LL (1)

Line selector touchpoint

m.1.1 List View – Play More Unlimited Copy 2

Plan selector touchpoint

ITERATION

Modal improvements

Responsive and modern with robust functionality

We ran multiple rounds of usability testing to test the CPC flow touchpoints as well as the new modal designs. We tested desktop and mobile screens, 'change' and 'stay' recommendations, and the nine most common recommendation rules that would surface. At a high-level, we wanted to know if the touchpoints were noticeable, if 'stay' recommendations were valuable and if there were any issues with comprehension, perception, or task completion.

Customers responded well to the CPC touchpoints, so most of the feedback was around the modal design. Customers mentioned that the side-by-side feature and price comparison was the most useful element in helping them decide whether or not to take the recommendation (even more so than the 'reason why' language and visuals). Although the tabs were noticed, customers felt that "hiding" the feature comparison chart behind the second tab unnecessarily broke up the flow of the recommendation. They also expressed wanting easy access to feature details to educate them on the features they were unfamiliar with. In the instance of recommending an individual unlimited plan from a shared data plan (Verizon's two main plan categories), customers were confused by the cost implication if they switched. Additionally, for multi-line scenarios, customers were confused as to which line a given recommendation was for and sometimes had to backtrack from the modal to figure it out. We also learned that there were some copy improvements to be made in the header, subhead, and recommendation explanatory copy (customers found the explanatory copy to be too vague).

One of our big findings was that 'stay' recommendations were valuable and helped to build trust. However, customers still wanted the option to change plans from the modal if they didn't agree with our recommendation. All of these findings helped inform the next iterations of the design.

Iteration 1, change

Iteration1_feedback

Iteration 2, change

Iteration 2 (2)

Iteration 1 & 2, stay

Iteration1and2_stay (1)

We got a lot of positive feedback throughout our usability testing, which validated that we were headed in the right direction. These are two of my favorite quotes:

Once we landed on the most important information to surface in our desktop FX table, we sketched out different ways of presenting all that data without overwhelming the user. We landed on a split table with two tabs, one for prices data and one for performance data. As most users visit the site through desktop, we were able to heavily scale back the FX table data on mobile to show only the essentials, fund name and nav price.

"I like that recommendations are personalized based on past usage. I feel like Verizon is trying to be honest with me and not just trying to take more money.”

— Nicole, customer

"I think it’s laid out well because it’s guiding me through a process and explaining to me why I may need this plan. I can’t see why I would need any more information at this time.”

— Joe, customer

DEVELOPMENT AND RELEASE

Building the product

Responsive and modern with robust functionality

Designs were released in multiple phases so as not to overwhelm our engineer partners. Each phase required multiple rounds of grooming, where I annotated and dictated visual elements and functionality. For the first release, we built out the new heading and 'why we recommend' sections (minus the graphs). For the second release, we built out the compare features element. And for the third release, we built out the graphs. This ended up being a great way to track the impact of individual updates. For example, once we added the compare features element, customers were 9% more likely to select the recommended plan.

I also helped QA each release to ensure the designs were developed to perfection before pushing live. Since then, we've been closely monitoring performance to inform future iterations.

THE RESULTS

Crushed it

Responsive and modern with robust functionality

In our latest analysis, we found that our product has surpassed our goals by a significant margin. Customers are choosing the recommended plan at higher rates, particularly for unlimited plans. They are self-serving through digital channels to make plan changes, causing calls to customer service to drop. And they are much more likely than before to step up to more expensive plans.

  1. Selection of recommended plans increased by 57% (target was 8%).
  2. Digital plan changes increased by 12% (target was 10%).
  3. Step-up migrations increased by 8% (target was 5%).

Additional research and analytics are underway to validate updates from our latest release and see where we may need to iterate further. This is a living, breathing project, so we're going to keep our hand on the dial to make sure it's working well. One of the things we are looking to implement in the future is a way to use AI to auto-generate recommendation content to minimize operational effort and personalize to an even greater extent.

Recent Projects

JPMAM Product DiscoveryImproving investor workflow

A Vision for Asset ManagementEmpowering investors for the future

ElectroluxHelping shoppers find, locate & purchase appliances

AspireSetting refugees up for financial success

Website design and content © 2019 Sam Szerlip.

Website design and content © 2019 Sam Szerlip.

Website design and content © 2019 Sam Szerlip.