An Evaluation of NEST

Client
Metricell
Skills
Research, UI & UX Design

Project Overview

As Metricell grew it became apparent that information was missing about how their users work with products created during the earlier life cycle of the business. One of these products known as “NEST” was the first that needed to be evaluated. A customer support tool that is integrated into Metricell’s biggest customer EE.

The project’s initial scope was to evaluate how different support agents use NEST and identify the most significant pain points in their user journey. Furthermore, to then ideate changes to the product that improves the user flow of NEST.

The Challenge/Problem

As stated in the introduction, NEST has been a product that was created in the earlier years of Metricell. This product was designed with cobbled assumptions about users, from stakeholders who worked in similar industries. This project was the first to truly understand the users and how they interact with the product.

Additionally, this research piece occurred during the start of the pandemic. While I was able to adapt quickly to the working change, it did limit the research techniques and altered the natural working conditions of the user.

The Solution

User Levels

Being a B2B company allowed me to quickly identify how NEST is being used from a business perspective. I was able to host meetings with the Head of Customer Support from EE at the start of the project. This allowed me to identify how the support agents were grouped and the purpose of each grouping.

I identified that there were three support levels at EE. Level 1 is responsible for dealing with initial first line support. They use NEST to either identify any network issues or log unknown issues and relay this back to the customer.

Level 2 is responsible for dealing with issues that the level could not address and working with the engineering team to address the issue. They are supposed to use NEST to log technical information about areas that have outages.

While Level 3 responsibility was to manage Level 1 and Level 2. They had a more back seat role with NEST, but would still need to use it when team numbers were low.

Testing Plan/Discussion Guide

Understanding this formed the basis of the research plan. My method was to host 3 sets of user interviews. Five users from level 1. Five users from level 2 and 3 users from level 3. Each set would consist of a 30-minute interview and 30 minutes for active tasks on NEST.

The interview portion of the usability sessions was to gauge an idea of who is using NEST. How do they use the tool in their life, what challenges do they face and how does this differ from Level 1 to level 3 agents? Understanding this from the viewpoint of the agent provides me with insights into common issues. Comparing this against the active test sessions helps determine if these issues are a software issue, a lack of training issue or a missing feature.  

The usability tests focused on two scenarios. A known network issue and an unknown issue. In each scenario, the agent was expected to go through a specific process hitting key points to help address the customer's issues.

Both scenarios required the agent to run a network check at the location of the issue for the customer. However, if the issue was not known the agent was required to record the issue based on the customer's description, create a problem ticket and link the customer to the issue to provide an automated message on the investigation. If the issue was known then the agent had to communicate the issue back to the customer and again link the customer to the network issue to provide them with the automated messages.

All of these points were captured on a discussion guide to help me keep track with the interviews and to make sure all necessary points were undertaken to collect the needed data.

Discussion Guide Screen Grabs

Running the Sessions

All sessions were conducted remotely as the majority of the users are based in Wales and as mentioned earlier conducted during the pandemic. Microsoft Teams was used to perform the test and record the sessions. Additionally, it provided a transcript of the session that was used during the analysis.

Analysis

I performed a thematic analysis on the transcript of the sessions. Pulling out key points and assigning a theme. This was done on both the interview and the tasks.

Additionally, I marked the tasks to see if they succeeded or failed and if they followed the correct procedure. If they deviated, why and how did the deviation help address the customer's issue? With EE they wanted lower agent levels to follow the process rigidly while higher agents had more flexibility to deviate from the process to resolve the customer's issue.

Results

I then took my findings and created a report for senior stakeholders to read. Within my findings, I discovered three major pain points within the NEST user flow.

Results breakdown and potential resolutions.
Screen Grabs from the Official Testing Report

Navigation

While all agents tested can find the various features to aid in the investigation, users were having to jump between multiple features to gain an understanding of the problem. An example of this is one agent had to jump between multiple site details, engineer notes, site KPIs and elevation profiles to gain an understanding of the problem.

The reason why we would categorise this as a potential issue is during a call with a customer this would increase the handling time. Moreover, if a customer has been escalated from level 1 to level 2, they may have already spent a while on a call to the contact centre. Analysing the evidence collected from Rebecca Morgan, customers can get frustrated due to spending a lot of time on a call and not getting the results they wanted. This can make it difficult for the agent to resolve the problem.

Additionally, while all tested agents were able to find the relevant features, a common message that came up in multiple sessions with the participants were newer level 1 agents unable to find certain features with the example being elevation profile. This is problematic as tickets are being escalated to level 2 that can potentially be solved within level 1. Creating additional work for agents and increasing the handling time between the customer and the business.

Process

In relation to this audit, one possible concern is communication between frontline agents and engineers. Agents are supposed to be using the "Geek Glossary", an excel sheet that contains common engineer terminology. However, no agent used this sheet to process the notes. This may be a possible explanation as to why agents either struggled to process the notes or referenced that new frontline agents would not be able to understand the notes provided. Utilising the "Geek Glossary" mid-call would also increase call duration time between agent and customer.

Data

The last theme is focussed on possible issues to do with Data. All of these items either lead the agent to have incorrect and/or not enough data.

The first issue which represents the most common complaint we were able to collect is the dependency on examples. For data speed complaints, the agent is required to get the customer to run speed tests which can be problematic for two reasons. The first is customers have to be recalled at a later date to provide them enough time to collect the results. This can be frustrating to the customer who may not understand why the process from start to resolution will have to take a long time due to the agent unable to process their complaint without the data speeds examples. The second reason is customers who are not tech smart are going out of their way to learn how to run a speed test to diagnose their problem.

New Features and Changes

Location Profile

To resolve the navigation problem, I proposed and later implemented a new feature known as “location profile”. The idea being a user can search a customer's location and NEST would open a report with all the data needed to investigate an issue. This solves the problem of agents having to navigate multiple menus to try to find the issue affecting the customer.

UI Design of Location Profile

Integrating Geek Glossary

As mentioned, users have to use an excel sheet in conjunction with NEST to process notes from Engineers. This in itself is a cumbersome process that doesn't get used as often as it should. Therefore, to make the process easier, I proposed integrating the excel sheet into the tool itself. When an agent brings up notes from an engineer, NEST will highlight any word that appears in the excel sheet. All the user would have to do is click on the word to bring up the definition.

UI Designs of how Geek Glossary works in pop-ups

Remote Speed Test

Metricell provided other tools to EE, one of them being an API that linked to the EE mobile app. To resolve the issue of agents not having enough data from the customers and asking them to run speed tests, I proposed to link the NEST platform to the API. This would allow agents to schedule speed tests to run in the background. This reduces the workload for the customer to provide the data and simplifies the process for agents to collect the data they need to diagnose an issue.

UI Design of how agents would schedule a remote speed test
An Evaluation of NEST
An Evaluation of NEST