Visier

Generative A.I. Assistant

Company
Visier Inc.

Role
Senior User Experience Designer

Team Composition
2 User Experience Designers, 1 Director of Product, and a team of developers

Scope
User Research
Prototyping
Interaction Design
Visual Design
Conversational A.I. Design

 

Problem Space

Visier People Analytics is a deep and robust solution that not only allows you to visualize HR data in a wide variety of data visualization presentations but also leverages an analytic model which understands organizational structures and their implied relationships. For someone new to data analytics or a people leader that doesn’t work with people data on a daily basis, the options and even the fundamental functions can be overwhelming.

With the rise of A.I. assisted workflows, Visier saw an opportunity to cut through often confusing workflows to get better insights to more people.

Project Goals

  1. Implement a cornerstone A.I. focused feature

  2. Feature should cut through difficult workflows to surface useful insights


 

How might we leverage new patterns of generative A.I. and conversational design to help people unfamiliar with data analytics quickly find useful insights?

 

Open-ended Exploration

The explicit use of A.I. in Visier People was new, so the initial design strategy was to go wide - to go with a blue-sky design mentality with the hopes of learning more about the A.I. technologies currently available and to not be constrained by the current technical limitations of the platform.

Project codename “Vee” had three main ideas bubbled to the top:

 

1. Proactive MEssaging

This design exploration was a previous project which pivoted into Vee. The main idea was to build a messaging feature that alerted users when certain trigger events or conditions were met. These users would receive an email with insights on the events and what their next steps could be.

From Hair Ball to OUt of my hair

The elevator pitcher for this feature would be Visier People acting as a background assistant helping users take appropriate actions. All of this would be done without having to go into Visier People - rather, the notification would reach the end user where they usually are (Slack, email, etc.).

Below is an example of the core steps:

1. Receive message from Vee about an event that’s important to you either in Slack, Email, or elsewhere

 

2. Take a follow-up action that could help resolve the situation. This action would have auto-generated to help the user speed up their next steps.

 

Ultimately, we didn’t follow through with this direction. The reasons were because:

  1. When it comes to people data, the triggering event conditions are many and complex, often specific to individual industries and companies. Creating that definition system would be a significant undertaking.

  2. The resulting actions are just as varied and complex. They would require context, approval, leadership alignment, etc. This function would also require a significant amount of work.

  3. The examples explored were often individual-focused - for example, an event happened to a person on your team. Visier People is capable of this analysis, but is better suited at analyzing data at a larger population level.

 

2. Partner Application

This design direction looked at having a partner application that was either a distinct app that you downloaded onto your computer or a browser extension that housed this application. The user would be open the partner app, select a portion of their screen or browser to analyse, then Vee would present a data visualization along with a conversational UI experience where one can ask follow-up questions about the data.

 

This was another direction we didn’t follow through with. The reasons were because:

  1. This application would not only require dynamic data analysis, but also automatic data upload into Visier People. Even if that was possible, there would need to be a step where that uploaded data is integrated into the data model so relationships and other structures can be enforced. All of this is currently outside of the platform’s capability.

  2. Though not apparent, there is a data security issue as well. If automatic data upload and data model integration was possible, regulating whether that data is secure or even relevant to the company would be difficult.

 

3. Search Replacement/Augmentation

This design direction was the most practical one we explored. It looked at augmenting or replacing the current search experience with a conversational experience. Rather than requiring the user to define an end outcome, as we do in the data visualization-centric “Explore” room, we begin by having the user ask a people centric question. Vee would then use Visier People to try and answer that question. There are many parallels to Chat-GPT which is a benefit since it would leverage a burgeoning but rapidly familiar precedent.

This was the direction we decided to follow through with and began to refine the design direction even further.


Research

Environmental Scan
Once the direction was determined, we went on to see what experience were similar and what we could learn from it. Along with the other UX designer, we compiled a list of various chatbot and conversational experiences.

Some key learnings were:

  1. Chat style interfaces (similar to Whatsapp) were the most prevalent way to present a conversational experience. This has been true since chatbots were introduced and remains relevant today.

  2. Supplementary actions are possible to include but can quickly take focus away from the text answers

  3. Being able to easily and inherently provide follow-up input can give the A.I. context which results in refined answers


Approach

Asking Questions = searching

Rather than trying to augment the search experience with an A.I. assistant, we opted for replacing search entirely. The point of search is to ultimately get to an answer, so having to balance a traditional search and a conversational U.I. wouldn’t be necessary.

Answer First, Visualization SEcond

Visier is renown for data visualizations and believe strongly in their power to communicate complex ideas. For Vee, images and secondary actions can be distracting to the answer the user is looking for. Deprioritizing or even hiding some of this supplementary data is key to having a simple and clear conversational interface.


Wireflows

For the first refinement pass, Vee fully replaces the Search function in Visier People. The search/input field is kept at the top which does not align with more common chat interface presentations, but allows for a smoother transition from a purely search experience to Vee.

After a user inputs a question, it appears below the input field to validate that the system has received their question. Vee’s answer would appear below. Additional follow-up actions would also be available below Vee’s answer that would either reveal a relevant data visualization (chart) or present the user with traditional search results germane to the user’s question.

As the user continues to ask questions, the thread-model would use those questions to refine the initial query. If they were to pivot to another line of questioning, a new thread would be established.


Usability Testing

SEtup

Once the mid-fidelity prototypes were complete, we performed summative usability tests with five Visier team members that best reflected external customers. The intention was to then do a wider follow-up usability test once Vee was in the alpha stage.

The participants were presented with a scenario then prompted with tasks to complete in the prototype. Think-aloud and secondary interview questioning were our main testing methods. The sessions were recorded then documented in Dovetail for archiving and easier sharing.

Results

A number of interesting findings were surprisingly consistent amongst the participants:

  1. The participants didn’t know to click the “Vee” logo to navigate to the page. They were looking for a search logo or something similar. If we were using a pass/fail rubric, all participants would have failed.

  2. The placement of the input field at the top of the page didn’t align with the expected layout and experience. Participants mentioned they were mentally referencing Facebook Messenger, Whatsapp, and Chat-GPT.

  3. Vee’s answer was formatted in a way that was difficult to understand and was misaligned with how participants expected “an A.I. to sound”.

  4. The follow-up actions were not aligned with participant expectations. They were hoping for additional context in the form of wider-view insights (such as, Vee’s answer compared to another company vertical or compared to a previous period). Also, the traditional search results didn’t match their mental model of the experience which resulted in more confusion than clarity.


Design Refinement

Following the usability tests, we made a number of improvements along with passing on minimum viable scope workflow states to the developers. Outlined below are a few key changes to both the interface and the question design.

 

Keep the door, change the room
A small change that made a big difference was the search icon rather than the “Vee” icon. Though there is a potential content mismatch, this change also allows users to explore a brand new feature without digging around for it.

 

Started at the bottom, now we here
The input field was moved to the bottom of the page which better aligns with burgeoning industry standards and our users expectations. By upholding Nielsen and Norman’s heuristic of “Consistency and Standards”, we were able to accelerate learnability and position ourselves in a similar vein to the Chat-GPTs of the world.

 

Clarifying actions
Two changes were made to the follow-up actions:

  1. The wording for the first button was changed from “Would you like more context?” to “View as visualization” to better reflect what would actually happen when the button was pressed.

  2. The “show related search results” button was removed. There are a variety of ways in the application to find a metric, dimension, analysis, chart, etc. which could render the search room redundant. There was a lot of resistance to this idea, so Vee was placed under a feature toggle which allows the end user to determine whether it would be a good fit for their organization or to test it for a period of time with the ability to revert to the traditional search.

 

More better words
Vee was producing correct answers, but they were difficult to understand as a user unfamiliar with the data model and/or data analytics.

An updated answer template that contained a number of basic rules was implement to rearrange the information presented so that it could sound more human and resulted in a more digestible answer. Some of these rules were:

  • Primary insight sentence structure:

    • “In/From {time frame} for {population scope}, the {subject} by {group by and filters} were:”
  • Secondary insight(s) sentence structure that communicates a change in value:

    • “For {group by and filters}, the {subject} {increased/decreased} by {value} in {time frame} to {value} in {time frame}. This was a {value} {increase/decrease}.”
  • Secondary insights are to be presented in an unordered list and in a lighter grey colour (as to not take away from the primary answer)


Results

Vee has yet to be released in a post-alpha state so results for this feature are inconclusive. The general sentiment from people that have used it say that Vee rapidly gets to a relevant and valuable answer without having to build a chart or review pages of analyses. What could take 15 minutes upwards to hours only took a few minutes.


Improving my process

I’ve taken away a number of valuable lessons while working on Vee:

  1. Applying new technology requires many “old” UI to help bridge the gap.

  2. Communication and early stakeholder insight is paramount. Speaking with the devs earlier about how we might circumvent the worry around replacing the Search experience would have resulted in a smoother process.

  3. I knew that rapid, lightweight usability testing is pivotal when implementing something completely new. This fact was heavily reiterated for me in this process.