THE PROBLEM
How should we design a chatbot for answering prospective students applying for the Comm Lead program?
CONTRIBUTION
-
Collaborated with project managers, researchers and other designers, researched various chatbot products and pitched the product's weakness and strength to the team.
-
Contributed to the research efforts and designed one main conversation flow.
Presented the chatbot project at the annual department summit
Won the design award
SKILLS PRACTICED
Product Research | Conversation Design | User Flows | Chatbot | Human-centered Design Thinking | Competitive Analysis | Card Sorting | Tree Testing | User Testing | Information Architecture | NLP Bot | Project Management
PREVIEW
Ask the bot when you have questions
The frequent asked questions (FAQ) navigation
The FAQs are organized into 5 key categories, allowing students to easily access the answers by selecting the relevant tab.
The chatting function
Users can type their questions into the chat to receive answers.
The chatbot backend - How I connected the conversation flows
THE PROBLEM
The transparency of information is a crucial aspect in the school admission process. Students struggle to find information as it is scattered across websites and emails.
Additionally, the high volume of emails from prospective students asking similar questions results in a tedious workload for the Academic Advisor.
To address these challenges, a chatbot solution can be designed to answer basic questions and guide prospective students to the information they require.
USER ANALYSIS
Understanding prospective students
We created two typical personas representing for international and domestic student in Comm Lead Program.
The geographic information was gathered from the Comm Lead program website traffic, the students' quotes and the questions were determined through the user survey.
Comm Lead website analysis (09/2021 - 02/2022)
Prospective students' FAQs
The User survey
STYLE GUIDE
Fostering the team alignment
The visual design and content guide serve as a reference for all, establishing the preferred terminology and tone to use in the bot's responses.
PLANNING
In addition to website analysis, user research, and the creation of a style guide, the team completed the design of the FAQ conversation on the main menu in phase 1.
The bot demo on the phase 1
USABILITY AUDITING
Aim at launching in 3 months
I looked back to the user testing done at the end of phase 1, I observed the following comments:
“At the end, I select 'I can't find what I want' but I don't have any menu showed up, I guess that's the end of the conversation. Wish there's option to go back to main menu.”
“My only recommendation would be to think about adding a ‘back to main menu’ option on some of the prompts. When I wanted to check out a second option, I clicked ‘go back’ but it only took me one step back within the same category. I think it might be helpful to have both?”
“It was really great! One thing I thought might be improved was the timing of "Was this answer helpful?" shows up. While I was reading the answers from the chat bot, this "Was this answer helpful?" showed up and the answers I was reading moved up before I could finish reading them.
“Make the not answer what the full cost of the course is , not per credit - international students don’t know what the US system is when they are joining.”
Based on these feedback, several problems with the conversation flow were identified, including the inability to return to the main menu, disruptions caused by automatic messages, and a lack of understanding of the needs of international students.
Additionally, users suggested improvements such as adding notifications for program activities in follow-up messages and enabling the chat function.
COMPARATIVE STUDY
Comparing 5 universities' chatbots
To create an effective main menu information navigation for users, I created a list of useful and primary question types and compared it to the offerings of chatbots designed by other universities.
This allowed me to determine the most important information to include.
The chatbot at the New Jersey City University's website
The chatbot at the California State University, San Bernardino's website
Main takeaways:
Strengths: inclusive, career oriented and straightforward
The Comm Lead chatbot provides comprehensive answers for both international and domestic students. It sets itself apart by offering a wide range of career resources information.
After the usability auditing and comparative study, I decided to overhaul these features:
1. Show the main menu tabs first
2. Enhance the interaction experience
3. Remove the initial question asking students' identity (international or domestic)
IMPROVEMENNTS
1. The default chatbot status
A circular widget with a chatbot introduction tooltip is positioned by default, minimizing disruption when not in use.
2. The welcoming message is sent to users when they click the widget
Upon arrival to the site, users are greeted with a welcome message and main menu, allowing for easy navigation to answers.
INFORMATION ARCHITECTURE
Did the bot perform well on the phase 1?
In phase 1, user validation and performance evaluation were not conducted. To fill this gap, I conducted the card sorting (10 participants) and tree testing (10 participants).
The FAQ tabs designed on the phase 1
The participants overview of the card sorting
Similarity matrix
I facilitated an open card sorting exercise using 20 name-labelled cards, as this method is more effective in gaining insights into the users' mental model with groups named by the participants themselves.
I moderated 3 testing sessions and received 7 unmoderated testing data. During the sorting process, I asked participants to verbalize their thoughts, reasoning, and frustrations.
The insights gained from this exercise:
Listen to the majority choices, leave the edge cases
Ask follow-up questions
Unfamiliar terms should be avoided
Tree testing
Following the card sorting, I reorganized the main question groups for the chatbot and conducted a tree testing to validate my adjustments. I designed separate testing tasks for both international and domestic students, considering their differing needs.
Testing tasks for international students
Testing overview of international students group
Outcome
From the tree testing, the new IA received 63% average success score in the international students testing group and 69% average success score in the domestic students group.
The testing also allowed me to identify groups that received a 100% success rate and areas that received a 0% success rate, providing insight into areas that need improvement
The specific success scores date examples
A new IA in the chatbot that match to users' needs and mental modal
The new structure in the chatbot
The combination of international and domestic student experiences has streamlined the process and saved time.
The main menus now include highly requested questions and answers related to student life, program updates, and visa information.
USER TESTING
Predict the failure before launch
Participants
Overall satisfaction rate
Willingness to use again
By introducing the chatbot to potential users and observing their direct responses, I was able to validate my design decisions and gather further insight into users' expectations, habits and suggestions.
Participant 1 09/01/2022
Participant 3 09/02/2022
Participant 2 09/02/2022
Participant 4 09/02/2022
PAIN POINTS AND SOLUTIONS
Pain points
Solutions
1
The access to talk with real people
The preferred method of communication for users remains talking real people.
1
Adding contact information to the auto-reply
Activated by the keywords: "bye" and "thank you", etc. The advisor's email and program's social media links will be sent to users.
2
The instant replies are distracting
The lack of a delay between sending questions and receiving replies interferes with the reading experience.
2
Setting the delay time
Set a delay time to mimic a human-like response from the bot.
3
Seek for a more engaging navigation experience
The navigation was unengaging and tedious.
3
Multiple "back" routes
I designed alternative navigation methods in the bot, including presenting related questions to click on, the option to type "hi" or "back" to return to previous pages, etc.
ITERATION
The access to talk with real people
The ending auto-reply is triggered when users end the conversation.
This auto-reply sends the information to connect the program staff.
Slower replying speed to mimic a human-like conversation
A 0.5 second delay time has been implemented when the chatbot responds, this creates a more personal interaction between the bot and the user.
4 different ways to navigate back and forth in the chatbot
-
Asking the user if they have further questions
-
Displaying other related questions
-
Allowing the user to type "hi" or "back" to navigate the chatbot on their own.
-
"Go back" button
REFLECTION
Teamwork or independent work
In the first three months of the project, I worked with a team, which was efficient in planning and distribution of work. However, working independently in phase 2 was advantageous for quicker decision-making and impactful results.
Gained user research skills
I honed user research skills in this project using card sorting and tree testing techniques to validate the information architecture. Developed testing trees, assigned user tasks, conducted participant interviews, analyzed results, and extracted findings.
Ability to translate UX knowledge to non-UX audience
Clear communication of the research processes and the reasoning behind them is essential when presenting results to clients. It helps them understand the value of the invested time. In phase two, I shared the research process and key findings in a clear and concise manner during weekly one-on-one meetings.