Improving experience of the Viewpoint simulation platform
Summary
This was a semester-long project we did for the course 'Needs Assessment and Usability Evaluation'. The objective was to help a client with a comprehensive assessment of their product to improve the experience and accessibility.
Viewpoint is cloud-based software that supports custom-designed role-playing simulations. It allows instructors to create and implement engaged learning experiences in both remote and in-person settings.
Here is the link to the platform. Viewpoint
Team
Our team comprises four members: two from computer science, one from sociology, and one from architecture.
Tools
Google Suite
Contribution
Interaction map, Structured Interviewing, Heuristics evaluation, Comparative evaluation, Usability testing
What is Viewpoint?
ViewPoint is a cloud-based decision-making platform used for academic purposes. It is a structured learning activity in which students adopt the personas of others mainly public figures and engage in a designated activity from their perspective. It enables students to share, communicate, and register their decisions for the topic in the debate. The primary goal of the product is to provide participants with a seamless collaboration and communication experience while offering instructors and facilitators a smooth experience for creating and moderating simulations.
Process diagram
Interaction map
We were granted access to the platform with two primary user roles, out of a total of three. These two main roles include students and instructors, while facilitators constitute a subset of the instructor roles.
While exploring the platform, we successfully pinpointed the critical user pathways. This enabled us to create a comprehensive end-to-end interaction map, allowing us to construct a clear mental model for navigation tailored to the diverse user groups.
Instructor
Creates simulations
Customizes pre-made simulations
Student
Participates in the simulations
Interacts with fellow participants
Access to simulation material
Facilitator
Moderates simulations
Interacts with students during the simulation
Stakeholder Interviews Interview findings
This was the main crux of our methodology for our initial evaluation of user experience of ViewPoint. We conducted user interviews and held an interpretation session afterwards to analyze the findings. This not only allowed us understand the platform better but also helped us build a connection with our stakeholders.
Survey Survey findings
After gathering some qualitative data on the users of ViewPoint through semi-structured interviews, we needed quantitative data to further understand the characteristics, attitudes and behaviors of the target users. Before the survey is deployed in scale, a pilot study is conducted to uncover the errors, biases and ambiguities in the questionnaire.
Here are examples of some of the changes we made based on the pilot study. We refined the survey to achieve a neutral tone by eliminating any inherent biases. We also made the survey clearer and more specific to ensure it's a straightforward and precise.
Before
How easy is it to communicate with your peers? (Likert scale: Strongly disagree … strongly agree)
How would a chat option affect your experience in the simulation?
Were there any action items that needed to be completed before the simulation began?
After
It was ______ to communicate with my peers within the platform (Likert scale: Extremely hard … extremely easy)
How would a chat option (similar to slack and messenger) affect your experience in the simulation for an in person simulation of <= 20 participants?
Were there any action items on ViewPoint that needed to be completed before the simulation began?
Comparative Analysis Comparative Analysis
For this study, eleven competitive products were evaluated from areas of direct, indirect, partial, parallel, and analogous competitors. Each of these products was compared according to a list of criteria.
Criteria for comparison to competitors:
If they are related to teaching
If they are related to policy/decision-making
Communication
Private - DMs
Groups
Broadcasts
Accessibility
Different text sizes
Voice over
Collaboration
Voting app
Simulation tool
Heuristics evaluation Heuristics evaluation
Through heuristics evaluation, we were able to synthesize legitimate and logical issues and recommendations. Each individual usability system was evaluated within context. For our system, these contexts fell under 4 themes which were determined based on the original research requirements from the stakeholder:
Creating a simulation: Any usability problem that has to do with creating a simulation. This was targeted at instructors as the audience and was one of the major areas of review requested by our client. This mainly included creation, duplication, and deletion of simulations.
Student communication: Usability problems that affected participants’ experience of the simulation. This accounted for features such as newsfeed, role playing, and messages.
Facilitator view: Usability problems affecting the facilitator's view. There was a lot of overlap between this category and the above two as the facilitator’s view is a combination/subset of instructors and participants’ views.
Miscellaneous: Any usability problem that did not fit amongst the above three categories was classified as miscellaneous, as the name suggests.
Usability evaluation Usability evaluation
Our analysis ended with usability evaluation where we created tasks for the users. Our findings aligned with the findings from the interviews and heuristics evaluation
Limitations
Since Viewpoint is targeted towards a specific user base like instructors and students who participate in role playing exercises, we were unable to find large sample size of instructors/facilitators to make our research methods even sounder.
To get better and accurate results, with the existing user group, we recommend that our client do the following
Deploy the survey
Conduct usability testing
Redo usability test after modifying website based on current recommendations
Takeaway
Every user is different and your design will be interpreted differently by each of them.
Breaking it down into achievable deadlines and keeping iterating the designs as they grow is the best solution.