Improving experience of the Viewpoint simulation platform

Summary


This was a semester-long project we did for the course 'Needs Assessment and Usability Evaluation'. The objective was to help a client with a comprehensive assessment of their product to improve the experience and accessibility. 


Viewpoint is cloud-based software that supports custom-designed role-playing simulations. It allows instructors to create and implement engaged learning experiences in both remote and in-person settings. 


Here is the link to the platform. Viewpoint

Team 

Our team comprises four members: two from computer science, one from sociology, and one from architecture.


Tools

Google Suite


Contribution

Interaction map, Structured Interviewing, Heuristics evaluation, Comparative evaluation, Usability testing

What is Viewpoint?

ViewPoint is a cloud-based decision-making platform used for academic purposes. It is a structured learning activity in which students adopt the personas of others mainly public figures and engage in a designated activity from their perspective. It enables students to share, communicate, and register their decisions for the topic in the debate. The primary goal of the product is to provide participants with a seamless collaboration and communication experience while offering instructors and facilitators a smooth experience for creating and moderating simulations.


Process diagram

Interaction map

We were granted access to the platform with two primary user roles, out of a total of three. These two main roles include students and instructors, while facilitators constitute a subset of the instructor roles.  

While exploring the platform, we successfully pinpointed the critical user pathways. This enabled us to create a comprehensive end-to-end interaction map, allowing us to construct a clear mental model for navigation tailored to the diverse user groups.

Interaction map

Instructor

Student

Facilitator

Stakeholder Interviews Interview findings


This was the main crux of our methodology for our initial evaluation of user experience of ViewPoint. We conducted user interviews and held an interpretation session afterwards to analyze the findings. This not only allowed us understand the platform better but also helped us build a connection with our stakeholders.




Survey Survey findings


After gathering some qualitative data on the users of ViewPoint through semi-structured interviews, we needed quantitative data to further understand the characteristics, attitudes and behaviors of the target users. Before the survey is deployed in scale, a pilot study is conducted to uncover the errors, biases and ambiguities in the questionnaire.


Here are examples of some of the changes we made based on the pilot study. We refined the survey to achieve a neutral tone by eliminating any inherent biases. We also made the survey clearer and more specific to ensure it's a straightforward and precise.

Before 



After

Comparative Analysis Comparative Analysis


For this study, eleven competitive products were evaluated from areas of direct, indirect, partial, parallel, and analogous competitors. Each of these products was compared according to a list of criteria.


Criteria for comparison to competitors:



Heuristics evaluation Heuristics evaluation


Through heuristics evaluation, we were able to synthesize legitimate and logical issues and recommendations. Each individual usability system was evaluated within context. For our system, these contexts fell under 4 themes which were determined based on the original research requirements from the stakeholder:





Usability evaluation Usability evaluation


Our analysis ended with usability evaluation where we created tasks for the users. Our findings aligned with the findings from the interviews and heuristics evaluation

Limitations


Takeaway