Usability Test
The preliminary test phase, employing a medium-fidelity version, focused on assessing the application's basic usability and the functionality of the guided calendar. This initial test investigated the clarity of the user journey and fundamental concepts, yet underscored the need for enhancements in distinctive product features, encompassing both user journey and visual aspects.
Transitioning to the final test, the main goals were to scrutinize the visual design and evaluate the modifications of the guided calendar. Utilizing the Maze platform, we distributed the test to the target audience - individuals with mid to senior-level positions in the private sector. Over a 5-day period, approximately 35 responses were gathered. However, challenges with screen flow and platform technical issues, prompted the need for a second link, introducing some confusion among users. These insights signify opportunities for refinement in both the design and functionality of the learning platform.
Test Instructions:
SAPIENS is a platform for corporate and gamified training. The app hosts mandatory courses and suggests others based on the student's profile, a study calendar, the student's skill history and progression, and provides a networking space.
During the course, students can navigate through the platform and the learning path to monitor the progress of the process.
Task 1
Complete the overdue task.
The initial task involved users completing a quiz from an ongoing course by navigating through the home screen and the course trail. A total of 37 users participated in this part of the test, and the preliminary results identified areas for improvement, particularly in the user journey. The findings underscore the need for enhancements to streamline the navigation process and overall user experience.
37.8%
Direct success
43.2%
Mission unfinished
55.4%
Misclick rate
206.3s
Avg duration
Other relevant data for this task:
Over 40% of users deviated from the expected path;
The average time on each screen was 17.3 seconds;
Although there was a decrease in the misclick rate, some users still clicked on non-clickable areas, suggesting the need for layout adjustments.
The heat-map for this task revealed more focused interactions, with users exploring less and following a more defined path. These findings emphasize the importance of refining the user journey to enhance efficiency and minimize confusion.
Due to time constraints and technical issues, only 24 users participated in the second test. Additionally, we observed a negative difference in terms of the direct task completion rate and a high abandonment rate. On the positive side, the misclick rate and average duration improved.
20.8%
Direct success
50%
Mission unfinished
36.7%
Misclick rate
121.8s
Avg duration
We also noticed that:
The average time on each screen was 8.4 seconds, metric considered excellent by Maze.
Task 2
Create a study plan for the new course, with study days on Monday and Wednesday.
Unfortunately, due to technical issues, the platform failed to collect data from the 30 users who participated in the first round of the second test.
In the second round, 21 users participated. Users continued to face challenges with the guided calendar, evident from the low task success rate of 40%. This emphasizes the necessity to enhance the presentation of this functionality. Key findings from this round of testing include:
38.1%
Direct success
33.3%
Mission unfinished
34.8%
Misclick rate
56.5s
Avg duration
On the other hand, the usability score was the highest of all the tasks we performed, which was a positive point for the overall test.
Other relevant data for this task:
The average time on each screen was 5.5 seconds, metric considered optimal by Maze.
The average rate (across screens) of users who misclicked was 11.2%, which is good but can still be improved. This also presents a point for future review of functionality of the buttons employed.
The calendar screens are clear and easy to understand, as the image demonstrates:
Key Takeaways & Feedback
During the tests, we identified some key learnings. First, we noticed that the home screen and pop-ups need adjustments as some elements were not intuitive for users. Second, the documentation of the main errors found, and the recommendation of HeatMap data analysis proved essential for understanding the success of buttons and user journeys. Finally, the importance of clear onboarding for users, providing precise instructions on the application's functioning, was evident. However, we noted that onboarding to explain the solutions may have negatively impacted the results.
We received various feedback from users during the tests. Many mentioned that the first version of the Home Screen of the app contained too much information, which could be overwhelming. Additionally, some users found the buttons in the pop-ups not very intuitive, especially the close (x) button, which was not very visible. On the other hand, the design of the application was widely praised, considered well-crafted and motivating for platform usage. These feedbacks are valuable for improving the user experience and ensuring that the design is more intuitive and enjoyable.
Check out the full reports below: