Usability Tests

The final usability test consisted of two tasks: "Complete the overdue task" and "Create a study plan for the new course, where the study days are Monday and Wednesday."

The first task required the user to find the ongoing course and complete the exercise that was overdue. The second task introduced and asked the user to use the guided calendar, where they would choose a study schedule for the new course to be started.

These tasks were also part of the test run earlier with the medium-fidelity version, where the goal was to evaluate the basic usability of the application and the functionality of the guided calendar. The initial test showed that the journey and the basic concept of the learning platform are clear to users. However, the distinctive features of the product needed improvements both in the user journey and visual aspects.

The final test had the main objectives of analyzing the visual design of the product and whether the guided calendar would have an increased success rate for users. We created a test with the two tasks in the final prototype, used the Maze platform, and distributed it to the target audience of people working in the private sector with mid to senior-level positions.

We shared the test link for 5 days and received around 35 responses. However, due to issues with the flow of screens, instability in the platform, and users unable to complete the second task, we shared a second link. Although the test was the same, this influenced the results and caused some confusion among users.

Task 1 - Complete the overdue task

First test

The first task required the user to complete a quiz from one of the ongoing courses. To finish this first part of the test, they should navigate through the home screen and the course trail. The initial results showed that there are areas for improvement, especially in the user journey, as seen below:

Other relevant data for this task:

  • About 25% of users did not use the expected path;

  • The average duration of the task increased by 7 seconds compared to the previous one, indicating a slightly more complex process;

  • There was a decrease in the misclick rate, but it was still present. Some users clicked on non-clickable areas, which suggests the need for adjustments in the layout.

The heatmap for this task showed a more focused interaction, with users exploring less and following a more defined path, as seen in the figures below:

Second Test

Due to time constraints and technical issues, fewer users participated in the second test. Additionally, we observed a negative difference in terms of the direct task completion rate and a high abandonment rate. On the positive side, the misclick rate and average duration improved.

We also noticed that:

  • Only 20% of users followed the expected path.

  • The average time on each screen was 8 seconds, which Maze considered excellent.

  • In addition to the Maze-provided data, we also received feedback about the app's menu screen. While it is aesthetically pleasing and user-friendly, it is filled with information, which hampers usability when trying to find key information. This is evident in the heatmap below of the home screen:

Task 2 - Create a study plan for the new course, with study days on Monday and Wednesday.

First test

Due to technical issues, the platform did not provide data for the 30 users.

Second test

The second round of testing again highlighted users' difficulties with the guided calendar. The main indicator was the direct task success rate of 40%, indicating the need to improve the presentation of this functionality. The most relevant data is shown below:

On the other hand, the usability score was the highest of all the tasks we performed, which was a positive point for the overall test:

Other relevant data for the task:

  • The average time on each screen was 5.5 seconds, which is considered successful by Maze.

  • The overall Misclick rate was 11.2%, which is good but can still be improved. This also presents a point for future review of the buttons used.

  • The calendar screens are clear and easy to understand, as the image demonstrates:

Key Takeaways

During the tests, we identified some key learnings. First, we noticed that the home screen and pop-ups need adjustments as some elements were not intuitive for users. Second, the documentation of the main errors found, and the recommendation of HeatMap data analysis proved essential for understanding the success of buttons and user journeys. Finally, the importance of clear onboarding for users, providing precise instructions on the application's functioning, was evident. However, we noted that onboarding to explain the solutions may have negatively impacted the results.

Received Feedback

We received various feedback from users during the tests. Many mentioned that the first version of the Home Screen of the app contained too much information, which could be overwhelming. Additionally, some users found the buttons in the pop-ups not very intuitive, especially the close ("x") button, which was not very visible. On the other hand, the design of the application was widely praised, considered well-crafted and motivating for platform usage. These feedbacks are valuable for improving the user experience and ensuring that the design is more intuitive and enjoyable.

Full reports:

Test 1

Test 2