Autodesk Screencast is a tool that allows users to record, edit and watch interactive instructional videos. During a summer internship, I worked with two ohter interns to redesign Screencast and make it more accessible to people who are deaf or hard of hearing. Our design also made Screencat easier to moderatare, improved its SEO and made it a better fit for educational institutions.
Screencast is a great learning resource for Autodesk customers. It allows experts to record their screens and share their knowledge with others. However, Screencast is not accessible by customers who are deaf or hard of hearing. This prevents the tool from reaching such customers and makes it not appropriate for use in educational institutions. The tool needs to be redesigned to be more accessible and more appropriate for educational institutions.
In order to get familiar with the existing Screencast product, we started by conducting heuristic evaluation and looked through existing research that the team had already documented. We also went through the usage data that we could get through Google Analytics and the Autodesk Community Forums. This gave us valuable insights that hinted at potential usability challenges and aided our research going forward.
We created a stakeholder map to get insights into the following questions:
We needed to do this, because we needed to understand why Screencast was designed the way it was, what insights played a major decision role and what technical limitations were in play. During this activity, we mapped out all of the project stakeholders, while also including their job title and how they related to each other in the project. Then, we conducted 1:1 interviews with all the stakeholders on our list and got their perspective on the product and how it was built.
Next, we conducted a competitive analysis in three dimensions - recorders, transcription services, and players with transcripts. This gave us a better understanding of some of the problems we might face, as well as how other solutions have tackled them. Accessibility and ease of use were among our top traits to look for during our analysis. During this activity, we were able to see a diverse set of takes on how a screencasting software should look and behave. We saw that many of the features were repeated across the various products, but also that the interaction design decision were quite varied.
To better understand our target population and user needs, we conducted interviews with Screencast users. The users varied in levels of experience with the app. We interviewed people from various geographic locations and needs. In addition, we spoke to some people that were deaf or hard of hearing, while also interview users that were not. This diverse set of interviewees helped us understand the different needs of our user base and helped guide our design.
In order to get value out of the data that we collected, we synthesized it using an affinity map. We used the LUMA Institute exercise Rose, Thorn, Bud. This allowed us to better visualize the data, draw conclusions and get a sense of direction for our next steps.
After getting a better understanding about our product, customers and possible transcription services, we brainstormed potential solutions. We worked together to construct user flow and high-level views for the various components of our solution.
We then split up and sketched out our ideas, which we then used to drive more discussions and insights. We critiqued each other's work and made refinement suggestions. During this phase, I focused on the desktop video recorder, as well as the video editing functionality.
We then created high-level prototypes using Framer. We picked that tool, because of its ability to produce highly interactive prototypes. We believed this was required for effectively testing a video player concept, as we wanted to truely test how someone learns from and interacts with a video. Below is a video of what our prototype looked like.
Then, We tested our prototypes with 6 customers, 2 of which were deaf or hard of hearing. These tests gave us insights into what worked and what could be improved with our design and led us to continue refining our prototype. The test sessions included the following steps:
Based on the feedback we got, we made changes to our prototypes and conducted a second usability testing session to verify that the new changes worked well.
In addition to the interactive prototypes, we produced specifications documents as a final deliverable. Selected visuals from the documents are shown below. We got good results and were able to add polishing touches on our design.
We improved the design of the video recorder and editor by simplifying the interactions and reducing unnecessary elements.
We redesigned Screencast, to make it better and more useful to our customers. Here are the improvements over the old design: