I got to be a part of a super exciting project with the Serre Lab in Spring 2016. As the UX and front-end engineer of the BABAS video annotation tool, I got to wear multiple hats while being as close to the user as possible. My job was essentially to understand the user, create designs for possible solutions, and implement them. By doing this, I was able to learn a huge chunk of the product design and development process -- from the user research all the way to development and implementation!
A large part of my job was determining and prioritizing user painpoints of the video annotation tool. The tool itself is meant for people to annotate behavioral information, typically of animals, in order to provide data to train the automated computer vision analysis system. I've highlighted a few of the largest sub-projects from this research experience below!
Many users expressed a desire for keyboard shortcuts in the annotation tool. The irony was that a few of these did, in fact, already exist -- but there was no interface to display the tool. Additionally, each video needed custom shortcuts for certain behaviors and actions (e.g. a bird video needs "FLY" annotation, while a mouse video does not). I not only designed the interface for this feature, but also implemented it. I learned a ton about Node.js, APIs, and back-end development in addition to implementing the UI I designed. I even took the next step and made these shortcuts editable, meaning that a user could customize which keys they would like to designate to each action.
The UI for this tool was made several years ago by someone with no previous design experience, so there were some elements that were ambiguous and confusing for many. After interviewing three annotators, they all expressed some confusion over these tools in the playback system:
I made a quick fix that has since helped many people understand how to skip multiple frames at a time, or setting playback speed by a custom input number -- not just a drop-down menu of 0.5, 1, or 2x. This helped many annotators who wanted to watch the videos very quickly in order to review their annotations, saving them hours of time!
The BABAS annotation tool is a project still in its early stages. One of my projects for this semester was to ideate future UI and propose designs that, while not achievable at the moment, could be what the tool looks like in the next five years.
I focused on the main page of the web app: the annotation tool page. I reimagined the timeline, where most of the users' actions happen. I interviewed current users with what their pain points of the app were, and prototyped sketches and listed ideas from there.
To understand better the mental model of users with video editing software interfaces, I got more familiar with popular apps like Adobe Premiere and Movie Maker. A pattern I noticed was the timelines representing moments of time, or clips, visually, like so:
Some of my resulting sketches and ideas are as follows.
I also sought to find solutions for user action inefficiencies. This was already remediated by the keyboard shortcuts I had previously implemented, but there was more to be done. How can the visual arrangements be manipulated with as few clicks and input movements as possible? What were the biggest bottlenecks in annotation?
To acknowledge these, I made the following prototypes and even implemented one of them: draggable actions, draggable time-adjustments between bars, and "piano" mode.
TO BE CONTINUED