Samsung AutoCache 3.0
Directing UX & UI,
Management of UI Dev Team
- Creation of web application
- Creation of VMware plugin
AutoCache is a hardware program that increases the performance and lifespan of solid state drives in the way hot and cold cache is managed.
Deliverables I Produced:
- Product Journey Map
- Card Sorting Exercises for feature sets and grouping of data
- Process Flowcharts
- Content Inventories & Inventory Mind Map
- Content Models
- Mental Models
- User Needs & Requirements
- Interactive Prototypes
- Product Specifications
- Product Roadmap
- Scenario Maps
- Internal team surveys for knowledge and use of product
- System Architecture Maps
- UI Style Guide & UX Pattern Library for the UI development team
- Axure Interactive Prototypes for stakeholders
- Figuring out how to make an interface for a previously hardware-only driven product.
- Tight timeline for deliverable
- Deciding upon and selecting the ideal technology stack that was scalable and future-proof.
- Understaffed and the need to build the UI Team.
- Evangelizing the value of UX to teams and executives.
- Performing User Experience Research and Human Factors Testing when:
- The product had not been beta tested with clients, so there was no feedback on the product's experience.
- There is no other product like itself in the market, therefore performing a Competitor Analysis would be a unique challenge.
- There is no field data on potential user types, so I had to investigate potential user types for Personas creation.
- Understanding which data is valuable to what potential user type.
- Identifying, grouping, and sorting data and how to effectively display it in the UI.
- Understanding how to prioritize and display valuable data for different user types and experience type.
- Gaining stakeholder alignment and open conversation between different teams and technical leads.
- How to make the UI engaging and valuable to end users.
- When designing the plugin on the VMware framework: to understand and integrate into their UI specifications, and to subtly improve it.
OVERCOMING THE CHALLENGES
To tackle the the obstacles, I started movement on many streams of work alongside my efforts to introduce design thinking and cross functional team building.
Initially, there was no dedicated UI or UX department and testing was performed by the Quality Team. I spent time with the teams to show them benefits of user centered design thinking and the benefits of human factors testing. I expressed the value of good user experience by developing user process flows with use cases and visual journey experience maps so that our teams could develop empathy for a wide range of user types. After these sessions I captured the edge cases and experience pain points and shared these with the teams so we could improve these troubled areas. I setup internal focus groups and studies. I was ultimately able to find great talent and build a tight-knit UI team. A great win for our team was the approval to release a beta of the product to a select focus group of companies to monitor their experiences and feedback of the products. The data gathered from the beta release was fed back into the next iteration of the product and helped to make significant improvements.
Removing team silos
To make this a successful product we had to harness the knowledge and input of every team close to the product. Many in the engineering and science professions tend to work in silos and remain in their respective groups. I designated end-to-end accountable leaders for each group and created a core team. I invited this team to bring their ideas, opinions, & research and share it with the group. Many found solutions from other teams, some made new friends, and some held strong opinions. This was good because the teams were talking to each other.
Priorities and alignment
The product requirements from the executives were very high level. This created the skeleton of the product road map. To add more meat and growth to it we would have to understand the limits and potential of the products features from the different research teams. We worked together to understand the use cases and potential personas for the user types. This would help drive the focus of the product features and dispel any opinion-based judgements. We performed workshops to gather, share, and groom data, such as card sorting, content modeling, mental modeling, and others.
Design, Usability Testing, & Research
Given the data, I was now able to form a clear picture of what to design and which technology stack was suitable to accommodate it. I decided to develop the backend with Java and C# and use angular.js, css, & html5 for the frontend, along with injections of charts.js, & d3.js for displaying dynamic interactive data. I designed the system to be modular, with widgets of useful content and data, displayed in a clear and comprehensible way. After numerous sketches and paper prototypes, I drew up the final wireframes from which I formed a UX pattern library with defined templates for the various screens based and where to apply them. I prepared a UI style guide that included micro interactions with animations for displaying the interactive data and feedback. I then designed comprehensive prototypes for web and mobile in Axure using the UX patterns and UI style guide I prepared. I initiated testing with internal focus groups based on the personas to test the prototypes for usability heuristics, such as Jakob Nielsen's top 10, as well as ease of comprehension of terms, ease of navigation, colors for retinopathy, sizing of font, and other factors. After much effort, I was able to initiate partnerships with select companies to perform product testing and provide feedback. This was extremely valuable and helped bring definition and validation to the hypothesized personas, resulting in definite user group types. Based on the user type, I designed the UI to accommodate their specific set of needs.
After several iterations of improvements to the prototypes and UI style guide, and UX pattern library, it was hardened and ready for production. I defined what I call versions A, B, & C of the product. Version A would have the minimum viable feature set to ship the product. Version B would have the essential features and additional features. Version C would be full featured with all the bells and whistles to speak. I provided these outlines to all the production, testing, and management teams and mapped it to a product timeline with milestones. After it was approved, I began hiring our UI development team. Everyone had access to the full prototypes and understood how it should look and function. We worked like an assembly line, I assigned the UX pattern library and UI Style Guide to half of the team to focus on developing the assets and templates, while the other half focused on the workflows and architecture maps to develop the infrastructure and pages with the backend developers. This was all implemented an agile process and I requested the setup of Team Foundation Server for the team to use. We performed weekly sprints with Tuesday and Thursday stand-ups and tracked the progress of development and any issues in Team Foundation Server.
Validation Testing & Key Performance Indicators
We ran several tests to see if what was coded displayed and functioned correctly on different browsers and mobile devices. Afterwards we tested common mobile KPI's such as session length, app launch time, page and action load times, usability, and comprehension. I conducted a number of studies with focus groups internally and externally, and also used our partnership program with the select companies to test the product. After testing was complete we prepared software documentation and a troubleshooting guide with known issues for both products. We mapped out future update releases and made an area for users to reach customer service and leave feedback. The code was locked for this release and ready for launch.
Launch, Gathering Feedback, & RECEPTION
I prepared a feedback site that tied to the applications, users would be requested to provide feedback on the product after first use. We setup the product to have a 30 day trial and also a full version. I worked with our teams to setup a database for beta keys. The product was launched and already had waitlist of enterprise customers that were eager to try the product. I planned a special early release for for the participating companies to provide extensive feedback on the product, to further help us understand our user types and their needs, as well as make improvements and capture feature requests for the products. We successfully launched the Version B of the product and it was well received. Upon the hardware only release in the previous year, many users did not understand the value the product brought because they did not have any way to measure the success nor any UI to view what effect or improvements the hardware was having on the system. Many of these same customers where provided a discount for the web client and VMware plugin that provided the UI and data to measure the performance of the hardware. These customers were quite pleased and many of them used the UI to generate reports to document the return on investment to their management, which was a key point and was previously hurting our product when it did not have a formal UI. We were slated for a new release every year, and based on the technology stack I selected, the platform was scalable for the next 8 years at minimum.