In a unique position given my previous experience with the UX, I took an opportunity when tasked by Product and Design to not only reskin the Listing but also to upgrade it.
Given that no one besides me really knew how RequireJS was working in the application and given its falling-out-of-favor in the general community as a module-loading solution, it was time to upgrade it to Webpack.
Here’s what the Task Listing looked like before
Here’s the mock from Design
In chronological order, here’s what I did
Upgraded DataTables from 1.9 to 1.10
Refactored JavaScript towards more of an OO paradigm
Applied new skin
Ported JavaScript for RequireJS to CoffeeScript for Webpack
Deployed
…and here’s what I delievered
with modal
Results
Successfully advocated for adoption of Webpack to replace RequireJS and then ported most-trafficked page, while achieveing near-pixel-perfect re-skinning.
Contributors, as they are called, are the +5M people around the world who do work on CrowdFlower’s platform. The application that enables them to do work is one of the company’s heavily trafficked as well as most complicated – blending a Rails backend with MooTools, jQuery, and RequireJS in the frontend.
The application’s UX
…had largely stayed the same for the last five years. In Q1/2014, we decided to enhance it by making it more interactive and towards engaging our users more and conveying the just how much work there is in our system.
Working with the Product Manager and an external Designer, we came up with the following high-resolution mock
Because the application is so heavily used, we knew we couldn’t merely throw the switch on a new design overnight; both from a community management standpoint as well as application performance. Instead, we chose a strategy of introducing a first at the company: use of A/B Testing to determine a design that would perform as well as if not better than the original.
Our key metric for performance in that regard had to do with contributor’s performance after being exposed to the new UX, particular the messaging around our forthcoming gamification and introduction of Levels. In the beginning, we did not have the infrastructure to determine the value for that metric so we simply settled on ‘clicks’ as a (conversion) proxy to understand if the new design was having an impact.
Infrastructure
Without an A/B Testing framework in place, I needed to choose one. As requirements were not concrete for such, I did some due diligence in vetting several options, coming up with a review of A/B testing frameworks for Rails.
It became obvious that Vanity was best suited to our needs. (Since it doesn’t yet have the ability to throttle a percentage of the traffic receiving experiments, I augmented it with Flipper.)
Once that was in place, we could begin iterating on the design, knowing with confidence how we were impacting the user experience.
Server-side
We knew we wanted the experience to be snappy, but completely replacing the existing experience with a Rich Internet Application was far out of the scope for the first month, particularly as there were infrastructure changes to be made to retrofit the stack with A/B Testing. We decided to make progress iteratively over several sprints.
In our first test, we pitted the control (original) against a bare-bones implementation version of the high-resolution mock as the new design.
original
The new version out-performed control (in terms of clicks) 21.3% vs 20.3% (at 95% confidence) so I continued to iterate on the implementation, coming up with the following
To calculate the overall satisfaction by other contributors for a task (denoted by the stars) proved to be too inefficient in this iteration; it wound up losing.
Client-side
On the assumption that we needed to make the experience snappier in order to drive engagement, it was obvious that we would need to have more (and faster) interaction and therefore, an interactive client-side implementation.
As what was essentially a completely parallel product, leveraging only some of the infrastructure that the server-side rendition was utiziling, I begin to flesh out the following
Further refinement (an actual data) was necessary to get it looking more like the high-res mock (and like its server-side-rendered peer)
At this point, we implemented and integrated with our own homemade badging solution, beginning to display badges in the following iteration
The new version out-performed control (in terms of clicks) 21.3% vs 20.3% (at 95% confidence) so I continued to iterate on the implementation, coming up with the following
Testing the impact of particular messaging was also of interest, so we added a Guiders variation as well. At this time we also leveraged Google Analytics Events on the Guider buttons to track how the far the user got in our messaging.
Letting the experiments run a few days with sufficent traffic, we found that client-side-rendered version peformed no worse than the server-side-rendered version (23.9% vs 22.9%) and that having guiders also performed not significantly worse (23.1% vs 23.7%) so we decided to keep both.
By that time, the new version was out-performing control (the original design) 22.2% vs 20.7% (at 99% confidence) so a decision was made to move forward rolling out the new experience to 100% of contributors, doing some polishing (copy/styling) work before finally settling on the following
Results
Used A/B testing to upgrade company’s most highly-trafficked page (5+M views/month,) increasing user engagement by 5% and saving $2K/month (in Bunchball costs) by rolling own simple badging solution.
The CrowdFlower platform is consumed via a number of microtasking sites. Each site registers and maintains its own users, but to better track unique identities across the CrowdFlower platform, we built a Single Page App in Ember.js to allow associating users across partner microtasking sites with one unique identifier in the CrowdFlower platform.
Results
Implemented a CRUD tool for managing users using Ember.js while iterating in conjunction with Product Manager as requirements changed.
Test Questions are used as the gold standard of quality in the CF platform, but they can be laborious to create, particularly for work that’s periodically repeated.
As no templatized solution existed, a team of three of us (me as F2E, Product Manager, and Backend Engineer) tackled creation of an internal product to simplify the workflow.
The user flow was to create “Cases” of Test Questions that got sent to jobs as “Batches”; where the composite idea of a “Mold” encompassed all “Cases” and “Batches” for a particular set of target jobs.
(“The Forge” was the product’s original name, derived from a time when “Test Questions” were known as “Gold.”)
One of the more challenging aspects of the project was the testing of the app. Selenium has always been a robust solution for testing even JS-heavy experiences, but given its heft, Poltergeist was used instead.
The product was to eventually be made available externally but never was.
The app is CrowdFlower’s most highly-trafficked app. It also happens to be one of the company’s most technically complex, given its history.
Its architecture is that of a Rails app, wrapping a Gem that extracted business logic from the company’s legacy (original) Merb app. The Gem contains all logic around rendering, styling, and providing interactivity for CML, the basis of abstracting microtasks in the platform.
The app was built (before my time) in order to bring a richer, more interactive experience to those doing microtasking work. When the original architect departed only weeks after I joined the company, maintenance and feature implementation fell to me.
Results
Supported site’s most highly-trafficked, revenue-generating UI (allowing for custom JS and CSS.)