Competitions: Poll Everywhere’s newest poll type, aiding in its ongoing mission to make presentations more engaging. This project stretched and innovated the way we build and the way I delivered my design.
Early research, designs for a new poll type with a new participant, presenter, and creator, experience. Release timeline was estimated for 6 months.
Lead designer and researcher. The implementation team consisted of 2 designers, 3 researchers, 1 product manager, and 6 engineers.
A frequent ask by customers. Customers want to make their presentation fun for their audiences. One of the best ways to engage their audiences is by holding a competition.
The idea of Competitions emerged. Participants can partake in the competition using their personal devices, suddenly becoming part of the presentation.
They loved the idea of running a competition, but the closest thing Poll Everywhere offered at the time was a segmented bar chart, which was complicated to set up and clunky to read.
"I find your bar charts with percentages confusing."
Leading the research team, I set out to uncover more pain points from our customers and analyze our competitor approaches. After sending a survey out to ~1,000 people and facilitating 7 customer interviews, our team synthesized the data into 6 key insights to work off of in our upcoming design brainstorm.
Design hosted a two day event inviting stakeholders and cross functional team members to participate in a series of exercises centered around the results that were generated through our research.
Through research insights and brainstorming, I worked with 3 key areas to inform my design.
DESIGN THROUGH DELIVERY
In order to build within a reasonable timeframe, and stay as flexible as possible to account for learnings, I broke up the participant experience into 3 layers.
The first layer is the Core Flow, aka "viz sync."
"Viz" is Poll Everywhere terminology to represent "visual," or what is shown on the projector, or monitor, or tv-screen. A bar chart or wordcloud is a display of "viz."
Pressing buttons to fight with another player would be the bare minimum of a fighting game.
This Core Flow is the bare minimum of what we need to have working to resemble a competition.
Simply put, we needed the the right screens to show on both the phone and the big screen.
Waiting screen (when a poll is deactivated)
Competition start screen (static)
Answer input screen
Correct answer feedback screen (static)
Leaderboard/Rank (mid competition) screen (static)
Competition end screen (static)
In this layer we create a sense of anticipation and begin establishing a scoring baseline within the game.
Capcom's Street Fighter 5 establishes match configurations such as # of rounds as well as giving a break down of the players' progress.
The concept of rounds would give the sense of pacing, and showing statistics lets the player know how they're doing.
The last layer focuses on participants feeling part of a live, interactive game. Making winners feel like winners, and creating replay value.
Blizzard's Overwatch in an early playable state (left) contrasted with the product at launch (right).
An unpolished game can feel unfair. You can see the massive difference it could make in a game like Overwatch, where on the left, it works. But imagine playing something that “just works” vs something very refined.
Here we introduce animations to make the competition feel polished.
With our current scoring system, ties are likely. We needed to come up with a way for the presenter to "break a tie" seamlessly in the experience.
Professor Geller is using Poll Everywhere in his presentation, and specifically adding a competition with a $100 cash prize.
Multiple people have the same score?
Nobody is answering anything correctly?
While this is a concern, ties are not necessarily a problem.
What can be considered a problem right now, is how a tie is visually displayed.
Design considered multiple scoring models that could reduce the amount of ties:
We decided that, time-based scoring along with a “bonus round” were the two models that would have the most impact while complementing our current system.
Testing for the participant archetype has historically been (and continues to be) very limiting. Aside from ideally being able to simulate a presentation experience, the biggest constraint we often have to work with while testing is the fact that our team needs to activate the polls to get usable results. This makes remote testing with most online testing services a challenge.
That said, we’ve been creative in some of the ways we could test. In this test, I created the testing experience with the purpose of using an online testing tool. Since the timing is important to Competitions, I used a video recording of someone going through a prototype of the competitions experience as a participant. I give them a scenario where they take the role of the person playing in the competition and have them respond to the on screen prompts in the video.
Users understand their score faster when the score is contextual within the individual question versus contextual within the entire competition.
There wasn’t anyone who was very surprised to find themselves on the leaderboard with the score they accumulated. The difference was when the testers learned about the score. By the third question, testers from the global scoring and the any question bucket learned how their score was being calculated.
The takeaway was that participants may not care about how the score is calculated, as long as the scoring feels fair.
Competitions need some time to rev up and build anticipation.
The goal here is to show activity and liveliness in order to create a sense of drama for the participants and to encourage their neighbors to join in.
Participants of a competition need to know how they are doing to stay engaged.
Question correctness and score is revealed on the participant's device at the same time as the results on the big screen.
I used animation to create a feeling of liveliness as the results transpire fluidly in front of the participant.
A key part of a competition is the leaderboard.
Participants need to be able to see how they compare with others. In order to get more people involved, while still being mindful of space, I included a Top 10 leaderboard.
I used animation to emphasize certain elements on the screen, with the biggest emphasis on the final rank.
I wanted those outside of the
Top 10 to still feel a part of the competition and give them a way to see how they compared with people around themselves. I created a scroll element for those participants so their name and rank would appear without posting statically, alleviating any potential feelings of embarassment.
Over 25,000 competitions have been created in less than a year.
To highlight one of those cases, Poll Everywhere’s Competition was recently used in a national Optometry Conference in Orlando. Students went head to head in a competition to test their optometry knowledge.
Being an avid gamer myself, I loved that I was able to use my own experience to inform my design. That being said, there were a number of challenges, that pushed the boundaries of my process.
I chose this approach because at the start of the project, I owned the participant archetype. I knew that what the participant sees is largely dependent on what their presenter chooses to show them. So I needed a way to concept and validate while working in parallel with my teammate (who owned the presenter). This was the strongest part of my design process, enabling me to deliver the best possible experience within our own time and resource constraints.
Our product is near impossible to test consistently and quickly for the participant archetype because no simulation can truly recreate all the various possibilities within live environments. Our recorded simulations and playbacks were creative, but our true results only roll in once it’s out in the wild.
I have a huge soft spot for well-used animation that are appropriate for the context, and not over-the-top. I was excited to use my animation skills to convey a feeling of liveliness. It allowed me to think outside of screen states, and more "scene" states.
Competitions was a success and shipped earlier than scheduled. By the second half of the project, I had taken both the participant and the presenter archetype, and my team executed on strategy well. I am proud of building something with a strong foundation and look forward to seeing future iterations.
Participating in a competitionCase study
Scaling through SharingProduct Design
Join instructionsCase study (WIP)
UI RefreshProject type