Blog Posts
https://f.hubspotusercontent20.net/hubfs/7677475/a23.jpg

UX: The Forgotten Principle

  • Pilotly Insights
  • September 18 2020
  • Thought Leadership
Share
Stay Informed

Nowadays, consumers are surrounded by beautiful products with pleasant and easy to use interfaces.

As more products are designed with User Experience (UX) in mind, the traditional survey research experience starts to look more and more antiquated.

This is concerning, since the first step of good research is to gather good data - and good data acquisition starts with a positive user experience. Hence, survey research needs to catch up with the times to reflect the user experience and design that respondents are accustomed to in other products that they use.

 

There are several aspects of UX design that can be applied to survey research, but for the purposes of this article we will focus on the following key areas and aim to relate each to conducting audience research for testing video content:

  • Visual Appeal
  • Language Used
  • Functionality Tailored Specifically For Your Application

 

Visual Appeal

 

A survey experience needs to be visually engaging in order for the audience to provide good data. Results from a study conducted by Dillman et al., revealed a strong impact in data quality indicators when visually appealing and visually deficient surveys were compared:

  • Open-ended results from the visually deficient design were less likely to be complete, thorough, or elicit a positive tone

  • Scalar opinion questions were negatively skewed

  • Respondents were less committed to the survey with increased “satisficing” (lack of effort in answering questions)

In research literature, there are several other examples of similar studies, so taking steps to enhance your survey experience by going beyond traditional white backgrounds with radio dots or check boxes can really help with the data you collect.

 

Furthermore, cognitive psychology research supports limiting the amount of question text and number of answer options the viewer is shown to improve quality of response. This is because of bottlenecks in respondents’ working memory, leading to viewers only being able to process a certain amount of information in a given amount of time. When there’s too much information to process, it is impossible for respondents to make the correct cognitive connections to answer the questions appropriately.

 
By simplifying the amount of information with which respondents interact we are much more likely to capture authentic feedback, enabling us to gauge true perceptions.
 

Language used

 

Using familiar language when creating questions is also backed by cognitive psychology research. If the information contained in questions and answer options are conveyed in a clear and familiar form, the cognitive processes engaged by the respondent is more likely to be fast and accurate. We do not want respondents to get caught up in complexities such as industry slang or complex syntax. Hence, where possible, it is best to avoid normal “market research speak” if there is not a good reason to do so.

 

For example, in Hollywood, actors / actresses are commonly referred to as “talent”. When conducting audience research for a new show, a new researcher may be tempted to ask “How would you rate the talent in the show?” based on the language used by their Hollywood client. However, this could be interpreted in different ways by the general consumer who may be wondering if the question is referring to the actors / actresses in the show or whether the creators of the show are talented. A more appropriate question might be, “Overall, how would you rate the actors / actresses in the show?”

 

Application Specific Functionality

 

Overall survey functionality should not be taken for granted when conducting online research. For different research projects you may need specific viewing environments and feedback mechanisms to help you achieve your goal. For example, traditional survey platforms like Qualtrics, Survey Monkey, or Confirmit, are not built to help researchers understand how viewers respond to a TV show but some researchers “hack” together video surveys, usually by embedding a video link on a survey page and asking questions before or after the video. Without any in-video mechanisms to measure viewer interest / engagement these types of surveys are relying on the user to remember everything they like and dislike about a video as they can’t give any feedback until the exit survey.

 

The brain uses cognitive heuristics to summarize information in ways that may reduce accuracy of responses over longer recall periods which are inherent to testing videos like movies or TV shows. As a result, people taking surveys on traditional platforms are most likely to remember “peak”, “salient”, or the most recent experiences prior to filling out a survey, so there is an inherent bias with research conducted on these platforms. Thus, researchers should aim to use the appropriate platforms for the type of research being conducted.

 

Pilotly’s Focus on UX Leads to More Effective Video Research

 
It’s important for researchers to keep these three principles of UX design in mind when conducting audience research in order to minimize bias in data quality and gain the most accurate insights.
 

In fact, we built our research platform around these principles. At Pilotly, we build on well-established research principles, but are not afraid to challenge the status quo. Our platform design is centered around User Experience (UX) and our interface is specifically built to help researchers understand video viewer response / intent, allowing researchers to conduct audience research in ways which were not possible before.

 

We’ve taken steps to enhance the survey experience by eliminating white backgrounds and radio dials and ensure that videos stream in high quality, providing a more colorful and sleek experience to the viewer. We also offer simulated environments outside of just fullscreen video, including simulated Facebook and Snapchat feeds, enabling researchers to mimic an organic experience in the context that videos might be seen in real life. These design elements and enhanced functionality facilitate capturing more meaningful data points via a smooth and interactive viewing experience.

 

Since we integrate the survey experience with the video itself, Pilotly is the only platform that enables you to ask questions and assess perceptions before, during, and after a video. Viewers are able to provide continual feedback in the form of liking, disliking, or indicating tune-out during particular scenes and we dynamically ask questions based on this feedback. This type of measurement and questioning complements regular survey techniques and ultimately reduces the amount of bias in our results.

 

We have even applied UX research and design to our research dashboard. When a study is setup researchers can easily create surveys by using a combination of pre-made template questions from our database and their own custom questions. The platform enforces limits on question and answer lengths to ensure a smooth experience for respondents and our consulting team is available to provide guidance on question wording to ensure appropriate lines of questioning are used.

 

In conclusion, we do not take any part of the research or survey design process for granted. We built Pilotly to bring video research into the modern era, addressing several of the pain points that affect research results due to how studies are conducted today. We realize that a good user experience is a precursor to good data and built our user interface to do just that. With a sleek design, less jargon and optimized processes, the viewer is more comfortable and more willing to provide the feedback that is truly sought after.

 

Read Next

A Custom Research Solution for Content Creators: Hollywood Has Come Knocking

Insights Off-Site: Conducting Qualitative Research Virtually

Insights and Cocktails : Using Technology to Understand Your Audience

Insights Off-Site: Marketing With Social Equity In Mind