The Conjoint method tries to find out which features of a product are most important when people decide to buy it. In this method, researchers show participants different versions of a product. Each version has different features. The participants then have to choose between them. This helps researchers understand which features truly matter to buyers.
The method places participants in a real decision-making situation. It does not ask them general questions. Instead, it gives them options, just like in real shopping situations. This makes the results more realistic and useful. It helps companies see what customers value most when they choose one product over another.
Some expets conducted study to test something new. They used the Conjoint method, but changed the design of the survey. Usually, surveys use a plain table format. In this study, they compared that old format with a new one. The new format looked like a modern online shop.
This new shop-like design helped people feel like they were shopping online. Statista+ wanted to see if this design changed how people made their choices. They found the results surprising. Their findings showed that even the design of a survey can influence people’s decisions.
A very interesting trend is now growing in market research. Researchers want to make the daily lives of participants look as real as possible. So, they create data collection situations that feel very natural and lifelike. For example, they build fake online shops. These shops look like popular platforms, such as Amazon. In these virtual shopping worlds, participants make real choices. They choose products, check prices, and look at product features. This experience feels almost exactly like real life.
The idea behind this approach is simple:
When a task is close to real life, people act more naturally during the survey. This behavior makes them more engaged, and it also leads to better results. To make surveys feel more realistic, tools like conjoint analysis are used. These tools simulate real buying decisions. Participants choose between different products with different prices and features. This method helps researchers understand how the market might react to price changes or new product features.
But how useful is this method, really?
At Statista+, we wanted to know if using real-life-like situations in conjoint analysis really improves data quality. Do these situations make participants behave more naturally? Or do they confuse people instead of helping them? Another question was about how fun the study feels. Does a fun setup make people more motivated? Or does it reduce scientific quality and turn it into simple entertainment?
To find answers, we ran our own internal studies. We used choice-based conjoint analysis and applied it to headphones and hair dryers. We tested two styles: a traditional table format and a modern web-shop-style layout. To study how each one worked, we used a mix of personal interviews and an online survey.
We held eight user experience tests. Each test lasted 60 minutes. Every participant completed both styles. We used the “think-aloud” method, where participants speak their thoughts while doing the task. This helped us understand how they made their decisions. After the test, we asked them specific questions. These questions helped us judge the layouts and better understand their shopping choices.
We also conducted an online survey with 1,190 participants. We used an A/B test where each person saw either the table design or the web-shop design. To make sure the results were fair, we studied each person’s shopping journey. We wanted to know who prefers online or offline shopping. This way, we could compare their answers fairly.
We made sure each group had a similar structure. We balanced gender and age across four groups. These groups were: people who prefer online shopping using the table design, people who prefer online shopping using the web-shop design, people who prefer offline shopping using the table design, and people who prefer offline shopping using the web-shop design. Online shoppers were people who usually buy or research products online. Offline shoppers were people who mainly shop in stores.
To find out whether the web-shop design gave any real benefits, we looked at three things. First, does it show the buying process in a more realistic way? Second, does it make the survey more fun and easier to understand? Third, does it improve the quality of the data?
Familiar Design Meets Logical Thinking
The web-shop design feels familiar to most people. Many participants liked it because it looks like a normal online store. This made the survey feel easier to begin and less intimidating. Because of this, people may have felt more willing to take part in the survey.
However, the study also showed some problems with this familiar style. People usually follow a logical process when deciding what to buy. They think about things like price, features, or design. They rank these things in order of importance. Then they start removing options that don’t meet their top needs.
The web-shop layout doesn't match this process well. It focuses on whole products instead of comparing features one by one. This makes it hard to see the structure behind each decision. In contrast, the classic table format shows this logic more clearly. Although it may look old-fashioned, it follows how people make choices. It helps participants rank features, especially when products are complex.
With the table layout, people can focus only on what matters. They are not distracted by fancy designs or images of the full product. This layout is similar to how most comparison websites work. These sites are popular because they make it easy to compare important product features.
This is why the table format is still the best choice when people make serious and well-thought-out decisions. The web-shop format may work better for quick and simple choices. It offers a nice environment when the decision is easy and doesn’t need deep thinking.
Entertaining nature of the research designs
Some might think that fun doesn’t matter in a study. But fun plays a big role in keeping participants motivated. Table formats often seem plain and only useful. In contrast, the web-shop format tries to make things more visually interesting and lively.
At first, some people found the web-shop layout more exciting. It reminded them of their regular online shopping experiences. This made the start of the survey feel pleasant. But this feeling didn’t last. As the survey continued, people got just as tired as they did in the table format. Their answers became shorter, and they started to lose interest.
Both formats felt equally tiring. In interviews, participants said that repeating the same tasks over and over was hard, no matter the design. We must also remember that doing a long survey takes effort. Also, our participants were new to online market research. We chose them to avoid bias from previous experience with surveys.
Overall, participants said they liked the survey. This was true no matter which format they used. Still, we noticed something interesting. In the web-shop design, people took more time to make decisions. Because of this, they gave fewer random answers. They also avoided choosing obviously bad products. This suggests the fun layout helped them stay more focused.
Are the Results Any Better?
Now comes the key question: Do these design changes give better results? Usually, it is hard to answer this. What do we mean by "better results"? In this case, the answer is easier than usual. The results from both formats were almost the same.
We split participants into four groups: online shoppers using the table design, online shoppers using the web-shop design, offline shoppers using the table design, and offline shoppers using the web-shop design.
All four groups cared about the same product features. The importance of features didn’t change across groups. The differences in how much they valued each feature were very small. The biggest difference was only 1.8 percentage points.
Some small changes appeared in how people rated individual features. These affected things like preference scores or price value. But these small changes didn’t follow a pattern. They were likely caused by personal taste within the small groups.
For example, online shoppers using the table format liked the "green design" of the hair dryer more than those who used the web-shop version. But offline shoppers didn’t show this difference at all.
Web-Shop Designs Don't Offer Real Benefits
The main goal of a conjoint study is to find the best combination of product features. These decisions are usually logical and careful. Our study shows that a web-shop layout does not help people make these decisions more naturally.
People often find traditional conjoint methods hard to use. We thought the web-shop layout might make it easier. But it actually made comparing products harder. That’s because it focused too much on the full product, instead of the individual parts.
Still, the web-shop layout gave some nice changes. It broke the routine of typical surveys. It also helped reduce bad-quality answers, like clicking quickly or giving random and conflicting responses.
But this didn’t really change the final results. We removed low-quality answers in both formats. We replaced them with better-quality data. However, fewer bad answers could help save time and lower the cost of using large survey panels.
Realistic web-shop designs can be useful in point-of-sale studies. But for conjoint analysis, our advice is clear: Don't spend your time building fancy web-shop formats. Instead, invest your time in better data analysis and deeper simulations. These will give you results with real value.
Comments
Post a Comment