Pre-Launch Product Feedback is Essential. It’s also Getting Harder to Obtain, Says Centercode User Testing Survey

Nearly 70% struggle to obtain feedback on pre-launch products because the testing process is inefficient; new automated testing tools can help

Nearly 70% of product testing managers find it difficult to get testers to participate in pre-launch user feedback, says a new survey from Centercode. This makes it particularly challenging to keep up with the speed, frequency and quality of product launches expected by today’s consumers.

The Delta Testing Industry Report 2022 from Centercode collected insights from leading brands to determine how testing teams structure user testing programs, manage time and resources and identify success. The survey also identified which test activities and job roles managers find most challenging.

“By continually launching products that have been consumer tested, that people love and continue to purchase, iRobot recently inked a deal with Amazon.”

Difficulty Engaging with Testers to Receive Useful Feedback

More than 50% of companies surveyed release new products every few months and almost 80% release product updates at a similar interval. Yet, 69% of test managers running pre-launch user testing programs have trouble obtaining tester participation, while 44% find it hard to collect useful feedback. Combined, these factors can result in prioritizing the wrong fixes and enhancements, creating significant risk to product success and delays in go-to-market timing.

“The nature of today’s connected and rapidly evolving landscape requires that consumer products and updates are released at speeds quicker than ever. This puts increased pressure on product teams to build continuously improved versions,” stated Luke Freiler, CEO of Centercode. “However, when managers struggle with collecting enough data or usable feedback from testers, it is difficult to know if the right fixes and enhancements are incorporated into the product.”

Marketing Technology News: Stagwell’s (STGW) Multiview Bolsters Commitment to Associations With ‘Your Aspiration is…

A Broken Testing Function

One reason user participation is low is that many test managers are using tools that are not explicitly designed for delta testing. In fact, most respondents (70%) use email to collect feedback from testers, followed by Survey Monkey, Jira and Google Forms, none of which are designed to support a fully functional or efficient user-testing program.

These tools require manual, time-intensive tasks for both test managers and test participants. Using the wrong tools and manual systems makes it hard to recruit testers, communicate with testers and demonstrate the value of tests, resulting in low tester engagement.

Additionally, when test managers are required to combine data from numerous siloed systems, comb through various forms to identify useful data and feedback, and analyze disparate data sets, it opens the door to mistakes in data interpretation, misunderstanding of feedback and results, and low-quality analysis.

Limited Time and Resources for Efficient User-Testing

Other limitations of current product testing programs are time and staffing. More than half (56%) of test managers feel that they do not have the proper amount of time to effectively manage user tests to ensure participation and quality results. And only 19% feel satisfied with the number of personnel available to support their organization’s user-testing efforts.

Because of these limitations, more than half (57%) of survey participants indicated that many of their testing processes are reactive. There is simply no time to properly, proactively plan and run user tests to help prepare new products or upgrades for the market. But without proper testing, companies are at risk for low consumer satisfaction scores, poor sales, and hits to the brand reputation.

Adoption of Automated Testing Tools is Growing

While many brands still hang on to outdated testing tools, fast-selling modern brands such as Fitbit, BOSE, Wyze, August and others are embracing automated delta testing solutions that reduce the staffing strain, support proactive testing programs, and solve problems associated with siloed feedback platforms or manual data collection.

“A great example of a brand that has used automated consumer testing and feedback to ensure exceptional product launches is iRobot,” said Freiler. “By continually launching products that have been consumer tested, that people love and continue to purchase, iRobot recently inked a deal with Amazon.”

Automated delta-testing programs help engage with testers, manage the test process from end-to-end, reduce the time and cost associated with testing, and create accessible reports that aggregate feedback so test managers can easily apply the right feedback into the design and development process.

This ultimately empowers brands to bring products to market that meet customer expectations and drives increased sales and revenue.

Marketing Technology News: MarTech Interview With Ronny Elkayam, SVP, Mobile, App Market & Strategic Products at Wix

Brought to you by
For Sales, write to: contact@martechseries.com
Copyright © 2024 MarTech Series. All Rights Reserved.Privacy Policy
To repurpose or use any of the content or material on this and our sister sites, explicit written permission needs to be sought.