As November draws to a close, our team has been hard at work preparing for the release of the first version of our fully AI-automated thematic analysis for qualitative customer research. But don’t miss these eight other product updates that went live in your account this month 👀
As many of you already know, we’re heads down building a new version of AI-run thematic analysis, which will make it easier for you to quickly analyze the responses in your Wondering studies. We’re getting close to a first release, and you’ll get access to it over the next few weeks. Building up to this release, we’ve also shipped a heap of product updates to make it easier for you to get the insights you’re after. Here are some other updates you’ll now be able to see in your Wondering account: 🖥️ Live Website Testing: Earlier this month, we shipped Live Website Testing! Live Website Tests allow you to show your participants any website, such as your landing pages or a competitor website. By combining Live Website Tests with other blocks in your study, you can then ask questions to your participants to better understand how they experience those websites. 🤖 Added support for Live Website Testing in the AI Study Builder: You can now generate Live Website Testing studies using the Wondering AI Study Builder. 👀 See all the responses for each block: We added block-level response tables that show responses to each question within a block, including answers to follow-up questions: 🔌 See Figma events in the participant transcript: To make it easier to understand how participants interacted with your Figma prototypes, you can now see Figma events in the transcripts for Prototype Test blocks. 💾 Export prototype test data in your CSV exports: You can now include prototype test data in your CSV exports, making it easier to dive deep into your results. 💖 A smoother participant experience: Sometimes it’s the smaller updates that are the most exciting. We’ve made numerous small improvements to the experience navigating the Wondering researcher app. ❤️ Usability improvements for researchers: We’ve made some improvements to the UI of the participants' experience when participating in studies, based on feedback from participants on our panel. 🐛 Platform-wide stability improvements and bug fixes: We’ve made numerous under-the-hood improvements to enhance the stability and reliability of the platform, ensuring a smoother experience all around. Anything else you'd like to see in Wondering? Let me know!