Utilising AI to streamline the survey building process

Data collection
AI
Saas

Summary:

We introduced an AI-driven feature that automates survey creation by allowing users to upload pre-approved surveys. This enhancement streamlined the process, saving time and increasing user satisfaction while boosting engagement and accuracy on our platform.

My role:

Product Designer

Skills:

  • + User Experience
  • + Visual Design
  • + User testing/research
  • + Design systems
  • + Ideation / workshop

Team:

  • + Myself
  • + 1 Product Manager
  • + 1 Customer Success Manager
  • + 2 Testers
  • + 3 Front-end engineers
  • + 1 Back-end engineer

Overview

We leveraged AI to revolutionize the survey creation process, making it both fast and effortless. Users can now upload pre-defined surveys, and our platform takes over the creation process. This update not only saves valuable time but also reduces user frustration, significantly enhancing overall satisfaction and retention. Some key tasks were:

  • + Conducting and leading user research in collaboration with PM
  • + Managing key-stakeholders
  • + Leading workshops on ideation and refinement
  • + Developing UI elements, visual assets, and interaction designs.

Problem

The existing survey creation process required users to manually recreate pre-approved surveys, resulting in inefficiencies and wasted time. This cumbersome process often led to frustration and reduced user retention, as users sought more streamlined solutions elsewhere. Our goal was to address these issues and improve user engagement with our platform.

Business Goals

  • + Simplify the survey creation process to enhance user satisfaction
  • + Reduce time spent on manual survey recreation
  • + Increase user retention by providing a more efficient solution
  • + Improve the accuracy of surveys generated on our platform
  • + Drive higher engagement with our new AI feature

User Goals

  • + Faster turnaround times for survey setup
  • + Streamlined survey creation using pre-approved surveys
  • + Reduced frustration with automated processes
  • + Increased accuracy and relevance in generated surveys
  • + Enhanced overall experience with the platform

The process of problem solving


After completing another project, I gathered feedback from 15 users about the sign-up process and their survey-building methods. I observed that many users had pre-set questions but still had to manually build each question, so I asked them about this issue.

Key Insights:

  • "You know, creating surveys can be quite a process. First, I have to draft the questions and get them approved by different departments to ensure they align with our standards and goals so often we already have a word doc to begin with that is pre-determined”.
  • ”I think thats were a lot of my time goes because once it’s built all you have to do is distribute it which is easy”
  • ”I find Creating surveys involves a lot of back-and-forth. Our questions are usually reviewed by various departments for approval, and then build it using SmartSurvey . It’s a bit of a hassle. If there was a way to upload a document or something then perhaps that would be easier - I think Qualtrics used to let me do that.”
  • "Deciding on the right question types can be tricky for us. We receive templates for surveys to create, but sometimes we're uncertain about which questions will work best."

Thought process

I shared my findings with my PM, who acknowledged the issue but didn’t prioritize it for the roadmap. We decided to address it as a hack-day project to explore it efficiently without disrupting the team’s work. I then organized an ideation session with stakeholders from customer success, sales, engineering, and my product manager.

Ideation session board
Ideation Board

Problem Defenition

The challenge lies in efficiently integrating pre-built surveys into our platform. Users face uncertainty in selecting optimal question types and encounter delays in obtaining necessary approvals from stakeholders. Our goal is to streamline this process, ensuring surveys are user-friendly, align with organisational standards, and meet user expectations seamlessly.

The ideation workshop

During the ideation session, I defined the problem and presented my findings to the team. We then generated ideas and refined HMW (How Might We) statements to ensure our solutions were focused on the user's needs.

Chosen path

We refined and prioritized our ideas, then used an impact/effort matrix to evaluate their value and required effort. We decided to use AI for several reasons:
  • It was scalable and consistent - AI can handle a large amount of surveys and is incredibly accurate.
  • Cost efficent - This minimised resources and reduced the build time - We also could integrate with our exsisting AI offering.
  • It would meet the users pain-point by reducing the time taken and helping suggest appropriate question types.

Analytics

We had recently introduced an AI feature that generates question lists, providing an opportunity to automate the survey process. I analyzed AI usage data, which showed strong user interest, supporting the development of a lightweight MVP and potentially avoiding a lengthy development cycle.
Ideation Board

Thought process

At this point, I had customer feedback, internal buy-in, and analytics to support potential usage. Next, I aimed to explore different implementation flows and develop various approaches for the feature.

User requirements:

  • Implement a text box where users can copy and paste pre-built survey content.
  • Develop an AI system that analyzes the pasted text and generates corresponding survey questions and formats.
  • Enable the AI to recommend the most suitable question types (e.g., multiple-choice, open-ended) based on the content of the pasted text.
  • Include a feature that allows users to regenerate or modify questions if the initial AI-generated output is incorrect or suboptimal.
  • Implement a feedback mechanism where users can rate the accuracy of the AI-generated questions.
  • Provide a real-time preview of the generated survey questions as users input text.
In order to create an MVP we decided to add onto the current AI flow for prompts as this meant it was easier for users to discover, would not interrupt squad work and meant we could tap into a steady stream of users already using the product.

Having discussed requirements, how we could utilise it in the current platform and establishing the rough flow meant I could start with the initial designs.

Initial Designs

From this point I then reached out to previous users to get their initial thoughts and responses on our proposed solution and was met with overall positive responses altohugh it did give us more to consider which helped mitigate later issues.
  • "The import feature looks like it would be really efficient. Being able to paste questions and see them instantly in the survey preview seems like a huge time-saver."
  • "The design looks straightforward, but I wonder how well the AI handles more complex questions. If it struggles, users might need to spend extra time adjusting things manually."
  • "The design seems straightforward to me, and the tips provided on the side add some support. They seem like they could be useful and I like having pointers on how to get the most out of my questions"
  • "It’s great that the AI can format the questions, but I’d be concerned about it changing the intent. Maybe an option to preserve the original question type would be useful."

Thought process

Mapping out the flows based on what currently exsisted and what we could then add helped to clarify to stakeholders what we were aiming to achieve and to visualise the work so everyone was onboard with the general scope of work and the solutions we had prioritised.

Refining the Hi-fidleity designs

We built onto the exsisting flow building into the onboarding steps as well as the initial survey creation step. This meant new users would be caught upon first entry and also that we could build this into the flow of exsisting users.

SmartSurvey AI implementation

Initial Phishing Campaign Flow
We added a content switcher to split between prompt and import - import allowing a user to copy and paste text from their creator or text editor of choice. This meant utilising the exsisting flow and kept the build to a minimum while also providing a light-weight version for users to bulk upload their survey questions.
Initial Phishing Campaign Flow
We added initial tips as the better the clarity and quality of the initial the more effective the generated return. We wanted to encourage users to be broad as well as open to reviewing the generated outcome
Initial Phishing Campaign Flow
Once a user has uploaded their text in the entry this then generates a return of set question types that correspond to our database and pre-set prompt. We also added the option to regenerate questions as although the accuracy was very high we wanted to apply fail-safes for users and assurance that they could simply change or experiemnt for greater results.
Initial Phishing Campaign Flow
Once a user is happy with their survey content they then proceed to edit further - this covers styling, additional logic, distribution and results. This means a user can initially save time uploading content and then have the option to then customise to advanced options if needed or they can simply distribute.
Initial Phishing Campaign Flow

Next steps

At the end of the hack-day, the project received positive feedback, but stakeholders were concerned about the AI's reliability and the limited question types. I scheduled a follow-up meeting to address these concerns, showcasing positive user feedback and a demo of the AI. We expanded the question types to 15 and planned more updates. We then released a BETA version to feedback providers for direct input on the new designs.

Feedback from Beta users

Following the stakeholder feedback and successful demo, we iterated on the initial design, expanded the AI capabilities to cover additional question types, and improved its reliability. The refined feature was then scheduled into our development roadmap. We implemented a beta rollout plan, using feature flags to release the new feature to 25% of both business and enterprise users. This allowed us to collect targeted feedback and monitor performance, refining the feature based on real-world usage before a broader release.

  • 26% of users adopted the new AI feature within the first month
  • Users reported a 35% reduction in time spent creating surveys
  • Satisfaction scores for the survey creation process increased by 22%
  • User retention improved by 10%