Services
Customer Experience
Survey Research
Explore our Services Close

Survey Research

Let quantitative research drive your understanding of customer attitudes and behaviour.

Customer actions change over time, and so too will the customer experience. Through customer survey research, we'll provide quantitative measurements of the changes in opinions, beliefs, and attitudes that affect your customers' behaviours. You'll be able to re-discover your customers, re-evaluate your approaches, and prioritize your strategies.

CX Challenges we solve
  • Measure Opinions Beliefs And Attitudes Affecting Behaviors
  • Track Changing Customer Attitudes And Behaviors Over Time
  • Re-Discover Customers And Prioritize Strategic Approaches
- GET IN TOUCH WITH US
— GET IN TOUCH WITH US

HOW WE DO IT

  1. 1

    Initial stakeholder interviews and collaborative workshops will provide great understanding of your research questions plus any previous research already conducted.

  2. 2

    Collaboratively, we will develop the survey research questionnaire that produces baseline data, test key hypotheses, uncover trends, and validate concept ideas.

  3. 3

    We will host the (online) survey, collect and do indepth analysis of the data.

  4. 4

    We can conduct complementary research, such as focus groups, ethnographic field research, diary studies, and other techniques.

  5. 5

    We will visualize the survey research data in an engaging format that promotes clear understanding and buy-in throughout your organization.

Want to learn more?
Contact us today

WHAT YOU GET

Together, we will uncover quantifiable data about your customers' perceptions and attitudes towards your product & service experiences. You'll get:

  • Measureable data and insights on which to base practical decisions about product and service design.
  • A visualization that captures the survey research data and gives concrete insight into your customers: their needs and wants.
  • Deep customer knowledge and clarity on how to move forward with your product, service, and organization.
SELECTED PROJECTS
Latest POSTS
Explore Our Blog
Clients we've helped with Intentional Experiences
Akendi UX Client
Akendi UX Client
Akendi UX Client
Akendi UX Client
Akendi UX Client
Akendi UX Client
Akendi UX Client
Akendi UX Client
Our foundation
Experience thinking perspective

Experience Thinking underpins every project we undertake. It recognizes users and stakeholders as critical contributors to the design cycle. The result is powerful insights and intuitive design solutions that meet real users' and customers' needs.

Have customer survey research questions?

Check out our Q&As. If you don't find the answer you're looking for, send us a message at contact@akendi.com.

What makes online customer surveys effective for quantitative research?

Online customer surveys excel at collecting quantitative data from large numbers of respondents efficiently and cost-effectively. They provide statistically analyzable data about customer attitudes, behaviors, and preferences across your target population. Surveys answer 'where' and 'how often' questions, helping prioritize areas of focus based on data rather than assumptions. They're particularly valuable for measuring trends over time and validating insights from qualitative research.

Tip: Use surveys to measure and validate patterns you've identified through qualitative research rather than as standalone exploration tools - they're most effective when you know what to measure.

How do customer surveys fit into the Experience Thinking framework?

In Experience Thinking, customer surveys measure attitudes and behaviors across brand, content, product, and service experiences. They quantify how customers perceive each experience area and can reveal connections between different experience elements. Surveys provide the scale and measurement needed to prioritize experience improvements based on actual customer data across your complete experience ecosystem.

Tip: Structure survey questions to capture perceptions across all four Experience Thinking areas (brand, content, product, service) to understand how they interconnect and influence overall customer experience.

What types of business questions can customer surveys answer effectively?

Customer surveys effectively answer questions about satisfaction levels, preference comparisons, behavior frequency, attitude measurement, and demographic patterns. They can measure brand perception, feature importance, usage patterns, loyalty indicators, and market segmentation. Surveys are ideal for tracking changes over time and comparing performance across different customer groups or touchpoints.

Tip: Frame survey objectives around specific business decisions you need to make - measuring satisfaction is less actionable than measuring which specific improvements would most increase likelihood to recommend.

What are the main limitations of customer survey research?

Survey limitations include self-reporting bias, inability to probe deeper into responses, potential for response fatigue, and dependence on predetermined question options. Surveys can't capture the 'why' behind responses, may miss important context, and are limited by the quality of question design. Response bias can occur when certain customer segments are more likely to participate than others.

Tip: Plan complementary research methods during survey design phase to address known limitations - combine surveys with interviews or behavioral data to get complete understanding.

How reliable are online survey results for business decision-making?

Online survey reliability depends on sample quality, question design, methodology rigor, and appropriate analysis. When conducted with proper statistical methods and representative samples, surveys provide reliable quantitative insights for business decisions. Reliability increases when surveys are used to validate hypotheses generated through other research methods rather than for pure exploration.

Tip: Assess survey reliability by evaluating the provider's methodology documentation, sample representativeness, and statistical analysis capabilities rather than just focusing on sample size.

When should customer surveys be used instead of other research methods?

Use customer surveys when you need quantitative validation, large sample sizes, statistical analysis, cost-effective data collection, or trend measurement over time. Surveys are ideal for confirmatory research, measuring known variables across populations, and prioritizing identified issues. They work best when you have specific hypotheses to test rather than broad exploratory questions.

Tip: Choose surveys when you need to answer 'how much' or 'how many' questions about known issues rather than 'why' or 'what' questions about unknown problems.

How do customer surveys complement other research methods?

Surveys complement qualitative research by quantifying insights, validating findings across larger populations, and providing statistical significance to observed patterns. They work well with focus groups, interviews, and usability testing by measuring the prevalence of issues identified qualitatively. Surveys can also validate personas, prioritize features, and track improvement over time.

Tip: Sequence research methods strategically - use qualitative research to identify what to measure, then use surveys to quantify and validate those insights across your customer base.

What's your approach to designing effective customer survey questions?

Effective survey design starts with clear objectives and specific business questions that need answers. We use closed questions for quantitative analysis, ensure response options are comprehensive and mutually exclusive, and avoid leading or biased language. Question flow is logical, with demographic questions at the end. We pre-test surveys to identify potential issues before full deployment.

Tip: Test your survey with a small group before full launch to identify confusing questions, technical issues, or missing response options - fixes after launch can compromise data quality.

How do you ensure survey questions avoid bias and leading responses?

Bias avoidance requires neutral question wording, balanced response options, randomized answer choices when appropriate, and careful attention to question order effects. We avoid loaded terms, don't suggest desired answers, and include 'don't know' options when appropriate. Question testing with diverse respondents helps identify potential bias before deployment.

Tip: Have people outside your organization review survey questions before launch - internal bias can be difficult to recognize, and fresh perspectives often identify leading language or missing options.

What's the optimal length for customer experience surveys?

Survey length depends on audience, topic complexity, and incentives, but generally shorter is better for response rates and data quality. Most customer surveys should take 5-15 minutes to complete. Length is less important than relevance - respondents will complete longer surveys if questions feel valuable and pertinent to their experience. Mobile optimization is crucial for online surveys.

Tip: Test survey completion time with diverse respondents rather than relying on internal estimates - what feels quick to survey designers often feels longer to actual customers.

How do you handle sensitive or personal questions in customer surveys?

Sensitive questions require careful positioning, clear privacy statements, optional response formats, and anonymous data collection when possible. We place sensitive questions later in surveys after rapport is established, explain why information is needed, and provide opt-out options. Data privacy protection is essential for maintaining trust and response quality.

Tip: Consider whether sensitive questions are truly necessary for your research objectives - unnecessary personal questions can damage response rates and customer relationships without adding value.

What's your approach to survey question validation and testing?

Question validation includes cognitive testing with target respondents, pilot testing with small samples, and expert review of question logic and flow. We test for comprehension, response accuracy, completion time, and technical functionality. Validation helps identify confusing questions, missing response options, and potential bias before full deployment.

Tip: Include both survey experts and target customers in validation testing - experts catch methodological issues while customers identify comprehension and relevance problems.

How do you design survey flow and logic for better completion rates?

Survey flow design includes logical question progression, skip logic to avoid irrelevant questions, progress indicators, and mobile-friendly formatting. We start with engaging questions, group related topics together, and end with demographics. Clear instructions and intuitive navigation reduce abandonment rates and improve data quality.

Tip: Map out the complete survey experience from the respondent's perspective before building - consider their context, motivation, and potential frustration points throughout the process.

What role does survey design play in data quality and analysis?

Survey design directly impacts data quality through question clarity, response option completeness, and logical flow. Well-designed surveys produce cleaner data that requires less post-collection processing. Design choices like question types, scales, and validation rules affect analytical possibilities and statistical validity of results.

Tip: Plan your analysis approach during survey design phase rather than afterward - knowing how you'll analyze data influences question types, response scales, and data collection requirements.

How do you determine appropriate sample sizes for customer surveys?

Sample size depends on population size, desired confidence level, margin of error, and expected response distribution. For customer satisfaction surveys, samples of 300-400 often provide adequate precision for most business decisions. Larger samples are needed for subgroup analysis or when measuring small differences. Statistical power analysis helps determine optimal sample sizes for specific research objectives.

Tip: Balance statistical requirements with practical considerations like budget and timeline - a smaller sample with high response quality often provides better insights than a large sample with poor engagement.

What's your approach to customer survey sampling and recruitment?

Sampling strategy depends on research objectives and target population characteristics. We use probability sampling when statistical representation is important and convenience sampling for exploratory research. Recruitment methods include customer databases, online panels, social media, and website intercepts. Sample composition should reflect your target market demographics and behaviors.

Tip: Document your sampling approach and any limitations clearly - this helps stakeholders understand how broadly they can apply survey findings to their customer base.

How do you ensure survey samples are representative of our customer base?

Representative sampling requires understanding your customer demographics, using appropriate recruitment channels, and monitoring response patterns for bias. We compare respondent characteristics to known customer data and use quotas or weighting when needed. Post-collection analysis examines potential non-response bias and its impact on findings.

Tip: Compare survey respondent characteristics to your overall customer data early in data collection so you can adjust recruitment strategies if certain segments are underrepresented.

What factors affect online survey response rates and data quality?

Response rates are influenced by survey length, topic relevance, incentives, timing, mobile optimization, and relationship with the organization. Data quality depends on question clarity, survey design, and respondent engagement. Clear privacy policies, professional presentation, and meaningful incentives improve both response rates and data quality.

Tip: Monitor response rates and completion patterns during data collection rather than waiting until the end - early patterns often reveal issues that can be addressed to improve overall data quality.

How do you handle non-response bias in customer surveys?

Non-response bias occurs when survey respondents differ systematically from non-respondents. We address this through multiple contact attempts, varied recruitment channels, incentive strategies, and post-collection analysis. When bias is detected, we use weighting techniques or acknowledge limitations in findings interpretation.

Tip: Collect basic information about non-respondents when possible (like demographics or customer type) so you can assess whether your survey sample might be biased in important ways.

What's your approach to segmentation analysis in customer surveys?

Segmentation analysis examines how survey responses vary across different customer groups based on demographics, behaviors, or attitudes. We design surveys to enable meaningful segmentation and ensure adequate sample sizes within each segment. Analysis identifies segment-specific insights that inform targeted strategies and personalized experiences.

Tip: Plan segmentation analysis during survey design to ensure adequate sample sizes within key segments - post-hoc segmentation often reveals insufficient data for reliable analysis of important customer groups.

How do you validate survey findings through data triangulation?

Data triangulation involves comparing survey findings with other data sources like customer analytics, sales data, support tickets, or qualitative research. This validation helps confirm survey accuracy, identify potential biases, and provide additional context for interpretation. Multiple data sources increase confidence in business decisions based on research findings.

Tip: Identify validation data sources before launching surveys so you can design questions that enable direct comparison with existing business metrics or other research findings.

What survey platforms and technologies do you recommend?

Platform selection depends on survey complexity, integration requirements, analytics needs, and budget. We evaluate platforms based on question types, logic capabilities, mobile optimization, data security, and analysis tools. Professional platforms offer better reliability, security, and analytical capabilities than basic tools, especially for business-critical research.

Tip: Choose survey platforms based on your specific technical requirements rather than just cost - features like advanced logic, data integration, and security compliance become important for complex business surveys.

How do you ensure survey accessibility and mobile optimization?

Accessibility requires compliance with web accessibility standards, mobile-responsive design, clear navigation, and compatibility with assistive technologies. Mobile optimization includes touch-friendly interfaces, appropriate text sizing, and efficient data loading. Testing across devices and accessibility tools ensures broad participation and data quality.

Tip: Test surveys on multiple devices and screen sizes before launch - mobile experience often differs significantly from desktop testing, and many customers primarily use mobile devices.

What data security and privacy measures do you implement?

Data security includes encrypted data transmission, secure storage, access controls, and compliance with privacy regulations like GDPR or CCPA. We implement anonymization when possible, clear consent processes, and data retention policies. Privacy protection builds respondent trust and ensures legal compliance for business research.

Tip: Verify that survey providers meet your organization's data security requirements before starting projects - data breaches in research can damage customer relationships and create legal liability.

How do you handle survey distribution and invitation management?

Distribution management includes invitation timing, reminder scheduling, response tracking, and multi-channel deployment. We personalize invitations when appropriate, optimize send times, and manage reminder cadences to maximize response while avoiding respondent fatigue. Integration with customer databases enables targeted distribution and response tracking.

Tip: Plan distribution timing around your customers' schedules and preferences rather than internal convenience - response rates often vary significantly based on day of week, time of day, and seasonal factors.

What quality controls do you implement during data collection?

Quality controls include response validation, completion time monitoring, duplicate detection, and data integrity checks. We implement logic checks, required field validation, and anomaly detection to identify potential data quality issues. Real-time monitoring allows for immediate correction of technical problems or survey design issues.

Tip: Monitor survey performance daily during data collection rather than waiting until completion - early detection of quality issues allows for corrections that improve overall data reliability.

How do you manage survey incentives and participant compensation?

Incentive management includes appropriate compensation levels, delivery mechanisms, tax compliance, and fraud prevention. Incentives should motivate participation without biasing responses. We handle incentive distribution, tracking, and reporting while ensuring compliance with relevant regulations and organizational policies.

Tip: Match incentive types and amounts to your target audience characteristics and survey length rather than using standard amounts - what motivates different customer segments can vary significantly.

What backup and contingency plans do you have for technical issues?

Contingency planning includes backup systems, alternative distribution methods, data recovery procedures, and communication protocols for technical failures. We maintain redundant systems, regular data backups, and clear escalation procedures. Technical support is available throughout data collection to address issues quickly.

Tip: Discuss technical support availability and response times during vendor selection - survey technical issues can significantly impact response rates if not resolved quickly.

What's your approach to customer survey data analysis and interpretation?

Analysis begins with data cleaning, validation, and descriptive statistics to understand response patterns. We use appropriate statistical techniques based on data types and research objectives, including cross-tabulation, regression analysis, and segmentation. Interpretation focuses on business implications and actionable insights rather than just statistical findings.

Tip: Ensure analysis plans are defined before data collection begins - this prevents cherry-picking significant results and ensures systematic examination of all research objectives.

How do you identify statistically significant patterns in survey data?

Statistical significance testing uses appropriate methods based on data types, sample sizes, and research questions. We apply confidence intervals, hypothesis testing, and effect size calculations to distinguish meaningful differences from random variation. Significance testing is combined with practical significance assessment to identify business-relevant findings.

Tip: Focus on practical significance alongside statistical significance - a statistically significant difference may not be meaningful for business decisions if the effect size is too small to matter operationally.

What techniques do you use for customer segmentation analysis?

Segmentation analysis includes demographic analysis, behavioral clustering, attitudinal grouping, and statistical modeling techniques like factor analysis or cluster analysis. We identify meaningful customer groups based on survey responses and validate segments through additional analysis. Segmentation provides targeted insights for personalized strategies and experience design.

Tip: Validate statistical segments against business intuition and operational feasibility - segments that make statistical sense may not be practically useful if they can't be identified or reached through normal business operations.

How do you connect survey insights to business impact and ROI?

Business impact analysis connects survey findings to customer behaviors, financial metrics, and operational outcomes. We examine relationships between satisfaction scores and retention, preference ratings and purchase intent, or experience ratings and loyalty indicators. Impact assessment helps prioritize improvements based on potential business return.

Tip: Link survey questions to existing business metrics during design phase so you can demonstrate clear connections between customer attitudes and business outcomes in your analysis.

What's your process for identifying actionable insights from survey data?

Actionable insight identification involves analyzing data patterns, identifying improvement opportunities, and connecting findings to specific business decisions. We prioritize insights based on impact potential, implementation feasibility, and strategic alignment. Insights are presented with clear recommendations and implementation guidance.

Tip: Involve business stakeholders in insight interpretation sessions rather than just delivering reports - collaborative analysis often reveals practical applications and implementation considerations that pure data analysis might miss.

How do you handle conflicting or unexpected survey findings?

Conflicting findings require careful examination of methodology, sample composition, and question interpretation. We investigate potential causes including response bias, question ambiguity, or genuine market complexity. Unexpected findings often provide valuable insights and may indicate important changes in customer attitudes or market conditions.

Tip: Don't dismiss unexpected findings as methodological errors too quickly - they often reveal important shifts in customer attitudes or previously unknown market segments that can inform strategy.

What visualization and reporting methods do you use for survey results?

Visualization includes charts, graphs, dashboards, and infographics designed for different stakeholder needs. We create executive summaries, detailed analytical reports, and presentation materials tailored to specific audiences. Interactive dashboards enable ongoing exploration of findings and trend monitoring over time.

Tip: Design reports and visualizations for your specific stakeholders' needs and technical comfort levels - executives may need different formats than analysts or operational teams implementing changes.

How can customer surveys inform our overall customer experience strategy?

Customer surveys provide quantitative foundation for experience strategy by measuring satisfaction, identifying pain points, prioritizing improvements, and tracking progress over time. Using Experience Thinking principles, surveys can measure perceptions across brand, content, product, and service experiences to inform holistic strategy decisions based on actual customer data.

Tip: Use surveys to validate experience strategy priorities identified through qualitative research rather than making strategic decisions based solely on survey data - combined insights provide stronger foundation for strategy.

What's your approach to using surveys for customer journey optimization?

Journey optimization surveys measure satisfaction, effort, and emotions at different touchpoints throughout the customer lifecycle. We identify journey friction points, measure handoff effectiveness, and track journey progression metrics. Survey insights inform journey redesign by quantifying which interactions most impact overall experience and customer behavior.

Tip: Design journey surveys to capture both touchpoint-specific feedback and overall journey evaluation - understanding how individual interactions affect complete journey perception helps prioritize improvement efforts.

How do customer surveys support product development and feature prioritization?

Product development surveys measure feature importance, usage patterns, satisfaction levels, and improvement priorities. We use techniques like MaxDiff analysis, conjoint analysis, and importance-performance mapping to prioritize features based on customer value rather than internal assumptions. Surveys validate product concepts and measure market demand before development investment.

Tip: Focus product surveys on customer value and outcomes rather than just feature preferences - understanding what customers are trying to accomplish provides better product development guidance than measuring feature popularity.

What role do surveys play in brand positioning and messaging strategy?

Brand surveys measure brand awareness, perception, associations, and positioning relative to competitors. We assess message effectiveness, brand personality perceptions, and emotional connections. Survey insights inform positioning strategy, message development, and communication priorities based on quantified customer perceptions and preferences.

Tip: Include competitive context in brand surveys to understand relative positioning rather than absolute performance - customers evaluate brands comparatively, and strategic decisions often depend on competitive differentiation.

How can surveys help identify and validate new market opportunities?

Market opportunity surveys measure unmet needs, willingness to pay, usage contexts, and purchase intent for potential new offerings. We identify underserved segments, quantify market demand, and assess competitive gaps. Survey insights help validate business cases for new products, services, or market entry strategies.

Tip: Design opportunity surveys to measure both need intensity and willingness to act - high need with low purchase intent may indicate pricing, positioning, or capability gaps that affect market viability.

What's your approach to competitive intelligence through customer surveys?

Competitive intelligence surveys measure customer perceptions of competitive alternatives, switching behavior, decision criteria, and satisfaction comparisons. We assess competitive strengths and weaknesses from customer perspectives, identify differentiation opportunities, and track competitive positioning changes over time.

Tip: Include customers of competitive solutions in your surveys, not just your own customers - understanding why people choose alternatives provides competitive insights you can't get from your existing customer base alone.

How do you use survey data for customer retention and loyalty strategy?

Retention surveys identify satisfaction drivers, loyalty predictors, churn risk factors, and improvement priorities that affect customer lifetime value. We measure Net Promoter Score, satisfaction correlations, and retention probability to inform retention strategies. Longitudinal surveys track how satisfaction and loyalty evolve over customer relationships.

Tip: Connect survey loyalty measures to actual customer behavior data when possible - this validates which survey metrics actually predict retention and helps focus improvement efforts on metrics that drive business results.

What's the typical timeline for customer survey research projects?

Survey project timelines typically require 4-8 weeks depending on complexity, sample size, and analysis requirements. This includes survey design and testing (1-2 weeks), data collection (1-3 weeks), and analysis and reporting (1-3 weeks). Complex surveys with advanced analysis or large samples may require longer timelines.

Tip: Build flexibility into survey timelines for potential response rate challenges or data quality issues that might require extended collection periods or methodology adjustments.

How do you structure pricing for customer survey research projects?

Pricing typically includes survey design, platform costs, sample recruitment, data collection management, analysis, and reporting. Costs vary based on sample size, survey complexity, analysis requirements, and incentive levels. We provide transparent pricing that separates different project components for clear understanding of investment allocation.

Tip: Compare survey pricing based on included services and deliverable quality rather than just per-response costs - comprehensive projects with skilled analysis provide better business value than basic data collection.

What deliverables should we expect from survey research projects?

Survey deliverables typically include comprehensive findings reports, executive summaries, data files, statistical analysis documentation, and presentation materials. Depending on project scope, you might also receive dashboard access, segmentation profiles, or trend analysis. All outputs are designed for practical business application.

Tip: Specify deliverable formats based on how different stakeholders will use the research - executives may need different presentations than analysts or operational teams implementing changes.

How involved should our team be in the survey research process?

Team involvement enhances survey quality and ensures business relevance. We recommend participation in objective setting, questionnaire review, target audience definition, and findings interpretation. Your business context combined with our research expertise produces more valuable insights and actionable recommendations.

Tip: Assign team members who understand both customer context and business implications rather than just senior stakeholders - hands-on involvement during design improves survey relevance and actionability.

What factors might affect survey project timelines or outcomes?

Timeline factors include response rate challenges, technical issues, sample recruitment difficulties, seasonal effects, and external events affecting customer availability. We monitor progress continuously and communicate potential delays early. Contingency planning helps address challenges without compromising research quality.

Tip: Consider customer lifecycle timing and business seasonality when scheduling surveys - response rates and answer quality can vary significantly based on when customers are contacted.

How do you handle project changes or additional analysis needs?

Project changes are managed through clear communication about implications for timeline, budget, and outcomes. We evaluate change requests against research objectives and provide options for accommodation. Some changes are better addressed through follow-up surveys rather than mid-project modifications that might compromise data quality.

Tip: Anticipate potential analysis needs during initial project planning rather than requesting additional analysis after data collection - some analytical approaches require specific survey design considerations.

What ongoing support do you provide after survey completion?

Post-project support includes findings clarification, stakeholder presentation assistance, implementation guidance, and consultation on follow-up research needs. We help translate insights into business decisions and provide guidance on tracking improvements over time. Support includes assistance with communicating results across your organization.

Tip: Plan for post-survey implementation support during initial project discussions - having research partners available during implementation helps ensure insights translate into effective business actions.

How can we help_hand help you?