Importance Scale Survey Template

Free Word download • Edit online • Save & share with Drive • Export to PDF

1 page20–30 min to fillDifficulty: StandardSignature requiredLegal review recommended
Learn more ↓
FreeImportance Scale Survey Template

At a glance

What it is
An Importance Scale Survey is a structured questionnaire that asks respondents to rate a defined set of attributes, features, or priorities on a numeric or descriptive scale — typically 1 to 5 or 1 to 10 — to identify what matters most to a target population. This free Word download gives you a professionally formatted, editable template you can adapt for employee feedback, customer research, product prioritization, or regulatory compliance assessments, and export as PDF for distribution or filing.
When you need it
Use it when you need quantifiable, defensible data on stakeholder priorities before making a business decision — product roadmap planning, compensation benchmarking, supplier evaluation, or compliance gap analysis. It is also appropriate when regulations or contractual obligations require documented evidence that stakeholder input was formally gathered and recorded.
What's inside
A clear purpose and instructions section, respondent identification and consent fields, a scaled rating matrix covering each attribute or feature under review, an open-ended comments section, a confidentiality and data use disclosure, and a respondent signature block confirming informed participation.

What is an Importance Scale Survey?

An Importance Scale Survey is a structured questionnaire that asks respondents to rate a defined set of attributes, features, or criteria on a numeric or descriptive scale — typically 1 (not important) to 5 (extremely important) — to generate ranked, quantifiable evidence of what a target population values most. Unlike open-ended feedback forms, an importance scale survey produces ordinal data that can be aggregated, segmented, and compared across respondent groups, making it a standard tool for product prioritization, employee benefits benchmarking, supplier evaluation, and regulatory compliance documentation. When properly structured with a consent disclosure, a defined data retention policy, and a signed respondent acknowledgment, it also functions as a legally defensible record of stakeholder input.

Why You Need This Document

Without a formally structured importance scale survey, priority decisions default to the loudest voice in the room — or to assumptions that go untested until they cost real money. A product team that ships features based on informal feedback rather than ranked importance data routinely builds for the vocal minority rather than the majority of customers. An HR department that redesigns compensation without measured employee importance data risks investing in benefits that do not drive retention. Beyond strategic value, organizations that collect personal data through surveys without documented consent, a stated retention period, and a clear data use disclosure are exposed to GDPR, PIPEDA, and CCPA enforcement risk — even for internal surveys distributed only to employees. This template gives you the structured format, consent language, rating matrix, and signature blocks needed to collect importance data that is both analytically reliable and legally defensible, without the cost of building a custom survey instrument from scratch.

Which variant fits your situation?

If your situation is…Use this template
Measuring employee satisfaction and workplace prioritiesEmployee Satisfaction Survey
Gathering post-purchase or post-service customer feedbackCustomer Satisfaction Survey
Evaluating supplier or vendor criteria by weighted importanceSupplier Evaluation Form
Rating product or service features for a new launchProduct Feedback Survey
Assessing compliance requirements by risk priorityRisk Assessment Questionnaire
Collecting ranked stakeholder input for a strategic planning processStakeholder Analysis Template
Measuring training needs by importance and urgencyTraining Needs Assessment

Common mistakes to avoid

❌ Placing consent disclosure after the rating questions

Why it matters: Under GDPR, PIPEDA, and equivalent frameworks, consent must be obtained before data is collected — not after. Post-hoc consent is generally invalid and exposes the organization to regulatory penalties.

Fix: Move the consent and data use disclosure to the top of the survey, before any rating questions, and require an acknowledgment before the respondent proceeds.

❌ Using an ambiguous or unlabeled scale

Why it matters: A scale labeled only at endpoints (1 and 5, with nothing in between) is interpreted differently by each respondent, making aggregate importance rankings statistically unreliable.

Fix: Label every point on the scale with a plain-English descriptor. For a 5-point scale: 1 = Not Important, 2 = Slightly Important, 3 = Moderately Important, 4 = Very Important, 5 = Extremely Important.

❌ Collecting surplus personal data without justification

Why it matters: Requesting fields like home address, age, or personal email that are not needed for the analysis violates data-minimization principles under GDPR Article 5(1)(c) and PIPEDA Principle 4.4.

Fix: Before finalizing the respondent identification section, confirm that every data field collected is strictly necessary to answer the business question driving the survey.

❌ Omitting the administrator certification block

Why it matters: Without a certified chain of custody, survey results used in employment decisions, vendor selections, or regulatory filings can be challenged as unreliable or tampered with.

Fix: Include an administrator certification section and complete it before results are shared with any decision-maker. File the certified copy alongside the aggregated data.

❌ Framing attributes as double-barreled questions

Why it matters: An attribute like 'price and delivery speed' forces the respondent to rate two different things simultaneously — the resulting score cannot be meaningfully interpreted for either dimension.

Fix: Split every double-barreled attribute into two separate items. Review the final attribute list and flag any item containing the words 'and' or 'or' as a candidate for splitting.

❌ No stated data retention or deletion policy

Why it matters: Indefinite retention of personal data collected via surveys is explicitly non-compliant under GDPR Article 5(1)(e), the UK Data Protection Act 2018, and PIPEDA — and creates unnecessary liability.

Fix: Set a specific retention period in the survey document itself and configure your data storage system to auto-delete or flag responses for review at that interval.

The 10 key clauses, explained

Survey Purpose and Scope

In plain language: States what the survey is measuring, who commissioned it, and how the results will be used — establishing the context for respondents and the legal basis for data collection.

Sample language
This survey is conducted by [ORGANIZATION NAME] to assess the relative importance of [SUBJECT MATTER] among [TARGET RESPONDENT GROUP]. Results will be used solely for [PURPOSE — e.g., product planning / HR benchmarking / vendor evaluation] and will not be disclosed to third parties without prior written consent.

Common mistake: Describing the purpose so broadly that respondents cannot assess what they are consenting to. Vague scope statements undermine informed consent and can expose the organization to data-protection challenges.

Respondent Identification

In plain language: Captures identifying information — name, role, department, or demographic category — to the extent needed to segment and analyze results, while specifying what is optional versus required.

Sample language
Full Name: [RESPONDENT NAME] (optional) | Role / Title: [TITLE] | Department / Group: [DEPARTMENT] | Date Completed: [DATE]

Common mistake: Collecting more identifying information than the analysis requires. Unnecessary personal data increases GDPR and PIPEDA compliance exposure and can deter honest responses.

Informed Consent and Data Use Disclosure

In plain language: Notifies respondents of their rights — including the right to withdraw, remain anonymous, or request deletion of their data — and obtains explicit acknowledgment before responses are recorded.

Sample language
By completing this survey, you acknowledge that: (a) your participation is voluntary; (b) your responses will be processed by [ORGANIZATION NAME] for [PURPOSE]; (c) data will be retained for [RETENTION PERIOD]; and (d) you may withdraw at any time by contacting [CONTACT DETAILS].

Common mistake: Burying the consent statement in small print at the end of the survey. Consent obtained after responses are recorded may not be valid under GDPR Article 7 or equivalent frameworks.

Importance Rating Matrix

In plain language: The core section listing each attribute, feature, or criterion to be rated, with a clearly labeled scale — e.g., 1 = Not Important, 5 = Extremely Important — applied consistently across all items.

Sample language
Please rate the importance of each item below using the scale: 1 = Not Important | 2 = Slightly Important | 3 = Moderately Important | 4 = Very Important | 5 = Extremely Important. [ATTRIBUTE 1]: 1 / 2 / 3 / 4 / 5. [ATTRIBUTE 2]: 1 / 2 / 3 / 4 / 5.

Common mistake: Using an even-numbered scale (e.g., 1–4) to force a directional response. This increases response bias and reduces data validity, particularly when genuine neutrality is a valid stakeholder position.

Instructions and Completion Guidelines

In plain language: Explains how to complete the rating matrix, how long the survey is expected to take, the deadline for submission, and the method of return or submission.

Sample language
Instructions: Rate each attribute using the 5-point scale above. Select only one rating per item. Estimated time: [X] minutes. Please return this completed survey to [SUBMISSION METHOD] by [DATE].

Common mistake: Omitting a deadline for submission. Without a deadline, response collection drags on indefinitely, making it difficult to aggregate results or meet decision-making timelines.

Open-Ended Comments Section

In plain language: Provides space for respondents to elaborate on their ratings, flag missing attributes, or provide qualitative context that the scale alone cannot capture.

Sample language
Please provide any additional comments on the items above, including attributes not listed that you consider important: [OPEN TEXT FIELD]

Common mistake: Making the comments field mandatory. Forced open-ended responses produce low-quality, perfunctory answers and reduce overall completion rates.

Confidentiality and Non-Disclosure Obligations

In plain language: Specifies whether responses are confidential or anonymous, who within the organization will see individual-level data, and any restrictions on respondents sharing the survey contents externally.

Sample language
Responses are treated as confidential and will be reviewed only by [AUTHORIZED RECIPIENTS — e.g., HR leadership / the project team]. Individual responses will not be attributed to respondents in any report. Survey contents are [CONFIDENTIAL / NOT FOR EXTERNAL DISTRIBUTION].

Common mistake: Claiming responses are anonymous when individual-level data is actually accessible to line managers. If management can see individual responses, the survey is confidential but not anonymous — conflating the two undermines trust and can invalidate results.

Data Retention and Deletion Policy

In plain language: States how long completed survey data will be retained, in what format, and the process for requesting deletion — a requirement under GDPR, PIPEDA, and similar frameworks.

Sample language
Completed survey data will be retained for [RETENTION PERIOD — e.g., 24 months] from the date of collection, after which it will be securely deleted. To request early deletion of your individual response, contact [DATA CONTROLLER CONTACT].

Common mistake: No stated retention period at all. Indefinite retention of personal data collected through surveys is non-compliant under GDPR Article 5(1)(e) and equivalent provisions in Canadian and UK law.

Respondent Signature and Date

In plain language: A signature block confirming the respondent has read the instructions and consent disclosure, completed the survey honestly, and authorizes the stated use of their responses.

Sample language
I confirm that I have read and understood the purpose of this survey and consent to the use of my responses as described above. Signature: [SIGNATURE] | Printed Name: [NAME] | Date: [DATE]

Common mistake: Treating the signature block as optional for internal surveys. Without a signed acknowledgment, the organization has no documented evidence of informed consent if a privacy complaint or employment dispute arises.

Administrator Certification

In plain language: A field for the survey administrator to certify that the survey was conducted in accordance with the stated purpose, that no responses were altered, and that data was securely collected.

Sample language
I certify that this survey was administered in accordance with [ORGANIZATION NAME]'s data collection policies and that responses have not been modified. Administrator: [NAME] | Title: [TITLE] | Date: [DATE] | Signature: [SIGNATURE]

Common mistake: Omitting administrator certification entirely. Without it, there is no chain-of-custody record for the data — which matters if survey results are used in employment decisions, vendor selections, or regulatory submissions.

How to fill it out

  1. 1

    Define the survey purpose and respondent group

    Enter a specific, one-sentence statement of what the survey is measuring and who is being asked to complete it. Avoid vague language like 'to improve our business' — be precise about the decision the data will inform.

    💡 A clearly stated purpose reduces the time respondents spend trying to interpret questions and measurably improves response quality.

  2. 2

    Select the rating scale and label each point

    Choose a 5-point or 10-point scale and write out a plain-English label for every point on the scale — not just the endpoints. Apply the same scale consistently to every attribute in the matrix.

    💡 Odd-numbered scales (5 or 7 points) outperform even-numbered ones for importance measurement because they allow genuine neutrality — a valid response in most business contexts.

  3. 3

    List all attributes in the rating matrix

    Enter each feature, criterion, or factor to be rated as a separate row. Keep attribute labels to 10 words or fewer — longer descriptions introduce ambiguity that inflates variance in responses.

    💡 Pilot the attribute list with two or three internal reviewers before distributing. Attributes that generate questions in the pilot will generate inconsistent ratings in the field.

  4. 4

    Complete the consent and data use disclosure

    Fill in the organization name, the specific purpose of data use, the retention period, and the contact details for data deletion requests. Make sure this section appears before the rating matrix, not after.

    💡 If the survey is distributed in the EU or to EU residents, confirm that the legal basis for processing (consent, legitimate interest, or contractual necessity) is explicitly stated.

  5. 5

    Set the submission deadline and method

    Enter a specific calendar date as the response deadline and the exact submission method — email to a named address, printed return to a named person, or an online form link.

    💡 A deadline 10–14 days from distribution is standard for internal surveys. Shorter windows reduce completions; longer windows delay the decisions the data is meant to inform.

  6. 6

    Add the respondent signature block

    Include the signature, printed name, and date fields at the end of the survey. For digital distribution, enable an e-signature field or require a typed-name acknowledgment of the consent terms.

    💡 For employment-related surveys, keep signed copies on file for the same retention period as the underlying HR records — not just the survey data retention period.

  7. 7

    Complete the administrator certification

    Once all responses are collected, have the survey administrator sign and date the certification block confirming the survey was conducted as intended and that no responses were altered.

    💡 File the completed administrator certification with the aggregated results — this is the audit trail that protects the organization if survey outcomes are challenged.

  8. 8

    Aggregate results and document methodology

    Calculate weighted averages for each attribute and document the methodology — number of respondents, response rate, and any demographic segmentation — in a summary memo attached to the survey file.

    💡 A response rate below 40% materially reduces the reliability of importance rankings. Document the rate and note any known sources of non-response bias before presenting results to decision-makers.

Frequently asked questions

What is an importance scale survey?

An importance scale survey is a structured questionnaire that asks respondents to rate a set of attributes, features, or criteria on a numeric scale — typically 1 to 5 or 1 to 10 — to determine what matters most to a specific population. Businesses use importance scale surveys for product prioritization, employee engagement analysis, vendor evaluation, and compliance documentation. The resulting data produces ranked, quantifiable evidence of stakeholder priorities that can inform decisions and satisfy audit requirements.

When should I use an importance scale survey instead of a general feedback form?

Use an importance scale survey when you need to rank competing priorities rather than simply collect open-ended opinions. If you have a defined list of attributes — product features, benefit options, supplier criteria — and need to know which matter most to stakeholders, a scaled rating matrix produces actionable ranked data. A general feedback form is appropriate when you are still discovering what attributes matter; a scale survey is appropriate when you already know what to measure and need to quantify relative importance.

What rating scale should I use — 5-point or 10-point?

A 5-point scale is generally sufficient for most business importance surveys and produces cleaner, more consistent results because respondents can reliably distinguish five levels of importance. A 10-point scale is more appropriate when fine-grained discrimination between attributes is analytically important — for example, in conjoint analysis or advanced market research. Avoid even-numbered scales (4 or 6 points) for importance measurement because they force a directional response and suppress genuine neutral positions.

How do I ensure survey responses are GDPR-compliant?

To comply with GDPR when running an importance scale survey, confirm the following: the legal basis for processing personal data is documented (typically consent or legitimate interest); the consent disclosure appears before any data is collected; only data strictly necessary for the analysis is requested; the retention period is defined and enforced; respondents are informed of their right to access, correct, or delete their data; and the data controller is identified by name and contact details. For surveys distributed to EU residents from outside the EU, these requirements apply regardless of where the organization is based.

Can an importance scale survey be used as evidence in an employment dispute?

Yes, provided it was administered correctly. Survey results used in employment decisions — such as compensation benchmarking, benefit restructuring, or performance criterion weighting — can be challenged by employees if the survey was not conducted with documented consent, if the methodology was inconsistent, or if individual responses were accessible to direct managers when anonymity was promised. Maintaining signed consent records, an administrator certification, and a clear data-handling policy significantly strengthens the evidentiary value of the results.

What is the difference between an importance scale survey and a Likert scale survey?

A Likert scale measures agreement with a statement — typically on a scale from 'Strongly Disagree' to 'Strongly Agree.' An importance scale measures how much a respondent values a specific attribute — typically from 'Not Important' to 'Extremely Important.' Both are ordinal scales and share similar design principles, but they answer different questions. Use a Likert scale when you want to gauge attitudes or satisfaction; use an importance scale when you want to rank priorities for resource allocation or feature planning.

How many attributes should an importance scale survey include?

Between 8 and 20 attributes is the practical range for most business importance surveys. Fewer than 8 attributes rarely justifies the survey format over a direct conversation. More than 20 attributes produces respondent fatigue, which increases satisficing — selecting the same rating for all items without genuinely evaluating each — and reduces data reliability. If you have more than 20 candidate attributes, consider running a preliminary open-ended phase to narrow the list before distributing the scaled survey.

Who should administer an importance scale survey?

For internal HR or employee surveys, an HR director or a neutral third-party administrator reduces social desirability bias. For customer or market research surveys, a product manager, researcher, or external agency can administer without conflict of interest. For compliance or regulatory surveys, a compliance officer or external auditor should administer and certify the results. In all cases, the administrator should complete the certification block and be available to respond to respondent questions about data use.

How this compares to alternatives

vs Customer Satisfaction Survey

A customer satisfaction survey measures how satisfied respondents are with an experience that has already occurred — rating performance against a baseline. An importance scale survey measures what respondents value most before or independent of any transaction. Use a satisfaction survey to evaluate past performance; use an importance scale survey to prioritize future investment or changes.

vs Employee Satisfaction Survey

An employee satisfaction survey asks staff to rate how happy they are with current workplace conditions — compensation, culture, management. An importance scale survey asks staff to rate how much each workplace factor matters to them. The two complement each other: importance data tells you where to invest, and satisfaction data tells you where you are currently falling short.

vs Risk Assessment Questionnaire

A risk assessment questionnaire identifies and rates the severity and likelihood of specific risks to the organization. An importance scale survey rates stakeholder priorities for features, criteria, or attributes. Both produce ranked data, but a risk assessment is oriented toward threat quantification and mitigation planning, while an importance survey is oriented toward resource allocation and decision support.

vs Stakeholder Analysis Template

A stakeholder analysis maps who your stakeholders are, their level of influence, and their likely stance on a project. An importance scale survey quantifies what those stakeholders value most within a defined set of options. Use the stakeholder analysis to identify who to survey; use the importance scale survey to gather structured, ranked data from them.

Industry-specific considerations

Technology / SaaS

Product teams use importance scale surveys to rank backlog features by customer priority before each planning cycle, reducing roadmap disputes with data rather than opinion.

Human Resources

HR departments administer importance scale surveys to measure which compensation components, benefits, and workplace factors drive retention, informing annual rewards benchmarking.

Healthcare

Patient experience teams use importance scale surveys to prioritize service improvements and generate documented evidence of patient input for accreditation bodies and regulatory audits.

Retail / E-commerce

Category managers use importance scale surveys to rank purchasing criteria — price, delivery speed, return policy, and brand trust — before redesigning the checkout experience or loyalty program.

Professional Services

Consultants facilitate importance scale surveys with client stakeholder groups at the start of strategic engagements to surface conflicting priorities and build consensus on decision criteria.

Manufacturing

Procurement teams use importance scale surveys to weight supplier evaluation criteria — quality, lead time, cost, and compliance — before issuing RFPs, ensuring selection decisions are documented and defensible.

Jurisdictional notes

United States

No single federal law governs business-to-business importance surveys, but surveys collecting personal data from California residents are subject to the CCPA, which requires a privacy notice and opt-out rights. Employment-related surveys must comply with EEOC guidelines — avoid demographic questions that could constitute evidence of discriminatory intent. Sector-specific rules apply in healthcare (HIPAA) and financial services.

Canada

PIPEDA and provincial equivalents (including Quebec's Law 25) govern the collection of personal information through surveys. Quebec's Law 25, in force since September 2023, requires explicit consent, a stated purpose, and a designated privacy officer for any collection of personal information from Quebec residents. Federally regulated employers must also consider the Canada Labour Code when using survey results in employment decisions.

United Kingdom

The UK GDPR and Data Protection Act 2018 apply to any survey collecting personal data from UK residents, regardless of where the organization is based. Consent must be freely given, specific, and documented before data is collected. Employee surveys in the UK are also subject to guidance from the ICO on monitoring and data use in the employment context — results used in performance or compensation decisions require a documented lawful basis.

European Union

GDPR Article 6 requires a documented lawful basis for processing survey data. For most business surveys, this is either consent (Article 6(1)(a)) or legitimate interest (Article 6(1)(f)). Consent must be obtained before any data is collected and must be as easy to withdraw as to give. Surveys distributed to employees in France, Germany, or the Netherlands may also require works council consultation before distribution, particularly when results will inform HR decisions.

Template vs lawyer — what fits your deal?

PathBest forCostTime
Use the templateInternal product, HR, or market research surveys where personal data collection is minimal and results are used for planning onlyFree30–60 minutes to customize and distribute
Template + legal reviewSurveys collecting personal data from employees or customers, or where results will inform employment decisions or regulatory submissions$200–$600 for a privacy counsel or HR legal review2–5 business days
Custom draftedRegulated industries (healthcare, financial services), cross-border surveys subject to GDPR and PIPEDA simultaneously, or surveys used as formal evidence in legal or compliance proceedings$800–$3,000+1–3 weeks

Glossary

Importance Scale
A numeric or descriptive rating system — such as 1 (not important) to 5 (extremely important) — used to quantify how much a respondent values a given attribute.
Likert Scale
A symmetric agree–disagree or importance rating scale, typically with 5 or 7 response options, used to measure attitudes or priorities in survey research.
Respondent Consent
A documented acknowledgment by the survey participant confirming they understand how their responses will be collected, stored, and used.
Attribute
A specific feature, criterion, or factor being rated in the survey — for example, 'price,' 'delivery speed,' or 'technical support quality.'
Weighted Average
A calculation that assigns different levels of significance to each rating, allowing higher-priority respondent groups to influence aggregate scores proportionally.
Data Controller
The organization or individual that determines the purposes and means of processing personal data collected through the survey, as defined under GDPR and similar privacy frameworks.
Anonymization
The process of removing or masking identifying information from survey responses so individual respondents cannot be identified from the data.
Response Bias
A systematic distortion in survey results caused by respondents answering in ways they believe are expected or socially desirable rather than accurately.
Informed Participation
The principle that respondents must understand the purpose of the survey, how data will be used, and any consequences of participation before completing it.
Rating Matrix
A table format in which rows list attributes and columns represent scale values, allowing respondents to rate multiple items consistently in a single view.
Ordinal Data
Survey data where responses indicate rank order — such as importance ratings — but the intervals between values are not necessarily equal.

Part of your Business Operating System

This document is one of 3,000+ business & legal templates included in Business in a Box.

  • Fill-in-the-blanks — ready in minutes
  • 100% customizable Word document
  • Compatible with all office suites
  • Export to PDF and share electronically

Create your document in 3 simple steps.

From template to signed document — all inside one Business Operating System.
1
Download or open template

Access over 3,000+ business and legal templates for any business task, project or initiative.

2
Edit and fill in the blanks with AI

Customize your ready-made business document template and save it in the cloud.

3
Save, Share, Send, Sign

Share your files and folders with your team. Create a space of seamless collaboration.

Save time, save money, and create top-quality documents.

★★★★★

"Fantastic value! I'm not sure how I'd do without it. It's worth its weight in gold and paid back for itself many times."

Managing Director · Mall Farm
Robert Whalley
Managing Director, Mall Farm Proprietary Limited
★★★★★

"I have been using Business in a Box for years. It has been the most useful source of templates I have encountered. I recommend it to anyone."

Business Owner · 4+ years
Dr Michael John Freestone
Business Owner
★★★★★

"It has been a life saver so many times I have lost count. Business in a Box has saved me so much time and as you know, time is money."

Owner · Upstate Web
David G. Moore Jr.
Owner, Upstate Web

Run your business with a system — not scattered tools

Stop downloading documents. Start operating with clarity. Business in a Box gives you the Business Operating System used by over 250,000 companies worldwide to structure, run, and grow their business.

Free Forever Plan · No credit card required