Website Rating Template

Free Word download β€’ Edit online β€’ Save & share with Drive β€’ Export to PDF

2 pagesβ€’20–30 min to fillβ€’Difficulty: Standardβ€’Signature requiredβ€’Legal review recommended
Learn more ↓
FreeWebsite Rating Template

At a glance

What it is
A Website Rating is a structured legal document used to formally assess, score, and record the performance, usability, compliance, and content quality of a website against defined criteria. This free Word download provides an editable framework that organizations, agencies, and auditors can customize with their own rating scales, criteria weights, and findings β€” then export as PDF for delivery to clients, stakeholders, or regulatory bodies.
When you need it
Use it when a client engages you to audit or evaluate their website, when internal compliance teams must document digital accessibility or data protection standards, or when procuring a new website vendor and scoring competing proposals against objective benchmarks.
What's inside
Parties and scope of evaluation, rating criteria and scoring methodology, performance and technical assessment, content and usability review, legal and regulatory compliance findings, weighted scores and summary rating, remediation recommendations, and signatures confirming acceptance of the evaluation findings.

What is a Website Rating?

A Website Rating is a formal, signed document that scores a website against a defined set of criteria β€” covering technical performance, content quality, usability, and legal compliance β€” using a structured methodology and weighted scoring system. It records conditions at a specific point in time, assigns an overall composite score, and documents remediation recommendations for any deficiencies found. Unlike an informal internal audit, a website rating is executed by both the evaluating party and the client, creating a legally accepted record of findings that can support contract milestones, regulatory compliance evidence, or dispute resolution.

Why You Need This Document

Without a signed website rating, an agency or consultant delivering a web audit has no documented record that the client received, reviewed, and accepted the findings β€” leaving payment disputes, scope disagreements, and liability claims to be resolved on the basis of email threads alone. For organizations with regulatory obligations β€” GDPR cookie compliance, WCAG accessibility, HIPAA-aligned data handling β€” an undocumented evaluation provides no audit trail if a regulator or litigant demands evidence of due diligence. A properly executed website rating closes these gaps: it establishes agreed scope before work begins, records objective findings with a reproducible methodology, caps the evaluator's liability to the fee paid, and creates a baseline against which future assessments can be measured. This template gives you a structured starting point you can customize to your scoring criteria and deliver as a professional, defensible document.

Which variant fits your situation?

If your situation is…Use this template
Auditing a client website for WCAG 2.1 accessibility complianceWebsite Accessibility Audit Report
Evaluating a website as part of a digital due diligence processDigital Due Diligence Checklist
Rating a website's SEO performance against defined benchmarksSEO Audit Report Template
Scoring multiple vendor websites during a procurement processVendor Evaluation Form
Documenting website changes required after an auditWebsite Project Scope of Work
Formal agreement governing ongoing website management and maintenanceWebsite Maintenance Agreement
Assessing a newly built website before launch sign-offWebsite Launch Checklist

Common mistakes to avoid

❌ Undefined scope leading to scope creep disputes

Why it matters: Without an agreed list of pages and functions, clients routinely claim the audit was incomplete and withhold payment or demand re-evaluation at no charge.

Fix: Attach a signed Schedule B listing every in-scope URL before the evaluation begins, and include an explicit exclusion list for anything the client asked about but was not included.

❌ Rating scale with no rubric

Why it matters: A score of 3 out of 5 on 'content quality' is undefendable without a written description of what distinguishes each score level β€” the client will always argue their site deserved a higher mark.

Fix: Define each numeric level for every criterion in Schedule A before the evaluation starts, and have the client sign off on the rubric.

❌ Omitting a liability cap clause

Why it matters: An evaluator who misses a GDPR non-compliance issue could face a claim that the missed finding caused regulatory fines β€” which can reach 4% of global annual turnover under EU law.

Fix: Include an explicit liability cap limiting the evaluator's exposure to the fee paid for the rating, and add a disclaimer that the rating reflects conditions at a point in time only.

❌ Only the evaluator signs the document

Why it matters: A unilaterally signed report does not constitute acceptance of findings β€” without the client's signature, disputes about delivery, content, and acknowledgment are unresolvable on the document alone.

Fix: Require the client's authorized representative to sign the acceptance block before the report is considered final and delivered.

❌ Conflating objective findings with subjective opinions in scored criteria

Why it matters: Mixing 'zero broken links found' with 'the homepage feels cluttered' in the same scored field undermines the document's credibility if disputed in a legal or contractual context.

Fix: Separate objective measurable findings from qualitative observations in the document structure β€” score only objective criteria and include qualitative notes in a separate non-scored commentary section.

❌ No date-of-evaluation statement for time-sensitive findings

Why it matters: A website can change significantly within days β€” without a clear evaluation date, a client can argue findings are stale or never applied to their site.

Fix: State the exact evaluation date on the cover and in the acceptance clause, and include a note that findings are valid only as of that date and are not a guarantee of future compliance.

The 10 key clauses, explained

Parties, purpose, and date

In plain language: Identifies the evaluator and the website owner by their full legal names, states the purpose of the rating, and records the date the evaluation was commissioned.

Sample language
This Website Rating is entered into on [DATE] between [EVALUATOR LEGAL NAME] ('Evaluator') and [CLIENT LEGAL NAME] ('Client'). The purpose of this document is to provide a formal evaluation of the website located at [URL] ('Website') against the criteria set out herein.

Common mistake: Identifying the evaluator by trade name rather than registered legal entity β€” if a dispute arises over findings, the contracting party must match the entity that provided the service.

Scope of evaluation

In plain language: Defines exactly which pages, subsections, and functions of the website were included in the rating and lists any areas that were explicitly excluded.

Sample language
This evaluation covers the following pages and sections: [LIST]. The following areas are excluded from this rating: [EXCLUSIONS]. Any page or function not listed above is outside the scope of this document.

Common mistake: Leaving scope undefined so the client later claims the audit should have covered the entire site β€” including sections the evaluator never accessed or agreed to assess.

Rating criteria and methodology

In plain language: Sets out the specific criteria being scored, the rating scale used (e.g., 1–5 or 0–100), the weight assigned to each criterion, and the source standards referenced.

Sample language
Each criterion listed in Schedule A is rated on a scale of [1–5 / 0–100]. Criteria are weighted as set out in Schedule A. Evaluation methodology follows [STANDARD β€” e.g., WCAG 2.1 AA / Google Core Web Vitals / ISO 25010].

Common mistake: Using a rating scale without defining what each score level means β€” a score of 3 out of 5 is meaningless without a rubric describing what distinguishes a 3 from a 2 or a 4.

Technical performance assessment

In plain language: Documents scores for load speed, mobile responsiveness, uptime, security (HTTPS, SSL), and Core Web Vitals measurements taken at the time of evaluation.

Sample language
Largest Contentful Paint: [X]s (Target: <2.5s) β€” Score: [X/5]. Mobile responsiveness: [PASS/FAIL]. SSL certificate: [VALID/EXPIRED]. Uptime over 30 days: [X]%.

Common mistake: Recording a single snapshot measurement for load speed without noting the testing tool, geographic location, and device type β€” results vary significantly across these variables.

Content and usability review

In plain language: Records the evaluator's assessment of navigation clarity, content accuracy, readability, call-to-action effectiveness, and user experience against the agreed criteria.

Sample language
Navigation clarity: [X/5] β€” [FINDING]. Readability score (Flesch-Kincaid): [X]. CTAs present on key landing pages: [YES/NO]. Identified usability issues: [LIST OR 'NONE IDENTIFIED'].

Common mistake: Combining objective findings (broken links count) with subjective opinions (design is outdated) in the same scored criterion without distinguishing them β€” conflating the two undermines the document's credibility in a dispute.

Legal and regulatory compliance findings

In plain language: Documents whether the website meets applicable legal requirements including cookie consent, privacy policy, accessibility standards, and jurisdiction-specific disclosures.

Sample language
Cookie consent mechanism: [COMPLIANT/NON-COMPLIANT β€” DETAIL]. Privacy policy present and dated: [YES/NO]. WCAG 2.1 AA compliance level: [PASS/PARTIAL/FAIL]. Required legal disclosures for [JURISDICTION]: [PRESENT/ABSENT β€” LIST].

Common mistake: Checking for the presence of a privacy policy without verifying its content covers the data types the website actually collects β€” a policy that doesn't match real data practices is a compliance deficiency.

Summary weighted score and overall rating

In plain language: Calculates the composite score across all criteria using the agreed weights and assigns an overall rating category β€” for example, Excellent, Satisfactory, Needs Improvement, or Non-Compliant.

Sample language
Total weighted score: [X/100]. Overall rating: [RATING CATEGORY] as defined in Schedule A. This rating reflects the website's status as of [DATE] and is subject to change if material changes are made to the Website.

Common mistake: Presenting a final score without showing the calculation β€” if a client disputes the result, an undocumented score has no audit trail to defend.

Remediation recommendations

In plain language: Lists each deficiency found, the criterion it failed, the recommended corrective action, and a suggested priority level (critical, high, medium, or low).

Sample language
Issue: [DESCRIPTION]. Criterion failed: [NAME]. Recommended action: [ACTION]. Priority: [CRITICAL/HIGH/MEDIUM/LOW]. Recommended completion: within [X] days of this report.

Common mistake: Including remediation items in the findings without assigning priority β€” a client who sees 30 issues with no urgency signal will likely deprioritize all of them.

Evaluator independence and limitation of liability

In plain language: Confirms the evaluator has no financial interest in the outcome, that findings represent conditions at the time of evaluation only, and limits the evaluator's liability to the fee paid for the rating.

Sample language
Evaluator confirms it has no financial interest in any product, service, or vendor referenced herein. This rating reflects conditions as of [DATE]. Evaluator's total liability under this document shall not exceed the fee paid by Client for this evaluation: $[AMOUNT].

Common mistake: Omitting a liability cap entirely β€” without one, a client could argue that a missed compliance finding caused regulatory fines and seek damages far exceeding the evaluation fee.

Acceptance, signatures, and governing law

In plain language: Records that both parties have reviewed and accepted the findings, identifies the governing jurisdiction, and provides signature blocks for both evaluator and client.

Sample language
By signing below, Client acknowledges receipt and acceptance of this Website Rating as of [DATE]. This document is governed by the laws of [STATE/PROVINCE/COUNTRY]. Evaluator: [NAME/TITLE/DATE]. Client: [NAME/TITLE/DATE].

Common mistake: Having only the evaluator sign and treating the document as a unilateral report β€” without the client's signature, disputes about whether findings were communicated and accepted are much harder to resolve.

How to fill it out

  1. 1

    Enter the parties' legal names and the website URL

    Insert the evaluator's and client's full registered legal names β€” not brand names or trading names β€” and the exact URL being evaluated. Record the date the evaluation was commissioned.

    πŸ’‘ Confirm the URL is the live production site, not a staging environment, unless the rating is explicitly scoped to a pre-launch review.

  2. 2

    Define and agree the scope before evaluation begins

    List every page, section, and function included in the rating, and explicitly name any exclusions. Attach this as Schedule B so both parties sign off on scope before work starts.

    πŸ’‘ Screenshot the site map on the evaluation date and attach it to the document β€” sites change, and a dated site map establishes what was in scope at the time.

  3. 3

    Populate Schedule A with rating criteria, weights, and rubric

    For each criterion, assign a weight (percentage of total score), define what each score level means in plain terms, and reference any external standard being applied (WCAG, Core Web Vitals, ISO).

    πŸ’‘ Share the completed Schedule A with the client for approval before conducting the evaluation β€” agreed criteria eliminate most post-delivery disputes.

  4. 4

    Conduct the technical performance assessment

    Run page speed and Core Web Vitals tests using Google PageSpeed Insights or Lighthouse, record HTTPS and SSL status, and check mobile responsiveness across at least two device sizes. Note the tool name, version, and test date for each metric.

    πŸ’‘ Run speed tests three times and record the median result β€” single-run results are distorted by network variability.

  5. 5

    Complete the content, usability, and compliance sections

    Work through each rated criterion systematically, recording objective findings (broken link count, missing alt text count) separately from qualitative assessments. For the compliance section, cross-reference the applicable jurisdiction's legal requirements.

    πŸ’‘ Use a browser accessibility extension such as axe DevTools to generate a reproducible accessibility finding log you can attach as an appendix.

  6. 6

    Calculate the weighted score and assign the overall rating

    Multiply each criterion score by its weight, sum the results, and compare to the rating categories defined in Schedule A. Show the full calculation in a table so the client can verify the arithmetic.

    πŸ’‘ If the weighted score falls near the boundary between two rating categories, add a narrative note explaining the deciding factors β€” borderline scores without context generate the most disputes.

  7. 7

    Write remediation recommendations with priorities and timelines

    For every deficiency, state the finding, the criterion it failed, the specific corrective action, and a priority tier (critical, high, medium, low). Assign a suggested completion window for each priority tier.

    πŸ’‘ Limit critical items to genuine legal compliance failures or security vulnerabilities β€” overusing 'critical' dilutes the urgency and reduces the chance clients act on what actually matters.

  8. 8

    Obtain signatures from both parties before delivery

    Send the completed document to the client's authorized representative for review. Do not finalize or deliver the report until both parties have signed the acceptance block and governing law is confirmed.

    πŸ’‘ Use a timestamped e-signature tool to create an auditable record of exactly when each party accepted the findings.

Frequently asked questions

What is a website rating document?

A website rating document is a formal written evaluation that scores a website against defined criteria β€” covering technical performance, usability, content quality, and legal compliance β€” using a structured methodology and weighted scoring system. It is used by agencies, auditors, and consultants to deliver defensible, signed findings to clients or internal stakeholders, and serves as a record of the website's condition at a specific point in time.

When do I need a formal website rating rather than an informal review?

A formal, signed website rating is appropriate when findings will be used to support a contractual obligation β€” such as a web development agreement milestone sign-off β€” when compliance findings must be documented for regulatory purposes, when a third-party evaluator is being paid to deliver an objective assessment, or when procurement decisions will be based on the evaluation scores. Informal internal reviews are sufficient for routine monitoring, but any evaluation with legal or financial consequences should use a signed document.

Does a website rating need to be signed to be binding?

For the rating to function as a legally accepted document β€” confirming the client received and acknowledged the findings β€” signatures from both the evaluator and the client are generally recommended. Without a client signature, there is no documented evidence of acceptance, which complicates payment disputes, scope disagreements, and any subsequent liability claims. The evaluator's signature alone is sufficient only when the document is intended as a unilateral internal report with no contractual consequence.

What criteria should a website rating cover?

A complete website rating typically covers four categories: technical performance (load speed, uptime, mobile responsiveness, SSL), content and usability (navigation, readability, accessibility, CTA effectiveness), legal and regulatory compliance (GDPR cookie consent, privacy policy, WCAG accessibility level, required disclosures), and SEO fundamentals (meta tags, structured data, crawlability). The relative weight of each category should be agreed with the client before evaluation begins.

How is the overall website rating score calculated?

Each criterion is assigned a weight reflecting its relative importance, expressed as a percentage of the total score. The evaluator scores each criterion on an agreed scale (typically 1–5 or 0–100), multiplies each score by its weight, and sums the results to produce a composite score. The composite score is then mapped to a rating category β€” such as Excellent, Satisfactory, Needs Improvement, or Non-Compliant β€” as defined in the rating schedule. Showing this calculation in full is important for auditability.

What is the difference between a website rating and a website audit?

A website audit is a broad investigative process β€” it identifies issues, analyses root causes, and may be informal or internal. A website rating is a formal scored evaluation that produces a documented, defensible assessment with a composite score, signed acceptance, and remediation recommendations. Audits inform ratings: you conduct the audit work, then formalise the findings in a rating document when the output needs to be legally or contractually significant.

Do website ratings need to comply with specific laws?

The rating document itself is not regulated, but the compliance criteria it assesses are governed by law. WCAG accessibility requirements are mandated for public sector websites in the US (Section 508), UK, and EU, and are increasingly required for private sector sites under disability discrimination law. GDPR and ePrivacy rules apply to cookie consent and data handling assessments for any site accessible to EU users. Evaluators should clearly reference the applicable legal standard for each compliance criterion and note the jurisdiction in the governing law clause.

How often should a website be rated?

For websites with active legal compliance obligations β€” such as healthcare, financial services, or e-commerce sites handling personal data β€” an annual formal rating is a reasonable baseline, with an additional review any time the site undergoes a material redesign or change in data collection practices. For websites subject to ongoing service-level agreements, a quarterly or semi-annual rating aligned to contract milestones is common. A rating more than 12 months old should not be relied upon as evidence of current compliance.

How this compares to alternatives

vs Website Maintenance Agreement

A website maintenance agreement governs the ongoing obligations to keep a site updated, secure, and operational over time β€” it is a service contract. A website rating is a point-in-time evaluation document that scores the site's current condition. The rating often identifies what the maintenance agreement must address; the two documents are complementary rather than interchangeable.

vs Vendor Evaluation Form

A vendor evaluation form scores a supplier's overall business capabilities β€” financial stability, service quality, reliability, and pricing. A website rating focuses exclusively on the technical, content, and compliance attributes of a website. Use a vendor evaluation form to choose between web development agencies; use a website rating to assess the deliverable those agencies produce.

vs Service Agreement

A service agreement defines the scope, payment, and obligations for services to be performed β€” it is forward-looking and governs the working relationship. A website rating is a backward-looking assessment document that records what was found at a point in time. A service agreement may require a website rating as a defined deliverable, but the two serve entirely different legal functions.

vs Website Development Agreement

A website development agreement is the contract that governs the build of a website β€” timelines, payment, IP ownership, and acceptance criteria. A website rating is the document used to formally score whether the delivered site meets those acceptance criteria. The development agreement sets the standard; the website rating records whether the standard was met.

Industry-specific considerations

Digital marketing and web agencies

Agencies use website ratings as a formal deliverable within web audit, redesign, or ongoing retainer engagements, providing clients with a signed document that supports invoicing milestones and scope-of-work completion.

Financial services

Regulated financial websites must meet FCA, SEC, or equivalent disclosure requirements β€” a formal rating documents compliance status and provides evidence for regulatory examinations or internal audit trails.

Healthcare and medtech

Healthcare websites handling patient data must meet HIPAA security standards and WCAG accessibility requirements; a signed rating documents the compliance baseline and identifies remediation obligations before regulatory review.

E-commerce and retail

E-commerce sites require cookie consent, GDPR-compliant checkout data handling, and performance benchmarks that directly affect conversion β€” a periodic formal rating captures each of these and links deficiencies to revenue impact.

Jurisdictional notes

United States

Section 508 of the Rehabilitation Act requires federal agency websites to meet WCAG 2.1 AA accessibility standards. The ADA has been applied to private sector websites in numerous federal court decisions, creating de facto accessibility obligations for businesses serving the public. State privacy laws β€” including the CCPA in California β€” impose specific cookie consent and privacy policy disclosure requirements that a rating covering US-facing sites should assess.

Canada

PIPEDA (and provincial equivalents in Quebec, Alberta, and BC) governs personal data collection on Canadian websites, including consent requirements and privacy policy obligations. Quebec's Law 25 (Bill 64) introduced stricter consent, data localization, and breach notification requirements effective 2023. Federally regulated websites must meet the Standard on Web Accessibility. Website ratings for Quebec-based clients should verify that both English and French language requirements are met under the Charter of the French Language.

United Kingdom

The UK GDPR and the Privacy and Electronic Communications Regulations (PECR) govern cookie consent and data handling on UK-facing websites post-Brexit. Public sector websites must comply with the Public Sector Bodies Accessibility Regulations 2018, requiring WCAG 2.1 AA compliance and a published accessibility statement. The ICO has issued enforcement notices and fines for non-compliant cookie banners, making cookie consent a high-priority compliance criterion for UK website ratings.

European Union

GDPR applies to any website that processes personal data of EU residents, regardless of where the website operator is located β€” fines reach up to 4% of global annual turnover. The ePrivacy Directive requires informed, freely given cookie consent before non-essential cookies are set. The European Accessibility Act (EAA) extends mandatory WCAG compliance to private sector websites and apps from June 2025. Member state implementations vary, particularly for advertising technology and cross-border data transfers under Standard Contractual Clauses.

Template vs lawyer β€” what fits your deal?

PathBest forCostTime
Use the templateAgencies and consultants delivering standard website audits to SMB clients with no regulatory compliance dimensionFree30–60 minutes to complete per evaluation
Template + legal reviewEvaluations that include GDPR, WCAG, or sector-specific compliance findings that could expose the evaluator to third-party liability$300–$7002–5 days
Custom draftedRegulated industries, evaluations commissioned as evidence for litigation or regulatory proceedings, or enterprise-level contracts requiring bespoke indemnity and liability terms$1,500–$4,000+1–3 weeks

Glossary

Rating Criteria
The specific measurable standards against which a website is evaluated, such as page load time, mobile responsiveness, or WCAG compliance level.
Weighted Score
A numeric score assigned to each rating criterion multiplied by that criterion's relative importance factor to produce a composite overall rating.
WCAG
Web Content Accessibility Guidelines β€” international standards published by the W3C defining how to make web content accessible to people with disabilities.
GDPR
General Data Protection Regulation β€” EU law governing how websites collect, store, and process personal data of individuals in the European Union.
Core Web Vitals
Google's three user-experience metrics β€” Largest Contentful Paint, Interaction to Next Paint, and Cumulative Layout Shift β€” used to measure real-world page performance.
Remediation
The corrective actions identified in an evaluation that a website owner must take to address deficiencies found during the rating process.
Scope of Evaluation
The defined boundaries of the rating β€” which pages, sections, or functions were assessed and which were explicitly excluded.
Acceptance
A signed acknowledgment by the party commissioning the rating that they have received, reviewed, and formally accepted the evaluation findings.
Baseline Rating
A reference score established at the time of an initial evaluation against which future assessments or redesigns can be benchmarked.
Material Deficiency
A finding where a website fails to meet a mandatory criterion β€” such as a legal disclosure requirement or critical security standard β€” requiring immediate remediation.
Evaluator Independence
A clause confirming the party conducting the rating has no financial interest in the outcome and is not affiliated with the party whose website is being assessed.

Part of your Business Operating System

This document is one of 3,000+ business & legal templates included in Business in a Box.

  • Fill-in-the-blanks β€” ready in minutes
  • 100% customizable Word document
  • Compatible with all office suites
  • Export to PDF and share electronically

Create your document in 3 simple steps.

From template to signed document β€” all inside one Business Operating System.
1
Download or open template

Access over 3,000+ business and legal templates for any business task, project or initiative.

2
Edit and fill in the blanks with AI

Customize your ready-made business document template and save it in the cloud.

3
Save, Share, Send, Sign

Share your files and folders with your team. Create a space of seamless collaboration.

Save time, save money, and create top-quality documents.

β˜…β˜…β˜…β˜…β˜…

"Fantastic value! I'm not sure how I'd do without it. It's worth its weight in gold and paid back for itself many times."

Managing Director Β· Mall Farm
Robert Whalley
Managing Director, Mall Farm Proprietary Limited
β˜…β˜…β˜…β˜…β˜…

"I have been using Business in a Box for years. It has been the most useful source of templates I have encountered. I recommend it to anyone."

Business Owner Β· 4+ years
Dr Michael John Freestone
Business Owner
β˜…β˜…β˜…β˜…β˜…

"It has been a life saver so many times I have lost count. Business in a Box has saved me so much time and as you know, time is money."

Owner Β· Upstate Web
David G. Moore Jr.
Owner, Upstate Web

Run your business with a system β€” not scattered tools

Stop downloading documents. Start operating with clarity. Business in a Box gives you the Business Operating System used by over 250,000 companies worldwide to structure, run, and grow their business.

Free Forever PlanΒ Β·Β No credit card required