1
Enter the parties' legal names and the website URL
Insert the evaluator's and client's full registered legal names β not brand names or trading names β and the exact URL being evaluated. Record the date the evaluation was commissioned.
π‘ Confirm the URL is the live production site, not a staging environment, unless the rating is explicitly scoped to a pre-launch review.
2
Define and agree the scope before evaluation begins
List every page, section, and function included in the rating, and explicitly name any exclusions. Attach this as Schedule B so both parties sign off on scope before work starts.
π‘ Screenshot the site map on the evaluation date and attach it to the document β sites change, and a dated site map establishes what was in scope at the time.
3
Populate Schedule A with rating criteria, weights, and rubric
For each criterion, assign a weight (percentage of total score), define what each score level means in plain terms, and reference any external standard being applied (WCAG, Core Web Vitals, ISO).
π‘ Share the completed Schedule A with the client for approval before conducting the evaluation β agreed criteria eliminate most post-delivery disputes.
4
Conduct the technical performance assessment
Run page speed and Core Web Vitals tests using Google PageSpeed Insights or Lighthouse, record HTTPS and SSL status, and check mobile responsiveness across at least two device sizes. Note the tool name, version, and test date for each metric.
π‘ Run speed tests three times and record the median result β single-run results are distorted by network variability.
5
Complete the content, usability, and compliance sections
Work through each rated criterion systematically, recording objective findings (broken link count, missing alt text count) separately from qualitative assessments. For the compliance section, cross-reference the applicable jurisdiction's legal requirements.
π‘ Use a browser accessibility extension such as axe DevTools to generate a reproducible accessibility finding log you can attach as an appendix.
6
Calculate the weighted score and assign the overall rating
Multiply each criterion score by its weight, sum the results, and compare to the rating categories defined in Schedule A. Show the full calculation in a table so the client can verify the arithmetic.
π‘ If the weighted score falls near the boundary between two rating categories, add a narrative note explaining the deciding factors β borderline scores without context generate the most disputes.
7
Write remediation recommendations with priorities and timelines
For every deficiency, state the finding, the criterion it failed, the specific corrective action, and a priority tier (critical, high, medium, low). Assign a suggested completion window for each priority tier.
π‘ Limit critical items to genuine legal compliance failures or security vulnerabilities β overusing 'critical' dilutes the urgency and reduces the chance clients act on what actually matters.
8
Obtain signatures from both parties before delivery
Send the completed document to the client's authorized representative for review. Do not finalize or deliver the report until both parties have signed the acceptance block and governing law is confirmed.
π‘ Use a timestamped e-signature tool to create an auditable record of exactly when each party accepted the findings.