The Speed Camera Problem: Why Continuous Accessibility Monitoring Beats the Annual Audit
24 April 2026 - Martin Dempsey
There's a moment every driver knows. You're on a motorway, moving at a pace that your speedometer would rather you didn't look at, and then you see it - a fixed speed camera. The brake lights flash across three lanes. Everyone slows to exactly the right number. You pass the camera. And thirty seconds later, everyone is back to exactly where they were before.
The camera clocked compliance. The road remained dangerous.
This, with uncomfortable precision, is how many organisations approach digital accessibility.
The Annual Audit Trap
The standard model goes something like this. A digital product - a website, an app, a customer portal - is launched with good accessibility intentions. WCAG guidelines are referenced during build. Someone runs an automated scan. A few issues are fixed. The product goes live, a box is ticked, and accessibility is filed under "done."
Months pass. Content is updated. New features are shipped. Third-party components are integrated. A new campaign lands on the homepage. Each change is made by a team appraised on delivery speed, not accessibility implications. Nobody is watching. Nobody is measuring. The issues accumulate quietly, one sprint at a time.
Then, somewhere between twelve months and three years later, an audit is commissioned. Perhaps it's triggered by a regulatory requirement, a complaint, or a periodic governance review. The auditors arrive, do their work, and produce a report.
That report, more often than not, is far from positive. Hundreds of issues. Multiple WCAG failures across critical user journeys. Problems spanning every team that's touched the product - development, content, design, marketing. The remediation estimate, both in terms of cost and time, is significant. The prioritisation conversation is painful. The work gets added to a backlog where it competes with every other commercial priority, and slowly, quietly, it loses.
The audit was the fixed speed camera. Everyone slowed down for it. Then drove away.
Why This Keeps Happening
The fixed camera model fails because it treats accessibility as a point-in-time state rather than an ongoing condition. It creates the illusion of compliance while providing no visibility of what happens in between. In an environment where accountability is blurred across business functions between audits, no one is responsible for the drift .
This is accessibility drift - the gradual erosion of standards that occurs whenever monitoring stops. It isn't caused by malice or indifference. It's caused by the absence of a system that makes accessibility visible, measurable, and owned.
When accountability is unclear and measurement is infrequent, the default human behaviour takes over: people optimise for what is being watched. If nothing is being watched, nothing is being optimised.
The scale of the problem is sobering. WebAIM’s 2025 analysis(this will open in a new window) of one million website homepages found that 94.8% had at least one detectable WCAG failure - an average of 51 accessibility errors per page. Those figures reflect only what automated scanning can detect. The true picture, once manual and user testing are applied, is invariably worse. Meanwhile, the regulatory environment is tightening. The European Accessibility Act(this will open in a new window), which came into force across all EU member states in June 2025, now requires private sector organisations offering digital products and services to EU consumers to meet defined accessibility standards - with real consequences for non-compliance. For UK organisations trading into the EU, this applies regardless of where they are based. Accessibility drift is no longer just an ethical failure. It is an increasingly significant legal and commercial risk.
Accessibility 2.0: The Average Speed Camera Approach
Average speed cameras work on a fundamentally different principle. They don't care what you're doing at a single point. They measure your behaviour across an entire stretch of road, continuously, with no gaps in which you can revert. There's nowhere to slow down for the camera and nowhere to speed up after it, because the camera is everywhere along that section, all of the time.
The result isn't just compliance at a checkpoint - it's a genuine change in behaviour across the whole journey. Drivers moderate their speed consistently because they know they are being measured consistently.
This is the model that digital accessibility needs, and it's the model that Access360, User Vision's managed accessibility service, is built around.
Access360 replaces the periodic audit with continuous automated monitoring - weekly scanning of your entire digital estate against WCAG 2.2 standards(this will open in a new window) and other relevant international standards, detecting issues as they appear rather than months after the fact. When new content creates an accessibility failure, your team knows within days, not at the next annual review. Issues are tracked, assigned, and prioritised by severity and user impact, through intelligent dashboards that give every part of the organisation - IT, DevOps, marketing, content - a shared, single version of the truth about where they stand.
That shared visibility is the point. Average speed cameras work because every driver on that stretch of road is subject to the same measurement. Access360 works because every team contributing to your digital estate is accountable to the same data. There are no blind spots between audits. There is no gap in which issues can accumulate unnoticed. The monitoring is always on, and the accountability is always clear.
Research shows that automated monitoring tools can detect up to 57% of accessibility issues by volume - significantly more than the 20–30% figure commonly cited, which measures WCAG criteria coverage rather than total issue count (Deque Systems, 2021(this will open in a new window)). The remaining issues require human expertise to identify.
Continuous monitoring acts as an early warning radar, highlighting where deeper, more targetted attention is needed.
Access360 combines this with periodic expert manual audits and disabled user testing — the human expertise that identifies the issues automated tools miss, and validates that fixes genuinely improve experience for real people, not just compliance scores.
The Culture Shift That Changes Everything
The most important outcome of the average speed camera isn't just that people drive within the limit. It's that, over time, it changes what drivers consider normal. Sustained, consistent measurement reshapes behaviour from the inside.
The same is true of continuous accessibility monitoring. When development teams see accessibility scores update weekly, accessibility stops feeling like an audit event and starts feeling like a quality metric - something they own, track, and take pride in. When content managers can see the impact of a single page change on their team's accessibility rating, they start making different decisions before they publish, not after.
This is Accessibility 2.0: not a box ticked at the point of launch and revisited reluctantly every few years, but an embedded, measured, accountable practice - part of how your organisation works, not something it worries about when the auditors arrive.
The fixed speed camera era of digital accessibility is over. The question is whether your organisation is ready to move to a stretch that's monitored all the way along.
Frequently Asked Questions
What is accessibility drift? Accessibility drift is the gradual erosion of WCAG compliance that occurs between audits. Every time content is updated, a new feature is shipped, or a third-party component is integrated without accessibility checks, small failures accumulate. By the time the next audit arrives, organisations often face a far larger remediation task than they expected — not because of deliberate neglect, but because no one was watching in between.
Is an annual accessibility audit enough to meet legal requirements? An annual audit demonstrates intent but cannot guarantee ongoing compliance. The European Accessibility Act (in force from June 2025) and the UK’s Public Sector Bodies Accessibility Regulations both require organisations to maintain accessible digital products — not simply to have passed a test at a single point in time. Given that digital products change continuously, a once-a-year review leaves months of unmonitored risk. Continuous monitoring is increasingly the only defensible approach.
How much of digital accessibility can automated tools detect? Research from Deque Systems, based on analysis of over 13,000 pages and nearly 300,000 issues, found that automated testing detected up to 57% of accessibility issues by total volume — considerably more than the 20–30% figure often cited, which measures how many WCAG criteria can be tested automatically rather than the proportion of actual issues found. The remaining issues — relating to things like meaningful alt text quality, keyboard interaction, and cognitive accessibility — require expert manual review and testing with disabled users. A robust accessibility programme combines both.
Access360 is User Vision's managed accessibility service, combining continuous automated monitoring with expert manual audits, disabled user testing, and capability building. To find out how it could work for your organisation, get in touch.
You might also be interested in...
Applying the Three Lenses Model: How to Integrate MR, CX and UX Research
21 April 2026Knowing the three lenses is one thing — here's how to actually use them together without restructuring your entire research function.
Read the article: Applying the Three Lenses Model: How to Integrate MR, CX and UX ResearchThe Three Lenses Model: How MR, CX and UX Work Better Together
16 April 2026Most organisations run market research, track CX, and test with users — but they rarely connect the dots. Here's a model that does.
Read the article: The Three Lenses Model: How MR, CX and UX Work Better TogetherAre You Heading to the Right Moon?
3 April 2026If your digital transformation north star wasn't defined by evidence, your agile ceremonies and KPI dashboards will simply deliver you efficiently to the wrong destination.
Read the article: Are You Heading to the Right Moon?