Just a difference of view? Progressive enhancement vs graceful degradation

With the deadline for replies to COI’s consultation on browser testing support looming tomorrow (Friday 17 October, folks! Get your views in now!) an A List Apart article on progressive enhancement caught my eye.

Ever since I’ve been working in web development, the key theme to ensuring browser compatibility has been graceful degradation: that means you can optimise your site for the latest whizz-bang browsers on the market, but you should also ensure that the site will still work – and look good – if used in an older browser.

So by all means include all the latest stuff – Flash, JavaScript, Ajax and the rest – but if the user views your site with a browser lacking Ajax then the web page should recognise this and substitute the Ajax part with older technology that will work. For example, instead of a review page automatically updating when you select a rating, the rating is submitted like a traditional form and the page reloads.

Similarly if your navigation utilises JavaScript to do nice drop-down sort of things when you click on it. If the user’s browser doesn’t support JavaScript, just make sure that there is a static version that will fill-in for it. And so it goes, until you make sure that even a basic text-only reader like Lynx can still get around the site okay: it might be missing even colour and images by this stage, but it will work to the best abilities of that lowly browser.

(Going back as far as Lynx might sound bizarre, but it’s not: it’s a good approximation of how a significant number of screen readers for the visually impaired read the site; and also what search engine robots tend to see. If you can’t move around the site without JavaScript then the chances are that neither can the robots – and you can kiss goodbye to that prestigious Google search ranking you were after.)

So you can see why graceful degradation has been such an effective way of thinking about web development best practice. But it’s not without its flaws: having to create a website with so many levels of multiple redundancy takes time and money to create, and is hard to do testing on when it is released because of the many different factors involved. And all that alternate coding can end up tripping over itself, leaving a stray line of code here and a stray line there, which when put together can mean your coding looks shambolic – and possibly even wreck the page. And the big problem is: graceful degradation is all about the browser. It’s about the technology leading the development, rather than the user needs or the project requirements.

That’s why the buzz phrase for web development, first used in 2003, is progressive enhancement – which focuses on building from the content out, which in turn is developed by referring to user needs.

The A List Apart article uses a brilliant metaphor of a coated M&M peanut to show how progressive enhancement works, which I’m not going to attempt to relay in detail or find an inferior way of describing them, but in the highest of high level: instead of burrowing back through layer upon layer of jury-rigged patches, you concentrate instead on creating excellent content, then wrapping it in XHTML, then CSS, and so on – each of them done to the highest web standards at every stage. And in this way you end up with the pefect final result that you just know will work with any well-behaved browser around.

This has been the gist of a lot of the feedback to the COI consultation on user support: that we were misguided to even frame the consultation as being all about browsers, when it should have been about web standards and building from the ground up. And it’s a very good point, well taken, that I’m sure the COI team responsible for drafting the next version of the guidelines will take into account.

But the difference between graceful degradation and progressive enhancement is lessened somewhat if you add to the former the overriding requirement to be web standards compliant, and underline that accessibility can only be achieved first and foremost by coding to best practice standards. That’s what COI has always tried to advocate in its digital media projects, and I hope we’ve made a positive difference to the standards of websites in general and usability/accessibility in particular over the years.

If your top requirement is web standards, and then you put on graceful degradation after that, does it really end up far away from being – to all intents and purposes – the same as progressive enhancement? Maybe I’m missing something, but I don’t think there’s much difference between the two in practice; you could argue that all things being equal progressive enhancement it the more elegant way of putting it, whereas graceful degradation plus web standards means you have to keep your eye on separate ends of the development process.

But there is certainly a difference in ethos: there’s no way I can argue against the principle that the user – and the content – come first, and the technology should be an invisible facilitator. I say it time and again to clients every day. And so the basis for testing certainly shouldn’t be the technology; but we still need to know what technology we should check our sites on to ensure that the developers have delivered on web standards and progressive enhancement. That means there still needs to be a way of quantifying how we check the end product, and on what. And I’m afraid it doesn’t absolve you of supporting those badly behaved browsers with special jury-rigging, too, if enough of your users are using IE6 a non-standards compliant mess of a browser. You can’t argue that progressive enhancement is about centring on the user and then turn round and say that what browser the user is seeing the site with is irrelevant.

In the end, maybe the most important thing about progressive enhancement is that it’s shiny and new. And I don’t mean that in a remotely facetious or sneering way, either: we all get jaded, we all get lazy, and we all sink into bad habits from time to time. What we quite often need is a new perspective on things, a kick up the backside – and a shiny new toy to play with.

Advertisements

  1. Thanks for bringing the COI consultation to my notice just in the nick of time. I’ve sent in a reply which I’ll probably post to one of my sites later, but the whole idea of “supported browsers” is an unacceptable market perversion that should be replaced by “supported standards”, there are obvious problems (including a feedback loop) in the Appendix A example and the topics of power-saving and security in relation to “rich internet applications” aren’t mentioned at all. Nor is mobile phone access to mainstream websites, now I come to think about it, but I forgot to include that in my reply. Oh well.

  2. andrewlewin

    Glad the reminder reached you in time – COI really is trying to get as wide a range of responses as possible, so all are welcome. Interesting points on the feedback loop and the rest.

    Clearly the language has gone awry: the document was never intended to be about which browsers are supported, just which browsers need to be tested to ensure that the site has been coded to web standards and which thus ensures all users can access the site. Would have helped if some of the titles hadn’t used the former (and indeed so did my early link in this piece. Whoops.) It’s definitely not meant to mean that any browsers not on the list are completely unsupported, just that the sample of browsers gives assurance that all users and browsers can access the site.

    As I understand it, mobile is intentionally not covered and will be the subject of a different consultation paper in the future, BTW.

  3. I’m interested in the interplay between progressive enhancement and Power of Information: if you’ve published the core content in accessible form on your site, is it OK/encouraged to adopt web 2.0 platforms with shady accessibility if that’s where your market is?

    Specifically, should government shun Facebook and other AJAX-rich platforms?

  4. andrewlewin

    Interesting question – but there’s no reason why an AJAX-rich site can’t be made perfectly standards-compliant and accessible.

    We obviously can’t dictate what other private sector site developers can do, we can only ensure our own house is in order and that we lead by example.

    But I don’t think that boycotting sites with “shady accessibility” is the thing to do – we’d be cutting off our nose to spite our face, as it would stop us from reaching audiences who use these sites and mean that our communications don’t reach any of them.

  5. The phrase “mobile is intentionally not covered and will be the subject of a different consultation paper in the future” is both a good thing and a bad thing to me. It’s good because it doesn’t matter that it was missing from my response, but it’s bad because it makes me suspect that COI will be advocating poor-relation “mobile ghetto” sites instead of making the main websites fully accessible.

  6. andrewlewin

    Not at all. It’s so that mobile doesn’t get buried in with the rest and get second class treatment.

    Not everything can be tackled and addressed all at the same time or else nothing would ever get agreed and signed off, it would just go round and round and round.




Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s



%d bloggers like this: