Category Archives: Accessibility

Taming Ello

As I have mentioned previously, I am using the new social network, Ello. New, and not without problems – the worst of which (for me) involve the user interface. Pale grey icons on text on a white background do not make for good readability, so I had a poke around in the page source to see if I could use some custom CSS to make it more readable.

Different browsers use different ways to override the CSS provided by sites; Google Chrome has an extension called Stylish, which does this for me. (Also available for Firefox, I believe.)

For anyone wanting to try this out, here is the CSS I am using. It’s a bit rough and ready, but fixes colour contrast and scrolling issues that were breaking things for me.

#drawer, #peops {
 overflow-y: scroll;
}
.btn--ico {
 color: #000;
 font-weight: bold;
}
 
.svgicon {
 stroke: #000;
 stroke-width: 2px;
}
.postbar {
 color: #000;
}

Blue Hat for a Blue Day

Smiffy's Twitter avatar wearing a blue hat

Preamble

Today is the 30th November. This year (2009,) the 30th November marks the third annual Blue Beanie Day, raising awareness of web standards and accessibility. (For those unsure as to what a beanie is – as was I – it's what in my family used to be called a wooly hat; toque in French.)

Firstly, let's get the "why a blue beanie (or wooly hat)" issue out of the way. The reason is simple: the online avatar of Jeffrey Zeldman, co-founder of the Web Standards Project, publisher of A List Apart, is seen wearing – you've guessed it – a blue beanie (or toque.) Someone decided this would be a good way to make the avatars of web standards-supporting folk stand out. The rest is history.

When I posted a note on Twitter advising why my avatar had suddenly gained some blue stuff up top, it was pointed out to me that the link to the Web Standards Project page was "…less than exciting to casual visitor." (Thanks to @DrJaneLS.) Which was perfectly true. The page in question means everything to those of us already involved in web standards/accessibility, but very little to anyone else. Since it is the "anyone else" we are trying to reach, some further explanation is required, otherwise we are just wasting our time, preaching to the choir, as it were.

Web Standards, Web Accessiblity in a Very Small Nutshell

At risk of upsetting my peers with an over-simplistic explanation, I will now endeavour to explain web standards/accessibility and why they are important to everyone.

Web Standards

The heart of the modern web page is HTML – HyperText Markup Language. It's the language that defines the content and semantics of a web page. The presentation of modern web pages – what they look like – is determined by CSS – Cascading Style Sheets – another language.

When writing in any human language, we need to observe rules of grammar, spelling, punctuation, semantic structure, etcetera – or we end up with an incomprehensible mess. Web standards is all about doing this with the languages used to create web pages.

If we don't stick to the rules when writing for the web, we may end up with a page that looks OK in one web browser (user agent, to use the technical term,) but not in another. But the implications go further than this, which leads me to web accessibility.

Web Accessibility

Whilst web standards may be regarded by some as technical niceties, web accessibility has a far more human face. It is, at the end of the day, all about people.

How do people use the web? If we assume that everyone has a desktop computer with a big colour monitor, keyboard, mouse, fast Internet connection, and design our content around this, we run the risk of excluding people who:

  • Are blind or have low vision
  • Cannot /do not use a keyboard
  • Cannot /do not use a mouse (possibly in combination with not being able to use a keyboard)
  • Have a slow Internet connection (includes so-called "mobile broadband")
  • Are using a tiny screen, such as on a smart-phone

This list is by far from being exhaustive. For further reading, I would suggest Introduction to Web Accessibility from the Web Access Initiative, the official body that creates the Web Content Accessibility Guidelines (and other related guidelines.)

People who fall into the first three categories on my list have technology solutions available to them, but these technologies rely on a certain quality of content to be able to do their job. [Note: I have written software that attempts to make sense of web content. Trying to get it to work with non-standards compliant content is highly frustrating to say the least.] Following web standards will not necessarily make content accessible – more work is required on the part of the content creator – but it's a very good start.

Conclusion

If we ignore web standards and accessibility, we run the risk excluding people from being able to make use of our content. Not only is this immoral (discriminatory,) but it makes poor business sense (excluding potential customers) and – in many jurisdictions – is illegal.

Support web standards and accessibility. If you don't create content yourself, you may have influence/responsibility over it. Make that content standards-compliant and accessible and you will be able to go home feeling that you have been socially inclusive, gained customers, and won't be seeing the wrong end of a discrimination suit.

What You Can Do

  • If you are a content creator, learn to write valid, semantic, markup. Learn about web accessibility techniques and apply them.
  • If you are having web content created, specify that it must be standards-compliant and that accessibility should be a primary design consideration, not an add-on. (If your contractor says "it doesn't matter," find someone else.)
  • Everybody: find out how people with disabilities use the web [try searching for screen reader video.] It can be quite a thought-provoking experience.
  • Spread the word!

TinyURL for this post: http://tinyurl.com/yf2aad9

What, no Nick Heyward? No. I don't even like the song – I was just looking for a snappy tagline.

Accessible Twitter, Accessible Tweets

Preamble

Twitter is one of a host of web-based social networking tools, excellent in concept, but (in my opinion) less-than-perfect in implementation – especially when it comes to usability/accessibility.  Whilst this article discusses Twitter and Twitter messages (updates, tweets,) certain aspects apply equally to other contexts.

Accessibility Through Alternative Interfaces

Like many of the web applications that I encounter, it appears to me that little (if any) thought has gone into the accessibility and usability of interface. I see little point in working through the issues and proposing possible solutions when this work has already been done and can be seen as Accessible Twitter. Accessible Twitter, the work of one Dennis Lembrée, is still in the alpha stage of development but is already everything that the Twitter web interface should be and more.

In a way, Twitter actually gains accessibility points through offering the API which Accessible Twitter and numerous other alternative interfaces operate – if one looks at the Big Picture. (My more cynical side looks at the API as a cop-out on the part of Twitter: Don’t like the interface? Here, go build your own.)

Could this be the modern-day “provide a text alternative” from WCAG10? (Which was also a cop-out.)

However one feels about this, APIs for many web applications are available and may be used as a Force for Good (or at least to provide more usable/accessible interfaces.)

What the Heck Did That Mean?

Twitter has one major limitation: 140 characters.  Historically, this was to allow messages to be sent by SMS; SMS messages can be up to 160 characters (or 2 lines of an 80 column terminal for those old enough to remember) long.  The 140 character Twitter limit is based on 20 characters of user name plus 140 characters of message.  I have always questioned the logic of this; if Twitter was designed to work via cellular telephones, why not use WAP which has no limits? (The sophistication of today’s cellular telephones – even the basic ones – offers a host of better ways to work than SMS.)

Here is not the place to debate this issue, nor is there any real point in doing so – Twitter has a 140 character limit: it is a given, we are stuck with it, end of story.

Brevity is the Soul of Wit

If this is so, oh, what a witty place Twitter must be!  (My attempt at humour in less than 140 characters.)  There are those who argue that the 140 character limit can make us better communicators.  This I dispute: if it takes me 20 seconds to type a message and then a further 5 minutes trying to re-phrase, remove punctuation, and (horrors!) abbreviate words to fit that message into the permissible 140 characters, I do not feel that this is effective use of my time.

Whilst many messages might fit into 140 characters without requiring any form of re-work or compression (“happy birthday!”, “dinner’s ready”, “war is peace”, “ignorance is strength”, “freedom is slavery”,) there is much that does get shortened into an oft-incomprehensible form of Newspeak. I read Twitter messages from people that I know beyond the realms of Twitter and frequently find myself mystified by industry-specific abbreviations which leave me thinking “what the heck did that mean?”  Generally I feel too embarrassed to ask.

Accessibility Implications

Consider this: however we access Twitter, it can be considered to be “on the web”.  Twitter messages are, therefore, web content.  If the various interfaces to Twitter are used to create web content, that makes them authoring tools.  Should these interfaces therefore be covered by the ATAG? No, that’s not part of the discussion.  Just thinking tangentially there.

Twitter messages are, nevertheless, web content.  By causing people to use degenerate (can’t think of a better term) language (2 for ‘to’, 4 for ‘for’, u for ‘you’ – and all the others that make me twitch and occasionally froth at the mouth,) the web content that is Twitter messages becomes anything but plain language – and thus becomes less accessible, especially to those for whom the language in question is not a first language, those with literacy issues, etcetera.

Answers?

I have no practical answers to this issue.  One can write a blog post with the required text and post a link with just a headline on Twitter.  This, however, is just too slow and inefficient.  A 200-word answer to a one-off question may not justify a blog post anyway.

Splitting a message into a series of Twitter messages is something that I have seen a few times but:

  • Twitter has very little support for message threading.
  • Messages may arrive with messages from others interspersed and thus lose sense due to the broken context.

Conclusion

This article seeks to raise awareness of the following:

  • Third-party applications may be used in place of Twitter’s less-than-perfect web interface.
  • The limitation in length of Twitter messages may create accessibility issues due to the use of abbreviations and degenerate language.

@smiffy

TinyURL for this article: http://tinyurl.com/cbxphx

Further Reading

Max Design CSS and Accessibility workshop series – May 2009

I have received a mailing from Russ Weakley (Twitter: @russmaxdesign) advising of the Max Design CSS and Accessibility workshop series being run May 2009:

Day 1: A practical guide to preparing WCAG 2 compliant websites

Roger Hudson

An all new accessibility workshop that focuses on providing practical advice to help developers and organisations comply with the Web Content Accessibility Guidelines, Version 2.0.

During the day, you will learn the importance of web accessibility, how to overcome common accessibility problems, how to reduce the risk of a discrimination complaint, and the techniques for complying with essential WCAG2 Success Criteria.

Day 2: Mastering CSS and XHTML – building elegant websites

Russ Weakley

Over a full day you will build a detailed website layouts from the ground up – starting with flat graphic mockups; and ending with clean markup and elegant styled pages using XHTML/CSS.

The course will cover styling forms, writing efficient CSS, use of sprites, sliding doors, dealing with browser bugs, creating print CSS and styling for different screen sizes.

Dates

Sydney – Monday 18 May and Tuesday 19 May Canberra – Thursday 21 May and Friday 22 May Melbourne – Monday 25 May and Tuesday 26 May Brisbane – Thursday 28 May and Friday 29 May

More information: http://maxdesign.com.au/workshop2009/

Web Accessibility Techniques workshop in Adelaide on 20 November 2008

Just passing on this communication from Vision Australia:

Vision Australia is running their popular Web Accessibility Techniques workshop in Adelaide on 20 November 2008.

This full-day workshop run by Vision Australia is targeted at web-development team leaders, corporate communications professionals along with content authors, web programmers and designers and web contract managers. A basic knowledge of HTML is helpful.

This workshop provides a thorough overview of accessibility issues and the techniques used to address them. It covers the World Wide Web Consortium’s Content Accessibility Guidelines and their implementation.

Course outline & registration details here.

Paralympics New Zealand – One for the Rogues’ Gallery

Paralympics New Zealand, a group with a disability focus, of all organisations, should be aware of accessibility issues.

Their web site, however, is a classic example of how not to make a site accessible (yes, it was even produced with Microsoft FrontPage,) and is joining my Accessibility Rogues’ Gallery with special dis-honours as an Organisation That Should Know Better.

I look forward to being able to write a follow-up to this post advising that all has been made good.

Dilbert Goes Inaccessible

It’s been nearly a month since I last looked at any online cartoons. Much to my horror, the Dilbert site has had an alleged ‘upgrade’; whilst some of the new features (user comments, enhanced searches) are a step in the right direction, the main comic is now displayed in some ghastly piece of Flash. Going back to a certain cartoon and then trying to read through up to date is not easy – I can find no link or Flash control for ‘next’ or ‘previous’ so I have had to change the URI of the current cartoon in the location bar to the next one I want to view (thanfully it uses a consistent URI scheme).

Whilst it is very hard for a cartoon to be fully accessible without a text transcript that actually reflects the humour, this latest change to the Dilbert site is now less accessible even to those who could use it before. (There is one comment on one of the cartoons complaining that Flash content is blocked on the poster’s work network so they can no longer read the cartoon at work.)

Hopefully Scott Adams will heed the comments and get something more accessible and standards-compliant implemented. Until that time, the dilbert.com has been awarded a place in my Accessibility Rogues’ Gallery.

Update: Slashdot has this to say on the matter.

Am I Too Slow?

Preamble

From the introduction to the Web Content Accessibility Guidelines, 1.0:

For those unfamiliar with accessibility issues pertaining to Web page design, consider that many users may be operating in contexts very different from your own … They may have … or a slow Internet connection.

Despite the availability of broadband Internet connections becoming more widespread, slow connections continue to be an issue with growth of web access via mobile devices. Whilst we have no control over how fast a user's connection is, there are things that we can do to make life easier – and faster – for those with slow connections. Connection speeds, however, are not the only speed-limiting factors in the delivery of web content. This article describes some of factors that can impact how quickly web content may be delivered and rendered, with suggestions as to how we can make improvements.

Time Trials

Whilst dusting off an old 28,800 modem and using it to connect to the Internet is one way to get a feel for the overall performance of a web site, it is not exactly practical – no more so than connecting via a cellphone (also very expensive).

The tool I tend to use for checking speed is an online service from WebSiteOptimization.com – the free Website Performance Tool and Web Page Speed Analysis. For those using the Web Developer Toolbar for Firefox, there is a shortcut to this service via Tools->View Speed Report.

Size Counts

Before I go into the more complex issues of dynamic sites and technical stuff about web servers, let's have a look at the issues that can affect simple, static web sites. (All the issues here apply to dynamic sites as well.)

Images and Other Media

Remember the days when it took several minutes for a large image to render on the screen? Just spare a thought for those who still have connections that slow.

What is the problem? Too many images? Too large images? Too many, too large images? (When I say images here, this applies equally to any other media that are loaded by default with the page.) The answer is really one of overall size. Look at the sizes of all the images that load with a page, add them together, then add the size of the page itself plus any linked stylesheets or scripts. The greater that total size, the longer the page will take to load and render.

Questions to ask yourself:

  • Do I really need all those images – are they essential for making the page look good, or do they just add clutter and distract the eye from the subject matter?
  • Do my images need to be that big? (Thumbnails may be linked to larger images for those who want to see all the gory detail and don't mind waiting.)
  • With JPEG images, how much can I increase the compression without noticeable loss of quality? (The answer is sometimes "quite a lot".)
  • Photographs: are these photographs well-composed, or could they benefit from cropping (and thus reducing the size)?

What's All That In Your Document <head></head>?

I would have to put up my hand to having created pages where document <head></head> is larger than the <body></body> – generally due to the inclusion of large amounts of Dublin Core metadata (see my previous article, 'Metadata, Meta Tags, Meta What?'). There are lots of things that should be up there in the <head></head>, but there are some things that may be better placed elsewhere:

Styling
Unless your site has only one page, forget about having <style></style> in your <head></head>; use an external CSS and provide a link. If your user agent (web browser) and the web server are both behaving properly, your external stylesheets should be requested from the server once and then cached somewhere on your local computer. If, however, you are duplicating that information in the <style></style> of every page, you are pulling that data down every time a page is loaded. Don't forget that this is in addition to the half-a-megabyte of banner image that you created at the wrong resolution and then scaled using CSS.
Scripts
Whilst there are some scripts that may only be required on one page, any that need to run on multiple pages should be stored externally. Once again, caching will call the script file from the server once rather than on every page load.

I Don't GET It

When you ask your user agent to fetch you a web page to read (or look at the pictures), whilst you are saying "bring me that page with all the pictures on it", the user agent has to do far more work than you might expect. The web is all based on HTTP transactions. (And you thought that the http:// at the beginning of URIs was just there to be annoying.) Let's consider a hypothetical page with 2 CSSs, 4 images and a Google Analytics tracker. When you say "bring me that page with those 4 nice images that my friend told me about", the user agent has to go through all this:

  1. Contact the server and issue a GET request for the HTML page itself.
  2. Have a look at the HTML page when received and make a list of all the other GETs that it needs to do.
  3. GET a CSS.
  4. GET another CSS.
  5. GET a background image specified in one of the CSSs.
  6. GET image #1
  7. GET image #2
  8. GET image #3
  9. GET image #4
  10. GET urchintracker.js from Google, and wait five minutes for it to turn up
  11. Render the page.

In case you weren't counting, that was 9 HTTP transactions to bring you that one page. Although all the to-ing and fro-ing of an HTTP transaction doesn't (usually) take that long, each transaction does take a finite amount of time. If you can put all your CSS in one file (assuming it's all for the same media type or all media types), do so – that's one less HTTP transaction to slow things down.

The facetious comment about Google Analytics comes from bitter experience – I have, on many occasions, had to wait for pages to finish loading, just for some piece of JavaScript that tracks sea urchins. Not being unduly interested in sea urchins (or other peoples ability to track site visitors), the Firefox Adblock extension saves me that HTTP transaction every time.

Update:

I am advised by a reader that a much faster Google Analytics script is now available – according to Google. I will believe this when I see it.

No, Not Here, Over There

Redirects can be really, really handy when writing web applications; just don't over-do them, as every one means an extra HTTP request.

Server Tips and Tricks

Squish!

As we all know, the Internet is a set of tubes. To get things to move through tubes faster, we can squash them up nice and small. I just saved this article, as far as I have written, to a file and looked at its size, which was 19399 bytes. I then squashed it up nice and small using a tool called gzip, after which it was 5971 bytes – that's less than a third of the original size. Text files – HTML, CSS, JavaScript – squash down really well. Image files are another case since many image file formats allow for compression. If you have compressed a JPEG image as much as you can, trying to squash it down yet further using gzip can – in some circumstances – make it bigger. Strange, but true.

But how can you squash your files? This is something that can be set up either in your web server configuration or, if you are running a dynamic site, can be done in the web application itself. Not every user agent can handle squashed files so either the web server or our software has to look at a line of the HTTP request that says something like:

Accept-Encoding: compress, gzip

This means that we can squash our files using formats compress and gzip or:

Accept-Encoding: compress;q=0.5, gzip;q=1.0

This means that both compress and gzip are OK, but I prefer gzip. (Personally, I prefer a super-squasher called bzip2, but I haven't heard of it being supported by user agents.)

In every silver lining there is, however, a cloud; whilst our squashed up files may go through the tubes a lot quicker, there is computer overhead at both ends as the web server (or application software) needs to do the squashing before it sends the files off and the user agent has to un-squash it before it can be rendered. (Visions of trying to unpack an over-stuffed rucksack spring to mind.)

Tune Up

If you are running your own web server, you did read all the documentation didn't you? (Ha!) Assuming that it was so long ago that you have forgotten, try Googling for: Apache tuning spare-servers. (If you don't use Apache, substitute the name of your own web server software and strike out the spare-servers bit.) Getting your server configuration right can make a big difference in how quickly you can service incoming requests, especially when things get busy.

Hardware (Technical Stuff Alert)

If you are not only running your own web server, but are doing so on your own hardware, put as much RAM in it as it will take or you can afford. Use RAID, not just for security, but for performance. Use fast discs with the fastest interfaces. Use multi-core CPUs. Build your LAMP components (assuming a LAMP environment) specifically for your processor architecture and with the appropriate optimisation flags set.

Even if you are running a virtual private server on someone elses hardware, you can generally pay a little extra to increase your RAM. Do it. The less the operating system has to swap, the sooner your web content gets to your customers, or your customers customers.

Stop running SETI@Home on your web servers – it really doesn't help matters.

Mind Your Language (More Technical Stuff)

Web applications can slow things down too! Here are a few bullet-point tips for those who write and use web applications:

  • If you are new to programming, don't be satisified that your programme works – make sure that it works efficiently. Take the time to really learn your language of choice – and that includes the SQL and features of whatever RDBMS you are using. PHP is so easy to code that it is easy to code badly. A bit like cars with automatic transmission – anyone can drive one through their neighbour's front window. If you do not have a programming background, try learning a "real" language like C – the discipline should do wonders for your PHP coding skills. I would recommend 'C All-in-One Desk Reference For Dummies' by Dan Gookin (the guy who wrote the original 'DOS for Dummies') as an ideal beginners text. If you are able to learn from Kernighan & Ritchie, you must already be a programmer and need no further telling.
  • Don't run PHP as CGI – use the appropriate Apache module.
  • If you use Perl and your site is getting big/busy, start converting your code to run with mod_perl before everything starts to slow down. (For an example of a large site running on mod_perl: Slashdot.)
  • Use sub-selects in your SQL language – try to keep down recursion (do a query, do something with it, do another query based on that) in PHP/Perl – it's inefficient. The less calls you make to the database – and the more you can get the database to do for every call (think stored procedures), the faster things will run.
  • Consider having tables of cached content such as metadata, navigation structures, etc., that are updated when pages are changed. These often involve complex queries which can impact performance on busy/large sites, if run every time a page is requested. Caching the output of complex queries means that those queries are run only once when the page is created – simpler, faster queries are then used to deliver the content.
  • For content that is not changed often, consider caching it as static pages as these can be served much quicker than having to run a programme every time the page is requested. Reverse proxies can be useful here, too.
  • If you are going to be searching on a database field, make sure that it is indexed. MySQLs fulltext indexing is very powerful, and very fast.
  • When designing your database, make it so that fields that link to other tables are integers. You can't get any faster than integer comparisons. (Don't forget to index those fields too.)
  • If you really want blinding performance and can't just throw more hardware at it, consider moving to a compiled language like C.

Conclusion

Speed is an accessibility issue and the things that slow down the delivery of web content are cumulative in effect. Every little thing that you can do to get your content to your audience is worth it – and may mean the difference of gaining a sale (or whatever) or having your prospective client get fed up with waiting and going elsewhere.

This article was written for the February 2008 edition of the newsletter of the Guild of Accessible Web Designers (GAWDS).

Matthew Smith asserts the right to be identified as the original author of this work.

Reading Difficulties – Not Getting the Points

A couple of days before I left England for the last time, on my way to Australia, I "did the rounds" of the local town, saying goodbye to various shopkeepers I had known.  In the bookshop, the proprietor insisted that I should buy a copy of "The Fatal Shore" by Robert Hughes.  I had actually seen a television programme related to the book and had been quite impressed – I fully expected to enjoy the book.  8 years later I have, despite a few attempts, not made it past the first few pages.  Something that I couldn’t quite identify really bugged me about that book.

Skipping ahead to the present, I have a stack of 3 books beside my chair in the living room.  They are: "The Selfish Gene" and "The God Delusion", both by Richard Dawkins.  The third (a little more esoteric) is "Dæmonologie" by the man I think of as "Loony King Jimmy Stuart", more formally known as King James I of England.  (Anyone wondering at the disrespectful epiphet should read the book and draw their own conclusions.

I can rarely sit and read a non-fiction text at one sitting (unlike sci-fi and fantasy) but am getting on quite happily with "The God Delusion" and "Dæmonologie".  "The Selfish Gene", however is giving me some trouble – I really struggle to read it.

Why do I find two books by the same author so different to read?  I don’t think that Dawkins’ writing style is so much different between the 2 books.  The language of "Dæmonologie" (especially the spelling) should certainly make it a more challenging read than something written just under 400 years later.

Yesterday, I looked at all 3 books and realised where my problem lies.  "Dæmonologie" is printed in a font which I guess to be about 13 point. (A point, for those unaware, is a printers’ measure – 1 point being equal to 1/72 of an inch.)  "The God Delusion" appears to be about 12 point, with a line spacing of 1.5.  "The Selfish Gene" and – now I look at it – "The Fatal Shore" are both printed at about 10 points.  I do believe that my problem is simply that the offending books are printed too small.

As I suffer from diabetes and have high risk of glaucoma, I have thorough eye checks every 6 months.  I know for a fact that my vision is "perfect", despite having one eye stronger than the other.  I have spectacles for when my eyes are tired – generally from sitting too long in front of a computer.  Why then, do I have trouble reading books with small font sizes – even when wearing my spectacles? Am I looking at an issue of inaccessible print content?  At least when viewing a web page with a font that is too small, I can always (and frequently do) increase the size to something that I find comfortable – not something that I can do with a paper-and-ink book.

Now that I appear to have identified my problem, I would like to conclude this article with a solution. Unfortunately, I do not have one. I will certainly bring this issue up the next time I see the optometrist and will check any books that I might buy in a bookshop before purchase. This helps me little though as the majority of the books that I purchase come from quickest and cheapest source – Amazon.com.

Beyond ALT Text – The Nielsen Norman report for free

Get your copy of the Nielsen Norman Group Report Beyond ALT Text: Making the Web Easy to Use for Users With Disabilities – for free. (At least it is free as I write this – normal price 124 USD.)

I have yet to read the report myself, but have been advised by reliable sources that it is an excellent piece of work.

Get them while they’re ‘ot, they’re luvverly!