High Accessibility Is Effective Search Engine Optimization
Issue № 207

High Accessibility Is Effective Search Engine Optimization

Many web designers view search-engine optimization (SEO) as a “dirty trick,” and with good reason: search engine optimizers often pollute search engine results with spam, making it harder to find relevant information when searching. But in fact, there is more than one type of search-engine optimization. In common usage, “black-hat” SEO seeks to achieve high rankings in search engines by any means possible, whereas “white-hat” SEO seeks to code web pages in a way that is friendly to search engines.

Article Continues Below

In Using XHTML/CSS for an Effective SEO Campaign, Brandon Olejniczak explains that many web design best practices overlap with those of white-hat SEO. The reason is simple: such practices as separating style from content, minimizing obtrusive JavaScript, and streamlining code allow search engines to more easily spider, index, and rank web pages.

Two years later, I am going to take Brandon’s conclusions a step further. I have been a search engine optimizer for several years, but only recently have become infatuated with web accessibility. After reading for weeks and painstakingly editing my personal website to comply with most W3C Web Content Accessibility Guidelines, I have come to a startling revelation: high accessibility overlaps heavily with effective white hat SEO.

Accessibility for all users, even search engines#section2

On further reflection, this overlap makes sense. The goal of accessibility is to make web content accessible to as many people as possible, including those who experience that content under technical, physical, or other constraints. It may be useful to think of search engines as users with substantial constraints: they can’t read text in images, can’t interpret JavaScript or applets, and can’t “view” many other kinds of multimedia content. These are the types of problems that accessibility is supposed to solve in the first place.

Walking through a few checkpoints#section3

Now that I’ve discussed the theory of why high accessibility overlaps with effective SEO, I will show how it does so. To do this, I am going to touch upon each Priority 1 checkpoint in the W3C Web Content Accessibility Guidelines which affects search-engine optimization.

1.1 Provide a text equivalent for every non-text element (e.g., via “alt”, “longdesc”, or in element content)…

Not only are search engines unable to understand image and movie files, they also cannot interpret any textual content that is based on vision (such as ASCII art). alt and longdesc attributes will, therefore, help them understand the subject of any such content.

Search engines are also “deaf” in reference to audio files. Again, providing textual descriptions to these files allows search engines to better interpret and rank the content that they cannot “hear.”

1.2 Provide redundant text links for each active region of a server-side image map.

Text links are very important to search engines, since anchor text often succinctly labels the content of a link’s target page. In fact, many search engine optimizers consider anchor text to be the single most important factor in modern search algorithms. If a website uses an image map rather than a text-based menu as the primary navigational method, a redundant text-only menu elsewhere on the page will give search engines additional information about the content of each target page.

4.1 Clearly identify changes in the natural language of a document’s text and any text equivalents (e.g., captions).

Major search engines maintain country and language-specific indexes. Specifying the language of a document (or of text within a document) helps search engines decide in which index(es) to place it.

6.3 Ensure that pages are usable when scripts, applets, or other programmatic objects are turned off or not supported […]

Some users choose to disable JavaScript and applets in their browser’s preferences, while other users’ browsers do not support these technologies at all. Likewise, search engines’ “browsers” do not read scripts; therefore a webpage’s usability should not be crippled when scripts are not supported. Otherwise, search engines may not even index the page, let alone rank it well.

14.1 Use the clearest and simplest language appropriate for a site’s content.

It is a bit less obvious how this particular checkpoint aids SEO. But if a website contains the “clearest and simplest language appropriate for the site’s content,” it is probably using those keywords with which potential searchers will be most familiar. Searchers tend to use succinct queries containing familiar language. Thus, to receive maximum traffic from search engines, it is best that a website contain the same words which the site’s audience will use when searching.

The benefits do not end with Priority 1—many of the Priority 2 and 3 Checkpoints are important for SEO purposes, too. For instance, Checkpoints 6.2 and 6.5 refer to the accessibility of dynamic content. In fact, making dynamic content search engine-friendly is one of the most daunting tasks a search engine optimizer faces when working on an ecommerce or database-driven site. Following the W3C’s recommendations can help to avoid any indexing or ranking problems related to using dynamic content.

From the horse’s mouth#section4

If you doubt any of the above, perhaps a visit to Google’s Webmaster Guidelines could convince you that Google rewards high accessibility. This page specifically mentions best practices which will help Google “find, index, and rank your site.”

Design and Content Guidelines:#section5

  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
  • Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
  • Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.
  • Make sure that your title and alt tags are descriptive and accurate. […]

Technical Guidelines:#section6

  • Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

Note that each of Google’s guidelines actually correlates with a W3C Web Content Accessibility Guideline. (Oddly enough, the word “accessibility” does not actually appear in Google’s Webmaster Guidelines. Perhaps they are afraid of scaring off some webmasters with technical jargon? In any case, it is clear that Google is lobbying for high accessibility.)

SEO: just another feather in accessibility’s cap#section7

The checkpoints I highlighted above are just a few of the many ways that high accessibility will help optimize a website for search engines—many of the other checkpoints in the W3C Web Content Accessibility Guidelines are helpful to SEO, as well. Of course, to most web designers, the goal of accessibility is (and should be) to make sites accessible to all people, independent of their platform or any disabilities they have. But if accessibility gets a website more traffic from Google, even better!

The good news is that a web designer who follows best practices for accessibility is already practicing solid white hat SEO. Search engines need not scare anyone. When in doubt, design your site to be accessible to blind and deaf users as well as those who view websites via text-only browsers, and SEO will fall into place.

71 Reader Comments

  1. I’d like to index my page with the Google sitemap, but using SSIs for the and first part of of all my pages prevents me from creating a new document title for each page, resulting in a repetitive index. I think there’s got to be a better templating method than using SSIs in this way. Please friends, enlighten me.

  2. Here is one page,,,,


    Title – Page 1

    Page 1

    Lots of stuff in here.

    Here is another page,,,


    Title – Page 2

    Page 2

    Lots more stuff in here.

    Different titles. Multiple (two in this case) includes in the same page.

    Kludgy? Maybe.

    Does it work? You bet.

  3. By meeting the Accessibility guidelines, you not only provide disabled people with access to your site, you can provide keyword rich Alt tags that can be indexed by search engines. These can be especially beneficial in image searches. If you are in the travel game, you will probably want to make sure you are using the Alt tags.

  4. what hagan has said is unflawingly true. Accessibilty is the king in the seo rules, as itelf pointed out by google chaps. But it does not really succeed as a single honest tool. I have seen spammed pages get higher rank in google search, and really aceesible pages suffer blackoout swarms. Be accessible first, and then a little mischevious. This is the real moto in seo, as far as sucess goes.

  5. I agree with your viewpoints. Even there is a considerable sized body of practitioners of SEO who see search engines as just another visitor to a site, and try to make the site as accessible to those visitors as to any other who would come to the pages.
    They often see the white hat/black hat dichotomy mentioned above as a false dilemma. The focus of their work is not primarily to rank the highest for certain terms in search engines, but rather to help site owners fulfill the business objectives of their sites. Indeed, ranking well for a few terms among the many possibilities does not guarantee more sales.
    A successful Internet marketing campaign may drive organic search results to pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and making sites accessible and usable.
    Still you really did present a great SEO working critireion

  6. Now the company I work for here India (WDC) can be
    found on the www if we use search words that describes our normal line of activities.But here you can’t work with new ideas. You just have to follow the route with single ‘yes’ all the time. Still my efforts do make a lot of difference as an SEO optmizer. Even our main competitors are frightened at the rate of our acceleration.
    I am always the believer of hign accessibilty. Now You can even find us if you search for our straight name in the google on the front page. I love to be a flash programmer soon but still I think my SEO efforts are also extra ordinary. May be our vice president got taken it very serious if it ended up on on his laptop.

  7. Interesting how this article passes the test of time. IMHO, so does table based HTML. I have always had great success with table based HTML 4.01 whether it validated or not. I guess since I started designing way back in the 90’s (a dogs age in internet time), I am inclined to prefer older code that has been tweaked over time. I have never felt that standards made any difference in rankings. I will say the clean code is always the best code and that cleaning a site can effect rankings, so one aspect here could be accurate.

    I suppose this is all beauty in the eye of the beholder.
    🙂

  8. My company is in the process of reconstructing its Navigation Generation tool and unfortunately they developed a tool that generates the navigation as DHTML using bloated Javascript calls. If I could have been in the design phase, I would have tried to convince them to do something like suckerfish (manipulating an unorderd list with CSS). Now that I’m stuck with this, I’m wondering if there is something I can do in paralelle with this menu to offer something SEO friendly and perhaps even accessable too. I was thinking maybe to have a page not only render the regular nav using javascript but also simultaneously render the same menu in a hidden DIV layer as an list for search engines to spider. This scares me because I don’t want to get the infamous BMW ban for having content hidden from human eyes. Does anyone have any suggestions?

  9. Ron, Don’t go down the hidden DIV layer, you are likely to get the site banned by one of your competitors reporting hidden text.

    Your best bet would be to redesign using suckerfish and kill the javascript altogether.

  10. It is true, as I have read many such articles, that certain people regard search engine optimization as a dirty trick. In a way they are not far off the mark, as white hat as it may be, all sorts of tricks and procedures are needed to successfully compete in search engine space. But it is not our fault. We inherited the system from the search engines, not the other way round, and it is these systems that we fiddle around with to find the necessary information about the algorithms before manipulating them. What’s so bad about that?

  11. I was disappointed to hear during a Google webinar event on 22 Oct:

    ‘Webmaster Chat – “Tips and Treats”‘

    that Google places no ranking benefit on a page that is well marked up versus one that is not. They made the point that great content put up by someone who did not know how to semantically markup that content should not be penalised.

    For me, this answer given by Google feels credible but at the same time I am disappointed to hear that one of the many benefits of web standards, that it improves SE ranking, is perhaps not true.

    I will always build to web standards for all the many good reasons, but perhaps I need to temper my enthusiasm for that place in the venn diagram of life where web standards and SEO meet?

  12. In recent google webinar and above comment by Alan Google places no ranking benefit on a page that is well marked up versus one that is not. I think John and Matt where talking about “strict markup” not markup in general. I am sure the information indexed has to be parsed based on mark up and when done properly there would be benefit? Now I am confused…

  13. I (and I think Mike above) would be delighted if a venerable ALA staffer would give their view on this (#65 and #66). Thanks in advance for any additional comments on this. Cheers, -Alan

  14. This article is a great way to show the parallel bewtween good practices and direct benefits for your business.
    I’ve found out that it is really difficult to sell ” web standards” to customers. Nobody (almost) seems to care. The proof of this is that a super high percentage of sites is not standards compliant, and this is not because of lack of resources as I point out in a post on Web Standards and Fortune 500 Companies:
    http://www.aggiorno.com/blog/post/Benchmarking-DOCTYPE-validation-in-Fortune-500s-Web-sites.aspx

    On the other hand, if you tender to the benefits of web standards like accessibility, SEO, redecution in maintenance costs, better chance to display correctly on mobile browsers, etc, you do get some traction in the conversation.

    In any case, I believe Google does care about well written code and I summarize a number of examples in this post:

    http://www.aggiorno.com/blog/post/Web-Standards-and-Search-Engine-Optimization-(SEO)-Does-Google-care-about-the-quality-of-your-markup.aspx

  15. It’s not just personal experience, either. I was introduced to one of the black-hat SEO (for Google specifically) guides last month, where various people measured the effect of certain Google-defeating tricks over time, and the enduring and verified techniques were virtually all compatible with best-practice accessible, semantic site code and content design. Nearly the others faded in usefulness over time, or even became actively penalised by Googlebot, but good, clear and simple code still has my sites
    sohbet 2nd and 4th on an ego-search at the time of writing

Got something to say?

We have turned off comments, but you can see what folks had to say before we did so.

More from ALA

I am a creative.

A List Apart founder and web design OG Zeldman ponders the moments of inspiration, the hours of plodding, and the ultimate mystery at the heart of a creative career.
Career