Design Choices Can Cripple a Website
Issue № 207

Design Choices Can Cripple a Website

I admit, it’s a provocative headline. But it’s true.

Article Continues Below

However compelling the message, however great the copy, however strong the sales argument… the way a page is designed will have a dramatic impact on conversion rates, for better or for worse.

Before I go any further, I want you to look at three versions of the same offer page:

I know, they won’t win any design awards. They weren’t intended to. But they are functional and familiar. A reader going to any one of these pages will be able to quickly figure out what the message is, and what they are being asked to do.

Version A is the original.

Version B follows the same basic layout, but we made some minor copy changes.

In version C, we changed from a one-column format to two-column format. We wanted to test the impact of bringing more of the page content onto the first screen.

Be honest with yourself and decide now whether B or C beat A, and by what percentage#section2

Don’t scroll down and look for the answer. You’re a designer, an expert in web design. So put your money where your credentials are and write down some figures now.

Write down a percentage by which B did better or worse than A. And a percentage by which C did better or worse than A.

The design choices you make have a profound impact on results#section3

I imagine you have some way of measuring the success of your site. Maybe it’s about sales. Maybe it’s based on readership. But one way or another, your site has a purpose.

But I don’t think most designers truly understand the effect their design choices can have on achieving that purpose.

And yes, I’m sure you do some usability testing. And that likely gives you some broad, if sometimes confusing insights into what’s working and what isn’t.

But do you test different page designs?#section4

By testing, I don’t mean asking a few folks around the office; I mean doing a live test that demonstrates—with hard figures—what site visitors actually do.

Testing like that is a beautiful thing. There is no space for fancy arguments. An expert’s credentials and opinions mean squat. When you serve alternative versions, one after the other, and measure reader actions, you get the real deal. You get what is.

Do you do that? It’s a scary thing.

But if you are serious about achieving your site’s purpose, and if testing can show you which version of a page does best, then where is the argument not to test?

Here’s how design choice can make a difference#section5

Here are just a few of the design elements we have found can make a significant difference to the performance of a web page:

  • The position and color of the primary call to action
  • Position on the page of testimonials, if used
  • Whether linked elements are in text or as images
  • The amount of “white space” on a page, giving the content space to “breathe”
  • The position and prominence of the main heading
  • The number of columns used on the page
  • The number of visual elements competing for attention
  • The age, sex and appearance of someone in a photo

OK… now for the results of the test.

 

 

  

     

     

     

     

  

 

 

  

     

     

     

     

  

  

     

     

     

     

  

  

     

     

     

     

  

 

A/B/C Split Test
  Page A Page B Page C
Percent of traffic 34% 33% 33%
New sales 244 282 114
Change N/A 15.57% -53.28%

Version B, with the minor copy changes, resulted in a 15.57% increase in sales—that represents a big revenue jump for a site with high sales volumes.

Version C, in which we changed the regular, one-column format into a two-column format, resulted in 53.28% fewer sales.

That’s an astonishing reduction in sales and revenues, resulting from a design change that was intended to improve the performance of the page.

Now, just pause for a moment and think of all the design choices you have made over the last year, and the reasons why you made them. And think about the huge impact those choices might have had on the performance of the sites you worked on.

Some concluding thoughts…#section6

The figures from this test are shocking. But they are not exceptional. Design changes really do have a huge impact on conversion rates.

Here are a few things to consider:

If you have some pages on a site which are critical to its overall success, instigate a program of A/B split testing. You cannot afford to guess; you have to know.

Be aware that however strong the copy and text on a page, its performance is very much dependent on the way in which it is presented. In other words, design choices can enhance or diminish the power of the words.

Talk with your writers. Ask them how they think the message would best be presented. Then test some different versions. A good writer should have some strong instincts when it comes to the layout of the text.

One way or another, it’s important to accept that none of us—neither designers nor writers—know what the “best” page design or copy is until we test.

In a business environment where marketers demand an accountable performance from every web page, it’s time to put aside the assumed expertise of design and copy “gurus.”

The way forward is to test, and let our readers show us which designs work best, and which copy works best.

While this may be uncomfortable for some, the end result is that we will become much better web designers and writers.

74 Reader Comments

  1. but the Sample Report should have been on the left and the form on the right. Bah! scrap that idea there is too much on screen! Try putting the form on another page?

  2. I liked the article, but it was not scientific. No real control, plus the data sample was too small. I think the best layout would have been a combination of B and C, with the prominent sales “copy” (Click Here) prominently displayed. In all three cases the yellow button at the bottom of the screen did little to impact first impressions. The call-to-action is the key element in Item B that likely did the work.

  3. Leaving aside the question of the tastefulness or otherwise of the web page in question, is this article really on the level?

    Out of curiosity I googled my way to nationalalertregistry.com, entered the zip code of 32225 and got….Version A. And just in case nationalalertregistry.com were still doing user testing a la comment #37, I tried accessing the same nationalalertregistry.com page via two other browsers. Same result.

    So what’s going on here?

  4. I still don’t understand why you don’t write this kind of article using figures from an hotel booking web site or anything more usual and less … subject to questions …

    anyway :
    B/ Is ugly. No space. Not readable. May be there is good idea in it, but the design itself is a mess i thing. Of course it improves conversion because of the “Click here” button. You could have the same result keeping the A design and simply adding this link at the top …

    C/ Is not very good from a design point of view but the idea is ok. It would ask for a redeisgn. You can’t really have any conclusion on the figures you give if you don’t know more (duration of the test, target list, new users / recuring users …)

    My conclusion reading this article, as many other comments, is that all is in the title and not more ;-( but it can be interesting for people who are not aware of that (i thing ALA readers are …)

  5. Subject matter aside, am I the only one who thinks that all 3 of these designs are amateur and hideous?

    Does fact that one terrible design tested better than two other terrible designs make this a valuable study of web design (especially when each version used completely different verbiage) ?

    I could go into my garage and bottle 3 different homemade wines. I could then go down to the vacant lot beside the local liquor store and conduct a taste test with some ‘users.’ One of my wines would score higher than the other two. What would this prove? Obviously it proves that the wrong wine label design can cripple a wine.
    ^To be perfectly analogous with the subject matter I should probably note that these particular wines are being used prepping the user for a ‘Bum Fight’ which we are taping at the same location.^

    Naturally these would be some valuable findings, and I would submit them to Wine Spectator magazine, who, being well regarded by vineyards and spectators alike would surely put the story on the cover with a provocative headline.

    Come on, ALA, show some editorial control.

  6. Despite what percentages say, I do really prefer PageC. PageA is also better than B, since B does really make me sick when I try to read it: line-height and white space in general are definitely too poor. This is the most important thing to be noticed in my opinion, more than the 2 or 1 coloumn layout. In fact I think that the best design might be a re-designed C version. Two changes might be:

    1) Please take away that damn FREE word, i’m sure people associate it with SPAM. I do. You do. We do. They do. So it’s a stupid bug that corrupts the design of the page.

    2) The 2 coloumns layout is ok, but why should they be put this way? As the natural flow of the reader tries to find the sample first, just put it on the left. I think it’s quite obvious, isnt it?

    Of course many other changes are to be made, but these are really important im my opinion.

    PS: When I see a “click here” link the very first thing I do is laughing. Guess why.

  7. Do you think the fact pageC mentions the cost straight away, whereas the other two don’t mention cost until you have been sold what you get in the report, has an effect?

    I think the space and look on pageC is easier to read than the other two, but I would stick to selling the report first and having the form underneath. I also think the headline in pageC makes the purpose of the page clearer. What about keeping the same text as B but laying it out with the spacing from C.

    I think this is interesting but, I’m a bit put off by your scare mongering to make money. I just hope you don’t make mistakes and send people to the wrong addresses like has happened in this country(UK).

  8. bq. Does fact that one terrible design tested better than two other terrible designs make this a valuable study of web design (especially when each version used completely different verbiage)?

    This is _not_ about what factors make a good design. It is not about whether we, as authors, prefer one design to another.

    It _is_ about how we find which designs work best for any given site (remembering that different sites have different demographics, so require different approaches).

    The aim behind this article (as far as I can tell) was not to say “B is a wonderful design”, nor was it to promote vigilante-ism or to disgust readers by the choice of material. It was to show a *real example of A/B testing* and how the results might differ significantly from what we expect.

    We could now discard the A and C designs, and refine B into B1 and B2, and see which of those performs best, and use that as the basis to improve the design further. Don’t think that the game is over just because we have completed Level 1.

  9. I can’t see having 3 different page designs, active at different times of the year (thus the market can shift significantly during those times) as yielding results worthy of legitimate note.

    We all want to analyze how visitors navigate our website, and there are (somewhat limited) tools available to do so to an extent, but I feel that any site which doesn’t necessarily bombard the user with data, and instead takes an approach in which navigation and information are minimalized, will have visitors who might not make a purchase, but when they do, may both do so quicker, and return more often; due to an ease of use factor.

  10. The article makes a good general point that design matters in drawing a user in/pushing them away, but comparing the three designs, as mentioned before, isn’t really fair.

    The other question I’d ask is whether it’s actually fair to compare the performance of the three when we don’t know what controls were put on the tests. The three designs presumably weren’t all live at once, so who’s to say one design wasn’t just subject to a slow day/week of sales?

    What other marketing was going on while each design was up on the site that may have affected results?

    The designs change too much and (barring more info from the author) seem to have too few checks over the environment in which they were competing to make any real conclusions on the merits of each.

  11. bq. The other question I’d ask is whether it’s actually fair to compare the performance of the three when we don’t know what controls were put on the tests. The three designs presumably weren’t all live at once, so who’s to say one design wasn’t just subject to a slow day/week of sales? What other marketing was going on while each design was up on the site that may have affected results?

    The usual way to carry out A/B testing *is* to have both versions live at the same time. Roughly half the users will be served one version of the page, and half the other. That _should_ mitigate against the sort of other factors you mention.

  12. I think B would be even better if the layout were more like A. Needs more whitespace. Layout C is just unnecessarily complex. The process of deciding on a small purchase is sequential: Read what it is, find out how much, purchase. But two columns forces the user to juggle all three things as soon as the page loads. You’re asking to fill out a form to buy something before it’s even clear what you’re selling.

    However, given some of the (apparently) misinformed comments I think all three designs have a more serious flaw: It’s not clear what the product is. First, let’s make it clear that I know nothing about this company except what’s in the screenshots, and I don’t care if they exist or not. But if I understand the copy right (esp the “Background” section seen on the first design) this “report” is a compilation of publically available information that law enforcement agencies must provide to anyone who ask, in accordance with child protection laws in the US. Some states do a decent job presenting the information online by themselves. Others do a bad job, thus creating a market for third-party sites like this.

    Those may or may not be good or effective laws (take it up with your member of congress) but the result is certainly not some random “creepy guy” blacklist compiled by fly-by-night spammers as some have implied. Calling the author/designer unethical seems both uninformed and unfair.

  13. This article really missed the mark by not trying to really analyze why B was slightly better. As others mentioned here it probably had something to do with the testimonials being more avaiable and the huge “Click Here” steering the user past the jumble of content. Though personally I think the click here is just a cop out for a bad design with no clear path for the eye to follow. Sure C is confusing but so are the others—they just are better because at least there is a simple down direction to follow.

    I agree with the article that real testing is great but really… how often can you get a client to pay for that? This article could have served better if it not only tested but analyzed the whys. For at least then we could walk away with a bit more understanding for use the next time a client won’t pay for design user testing.

  14. This is sound advice for ANY business in ANY industry. Live testing is the only way to know what really works, and everyone should be open to the idea that release #1 may not be (and probably isn’t) the most effective version.

    For the record, I used live testing in a phone card distribution business, and was able to come up with a protocol that allowed sales to increase linearly (to keep up with cash flow) in an almost unbounded fashion. The results after 18 months of this gameplan? 130% increase in sales, 400% jump in profits, 81% jump in per-unit profit.

    When I read something like this, it just serves to drive home a point that I was lucky enough to have firsthand experience with. I hope everyone who reads this sees the value here.

  15. I understand the point the author is trying to make but the methodology is appalling and acts to counter, the useful point that I think is lurking in the article.

    To make a test meaningful you can only alter one variable. How can you tell what caused the difference in sales if both copy and layout was changed. As an earlier commenter suggested, perhaps it was the copy that had the real effect, not the layout as the author implies.

  16. The major difference to the results was copy changes, not design. The authow even stated as much in the opening: “Version B follows the same basic layout, but we made some minor copy changes.”

  17. To my eyes all 3 options looked confusing….didnt know where to look or what different area’s did.
    Hire a professional web designer to actually design the site rather than tinkering with slight variations…. THEN see the results.

  18. Its very difficult sometimes to determine what kind of layout to use for a particular site.

  19. When I clicked the Marketing Experiments’ Journal link in our author’s profile it took me to alovely white background page with one sentence on it… No website is configured for this URL.

    Why is that happening…?

  20. The author’s condescending remark, “I know they won’t win any design awards. But they are functional and familiar” betrays the offensive implication that designers simply decorate web sites with pretty graphics, with no understanding of what they are doing.

    The article is not ‘provocative’ because it is suggesting something radical or novel, it is provactive because of it’s crass arrogance and ignorance.

    Surely in 2006 we’re not still debating the value of design! The purpose of design (whether in print or web) is to aid the communication of ideas and information, and to create desire.

    I would be amazed if an IA, information designer or graphic designer had been anywhere near the example pages that were shown. All three were terrible – it’s a wonder that site made any sales at all!

    I am all for live testing (under rigorous conditions),,, but does the author of this article (which I have to say, has no place on this illustrious site) seriously believe that as designers we are just poking around in the dark hoping for the best?!

  21. First: Back to the basic purpose of the article: improving website design. Still pertinent and still needing to be examined…even in this lofty year of 2006! In fact, what seems to be missing in relating to the comparision test comments– is the realization of what is the difference in the effects between A, B, C.
    Too many of the comments are concerned with details. Without a doubt, the best version was site B. Why? Structurally it was ergonimically appreciated visually (meaning eyeballroll) and did not ‘yank’ the head down as was version A’s affect…which to a tired neck, is a complaint. Further, a minor frustration that the core of the info which immediately ‘weights’ seems to be sinking into the bottom of the computer screen.
    Test it yourself. Where are your eyes immediately drawn to before you reavert to search the site? Looking right down at the bottom of the screen. Now you understand the effect of the neck being ‘yanked’.

    In version B, the bulk that the eye first seeks is better positioned to the upper left, and is more effortless to view overall.

  22. forgive me if this was mentioned already but I think b was most succesful simply because of the “want names pictures and address” link near the top. who wouldn’t avoid mucking through such a bad layout if they could click on a link and get what they really wanted? the rest of the design choices in my opinion are irrelevent.

Got something to say?

We have turned off comments, but you can see what folks had to say before we did so.

More from ALA

I am a creative.

A List Apart founder and web design OG Zeldman ponders the moments of inspiration, the hours of plodding, and the ultimate mystery at the heart of a creative career.
Career