3 Case Studies that make the UX argument

March 26th, 2014 by Bruno Lucas

Some rights reserved by Je.T. (Attribution Share Alike)

We often get asked for case studies of Website Funnel Optimisation.

A funnel is way of visualising data. As this is often a competitive advantage, companies are not keen on publishing this information.

As can be seen in these studies by simplifying and reducing the steps these websites where able to massively increase their conversion rates (and sales). These are the case studies that we aware of. If you know of any more please add them in the comments.

HSBC case study

This is an interesting case study on how HSBC Hong Kong went from 8 to 10 inquiries to 800 per month by cutting the numbers of form fields from 14 down to 3. See page 28 to 30.

Source:
Bank 2.0: How Customer Behaviour and Technology Will Change the Future of Financial Services

The 300 million dollar button

The “Three hundred million dollar button“, is a great example of a form that was super simple, it asked for an email and password, but customers still did not buy. Carrying out the usability testing helped identify the why.

The form was meant to help repeat customers fulfil a purchase on the site. First time users were very resistant to registering on the site, as they felt they would be giving away information to be used for marketing purposes. Even repeat users, who had previously registered, had problems recalling their login information and either used the password recovery link or resorted to multiple registrations.

And the solution?

They took away the Register button. In its place, they put a Continue button with a simple message: “You do not need to create an account to make purchases on our site. Simply click Continue to proceed to checkout. To make your future purchases even faster, you can create an account during checkout.”

The results: The number of customers purchasing went up by 45%. The extra purchases resulted in an extra $15 million the first month. For the first year, the site saw an additional $300,000,000.

Source: https://www.uie.com/articles/three_hund_million_button/

Expedia increased revenue by $12m

In a similar case, Expedia used analytics to find out why their customers were not completing a purchase, even after clicking the buy button and entering their billing information.

By analysing what was going wrong they worked out that one data field – “Company” – was confusing customers, causing the transaction to fail. Expedia simply deleted the confusing data field and saw an immediate jump in successful bookings

Source: http://www.lexology.com/library/detail.aspx?g=f23222dc-cc63-4ffb-bc7a-775bbf1303c4 and here http://www.singlehop.com/blog/infographic-tips-for-increasing-contact-form-conversions/

These small details matter, and they can range from the choice of words to the way buttons are laid out.  With a solid research methodology it is easier to pin point where these changes need to happen. With remote usability tests and surveys we are able to get the view point of the user.

In the past we applied our remote usability testing tool to Travelport for their new product launch. The project involved travel agents in 5 different countries, and helped provide insights into the usability and user experience of this new product.

Photo credits: Je.T – Flickr.

Events at Webnographer in Lisbon

January 16th, 2014 by James

1070060_375775072544686_1305704640_n

Going forward nearly every week there will be a meetup at Webnographer’s office in Lisbon. Ideas are created and spread through people meeting. Today we are lending our office for an event on Javascript called require(lx) next week we have a restarted Lisbon UX Cocktail hour, and the week after a Hacker News meetup on Technology and entrepreneurship. We also have in the plans an event on Growth Hacking (Technology meets Marketing).

London, San Francisco, Paris, Boston and Berlin have 100′s of meetups happening every week. Not everybody knows everybody and a themed event can help jell connections. They also act as a port for visitors to a city.

 

Today, January 16th: require(lx)

On the Javascript framework Hapi.js by the core developers of the software: Wyatt Lyon Preul and Ben Acker. RSVP here.

 

Thursday, January 23rd: Lisbon Ux Cocktail Hour.

Google’s Tomer Sharon talking on Start-up and UX research. RSVP here.

 

Wednesday, January 29th: HNLisbon.

Talks on Technology and entrepreneurship to be announced. Join the group here.

 

Growth Hacking

[Exact date still to be confirmed]

Ricardo Nunes, Head of Social Media at Mindshare Lisbon on Adaptive Marketing.

 

I have to thank João Pinto Jerónimo, Bruno Barreto, and David Dias the organisers of require(lx) in making it so easy.

 

Some pictures from previous events we organised…

1264313_10151660440412810_344416316_o

922445_10151447152582810_898706555_o

994870_374809809307879_1306158893_n

 

 

 

 

 

1244388_10151660441157810_62737700_o

4019058079_fc6619bacc_z

428430_342051152583745_2111950018_n

 

Why Santa uses Webnographer

December 17th, 2013 by James

Santa using Webnographer

Santa’s challenge is that he wants to deliver the right present to the right person, but he is constrained by being far far away from his customers. What the children in Lapland want is very different than the children in Brazil, or Ireland. Only by using Remote Usability Testing can he reach the far corners of the world. Santa’s challenges are similar to most Website Product Managers.

His customers have a broad variety of needs. Some children want the latest video game, others the latest Lego kit, and some want that patch work doll. So Santa needs to sample thousands of children to be able to find the requirements of millions of children. Only by using a service like Webnographer can you easily research 100’s of users instead of 10.

Up to the last moment new toys are being released, launch dates are being changed. Only by using a service like Webnographer can the testing be moved forward or back easily. There is no participant recruitment cost to reschedule.

Santa tried AB testing last year but then 50% of the children got the wrong present. He has realised that Analytics is like accounting, it only tells you after the fact. With Webnographer you can test before.

To add Webnographer to your New Year’s wish list contact Sabrina or James at Webnographer.

Illustration by Bruno Miguel Fernandes Maltez

Discovering WHY from numbers

November 7th, 2013 by Sabrina

A couple of weeks ago James and I gave a talk at the UXPA in London. The theme of the October event was “The power of quantitative data”. The title of our talk “Discovering WHY from numbers”.

In a UX test, conventional UX thinking is that numbers can only tell you what happened, and cannot explain the ‘why’ of a UX issue. James and I showed in our talk how one can discover ‘why’ an issue is happening. And how one can go beyond just knowing simple completion rates.

 

@ElviaVasc created a great sketchnote of our talk, which you can see below. And there are many great pictures of the event on the UXPA Pintrest board.

23 Remote Usability Methods

October 31st, 2013 by James

It is easy to think that there is only one or two remote usability methods out there: moderated and un-moderated. But there are many more methods than just a couple.

Last year before the Denver Lean UX conference, Sabrina and I came up with a list of the different Remote Research Methods. In total we came up with 23 different methods. Some of them we use at Webnographer, and other methods we know that other people use. I am sure we are missing one or two. Please post any we are missing in the comments.

This post is the first one of a series, giving a top-level overview of all the online user research methods. Over the next few weeks, we will be posting a detailed article on some of more popular methods explaining in detail how and when to use it.

The 23 methods fall into 2 main categories: synchronous, where the moderator or researcher interacts with the respondent; and asynchronous, where the researcher does not interact with the respondent directly.

The methods listed with a (w) are ones that
we either use or have used at Webnographer.

 

Synchronous remote methods

 

Moderated Test

What

The participant and the researcher meet via web/video conferencing. The participant is asked to think aloud while the researcher notes his/her observations.

Why

Provides a remote version of a lab test and enables a researcher to test users anywhere in the world.

 

Online Ethnography

What

This method researches communities facilitated through online spaces such YouTube, Twitter, or Pintrest. The research is carried out by participating in those online spaces.

Why

Online Ethnography helps increase empathy and reduce context collapse. By partaking in the experience the designer understands the participant’s perspective.

 

Telephone Survey

What

Participants are called and asked questions on the phone.

Why

A quick way of gathering responses from users. It can be useful for getting feedback from users who will not take part in other forms of survey data.

 

Remote Eye Tracking

What

This method provides information about where the participant is looking on the page. It identifies what attracts their attention and what passes unseen.

Why

It shows which areas of the webpage users look at first and which areas they ignore.

 

Asynchronous remote methods

 

True Intent

What

Users are intercepted when they arrive on the website. They are asked the reason for their visit before being redirected back to the site to complete the task using a remote usability testing tool. While they complete the task, their interaction with the site is recorded.

Why

Records real user behaviour and identifies their real goals. Identifies why people are visiting and whether they succeed or not. Helps to prioritise and quantify issues.

 

Set Task

What

Participants are set a task on a website to find a piece of information. While they navigate through the site their interactions are recorded with a remote usability testing tool.

Why

Evaluates if users can complete a task successfully, how they navigate the site, and the amount of time spent. It provides insights into where and why customers are having problems.

 

Race Test

What

Participants are asked to perform a task on the website within a given time.

Why

Tests how people will perform under stress. Often behavior is different in stressful situations.

 

Click to Finish Test

What

Users are asked to complete a task on a single page. Once participants click on the page the task is finished. The test can be done with static mock-up or real sites.

Why

Provides a fast way of gathering early feedback. This method evaluates whether users can find a piece of information within a page. Metrics include success ratings, time on task, and satisfaction ratings.

 

5 Second Test

What

Participants are shown a webpage for five seconds. They are then asked to write down everything they remember about the page.

Why

Shows what users remember about the webpage, and evaluates what items are the most prominent. This method uses recall memory.

 

Recognition Test

What

Participants are shown a webpage. The page then “disappears” and participants are shown a list of items. The participants must identify which items were on the webpage and which items were not.

Why

Shows what users remember about the webpage, and evaluates what items are the most prominent. This method uses recognition memory.

 

Critical Incident Report

What

Participants are asked to give feedback via email, postal mail, or a web form about their experience after visiting a site.

Why

A fast and relatively inexpensive method of collecting data about users’ experiences.

 

A vs B

What

This method is used to test alternative designs or versions for the same webpage. It can use a web stats package or a remote usability testing tool like Webnographer.

Why

With a large enough sample size, this method identifies which design route works best.

 

Web Analytics

What

Web analytics collects your website’s visitor log data, and allows the analysis and reporting of Internet data for purposes of understanding and optimizing your website.

Why

Discover where your visitors are going, what pages they are visiting, and how long they are staying. This method does not identify why users are visiting.

 

Open Card Sort

What

Participants are given a set of cards. They need to come up with the appropriate categories and group similar cards.

Why

A card sort can help navigation design and information architecture. This can also be an effective way of determining which labels and wording works best.

 

Closed Card Sort

What

Participants are given a set of cards and categories. They need to group similar cards under each appropriate category.

Why

A card sort can help navigation design and information architecture. This can also be an effective way of determining which labels and wording works best.

 

Click Sort

What

Participants are given a set of cards and categories. They need to click on the appropriate category for each card.

Why

This is similar to a card sort but, due to its faster interaction, users can complete far more items than a card sort.

 

Tree test

What

Participants are asked to perform a task. They are shown a list of items, such as links, and they need to select the one item they think is the most effective to complete the task.

Why

A tree test is useful for testing menu structures.

 

Diary Study

What

Participants are asked to keep a log/diary of their activities over a period of time. The study can have a focus/theme. In this case, participants need to track everything related to a given topic.

Why

Helps to understand how users’ experience changes over time. It can help with getting a broader experience, especially with a service that has multiple touch points.

 

Participant Review

What

Participants are sent a link to a website and give their design suggestions via instant messaging.

Why

Co-design goes remote. This allows people with diverse backgrounds to contribute to the design.

 

Video and Send

What

Users film their experience and then email the video back to the researchers.

Why

Video reduces context collapse. It makes the designer feel closer to the research participant.

 

Web Survey

What

Participants respond to a survey online. They can either be intercepted on a website or emailed the invite to take part in the research.

Why

It is one of the most cost-effective ways of gathering a large selection of user feedback.

 

Longitudinal Survey

What

The study is conducted over a period of time with the same participants. The aim is to observe the changes that occur between the different sessions.

Why

It is useful to see how users experience changes over time.

 

Delayed Survey

What

Users are intercepted by a pop-up window when they arrive on the website. They are asked for their email address and about their goal.
The following day, the user is emailed a follow-up survey questioning the success of their visit.

Why

A delayed survey captures the whole customer experience.

 

Exit Survey

What

Real users are intercepted by a pop-up window. They are asked if they want to take part in the research, and a new window opens in a pop-under if yes. When they end the session and close the browser, the survey window reappears, and they are asked a couple of questions.

Why

Captures users’ experience after their interaction with the site.

 

This post is the first one of a series, giving a top-level overview of all the online user research methods. Over the next few weeks, we will be posting a detailed article on for each method explaining in detail how and when to use each method.

 

Feel free to post in the comments, if you feel that we are are missing a remote research method.