In the digital world, where business results are ever more top of the list, user experience is one of the key factors of success.
Today is a day I’ve been looking forward to for 3 years. I am going to present you the latest innovation of Netway: the ability to travel in the mind of users in order to gain a better understanding of what’s going on.
I want to illustrate this by sharing part of the analysis we did on the Facebook site for one of our customers.
But, before getting into the subject, allow me to explain why it is necessary to understand what goes on in a user’s brain when you design a screen.
The answer is quite simple: users are more and more in contact with competing screens, it becomes ever more difficult to differentiate your screen from the competitors’ screens, …
So, let’s go!
The brain works in sequences of activities that last milliseconds.
Thanks to eye-tracking, we can see where the brain decides to go to collect information because we can observe the eyes. However, it is impossible to know whether the person is already familiar with the content on the screen, whether the screen incites users to click or whether the user has memorised the screen, …
Now, all these problems belong to the past! Imagine looking through your customer’s eyes and now to travel within their brain.
In plain English: we can see which zones of the brain are activated when a user is performing a task.
If we can identify the activated zones of the brain, we also know the answer to the following questions:
- Doesn’t the screen have too many elements?
- Which parts of the screen are analysed the most by the brain?
- Do users recognize the used visuals?
- Do the call-to-action elements incite action?
- Do users understand the content?
And if we do that, we can objectively measure the user experience.
Let’s get back to our Facebook example…
I will deliberately make my explanation a bit more accessible so the greatest possible number of our community members can benefit from it.
Let’s go for it …
In the case of Facebook we see the right visual cortex has a higher level of activation. This indicates the visual elements at the left side of the interface generate more brain activity than the right-side elements.
But what does this mean? Let’s take a closer look at the activation flux in the visual cortex.
We see the visual cortex is not very activated between zone V1 and the Brodmann 7 zone (in blue). This means the visual attention (position in space, orientation and size of the graphical objects) requires little effort. This means the site has a visual organisation that requires little visual attention.
But if we look at the zone that goes from V1 to V4 (in orange), we see it has a higher level of activation. This means users non-consciously identify and recognize the visual elements on the screen.
The fusiform gyrus is the zone that will make us recognize faces and well-known things.
We can conclude Facebook has an easily understandable and efficient visual organisation. People recognize the visual elements of the screen, and in particular the faces.
We now have interesting data on the visual elements of the Facebook homepage. However, during a visit, many cerebral systems will be active in parallel. These activities constitute sequences of milliseconds and involve:
• the visual system
• the semantic system
• the motor system
Let’s now analyse the content understanding of the site.
The Brodmann 44 zone is involved in recovering information in our semantic memory. This means a surfer watches the elements and this system will activate a network of knowledge about a certain word or an object.
The information that is recovered in the long-term memory during a Facebook site visit activates the semantic network. People will know what they see and that activates a set of linked information (I know this person, it is a friend of…, …).
We see the Brodmann 45 zone is not activated. If this had been the case, it would have meant the recovered information didn’t activate strong associations. That would mean the content is not very well known or not very often used by our brain.
In short, the content on Facebook is simple and does not require a considerable cognitive effort.
Let’s know check whether the call-to-action elements generate a lot of reaction.
There is a special zone in our brain, the Brodmann 6 zone, which is activated when a surfer thinks about clicking on something. This zone of the premotor cotex plans the movement of the hand and the fingers (before actually moving). By analysing this zone we know whether a call-to-action makes people want to click on something before they actually do so.
In the case of Facebook, the interface gives moderate results.
Facebook has a simple visual organisation; users immediately recognize the graphical elements; the cognitive efforts are low and surfers understand the content.
We have analysed many more zones, such as the hypocampus that allows us to know whether a screen has been recognized, or the reward system that allows us to check whether people are happy, and so on.
So how do you put this information into practice?
In our case:
- We build screens based on the required behaviour.
- Once the screens have been designed, we check whether they activate the zones and activation levels that have been determined beforehand in the form of a hypothesis.
- We cross these data with the data of ocular analysis that have been gathered in order to have a view that is as objective and as certain as possible.
- If needed (read: in 99% of the cases), we correct the screen in order to generate the expected behaviour in more than 80% of the cases.