Translating digital body language

digital body language DXC Blogs

If you look at yourself in a “digital mirror,” chances are your image will be shockingly different than what you see in real life.

Whether you’re a digital native or a digital immigrant, you’re constantly establishing a footprint based on engagements with websites, digital devices and RFID feeds.

What makes this more complex is that there is often an offline human imprint driving much of our digital body language. And in most cases, there’s not a one-to-one correlation between the two. (As you look at cross-cultural aspects of digital body language, the implications become even more profound.)

So why do the subtleties of digital body language make a difference to enterprise technology professionals? The answer is that, despite the prevalence of e-commerce and social media, customers continue to live in two worlds. The experiential aspects of the two are converging, as opposed to diverging.

The challenge that technologists and marketers share is finding the sweet spot between online and offline personas. For example, what will it take to leverage the physical body language of a customer shopping in a department store and engage that same person in a digital shopping experience? To the converse, is there a way to translate digital body language in a physical environment?

Whether knowing it or not, we’re now constantly being monitored by digital beacons. Retail establishments use them to track customer movements. On trade show floors, heat-map technology tracks where attendees cluster and what is drawing them to that location. This data now forms a mosaic of potential instantaneous jump-off points for online engagement.

For example, Allrecipes, the digital food brand, is installing beacons in 58 supermarkets in Ohio. These are designed to trigger the Allrecipes Dinner Spinner app. Shoppers receive customized meal and recipe recommendations based on their physical movements. Thus, physical body language triggers online engagement, which in turn triggers digital body language related to food and cooking preferences that can be directed back to engagement in the aisle.

While not as sexy as beacon technology, most digital body language is related to patterns of online engagement. For instance, in retail, online browsing patterns point to the propensity to buy a good or service. Unfortunately this engagement “picture” can often resemble a Jackson Pollock painting, unless there is a predetermined strategy for translating body language into insight.

Many marketers or user experience experts operate under the illusion that the way a person navigates a website corresponds to “body language.” This is only half true. The other, more important half (typically overlooked) is the content that drives online body language.

This really isn’t revolutionary. After all, content drives human body language and interaction in face-to-face situations.  If a speaker is interested, the audience leans forward. If the content is boring, they yawn. If it’s intimidating, they move back.

In the online world, the process requires a religious testing process that includes a content mapping exercise making meaning of digital language. For example, recommending follow-up content that successfully results in a series of “leaning forward” engagements can produce valuable insights in propensity to buy.

On the other hand, watching users push back on certain types of content is not necessarily a bad thing. It allows businesses to determine affinity for certain kinds of content. Think about how Pandora and Spotify use a “thumbs down” to help target music delivery to the listener.

Whether it’s hitting the delete key or watching a 45-minute YouTube video, every online interaction has meaningful language, as long as you can understand and track the emotional and business indices that stimulated it.


Operational efficiency: A must-have for digital retail

Omnichannel engagement: The power of a single view


Speak Your Mind


This site uses Akismet to reduce spam. Learn how your comment data is processed.