Data gravity defined and equated

This is a series of blog posts on the exploration of data gravity as introduced by the most recent DXC TechTalk. You are now reading part 2. Previous post: 1

So, what did I learn in this short hour listening to the June DXC TechTalk on cloud (rather than watching the latest from “MasterChef”)? I learnt what data gravity is. If we use Techopedia’s definition, they state that:

Data gravity is an analogy of the nature of data and its ability to attract additional applications and services. The Law of Gravity states that the attraction between objects is directly proportional to their weight (or mass).

So, this therefore states that the more data you have, the ability for the data to attract more applications and services increases.

IMHO, I think we could take this one step further: the more data there is, the more it attracts not only additional applications and services, but also people, technology, money, risk, theories, ideas, innovation, processes and security. The more data there is, the more likely it is to attract other things to it.

If we think about that in more detail, some of the world’s most successful companies right now have the largest data gravity in the world. Google, Facebook, LinkedIn and Instagram are all data companies; without data, their business models would not work. For them, data gravity is crucial, so they need to be very aware of the effects of data gravity on their business.

Dan Hushon, DXC Technology’s CTO, was very kind in providing me with more materials around data gravity, a phrase first developed in 2012. He shared a copy of his blog post he wrote the same year.

He also provided the DXC TechTalk audience with another great link: datagravity.org. Who knew there was an entire website dedicated to the term data gravity?

Data gravity is defined by Dave McCrory in the below figure; more on his equation can be viewed at datagravity.org:

Data gravity 1Source: Dave McCrory

There are other factors that need to be considered in the data gravity equation in order for it to hold true in our environment. In my next post, I shall go on to propose a new equation. Meanwhile, Dave McCrory tells me his is updating his equation, so watch datagravity.org.

Feel free to leave a comment below, or engage me in a conversation on Twitter at @GeoSuperGirl and use #DXCTechTalk. Let’s get the community conversation started around #DataGravity.


Sarah James was ANZ lead for Authentic Leadership in DXC and an advocate for DXC’s Women in Leadership and STEM. Prior to leaving DXC in September 2017, Sarah founded the Empowering Future Leaders blog and was its primary author. With over 15 years of experience in the world of IT, Sarah’s specialty is spatial information and includes integration on projects as diverse as mapping volcanoes in Hawaii to delivering high-tech police vehicles.

RELATED LINKS

Data gravity: The things you never knew you never knew

Data gravity expanded and relied upon

Transforming to a digital enterprise

Comments

  1. I’m likely going to revamp DataGravity.org in several ways, including changing the equation dramatically. To speak to your other comments around Data Gravity, this isn’t actually the case. What you are describing is called Data Agglomeration and has to do with the economics side of Data and the effects of having such an ecosystem brings based on having Data as a resource.

    Liked by 1 person

    • geosupergirl says:

      Wonderful thank you very much for sharing with us Dave and can’t wait to see the updated equation. Will read up further on data agglomeration.

      Like

    • Dave – have you made progress changing the equation? It seemed to me that it had a critical flaw – placing Latency in the denominator. This placement means that, holding everything else constant, an increase in Latency leads to a decrease in the Data Gravity quotient. That’s can’t be right.

      Liked by 1 person

      • Actually, an increase in latency would be a decrease in Data Gravity. One of the factors of attraction is the potential to decrease latency and/or increase bandwidth. Otherwise, you could have infinite latency and no change in attraction. Infinite latency would mean the data is inaccessible! Does that make sense?

        Liked by 1 person

  2. We now live in a world where information is available to all of us. It is there to be used, it will depend on each one how we want to handle it. I had to read the equation 2 times to be able to understand it haha

    Liked by 1 person

  3. Hi ,
    Its an interesting view on the data . It brings me to my question that of we are saying data that is accumulated attracts more applications , usage and access …Are we here talking about Data accumulated anywhere in the organisation within disparate business applications , Data warehouses , data marts or are we defining data gravity assuming the data is collected from all sources and made available for consumption.

    Insight into this will be very helpful

    Thanks

    Like

Trackbacks

  1. […] Data gravity defined and equated […]

    Like

  2. […] Data has gravity. Large volumes of data tend to accumulate in specific places and resist moving freely throughout the enterprise. For very large volumes, much of the data generated can become either unavailable to most or impractical to use for anything other than local decisions at the edge of the enterprise. […]

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: