You've probably heard a lot about Big Data in recent months, largely as a result of the fact that the technology – in the shape of ultra-fast processors and data interfacing systems – has come of age, meaning that companies can harness the power that Big Data brings to the better technology table.

There has been, however, a lot of confusion about what Big Data is, and how it helps the average hard-pressed company professional.

At its most basic, Big Data is an umbrella term for any pro-active use of available data for the purposes of improving services and customer satisfaction.

In this context, there has been a major focus on data warehousing and data mining for better analytics on a company's customers, as well as their product or service consumption.

The underlying premise here is that the data required for analysis is available to the company concerned, and is in a format that is easily accessible. The data should also, of course, be reliable enough to support analytics.

But wait, there's more, as Big Data generates information for what analysts call BI, short for Business Intelligence which, unlike the raw materials used in manufacturing processes, can be used, re-used and re-used again.

BI is now a must-have feature of modern management. A 2011 IBM survey found that 83% of chief information officers now view BI as their top priority for enhancing competitiveness.

Until just a few years ago, businesses tended to limit and even block the data they supplied to people outside of their day-to-day environment and bring information inside. But the arrival of Big Data – and the raw BI it generates – allows companies to do the reverse, and share their inside data with customers and see what they do with it.

This is, in essence, how the more efficient businesses communicate with their customers on social networking site and services such as Facebook and LinkedIn.

As a result, many organizations are finding that a high percentage of BI now resides outside the structured environment, meaning that businesses have to change the methodology by which they get data, which can pose significant technical challenges.

Assuming these technical challenges can be overcome and the underlying Big Data supporting the organisation's information resource is reliable, then we can start to crunch the available information.

For most applications, historical data meets the reliability criterion, but there are technical limitations caused by the fact that a lot of data is being exchanged across networks at lightening speeds and with service lifetimes that are often reduced to the time it takes to download an app.

And here is where it gets interesting, as our observations suggest that the optimum level of customer satisfaction occurs at the "moments of truth" where the customer interacts with the service – this is a concept made famous by Jan Carlson of Scandinavian Airlines.

When it comes to communication networks, these moments of truth are occurring in real time and at very high speed. Put simply, this means that, whilst a great deal of effort can be expended on analyzing – and understanding – a customer's service consumption history, the real measure of customer satisfaction is how well the service provider can satisfy customer needs at the moment of truth.

Let's think about what this means for the underlying IT system, as whilst the concept of Big Data is relatively easy to understand, the very term itself is likely to send shivers down the spine of the IT professional, for the simple reason that moving large volumes of data in real time means that one or more technology bottlenecks will be encountered.

These bottlenecks differ between organisations, but the central focus is that there needs to be real-time data analysis of customer service usage available to management in order that they can assemble the KPIs (Key Performance Indicators) that modern business planning now thrives on.

Questions that need to be answered include: Did the customer get the service they wanted and was it provided satisfactorily? Were there any delays or resends? Were there any issues with congestion that prevented the customer getting the service when they needed it as fast as they needed it?

The only way to collect and analyze this information is to complete the process in real time as the moment of truth unfolds.

The bottom line here is that capturing this information on customer service usage and network performance is the crucial front-end to understanding if the service delivery is living up to expectations.

It's important to understand here that this information is not only useful for understanding the current situation, but can also be used to enhance the historical information that KPI projections are often based on.

By historical information, we mean data on which services customers are using, as well as when and for how long, so allowing pro-active service providers to change their service offering to better suit customers' behavior.

From a technology perspective, this is the back end we traditionally understand as supporting Big Data, but the essential front end is real-time data collection on those crucial "moments of truth", which – in the end – determine customer satisfaction.

Dan Joe Barry is vice president of marketing at Napatech in Andover, Mass.

NOT FOR REPRINT

© 2025 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.