There’s value in organisations being able to analyse social media information and compile profiles to better target their customers. But creating, documenting, and retrieving vast amounts of data is one thing. Understanding it is an entirely different matter.
Context is Key
Measuring ‘likes’ or searching for keywords and phrases is pretty straightforward – a “sentiment analysis”. You might be tempted to develop a marketing strategy directly derived from this.
But there’ll always be examples of impulse buys, or snap decisions in the heat of the moment. And data samples may include information that’s not so easy to quantify – like pictures or videos.
In fact, the majority of actions will be based on the context surrounding them. Brand A might cost less, but B offers greater satisfaction. The sports car looks great, but what about the kids? And so on.
If a data analysis tool can’t provide further context around the solutions it offers, it’s at best, an expensive waste of time.
Investments in data analytics can be useless – even harmful – unless employees can incorporate that data into complex decisions. Meeting this challenge requires an understanding of human behaviour which is often lacking – and not only in IT departments.
On a par with business intelligence is the need for operational intelligence. The ability to see and know everything that’s happening in your IT environment, at any moment. A tall order, considering the levels of scale and complexity, involved.
But armed with this knowledge, IT teams can better collaborate, fix problems, and provide support for product launches, application rollouts, migrations, upgrades, and other initiatives.
Operational intelligence may be obtained from four primary data sources:
1. Machine data: such as log files, SNMP and WMI. Data from sensors (e.g. on wearable devices) also applies.
2. Code-level instrumentation: which traditional application performance management (APM) is based on.
3. Service checks: which provide insights on whether applications are up or down, and how well they’re performing.
4. Wire data: the data-in-motion, describing all communications between systems.
Of these, wire data has the greatest potential for transforming intelligence.
Wire data is the record of everything that’s happening in IT, in real time. It provides an in-depth view into the performance, availability, and security of your environment – including issues you might otherwise be unaware of.
Wire data is unstructured, and also high-velocity; generally at 10Gbps in data centres, and faster still in cloud environments. Powerful packet processing capabilities are required, just to keep up.
But, with the right tools in place, it can assist with:
1. Detecting Application and Infrastructure Performance Issues: Based on communications over the wire.
2. Big Data Analysis: You can extract specific pieces of wire data and feed it into analysis platforms such as MongoDB or Splunk.
3. Spotting Data Theft: You can easily identify when data is being stolen from your back-end databases – a particularly vulnerable place. Using wire data, you can spot when queries are being made by unknown or untrusted sources.
4. Parsing Data: Using Big Data analysis tools, you can mine that data for business intelligence purposes.
5. Generating Meaningful Reports: Which will enable you to analyse what’s happening, with the data you collect.
Visualisation helps put data into context and bring business cases to life, through the creation of visual models that represent what’s happening to and with your data.
Most people are now moving toward the kinds of models having dashboards of information where you can zoom in or out. These help to understand what happened or did not happen based on actions you took – a hindsight analysis.
To look into the future, visualisation models need to be more dynamic.
In the Real World…
Let’s take utilities (power, gas supply etc.) as an example. Most have archaic records and inaccurate information, with no idea where all of their underground assets are located. That makes it hard for them to deal with service interruptions that can occur when a power line is accidentally cut, or a water main bursts.
In the USA, the Las Vegas city government has taken advantage of smart data to develop a living model of its utilities network. VTN Consulting helped the city aggregate data from various sources into a single real-time 3D model using Autodesk technology. The model is being used to visualise the location and performance of critical assets above and below ground.
Companies in the health sector are reporting impressive results as they use Big Data analytics to increase efficiency, improve patient outcomes and provide greater personal care. These are largely fuelled by the US government’s push for more meaningful use of electronic health record (EHR) systems.
How to Make Data Meaningful?
Work backwards, and ask a few fundamental questions:
· What business processes or decisions do you want to improve? (Make sure to get management involved at this level.)
· How will these decisions improve the business?
· What are you trying to maximise?
· What are the most meaningful elements used to measure progress toward those goals?
· What types of analysis do you need to perform to expose the data, explore “what if” scenarios and work through alternatives to optimise your operations?
· What types of data do you need to collect in order to feed the above analysis and decision-making?