big data

How Big Data is Changing the Business World

How Big Data is Changing the Business World 150 150 Kerry Butters

 It’s very easy to assume that any new term that emerges in the tech world will just be a buzzword, with no real significance to anything of use. For the past couple of years the term ‘Big Data’ has often been labelled as such, but with plenty of corporate enterprises now on board with big data analytics, it’s time for those two words to shed any connotations of inflated bombast and for the rest of us to start taking note. Big data is changing the world of business, and isn’t going anywhere anytime soon.

Indeed, big data is getting bigger. That is of course its nature, but the point being that the value of the information buried amongst it all is becoming of increasingly reachable use in the business world. It is changing the shape of internet marketing on a global scale, and below is a list of the key areas in which big data is already making an impact on our lives today (even if we’re not yet fully aware of it).

 

Expanding Customer Intelligence

The internet can be a faceless entity, a factor that is perhaps most noticeable in the world of ecommerce. Customer relations can all but disappear in a company with thousands upon thousands (even millions) of online shoppers, and indeed, picking out exactly who the most valuable customers are in such a vast congregation has been an almost impossible task.

But big data is changing all that. How? Well, big data doesn’t just look at one source of data in analysing customer trends and preferences, as has been the case in the past. Instead it utilises all the information that can possibly be gathered from a company’s social media data, browser logs, sensor data and text analytics to get a fuller picture of their customers – the purpose of which being, primarily, to produce predictive models.

The sheer wealth of data being collated and analysed from innumerable sources is now being exploited so that companies can now predict – often with staggering accuracy – all sorts of information about their existing and potential customers, and they can then focus their marketing efforts in an extremely individualised manner.

For example, as Bernard Marr reports, Wall-Mart can predict what products will sell, car insurance companies can determine how well their clients actually drive, and Telecom companies can predict customer churn. Indeed, by looking at customers’ spending habits, retailers like Target have been able to accurately predict when one of their customers is expecting a baby – it even managed to do it before the baby’s father did (Mail Online).

 

Optimizing Business Processes

One of the most innovative and useful ways that big data is now being used in business is in the optimization of stock, based on the predictions that are generated from web search trends, social media and even weather forecasts. With a more accurate detail of stock investment, companies are cutting down on waste and the need to flog off excess merchandise at reduced rates (and therefore reduced profits).

What is more is that the supply chain can be alerted in advance as to what will be needed, where and when. Indeed, even the delivery route can be optimised using big data. Radio frequency identification and geographic positioning are used to track delivery vehicles and their goods, and then, by using live traffic data, the routes are optimised for the most speedy and safe delivery.

 

Financial Trading

Big data is finding a lot of use in High Frequency Trading (HFT). The development of algorithmic trading is being guided by the growing ability to analyse massive amounts of market data generated both internally and externally. The speed at which decisions can be made about the associated risks of any trading decision is minimised to a microsecond, as algorithms instantaneously analyse big data marketing trends and movements and return clean and clear information that is used to predict what is going to happen next. Indeed, it is now the majority of equity trading that is taking place via data algorithms. These are increasingly taking into account all big data and signals from news websites and social networks so that buy and sell decisions can be made in the blink of an eye, with risk significantly reduced.

The big data revolution is upon us and in full swing. It is changing the way companies target their customers and market their products. Almost any pop-up ad that appears on your desktop whilst browsing at home or at work would have been tailored for you using big data analytics. But within the actual day-to-day functionalities of a business is where big data is also being used to optimise every step of the process, from production to packaging to purchase. This, along with the high stakes attainable at the sharp end of financial trading is what is going to produce the biggest changes in the business world. As processes, supply chains and production lines are increasingly optimised, business capitals will grow, making those high end deals more lucrative than ever.

Google, Big Data, and a New Cloud Service

Google, Big Data, and a New Cloud Service 150 150 Kerry Butters

With Big Data projects, the challenge is to clean and filter the huge amounts of information involved. It takes a lot of work to get to the point where business value can be extracted.

Over the coming year, Google will focus on releasing cloud tools and services that ease development tasks, while helping companies monitor their Big Data operations. At their I/O developer conference in June, the company unveiled a number of new products to achieve this.

 

Platform

Google Cloud Platform lets developers build, test and deploy applications on Google’s infrastructure; from computing, storage and application services for Web, mobile, or backend solutions. The platform is a set of modular cloud-based services allowing you to create anything from simple websites to complex applications.

 

Dataflow

The tech giant has introduced a cloud computing service called Google Cloud Dataflow – billed as a way of more easily moving, processing, and analysing vast amounts of digital information. According to Urs Hölzle (who oversaw the creation of Google’s global network of data centres), it’s designed to help companies deal with petabytes of data – as in, millions of gigabytes.

Dataflow is based on Google’s FlumeJava data-pipeline tool and its MillWheel stream-processing system, and is seen as the company’s answer to Amazon’s Elastic MapReduce and Kinesis, all in one package.

Batch processing is a way of crunching data already collected, while stream processing involves analysing data in near real-time as it comes off the Net. Many organisations need both types of analysis, and Cloud Dataflow puts them under one umbrella.

Designed to be relatively simple, Dataflow handles very large datasets and complex workflows. All jobs use the same code, and Dataflow automatically optimises pipelines and manages the infrastructure.

A live demo at Google I/O involved streaming World Cup data against historical information, to spot anomalies. The system could be set to automatically take actions when something was detected.

 

Compute Engine and App Engine

Google sees cloud computing as a potentially enormous market, one to rival online advertising (its primary revenue source).

With Google Compute Engine (the company’s “infrastructure-as-a-service” cloud) and Google App Engine, it now offers cloud services allowing companies and independent developers to build and run large software applications. Google also revealed a number of support services.

 

Cloud Monitoring

Google Cloud Monitoring is designed to help find and fix unusual behaviour across an application stack. Based on technology from Google’s recent acquisition of Stackdriver, Cloud Monitoring provides metrics, dashboards and alerts for Cloud Platform. It comes with over a dozen popular open source apps, including Apache, Nginx, MongoDB, MySQL, Tomcat, IIS, Redis, and Elasticsearch.

 

Cloud Trace

To help isolate the root cause of performance bottlenecks, Cloud Trace analyses the time spent by your application on request processing. You can also compare performance between various releases of your application using latency distributions.

 

Cloud Debugger

Cloud Debugger can be used to identify problems in production applications, without affecting their performance. It gives a full stack trace, and snapshots of all local variables for any watchpoint you set in your code – while your application runs undisturbed.

 

Cloud Save

Google Cloud Save provides a simple API for saving, retrieving, and synchronising user data to the cloud and across devices, without needing to code up the backend. Data is saved in Google Cloud Datastore, making it accessible from Google App Engine or Google Compute Engine via the existing Datastore API.

Cloud Save is currently in private beta, but will be available for general use “soon”.

 

Android Studio

Tooling has been added to Android Studio, simplifying the process of adding an App Engine backend to mobile apps. There are now three built-in App Engine backend module templates, including Java Servlet, Java Endpoints and an App Engine backend with Google Cloud Messaging.

 

BigQuery

With Big Data analysis, timing is everything. As Greg DeMichillie, director of product management for Google’s cloud team put it, “Knowing there was a trend isn’t helpful if you find out a week later.” What’s required is data analysis in real time – or as close to real time as you can get.

BigQuery is a way of almost instantly asking questions of massive datasets. You can bulk load data by using a job, or stream records individually.

Queries can execute asynchronously in the background, and be polled for status. Using the Google Cloud Console, you can access a history of your jobs and queries with the rest of your Cloud Platform resources.

Queries are written in BigQuery’s SQL dialect, which supports synchronous and asynchronous query methods. Both methods are handled by a job, but the “synchronous” option exposes a timeout value that waits until the job has finished before returning.

There are separate interfaces for administration and developers. Access at both project and dataset levels can be controlled via the Google APIs Console. 

The first 100 GB of data processed each month is free. Monthly billing will vary, but the BigQuery website has a Pricing Calculator which provides a simple tool to help get a sense of what an application running on Google Cloud Platform could cost.

Google looks to position itself as the cloud provider most dedicated to making developers’ lives easy. As with Big Data, it’s automating much of the process – and exposing some of its in-house technologies, on the way.

Data is Meaningless Without Analysis

Data is Meaningless Without Analysis 150 150 Kerry Butters

There’s value in organisations being able to analyse social media information and compile profiles to better target their customers. But creating, documenting, and retrieving vast amounts of data is one thing. Understanding it is an entirely different matter.

Context is Key

Measuring ‘likes’ or searching for keywords and phrases is pretty straightforward – a “sentiment analysis”. You might be tempted to develop a marketing strategy directly derived from this.

But there’ll always be examples of impulse buys, or snap decisions in the heat of the moment. And data samples may include information that’s not so easy to quantify – like pictures or videos.

In fact, the majority of actions will be based on the context surrounding them. Brand A might cost less, but B offers greater satisfaction. The sports car looks great, but what about the kids? And so on.

If a data analysis tool can’t provide further context around the solutions it offers, it’s at best, an expensive waste of time.

Informed Decision-Making

Investments in data analytics can be useless – even harmful – unless employees can incorporate that data into complex decisions. Meeting this challenge requires an understanding of human behaviour which is often lacking – and not only in IT departments.

Operational Intelligence

On a par with business intelligence is the need for operational intelligence. The ability to see and know everything that’s happening in your IT environment, at any moment. A tall order, considering the levels of scale and complexity, involved.

But armed with this knowledge, IT teams can better collaborate, fix problems, and provide support for product launches, application rollouts, migrations, upgrades, and other initiatives.

Operational intelligence may be obtained from four primary data sources:

1.  Machine data: such as log files, SNMP and WMI. Data from sensors (e.g. on wearable devices) also applies.

2.  Code-level instrumentation: which traditional application performance management (APM) is based on.

3.  Service checks: which provide insights on whether applications are up or down, and how well they’re performing.

4.  Wire data: the data-in-motion, describing all communications between systems.

Of these, wire data has the greatest potential for transforming intelligence.

Get Wired!

Wire data is the record of everything that’s happening in IT, in real time. It provides an in-depth view into the performance, availability, and security of your environment – including issues you might otherwise be unaware of.

Wire data is unstructured, and also high-velocity; generally at 10Gbps in data centres, and faster still in cloud environments. Powerful packet processing capabilities are required, just to keep up.

But, with the right tools in place, it can assist with:

1. Detecting Application and Infrastructure Performance Issues: Based on communications over the wire.

2. Big Data Analysis: You can extract specific pieces of wire data and feed it into analysis platforms such as MongoDB or Splunk.

3. Spotting Data Theft: You can easily identify when data is being stolen from your back-end databases – a particularly vulnerable place. Using wire data, you can spot when queries are being made by unknown or untrusted sources.

4. Parsing Data: Using Big Data analysis tools, you can mine that data for business intelligence purposes. 

5. Generating Meaningful Reports: Which will enable you to analyse what’s happening, with the data you collect.

Visualisation

Visualisation helps put data into context and bring business cases to life, through the creation of visual models that represent what’s happening to and with your data.

Most people are now moving toward the kinds of models having dashboards of information where you can zoom in or out. These help to understand what happened or did not happen based on actions you took – a hindsight analysis.

To look into the future, visualisation models need to be more dynamic. 

In the Real World…

Let’s take utilities (power, gas supply etc.) as an example. Most have archaic records and inaccurate information, with no idea where all of their underground assets are located. That makes it hard for them to deal with service interruptions that can occur when a power line is accidentally cut, or a water main bursts.

In the USA, the Las Vegas city government has taken advantage of smart data to develop a living model of its utilities network. VTN Consulting helped the city aggregate data from various sources into a single real-time 3D model using Autodesk technology. The model is being used to visualise the location and performance of critical assets above and below ground.

Companies in the health sector are reporting impressive results as they use Big Data analytics to increase efficiency, improve patient outcomes and provide greater personal care. These are largely fuelled by the US government’s push for more meaningful use of electronic health record (EHR) systems.

How to Make Data Meaningful?

Work backwards, and ask a few fundamental questions:

·   What business processes or decisions do you want to improve? (Make sure to get management involved at this level.)

·   How will these decisions improve the business?

·   What are you trying to maximise?

·   What are the most meaningful elements used to measure progress toward those goals?

·   What types of analysis do you need to perform to expose the data, explore “what if” scenarios and work through alternatives to optimise your operations?

·   What types of data do you need to collect in order to feed the above analysis and decision-making?

Obituary: Big Data

Obituary: Big Data 150 150 Kerry Butters

Donald Feinberg, VP and analyst at Gartner’s Intelligence and Information Group, recently said that Big Data will die within the next couple of years, thanks largely to the confusion which surrounds the term.

Once upon a time, databases were relatively small; tiny by today’s standards. Businesses had records of their customers’ accounts, built up manually over time, originally with pen and paper and later with microprocessors. Bigger companies started to have whole floors dedicated to data processing departments, ensuring that purchase orders and invoices we all matched and accurate, and accountants knew who had paid and who owed money, what had been bought and what had been cancelled.

With cloud computing and processing technology getting so small that you could practically map out the life cycle of a grain of rice, data started to get recorded and collected at increasingly faster rates and much more of it. Processors in cars and other equipment meant that a whole boatload of parameters could be constantly measured.

More and more measurables

Social media sites, ecommerce sites and other communal online gatherings meant that individuals could be adding to the pile of data already stored about them as they filled in forms and registered for things online. Photos, likes, friends, birthdays, political leanings, sexual orientation, marriage status, hobbies and interests…the list of measurables became endless.

Marketers cottoned on that they could find out even more about people and their activities by giving a little entertainment in return for information.

Data was evolving and its new buzz-name was emerging. This thing was big and needed a grand, although quite unoriginal, title. ‘Big Data’ was born and every smart sales person, IT geek and technical consultant around was throwing the phrase around in conversations.

What to do with Big Data?

Some people had ‘Big Data’ but didn’t know what to do with it. Others had bigger ‘Big Data’ than everyone else (so there) and knew exactly what they wanted to do with it. Certain eager beavers didn’t care if it was called Data, Big Data or naught and one spaghetti; they just made sure that the IT infrastructure for their organisation could handle any amount of information that was going through their servers and conduits.

Others blissfully got on with running their businesses, hiring the services of IT support companies and other professionals who would make sure their systems didn’t crash and their printers worked when they turned them on. Some smart people were figuring out how to condense meaningful understanding from all the data they were gathering.

Changing technologies

This was Big Data’s heyday, and a time when it actually meant something. Technology was changing and much of the change related to the amount of data that was ‘out there’, how it was being managed, how quickly it could be processed and moved around, and the mind-blowing variety of variables that could be and where being measured. Surely this data was going to be extremely useful for managing situations, for making the most out of the trends, for future proofing organisations by learning lessons from the past.

In fact, the enormity of the situation meant it was almost beyond definition. From a business perspective many aspects of how information stock piles were growing could be a threat or an opportunity. It all depended on how businesses reacted to the changes and the trends, and embraced what was happening. There would be winners. There would be losers. Who would win and who would lose was down to how they played the Big Data lottery.

The death of Big Data?

Yes, they were halcyon days for ‘Big Data’, so what changed? How did Big Data eventually get sick, and then subsequently die? What went wrong? In a nutshell, it was beaten by its own success. It was made redundant by its own arrival.

Big Data could be seen as a hurricane on its way to a land that has never experienced such a phenomena before but is going to imminently – where people are forewarned well in advance that there is a disturbance in the weather, troubled times ahead, that the time is coming to baton down the hatches and seek cover before the incredible winds, rain and destruction arrive. The time for talking about it, for describing, defining it and giving explanation is before the storm hits. Once the storm hits, everybody is too busy making sure they come out on the other side in one piece.

A major change in how information was being generated and gathered was occurring that required attention and action on a number of fronts. Just as the village needed to be prepared for the high winds of the hurricane and in order to be convinced that it was coming, it was defined, described and explained to them; so was Big Data the catch-all solution to capture the revolution in data processing.

Information control

That job has been done. Mentioning Big Data is now pointless. The consequences of its arrival are already here in the flesh. Businesses are finding right now that they either have control of the information in their possession or they don’t. They are either benefitting from the intelligent analysis of the information that counts or they have been focusing on the least fruitful facts and figures – or maybe not mining from their data banks effectively at all.

Big Data was a hype word. It was a necessary one to galvanise people into action, to facilitate communication and sum up a range of phenomena that needed to be acknowledged – like the ‘swinging 60s’, but now the realities have already hit home the people who are really in the know are looking for the next change they need to predict. Big Data is rapidly becoming a ghost that is only mentioned by people who are not so savvy, or who are desperate to show understanding in order to sell a product or service.

Thanks for everything you did for us, Big Data. Good bye and God bless. May you rest in peace.

Image: Gerd Leonard

What’s your Data Governance Plan?

What’s your Data Governance Plan? 150 150 Kerry Butters

Big data is something that businesses are embracing in increased numbers, but not every business is prepared for the changes that accompany big data adoption. Information from a recent survey suggests that as many as 44% of businesses aren’t ready to implement data governance plans. 22% of these firms that don’t have a data policy have suggested that they have no plans to implement one.

These findings were released in a data governance survey from Rand Secure Data, which is a division of Rand Worldwide. The findings suggest that businesses simply aren’t prepared for the legacy of big data and it’s becoming apparent that many businesses are happy with the benefits of big data gathering, but are equally happy to ignore the dangers.

Businesses are aware of what needs to be done to safeguard its data but many seem loath to act or even address the problem. It seems that until there are consequences, many businesses simply won’t acknowledge or do the things that need to be done.

Here’s a quick list of things that businesses should be doing:

·         An enterprise-wide process for managing data archiving

·         Backup of system files and company servers

·         Promoting e-discover and incorporating it better

Many respondents in the survey said that their companies had yet to adopt any of the above. Whilst not every company is a culprit for poor business planning, a surprising number are. This is of course an area for concern and businesses need to develop good data governance plans and better prepare for the increase in big data usage.

There are a number of consequences to neglecting good data governance plans. According to the survey, the next two years are important and if companies don’t adopt better policies before that time expires then those companies could lose data, lose control of the tracking and gathering of data, and even risk potential lawsuits due to bad data governance.

The survey doesn’t just bring bad news however – there’s some hope in the sub regions of data management. Over 98% of respondents said that their company has some form of backup program for its data and 95% stated that their organisations backup all of its data on a consistent basis.

The role of e-discovery

E-discovery however is another area of concern for businesses and the danger of legal action due to mismanaged data stockpiles is high. According to the survey, over a third of respondents felt that the company they worked for would be unable to find and produce data when it was needed and the same participants also said that their organisation wouldn’t be able to prove the voracity of its data in the event of legal action.

These are areas that need addressing and a business shouldn’t feel that by ignoring the potential dangers of big data usage the consequences are negated. The potential for error obviously increases the longer that good data governance plans aren’t implemented. It’s not only forward thinking, but completely necessary and businesses shouldn’t shirk their obligations in favour of easy rewards.

Predictive coding

A lot of businesses are avoiding implementing predictive coding – even when this can save time and greatly increase productivity. Companies need to realise that time saved now is not greater than the time and hassle saved in the future by early implementation. A machine-learning e-discovery technology could almost negate human input and automatically determine how and where documents will be classified. This type of time-saving technology will also help in terms of making the network and its data much more readily searchable and it’ll increase the working output of employees.

Although e-discovery software is easy to obtain and implement, predictive coding is used by only 14% of those surveyed. Perhaps more telling is the fact that 33% said that they had never encountered e-discovery or predictive coding before. This suggests that e-discovery is an area that needs exploring and businesses need to be better educated on what it is and how they can use it to its advantage.

In terms of what businesses can do to become better prepared for big data through good governance plans, the actions required are pretty simple. Executives need to participate and help to devise data governance policy that benefits and safeguards the business. The involvement of executives and company wide policies mean that a business is three times less likely to lose its data or run the risk of a data audit failure.

The survey has four recommendations for businesses looking to adopt data governance plans.

1.      Organisations need to adopt a formal data governance plan or reassess the current plan. No policy will ever be faultless and of course there will always be exceptions but corporate entities need to work on and develop protective measures to ensure that the business gets the valuable data it needs safely.

2.      Use your organisation and solicit as much input as you can from your employees. They’re working with the data on a daily basis and they’re likely to have a good idea of what needs to change and develop.

3.      Make sure that any data governance policies are in keeping with your organisation’s legal requirements. There are many different types of data, each with different retention rates and you need to know how to meet those requirements.

4.      Be on the look out for new technology that your business could utilise to its benefit. Big data is growing and becoming ever more saturated and data governance should be reliable, scalable, and of course efficient. New technologies will arrive that can make it easier for your data governance goals to be met.

Many businesses are not doing what’s required of them when it comes to data governance. Businesses need to realise the dangers of not adopting new policies and understand that scrimping now will not save them in the future.

Is Big Data Essential For Business?

Is Big Data Essential For Business? 150 150 Kerry Butters
 Image by  Domo

Image by Domo

Data is more a part of our lives now than it has ever been. It’s woven into every sector of the global economy and the harnessing of that data by businesses and individual alike is becoming the norm.

Big Data is what we get when the data sets collected become too large and complex to analyse using standard methods. This data comes from all sorts of sources, including web browsers, social media and consumer information. By sifting through all of this information business managers are able to make much more informed decisions and therefore move their company forward with confidence.

The Benefits of Big Data

Future ready – The internet of things is a hot topic in 2014. As products begin to broadcast data to one another to improve their efficiency. The amount of data available to manufacturers and suppliers is going to grow even greater as a result. Investing in an infrastructure that can handle and analyse Big Data now will put your business in an excellent position for the future, even if you’re not concerned with the internet of things, Big Data is listed by Gartner as one of the technology trends you can’t afford to ignore.

Customer Insights – Near the top of a good business’s objectives is to react and respond to the needs and desires of its customers and clients. Big Data allows your business to make accurate predictions about the near-future needs of those who use your business regularly, and the business of your competitors. This can give you the edge that you need to both excel in your chosen field and compete with other businesses.

New Business Models – Some industries have seen Big Data create completely new business models. Algorithmic trading allows businesses to analyse tremendous amounts of market data every minute, providing them with information on where the real opportunities are. This kind of speed would have been unheard of a decade ago. Retail companies are using Big Data to change their purchasing behaviours, making purchases based on fact rather than speculation.

Better In-House Operations – Big Data may be the final link for companies looking to achieve maximum operational efficiency. By constantly assessing the efficiency of their workers and work processes, managers will be able to highlight the points where they’re losing money. Insurance companies are already speculating how they can use Big Data analytics to speed the processing of claims and spot potentially fraudulent claims that need investigating.

Improved Sustainability – The Guardian recently reported that Big Data could see businesses able to take much greater steps towards becoming sustainable and nature friendly by using the same information that improves their in-house operations.

Harness Your Staff’s Potential – Data analysis doesn’t have to stop at your customers, businesses who turn their analysis inwards and review their own staff have reaped the benefits too. By analysing the work patterns and talents of their employees, managers can make sure that staff with niche skill-sets aren’t going to waste and introduce multi-discipline roles to improve efficiency.

The challenges of Big Data

Security – Obviously if you’re handling tremendous amounts of customer and client information then you need to be absolutely sure that that information is well protected from any hostile forces that might want to misuse this information. If you’re going to invest in Big Data we advise you also invest in beefing up your security. Fortunately, Big Data security is set to be one of the big talking points of 2014, so there will be plenty of opportunities to make improvements across the board.

Accumulation – Data doesn’t just turn up on your desk, you have to invest in accumulation systems to help you collate all the information that’s out there and turn it into useful, business-driving data. These can be expensive, but they are an important part of any business’s Big Data strategy.

Analysis – Big Data needs high performance analysis because there’s just so much of it. Datamation wrote an article last year about the amount of data we create every minute, and the numbers are staggering. According to their data in June, the human race wrote over 204 million emails a minute, queries Google 2 million times, watched 48 hours worth of YouTube Videos and produce over 100,000 tweets every minute. That’s a huge amount of information to process and eke useful data from, but it’s almost certainly worth it.

Datamation also noted that over $272.000 was spent on e-commerce every minute. Like the other numbers it’s likely to have gone up in 2014 to even more. People are spending more money online because it’s often cheaper, more convenient, and more varied. It also means more data, and that data is becoming more and more accessible every year. For a business, passing on all that information is passing on a huge percentage of potential new customers for your business once you isolate their desires using Big Data.

It’s Time for Big Data

Big Data analytics is looming over the business sector like never before. 2014 is the year where we’ll see businesses really kick their analysis into gear. The target for many will be to become capable of real-time analysis of data as it’s collected, and then react to that information accordingly. Any big company unable to do similar may find themselves falling behind, and any SME’s getting into Big Data early could well see themselves rocketing to the front of the pack.

Reluctance to Hire Staff Holding Back Big Data

Reluctance to Hire Staff Holding Back Big Data 150 150 Kerry Butters

Big Data is big news and many companies are embracing it whole-heartedly. It can provide an organisation with a good idea of whom its customers are and what they want from the services provided. It has promoted the idea of ‘value exchange’ and allowed companies a better means of communicating and understanding its customers.

So how can Big Data be utilised within marketing campaigns?

Infogroup Targeting Solutions recently published a study that revealed that the investment from business into Big Data marketing would increase hugely in 2014. This is due to new initiatives that allow a business to fully embrace the massive amount of data that its customers are relaying. However, the study also found that although the benefits of Big Data are clear, there are many companies that are not making the correct plans for data-related job positions.

According to David McRae, president of Infogroup Targeting Solutions:

“The survey findings also indicate that marketers are moving from the information-gathering stage to the analytics phase of Big Data adoption. But a downturn in hiring could stall Big Data implementation, as the need for human capital is greatest during the analysis and action stages.”

It seems that marketers already have the information they need and are now looking for the best application of that data through analytics tools. The problem facing most businesses is the implementation of Big Data policies as many are failing to hire the correct staff within this burgeoning sector.
Employ more staff

It’s all well and good collecting Big Data but it’s in the collating, analysis, and action stages that businesses are struggling. A good tactic for businesses looking towards the future would be to employ more staff to handle this huge influx of data. This will allow them to accurately discover the best way to apply the new information provided by customers.

Many participants in the study cited a number of reasons for not adopting Big Data practices at work. Some of the main concerns mentioned were limited budgets to fragmented systems; half of the marketers surveyed however were enthusiastic about the role of Big Data in the marketing industry.

Big Data is big business

The reports found that for a second year in a row more than 60% of companies expect their Big Data marketing budgets to increase. The majority of marketing departments however don’t plan on adding any new employees to their staff. This means that there won’t be enough hands to handle the influx of data in 2014 marking the key problem with Big Data implementation.

There are simply not enough business policies in place, be they employment of new staff or simply increased budgets, to allow for Big Data to make the splash that everyone is expecting. Interestingly enough, many companies stated last year that they were planning to hire for Big Data positions. This hasn’t materialised and it’s frustrating to see an industry so poised for growth being hampered by short minded, profit driven businesses.

McRae further argued that businesses need a huge influx of employees to fully grasp the Big Data potential:

“Big data is meaningless without manpower. While it’s exciting that most companies are making bigger investments in Big Data, marketers should not forget that it takes people to make sense of the information. Hiring before reaching the analytics stage enables companies to become data-led and act on the data.”

The advice is clear: employ more staff before reaching the analytics stage. Back in 2013 most marketers, around 70%, said that they expected data-related spending to increase in the year ahead. The spending spree will continue with close to 62% of marketers predicting that their Big Data budgets will increase.

There is a slight decline of 8% here and although it’s not really that much it reflects the lack of investment in this sector. This number should have gone up hugely since 2013, not down by a fraction, and this decline might be an indication that fewer marketers are budgeting for data solutions.

Why does Big Data matter?

Consumer expectations have never been higher. The mobile app market and the customer-first attitude have fuelled the customer’s desire to be heard and recognised. Marketers and indeed many businesses are coming under heavy pressure to support customer-satisfaction initiatives whilst, at the same time, find a return on investment for marketing spends.

For many marketers the answer lies in gaining a more complete view of the customer. Big Data provides this possibility as it lets marketers slice their consumer base into individual segments, this enables the marketer to better understand, predict, and shape customer buying behaviour.

Marketers see Big Data, coupled with sophisticated mining and analytics tools, as the key to unlocking those consumer capabilities. The application of Big Data within marketing lies in finding a wide range of tools, and of course employees, to finding a use for that data through correlating ads and sales, to audience measurement, to predicting customer behaviour.

Big Data has big potential and it lies squarely in the hands of businesses to implement successful policies to take advantage of this ever-expanding market. A business should consider employing staff to specifically deal with the influx of data that the new value exchange model provides. If businesses fail to employ more staff all of the Big Data that has been collected is a waste and ultimately meaningless.

Whilst there still exists something of a skills gap when it comes to finding the right staff to analyse Big Data, companies should be pulling out all the stops to ensure that they land the best candidates.

Big Data – is it possible to define it?

Big Data – is it possible to define it? 150 150 Simon Randall

This is a big question and one which, once fully considered, has massive implications for any business. Every day, businesses are amassing an increasing amount of data and the scope of what can be measured is also expanding at an incredible rate. While businesses are still getting to grips with how to use this data meaningfully, some are struggling to manage it effectively.

Sometimes that means databases becoming corrupt at an increasing rate as they become larger, or more difficult to store effectively in-house. Massive databases may be creating too much demand on infrastructure when being processed at speed or are that diverse in nature that it is difficult to know where to start when organising them into a usable format.

What is Big Data?

Once upon a time, a business would store essential information such as client names and invoice details, order history and accounts records. This information would be structured into usable format and, with the dawn of the computer age, tied up with software making it easy to access. Looking back to that era, data was gathered conscientiously and with a definite purpose in mind. The bigger the business was, the bigger the databases required to store its prized information.

Digital evolution has changed the data landscape forever. Where data was once input into a system one unit at a time by an operator or by an individual filling in a questionnaire, data collection and its transfer is now a more automated affair. Modern day communications being what they are means that information relating to somebody’s Facebook usage, while accessing the site from their mobile phone in Indonesia, is usable in the US in the blink of an eye.

It’s not only the speed that data is transferred that’s changing, and how it’s transferred; the spirit in which data is now gathered has now become incidental, almost accidental. That is, much of what people do with their credit card or view using their browsers leaves a data trail that can be detected and processed retrospectively. It’s this discovery of what’s already out there that’s making Big Data so much of a wild card commercially with regard to its changing nature, and also what makes it so exciting an opportunity.

Types of data

The type of data being collected and the sheer amount of information that’s out there is absolutely mind-boggling. Most companies in the US have at least 100 terabytes of data stored – that’s 100,000 gigabytes if you’re wondering – according to a useful infographic (below) published by IBM. The same document highlights the fact that out of a world population of 7 billion, some 6 billion of us are using a mobile phone. Imagine the humungous amount of information being measured through our phones. What’s particularly interesting in this infographic is how data is categorised according to four criteria: volume, velocity, variety and veracity.

Volume

There’s no particular agreed volume of data needed for it to qualify as Big Data, so it’s a bit of a case of how long is a piece of string, However if you’re looking at a sizeable amount that’s becoming hard to manage, then it’s safe to say you’ve entered the realm of Big Data. It’s now possible to mine a large array of information from data trails, and where specific data is being gathered deliberately, the processes have become so efficient through better real-time technology, that it’s very easy for a business to find it has a tricky amount of facts and figures very quickly.

Velocity

For data to be useful, it needs to be processed. How quickly the data is gathered in a meaningful way, and how quickly it can be analysed and used effectively, is a reflection of its velocity. As shown by the earlier infographic, it is projected that there will be 18.9 billion network connections by 2016; that’s almost 2.5 connections per person on earth, and they will be creating a whole new universe of high velocity, streaming data that will be analysed on the fly. There’s a great illustration of how velocity interacts with volume and variety in this WhatIs article.

Variety

The variety of data that exists is changing all the time as technology changes. Twenty years ago, the idea that 30 billion pieces of content would be shared on Facebook each month or that 400 million Tweets would be generated EVERY DAY were unthinkable. Data is now being collected using hi-tech medical monitoring equipment; from computers, mobile phones etc. Just as today’s variables were unimaginable yesterday, it’s highly likely that there are many others beyond tomorrow’s horizon that are unthinkable today. Big Data is going to get much bigger and even more complex, but the rewards for managing it effectively are going to be exciting for everyone.

Veracity

From the Latin root, ‘veritas’, meaning ‘truth’, this wonderful word refers to how dependable, or how certain the data that’s been gathered, is. Since records began, there were those that would leave a space blank rather than double-check, or lazily record an approximate value or sometimes an inaccurate one. Whether through human error or mechanical failure, as would be the case if a particular button on a keyboard was faulty, mistakes have always been made and will continue to be a reality for some years yet.

The problem with Big Data in relation to veracity is that mistakes tend to be greatly amplified for greater amounts of data. Also, just as an arrow that’s off course by just half a degree can finish up increasingly further from its target depending on the distance travelled to reach it, so poorly measured or recorded data can have a snowball effect in the long term, especially as data size increases.

What do Volume, Velocity, Variety and Veracity mean for business?

The scale of information gathering and processing now available to business means lucrative opportunities for understanding their target markets, current trends, projections, spending habits, ways to become more efficient; the potential is enormous. The four main categories of Big Data discussed in this article, the 4 V’s, present different challenges to business in regard to how they can be practically handled.

Effects on IT infrastructure

As the volume of information being gathered continues to increase, perhaps not exponentially but dramatically, businesses are going to need servers that can handle the extra load and remote data back-up will also need to be airtight. After all, the potential to lose data will be greater because whereas one day’s data loss was this much yesterday, it will be many times more than that today. The size of data being transferred will have knock-on effects on IT infrastructure; with cabling needing to be of a suitable spec to handle the larger volume for example.

Even smarter software development

The experts are getting better at measuring things and converting that information into noughts and ones. Light, sound, humidity, occupancy are all measurable and these represent just the tip of the iceberg from the world of building management. As we are able to collect an increasing range of variables from the world around us, we will need to store these in an intelligent way for easier processing. They will need separate databases, and sophisticated software packages that can move them, shake them, make sense of them and create the commercial honey we all want.

Robust data gathering and recording

As more data is collected automatically, veracity will improve as software becomes more sophisticated and data transfer more reliable. The slightest error could mean there’s a black hole for every hundred thousandth unit of data, for example, and when dealing with a mammoth amount of information that’s going to be put through heaps of processing, the long term corruption potential could be catastrophic.

Whether or not human beings fill in the fields of CRM packages or online forms depends on many factors. Are they being asked too much by a company they have never done business with before? Is the process too tedious? In regard to CRM software, has it been embraced by the workforce who are using it or is it seen as a way of snooping on them? Have they been trained to use it correctly or is it just another unwelcome task that has landed on their lap, another thing to slow them down in an atmosphere where productivity is constantly monitored? After all, if the team who are using it don’t embrace it, how can the data being entered into the system be trusted?

Unintended consequences

Other factors that can affect veracity are the unintended consequences of sales people, for example, who are trying to find loopholes in the system or are cutting corners. Perhaps, they are creating a duplicate account for business that has been barred from further business by the accounts department of the organisation, or maybe an operator has figured out they can get through the system quicker by putting any old number into a particular field.

These accuracy issues can only be dealt with using a thorough approach – software fail-safes, intelligent design, effective communication and training of the staff that are using the software etc.

So is it possible to define Big Data?

Every aspect of Big Data is changing; the amount of data, more is expected from the data collected, and the type of data being processed is evolving. What this means is that trying to define Big Data is like trying to grab hold of blamanche. Any definition that can be applied is likely to become outdated fairly quickly because the boundaries of this new and wonderful entity are in a state of flux.

Big Data is bringing a whole new world of opportunity to everybody. For the everyday person it means added convenience because the organisations that offer everyday services are getting better at giving people what they want. From the perspective of big business, it means getting it right more often, stock that sells, projections that are accurate, efficiencies and increased profits.

The wise will bear in mind that with every opportunity, there is usually a threat and that is certainly true in the case of Big Data. If businesses don’t get to grips with how they handle Big Data accurately, efficiently and effectively, not only will they miss out on growth potential, they will be vulnerable to competitors who are in command of their Big Data.

The real question

For a business to be truly future-proofed, it needs to be prepared for change and ready to adapt to a new world. In the case of Big Data, this means be ready to take advantage of whatever new variables become measureable – and these new possibilities may well be beyond what we can currently see. The question then becomes this: –

If businesses really want to reap the benefits that Big Data brings, should they even define it in the first place or is it better to keep eyes peeled on the next big change?

Big Data: What’s all the fuss about?

Big Data: What’s all the fuss about? 150 150 Simon Randall

If you read the
technology news, or even just put searches into Google, you surely
can’t have failed to notice the term big data being bandied
around very frequently. Whilst it’s easy to dismiss it out of hand as
the latest buzzword in the technology industry, which does like its
jargon, let’s face it, big data is something that could prove very
valuable to your business.

So without further
ado, let’s have a look at what big data actually is.

Big data is a
collection of data sets that are too big and complex for the usual
database management tools or data processing applications to manage.

In order to be
able to use this data for various business purposes, the enterprise
has to find a way to carry out key tasks such as:

  • Capture

  • Curation

  • Search

  • Transfer

  • Analysis

  • Visualisation

Successfully
carrying this out means that companies can then use the data to pick
up trends, determine the quality of research and even prevent
disease, amongst many other things. Data can be collected from a
variety of sources, both inside and outside of the company. Product
sales, financial information, online and offline interaction
channels, all of these can make up a part of a data set.

Structured and unstructured

It’s important to
understand that data sets can be made up of data that’s both
structured and unstructured, with the latter coming from sources that
are difficult to organise, such as social media posts.
Multi-structured data comes from a variety of sources and uses many
different formats and can be categorised as having some form of human
interaction with the machine.

This means that it
can be made up of text, images, web logs, forms or transactional
information. Basically it’s talking about any data that comes about
thanks to the interactions between people and machines, whether that
be web-based or not.

It
has plenty of uses and with the ever-growing amounts of information
that are used on social and web communities, it’s something that
marketers are taking advantage of and this trend is only likely to
continue.

Understand your customers

The key to using
big data successfully for marketing purposes, is that it allows you
to leverage big data technology to understand how your customers
think. This means that you can then engage with them at a time that
is relevant to them and with information on the products and services
that they really want.

With
big data, you can examine and act upon everything that you know about
your customers and this can be a substantial amount of information
that allows for more accurate targeting. If you look at a company
like Amazon and consider the kind of emails that you receive from
them, they’re always targeted to you specifically, with suggestions
that they think you may act on.

Likewise,
when browsing Amazon’s site, suggested products will appear at the
bottom of the page as you’re surfing. This is what big data can do,
using technology to personalise the customer experience that
increases the chance that they will buy significantly.

Maximise operating margins


According to the McKinsey
Global Institute
, “a retailer using big data to  the full could
increase its operating margin by more than 60 percent” and
harnessing it for the public sector has a huge amount of potential.
Health care systems could benefit from vast cost-savings and could “
create more than $300 billion in value every year”. Certainly not a
sum to be sniffed at and for the US, it could reduce the amount spent
on health care by a whopping 8%.

Governments in Europe could also make
huge savings, which McKinsey say could total as much as 100 bn Euros.
Big data has numerous uses and there’s no doubt at all that it will
also aid research in such a way that’s never been seen before. Of
course, there are also always the scare stories and conspiracies that
mean there will always be some media sensationalist story about the
dangers (such as the NSA and how they use it), but that’s a natural
side-effect of progression.

More research

It’s no surprise
that IBM are one of the pioneering technology companies that are
tackling big data and providing software to handle it either. Now,
the company has also launched its new Accelerated
Discovery Lab
which aims to find connections among massive data
sets, which can in turn be used for analytics.

“There’s a set of data challenges and lot
of different expertise you need to tackle those problems,” Laura
Haas, the labs director of technology and operations told Venture
Beat
in a recent interview.

“Everyone is talking about Big Data and
analytics. We talk to our researchers, but it’s still really hard
to get the right kind of insight out of the data.”

The
new lab is located at the IBM Almaden Research Center in San Jose,
California and already one researcher has used big data to decipher
someone’s personality just by studying 200 tweets.

Big
data isn’t for every company, if only because small businesses may
struggle to find the time and money necessary to utilise it. But
there’s little doubt as to the value of it, in pretty much every
industry across the board.

 






captcha