Organisations turn to Big Data technology for two reasons – hype and necessity, according to Gartner analyst Svetlana Sicular. In the latter case, organisations have identified business ideas where traditional technology just doesn’t cut it anymore. The hype comes from the media, she says, and this has created worry among business leaders that they are lagging behind in getting their Big Data initiatives off the ground.
“In fact, the majority of enterprises, including those touted by the media as successful adopters of big data technologies, are only exploring a few tactical use cases,” Sicular says. “Enterprises that are currently evaluating the big data space are not late to the game, but rather, are still ahead of the curve.”
On the other hand, the “paradigm shift” to Big Data is expected to happen fast – Gartner predicts that the term ‘Big Data’ will only last another couple of years.
“Then it will just be ‘data’,” says Ian Bertram, managing vice president, analytics, BI and performance management research, Gartner Asia-Pacific. “It will be the norm.”
There is still confusion around what Big Data solutions actually are. Where do you draw the line between business intelligence and Big Data? Although Big Data is characterised by the volume, velocity and variety of information assets, it’s really the new processing styles that are important, and the enhanced decision-making it will hopefully bring, says Bertram. For many, Big Data is “continuing to build on your current BI and analytics investments”.
“So for many organisations, Big Data isn’t necessarily anything new,” he says.
Rick Dry, information management brand lead at IBM NZ, asks: “Is [Big Data] a traditional data warehouse where you extract data from the production systems and build a view of your customers or products? A lot of our customers have data warehouses and the question in my mind is, at what point does it become Big Data?”
This lack of clarity is fuelling the question whether Big Data is hype or not, says Craig Stires, IDC Asia-Pacific director for Big Data and analytics research.
“In the Asia-Pacific region, the maturity of organisations around data management and analytics is highest in Australia and New Zealand,” he says. “This is also being reflected in Big Data technology adoption rates. Although the largest spend on these technologies is happening in China, the sophistication of application is highest in Australia and New Zealand.”
Financial services is the biggest Big Data using sector, Stires says, with telecommunications, insurance, energy and utilities companies also leading the way.
He says this is mainly because of the intense competition among banks for acquiring and retaining customers. This “competitive need” will continue to drive banks to invest in technologies that offer advantages in reducing churn. Historically, banks have also had high maturity rates when it comes to analysing business data, adds Stires. Closely following the financial services sector would be the telco industry, which is also driven by the on-going need to reduce customer churn, he says.
“Telco is interesting, because they also hold the most significant amount of information about their customers – who they call, for how long, when, who calls them, where they move through the day. This is extremely high-value information, and telcos are seriously looking into investments on how to capitalise on this,” Stires says.
Big Data challenges
There is a fine line between taking advantage of detailed customer information and invading customers’ privacy, and this is something you need to take into consideration if you want to leverage Big Data technology, says Nick Cater, business analytics and performance management territory manager at IBM NZ.
“Certainly, the technology exists to be able to link customer records with call records, with Twitter and Facebook accounts and start creating potentially a very comprehensive view of each customer,” he says. “But there are privacy aspects to it.”
Many organisations use data that they already have for customer analytics, and then use “potentially more controversial sources”, like Twitter and Facebook, for trend analysis and information on general uptake of messaging and products, he says. However, organisations are increasingly looking at how they can target individuals and specific demographic groups.
While the technology is capable of doing it, most organisations are avoiding getting down to the individual level in order to steer clear of privacy concerns, he says. If you are looking at diving deeper into your customer information for analysis purposes, he recommends having policies in place to make sure you don’t violate those boundaries where most people would start to have privacy issues.
According to both IDC’s Stires and Gartner’s Bertram, finding skilled staff is one of the biggest hurdles to uptake of Big Data solutions in New Zealand, where the majority of businesses fall into the small-to-medium sector.
“The biggest barrier for SMBs is going to be skills,” Stires says. “It’s going to be this way, until services companies start offering the market outsourced domain knowledge. There are a lot of barriers to services companies getting this right, and doing it in a way that integrates the business data from the SMBs.”
The more specialised the business is, the more difficult it will be for an external party to interpret the data and put it into a context that can be tied back to business activities, he says.
“There is already a significant shortage of skills for traditional data analytics, [and] the large enterprises will always absorb the vast majority of that talent.”
Other hurdles, such as cost and tools, will become less relevant in the near future, he says.
“Costs of hardware will continue to drop, and the ability to tap into cloud services for data storage and compute will further drop the barriers to adoption for SMBs. It’s all about the skills gap.”
Bertram says new skills and capabilities are required to be able to look at the data in new ways – these skills are often lacking in the internal business analysts or financial analyst.
New Zealand a hotbed?
However, Rob Wickham, head of Exadata and strategic solutions at Oracle Australia and New Zealand, has a different view, describing New Zealand as a “hotbed for data science”.
One of the fundamental technologies behind Big Data is R Analytics, which was created by Ross Ihaka and Robert Gentleman at the University of Auckland. Today, the open source programming language is developed by the R Development Core Team.
“As a result, there are a lot of high-calibre data science, maths and statistics students coming out of the University of Auckland, and other universities in New Zealand as well,” says Wickham.
In New Zealand, the agricultural sector is among those leading the way for Big Data systems, according to Warwick.
“Farming and agriculture are beginning to use advanced techniques, where they are embedding sensors and RFID tags to monitor for example water levels, fertilisation distribution, livestock movement and crop rotation. There are some really interesting advanced farming techniques [being deployed].”
New government services
A common misconception is that Big Data involves enormous volumes of data, says Oracle’s Wickham.
“Whilst the name is ‘Big Data’ I think a lot of the benefits that come from it doesn’t require massive volumes of data,” he says. “In the end, from our standpoint, Big Data really means taking a data-driven view to your decision-making processes.”
Somewhat misleadingly, a lot of the focus has been on the volume aspect, when there are so many other opportunities for organisations to experiment with, says Gartner’s Bertram.
Bertram gives the example of a new local government service that came to life during the recent snowstorms in New York. The 1400 snowploughs keeping the roads clear were each fitted with GPS systems. Information from these was mashed with information from other sensors, for example measuring how much salt was being distributed on the roads. People were able to access an app that showed them when their street had been ploughed and where the ploughs were currently.
“This was a new service to the constituency in New York. It gave people a rough time when they could be out of their house or get to their house,” he says. “It’s not masses of amounts of data. But it’s new data, variety and complexity.”
Bertram adds that with the open government phenomenon that is happening across the world, we are likely to start seeing new services like this one.
“Government is one of the big spenders on Big Data technology right now,” says IDC’s Stires. “Defence departments have been investing in complex analytics for many years. We see projects in homeland security, immigration and checkpoints, and other surveillance initiatives.”
Video analytics is another component in Big Data solutions. One of IBM’s customers, a large customs agency, deals with thousands of ships coming into port every day, says IBM sales executive Rick Dry. In order to decide what ships to investigate, the agency uses a technology to analyse video.
“They can basically see how low a boat is sitting in the water, match that to the ship’s manifest and see where it should be sitting in water, and then determine whether they should be looking at that boat,” Dry says. IBM is seeing uptake of tools for looking at data “in flight”, like the example above, he says – not so much in New Zealand yet, but globally. It’s often used to analyse what is going on on the internet in near real-time.