Big data - big hype or the next big thing?

'Big data' is the latest buzzword in the industry but what does it mean, how can it offer a competitive advantage and what's happening locally?

The upsurge of networked devices and applications means that more data is being collected than ever before. This has lead to an explosion of large-scale data sets which is changing business and science around the world.

Organisations may not know exactly what to use the information for - but they are keen to hold on to it, and this is creating a huge demand for storage.

Christchurch-based ARC Innovations provides metering and field services for electricity retailers and data services to electricity distribution companies. In addition, it is looking at providing services to end-consumers.

General manager of technology Michael Peterson says the company, which has been in business since 2005, is now experiencing “exploding volumes of data”.

The industry has changed radically since the days when someone would physically go out and read meters, perhaps once every two months, collect one data value and return that to the energy retailer.

“Today, our meters are recording a read every 30 minutes. And not just kilowatt-hours – they are import/export capable as well, meaning that they can read the electricity you are pumping back into the grid [as well as the electricity] you are using.”

The meters also return information about power quality, harmonics on the electricity lines, and voltage sags and swells, Peterson says. And they are logging event data, such as power outages, and sending notifications to the back-office system.

“So it’s gone from this one-value read every two months to dozens and dozens of different data elements and information that is collected during the day and returned to you.”

In addition, the smart meters have two-way communications, he says. ARC has deployed radio frequency mesh network infrastructure, running its own communications network. “We are effectively like a telco,” he explains.

However, the smart meters bring back so much more information than a mobile phone would to a telco, he says.

“The more meters there are out there and the longer they have been there, the more data you are collecting,” he says.

“Previously, we didn’t have the ability to collect this data. Smart metering and smart grid technologies are revolutionising the industry.”

“Even if you are doing just simple things, like queries on the data that we are getting back from the meters [every half an hour], just using our standard Oracle relational database won’t cut the mustard any more. Queries will just time out,” he says.

The company has started looking at other ways to deal with large amounts of data, and Peterson says ‘big data’ has the potential to “transform the way that the industry operates.”

To be able to manage this explosion of data, and, even more importantly, extract valuable insights from it, ARC is working with Oracle on improving its existing data warehousing technology. But the company is also looking at IBM and SAP technology, he says.

Peterson and his team have also investigated Hadoop. “But I don’t think we are at that stage yet. That sort of technology works where the data is too big for a database and we are not at that level yet. We may never get there as the relational database vendors are looking at ways to make their solutions able to cope with the data influx.”

About a year ago the company completed its first in-house proof-of-concept project.

“We hand-built everything based on the Oracle toolset we had,” says Peterson. “So a lot of the solution is built in Java. We actually developed all the data extraction and translation capabilities and created our own analytics engine.”

In hindsight, he says it would have been better to partner with a vendor and leverage the capabilities out of the box.

“Our solution proved to be quite heavy on maintenance because everything was hand-done. It was an interesting proof-of-concept but we’d rather initially align ourselves with a vendor in this space.”

“Companies like Oracle, SAP and IBM are spending billions on developing [big data] capabilities and it just makes more sense to me to align with their efforts and look at customising if required.”

So what benefits is Peterson hoping the big data journey will give?

He says one of the company’s biggest challenges is being able to correlate information.

Another one is geospatial analysis. A big data solution would make it easier to understand complex data that is coming in, and the relationships between that data. ARC is looking at developing new services that could be sold back to the industry.

“If we invest in data analytics capabilities, we could profile [an electricity] retailer’s consumer base; look at the actual load usage, and we would be able to potentially sell this information directly to consumers or via the retailer,” he says.

ARC would also be able to match consumption profiles to all the available tariffs in the market and inform consumers which retailer they should go to and what tariff they should be on, based on their actual consumption pattern, he says.

“That is easy enough to do when you are looking at one household at a time, but what we are talking about here is how do you do that over a period of time, taking in seasonal variations and looking at hundreds of thousands of households at once. How do you model that? How do you support those types of queries?”

He sums up the biggest challenge as: “Coming to grips with all the external pieces of data”.

In addition to ARC’s own data, energy distributors have data coming in from probes. There is also hourly weather data and rainfall data. Understanding rainfall is very important, he explains, as there are huge amounts of power consumed in New Zealand by irrigators in rural areas. As ARC is based in Christchurch, it is affected by the earthquake recovery process. The company shares information with Canterbury Earthquake Recovery Authority (CERA) around electricity disconnections and redirections they are doing.

Collecting, summarising and analysing all that information in an efficient way requires a good underlying toolset, but also new skills are required within the company, Peterson says.

ARC is looking at creating completely new roles, such as a data architect, to be able to extract value out of all the unstructured data coming in. Database understanding also has to be improved.

Storing all this data is also a challenge. “We need to annually allocate budget for our storage solutions. We’ve also gone into virtualisation,” he says.

Extreme volume, velocity and variety

While ‘big data’ is quite a phenomenon at the moment, it has been building up over time – it hasn’t just suddenly dawned on us, says Eric Thoo, principal research analyst at Gartner.

Information is reaching new levels of extreme volume, velocity and variety and that is driving new challenges, he says.

“Generally speaking, big data is just the beginning of ‘extreme level of information’ challenges.”

Traditionally, data arrived at one place and became information – now, data is on the move, he says. Data is needed at the point it is created or when a process has just occurred – someone just keyed something into a handheld device, clicked a website, sent an SMS, used a kiosk or made a purchase in a store.

In the Asia-Pacific region, companies are still in aggressive growth mode and are wanting to transform their businesses, says Thoo.

They are recognising there are more opportunities out there if they have insight into their customers’ behaviour and the products they are using. Many of the organisations Gartner that is talking to don’t know yet how they want to approach that, but there is interest in using big data.

In Australia and New Zealand, most companies that are interested in big data applications are just on the starting blocks, he says, assessing the competencies and tools needed.

One challenge is knowing when the information is available, says Thoo.

“Organisations want to get closer to the time when information becomes available,” he says.

Keeping and delivering the information to the relevant receiver are other challenges.

“Traditionally, we have technologies and tools that address the aspects of getting, keeping and delivering information but when it’s happening at extreme levels of volume, velocity and variety, some of these tools may not be sufficient,” Thoo says.

New tools and technologies will be needed – in particular technologies that can connect to different sources of big data from unstructured data sources, or metering tools that generate continuous streams of data – these are common in industries such as utilities, astronomy and scientific research.

“Providers of data integration tools are beginning to focus on how their tools can be applied to different sources,” says Thoo.

There will also be demand for new analytic expertise within organisations – in order to interpret context, he says.

“It’s not just reporting any more. It’s accessing and using information when it occurs in a very dynamic way and embedding that into the business processes as well.”

What types of companies would benefit from investing in big data? Thoo says it’s not just banks and utility companies – it could be marketing companies, telecommunications companies and retail companies that have online interaction with consumers.

The driver to embark on big data initiatives is stronger for organisations that have moved to empowering their customers and where consumers use services through multiple channels, he says. Any company that deals with customers via multiple channels; that requires faster ways of collecting data; faster ways of processing an order or servicing customers are likely to benefit from a big data approach.

Gartner has predicted that through to 2015, more than 85 percent of Fortune 500 organisations will fail to effectively exploit big data for competitive advantage. This is largely because the leaders of enterprises are not fully aware of the opportunities, says Thoo.

“There is a level of education needed to get the concept right. A lot of leaders may understand their business, but they need to understand that business is going to evolve around the economics of data, not just the business process, products and services.”

Organisations need to look at data in a whole new way, he says. They have to come together in a more holistic way, making sure that their variety of data can be used more synergistically and that mindset will take time to develop.

“But that is necessary,” he says. “I think it’s good in a way that organisations are awakened to focus on the right competencies and the tools to use information synergistically, rather than leaping on to big data as a hype. [Because] that would mean going out and buying new technology and that would just aggravate the problem more.”

Technology is not the ultimate solution – it’s part of a larger framework to enable better information management, he adds.

Looking for competitive edge

Both IBM and Oracle in Australia report that companies across the Asia-Pacific region are interested in how they can use big data to change the way they are making decisions in their business.

“There is a particular focus on understanding the content and knowledge within in social networks, such as Facebook, Twitter, domain-specific forums and how this information can be used to understand customer sentiment and product acceptance,” says Christopher Bunn, AP information integration sales manager at IBM Software Group.

Many businesses today have typical structured data warehouse environments, and a lot of them are looking for that next competitive edge, says Stuart Long, CTO of systems sales consulting, Oracle Asia-Pacific.

However, the sheer volume and velocity of information brings traditional approaches of processing data to its knees, says IBM’s Bunn.

“Variety of the data is also a major challenge of big data and not one easily solved without the right technology developed for the task,” he says. “Information needs to be stored and analysed in its native format, be that audio, video, text, call data records, smart meter data, posts to Twitter and social networks, for instance.”

Some of the companies IBM is working with in Australia are in the telecommunications and utilities verticals, he says.

“They are analysing network data in real-time to provide better customer services, improved customer retention and experience. A number are also using big data and smartphones to create new revenue streams.”

Among the systems already deployed are those belonging to government and security agencies that analyse a myriad of information for homeland security, he adds.

Oracle’s Long is dealing with power companies, retail distribution companies, telcos, education and security companies at the moment – some of them with a presence in New Zealand - that are looking to apply the same processes across both sides of the Tasman.

According to Bunn, a number of Australian companies are at the start of their big data journey.

“There are a number that have established social media departments to monitor and understand what is being said – this for many is the start of the journey. As the volume, velocity and variety of the information increases, smart technologies are required to automate and process this knowledge and will drive significant investment in technology to capitalise on the new knowledge.”

He says IBM’s big data solutions, including InfoSphere Big Insights and Streams, are available in New Zealand as well.

The real-time nature of big data is the interesting part, says Long. Working out what to store and how to store it in real-time environments is difficult.

“From an Oracle perspective we approach it around a data model – what is the data you want to capture and what is the most efficient way of capturing that data?”

You don’t want to capture data that has no economic value, he says.

“The amounts of data can start to grow rapidly bringing cost with it.”

Today, companies have the opportunity to set up their own big data environment to a relatively low cost, using the likes of Amazon, he says.

Oracle is bringing to market its open-source big data appliance which helps capture, store and analyse information, he says. The appliance uses Oracle technology, its NoSQL database, and has direct integration with Hadoop, he says.

Many businesses are focusing their next wave of investment on big data, says Long. It’s often about keeping existing customers – analysing what your customers are using and when they are likely to make a decision to either stay or change provider.

There will “definitely” be a need for new skill sets within companies around how to use this data, he says. Many companies are used to making decisions after post-process analysis. How do you turn them into an organisation that is comfortable making real-time decisions, he asks.

“That often goes against some of the business models that traditional customers have built up.”

Join the newsletter!

Error: Please check your email address.

Tags Special ID

Show Comments

Market Place

[]