INSIGHT: In-memory processing - the true driver for advanced analytics applications

Geoff Beynon, General Manager, SAS Institute New Zealand, believes businesses everywhere are embracing in-memory analytics.

Businesses everywhere are embracing in-memory analytics for the competitive, productivity and bottom line benefits the approach can deliver.

And as with the ‘space race’ when technology advances in the 1960s enabled putting a man on the moon, so today’s technology innovations put the in-memory processing of analytics advances within reach for even quite modest-sized organisations.

In addition to miniaturisation and decreasing cost, says Gartner, the growing use of in-memory analytics is driven by “digital businesses’ avidity for scale and performance”, also by “users’ desire for deeper and timelier analysis.”

Enthusiastic adoption should be no surprise. With in-memory analytics, all the data to be used is suspended in memory instead of needing to be repeatedly drawn from and restored to disk.

This not only speeds access to the data needed for applications, it also means data can be shared by multiple users across different applications, concurrently and securely.

All of which adds up to better enabled decision making across the enterprise.

Speed is not the only advantage of in-memory analytics processing, compared with conventional processing from disk access.

Greater scale is also a factor, enabling deeper and more granular insights. In addition, continuing hardware advances make the running of in-memory analytics perfectly feasible on today’s low-cost commodity platforms.

But these opportunities can only be exploited to full advantage if decision makers have the right business processes and people to support them.

This is a challenge in every sector but especially so for asset-intensive industries such as utilities, transport, telecoms and manufacturing where close collaboration between data analysts, IT and operations managers is particularly important.

Getting started with in-memory analytics

The first consideration is about the skills set required to get the best out of an in-memory analytics investment.

Yes, the currently much talked about ‘data scientist’ role is a strong plus but the experience and talent to configure and manage a big data infrastructure are also very important.

Competence with the latest machine learning techniques, along with a Hadoop-based data infrastructure is another ‘must’.

A major part of the effort in developing any analytics project goes into data integration, including preparing the data before creating models and building scoring codes into operational systems.

The IT department should be involved at the outset when deciding to adopt in-memory analytics. This is because the evaluation and planning process must precisely determine how the endeavour will meet the need for a flexible and scalable analytics platform.

In-memory analytics can give users valuable self-service capabilities, thereby lessening dependence on IT to create, maintain and manage aggregates, indexes and reports.

But in designing for self-service, IT must guard against creating a dedicated silo. In-memory analytics should be part of overall information architecture rather than a stand-alone strategy.

Later, as more diverse data types and volumes such as free-form text, event streams, and sensor, log and social media data are included, data integration and discovery capabilities will assume even greater importance.

Adopting in-memory analytics to deliver deeper and faster insights to aid decision making is not a simple exercise.

But with cross enterprise agreement on detailed requirements; close cooperation between IT, analysts and decision makers; and the right skills and strong leadership, the effort will be well rewarded.

By Geoff Beynon - General Manager, SAS Institute New Zealand

Join the Computerworld New Zealand newsletter!

Error: Please check your email address.

Tags SAS Institute New Zealand

More about GartnerSASSpeed

Show Comments
[]