Since business intelligence (BI) systems evolved as data warehouses, businesses have been able to analyse and act upon increasing amounts of information that they hold in their databases. However today’s trends towards Big Data and real-time analytics are making BI systems look like dinosaurs: yesterday’s data is too old to be relevant, causing businesses to miss out on opportunities when faced with increasingly agile competitors. In-memory data is the technology that is making this world a reality, able to handle both traditional data and unstructured data, but what changes need to be made, and can legacy BI systems be integrated into a Big Data world?
How we got here
The story of BI is really one of the scale, accuracy (whether descriptive or predictive) ease and scope at which we try to analyse data. Before BI systems arrived, enterprise applications had their own databases which contained all the information they could access to perform their functions, and it was possible to derive useful insights about the business from them. For example, a pre-BI customer database would contain information that could be used to segment the customer base and drive marketing, sales and R&D investment.
However, these systems were siloed because they were not linked to other systems: they were unable to share databases and although each system held a wealth of data, its value was limited. The problem became worse as enterprises acquired more IT systems linked to various departments, such as CRM, ERP, HR, finance and customer services. Mergers and acquisitions caused them to end up with multiple systems fulfilling the same functional requirements but against a subset of the organisation’s total data for the function. At this point the analytical potential actually decreased as the complexity of getting a single, accurate view of data increased with each new system.
It was this world that the data warehouse concept of BI entered, as a single repository for all of the organisation’s data where the data can be organised, analysed and made useful to the organisation. The challenge facing organisations today stems from the way in which this vision was implemented and the tools used.
Extract, transform and load (ETL) tools were developed to handle moving the data from the business systems to the data warehouse, including making the data readable and keeping it up to date, and business process orchestration systems are also able to connect data to warehouses in this way. Data mining engines would then perform analysis of the data in the warehouse, and reporting tools were linked to the data mining to provide easily understood output.
These tools have been able to provide businesses with both accurate and complete historical data and a degree of predictability through extrapolation from past trends. However Big Data is starting to acquire mainstream acceptance and this completely changes the way we have to use BI.
Why is Big Data different?
Big Data is actually a misleading term, as the name conjures images of databases that are larger and more complex but the reality is that Big Data refers to a very different type of data: unstructured data, which cannot be mapped to a traditional relational database. Big Data is characterised by the “Four Vs”: volume, velocity, variety and value.
Volume refers to the fact that we generate a huge amount of this data, and generating increasing amounts. For example, smartphones contain an array of sensors which produce data that can be queried for use in analytics, such as a GPS system. As the number, complexity and exploitation of smartphones increases (more of them, producing more data and with users who know how to use them) the volume of data produced will increase as well.
Velocity means that the data changes rapidly, so rather than traditional BI data about customer orders which can be handled in batches, we are instead looking at smartphone location data which could be outdated within minutes or even seconds if for example the objective is to send a specific offer to a customer in a high street.
Variety refers to the many types of data and sources, from databases to audio and video objects (to which we can attach context and which become part of analytics) and increasing amounts of unstructured mobile and social data.
Value is exactly what it says: the better we get at analysing Big Data, the more value we can extract from it.
What this means is that we are shifting away from a model where the data warehouse is the “single source of truth” for an organisation towards a more decentralised view where databases are enriched by real-time and non-relational information. There’s just one problem: how can we make our existing BI tools work with this new world?
Existing data warehousing and analytics tools are designed to run preset or ad-hoc queries against vast databases, but predictive analytics and real-time data require different tools, and our existing tools would need to be re-programmed if they are to cope with this.
How can we accommodate Big Data?
All of this leads to the key question: does Big Data and its associated In-Memory data mean that traditional BI tools are on the way out? Any attempt to answer this just leads to more questions: should we extend existing enterprise integration to the new tools? Will In-Memory computing replace ETL and batch processing? Will the improvements on ETL such as process-based integration continue to lead the way in this new world? Or do we need a smart platform that can take all these elements and fit them together?
I don’t want to try to answer these questions here, as that’s a topic for another day; instead, I’d like to show the difference the new world can bring and let you start thinking about what this could do in your business.
Imagine a high street retailer like House of Fraser: in the traditional BI model they would want to accumulate your transaction history and to do so would offer you a loyalty card. Scanning this every time you make a purchase would let them track what you have bought, and put this data into a warehouse or cube where it could be carefully sliced to provide insight into which promotions you could be offered. The problem is that this is reactive, tries to extrapolate based on past activity and does not offer you a very personalised experience… and consumers are increasingly bored with loyalty cards.
In an In-Memory, Big Data world, the picture is very different. The retailer has no need to persuade you to take a loyalty card because they can track your purchases by your credit card number. They know what that card is used to buy, and can track differences between shopping trips. For example, if you make a pair of weekly shopping trips, but bought a belt the first time, the system could intelligently offer you a promotion that you might be interested in. Not another belt (you already have one) but perhaps an offer on socks that people who also bought that belt chose. In order to create and personalise that offer, it would be possible to bring in data about people near you, similar demographics, and what people who bought the same item said on social media about it. Who has endorsed the brand, has it been featured in any news or magazines? Not only is the amount of data that can be brought in growing but it has to encompass both structured and unstructured data, and the kind of information that would previously have been too difficult to collect or analyse is now exactly what will persuade the customer to buy.
This personalised offering is made possible in real-time without any cubes of data or warehouses, without any BI and just by having your and every other customers’ purchases held in memory. Real-time also means that offers don’t have to wait until you’re at a checkout (by which point you have psychologically shifted from “browsing” mode to “complete the purchase, get home” mode, and thus are less receptive), instead by combining it with those hand-held scanners that are increasingly popular in supermarkets you can be presented with a personalised offer as you pick up an item, or as you walk past an aisle. If you’ve ever got your shopping home and then remembered something else you needed, you can imagine how useful a quick reminder about some of your frequently purchased items might be.
David Akka is the Managing Director of Magic Software (UK) Ltd and have worked with Magic Software for over 17 years.
Visit his blog at: http://www.davidakka.com/
This article was originally published David's Akka Blog