Big Data, Big Change?
Unless you have been in a cave for the past year you will be well aware of the hype around “big data”. As usual it’s being driven by the software vendors and great promises of leading to the holy grail of innovation, that other over used buzz word and subject of a hundred articles in Harvard Business Review and the like.
But how do you actually use all this information and the available tools effectively? I just write a check to pay for more analytics licences right? Surely I can just go buy it as a service from the cloud?
A recent article by Thomas Davenport in MIT Sloan Management Review highlights that
Organizations that capitalize on big data stand apart from traditional data analysis environments in three key ways:
- They pay attention to data flows as opposed to stocks.
- They rely on data scientists and product and process developers rather than data analysts.
- They are moving analytics away from the IT function and into core business, operational and production functions.
I couldn’t agree more so here’s my two cents to extend this perspective.
I’ll explain it through the lens of applying risk management to a business as it aligns with several of my current clients recent transformation experiences and is a model many people are familiar with. I’ll also draw on experience in the field of intelligence and its relationship to risk and uncertainty to explain how you really should expect your “business intelligence” to work better for you.
You see I think we can learn from the world of spooks – who make an art of managing risk and uncertainty through information related processes - to expect more from the guys who keep talking about dashboards and cubes, let alone the vendors who are desperate to sell you more licences for analytics packages. But it is quid-pro-quo, and the rest of the business has to participate, this in itself is probably the single biggest challenge of making real transformative use of big data in your organisation.
So let’s talk about the information story at the heart of the big data challenge, its dimensions and the ways it should enable a transformation in your business.
Many of us in the IT profession are familiar with Risk Management as a concept, particularly in Australia and New Zealand where the ANZ risk standard was particularly mature. A few years back the A/NZ standard (AS/NZS 4360:2004) was adopted by the International Standards Organisation (ISO) and as it was so good to start with little change was made along the way to it becoming ISO 31000:2009.
Two things of note that were changed however including firstly a refined definition:
“Chance of something happening that will have an impact on objectives” AS/NZS 4360:2004
“Effect of uncertainty on objectives” ISO 31000:2009
Secondly they introduced 11 or so principles to guide the application of the standard which included things like:
- Risk management creates and protects value
- Risk management is an integral part of all organisational processes
- Risk management is part of decision making
- Risk management explicitly addresses uncertainty
- Risk management is systematic, structured and timely
- Risk management is based on the best available information
- Risk management is dynamic, iterative and responsive to change
- Risk management facilitates continual improvement of the organisation
- Risk management takes human and cultural factors into account
- Risk management is transparent and inclusive
- Risk management is tailored
So what you say?
Turn these principles around and ask yourself how your organisation is making use of all the information it has and is exposed to through its many customer and supplier interactions, to: Continuously improve? Drive decision making? Create value?
How are you making the use of and adapting to information insights an integral part of your organisation?
What about the information you are using for decision making? Does it represent a complete picture and is it based upon the best available information? How are you managing the uncertainty of achieving your outcomes?
Aren’t all organisations in the business of trying to achieve their objectives? Wouldn’t a better understanding of the likelihood of meeting those outcomes help you run your business? I’d think so!
Next let’s consider what uncertainty is and how you manage it. Again the risk standard provides some guidance here defining uncertainty as “the state, even partial, of deficiency of information related to, understanding or knowledge of an event, its consequence, or likelihood”.
The events of course are wide ranging from the sale of a product, the impact of a service in meeting the client’s needs, the behaviour of a client or for that matter the activities of a member of staff, say a sales rep.
So how do we go about understanding these events? Where does the information come from? Where do you start?
In Tom Davenport’s article he suggests moving the analytics away from the IT function. This is certainly a good start. Next you need to think about how you will use the information that becomes available, the insights it needs to deliver and the level of uncertainty related to the decision making the information will support.
Typically the insights fall into three perspectives: Tactical, Operational and Strategic. Each perspective requires a different process in reflection of the time frames of the decision being made and the relative uncertainty you’d expect. For example, long range strategic decision making or planning needs a process like scenario planning to consider the likely potential outcomes, how you would respond and weigh the potential impacts. But I digress!
For the purpose of this discussion let’s stay relatively short term focused for now and turn to the world of intelligence for a conceptual model to help frame our thinking.
A typical intelligence process is fairly linear (although also iterative in practice) such that it looks like the one below. This mirrors many of the business intelligence processes I’ve seen in action in our clients:
You start by understanding the information customer’s needs, plan to collect the data or information, collect it, process it and transform it into a usable form where it emerges as intelligence.
The problem is however, that information is continuously hitting the organisation in the manner pointed out in the article I referenced requiring you to think about stocks and flows. The above process also doesn’t recognise just how much things change and can result in analysis once produced, be that a product from an a analyst or the content of a dashboard, being out of date or misguiding. Of course that’s also assuming you can develop the BI/dashboard components in time for it to be relevant.
So how do you respond to the need to be dynamic, iterative and responsive to change? The trick is it’s not the tools, it’s the way you structure your organisation around them.
Think of a customer engagement where the customer meets a target segment where you have a deficiency of information to be able to accurately serve them. How do you change the nature of the interaction with that customer to collect the information you need? Change the configuration of the product so it better serves their needs? How do you better involve the elements available that would participate in the information collection be that a sales person or a website?
The solution may lie in an emerging intelligence practice called target centric intelligence. It’s described very well by Robert M. Clark in his 2003 book Intelligence Analysis: A Target-Centric Approach, CQ Press, ISBN 1-56802-830-X. Visually it looks like this:
Taking this approach supports a more iterative approach to gathering information and helps guide the information collection activities so that you adapt and continuously improve particularly in response to feedback from those closest to the target which may be a customer, a product or a service.
Applying this conceptual model you are utilising your analysis team to both assist with identifying the needs and translating the return of the information to the internal customer. At the same time you are focusing on the interactions with the clients to continuously evolve your understanding of the information gaps and more importantly to identify new opportunities to collect information from the target. Understanding those gaps however requires a higher degree of interaction with both the customer focusing resources and your product and service specialists.
Obviously this is pretty hard to implement with your typical in house BI team focused on reporting and analytics. Furthermore it threatens the domain of the expert SAS user (we’ve all met them right?) who likes to while away the hours digging for gold.
This is why as Tom points out the engagement of the data scientists and product and process developers rather than data analysts alongside the core business, operational and production functions is so important.
With the advent of big data the challenge is made easier but don’t disregard the level of change required to utilise the insights effectively on your organisation.
Ben Copsey is a Managing Consultant with Optimation. You can follow Ben on twitter @benjamincopsey