Rick van der LansRick van der Lans


The Business Intelligence (BI) world is moving into the second era of data warehousing in order to meet new requirements from business users. By 2012, business units will control at least 40% of the total budget for BI.

This is according to Rick van der Lans, international BI expert from R20/Consultancy, who gave an overview of global trends in the BI arena during the keynote address at the ITWeb BI Summit, at The Forum, in Bryanston.

“New technologies create new opportunities. With BI, the technology and the requirements are there. We need to change our architectures. The whole concept of a chain of databases is the end of the era. We are entering a new era of data virtualisation,” said Van der Lans.

Data virtualisation is a software layer between reports and data stores. According to Van der Lans, data virtualisation enables IT to decouple applications from storage structures. He said IT can redirect data marts and build a much simpler architecture.

<a href=<a href=

Barry Devlin" />“All organisations use various tools from various vendors and have metadata specifications that are replicated all over. As soon as an organisation uses data virtualisation servers, all the tools use one specification. This means that data virtualisation can enable BI architecture to bring metadata in a centralised way.

“Data virtualisation can bridge the gap between replicated data stores and what business users need in their BI environments.”

He added: “The data warehousing system that IT has been building for the last 15 years does not have the architecture that can cope with new requirements of managing huge volumes of data in order to make fast business decisions.”

John CallanJohn Callan

According to Aberdeen Research, there are problems with current data warehouse platforms; 45% were not happy with the query performance they were getting from their data servers. Other challenges include inadequate data load speed and the high cost of scaling. Said Van der Lans: “Business pressures demand a more flexible approach to BI delivery, yet 42% of enterprises report that making timely decisions is becoming more difficult.

“The average time needed to add a new data source was 8.4 weeks in 2009, 7.4 weeks in 2010, and 7.8 weeks in 2011. Around 33% of respondents say they need more than three months to deploy a new data source. Developing a complex report or dashboard took an average of seven weeks in 2011.”

According to Van der Lans, business users demand more information and they need it faster than ever before. Added to this challenge is the growing volume of data, which is creating more complex data warehouses, and IT staff face growing backlogs of information requests. “We don’t have the flexibility anymore in current architectures to give users what they need.”

FASTER, BETTER

<a href=<a href=

Bill Hoggarth" />Barry Devlin, founder and principal of 9sight Consulting, concurred. He said the biggest dilemma facing organisations today is how to manage BI quicker and better.

“Data consistency and quality was the original business driver for data warehousing. In the past, there were multiple disparate and inconsistent data sources that were never designed to work together, and had incomplete and low-quality data. The goal was to achieve a single version of the truth.”

However, Devlin said there isn’t a single version of the truth where functional business focus differs. “The changing world drives evolving measures; new ways of doing business drives change in data needs.

<a href=<a href=

Jane Thomson" />“Business wants a sufficiently correct answer soon enough to affect the outcome of decision-making. There needs to be a balanced structure between quicker and better BI.”

Devlin pointed out that advanced BI environments demand simpler, clear operational sources and an integrated design environment.

“If I can bring in the business people who understand the data, then I get something that’s driven by their knowledge of the data sources. Shared metadata becomes easier to manage.”

He explained that organisations must start reorganising the data warehouse design and build teams to benefit from the knowledge that business has from acquired data.

“Increasingly, data is being distributed. We are going to have multiple databases and have to have the ability to virtualise access to those multiple stores. BI tools offer virtualised access to local data, data in marts, data warehouse data, and to non-relational data.”

Devlin added: “What I see coming next is that we will have an enterprise data warehouse, but will become smaller than what we do today; there will be analytic stores, enterprise data warehouse and other stores; they will exist in parallel, with data virtualisation being the means to get access across them.”

According to Devlin, collaborative working and peer-based evaluation is critical to making BI more agile. He noted that self-service BI has a large element of hype in it because business users don’t know enough about self-service BI, and haven’t used the tools to their full advantage. He said self-service BI usage must be built on top of IT-provisioned quality.

“Three tips for more effective BI delivery are to empower users to innovate; give them the power to play with data fast; and have agile access and collaboration among peers.

“Focus IT on core data quality and performance, information and model re-use, and agile development. Bridge the business and IT gap using social networking tools, coring, ranking and peer evaluation.”


SPEEDY GROWTH

John Callan, senior director for global product marketing at QlikTech, says BI is rapidly growing at a 7% compound annual growth rate, and is increasingly moving towards a self-service approach.

“BI is typically in the top five priority list, if not the top priority for CIOs worldwide. Organisations are using data as the raw material to assist them in the decision-making process that requires them to be agile.”

Callan indicated there is a trend towards BI acting as a support platform. “It’s about trying to make sense of all this data, where people are more empowered, mobile and have different expectations on the software they’re using.

“The consumerisation of BI doesn’t reduce the importance of IT in an organisation, but rather empowers IT to focus on a business’ core competences.

“Business users are bringing in their s, iPhones and Android devices into the workplace and expect to be as productive in the consumer world as they would in the business world. They want to have their analytics and reports regardless of where they are,” explained Callan. He pointed out that self-service BI is not new in the BI world; however, it has fallen short of expectations because it is difficult to manage. He said the need for self-service BI has not gone away; it’s actually growing.

“Business discovery is inherently self-service BI. Self-service is worrying for IT, because , governance and data access is at risk. These are real concerns for IT.

“I think the role of IT is changing in BI in terms of business discovery. There’s a shift away from going with a big bang purchase approach of BI, where, for a variety of reasons, departments are looking to be more agile, and the purchasing cycle is more focused on a departmental level.”

While data is a key component of the decision-making process, it’s not the only component, cautioned Callan. He said context and people are just as important in a BI process. “Social business discovery brings people into the application itself; it’s about having adaptations within the application that is asynchronous, where people can collaborate within the application itself.”

According to Callan, business users are increasingly using unstructured data that resides outside of their firewalls such as data from Facebook, Twitter and Google. Yet he pointed out that it’s important for organisations to be able to merge this data with multiple sources.


BIG DATA

This large store of unstructured data brings value to an enterprise. Bill Hoggarth, MD of Dataways, said big data is the new source of productivity, growth and competitive advantage.

Hoggarth said that while , Forrester and analysts tout big data as one of the main BI challenges in terms of complexity, volume and variation, businesses must have a need to do something valuable with that data.

“It doesn’t matter how quickly big data is changing or how many formats it comes in; if business cannot derive value from that data, then there is no point in capturing and analysing this information.”

He explained that big data is unpredictable and too complicated for traditional datasets to process and manage: “The question is not how much big data is out there, but how much of it is useful.”

According to Hoggarth, the BI vendor market is seeing increasing financial value, rising to $46 billion for big data processing. Hoggarth outlined two conflicting data principles: “The first fundamental principle is that better information leads to better business decisions, but the competing business principle is that you can’t manage what you can’t measure. It’s better to be vaguely right than precisely wrong when it comes to information.”

He added: “Until big data came along, we weren’t able to answer the question as to why a company’s most profitable customers were leaving. A company needs to know who the most profitable customers are and what its competitors are doing. Then it has to take a query and search across different data sources.

“By focusing on the value of the big problem of big data and focusing on new interfaces such as the , big data will be the new source of productivity, growth and competitive advantage.”

Hoggarth noted that the has become a catalyst to focusing on the problem of big data. He said there are more s in SA than there are BI user licences. In addition, he noted that user expectations for BI are being set by Google, in terms of speed of information access, and Apple, in terms of usability and design.

Hoggarth gave an example of how state police in the US are using hi-tech mobile forensic devices to draw in cellphone data from users in under two minutes during roadblocks. The technology is used to conduct warrantless searches without phone owners knowing that their data is being extracted by authorities.


Beware the bad data

While data can be the greatest asset, it can also go awry, explained Jane Thomson, executive director of . Up to 80% of large BI projects that fail do so because of bad data.

“Business is not listening to their IT departments when IT asks for help. In addition, most enterprise organisations have complex systems and don’t have a single unified data structure design or integrated architecture.”

Thomson stated that more than 75% of enterprises do not have effective master data management strategies.

“People need to make hard decisions about data, and now need to call the shots in terms of how data is managed. New technologies make it easier to deal with large volumes of data. It’s imperative that we have linkages that allow us to find the relationships of data coming from different sources,” added Thomson.

One of the biggest BI challenges is around the issue of data ownership. Thomson indicated that organisations with complex data systems must first define a data and BI strategy, have a recognised portfolio, and secure management’s support.

Thomson said it makes sense for BI and information management to work together to deliver results and to fix the data one logical chunk at a time.

“Expensive applications alone do not deliver value. Today we have tools to sort out data issues. Messed up old data plus new technology equals expensive old messed up data,” Thomson noted.

She added that BI challenges are both a data and a people problem. This is often evident in businesses that grow too rapidly without putting the data management foundations in place. “It’s all about building a proof case. Find a business champion rather than a technologist; someone who understands systems at a business level.”