The term Web 2.0 means lots of things to lots of people. “Web 2.0, a phrase coined by O’Reilly Media in 2004, refers to a supposed second generation of Internet-based services?such as social networking sites, wikis, communication tools, and folksonomies?that emphasize online collaboration and sharing among users.” (from Wikipedia definition).
Interestingly BI 2.0 is based on SOA and Web 2.0. Then what would BI 2.0 look like?
There is plenty of good content in the various BI 2.0 posts, white papers and blogs with criticism of the term as well…
Some people say BI2.0 is people centric Business Intelligence, then is it all about People being able to create and share…and Is it about People being able to talk to their data and have it talk back?
Despite the criticism i believe that the goal of BI 2.0 should be to cut the time between when an event occurs and when an action is occurred to improve business performance. The longer you take to respond to new data, the less value there is in your response.
BI tools today focus on the presentation of data. BI is not just extracting data that is hours or days old and publishing it into reports. Often users express that the information arrives too late to be really useful. Simply delivering more reports faster doesn’t solve the problem. Customers expect instant results, and don’t want to wait for answers.
BI 2.0 will come about through a blending of consumer-oriented information mashup technologies with extranet-oriented traditional BI solutions.
Charles Nicholls has laid out some good ideas in a recent article.
BI 2.0 is driven by this need for intelligent processes and has the following characteristics:
- Event driven. Automated processes are driven by events; therefore, it is implicit that in order to create smarter processes, businesses need to be able to analyze and interpret events. This means analyzing data, event by event, either in parallel with the business process or as an implicit process step.
- Real time. This is essential in an event-driven world. Without it, it is hard to build in BI capabilities as a process step and nearly impossible to automate actions. By comparison, batch processes are informational – they report on the effectiveness of a process but cannot be part of the process itself unless time is not critical. Any application that involves trading, dynamic pricing, demand sensing, security, risk, fraud, replenishment or any form of interaction with a customer is a time-critical process and requires real-time processing.
- Automate analysis. In order to automate day-to-day operational decision-making, organizations need to be able to do more than simply present data on a dashboard or in a report. The challenge is turning real-time data into something actionable. In short, businesses need to be able to automatically interpret data, dynamically, in real time. What this means in practice is the ability to compare each individual event with what would normally be expected based on past or predicted future performance. BI 2.0 products, therefore, must understand what normal looks like at both individual and aggregate levels and be able to compare individual events to this automatically.
- Forward looking. Understanding the impact of any given event on an organization needs to be forward looking. For example, questions such as “Will my shipment arrive on time?” and “Is the system going to break today?” require forward-looking interpretations. This capability adds immediate value to operations teams that have a rolling, forward-looking perspective of what their performance is likely to be at the end of the day, week or month.
- Process oriented. To be embedded within a process in order to make the process inherently smarter requires that BI 2.0 products be process-oriented. This doesn’t mean that the process has been modeled with a business process management tool. Actions can be optimized based on the outcome of a particular process, but the process itself may or may not be explicitly defined.
- Scalable. Scalability is naturally a cornerstone of BI 2.0 because it is based on event-driven architectures. This is critical because event streams can be unpredictable and occur in very high volumes. For example, a retailer may want to build a demand-sensing application to track the sales of every top-selling item for every store. The retailer may have 30,000 unique items being sold in 1,000 stores, creating 30 million store/item combinations that need tracking and may be selling 10 million items per day. Dealing with this scale is run of the mill for BI 2.0. In fact, this scalability itself enables new classes of applications that would never have been possible using traditional BI applications
Neil Raden says in his article on BI 2.0…Rest assured, the current era of BI is coming to an end and will be succeeded by a BI 2.0 era that promises simplicity, universal access, real-time insight, collaboration, operational intelligence, connected services and a level of information abstraction that supports far greater agility and speed of analysis. The motivation for this “version upgrade” for BI is the need to move analytical intelligence into operations and to shrink the gap between analysis and action.
But – I think nobody has yet had the last word on what BI 2.0 might exactly mean…