I’ll be delivering a keynote presentation in Sydney Australia on Oct 18 at the Mastering Business Intelligence with SAP conference. I’ll also be doing a roadshow around the country with our local partner First Point Global, who really understand the business of IAM. The Australian market is very forward-looking these days, and I’ve been impressed with the vision behind the projects we’ve been involved in. If you’re in Australia, come by the conference or send me an email if you would like to meet.
Here’s the abstract in full:
BI is Dead. Long Live BI. The Future of Business Intelligence in the Cloud
Will cloud computing really change IT? Despite all of the attention that cloud computing commands, this deceptively simple question has been largely overlooked. The promise of shifting capex dollars to lower opex is certainly compelling and the overnight success of some of the large Software-as-a-Service (SaaS) vendors, such as Salesforce.com is undeniably impressive. But once the hype dies down, what will be the real impact of cloud computing to mission-critical applications such as BI?
Cloud will transform BI, much as it is currently transforming CRM. Cloud isn’t only about a cheaper new delivery model; when done right, cloud also radically changes how applications are composed and where data can reside. These changes are driven both by necessity-acknowledging the realities of latency, privacy and compliance – but also by opportunity and the rapidly evolving best practices that show us how to build applications better and deliver these faster. BI must change to be successful in the cloud and cloud is an irresistible forcing function that will make this change inevitable. If your career is centered around BI, you need to be ready for this revolution.
Cloud computing would fundamentally shift many traditional software model from licensing/shrink-wrapped/in-house towards a SaaS model simple, the cloud is able to provide an elastic, cheap, on-demand utility to both computational power and storage.
There are also exciting ideas to ponder about. Given that one of the arguably best approach to ETL is to memcache the raw/intermediate data for rapid data-processing, but its speed is constrained by the physical memory 8 ~ 64 GB RAM would it be better to write code to run on an “infinitely limitless” virtual machine, or right parallelizable code to exploit computing on multiple cores?
BI has been touted too long as a technology solution, the same as in the mid 1800’s inductrial machines were with a statement that they would solve all of mankinds desires – look what happened.
BI, DW, EDW are all technology solutions. As BI’s continue to fail at an alarming rate (globally at the rate of 50% according to Gartner) and based on the fact that ‘98% of BI implementations are successful on week 1, and only 50% remain successful by week 10″ BI Valuenomics findings, could it be true that VM Ware and clouds are simply technical distractions to problem that lies outside the technology stack.
“Without business in business intelligence, BI is Dead” Gartner 2011, is as close as we get to the core of the issue.
My contention is that the true solution lies not in the cloud but in creating a solid foundation based on proven scientific methodologies – the rest could simply be hype. Once this is accomplished, costs go down, quality goes up and a lot of technology alternatives can be postponed.
According to empirical calculations we globally wasted over US$ 4.6 billion in 2009 alone. This is computed based on two Gartner findings.  The global spend for BI in 2009 was US$9.3 billion;  50% of BI initiatives fail
we have a missing link to the whole BI process and that is – business