Effort and BI – Is the Focus Shifting?
Neil Randall, a consultant in our London office, shares his thoughts on how technology is shifting the focus in the way data warehouse and BI solutions are being delivered.
Platforms such as Microsoft Azure and Amazons’ AWS offering may well be on the way to shifting the balance of where we put our time and effort in delivering Data Warehousing and Business Intelligence (BI) solutions.
First, what is the point of building a BI solution? Have we ended up in a place where the automatic answer is “well that’s easy!, it’s a Data Warehouse, ETL (Extract, Transform, Load) process and a reporting tool” and we pop off and carry out a few RFI’s and select a relevant tool to cover these 3 areas? Or is that we simply want to deliver data and reports to our end users as fast as we can and with the best value possible?
BI may well be in the middle of a huge shift in mindset. The old methods using traditional vendors delivered solutions – but often at vast expense and time. Purchasing involved mainly CAPEX, up front fixed term licensing costs and infrastructure expenditure. Cost of failure? well the old phrase “too big to fail” comes to mind!
With the new cloud platforms and tools, this doesn’t need to be the case anymore. We can spin up cloud based tools in a matter of minutes, with no upfront costs and “pay as you go”. We can build POC’s at low price points with little risk. Don’t like it? Just switch it off and stop paying. The cost of failure is rapidly approaching near zero which is great news for those of us who love to experiment.
Considering the 3 areas mentioned above, (ETL, DB and reporting tools) the cloud vendors now offer both ETL (actually more ELT) and DB at a fraction of the cost of traditional vendors. The front end reporting tool vendors are slowly catching up, moving to subscription models, although few have been brave enough to implement a true “pay as you go” model.
So how do these new possibilities impact where we put our effort? As mentioned ELT (extract, load, transform) is a cheap, super fast viable method of transforming our data into information within a platform such as Redshift or Snowflake, pay as you go big data style databases, so keenly priced it would make certain traditional competitors weep. The speed of development with these tools is impressive. You could build a data pipeline and have it up and running within days. Cost of failure? Negligible.
So I’m getting to the point, honest…….
If we can get our data pipeline up and running fast and at a low cost, we can switch our BI efforts away from managing infrastructure, databases and complex back end modelling and get into the real value add activities where our data consumers (end users) start seeing the data and building insight. With a database engine that scales up and down in near real time (depending on your usage) you also gain performance increases which don’t require weeks of a DBA’s time to tune index’s and add SQL hints, a host of network activity or procurement paperwork tying you into years of spend.
So we now have more choices than ever in defining our BI solutions and delivering data to our users and insight to our companies. We can choose to do it fast, with little cost of failure and concentrate our efforts in the analysis phases, or we can carry on spending a small fortune building and maintaining monolithic solutions which tie us in for years at a time. Want a world where you can treat your BI solution like any other development, press a button and fork out a development or test environment based on the latest position of the data and its structure? No problem!
It’s a brave new world out there and as someone who likes to see results fast and be able to experiment, it’s an exciting time to be in BI. Shifting the core of our effort to analysis and away from providing the “back end stuff” is key to a new approach.