Effectively Delivering Data
by Andrew Bilsdon, Delivery Lead – Altis Sydney
If you are involved in a data project, or the delivery of data as a service, this post is pitched at you.
The recommendations are based on my 15 years of experience in data, information and analytics and are intentionally pragmatic.
Self-service has arrived
Unlike the ‘paperless office’ that was once promised, but never really arrived, self-service is here. Tools like Power BI, Tableau, QLIK, etc. have now made it simple for end-users to create their own content without necessarily having to engage IT.
What was once within our domain to deliver, based on visualisation best practices, is now with the end-user, the genie is truly out of the bottle.
So, we need to shift our thinking. Instead of building the reports and dashboards, let’s make sure we now take a consultative approach in:
- Sharing with users presentation options
- Providing technical guidance
- Facilitating communication between areas in the business to encourage collaboration and reuse of datasets and therefore insights
If they want to use a 3-d pie chart, and we have explained the constraints, so be it!!
Given that end-users are now empowered to present data in their preferred manner, we need to embrace our new role of trusted data provider so that we remain relevant.
The number one priority is that it needs to be intuitive, using business concepts, names and hierarchies. If we need to provide comprehensive supporting information about what the data means we have failed.
Unless the business (and therefore the data) is slow-moving (e.g. annual membership renewals), intra-day delivery has to be the default frequency for data delivery.
The days of an overnight batch ETL process are long gone, however, don’t fall into the trap of thinking that ‘real-time’ is, therefore, the answer. This can add a whole set of new problems that the customer most likely won’t thank you for.
Architect for value, not excellence
Just because we can create multi-layer, streaming, AI solution doesn’t mean that we should. I have been on projects where we have spent months building layers (e.g. data vaults) that have delivered modest, tangible, business value.
The time to business value needs to the number one goal of any data project. You can bet that the longer you take to deliver meaningful data that enables the user, the more ‘shadow’ solutions will spring up in the shape of Excel, Tableau, Power BI, etc.
Remember that shadow solutions typically do not exist because the user has gone feral and chosen ungoverned, architecturally suspect options, instead, it is typically because we (i.e. IT) haven’t left them without an option. They have their day job, and we are not supporting them in doing it.
Yes, I know that scope needs to be managed, but that doesn’t mean we can use it as an excuse to deliver solutions that are slow to accommodate change.
I have seen many solutions built against requirements that are out of date within 6 months of the project completing. Now that end-users can self-serve, it means that their ability to ask new questions has also been enabled. We need to realise that the data we deliver will need to rapidly evolve and grow to keep pace with the questions being asked.
There is nothing more frustrating than for a user to see all the opportunities we have delivered, only to find that the solution is too rigid to accommodate change when they come up with a new question that isn’t natively supported.
DON’T OVER GOVERN!!!
This is my favourite, hence the upper case. I have seen so many projects struggle to get ‘buy-in’ because they have:
- Not created a:
- Simple process that allows users to request access.
- Flexible security model that allows for the level of control necessary to grant access.
- Taken the traditional approach that stewardship and ownership will solve governance issues without providing the owners and stewards with the:
- Training in who to make decisions on access to data
- Time in their busy schedule to fulfil their roles
- Budget to address issues, or make changes, when necessary
The inevitable consequence of excessive governance is
- The proliferation of shadow solutions so that users can do their job.
- Grumbling and noise from the users that IT consistently fails to make the data available
This doesn’t mean that a free for all is the alternative, it means finding the right level of governance.
Share the quality
Don’t pretend that the data is perfect, being open that there are issues is a powerful position to take. (see Don’t wait for your data to be right …).
Assuming that the issues arise at the point of data capture/entry it also means that the conversation changes from an IT-based one to a business based one.
Find an advocate
With the arrival of true self-service, there has been a proliferation of the power user. These are the users who used to wrangle Excel or even write simple SQL statements, now, this overhead has been removed and consequently, there are more users with an appetite to use the data prepared.
If you can find the right person, you will make their life so much easier, and in return, they will become a reference. It is so much easier to stand up the next project when you have a business user who is prepared to sing your praises.
If we want to remain relevant, we need to firmly embrace the role of providing data services where flexibility, convenience, quality and timeliness are the goals by which we measure ourselves.