Noise cancelling headphones remove ambient sound, allowing the wearer to ignore distractions and indulge in an optimised listening experience.
In our digital society, we are constantly filtering noise to find the most important and relevant music/friend/cafe/meeting or other information.
Government open data initiatives come with their share of noise. The public sector will do well to put on the headphones and focus on the most important, most useful or ‘highest value’ data. A simple data value assessment method is needed.
Government organisations collect and manage large volumes of data on a daily basis in the development of public policy, programme management and service delivery. The Australian government’s open data initiative requires that data collected and held by the public sector be made available publicly. This creates much needed transparency, helps improve citizen engagement, and, provides the foundations to support new and innovative products and services (think new apps). This has the potential to make a significant economic impact, in Australia, this has been estimated to be up to $25b annually.
It’s no surprise then, that local councils, state and federal government organisations have established open data platforms and are busy publishing their data assets (as a start, refer to over 23,000 discoverable data sets on https://data.gov.au and spatially-enabled data on https://nationalmap.gov.au).
This trend isn’t going away any time soon. Rather, it has support from the top-down, with the release of the Australian Government policy on public data by the Prime Minister on 7 December, 2015.
Australia is not alone on this, as it has global focus from governments in UK, US, Canada, NZ and Europe, as well as a strong alignment with Smart City agendas implemented by local governments.
However, it’s not as easy as it sounds. Making data publicly available is driving government organisations to ask some basic questions such as ‘what data do we even have?’, ‘what should we open first?’, ‘how do we go about publishing our data?’ and ‘how will we protect the sensitive stuff?’.
The challenges include:
- Identifying the data held in the organisation. Few large organisations have a current, complete and readily accessible view of the data they hold across their portfolios, divisions and branches. This creates a need for (usually resource-intensive) data discovery and ‘data cataloguing’ to establish an agreed view of what data assets currently exist.
- Lack of consistency. The way that data is defined, created, maintained, published and used varies depending whose idea it was at the time. This means that effort is needed to establish and implement the right data standards to drive consistency, connectedness and repeatability.
- Scarcity of ICT infrastructure and data skills. The right tools and technology are not always readily available. The recent Data Skills & Capability Report highlighted and targeted the need for investment in a workforcethat has the skills and capability to effectively manage and use data, alongside an uplift in the ICT Infrastructure.
- Legislative (and cultural) constraints. Ensuring that sensitive data is protected can get pretty complicated due to legislative and contractual constraints, not to mention a tendency for risk aversion. A focus on cultural change and willingness to unpack and challenge legislative barriers is one part, but there is also a need for getting comfortable with not knowing what the data might be used for.
These challenges, topped with resource and funding and constraints (e.g. everyone already has a day job), drive the need for a prioritised approach to making government data public. For government organisations seeking to meet their obligations to publish to open platforms, this means focusing on publishing the most valuable data first to maximise the return on public funds.
Data value assessments, in an open data context, support efficient resource allocation to publish the data assets that are most valuable. This also helps high value data to be readily shared within an organisation, even before it has hit an open data platform. Further, let’s not forget that the outcome of a data value assessment can be leveraged to inform the prioritisation of other strategic data and information management investments (such as data quality improvement).
The initial challenge is to determine how data can be valued. There are a number of ways to assess data value, several of which have been explored and utilised by public sector organisations. Ways that data value can be assessed include:
Economic – The bottom-line financial value of the data asset, which looks to describe the widespread impact of the data and its potential consumption. The economic value can be looked at by the benefit of consuming the data, minus the cost to maintain or obtain the information.
Commercial – The monetised view of data which assesses the income that can be generated by selling, renting or bartering with this data (e.g. how much is a consumer willing to pay for access to this data asset).
Relative Business Value – Value scoring based on ranking the relative value of one data asset against another, across dimensions such as alignment with organisational objectives and data asset usage.
Regardless of which value assessment approach is taken, a more holistic view of the organisation’s data assets is needed as the starting point. There is also a reasonableness test in that trying to log and assess all XLS spreadsheets is not doing anyone any favours! We can see how the idea of ‘data value’ can get complicated or ‘noisy’ very quickly. But there is always a way forward.
The right approach to putting a value on data should be based on the organisation’s maturity and the business ‘use cases’ for how information about the value of a data asset will be used. E.g. an organisation intending to sell a part of its business, and the data that goes with it, would choose a Commercial data value assessment approach. Alternatively, an organisation looking to understand the full impact of publishing a data set (and let’s say that it is published to an open data platform without a cost), might opt for an Economic assessment to fully capture the benefits realised through publication.
A government department in the early days of its data management journey, but seeking to meet its obligations to publish data to open platforms, should start simple. This means leveraging the Relative Business Value approach.
In contrast to an economic or commercial data value assessment, the Relative Business Value approach does not attempt to directly map data assets to a dollar figure. The benefit of this method is that it is fast and simple to execute and well suited to an organisation that is at an early stage in its data management journey.
Once a view (or catalogue) of the organisation’s data assets has been established, the Relative Business Value approach can be as straightforward as mapping data assets to two dimensions:
- Strategic relevance: the extent to which the data asset supports the organisations objectives.
- Usage: the extent to which the data asset is accessible and used. This could focus on current use or potential use and (either within or outside of the organisation).
These two dimensions can readily be mapped on an axis to help visualise the highest value data assets.
The data assets with the greatest importance to organisational objectives and greatest usage will usually become the data assets that are considered ‘highest value’ and strong candidates for publication to open data platforms. (This also makes them priority candidates for data quality assessment and improvement).
Undertaking a data value assessment also creates a good opportunity to test out the role of data owners and data stewards. E.g. who has an informed view of which data assets are more valuable than others? It follows that this type of initiative can highlight the right candidates for data ownership roles, critical for an organisation seeking to strengthen their data governance capabilities.
The Relative Business Value approach for assessing the value of data assets is far from perfect (e.g. it measures value based on internal perceptions, not industry or public appetite) but it’s a simple place to start and an approach that delivers a clear output – that is – an agreed view of which data assets are most valuable to the organisation.
In the same way that noise cancelling headphones enable the listener to focus on the intended richness and variation in music, in an open data context, a simple data value assessment can move past the noise and put the focus on the data that matters most. Getting the most valuable, ‘priority’ data published to open platforms brings our nation a step closer to realising the immense economic and social gains of data sharing.