Big Data has quickly moved ‘out of the lab’ and into operation across businesses all around Australia. Insights are driving the creation of very real solutions that deliver competitive advantage.

With the ability to now effectively exploit extremely large data sets, we have seen organisations dive head first into their Big Data. This doesn’t mean it always goes smoothly. Often organisations struggle to effectively unlock the potential of their Big Data and maximise the business value in a meaningful way.

As we are now dealing with large volumes of data – we now have timely access to answers for questions we might not have even asked before. As such, in order to make the most of Big Data, an exploratory mindset is sometimes necessary – where we approach our data analysis with flexibility and collaboration. Similarly, when maintaining long lead time implementation styles with Big Data solutions, organisations can easily pass by opportunities to operationalise high value insights quickly and effectively. The natural response is to fuse agile techniques into our approaches when driving delivery with Big Data – uncovering insights beyond our original expectations and remaining responsive to advancing technologies.

In this blog, we share one example from our experience of using an Agile approach to maximise the value that can be obtained from Big Data within a single project for an individual business unit.

An Agile Approach for Big Data Delivery:

The business we were embedded in was driven by the need to promptly respond to increasing workloads and gain end-to-end visibility into customer sentiments and related operations.

There are many ways a Big Data solution can be implemented, but in this instance, business insights were discovered in a matter of a few weeks by initially setting up a small scaled technology stack and showcasing use cases through prototype dashboards.

With early success in demonstrating how insights from Big Data can provide tangible business value, the flexible and scalable nature of Big Data technologies were then leveraged to easily expand out the scope of the solution, on the basis of already proven value cases.

This is where we found an Agile framework for delivery, supported by scalable Big Data solutions and a highly iterative business engagement, can work together to maximise opportunities for insight driven growth. Below are some of the specific learnings the engagement team took away from this approach:

Fusing an Agile Mindset with Big Data Delivery to Enable Insight Driven Growth
  1. Be ready to emulate a lean start-up

Taking advantage of the low cost opportunity of Big Data technologies 

Big Data technologies are renowned for being cost-effective, efficient and easily scalable.

Based on this, there is ample opportunity for low cost Big Data platforms to be implemented swiftly. When wanting to validate the value accessible in Big Data, we are able to utilise this and quickly set up an initial build of a Big Data technology stack, supported by a low cost barrier of entry. This initial build can then be used as a tool to derive insights in line with high impact, low effort use cases.

Once the value of these use cases has been proven, we can then scale the scope of the solution to facilitate operational growth.

This is a stark difference to when organisations adopt a ‘Big Bang’ type delivery model, where the full breadth of the solution is implemented all at once. In many ways, Big Data technologies enable a break from waterfall type delivery approaches for the adoption of iterative drops of value – enabling organisations to showcase insights and subsequent value upfront.

  1. Embody business and data ownership

Working side-by-side with the business so solutions are tailored to deliver true value

Big Data introduces new technologies and insights to the business, and with new things, people get excited! If not managed closely, the speed of new discoveries in Big Data can intimidate or surprise some, causing resistance in acting on these insights.

By developing business and technology team partnerships, organisations can proactively manage any perceptions and co-create the solution with a shared sense of ownership.

To reflect this in our way of working, we can involve users not just during the requirements discovery stage, but throughout the build of the use case. Having open discussions about how data scientists have translated business requirements into data design will assist with flexibly responding to new discoveries, knowledge sharing between business and data teams, and ultimately exceeding expectations.

  1. Discover value-adding use cases

Delivering value quickly in a fluid, evolving environment based on data discoveries 

When dealing with Big Data, it is rare to be able to lock down a complete set of requirements and get optimal business benefits, before needing to dive into the vast amounts of data.

Due to its large scale, business users may need time to interrogate the data and discover the full potential of use cases before being able to identify the right features. As business users discover new opportunities in their Big Data, requirements will evolve. In response, we would want the development of these use cases to also evolve.

This is where adopting Agile ways of working – such as a continuously evolving and a customer centric mindset – can assist us in delivering features that fit business needs and drive organisational growth beyond expectations. Also, don’t forget to showcase and celebrate outputs so the value achieved is made visible!

  1. Learn rapidly in focused sprints

Utilising Subject Matter Experts (SMEs) when breaking down workflow into short sprints

By breaking down workflow into manageable features throughout the value enablement journey, we are able to utilise the right skill set at the right time in the delivery journey.

To address new project discoveries, SMEs can be engaged in focused sprints, allowing for quick response to unique challenges uncovered.

The delivery team is able to ensure each component of the solution is functioning, leverage SMEs that can assist in promptly addressing challenges at specific points in time, and maintain overall confidence in the capability of the Big Data solution.

The above learnings are a good example of how open source Big Data technologies can often be implemented with a lightweight Agile delivery mindset to discover and deliver value early, whilst building towards a truly scalable solution in a project context.

 However, we’ve really just scratched the surface. Big Data solutions often enable a new level of agility at an enterprise level, providing new delivery and migration paths that were previously unavailable with older technologies.