Data Analytics at Data Agility.

At Data Agility, we’re not sure when data was first gathered, analysed and decisions made from it, but it is probably fair to say that it was a long time ago.

Data Agility have been in the business of data analytics since we were established in 2003.  Early on, we undertook a major piece of work for the Australian Taxation Office (ATO) on the impact of a change in service standard for their contact centres.  They had undertaken some initial analysis which resulted in a ‘gulp’ moment.  We then dived full length into their voluminous data, and working closely with them, dragged out a solution which avoided the ‘gulp’.


What we see.

Somewhat akin to the ATO’s experience, one of the challenges that many organisations and people face today is the volume and variety of data that is available.

Working out what to use and what not to use can be enormously time consuming, and there is often an uncomfortable feeling that in doing nothing, you might be missing the boat.

Another challenge that we see at Data Agility is that for all the mass of data that exists, what you want may not exist, or you may not have access to it even if it does exist.

It is very hard to make good decisions without reliable data. Ted Friedman at Gartner put it ‘a data lake without data quality is just a data swamp’.

Data Analytics

How do you approach the topic?

How do you approach data analytics?  Dr Peter Cox, a theoretical particle physicist, says you must be very clear in the question you want an answer to.

Dr Cox works with the data produced by the Large Hadron Collider (LHC) at Cern.  He says he has no chance of achieving anything if he isn’t clear on what he is trying to do with it in the first instance. 

The same applies to just about every data analytics challenge. Be clear on the question you want to ask and the problem you are trying to solve, and you’ll be half way there.

In the case of Sustainability Victoria (SV), they were tasked with modelling 30 years into the future to determine what the impact would be of significant population growth on the State’s waste infrastructure.  Waste comes in a variety of types including recyclables, household and commercial landfill and green (vegetation).

They needed to understand, in granular detail, what was the current waste management capacity and how much was currently being generated in each category.   Working in partnership with SV, we then built a projection model that include no-waste data such as population data, rainfall data and commodity prices.  Each of these elements have a significant impact on our production of waste.

Today, SV is in the happy position of being able accurately model the production of waste 30 years into the future. It also enables them to do scenario modelling. This is now a key component in the State’s waste management plan.

Technology, Tools and Expertise

For long term planning.

The SV model enables long term planning, but sometimes there is a need to do things very quickly. There are a range of technologies available today that do enable good quality data analytics in real-time.  

The solutions are often available from cloud-based service providers that allow organisations to start small and scale up as capabilities are built.

Data Agility concluded a program with EPA Victoria building an Azure based data analytics platform.  The platform is a component of a sophisticated solution that enables EPA to swiftly and accurately report significant environmental information.

One of its key capabilities is to publish in real-time air quality data. One day in November 2019 was perhaps its first big test.  For most Victorian’s 21 November 2019 was an intense day with temperatures over 40C and northerly winds persistently exceeding 40 kilometres-per-hour, and gusting much higher.

All the while a network of internet-of-things sensors across Victoria were gathering particle, carbon monoxide, nitrogen dioxide and ozone data, which the platform analysed and reported via EPA’s website. For many Victorians, in the space of just two hours, the air quality went from good to very poor. By mid-afternoon it was hazardous, the most severe rating.

Air Quality data has been recorded in Victoria for decades but the process of publishing the data has always been slow. Historically Victoria’s air quality has been extremely good but at a time of increasing climatic change, this new asset is a valuable addition to our everyday decision making.  

It is a nice example of how data is gathered, analysed and made available. 

It also picks out another contemporary theme of the new era: self-service.  All you have to do is simply check AirWatch and make a decision as to whether it is wise to be digging a hole, cycling home or should the kids attend sports training?



Where to from here.

In some ways, not much has changed.  You still need to know the question you want to ask/answer.  

You still need reliable data and you need to gather, analyse and publish it.  But as we have seen in the example of the excellent work of EPA Victoria and SV, if you can do all those things successfully you can deliver real benefits very quickly.


Submit a Comment

Your email address will not be published. Required fields are marked *

Most popular insights.

Effective use of technology and data in sports

Effective use of technology and data in sports

Sports across codes and the world expect decision makers, like referees, to make perfect decisions with the tech and data they have at their fingertips. The passion of the sports fan and the desire for the ref to make the correct ‘call’ is not going anywhere. So is technology to blame? Is the data inaccurate? Or is human error unavoidable? Let’s explore.

Data Lifecyle Management eBook

Data Lifecyle Management eBook

This eBook shows you the best practices on data lifecycle management; collect the data you need, store it securely while you need it, dispose of it when it is no longer needed.