Hi, I’m Roger Jones and I’m the Chief Technology Officer of Auckland Transport. We cover everything to do with the transport in Auckland Transport, apart from the state highways. We run all the urban roads, all the rural roads, which are substantial in Auckland, including the maintenance and operation of everything on the roads.
One of the first things we recognized quite early was that we were running multiple systems. For instance, CCTV, we had five control rooms for five different operations. Being able to automate that so that the operators would be presented with things that mattered to the public and to the transport agency from a safety perspective, and from an operational perspective, that we could get huge efficiencies and economies of scale out of that. So that was one of the key drivers for bringing them all in together, running the analytics across that, and then making that data available to other emergency services.
So we looked at it from two perspectives, that’s the data acquisition of real-time data and sensor data from across the city, and then the analytics. So what we’ve done is separated the historical data warehouse, which used to collect a lot of real-time data from the buses and public transport and we’ve migrated that into a high-speed database, using the Microsoft platform on top of a HPE appliance. And then we’ve used a Vertical platform to start taking the analytics that we need in real-time and process that. Now, we’re talking about a petabyte of data coming off the CCTV alone. So it’s a significant amount of data that’s being processed in real time that we’re using the likes of Vertica for.
So we use IDOL as a front end processing for all our CCTV analytics. So that’s the core engine and it’s also driving incident management alerting to the operators. Once that’s processed, we’re passing a different set of data into Vertica, which is available for real-time queering by the planners, the people who are managing incidents. So we’ve separated the two. The real-time management are things that are happening right now and the one that’s actually slightly past that but it’s the “What happened five minutes ago? What’s gonna happen in the next ten minutes?” So being able to use engines like IDOL enables us to multitask the camera into pedestrian counting, cycle counting, who went through red lights, number of vehicles, and provide us with real factual information that we didn’t have before. IDOL does the front end analytics, very good at doing that, but then it passes it on to Vertica.
What we can now do through efficient design is reduce the number of cameras and multitask them. So we no longer have a camera that just counts cars. It can read a number plate. It can count pedestrians. It can activate the traffic lights for the cyclists.
So, yes, we can fundamentally change the way traffic lights work in a city.
### END ####
Sponsored by HPE