Queen Elizabeth Olympic Park uses artificial intelligence (AI) to create real-time insights that help improve the flow of visitors around its sites, transportation systems and retail facilities.
the London Legacy Development Corporation (LLDC), which manages the park, has plugged 32 of its CCTV cameras into an artificial intelligence platform from computer vision specialist Fyma over the past two years. Emma Frost, Director of Innovation at LLDC, tells diginomica how this technology offers new insights into how the park and its facilities are used.
Frost describes the park, which covers 560 acres, as “a really interesting space.” She says it has continued to evolve rapidly since the London Olympics in 2012. Although many sporting venues remain, the park also provides access to a range of commercial, cultural, educational and commercial facilities, such as the Stratford City Westfield shopping center located nearby. center.
These sites and facilities are connected by a network of roads, trails and public transport hubs. Frost says AI has enabled his innovation team to turn park CCTV cameras into smart devices that deliver new data-driven insights. The technology helps track the movement of people and objects in the area, which helps LLDC make important decisions about how the space is used, improved and developed. Frost explains:
Having a real-time data feed on the movement of people and the different modes of transport in the area is very important for us because we need to know at all times what the numbers of users are and what the approaches of users are in and around the park, because it changes so quickly.
LLDC also wanted to use the Fyma AI platform, which has been adopted by businesses and local governments in other European cities, due to its strong data ethics principles, Frost says. The platform is trained to never recognize or process human faces, for example.
Fyma blurs human faces in the images used to train its AI system, so the algorithms — and the data science teams that process the information — never see human faces. Camera feed data is automatically deleted once it passes through AI analysis. Frost explains:
There is no gray area there; the data is never actually captured. And as a public organization, it was very important for us – as we start to enter the world of AI and understand different technological applications for a better urban future – that we really pay attention to the ethics of data, data standards and data protection from the start of any of our trials and initiatives.
Analyze the use of alternative transport
More than 43 million objects have been detected in the vicinity of the park to date. Digging deeper into the data yields more detail on specific areas, such as transportation. Let’s take the example of Waterden Road, which is the road leading to the Westfield Strafford City shopping centre. Since the start of the project, Fyma has detected more than nine million people, approximately 1.6 million buses, more than 532,000 bicycles and more than 100,000 electric scooters. Frost says:
We have seen a massive influx of new modes of transportation. So, for example, we were the first place to test electric scooters. They are not legal on major roads. But because we are a private estate, in 2018 we were the first place in the UK to test and run a trial for electric scooters.
The team is also using the AI system to analyze how people use electric scooters. They examine key issues, such as the routes they take and potential points of conflict with pedestrians and other means of transport. Frost adds:
Using the AI platform with our existing camera network means we get a real-time readout of all this information. We actively had to train the AI platform to be able to detect the shape and shape of the e-scooter. And that was a big part of that trial to make sure we could get high-accuracy readings on detecting those scooters.
Information from the artificial intelligence system, which also looks at trends such as bike use, bus stop popularity and the flow of people through the park, will help LLDC make the area more accessible, user-friendly and sustainable. In the years to come. Fyma’s system was initially implemented in a six-month trial last year. Now that the technology has proven itself, Frost wants to do more. She says:
We will now extend this trial. We’re going to expand the 32 cameras that looked mostly around the edges of the park and we’re going to cover most of the interior of the park, with a few key priority areas. So it’s really an extension to look at more detailed areas between sites and road networks, so we can analyze the next level of detail.
A good example of where they want to go next is to use AI to inform LLDC’s retail strategy. Frost explains:
It’s about understanding in great detail the movement of people walking and where dwell times are the longest, and how those levels change over weekends, whether there are seasonal patterns or whether or not a West Ham United match takes place at the London stadium.
LLDC and its partners could use this data to create real-time operational benefits, such as changing hours of operation to reflect higher visitor numbers. More generally, she says, Fyma’s application to the park shows how any organization looking to use AI should be aware of important cultural considerations. Frost says:
With that comes all the questions of the unknown and ethics, risk appetite and uncertainty – and sometimes, just a lack of understanding. There is therefore a big cultural and educational element that must accompany any introduction of technological projects – and data projects, in particular.
LLDC uses Fyma’s data science capability to help understand the information that is captured. Longer term, Frost says LLDC will need to think carefully about how it partners with other organizations to access data talent. The organization may even need to develop expertise internally. What’s crucial to recognise, says Frost, is that the data-driven work she’s now undertaking is part of a much larger digital innovation strategy:
The park was built with incredible digital connectivity. Now we’re really trying to figure out what we need to put in place in terms of learning, culture, and managing that infrastructure.