How F1 harnessed AWS cloud computing to drive competition

An F1 car doing a victory donut on a track, surrounded by tyre smoke. Fireworks are going off in the night sky above the track. Decorative: the car bears the logos for Red Bull and Oracle, as it is the car of Oracle Red Bull Driver Max Verstappen.
(Image credit: Getty Images)

How do you make Formula One (F1) more exciting? Few will spend two hours on a given Sunday watching cars file around a track in a procession, and least of all the millions of new fans drawn in by the deftly-edited documentaries and clips that only show the most tense moments of races that are otherwise packed with more boilerplate exchanges.

Ahead of the 2022 season, the F1 authorities decided to change the rules and use one cloud computing services to inject a more competitive atmosphere into races.

F1 enforced modifications to the design of the cars that would make it easier to overtake, to boost the overall excitement of races. Its engineers and experts had a good idea about what modifications would result in more overtaking but needed evidence to prove it. 

Testing is wildly expensive and terrible for the environment; wind tunnels are limited in size and scope, especially when you’re measuring overtaking which necessarily requires two cars. Instead, F1 turned to its partners at cloud computing giant AWS to see if software simulations could provide the answers.

Racing at the edge

F1 teams each design their own cars but must do so within the very tight confines of the F1 regulations. These stipulate how big certain parts must be, or put constraints on the design of different components. In order to meet these requirements while still achieving a competitive edge, teams have to collect and analyze big data.

Data has often been described as the new oil, and this is true even in a sport like F1. Rob Smedley, former Ferrari race engineer and now an F1 consultant working alongside AWS, compares contemporary F1 with when he joined in the 1990s, joking “we still had abacuses”.

“When I first came in, there were probably 20 sensors on the car,” says Smedley. “Fast forward not many years after – in fact, above five years – and there’s probably something like 100 sensors on the car and like 1,000 or 1,500 data channels.

“If you look at today’s cars, in test mode, they might have 500 sensors on the car, to just keep extracting more and more data.”

RELATED RESOURCE

Purple whitepaper cover with image of smiling female worker wearing glasses and carrying a folder and smartphone

(Image credit: AWS)

How to improve operations for Kubernetes at scale or focus on building new applications

DOWNLOAD NOW

Smedley says his generation of engineers pushed the use of data and edge computing sensors to “drive better decisions” for two reasons: “one, to design fundamentally faster racing cars and then, two, to optimize those racing cars for any given track or condition”.

“We started to use much heavier deployment of data analytics, we were very early adopters of neural networks and subsequently machine learning. Given the complexity and how wide the problems are, it’s just kind of gone on from there,” he says.

So, with no shortage of data to fall back on, F1 bosses turned to the cloud to help them rewrite the rulebook.

Looking to the cloud for aerodynamics

When it comes to testing F1 cars there are several options, but none of them are cheap. Dr Neil Ashton, principal computational fluid dynamics architect at AWS, says the “old-school way of doing it” is the physical test, where you drive a car around a circuit lots of times to collect data.

“Formula 1 recognized that one, that was costing a lot of money and two, it’s not really environmentally friendly to be doing that sort of testing. So, physical testing from a Formula 1 point of view has long been reduced.”

Some alternatives don’t impact sustainability initiatives as much, such as wind tunnels which which shoot wind over cars to test the aerodynamics of new designs. Ashton says NASA had one facility where you could fit a whole aircraft inside the wind tunnel, but “it requires the power of a small city”. Although the F1 tunnels aren’t that power-hungry, they’re still expensive to construct and limited in space, which isn’t ideal when you’re trying to measure the impact of changes on overtakes.

So with both traditional testing methods proving expensive and ineffective for design testing, it had to be simulated in software via digital twins of cars. But why go for the cloud instead of a dedicated supercomputer? “They [F1] did have access to computing power through an on-premise, more classical ‘supercomputer’, but that had a fixed capacity, which meant that simulations were taking about 60 hours,” says Ashton. 

With a supercomputer taking up to three days to run a simulation, and only having the capacity to run one simulation at a time, “it really did limit their capacity to go through many, many different designs,” says Ashton. 

“Whereas the whole premise of cloud computing is you get access to this virtually unlimited capacity, where the limitation is really on the individual to produce the ideas rather than the compute.”

Running the new car designs through a simulation on AWS’s cloud infrastructure instead of the supercomputer reduced the time of each run from 60 hours down to 12. And the cloud could handle multiple runs simultaneously. “That’s the difference between getting something done on the same day versus two or three days,” says Ashton. “You go into the office, you run something, you get it back by the end of the day.”

The F1 engineering team originally planned to run 20-30 simulations a week but was able to boost that to 80-90 tests with AWS’s cloud infrastructure. With a limited window in which to rewrite the rulebook, the cloud made a much bigger set of changes possible.  

Scalable compute for an ever-changing sport

Running simulations on AWS might be cheaper and faster than using a wind tunnel, but is it as accurate? Neil Ashton claims it’s very close. “It depends very much on the specific simulation that’s being done,” he says. 

“It can be within a few percent, so 1 or 2% difference. But it could be worse than that, or better than that, depending on the specific technique that is used.”

However, the more computing power that’s thrown at the problem, the better the results. “Essentially, there’s a linear relationship between the accuracy of the simulation and the compute power required,” Ashton adds. 

“That’s why so many companies are moving to the cloud, because they see that if they want to get their simulations to be more accurate, they need more compute. If you want to close that gap from 2% to 1% to 0.5%, it is increasingly going to be about more compute. That’s the direction of travel across many different industries.”

F1 drivers Lewis Hamilton, Max Verstappen, and George Russell stood on podiums in front of a wall bearing multiple iterations of the AWS logo. An orange LED sign above the drivers identifies the race as the Formula 1 Gran Premio de España 2023.

(Image credit: Getty Images)

Aside from adding multiple cars to the scenarios, there are other things you can simulate with software such as the impact of crosswinds, temperature differences, or adverse weather conditions. Simulations also let the testers model the fluid and structural motion of the car together, “so you can really start to get into the physics,” says Ashton. 

Finally, once all the simulations are run and results are collected, “that mathematical model can essentially go inside your racing game”. That is, the car simulators F1 drivers train with. 

“When they give that test driver the steering wheel, that person will have pretty close to a feel of what the car would be like so that when a team says ‘we have a new part that we’d like to test,’ they will actually test it in the simulator first to see if it gives an advantage. If it gives an advantage in the simulator, then they may go and physically build it and send it to the track.”

Pairing hardware and software for live results

The goal of the 2022 rule changes was to create a car with a much smaller ‘wake’, reducing the turbulence experienced by cars chasing drivers. To that end, changes were made such as fitting wheels with wake control devices and redesigning the rear wing expel air above the car following behind. All of this was simulated over six months, generating 550 million individual data points before the rules were finalized and teams could get on with designing their cars.

Rules were tweaked again at the start of the 2023 season, and they will continue to be for years to come, as F1 looks to get the formula right. Hamilton’s teammate, George Russell, remarked recently: “The sport took a really good turn for the better when these new cars were introduced [in 2022], but we need to take it to the next step now.” 

Neil Ashton thinks that’s possible with ever-more cloud computing power coming online with each passing year. “If you could do more and more compute, you could design the cars more efficiently,” he says. “That would create even more exciting cars, that would create even more exciting racing. I think we’re in an inflection point now where it could really do something great for the sport.”

When it comes to practices and race day, a whole industry kicks into play. Huge trucks arrive at each location, forming temporary HQs for each team known collectively as the paddock.

RELATED RESOURCE

Whitepaper cover with image of man in glasses sat at a workstation

(Image credit: AWS)

Discover DevOps services and tools that help will you rapidly iterate and scale

DOWNLOAD NOW

Tucked discreetly behind the paddock is F1’s Event Technical Centre (ETC), a suite of temporary rooms packed with monitors, workstations , and serious-looking technicians. In terms of data, this is the ground zero of each race. Every byte of information is tracked by the dozens of sensors on each car, with the cars themselves thrown up on multiple 4K video streams that eventually appear on our screens.

While the coverage you see on TV is topped and tailed by the hundreds of broadcasters who beam races around the world, they rely on a single source: F1 itself. It has total control of the broadcast as if it were a TV production company as well as the organizer of the event. F1 also fully controls the technical services, from the timing that decides who wins and who loses to the bottomless pit of telemetry (rev counts, the steering angle, the G-forces) produced by internet of things (IoT) sensors on cars. Viewers eventually see these stats on their screens.

This process is managed by a comparatively small team that’s more akin to a small to medium-sized business (SMB), with a few hundred employees, rather than the global megabrand that one might think of when presented with the name ‘Formula 1. 

“We’re not a massive organization,” says Pete Samara, director of innovation and digital technology at Formula 1, speaking on the practice day before this year’s Silverstone race. “We’re an SME effectively, and efficiency is really important.”

At its most basic that means buying hardware that will keep working (“that we don’t need to switch on and off”) and a system that allows staff to order whatever they need to their job. “We’ve got engineers, editors, broadcast graphics operators, computer-aided design (CAD) designers that are 3D-forming this building. So all types of user profiles.”

For the past few years, F1 has chosen Lenovo as its hardware partner, and that partnership extends to its data center. Or, strictly speaking, two data centers. The control center that pops up at each race is merely a satellite compared to F1’s Media and Technology Centre in Biggin Hill.

F1 driver Sergio Perez of Mexico and Oracle Red Bull Racing, sat with his team in the pit looking at data on laptops.

(Image credit: Getty Images)

Processing all the data captured at each event – around 500TB of media in the course of a 90-minute race – necessitates more compute power than the modest hardware contained in the ETC. For this,  it is beamed back to F1’s Biggin Hill base, where the data is processed in real-time and transmitted to broadcasters.

F1 also has big plans to enhance the quality of the data it provides to broadcasters in the coming seasons. Rob Smedley says he’s spent some time with the football Premier League broadcasters, watching how they cover live matches. “You’ll have your data scientists in the background saying ‘right, here are some really interesting stats’, and then you’ll get the pundits, who are actually engaging directly off-camera with the data science team… and they feed that directly to the audience.”

“We need to get to that [in F1],” says Smedley, “There’s technology we’re trying to build with AWS at the minute which will eventually get us to that. It will be more virtual, it will be more artificial intelligence (AI) driven, because you’ve got a hundred different broadcasters with a hundred different commentators, all maybe at that point telling a hundred disparate stories. So in order to get in sync, then we’ve got to use the data, but we also use got to use technology.”

This content originally appeared on ITPro's sibling magazine PC Pro. For more information and to subscribe, please visit PC Pro's subscription site.

Barry Collins

Barry Collins is an experienced IT journalist who specialises in Windows, Mac, broadband and more. He's a former editor of PC Pro magazine, and has contributed to many national newspapers, magazines and websites in a career that has spanned over 20 years. You may have seen Barry as a tech pundit on television and radio, including BBC Newsnight, the Chris Evans Show and ITN News at Ten.