IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

MIT researchers hail "liquid" algorithm breakthrough

A set of differential equations at the base of an algorithm could lead to a more adaptable type of machine learning

A digital depiction of a neural network

Researchers at MIT say they have developed a flexible algorithm that can change its underlying equations to continuously adapt to new inputs of data.

The "liquid" algorithm is said to be a new type of neural network that learns during tasks rather than just in its initial training phase.

It's hoped this new approach could revolutionise technology that relies on decision-making protocols where the data changes over time, or in unpredictable environments, such as medical diagnosis or autonomous driving.

The research will be presented at the AAAI Conference, an artificial intelligence event taking place in Vancouver, Canada, in February.

"This is a way forward for the future of robot control, natural language processing, video processing - any form of time series data processing," says Ramin Hasani, the study's lead author. "The potential is really significant."

Most neural networks have fixed behaviour and they typically don't adjust all that well to changes in incoming data streams. For example, the crash of an Uber autonomous vehicle in 2018 that resulted in the death of Elaine Herzberg, considered the first fatality involving the technology, was said to have been caused by the system being unable to identify the shape of a pedestrian when they were walking alongside a bicycle.

Related Resource

Unleashing the power of AI initiatives with the right infrastructure

What key infrastructure requirements are needed to implement AI effectively?

What key infrastructure requirements are needed to implement AI effectively?Download now

The neural network designed by Hasani has the potential to avoid these issues by using a set of differential equations as the base of its algorithm, potentially creating a more fluid type of machine learning. The idea is inspired by the microscopic nematode, Caenorhabditis (C) elegans, which has only 302 neurons in its nervous system. Hasani said they can still "generate unexpectedly complex dynamics".

Similarly, Hasani and his team used equations that allowed the parameters of his neural network to change over time. These are essentially a nested set of differential equations that change the representation of the neuron, creating a small number of highly "expressive" ones, according to Hasani.

"We have a provably more expressive neural network that is inspired by nature," Hasani said. "But this is just the beginning of the process. The obvious question is how do you extend this? We think this kind of network could be a key element of future intelligence systems."

Featured Resources

What 2023 will mean for the industry

What do most IT decision makers really think will be the important trends and challenges in the coming year?

Free Download

2022 Magic quadrant for Security Information and Event Management (SIEM)

SIEM is evolving into a security platform with multiple features and deployment models

Free Download

IDC MarketScape: Worldwide unified endpoint management services

2022 vendor assessment

Free Download

Magic quadrant for application performance monitoring and observability

Enabling continuous updating of diverse & dynamic application environments

View Now

Most Popular

Dutch hacker steals data from virtually entire population of Austria
data breaches

Dutch hacker steals data from virtually entire population of Austria

26 Jan 2023
GTA V vulnerability exposes PC users to partial remote code execution attacks
vulnerability

GTA V vulnerability exposes PC users to partial remote code execution attacks

23 Jan 2023
European partners expect growth this year, here are three ways they will achieve it
Sponsored

European partners expect growth this year, here are three ways they will achieve it

17 Jan 2023