Researchers from the University of Liège (Belgium) recently developed a new artificial neuron inspired by the different modes of functioning of human neurons. Called a bistable recurrent cell (BRC), this process allowed recurrent networks to learn temporal relationships of over a thousand discrete time units where classical methods failed after only one hundred time units. These important results are published in the journal PLOS A.
The huge interest in artificial intelligence (AI) in recent years has led to the development of extremely powerful machine learning techniques. For example, time series – any data series where a time component is present, such as stock prices, weather, or electroencephalograms – are by nature extremely common and of great interest due to their wide range of values. applications. Time series analysis is a type of task in which machine learning techniques are particularly interesting, allowing the prediction of future events based on past events. Given the diversity of potential applications, it makes sense that processing such data through AI algorithms has become very popular in recent years.
A particular type of artificial neural network, called a recurrent neural network (RNN), has been specially developed in recent years to have a memory allowing the network to retain information over time in order to correctly process a time series. Whenever new data is received, the network updates its memory to retain this new information. Despite these developments, it is still difficult to form such networks and their memory capacity is limited in time. “One can imagine the example of a network which receives new information every day”, explains Nicolas Vecoven, doctoral student in the Systems and Modeling laboratory of the University of Liège and first author of the study. “but after the fiftieth day, we realize that the information of the first day had already been forgotten.”
“However, human neurons capable of retaining information over an almost infinite period of time thanks to the bis-stability mechanism. This allows neurons to stabilize in two different states, depending on the history of the electric currents to which they have been subjected , and this for an infinite time. In other words, thanks to this mechanism, human neurons can retain a little (a binary value) of information for an infinite time. “, Nicolas explains. -stability, Nicolas Vecoven and his colleagues Damien Ernst (specialist in AI) and Guillaume Drion (specialist in neuroscience) from ULiège, built a new artificial neuron with this same mechanism and integrated it into recurrent artificial networks. a bistable recurrent cell (BRC), this new artificial neuron has allowed recurrent networks to learn temporal relationships of more than 1000 time steps, where classical methods have failed ap rès only approximately 100 time steps. These are important and promising results that have been published in the journal PLOS A. The three researchers are continuing their research in this particular field and are continuing to develop technologies to improve the memories of RNNs, by promoting the emergence of points of equilibrium within them.
Source of the story:
Material provided by university of Liege. Note: Content can be changed for style and length.