Computers are just too good to remember everything we teach them. Normally, this is good; you wouldn’t want the systems that maintain your medical or financial records to start dropping random 1s and 0s (OK, well maybe the one that tracks your credit card debt, but other than that). However, these systems generally do not discriminate between information sources, i.e. each bit of data is treated with the same vigor. But as the amount of information available increases, AI systems must spend more and more limited computing resources to manage it. Facebook researchers hope to help future AIs pay more attention to it by .
It’s called Expire-Span and is designed to help neural networks more efficiently sort and store the information most relevant to the tasks assigned to them. Expire-Span works by first predicting what information will be most useful to the network in the given context, then assigning an expiration date to that data. The more important the information, the more it assigns the expiration date, Angela Fan and Sainbayar Sukhbaatar, scientific researchers at FAIR, explained in a . Thus, neural networks will be able to retain relevant information for longer while continuously erasing memory space by “forgetting” irrelevant data points. Whenever new data is added, the system not only evaluates its relative importance, but also re-evaluates the importance of existing data points linked to it. It will also help the AI learn to use available memory more efficiently, which leads to better scalability.
Forgetting, at least for AIs, can be a challenge as . Like the 1s and 0s that make up the AI code, the system may or may not remember information. As such, optimizing for a binary system like this is oddly difficult. Previous attempts to work around this difficulty involved compressing less useful data so that it took up less memory space, but these efforts were unsuccessful as the compression process results in “fuzzy versions” of the information, according to Fan. and Sukhbaatar.
“Expire-Span calculates the information expiration value for each hidden state, each time new information is presented, and determines how long that information is kept as memory,” they explained. “This gradual degradation of certain information is essential to preserve important information without confusing it. And the learning mechanism allows the model to adjust the scope size as needed. Expire-Span calculates a prediction based on the context learned from the data and influenced by its surrounding memories. “
Although still in the early stages of research, “as the next step in our research towards more human artificial intelligence systems, we are investigating how to incorporate different types of memories into neural networks,” the research team wrote. In the future, the team hopes to develop an approximation that is even closer to human memory, but capable of learning new information much faster than current technology allows.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.