Skip to main content

Geoffrey Hinton’s Forward-Forward Algorithm Charts a New Path for Neural Networks

2022-12-08 by Synced

there is increasing interest in whether the biological brain follows backpropagation or, as Hinton asks, whether it has some other way of getting the gradients needed to adjust the weights on its connections. In this regard, Hinton proposes the FF [Forward-Forward] algorithm as an alternative to backpropagation for neural network learning.

It aims to replace the forward and backward passes of backpropagation with two forward passes: a positive pass that operates on real data and adjusts weights “to improve the goodness in every hidden layer,” and a negative pass that operates on externally supplied or model-generated “negative data” and adjusts weights to deteriorate the goodness.

https://syncedreview.com/2022/12/08/geoffrey-hintons-forward-forward-algorithm-charts-a-new-path-for-neural-networks/

The Forward-Forward Algorithm: Some Preliminary
Investigations by Geoffrey Hinton
https://www.cs.toronto.edu/~hinton/FFA13.pdf

Using the Forward-Forward Algorithm for Image Classification
Includes sample Python code using the Keras Python library.
https://keras.io/examples/vision/forwardforward/

Training Computer Vision Models on Random Noise Instead of Real Images

2021-12-09 Martin Anderson

Researchers from MIT Computer Science & Artificial Intelligence Laboratory (CSAIL) have experimented with using random noise images in computer vision datasets to train computer vision models , and have found that instead of producing garbage, the method is surprisingly effective

https://www.unite.ai/training-computer-vision-models-on-random-noise-instead-of-real-images/

Memristor Breakthrough: First Single Device To Act Like a Neuron

2020-09-01 Samuel K. Moore

It combines resistance, capacitance, and what’s called a Mott memristor all in the same device. Memristors are devices that hold a memory, in the form of resistance, of the current that has flowed through them. Mott memristors have an added ability in that they can also reflect a temperature-driven change in resistance. Materials in a Mott transition go between insulating and conducting according to their temperature. It’s a property seen since the 1960s, but only recently explored in nanoscale devices.

The transition happens in a nanoscale sliver of niobium oxide in the memristor. Here when a DC voltage is applied, the NbO2 heats up slightly, causing it to transition from insulating to conducting. Once that switch happens, the charge built up in the capacitance pours through. Then the device cools just enough to trigger the transition back to insulating. The result is a spike of current that resembles a neuron’s action potential.

https://spectrum.ieee.org/nanoclast/semiconductors/devices/memristor-first-single-device-to-act-like-a-neuron

Also: https://www.nature.com/articles/s41586-020-2735-5.epdf?sharing_token=B11PDbIH67ccrQscLpqM19RgN0jAjWel9jnR3ZoTv0OdeNphDinnZf2DfBr6sMtOQnlA9ClIX5PlqiQovl5PS67A1_SeUDz_GOTcpm9U8FJOwFmzPM8n_1wR_XcVzo9nasoynqgc04XmOkuXv1UxU95v5wjS-eNBbDS0aEI6zvz9aX0jlTRX9soTeiiWwoHX-JFpZUeYiamNdcA3x8Vr8eOQFWRjS7vQ0Ji-WYiQAvIhdiylBLMCTx5sY6HEBVNO2EAlUzWxg8JW4JFhkFf9Fd_P8V18BwKJ_k_eJ2TofXNsyjmPTa-r98OT104dU21Eev4zf-LFX6_7z34scRoUTA%3D%3D&tracking_referrer=spectrum.ieee.org

Artificial brains may need sleep too

2020-06-08 James Riordon

States that resemble sleep-like cycles in simulated neural networks quell the instability that comes with uninterrupted self-learning in artificial analogs of brains

Watkins and her research team found that the network simulations became unstable after continuous periods of unsupervised learning. When they exposed the networks to states that are analogous to the waves that living brains experience during sleep, stability was restored. “It was as though we were giving the neural networks the equivalent of a good night’s rest,” said Watkins.

https://www.lanl.gov/discover/news-release-archive/2020/June/0608-artificial-brains.php

Spiking Neural Networks

2020-02-17 Martijn van Wezel

The SNNs bio-inspired neural networks are different from conventional neural networks due that the conventional neural networks communicate with numbers. Instead, SNNs communicate through spikes. … Having multiple spikes in a short period can stimulate the neuron to fire. However, if the time periods are to big between spikes, the neuron lose interest, and goes to sleep again.

… one major benefit of a Spiking Neural Networks is the power consumption. A ‘normal’ neural network uses big GPUs or CPUs that draw hundreds of Watts of power. SNN only uses for the same network size just a few nano Watts.

https://martijnvwezel.com/blogs/spiking_neural_networks/

Machine Learning Takes On Antibiotic Resistance

2020-03-09 Katherine Harmon Courage

In the February 20 issue of Cell, one team of scientists announced that they — and a powerful deep learning algorithm — had found a totally new antibiotic, one with an unconventional mechanism of action that allows it to fight infections that are resistant to multiple drugs. The compound was hiding in plain sight (as a possible diabetes treatment) because humans didn’t know what to look for. …

Collins, Barzilay and their team trained their network to look for any compound that would inhibit the growth of the bacterium Escherichia coli. They did so by presenting the system with a database of more than 2,300 chemical compounds that had known molecular structures and were classified as “hits” or “non-hits” on tests of their ability to inhibit the growth of E. coli. From that data, the neural net learned what atom arrangements and bond structures were common to the molecules that counted as hits. …

The researchers … also trained the algorithm to predict the toxicity of compounds and to weed out candidate molecules on that basis. …

They then turned the trained network loose on the Drug Repurposing Hub, a library of more than 6,000 compounds that are already being vetted for use in humans for a wide variety of conditions.

https://www.quantamagazine.org/machine-learning-takes-on-antibiotic-resistance-20200309/