2020-06-08 Chris Lee
Hydrogen, in low concentrations in a confined space, burns with a branching network instead of with a wide flame.
https://arstechnica.com/science/2020/06/fractal-flaming-hydrogen-wiggles-through-tiny-gaps/
2020-06-08 Chris Lee
Hydrogen, in low concentrations in a confined space, burns with a branching network instead of with a wide flame.
https://arstechnica.com/science/2020/06/fractal-flaming-hydrogen-wiggles-through-tiny-gaps/
2020-05-29 Blender Animation Studio
Fueled by caffeine, a young woman runs through the bittersweet memories of her past relationship. Get the production files, assets, tutorials and exclusive making-of videos by joining Blender Cloud at https://cloud.blender.org/p/coffee-run
https://www.youtube.com/watch?v=PVGeM40dABA
Discovered from https://www.blendernation.com/2020/05/29/coffee-run-blender-open-movie/
2019-12-16 Elena Renken
Scientists are beginning to understand one of the ways in which sleep may benefit the health of the brain: by organizing the flow of fluids that can wash away harmful build-ups of proteins and wastes around neurons.
https://www.quantamagazine.org/sleeping-brain-waves-draw-a-healthy-bath-for-neurons-20191216/
“The trails that we see are actually casts, corresponding to the base of the sauropods’ feet,” Moreau explains. “It’s as if you were looking at the tracks left by the dinosaurs from below. What happened is that infiltrated water eroded the rock beneath the sedimentary layer which contains the footprints.”
https://scienceblog.com/516341/dinosaur-footprints-on-a-cave-ceiling/
2020-04-30 Kim Martineau
They showed that a deep neural network could perform with only one-tenth the number of connections if the right subnetwork was found early in training.
Train the model, prune its weakest connections, retrain the model at its fast, early training rate, and repeat, until the model is as tiny as you want.
https://news.mit.edu/2020/foolproof-way-shrink-deep-learning-models-0430
2020-02-17 Martijn van Wezel
The SNNs bio-inspired neural networks are different from conventional neural networks due that the conventional neural networks communicate with numbers. Instead, SNNs communicate through spikes. … Having multiple spikes in a short period can stimulate the neuron to fire. However, if the time periods are to big between spikes, the neuron lose interest, and goes to sleep again.
… one major benefit of a Spiking Neural Networks is the power consumption. A ‘normal’ neural network uses big GPUs or CPUs that draw hundreds of Watts of power. SNN only uses for the same network size just a few nano Watts.
2020-03-20
Alcohol-based disinfectants are also effective, but soap is a highly efficient way of killing the virus when it’s on your skin
… Health authorities have been giving us two messages: once you have the virus there are no drugs that can kill it or help you get rid of it. But also, wash your hands to stop the virus spreading. This seems odd. You can’t, even for a million dollars, get a drug for the coronavirus – but your grandmother’s bar of soap kills the virus.
So why does soap work so well on the Sars-CoV-2, the coronavirus and indeed most viruses? The short story: because the virus is a self-assembled nanoparticle in which the weakest link is the lipid (fatty) bilayer. Soap dissolves the fat membrane and the virus falls apart like a house of cards and dies – or rather, we should say it becomes inactive as viruses aren’t really alive.
2008-10 Paul Graham
… investors tend to be less willing to invest in bad times. They shouldn’t be. Everyone knows you’re supposed to buy when times are bad and sell when times are good.
2020-03-09 Katherine Harmon Courage
In the February 20 issue of Cell, one team of scientists announced that they — and a powerful deep learning algorithm — had found a totally new antibiotic, one with an unconventional mechanism of action that allows it to fight infections that are resistant to multiple drugs. The compound was hiding in plain sight (as a possible diabetes treatment) because humans didn’t know what to look for. …
Collins, Barzilay and their team trained their network to look for any compound that would inhibit the growth of the bacterium Escherichia coli. They did so by presenting the system with a database of more than 2,300 chemical compounds that had known molecular structures and were classified as “hits” or “non-hits” on tests of their ability to inhibit the growth of E. coli. From that data, the neural net learned what atom arrangements and bond structures were common to the molecules that counted as hits. …
The researchers … also trained the algorithm to predict the toxicity of compounds and to weed out candidate molecules on that basis. …
They then turned the trained network loose on the Drug Repurposing Hub, a library of more than 6,000 compounds that are already being vetted for use in humans for a wide variety of conditions.
https://www.quantamagazine.org/machine-learning-takes-on-antibiotic-resistance-20200309/
2020-02-04 Jennifer Ouellette
Study reveals that polymers of varying strand lengths are the key ingredient.
The article includes the recipe for blowing gigantic bubbles.