H+ Weekly - Issue #358
This week - small Chinchilla AI outperforms GPT-3 and other giant models; what’s next for protein-folding AIs; automating Spot; earbuds that watch your brain; and more!
MORE THAN A HUMAN
This Startup Wants to Get in Your Ears and Watch Your Brain
Born from Alphabet's “moonshot” division, NextSense aims to sell earbuds that can collect heaps of neural data—and uncover the mysteries of gray matter.
A New AI Trend: Chinchilla (70B) Greatly Outperforms GPT-3 (175B) and Gopher (280B)
There is a trend amongst AI companies to build bigger and bigger AI systems that have tens or even hundreds of billions of parameters. Researchers at DeepMind took a closer look at this trend and concluded that it is possible to substitute AI size with a good quality training data. To test this idea, they have created Chinchilla - an AI model much smaller than current state of the art systems but similar in performance. “The conclusion is clear: Current large language models are “significantly undertrained,” which is a consequence of blindly following the scaling hypothesis — making models larger isn’t the only way toward improved performance.”, writes Alberto Romero in this article.
What's next for AlphaFold and the AI protein-folding revolution
AlphaFold and RoseTTAFold, two AI systems that solved the protein-folding problem, made a huge impact in the scientific community. This article explains the impact of AI revolution in biology and how scientists use these tools to advance our knowledge about proteins.
How do I Pursue a Career in Brain-Based AI?
If you have ever wondered where to start your journey in brain-based AI, then check this post out. It gives the basic information where to start and what skills to acquire to start making artificial brains.
► Automating Boston Dynamics Spot Robot - Computerphile (14:41)
Guys from Oxford Robotics Institute show, on high level, what tools and techniques they have used to allow Spot move autonomously from point A to B while mapping and learning about its environment.
At Amazon Robotics, simulation gains traction
In this post, researchers from Amazon Robotics describe how they use virtual environments to train their robots and how they incorporate real-life tests because even the best virtual test cannot fully recreated the messiness of real-world physics.
This issue was brought to you by our awesome patrons Eric, Andrew, dux and Tom! You too can support the newsletter on Patreon.
Follow H+ Weekly!