H+ Weekly - Issue #361
View this email in your browser
This week - Meta releases their own large language model; updates on Boston Dynamics' Spot; a robot that can taste; UK's first autonomous bus starts road tests; and more!
ARTIFICIAL INTELLIGENCE
Democratizing access to large-scale language models with OPT-175B
Meta joined companies such as OpenAI and DeepMind in publishing a large-scale language model. Their model has 175 billion parameters which is at the same level as GPT-3. The source code and a small-scale pretrained model have been made open-source and are available on Github.
How Language-Generation AIs Could Transform Science
Large language models (LLMs) are being applied everywhere, including science. This interview highlights how they can be used in summarizing and aggregating scientific papers but also warns about the potential problems LLMs can cause, like oversimplifying complex topics or generating completely fake papers.
Time to get social: Tracking animals with deep learning
Here is an example of machine learning application you might now have thought of - a deep learning based pose estimation system designed to track animals in real-time without any intrusive markers. Named DeepLabCut, this system can track multiple animals and keeps track even when animal hides and reappears.
ROBOTICS
► What’s New in Spot | Boston Dynamics (10:41)
Engineers from Boston Dynamics present new upgrades to Spot, which include: new sensors, tablet and charger as well as new additions to Spot ecosystem - a new mesh radio kit and Spot Core I/O - a small computer with 5G connection designed specifically for Spot. Oh, and they also published another video with dancing Spot.
Swarming drones autonomously navigate a dense forest (and chase a human)
Researchers at Zheijang University in Hangzhou showed off a swarm of 10 small drones navigating a dense unfamiliar forest on their own. There is no central computer controlling them - the drones communicate with each other to plan their flight path so they don't fit any tree or each other. As a demonstration, researchers showed their swarm follows a human in a forest. If you are interested in how they look in action, check the video included in the paper.
R2-D-Chew: robot chef imitates human eating process to create tastier food
Thanks to researchers from Cambridge, robots unlocked a new sense - taste. They have created a culinary robot that can taste dish's saltiness and the myriad of ingredients at different stages of chewing. “If robots are to be used for certain aspects of food preparation, it’s important that they are able to ‘taste’ what they’re cooking,” said Grzegorz Sochacki, one of the researchers, from Cambridge’s department of engineering.
The UK’s First Autonomous Passenger Bus Started Road Tests
A self-driving bus has started tests in Scotland. The bus follows a 22km long route between a park & ride lot and a train/tram interchange near Edinburgh and it includes a 2.5km suspension bridge across a river.
BIOTECHNOLOGY
Scientists Discover Method to Break Down Plastic in Days, Not Centuries
By using machine learning to find the right mutations to create a fast-acting protein that can break down building blocks of polyethylene terephthalate (PET), scientists were able to create an enzyme that can break down plastic in a matter of days, not centuries.
This issue was brought to you by our awesome patrons Eric, Andrew, dux and Tom! You too can support the newsletter on Patreon.
Thank you for subscribing,
Conrad Gray (@conradthegray)
If you have any questions or suggestions, just reply to this email or tweet at @hplusweekly. I'd like to hear what do you think about H+ Weekly.
Follow H+ Weekly!