您现在的位置是:Google's SayTap allows robot dog to understand vague prompts >>正文
Google's SayTap allows robot dog to understand vague prompts
上海品茶网 - 夜上海最新论坛社区 - 上海千花论坛775人已围观
简介By subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.We have s...
By subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.
We have seen robot dogs perform some insane acrobats. They can lift heavy things, run alongside humans, work in dangerous construction sites, and even overshadow the showstopper at the Paris fashion show. One YouTuber even entered its robot dog in a dog show for real canines.
And now Google really wants you to have a robot dog. That’s why researchers at its AI arm, DeepMind, have proposed a large language model (LLM) prompt design called SayTap, which uses ‘foot contact patterns’ to achieve diverse locomotion patterns in a quadrupedal robot. Foot contact pattern is the sequence and manner in which a four-legged agent places its feet on the ground while moving.
See Also Related- Thanks to GPT, now you can have conversations with Unitree's new AI robodog
- Robotic dog runs independently on a treadmill with no support
- Google's 'Barkour' will let robots navigate obstacle courses just like real dogs
Paving the way for intelligent helper robots?
This means they just made communication between a human and a robot dog much simpler, with the robot dog getting more adept at predicting what a human could mean. The quadrupedal robot was able to take both direct and vague human commands.
It’s not uncommon for robot dogs to understand basic instructions and perform them. Unitree Robotics launched Unitree Go2, a robot dog that is GPT-enabled and can jump around, shake hands, and cheer as and when instructed. What’s interesting about Google DeepMind’s LLM is that it enables the robot dog to understand unclear and vague commands.
Direct instructions ranged from ‘Lift front right leg’ and ‘Pace backward slowly.’ Whereas unstructured/vague commands were along the lines of ‘Go catch that squirrel on the tree,’ ‘Act as if you have a limping rear left leg,’ or ‘Act as if the ground is very hot.’
When it was given the prompt ‘Back off! Don’t hurt that squirrel,’ we could see the robot dog backing off in quick movements. When the robot dog was given the prompt, ‘Good news, we are going to a picnic this weekend!’ it jubilantly jumped around in reaction, like a real dog does upon hearing its favorite words like ‘park’ or ‘outside.’
The developers have uploaded videos that show the A1 robot dog adhering to these commands and performing the said actions here.
The robot dog reacted with implied feelings
“The locomotion controller is used to complete the main task (e.g., following specified velocities) and to place the robot’s feet on the ground at the specified time, such that the realized foot contact patterns are as close to the desired contact patterns as possible,” said the researchers in a blog.
They further explained how the locomotion controller works. “...the locomotion controller takes the desired foot contact pattern at each time step as its input in addition to the robot’s proprioceptive sensory data (e.g., joint positions and velocities) and task-related inputs (e.g., user-specified velocity commands).”
The locomotion controller has been trained on a trial-and-error basis and represents a deep neural network.
“The core ideas of our approach include introducing desired foot contact patterns as a new interface between human commands in natural language and the locomotion controller,” said the researcher
The researchers believe that more work is needed to augment LLM’s interpretations, like invoking more (implied) feelings in a robot dog setup.
“Simple and effective interaction between human and quadrupedal robots paves the way towards creating intelligent and capable helper robots, forging a future where technology enhances our lives in ways beyond our imagination,” concluded the researchers.
Tags:
转载:欢迎各位朋友分享到网络,但转载请说明文章出处“上海品茶网 - 夜上海最新论坛社区 - 上海千花论坛”。http://www.jz08.com.cn/news/2649.html
相关文章
German Bionic going to reveal its smart power suit Cray X Exoskeleton at CES 2023
Google's SayTap allows robot dog to understand vague promptsBy subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.German Bi...
阅读更多
3M reinvents 'tape' by blending adhesive science with robotics
Google's SayTap allows robot dog to understand vague promptsBy subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.We hear t...
阅读更多
Dream Chaser: Hypersonic space plane that will ferry passengers to and from Orbital Reef
Google's SayTap allows robot dog to understand vague promptsBy subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.Amazon fo...
阅读更多
热门文章
- New technique to make ultra
- Inner energy: Drilling 12 miles into the Earth to power the planet
- China's quantum leap: 'Supersolid' material cuts helium reliance
- Scientists use microscopy to investigate the toughness of spider silk
- LG unveils wireless transparent OLED TV, agent bot
- China's 'reusable' hypersonic missile interceptor inspired by MIT & NASA
最新文章
Ariana Grande Claps Back at Critics on ‘Yes, And?’
Glove assists stroke patients rehabilitate through music
29 Dragon Places for the Lunar New Year
Dan Morehead believes that Bitcoin’s next bull market is on its way
Ram 1500 Electric truck: Ram Revolution Concept unveiled at Vegas CES 2023
Should you buy Sandbox SAND at the $1.03 support?