Видео недоступно для предпросмотра
Смотреть в Telegram
Teaching Robots to Listen and Think Harder
Read more: https://www.physicalintelligence.company/research/hirobot
Can we get our robots to "think" the same way, with a little "voice" that tells them what to do when they are presented with a complex task? We developed a system that we call the Hierarchical Interactive Robot (Hi Robot) that allows us to incorporate vision-language-action (VLA) models, such as π0, into a two-level inference process. π0 serves as the instinctual, reactive "System 1" that can perform well-practiced tasks, and a high-level semantic vision-language model (VLM) plays the role of "System 2," reasoning through complex tasks and language interactions by "talking to itself." This System 2 high-level policy quite literally emulates that little voice, telling the robot how to break up complex tasks into intermediate steps
Read more: https://www.physicalintelligence.company/research/hirobot