MIT debuts a large language model-inspired method for teaching robots new skills | TechCrunch
Briefly

"In the language domain, the data are all just sentences," says Lirui Wang, the new paper's lead author. "In robotics, given all the heterogeneity in the data, if you want to pretrain in a similar manner, we need a different architecture."
"Our dream is to have a universal robot brain that you could download and use for your robot without any training at all," CMU associate professor David Held said of the research. "While we are just in the early stages, we are going to keep pushing hard and hope scaling leads to a breakthrough in robotic policies, like it did with large language models."
Read at TechCrunch
[
|
]