![]() ![]() Weight consolidation (EWC) for selective weight updates, combined with low-rankĪdapters (LoRA) for training efficiency. We thus further introduce the classical elastic Moreover, it isĭesirable to preserve the generality of LMs during finetuning, whichįacilitates generalizing the embodied knowledge across tasks rather than being ![]() These experiences are then used to finetune LMs to teachĭiverse abilities of reasoning and acting in the physical world, e.g., planningĪnd completing goals, object permanence and tracking, etc. Particularly a simulator of the physical world (VirtualHome), and acquires aĭiverse set of embodied experiences through both goal-oriented planning and Our approach deploys an embodied agent in a world model, Models, to gain diverse embodied knowledge while retaining their general Paper, we propose a new paradigm of enhancing LMs by finetuning them with world Only on written text and miss essential embodied knowledge and skills. ![]() The limitation arises from the fact that LMs are trained Physical environments, such as understanding object permanence or planning Numerous tasks, they often struggle with simple reasoning and planning in ![]() Download a PDF of the paper titled Language Models Meet World Models: Embodied Experiences Enhance Language Models, by Jiannan Xiang and 6 other authors Download PDF Abstract: While large language models (LMs) have shown remarkable capabilities across ![]()
0 Comments
Leave a Reply. |