Paper
4 September 2024 EILLM: efficient imitate with large language models
Minghao Qiu, Hankz Hankui Zhuo
Author Affiliations +
Proceedings Volume 13259, International Conference on Automation Control, Algorithm, and Intelligent Bionics (ACAIB 2024); 1325947 (2024) https://doi.org/10.1117/12.3039320
Event: Fourth International Conference on Automation Control, Algorithm, and Intelligent Bionics (ICAIB 2024), 2024, Yinchuan, China
Abstract
Rapid evolution is underway in the fields of nature language processing and robotics. Our research endeavors to integrate Large Language Models (LLMs) with imitation learning, aiming to enhance the efficiency of action generation in robotics. The proposed model, Efficient Imitate with Large Language Models (EILLM), utilizes the vast knowledge of LLMs, thus improving the limitations that have traditionally plagued imitation learning approaches, which often falter when confronted with limited expert data. Employing a multi-pronged approach, EILLM adopts LLMs for task understanding, state-action pair analysis, and Monte Carlo search tree pruning, thereby simultaneously raising the accuracy of action generation and efficiency of search process. Furthermore, our experiments demonstrated the superior performance of EILLM compared to baseline models, corroborating its efficacy in achieving sample-efficient robot motion by integrating LLMs and imitation learning.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Minghao Qiu and Hankz Hankui Zhuo "EILLM: efficient imitate with large language models", Proc. SPIE 13259, International Conference on Automation Control, Algorithm, and Intelligent Bionics (ACAIB 2024), 1325947 (4 September 2024); https://doi.org/10.1117/12.3039320
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Machine learning

Data modeling

Motion models

Statistical modeling

Performance modeling

Monte Carlo methods

Evolutionary algorithms

Back to Top