Skip to yearly menu bar Skip to main content


Poster

Gorilla: Large Language Model Connected with Massive APIs

Shishir G Patil · Tianjun Zhang · Xin Wang · Joseph Gonzalez

East Exhibit Hall A-C #4925
[ ] [ Project Page ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Large Language Models (LLMs) have seen an impressive wave of advances, withmodels now excelling in a variety of tasks, such as mathematical reasoning andprogram synthesis. However, their potential to effectively use tools via API callsremains unfulfilled. This is a challenging task even for today’s state-of-the-artLLMs such as GPT-4 largely due to their unawareness of what APIs are availableand how to use them in a frequently updated tool set. We develop Gorilla, afinetuned LLaMA model that surpasses the performance of GPT-4 on writing APIcalls. Trained with the novel Retriever Aware Training (RAT), when combinedwith a document retriever, Gorilla demonstrates a strong capability to adapt totest-time document changes, allowing flexible user updates or version changes.It also substantially mitigates the issue of hallucination, commonly encounteredwhen prompting LLMs directly. To evaluate the model’s ability, we introduceAPIBench, a comprehensive dataset consisting of HuggingFace, TorchHub, andTensorHub APIs. The successful integration of the retrieval system with Gorillademonstrates the potential for LLMs to use tools more accurately, keep up withfrequently updated documentation, and consequently increase the reliability andapplicability of their outputs. Gorilla’s code, model, data, and demo are availableat: https://gorilla.cs.berkeley.edu

Live content is unavailable. Log in and register to view live content