LLMExecutor SDK
The LLMExecutor SDK provides a way to use the large language model Chat interface. All you need to do is preset the Prompt and Params variables in the configuration, then populate the values of Params in the SDK to complete a dialogue. In addition to using the built-in Prompt, we also support invoking the large language model interface in a native manner to meet the needs of some special scenarios.
Usage
To use the LLMExecutor SDK in a Babel application, you first need to import the LLMExecutor SDK dependency in the Element:
import { myLLMExecutor } from "#elements";
APIs
- execute Populate the value of the Prompt variable and return a string object.
- executeStream Populate the value of the Prompt variable and return a string stream.
- complete Invoke the large model interface in a native manner.
- completeStream Invoke the large model interface in a native manner, returning results in a stream.