Language Model
Last updated
Last updated
The Model Block accepts a specified prompt string and returns a Language Model completion as the result.
Parameters:
Prompt: Enter your prompt to send to the configured model
Config: Provides options for the type of LLM model and the temperature
For the final model block in your callable, it must be named OUTPUT_STREAM to stream the chat results in your copilot.
Using Advanced Model Blocks: