//Prompt and get response constresponse = awaitchat.prompt('Write a short story about a cat');
//or use streaming conststreamEvents = awaitchat.prompt('Write a short story about a cat').stream(); streamEvents.on(TLLMEvent.Content, (event) => { console.log(event); }); streamEvents.on(TLLMEvent.End, () => { console.log('Stream ended'); });
Example
By default, the SDK relies on a vault file to get the API keys, the vault is configured when you initialize your project using "sre" command line tool.
//Bellow are different ways to invoke an LLM without passing the API key
//Using the model ID constllm = LLM.OpenAI('gpt-4o'); constresponse = awaitllm.prompt('Write a short story about a cat');
//Using the model params constllm = LLM.OpenAI({ model:'gpt-4o' }); constresponse = awaitllm.prompt('Write a short story about a cat');
//Using the model params with custom settings constllm = LLM.OpenAI('gpt-4o', { temperature:0.5, maxTokens:50 }); constresponse = awaitllm.prompt('Write a short story about a cat');
//Using the model params with custom settings constllm = LLM.OpenAI({ model:'gpt-4o', temperature:0.5, maxTokens:50 }); constresponse = awaitllm.prompt('Write a short story about a cat');
Example
If you don't want to use the vault file, or want to use a specific API key, you can pass the API key explicitly.
//Using the model params with an API key constllm = LLM.OpenAI({ model:'gpt-4o', apiKey:'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', temperature:0.5, maxTokens:50 }); constresponse = awaitllm.prompt('Write a short story about a cat');
//Using the model params with an API key constllm = LLM.OpenAI('gpt-4o', { apiKey:'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx', temperature:0.5, maxTokens:50 }); constresponse = awaitllm.prompt('Write a short story about a cat');
Create standalone LLM Provider instances, these can be used without agents.
Example: Different providers are available
see below for all available providers
Example: Prompting an LLM
Example: Streaming response
Example: Chat with an LLM
The difference between direct prompting and chatting is that chatting will persist the conversation.
Example
By default, the SDK relies on a vault file to get the API keys, the vault is configured when you initialize your project using "sre" command line tool.
Example
If you don't want to use the vault file, or want to use a specific API key, you can pass the API key explicitly.