OptionalapiOptionalbehaviorOptionaldimensionsThe dimensions parameter for text embeddings models
OptionalfrequencyThe frequency penalty of the model
Optionalinputthe maximum input tokens that the model should accept (Context window size)
OptionalmaxThe maximum number of tokens to think
OptionalmaxThe maximum number of tokens to generate
OptionalmodelOptionaloutputthe maximum output tokens that the model should generate
OptionalpresenceThe presence penalty of the model
OptionalproviderOptionalstopThe stop sequences of the model
OptionaltemperatureThe temperature of the model
OptionaltopThe top K of the model
OptionaltopThe top P of the model
The behavior of the model, this will be typically used as LLM system message.