Perform LLM text completion inference.
POST {{baseUrl}}/api/v2/cortex/inference:complete
Perform LLM text completion inference, similar to snowflake.cortex.Complete.
Request Body
{"model"=>"<string>", "messages"=>[{"content"=>"<string>"}], "stream"=>false, "temperature"=>0, "top_p"=>1, "max_output_tokens"=>4096}
HEADERS
Key | Datatype | Required | Description |
---|---|---|---|
Content-Type | string | ||
Accept | string |
RESPONSES
status: OK
{"choices":[{"message":{"content":"\u003cstring\u003e"}},{"message":{"content":"\u003cstring\u003e"}}],"usage":{"prompt_tokens":"\u003cinteger\u003e","completion_tokens":"\u003cinteger\u003e","total_tokens":"\u003cinteger\u003e"}}