interface Choice {
    finish_reason: "length" | "stop" | "tool_calls" | "content_filter" | "function_call";
    index: number;
    logprobs: null | ChatCompletion.Choice.Logprobs;
    message: ChatCompletionMessage;
}

Properties

finish_reason: "length" | "stop" | "tool_calls" | "content_filter" | "function_call"

The reason the model stopped generating tokens. This will be stop if the model hit a natural stop point or a provided stop sequence, length if the maximum number of tokens specified in the request was reached, content_filter if content was omitted due to a flag from our content filters, tool_calls if the model called a tool, or function_call (deprecated) if the model called a function.

index: number

The index of the choice in the list of choices.

logprobs: null | ChatCompletion.Choice.Logprobs

Log probability information for the choice.

A chat completion message generated by the model.

Generated using TypeDoc