Authorizations
Body
Model used for chat completion
JSON schema format used to validate the output data.
Document to be analyzed
Documents to be analyzed (preferred over document)
Resolution of the image sent to the LLM
Temperature for sampling. If not provided, the default temperature for the model will be used.
0
The effort level for the model to reason about the input data. If not provided, the default reasoning effort for the model will be used.
minimal
, low
, medium
, high
Number of consensus models to use for extraction. If greater than 1 the temperature cannot be 0.
If true, the extraction will be streamed to the user using the active WebSocket connection
Seed for the random number generator. If not provided, a random seed will be generated.
null
If true, the extraction will be stored in the database
If true, the extraction will be validated against the schema
before_handle_extraction
, within_extraction_parse_or_stream
, after_handle_extraction
, within_process_document_stream_generator
Response
Successful Response
"chat.completion"
auto
, default
, flex
, scale
, priority
Object defining the uncertainties of the fields extracted when using consensus. Follows the same structure as the extraction object.
Timestamp of the request
Timestamp of the first token of the document. If non-streaming, set to last_token_at
Timestamp of the last token of the document