Messages
messages
Methods
Send a structured list of input messages with text and/or image content, and the model will generate the next message in the conversation.
The Messages API can be used for either single queries or stateless multi-turn conversations.
Learn more about the Messages API in our user guide
Optional header to specify the beta version(s) you want to use.
To use multiple betas, use a comma separated list like beta1,beta2 or specify the header multiple times for each beta.
The version of the Anthropic API you want to use.
Read more about versioning and our version history here.
Your unique API key for authentication.
This key is required in the header of all API requests, to authenticate your account and access Anthropic's services. Get your API key through the Console. Each key is scoped to a Workspace.
Unique object identifier.
The format and length of IDs may change over time.
Content generated by the model.
This is an array of content blocks, each of which has a type that determines its shape.
Example:
[{"type": "text", "text": "Hi, I'm Claude."}]
If the request input messages ended with an assistant turn, then the response content will continue directly from that last turn. You can use this to constrain the model's output.
For example, if the input messages were:
[
{"role": "user", "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun"},
{"role": "assistant", "content": "The best answer is ("}
]
Then the response content might be:
[{"type": "text", "text": "B)"}]
The model that handled the request.
Conversational role of the generated message.
This will always be "assistant".
The reason that we stopped.
This may be one the following values:
"end_turn": the model reached a natural stopping point"max_tokens": we exceeded the requestedmax_tokensor the model's maximum"stop_sequence": one of your provided customstop_sequenceswas generated"tool_use": the model invoked one or more tools
In non-streaming mode this value is always non-null. In streaming mode, it is null in the message_start event and non-null otherwise.
Which custom stop sequence was generated, if any.
This value will be a non-null string if one of your custom stop sequences was generated.
Object type.
For Messages, this is always "message".
Billing and rate-limit usage.
Anthropic's API bills and rate-limits by token counts, as tokens represent the underlying cost to our systems.
Under the hood, the API transforms requests into a format suitable for the model. The model's output then goes through a parsing stage before becoming an API response. As a result, the token counts in usage will not match one-to-one with the exact visible content of an API request or response.
For example, output_tokens will be non-zero, even for an empty string response from Claude.
Total input tokens in a request is the summation of input_tokens, cache_creation_input_tokens, and cache_read_input_tokens.
Count the number of tokens in a Message.
The Token Count API can be used to count the number of tokens in a Message, including tools, images, and documents, without creating it.
Learn more about token counting in our user guide
Count the number of tokens in a Message.
The Token Count API can be used to count the number of tokens in a Message, including tools, images, and documents, without creating it.
Learn more about token counting in our user guide
Batches
messages.batches
Methods
Send a batch of Message creation requests.
The Message Batches API can be used to process multiple Messages API requests at once. Once a Message Batch is created, it begins processing immediately. Batches can take up to 24 hours to complete.
Learn more about the Message Batches API in our user guide
List all Message Batches within a Workspace. Most recently created batches are returned first.
Learn more about the Message Batches API in our user guide
This endpoint is idempotent and can be used to poll for Message Batch completion. To access the results of a Message Batch, make a request to the results_url field in the response.
Learn more about the Message Batches API in our user guide
Delete a Message Batch.
Message Batches can only be deleted once they've finished processing. If you'd like to delete an in-progress batch, you must first cancel it.
Learn more about the Message Batches API in our user guide
Batches may be canceled any time before processing ends. Once cancellation is initiated, the batch enters a canceling state, at which time the system may complete any in-progress, non-interruptible requests before finalizing cancellation.
The number of canceled requests is specified in request_counts. To determine which requests were canceled, check the individual results within the batch. Note that cancellation may not result in any canceled requests if they were non-interruptible.
Learn more about the Message Batches API in our user guide
Streams the results of a Message Batch as a .jsonl file.
Each line in the file is a JSON object containing the result of a single request in the Message Batch. Results are not guaranteed to be in the same order as requests. Use the custom_id field to match results to requests.
Learn more about the Message Batches API in our user guide
Batches may be canceled any time before processing ends. Once cancellation is initiated, the batch enters a canceling state, at which time the system may complete any in-progress, non-interruptible requests before finalizing cancellation.
The number of canceled requests is specified in request_counts. To determine which requests were canceled, check the individual results within the batch. Note that cancellation may not result in any canceled requests if they were non-interruptible.
Learn more about the Message Batches API in our user guide
Streams the results of a Message Batch as a .jsonl file.
Each line in the file is a JSON object containing the result of a single request in the Message Batch. Results are not guaranteed to be in the same order as requests. Use the custom_id field to match results to requests.
Learn more about the Message Batches API in our user guide
Methods
This endpoint is idempotent and can be used to poll for Message Batch completion. To access the results of a Message Batch, make a request to the results_url field in the response.
Learn more about the Message Batches API in our user guide
Delete a Message Batch.
Message Batches can only be deleted once they've finished processing. If you'd like to delete an in-progress batch, you must first cancel it.
Learn more about the Message Batches API in our user guide
Batches Beta True
messages.batches_beta_true
Methods
Send a batch of Message creation requests.
The Message Batches API can be used to process multiple Messages API requests at once. Once a Message Batch is created, it begins processing immediately. Batches can take up to 24 hours to complete.
Learn more about the Message Batches API in our user guide
List all Message Batches within a Workspace. Most recently created batches are returned first.
Learn more about the Message Batches API in our user guide