Anthropic
Adds instrumentation for the Anthropic SDK.
For meta-framework applications using all runtimes, you need to manually wrap your Anthropic client instance with instrumentAnthropicAiClient. See instructions in the Browser-Side Usage section.
Import name: Sentry.instrumentAnthropicAiClient
The instrumentAnthropicAiClient helper adds instrumentation for the @anthropic-ai/sdk API to capture spans by wrapping Anthropic SDK calls and recording LLM interactions with configurable input/output recording. You need to manually wrap your Anthropic client instance with this helper. See example below:
import Anthropic from "@anthropic-ai/sdk";
const anthropic = new Anthropic({
apiKey: "your-api-key", // Warning: API key will be exposed in browser!
});
const client = Sentry.instrumentAnthropicAiClient(anthropic, {
recordInputs: true,
recordOutputs: true,
});
// Use the wrapped client instead of the original anthropic instance
const response = await client.messages.create({
model: "claude-3-5-sonnet-20241022",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello!" }],
});
To customize what data is captured (such as inputs and outputs), see the Options in the Configuration section.
The following options control what data is captured from Anthropic SDK calls:
Type: boolean (optional)
Records inputs to Anthropic SDK calls (such as prompts and messages).
Defaults to true if sendDefaultPii is true.
Type: boolean (optional)
Records outputs from Anthropic SDK calls (such as generated text and responses).
Defaults to true if sendDefaultPii is true.
Usage
Using the anthropicAIIntegration integration:
Sentry.init({
dsn: "____PUBLIC_DSN____",
// Tracing must be enabled for agent monitoring to work
tracesSampleRate: 1.0,
integrations: [
Sentry.anthropicAIIntegration({
// your options here
}),
],
});
Using the instrumentAnthropicAiClient helper:
const client = Sentry.instrumentAnthropicAiClient(anthropic, {
// your options here
});
By default, tracing support is added to the following Anthropic SDK calls:
messages.create()- Create messages with Claude modelsmessages.stream()- Stream messages with Claude modelsmessages.countTokens()- Count tokens for messagesmodels.get()- Get model informationcompletions.create()- Create completions (legacy)models.retrieve()- Retrieve model detailsbeta.messages.create()- Beta messages API
Streaming and non-streaming requests are automatically detected and handled appropriately.
@anthropic-ai/sdk:>=0.19.2 <1.0.0
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").