Anthropic's Claude models are now generally available through Vertex AI.
The Vertex API for accessing Claude is nearly-identical to the Messages API and supports all of the same options, with two key differences:
- In Vertex,
modelis not passed in the request body. Instead, it is specified in the Google Cloud endpoint URL. - In Vertex,
anthropic_versionis passed in the request body (rather than as a header), and must be set to the valuevertex-2023-10-16.
Vertex is also supported by Anthropic's official client SDKs. This guide walks you through making a request to Claude on Vertex AI using one of Anthropic's client SDKs.
Note that this guide assumes you already have a GCP project that is able to use Vertex AI. See using the Claude 3 models from Anthropic for more information on the setup required, as well as a full walkthrough.
First, install Anthropic's client SDK for your language of choice.
```bash pip install -U google-cloud-aiplatform "anthropic[vertex]" ``` ```bash npm install @anthropic-ai/vertex-sdk ``` ```bash dotnet add package Anthropic.Vertex ``` ```bash go get github.com/anthropics/anthropic-sdk-go ``` ```groovy Gradle implementation("com.anthropic:anthropic-java-vertex:2.20.0") ```<dependency>
<groupId>com.anthropic</groupId>
<artifactId>anthropic-java-vertex</artifactId>
<version>2.20.0</version>
</dependency>import com.anthropic.client.AnthropicClient;
import com.anthropic.client.okhttp.AnthropicOkHttpClient;
import com.anthropic.vertex.backends.VertexBackend;
import com.anthropic.models.messages.MessageCreateParams;
import com.anthropic.models.messages.Message;
import com.anthropic.models.messages.Model;
public class BasicMessage {
public static void main(String[] args) {
AnthropicClient client = AnthropicOkHttpClient.builder()
.backend(VertexBackend.fromEnv())
.build();
MessageCreateParams params = MessageCreateParams.builder()
.model(Model.CLAUDE_OPUS_4_6)
.maxTokens(1024L)
.addUserMessage("What is the capital of France?")
.build();
Message response = client.messages().create(params);
response.content().stream()
.flatMap(block -> block.text().stream())
.forEach(textBlock -> System.out.println(textBlock.text()));
}
}Note that Anthropic model availability varies by region. Search for "Claude" in the Vertex AI Model Garden or go to Use Claude 3 for the latest information.
| Model | Vertex AI API model ID |
|---|---|
| Claude Opus 4.6 | claude-opus-4-6 |
| Claude Sonnet 4.6 | claude-sonnet-4-6 |
| Claude Sonnet 4.5 | claude-sonnet-4-5@20250929 |
| Claude Sonnet 4 | claude-sonnet-4@20250514 |
| Claude Sonnet 3.7 |
claude-3-7-sonnet@20250219 |
| Claude Opus 4.5 | claude-opus-4-5@20251101 |
| Claude Opus 4.1 | claude-opus-4-1@20250805 |
| Claude Opus 4 | claude-opus-4@20250514 |
| Claude Haiku 4.5 | claude-haiku-4-5@20251001 |
| Claude Haiku 3.5 |
claude-3-5-haiku@20241022 |
| Claude Haiku 3 |
claude-3-haiku@20240307 |
Before running requests you may need to run gcloud auth application-default login to authenticate with GCP.
The following examples show how to generate text from Claude on Vertex AI:
MODEL_ID=claude-opus-4-6
LOCATION=global
PROJECT_ID=MY_PROJECT_ID
curl \
-X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
https://$LOCATION-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/${LOCATION}/publishers/anthropic/models/${MODEL_ID}:streamRawPredict -d \
'{
"anthropic_version": "vertex-2023-10-16",
"messages": [{
"role": "user",
"content": "Hey Claude!"
}],
"max_tokens": 100,
}'# The ant CLI does not yet support Vertex AI.from anthropic import AnthropicVertex
project_id = "MY_PROJECT_ID"
region = "global"
client = AnthropicVertex(project_id=project_id, region=region)
message = client.messages.create(
model="claude-opus-4-6",
max_tokens=100,
messages=[
{
"role": "user",
"content": "Hey Claude!",
}
],
)
print(message)import { AnthropicVertex } from "@anthropic-ai/vertex-sdk";
const projectId = "MY_PROJECT_ID";
const region = "global";
// Goes through the standard `google-auth-library` flow.
const client = new AnthropicVertex({
projectId,
region
});
async function main() {
const result = await client.messages.create({
model: "claude-opus-4-6",
max_tokens: 100,
messages: [
{
role: "user",
content: "Hey Claude!"
}
]
});
console.log(JSON.stringify(result, null, 2));
}
main();using Anthropic;
using Anthropic.Models.Messages;
using Anthropic.Vertex;
var projectId = "MY_PROJECT_ID";
var region = "global";
var client = new AnthropicClient
{
Backend = new VertexBackend(projectId, region)
};
var parameters = new MessageCreateParams
{
Model = Model.ClaudeOpus4_6,
MaxTokens = 100,
Messages = [new() { Role = Role.User, Content = "Hey Claude!" }]
};
var message = await client.Messages.Create(parameters);
Console.WriteLine(message);package main
import (
"context"
"fmt"
"github.com/anthropics/anthropic-sdk-go"
"github.com/anthropics/anthropic-sdk-go/vertex"
)
func main() {
// Uses default Google Cloud credentials
client := anthropic.NewClient(
vertex.WithGoogleAuth(context.Background(), "global", "MY_PROJECT_ID"),
)
message, err := client.Messages.New(context.Background(), anthropic.MessageNewParams{
Model: "claude-opus-4-6",
MaxTokens: 100,
Messages: []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock("Hey Claude!")),
},
})
if err != nil {
panic(err)
}
fmt.Printf("%+v\n", message)
}import com.anthropic.client.AnthropicClient;
import com.anthropic.client.okhttp.AnthropicOkHttpClient;
import com.anthropic.models.messages.Message;
import com.anthropic.models.messages.MessageCreateParams;
import com.anthropic.vertex.backends.VertexBackend;
public class VertexExample {
public static void main(String[] args) {
// Uses default Google Cloud credentials
AnthropicClient client = AnthropicOkHttpClient.builder()
.backend(VertexBackend.fromEnv())
.build();
Message message = client
.messages()
.create(
MessageCreateParams.builder()
.model("claude-opus-4-6")
.maxTokens(100)
.addUserMessage("Hey Claude!")
.build()
);
System.out.println(message);
}
}<?php
use Anthropic\Vertex;
$client = Vertex\Client::fromEnvironment(
location: 'global',
projectId: 'MY_PROJECT_ID',
);
$message = $client->messages->create(
maxTokens: 100,
messages: [
['role' => 'user', 'content' => 'Hey Claude!']
],
model: 'claude-opus-4-6',
);
echo $message->content[0]->text;require "anthropic"
client = Anthropic::VertexClient.new(
region: "global",
project_id: "MY_PROJECT_ID"
)
message = client.messages.create(
model: "claude-opus-4-6",
max_tokens: 100,
messages: [{role: "user", content: "Hey Claude!"}]
)
puts message.content.first.textSee the client SDKs and the official Vertex AI docs for more details.
Claude is also available through Amazon Bedrock and Microsoft Foundry.
Vertex provides a request-response logging service that allows customers to log the prompts and completions associated with your usage.
Anthropic recommends that you log your activity on at least a 30-day rolling basis in order to understand your activity and investigate any potential misuse.
Turning on this service does not give Google or Anthropic any access to your content.For all currently supported features on Vertex AI, see API features overview.
Claude Opus 4.6 and Claude Sonnet 4.6 have a 1M-token context window on Vertex AI. Other Claude models, including Sonnet 4.5 and Sonnet 4, have a 200k-token context window.
Vertex AI limits request payloads to 30 MB. When sending large documents or many images, you may reach this limit before the token limit.
Google Vertex AI offers three endpoint types:
- Global endpoints: Dynamic routing for maximum availability
- Multi-region endpoints: Dynamic routing within a geographic area (for example, the United States or the European Union) for data residency with high availability
- Regional endpoints: Guaranteed data routing through specific geographic regions
Regional and multi-region endpoints include a 10% pricing premium over global endpoints.
This applies to Claude Sonnet 4.5 and future models only. Older models (Claude Sonnet 4, Opus 4, and earlier) maintain their existing pricing structures.Global endpoints (recommended):
- Provide maximum availability and uptime
- Dynamically route requests to regions with available capacity
- No pricing premium
- Best for applications where data residency is flexible
- Only supports pay-as-you-go traffic (provisioned throughput requires regional endpoints)
Multi-region endpoints:
- Dynamically route requests across regions within a geographic area (currently
usandeu) - Useful when you need data residency within a broad geography but want higher availability than a single region
- 10% pricing premium over global endpoints
- Only supports pay-as-you-go traffic (provisioned throughput requires regional endpoints)
Regional endpoints:
- Route traffic through specific geographic regions
- Required for single-region data residency, strict compliance mandates, or provisioned throughput
- Support both pay-as-you-go and provisioned throughput
- 10% pricing premium reflects infrastructure costs for dedicated regional capacity
Using global endpoints (recommended):
Set the region parameter to "global" when initializing the client:
# The ant CLI does not yet support Vertex AI.from anthropic import AnthropicVertex
project_id = "MY_PROJECT_ID"
region = "global"
client = AnthropicVertex(project_id=project_id, region=region)
message = client.messages.create(
model="claude-opus-4-6",
max_tokens=100,
messages=[
{
"role": "user",
"content": "Hey Claude!",
}
],
)
print(message)import { AnthropicVertex } from "@anthropic-ai/vertex-sdk";
const projectId = "MY_PROJECT_ID";
const region = "global";
const client = new AnthropicVertex({
projectId,
region
});
const result = await client.messages.create({
model: "claude-opus-4-6",
max_tokens: 100,
messages: [
{
role: "user",
content: "Hey Claude!"
}
]
});using Anthropic;
using Anthropic.Models.Messages;
using Anthropic.Vertex;
var projectId = "MY_PROJECT_ID";
var region = "global";
var client = new AnthropicClient
{
Backend = new VertexBackend(projectId, region)
};
var parameters = new MessageCreateParams
{
Model = Model.ClaudeOpus4_6,
MaxTokens = 100,
Messages = [new() { Role = Role.User, Content = "Hey Claude!" }]
};
var message = await client.Messages.Create(parameters);
Console.WriteLine(message);package main
import (
"context"
"github.com/anthropics/anthropic-sdk-go"
"github.com/anthropics/anthropic-sdk-go/vertex"
)
func main() {
// Uses default Google Cloud credentials
client := anthropic.NewClient(
vertex.WithGoogleAuth(context.Background(), "global", "MY_PROJECT_ID"),
)
message, _ := client.Messages.New(context.Background(), anthropic.MessageNewParams{
Model: "claude-opus-4-6",
MaxTokens: 100,
Messages: []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock("Hey Claude!")),
},
})
_ = message
}import com.anthropic.client.AnthropicClient;
import com.anthropic.client.okhttp.AnthropicOkHttpClient;
import com.anthropic.models.messages.MessageCreateParams;
import com.anthropic.vertex.backends.VertexBackend;
// Uses default Google Cloud credentials
AnthropicClient client = AnthropicOkHttpClient.builder()
.backend(VertexBackend.fromEnv())
.build();
var message = client
.messages()
.create(
MessageCreateParams.builder()
.model("claude-opus-4-6")
.maxTokens(100)
.addUserMessage("Hey Claude!")
.build()
);<?php
use Anthropic\Vertex;
$client = Vertex\Client::fromEnvironment(
location: 'global',
projectId: 'MY_PROJECT_ID',
);
$message = $client->messages->create(
maxTokens: 100,
messages: [
['role' => 'user', 'content' => 'Hey Claude!']
],
model: 'claude-opus-4-6',
);
echo $message->content[0]->text;require "anthropic"
client = Anthropic::VertexClient.new(
region: "global",
project_id: "MY_PROJECT_ID"
)
message = client.messages.create(
model: "claude-opus-4-6",
max_tokens: 100,
messages: [{role: "user", content: "Hey Claude!"}]
)
puts message.content.first.textUsing multi-region endpoints:
Set the region parameter to a multi-region identifier: "us" for the United States or "eu" for the European Union. The SDK routes requests to the corresponding multi-region endpoint (https://aiplatform.us.rep.googleapis.com or https://aiplatform.eu.rep.googleapis.com), which dynamically balances traffic across regions within that geography.
from anthropic import AnthropicVertex
project_id = "MY_PROJECT_ID"
region = "us" # Multi-region identifier: "us" or "eu"
client = AnthropicVertex(project_id=project_id, region=region)
message = client.messages.create(
model="claude-opus-4-6",
max_tokens=100,
messages=[
{
"role": "user",
"content": "Hey Claude!",
}
],
)
print(message)import { AnthropicVertex } from "@anthropic-ai/vertex-sdk";
const projectId = "MY_PROJECT_ID";
const region = "us"; // Multi-region identifier: "us" or "eu"
const client = new AnthropicVertex({
projectId,
region
});
const result = await client.messages.create({
model: "claude-opus-4-6",
max_tokens: 100,
messages: [
{
role: "user",
content: "Hey Claude!"
}
]
});using Anthropic;
using Anthropic.Models.Messages;
using Anthropic.Vertex;
var projectId = "MY_PROJECT_ID";
var region = "us"; // Multi-region identifier: "us" or "eu"
var client = new AnthropicClient
{
Backend = new VertexBackend(projectId, region)
};
var parameters = new MessageCreateParams
{
Model = Model.ClaudeOpus4_6,
MaxTokens = 100,
Messages = [new() { Role = Role.User, Content = "Hey Claude!" }]
};
var message = await client.Messages.Create(parameters);
Console.WriteLine(message);package main
import (
"context"
"github.com/anthropics/anthropic-sdk-go"
"github.com/anthropics/anthropic-sdk-go/vertex"
)
func main() {
// Multi-region identifier: "us" or "eu"
client := anthropic.NewClient(
vertex.WithGoogleAuth(context.Background(), "us", "MY_PROJECT_ID"),
)
message, _ := client.Messages.New(context.Background(), anthropic.MessageNewParams{
Model: "claude-opus-4-6",
MaxTokens: 100,
Messages: []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock("Hey Claude!")),
},
})
_ = message
}import com.anthropic.client.AnthropicClient;
import com.anthropic.client.okhttp.AnthropicOkHttpClient;
import com.anthropic.models.messages.MessageCreateParams;
import com.anthropic.vertex.backends.VertexBackend;
// Multi-region identifier: "us" or "eu"
AnthropicClient client = AnthropicOkHttpClient.builder()
.backend(
VertexBackend.builder()
.region("us")
.project("MY_PROJECT_ID")
.build()
)
.build();
var message = client
.messages()
.create(
MessageCreateParams.builder()
.model("claude-opus-4-6")
.maxTokens(100)
.addUserMessage("Hey Claude!")
.build()
);<?php
use Anthropic\Vertex;
$client = Vertex\Client::fromEnvironment(
location: 'us', // Multi-region identifier: "us" or "eu"
projectId: 'MY_PROJECT_ID',
);
$message = $client->messages->create(
maxTokens: 100,
messages: [
['role' => 'user', 'content' => 'Hey Claude!']
],
model: 'claude-opus-4-6',
);
echo $message->content[0]->text;require "anthropic"
client = Anthropic::VertexClient.new(
region: "us", # Multi-region identifier: "us" or "eu"
project_id: "MY_PROJECT_ID"
)
message = client.messages.create(
model: "claude-opus-4-6",
max_tokens: 100,
messages: [{role: "user", content: "Hey Claude!"}]
)
puts message.content.first.textUsing regional endpoints:
Specify a specific region like "us-east1" or "europe-west1":
# The ant CLI does not yet support Vertex AI.from anthropic import AnthropicVertex
project_id = "MY_PROJECT_ID"
region = "us-east1" # Specify a specific region
client = AnthropicVertex(project_id=project_id, region=region)
message = client.messages.create(
model="claude-opus-4-6",
max_tokens=100,
messages=[
{
"role": "user",
"content": "Hey Claude!",
}
],
)
print(message)import { AnthropicVertex } from "@anthropic-ai/vertex-sdk";
const projectId = "MY_PROJECT_ID";
const region = "us-east1"; // Specify a specific region
const client = new AnthropicVertex({
projectId,
region
});
const result = await client.messages.create({
model: "claude-opus-4-6",
max_tokens: 100,
messages: [
{
role: "user",
content: "Hey Claude!"
}
]
});using Anthropic;
using Anthropic.Models.Messages;
using Anthropic.Vertex;
var projectId = "MY_PROJECT_ID";
var region = "us-east1";
AnthropicClient client = new()
{
Backend = new VertexBackend(projectId, region)
};
var parameters = new MessageCreateParams
{
Model = Model.ClaudeOpus4_6,
MaxTokens = 100,
Messages = [new() { Role = Role.User, Content = "Hey Claude!" }]
};
var message = await client.Messages.Create(parameters);
Console.WriteLine(message);package main
import (
"context"
"github.com/anthropics/anthropic-sdk-go"
"github.com/anthropics/anthropic-sdk-go/vertex"
)
func main() {
// Specify a specific region
client := anthropic.NewClient(
vertex.WithGoogleAuth(context.Background(), "us-east1", "MY_PROJECT_ID"),
)
message, _ := client.Messages.New(context.Background(), anthropic.MessageNewParams{
Model: "claude-opus-4-6",
MaxTokens: 100,
Messages: []anthropic.MessageParam{
anthropic.NewUserMessage(anthropic.NewTextBlock("Hey Claude!")),
},
})
_ = message
}import com.anthropic.client.AnthropicClient;
import com.anthropic.client.okhttp.AnthropicOkHttpClient;
import com.anthropic.models.messages.MessageCreateParams;
import com.anthropic.vertex.backends.VertexBackend;
// Uses default Google Cloud credentials with specific region
AnthropicClient client = AnthropicOkHttpClient.builder()
.backend(
VertexBackend.builder()
.region("us-east1") // Specify a specific region
.project("MY_PROJECT_ID")
.build()
)
.build();
var message = client
.messages()
.create(
MessageCreateParams.builder()
.model("claude-opus-4-6")
.maxTokens(100)
.addUserMessage("Hey Claude!")
.build()
);<?php
use Anthropic\Vertex;
$client = Vertex\Client::fromEnvironment(
location: 'us-east1',
projectId: 'MY_PROJECT_ID',
);
$message = $client->messages->create(
maxTokens: 100,
messages: [
['role' => 'user', 'content' => 'Hey Claude!']
],
model: 'claude-opus-4-6',
);
echo $message->content[0]->text;require "anthropic"
client = Anthropic::VertexClient.new(
region: "us-east1", # Specify a specific region
project_id: "MY_PROJECT_ID"
)
message = client.messages.create(
model: "claude-opus-4-6",
max_tokens: 100,
messages: [{role: "user", content: "Hey Claude!"}]
)
puts message.content.first.text- Google Vertex AI pricing: cloud.google.com/vertex-ai/generative-ai/pricing
- Claude models documentation: Claude on Vertex AI
- Google blog post: Global endpoint for Claude models
- Anthropic pricing details: Pricing documentation