Amazon Bedrock

This page provides real-time availability and performance metrics for Amazon Bedrock. Fully managed service for building generative AI applications with foundation models.

Loading...

Dashboards & Latency Matrices

These pre-built availability and performance dashboards and latency matrices provide quick access to comprehensive historical and real-time analytics derived from our continuous cloud service testing. These reports offer insights into service availability patterns, performance trends, and latency characteristics across multiple cloud providers and regions. Each report includes time-bound analysis spanning from 1 day to 1 year, with customizable views, bookmarking, and URL-based sharing for collaborative analysis.

Dashboards

Detailed summaries of performance and availability featuring time series graphs, latency distribution charts, and summary metrics for cloud service control and data plane operations, and network performance.

Latency Matrices

Summarized statistical analysis providing median, mean, and percentile performance metrics in matrix format offering comparative views of latency and availability across services, regions, and time periods.

Show status for the past:

1 Day 1 Week 1 Month 3 Months 6 Months 1 Year

us-east-1

This section provides control plane and data plane availability and performance metrics for Amazon Bedrock in the us-east-1 region. Networking operations are not currently covered in this region.

us-east-1 Tests Covered

Control Plane Data Plane

Control Plane / us-east-1

Control plane operations test the responsiveness and reliability of cloud service APIs for creating, modifying, and deleting resources.

Operations Covered

create-inference-claude-3-haiku

Creates an application inference profile for Claude 3 Haiku model using bedrock create-inference-profile. Measures time from creation request until profile ARN is returned.

Loading availability & performance data...

create-inference-nova-lite

Creates an application inference profile for Nova Lite model using bedrock create-inference-profile. Measures time from creation request until profile ARN is returned.

Loading availability & performance data...

create-inference-nova-micro

Creates an application inference profile for Nova Micro model using bedrock create-inference-profile. Measures time from creation request until profile ARN is returned.

Loading availability & performance data...

create-inference-nova-pro

Creates an application inference profile for Nova Pro model using bedrock create-inference-profile. Measures time from creation request until profile ARN is returned.

Loading availability & performance data...

delete-inference-claude-3-haiku

Deletes a Claude 3 Haiku application inference profile using bedrock delete-inference-profile. Measures time from deletion request until profile is removed.

Loading availability & performance data...

delete-inference-nova-lite

Deletes a Nova Lite application inference profile using bedrock delete-inference-profile. Measures time from deletion request until profile is removed.

Loading availability & performance data...

delete-inference-nova-micro

Deletes a Nova Micro application inference profile using bedrock delete-inference-profile. Measures time from deletion request until profile is removed.

Loading availability & performance data...

delete-inference-nova-pro

Deletes a Nova Pro application inference profile using bedrock delete-inference-profile. Measures time from deletion request until profile is removed.

Loading availability & performance data...

read-inference-claude-3-haiku

Retrieves Claude 3 Haiku inference profile details using bedrock get-inference-profile. Measures API response time to fetch profile configuration.

Loading availability & performance data...

read-inference-nova-lite

Retrieves Nova Lite inference profile details using bedrock get-inference-profile. Measures API response time to fetch profile configuration.

Loading availability & performance data...

read-inference-nova-micro

Retrieves Nova Micro inference profile details using bedrock get-inference-profile. Measures API response time to fetch profile configuration.

Loading availability & performance data...

read-inference-nova-pro

Retrieves Nova Pro inference profile details using bedrock get-inference-profile. Measures API response time to fetch profile configuration.

Loading availability & performance data...

update-inference-claude-3-haiku

Updates Claude 3 Haiku inference profile tags using bedrock tag-resource. Measures time to apply tag changes to the profile.

Loading availability & performance data...

update-inference-nova-lite

Updates Nova Lite inference profile tags using bedrock tag-resource. Measures time to apply tag changes to the profile.

Loading availability & performance data...

update-inference-nova-micro

Updates Nova Micro inference profile tags using bedrock tag-resource. Measures time to apply tag changes to the profile.

Loading availability & performance data...

update-inference-nova-pro

Updates Nova Pro inference profile tags using bedrock tag-resource. Measures time to apply tag changes to the profile.

Loading availability & performance data...

Data Plane / us-east-1

Data plane operations measure the performance and availability of actual service functionality like storage operations, compute tasks, and database queries.

Operations Covered

converse-code-claude-3-haiku

Tests Claude 3 Haiku model code analysis using bedrock-runtime converse presenting a valid Python Fibonacci function and asking the model to confirm if it works correctly by responding "Yes" or "No". Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-code-nova-lite

Tests Nova Lite model code analysis using bedrock-runtime converse presenting a valid Python Fibonacci function and asking the model to confirm if it works correctly by responding "Yes" or "No". Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-code-nova-micro

Tests Nova Micro model code analysis using bedrock-runtime converse presenting a valid Python Fibonacci function and asking the model to confirm if it works correctly by responding "Yes" or "No". Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-code-nova-pro

Tests Nova Pro model code analysis using bedrock-runtime converse presenting a valid Python Fibonacci function and asking the model to confirm if it works correctly by responding "Yes" or "No". Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-fact-claude-3-haiku

Tests Claude 3 Haiku model factual knowledge using bedrock-runtime converse with prompt "Who was the 16th president of the United States?" expecting "Lincoln" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-fact-nova-lite

Tests Nova Lite model factual knowledge using bedrock-runtime converse with prompt "Who was the 16th president of the United States?" expecting "Lincoln" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-fact-nova-micro

Tests Nova Micro model factual knowledge using bedrock-runtime converse with prompt "Who was the 16th president of the United States?" expecting "Lincoln" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-fact-nova-pro

Tests Nova Pro model factual knowledge using bedrock-runtime converse with prompt "Who was the 16th president of the United States?" expecting "Lincoln" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-math-claude-3-haiku

Tests Claude 3 Haiku model mathematical reasoning using bedrock-runtime converse with prompt "What is the square root of 144?" expecting "12" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-math-nova-lite

Tests Nova Lite model mathematical reasoning using bedrock-runtime converse with prompt "What is the square root of 144?" expecting "12" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-math-nova-micro

Tests Nova Micro model mathematical reasoning using bedrock-runtime converse with prompt "What is the square root of 144?" expecting "12" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-math-nova-pro

Tests Nova Pro model mathematical reasoning using bedrock-runtime converse with prompt "What is the square root of 144?" expecting "12" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-reasoning-claude-3-haiku

Tests Claude 3 Haiku model logical reasoning using bedrock-runtime converse with prompt "If a fridge is unplugged, will the food inside stay cold forever?" expecting "No" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-reasoning-nova-lite

Tests Nova Lite model logical reasoning using bedrock-runtime converse with prompt "If a fridge is unplugged, will the food inside stay cold forever?" expecting "No" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-reasoning-nova-micro

Tests Nova Micro model logical reasoning using bedrock-runtime converse with prompt "If a fridge is unplugged, will the food inside stay cold forever?" expecting "No" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...

converse-reasoning-nova-pro

Tests Nova Pro model logical reasoning using bedrock-runtime converse with prompt "If a fridge is unplugged, will the food inside stay cold forever?" expecting "No" response. Measures inference response time from API call to response receipt.

Loading availability & performance data...