🧠 All Things AI
Intermediate

AWS CLI for Bedrock

The AWS CLI is useful for quick experiments, scripting, and debugging Bedrock without writing Python. All Bedrock operations available in boto3 are also available in the CLI. The two service namespaces mirror the boto3 clients: aws bedrock for the control plane andaws bedrock-runtime for inference.

Setup

# Install / update AWS CLI v2
# macOS: brew install awscli
# Linux: see docs.aws.amazon.com/cli

aws configure
# AWS Access Key ID: ...
# AWS Secret Access Key: ...
# Default region name: us-east-1
# Default output format: json

List Foundation Models

# List all available foundation models
aws bedrock list-foundation-models --region us-east-1

# Filter by provider
aws bedrock list-foundation-models   --by-provider anthropic   --region us-east-1   --query "modelSummaries[].{id:modelId,name:modelName}"

# Check model details (context length, supported features)
aws bedrock get-foundation-model   --model-identifier anthropic.claude-sonnet-4-5   --region us-east-1

Converse (Chat)

aws bedrock-runtime converse   --model-id anthropic.claude-sonnet-4-5   --region us-east-1   --messages '[{"role":"user","content":[{"text":"What is RAG in one sentence?"}]}]'   --inference-config '{"maxTokens":256}'   --query "output.message.content[0].text"   --output text

InvokeModel

# Write the request body to a temp file (avoids shell escaping issues)
cat > /tmp/body.json << 'EOF'
{
  "anthropic_version": "bedrock-2023-05-31",
  "max_tokens": 256,
  "messages": [{"role": "user", "content": "Name three AWS services"}]
}
EOF

aws bedrock-runtime invoke-model   --model-id anthropic.claude-sonnet-4-5   --body file:///tmp/body.json   --content-type application/json   --region us-east-1   /tmp/response.json

# Pretty-print the response
cat /tmp/response.json | python3 -c "
import json, sys
r = json.load(sys.stdin)
print(r['content'][0]['text'])
"

Batch Inference Jobs

# Create a batch job
aws bedrock create-model-invocation-job   --job-name my-batch-job   --model-id anthropic.claude-haiku-3-5   --input-data-config '{"s3InputDataConfig":{"s3Uri":"s3://my-bucket/input.jsonl","s3InputFormat":"JSONL"}}'   --output-data-config '{"s3OutputDataConfig":{"s3Uri":"s3://my-bucket/output/"}}'   --role-arn arn:aws:iam::ACCOUNT:role/BedrockBatchRole   --region us-east-1

# Check job status
aws bedrock get-model-invocation-job   --job-identifier JOB_ARN   --region us-east-1   --query "status"

# List recent jobs
aws bedrock list-model-invocation-jobs   --region us-east-1   --query "invocationJobSummaries[].{name:jobName,status:status}"  

Guardrails

# List guardrails
aws bedrock list-guardrails --region us-east-1

# Get guardrail details
aws bedrock get-guardrail   --guardrail-identifier GUARDRAIL_ID   --guardrail-version DRAFT   --region us-east-1

Model Access

# Check which models you have access to
aws bedrock list-foundation-models   --region us-east-1   --query "modelSummaries[?modelLifecycle.status=='ACTIVE'].modelId"

CLI Tips

  • • Use --output json (default) for full response; --output text with --query for just the field you want
  • --query uses JMESPath syntax — learn the basics for effective filtering
  • • Write request bodies to temp files with file:// prefix to avoid shell quoting nightmares with JSON
  • • Always specify --region explicitly — don't rely on default region for scripts that may run in other environments

Checklist: Do You Understand This?

  • What is the difference between aws bedrock and aws bedrock-runtime?
  • How would you list all Claude models available in us-east-1?
  • Can you write a CLI command to send a user message via Converse and print only the response text?
  • How do you check the status of a batch inference job?