LanguageEnglish

How to Use Amazon Bedrock Models with Open WebUI

2025-08-27
2025-08-27

Hello everyone!

Today, I'll be explaining how to use Amazon Bedrock as a model in the popular chat application faviconOpen WebUI.

What is Open WebUI?#

Open WebUI is a comprehensive application that provides a user interface for LLM chat applications, and is a rare project that has garnered over 100k GitHub Stars at the time of writing.

Architecture#

Open WebUI supports faviconmodel providers with OpenAI-compatible API endpoints. In this article, we'll use this feature to call Amazon Bedrock from Open WebUI.

Using LiteLLM#

faviconLiteLLM has many features, but one of its most significant capabilities is to convert various model providers' APIs into OpenAI-compatible APIs.

Using this feature, you can convert Amazon Bedrock's API into an OpenAI-compatible API without any coding.

Using Open WebUI#

Open WebUI can externalize session management, database management, vector stores, etc., but it can also run as a single instance. For details, please refer to the favicondocumentation.

For this minimal example, we'll run it as a single instance.

Connecting Open WebUI and LiteLLM#

Open WebUI can immediately start calling models by simply specifying the URL if it's an OpenAI-compatible API endpoint.

Preparation#

Confirming Model Access#

Check if the models you want to use are accessible. For the Tokyo region, you can check from this link.

Launching with docker-compose#

To simplify configuration, we'll use docker-compose. Make sure the Docker daemon is running and docker-compose is installed.

Creating the LiteLLM Configuration File#

Create a configuration file for LiteLLM. You can specify which models to use here. For details, please see the favicondocumentation.

Here's an example that specifies model IDs for the Tokyo region endpoint:

yaml
model_list:
  - model_name: Amazon Nova Pro
    litellm_params:
      model: bedrock/apac.amazon.nova-pro-v1:0

  - model_name: Amazon Nova Lite
    litellm_params:
      model: bedrock/apac.amazon.nova-lite-v1:0

  - model_name: Amazon Nova Micro
    litellm_params:
      model: bedrock/apac.amazon.nova-micro-v1:0

  - model_name: Anthropic Claude 3.5 Sonnet
    litellm_params:
      model: bedrock/apac.anthropic.claude-3-5-sonnet-20240620-v1:0

  - model_name: Anthropic Claude 3.5 Sonnet v2
    litellm_params:
      model: bedrock/apac.anthropic.claude-3-5-sonnet-20241022-v2:0

  - model_name: Anthropic Claude 3.7 Sonnet
    litellm_params:
      model: bedrock/apac.anthropic.claude-3-7-sonnet-20250219-v1:0

  - model_name: Anthropic Claude Sonnet 4
    litellm_params:
      model: bedrock/apac.anthropic.claude-sonnet-4-20250514-v1:0

docker-compose Configuration#

To call the Amazon Bedrock API from within LiteLLM, we need to pass authentication information to the LiteLLM container. In this case, we'll configure and pass an SSO profile.

yaml
# docker-compose.yml
version: "3.9"
services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: open-webui
    ports:
      - "8080:8080"
    environment:
      WEBUI_AUTH: "False" # Run in single-user mode
      OPENAI_API_BASE_URL: "http://litellm:4000" # Specify the model provider's API root (can be changed from GUI)
    volumes:
      - open-webui:/app/backend/data

  litellm:
    image: ghcr.io/berriai/litellm:main-latest
    container_name: litellm
    ports:
      - "4000:4000"
    environment:
	    # We'll use an SSO profile here
      AWS_PROFILE: examples
      AWS_REGION: ap-northeast-1
    volumes:
      - ./litellm-config.yaml:/app/litellm-config.yaml:ro
      - ~/.aws:/root/.aws:rw
    command: ["--config", "/app/litellm-config.yaml"]

volumes:
  open-webui:

Launch#

Launch with the following command:

bash
docker-compose up -d

Startup will take some time. Check the status by accessing http://localhost:8080 several times.

Once it's up and running and you can access the GUI, verify that the model list is displayed in the upper left.

Let's try asking it something. Check if you get a response.

To shut down, use the following command:

bash
docker-compose down
Ikuma Yamashita
Cloud engineer. Works on infrastructure-related tasks professionally, but has been spotted dedicating private time exclusively to systems programming. Shows a preference for using Rust. Enjoys illustration as a hobby.