LanguageEnglish

How to use Amazon Bedrock models with Open WebUI

2025-08-27
2025-08-27

Greetings, everyone.

This time, we'll be using Amazon Bedrock as the model for the extremely popular chat application faviconOpen WebUI.

What is Open WebUI?#

Open WebUI is a comprehensive application that provides a user interface for LLM chat applications, and is a rare project that has earned over 100k GitHub Stars at the time of writing.

Architecture#

Open WebUI supports faviconmodel providers with OpenAI-compatible API endpoints. This time, we'll be using this feature to call Amazon Bedrock from Open WebUI.

Using LiteLLM#

faviconLiteLLM has many features, but one of its major capabilities is converting various model provider APIs to OpenAI-compatible APIs.

By using this feature, it's possible to convert Amazon Bedrock's API to an OpenAI-compatible API without any coding.

Using Open WebUI#

Open WebUI can have session management, database management, vector stores, etc. externally, but it can also run as a single instance. For details, please refer to the favicondocumentation.

This time, we'll run it as a single instance as a minimal example.

Connecting Open WebUI and LiteLLM#

Open WebUI can start calling models immediately just by specifying a URL, as long as it's an OpenAI-compatible API endpoint.

Preparation#

Checking Model Access#

Verify that the models you want to use are accessible. For the Tokyo region, you can check via this link.

Launching with docker-compose#

To simplify configuration, we'll use docker-compose. Please verify that the Docker daemon is running and that docker-compose is installed.

Creating LiteLLM Configuration File#

Create a configuration file for LiteLLM. You can specify which models to use here. For details, please see the favicondocumentation.

The following is an example specifying model IDs for the regional endpoint in the Tokyo region.

yaml
model_list:
  - model_name: Amazon Nova Pro
    litellm_params:
      model: bedrock/apac.amazon.nova-pro-v1:0

  - model_name: Amazon Nova Lite
    litellm_params:
      model: bedrock/apac.amazon.nova-lite-v1:0

  - model_name: Amazon Nova Micro
    litellm_params:
      model: bedrock/apac.amazon.nova-micro-v1:0

  - model_name: Anthropic Claude 3.5 Sonnet
    litellm_params:
      model: bedrock/apac.anthropic.claude-3-5-sonnet-20240620-v1:0

  - model_name: Anthropic Claude 3.5 Sonnet v2
    litellm_params:
      model: bedrock/apac.anthropic.claude-3-5-sonnet-20241022-v2:0

  - model_name: Anthropic Claude 3.7 Sonnet
    litellm_params:
      model: bedrock/apac.anthropic.claude-3-7-sonnet-20250219-v1:0

  - model_name: Anthropic Claude Sonnet 4
    litellm_params:
      model: bedrock/apac.anthropic.claude-sonnet-4-20250514-v1:0

docker-compose Configuration#

Since we'll be making Amazon Bedrock API calls from within LiteLLM, we need to pass authentication credentials to the LiteLLM container in some way. This time, we'll configure and pass them via an SSO profile.

yaml
# docker-compose.yml
version: "3.9"
services:
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: open-webui
    ports:
      - "8080:8080"
    environment:
      WEBUI_AUTH: "False" # Run in single-user mode
      OPENAI_API_BASE_URL: "http://litellm:4000" # Specify the model provider API base (can be changed from the GUI)
    volumes:
      - open-webui:/app/backend/data

  litellm:
    image: ghcr.io/berriai/litellm:main-latest
    container_name: litellm
    ports:
      - "4000:4000"
    environment:
      # We'll use an SSO profile here
      AWS_PROFILE: examples
      AWS_REGION: ap-northeast-1
    volumes:
      - ./litellm-config.yaml:/app/litellm-config.yaml:ro
      - ~/.aws:/root/.aws:rw
    command: ["--config", "/app/litellm-config.yaml"]

volumes:
  open-webui:

Launch#

Launch with the following command:

bash
docker-compose up -d

It will take some time to start up. Please access http://localhost:8080 several times to check the status.

Once it has started and you can access the GUI, please confirm that the model list is displayed in the upper left.

Let's try saying something to it. We'll verify that a response comes back.

Shut down with the following command:

bash
docker-compose down
Ikuma Yamashita
Cloud engineer. Works on infrastructure-related tasks professionally, but has been spotted dedicating private time exclusively to systems programming. Shows a preference for using Rust. Enjoys illustration as a hobby.