Skip to content

A lightweight API gateway for LLMs built on Cloudflare Workers, designed to simplify interactions with multiple LLM providers by exposing an one-united OpenAI-compatible endpoint.

License

Notifications You must be signed in to change notification settings

awesome-archive/one-united

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

one-united

A lightweight API gateway for large language models (LLMs) built on Cloudflare Workers, designed to simplify interactions with multiple LLM providers by exposing an one-united OpenAI-compatible endpoint.

Overview

  • 🦀 Language & Tools: Built with worker-rs and deployed using Wrangler.
  • Key Features:
    • 🤖 Easily deploy your own LLM API gateway.
    • ☁️ Benefit from the Cloudflare infrastructure.
    • 🔄 One unified endpoint for multiple LLM providers with a latency-based load-balancing strategy.
    • 🔑 OpenAI-compatible API.
  • 🚧 TODO:
    • Provide more customizable load balancing configuration.
    • Intuitive front-end configuration management interface

Deployment

Before deploying, ensure that you have the following installed:

1. Clone and Setup the Repository

Clone the repository and copy the example Wrangler configuration:

git clone https://github.com/one-united/one-united.git
cp wrangler.example.toml wrangler.toml

2. Create a KV Namespace

The worker uses a Cloudflare KV namespace to store its configuration. Create a KV namespace with:

npx wrangler kv:namespace create config

After running the command above, copy the provided kv_namespaces section and paste it into your wrangler.toml file. It should appear similar to:

[[kv_namespaces]]
binding = "config"
id = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

3. Deploy the Worker

Then, simply run the following command to deploy the worker:

npx wrangler deploy

Once deployed, your worker will be available at the URL:

https://<YOUR_WORKER>.<YOUR_SUBDOMAIN>.workers.dev

You can verify its status through Cloudflare’s dashboard.

4. Set Your API Secret (Optional but Recommended)

To secure your endpoint from unauthorized use, configure your API secret:

npx wrangler secret put ONE_API_KEY

Usage

After deployment, you need to upload a configuration that defines your LLM providers and routing rules. Send a POST request to /config with your configuration. Edit the config.example.json file with your actual provider details and API keys.

mv config.example.json config.json && vim config.json
curl -X POST https://<YOUR_WORKER>.<YOUR_SUBDOMAIN>.workers.dev/config \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $ONE_API_KEY" \
  -d @config.json

To check if the configuration has been successfully applied, use:

curl -H "Authorization: Bearer $ONE_API_KEY" -s https://<YOUR_WORKER>.<YOUR_SUBDOMAIN>.workers.dev/config

Once configured, you can now send chat completions requests via the unified endpoint. For example:

curl https://<YOUR_WORKER>.<YOUR_SUBDOMAIN>.workers.dev/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $ONE_API_KEY" \
  -d '{
     "model": "gpt-4o",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

For more details on how to use the API and customize your requests, please refer to the OpenAI API documentation.

About

A lightweight API gateway for LLMs built on Cloudflare Workers, designed to simplify interactions with multiple LLM providers by exposing an one-united OpenAI-compatible endpoint.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Rust 99.7%
  • Makefile 0.3%