-
Hello. I want to implement to authenticates users and pass data from the middleware to a runnable. Usually FastAPI application can use Middleware to store data in https://www.starlette.io/requests/#other-state
Does anyone have any ideas? Thank you. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Two options at the moment:
If you can give me a bit more about the information that you want to pass to the runnables i could see if there's additional functionality that we could add to make it better to handle your use case. |
Beta Was this translation helpful? Give feedback.
-
Thank you! My question was solved by I wrote the example code below. from typing import Any, Dict, Optional
import jwt
from fastapi import FastAPI, Request, Response
from fastapi.responses import Response
from langchain_core.runnables import RunnableConfig, RunnableLambda
from langserve import add_routes
from langserve.schema import CustomUserType
from pydantic import BaseModel
from starlette.middleware.base import RequestResponseEndpoint
from starlette.requests import Request
app = FastAPI()
class Credentials(BaseModel):
sub: str
name: str
iat: int
@classmethod
def from_jwt(cls, token: str) -> "Credentials":
return cls(**jwt.decode(token, key="secret", algorithms=["HS256"]))
@app.middleware("http")
async def authorize(
request: Request, call_next: RequestResponseEndpoint
) -> Response:
authorization_header: Optional[str] = request.headers.get(
"authorization"
)
if authorization_header is None:
return await call_next(request)
bearer_token: Optional[str]
try:
bearer_token = authorization_header.split(" ")[1]
except Exception:
return await call_next(request)
credentials: Optional[Credentials]
try:
credentials = Credentials.from_jwt(bearer_token)
except Exception:
return await call_next(request)
request.state.credentials = credentials
return await call_next(request)
class CustomUserTypeWithOptionalToken(CustomUserType):
token: Optional[str]
class Greet(CustomUserType):
name: str
class GreetWithOptionalToken(Greet, CustomUserTypeWithOptionalToken):
...
def greet(input: Greet, config: RunnableConfig) -> str:
credentials: Optional[Credentials] = None
if "metadata" in config and "credentials" in config["metadata"]:
credentials = config["metadata"]["credentials"]
if credentials is None:
return f"Hello, {input.name}!"
return f"Hello, {input.name}! (Authorized as {credentials.name}))"
def credentials_config_modifier(config: Dict[str, Any], request: Request) -> Dict[str, Any]:
credentials: Optional[Credentials] = None
if hasattr(request.state, "credentials"):
credentials = request.state.credentials
token = CustomUserTypeWithOptionalToken(**request._json["input"])
if token.token is not None:
credentials = Credentials.from_jwt(token.token)
if credentials is None:
return config
if "metadata" not in config:
config["metadata"] = {}
config["metadata"]["credentials"] = credentials
return config
add_routes(
app,
RunnableLambda(greet),
path="/greet",
input_type=GreetWithOptionalToken,
per_req_config_modifier=credentials_config_modifier,
)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="localhost", port=8000) Execution result: and $ curl 'http://127.0.0.1:8000/greet/invoke'\
-X POST \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.XbPfbIHMI6arZ3Y922BhjWgQzWXcXNrz0ogtVhfEd2o' \
-d '{"input":{"name":"John"},"config":{}}' \
{"output":"Hello, John! (Authorized as John Doe))","callback_events":[],"metadata":{"run_id":"7acf4957-4bb6-420c-83bf-1abecf1d729f"}} |
Beta Was this translation helpful? Give feedback.
Two options at the moment:
Specifying a per request modifier function. It gives you access to the raw request object and the config. You can then populate the 'configurable' field in
config
w/ appropriate information. This will work well in conjunction with configurable runnables.https://github.com/langchain-ai/langserve/blob/main/examples/configurable_chain/server.py
If you want to do something else, you can use the underlying APIHandler; it'll require typing up a bit more code, but gives you more flexibility: https://github.com/langchain-ai/langserve/blob/main/examples/api_handler_examples/server.py
If you can give me a bit more about the information that you want to pass to the …