Cheese Chatter is an LLM app ready to integrate with a Telegram bot. The application is built with langchain-langgraph and deploy in a local server using LangGraph Platform.
This project has a structure designed for high scalability, reusability, and customization of Graphs and Subgraphs. The code is designed around StateHandlerClasses, which are responsible for modifications of the GraphState through Runnables, Commands, and Subgraphs.
The Graph has the following architecture captured by each run:
- Python 3.12.5
- Ollama 4.0 or higher (free); or an Azure OpenAI Deployment (payment)
- pip (Python package manager)
-
Clone the repository
-
Create a virtual environment:
python3 -m venv .venv
-
Activate the virtual environment:
- On Windows:
.venv\Scripts\activate
- On macOS and Linux:
source .venv/bin/activate
-
Install the project:
python3 -m pip install -e .
To run the project:
-
Set LLM Services
Choose one of the following options:
Start the Ollama service by running
ollama run llama3.1
Config all your model variables in your
.env
cp .env.example .env
-
Run the LangGraph Platforms app
make run-app
-
Compile the graph mannually
You can check
./runnable.ipynb
for more information.
cheese-chatter/
├── main.py # Main file to run the project
├── app.py # Main file to assemble the app
├── runnable.ipynb # Notebook for debugging and interacting with the project
├── requirements.txt # Project dependency list
├── .env # Environment variables for configuration
├── README.md # Project documentation
├── src/
│ └── cheese/
│ ├── components/
│ │ ├── nodes/
│ │ ├── edges/
│ │ │ ├── evaluators/ # Contains StateEvaluator for conditional edges
│ │ │ └── conditionals/ # Contains ConditionalEdge
│ │ ├── tools/ # Contains BaseTool
│ │ └── runnables/ # Contains executable invoke files
│ ├── utils/
│ │ ├── common.py
│ │ ├── logger.py
│ │ └── type_vars.py
│ ├── config/
│ │ ├── config_graph.py # Contains the definition of graph nodes and edges
│ │ ├── subgraphs/ # Contain subgraphs to be implemented as nodes
│ │ └── runnables/ # Contains prompts and LLM configuration
│ ├── managers/ # Contains manager classes
│ ├── services/ # Contains services
│ ├── entity/
│ │ ├── models/ # Contains structural models
│ │ ├── graph_layout.py # Initialize the Graph Layout with a Config Graph dataclass
│ │ ├── runnable_builder.py # Builder for LangChain Runnable
│ │ ├── statehandler.py # Contains main entities for GraphState handlers
│ │ ├── node.py # Contains main entities related to nodes
│ │ └── edge.py # Contains main entities related to edges
│ └── constants/
│ └── __init__.py # Contains project constants
├── config/
│ └── config.yaml # Main configuration files
├── research/ # Directory for experimentation scripts and notebooks
├── tests/ # Directory for testing modules
│ ├── integration_test/
│ └── unit_test/
├── artifacts/ # Directory for artifacts
│ ├── cheese_graph.png # Image of the application's main architecture
│ └── models/ # Directory for models generated in research
└── logs/ # Directory for project logs