AdalFlow is a PyTorch-like library to build and auto-optimize any LM workflows, from Chatbots, RAG, to Agents.
- Say goodbye to manual prompting: AdalFlow provides a unified auto-differentiative framework for both zero-shot optimization and few-shot prompt optimization. Our research,
LLM-AutoDiff
andLearn-to-Reason Few-shot In Context Learning
, achieve the highest accuracy among all auto-prompt optimization libraries. - Switch your LLM app to any model via a config: AdalFlow provides
Model-agnostic
building blocks for LLM task pipelines, ranging from RAG, Agents to classical NLP tasks.
View Documentation
Install AdalFlow with pip:
pip install adalflow
View Quickstart: Learn AdalFlow end-to-end experience in 15 mins.
[Jan 2025] Auto-Differentiating Any LLM Workflow: A Farewell to Manual Prompting
- LLM Applications as auto-differentiation graphs
- Token-efficient and better performance than DsPy
We work closely with the VITA Group at University of Texas at Austin, under the leadership of Dr. Atlas Wang, alongside Dr. Junyuan Hong, who provides valuable support in driving project initiatives.
For collaboration, contact Li Yin.
AdalFlow full documentation available at adalflow.sylph.ai:
AdalFlow is named in honor of Ada Lovelace, the pioneering female mathematician who first recognized that machines could go beyond mere calculations. As a team led by a female founder, we aim to inspire more women to pursue careers in AI.
The AdalFlow is a community-driven project, and we welcome everyone to join us in building the future of LLM applications.
Join our Discord community to ask questions, share your projects, and get updates on AdalFlow.
To contribute, please read our Contributor Guide.
Many existing works greatly inspired AdalFlow library! Here is a non-exhaustive list:
- 📚 PyTorch for design philosophy and design pattern of
Component
,Parameter
,Sequential
. - 📚 Micrograd: A tiny autograd engine for our auto-differentiative architecture.
- 📚 Text-Grad for the
Textual Gradient Descent
text optimizer. - 📚 DSPy for inspiring the
__{input/output}__fields
in ourDataClass
and the bootstrap few-shot optimizer. - 📚 OPRO for adding past text instructions along with its accuracy in the text optimizer.
- 📚 PyTorch Lightning for the
AdalComponent
andTrainer
.