Finetune Llama 3.3, DeepSeek-R1 & Reasoning LLMs 2x faster with 70% less memory! 🦥
-
Updated
Feb 15, 2025 - Python
Finetune Llama 3.3, DeepSeek-R1 & Reasoning LLMs 2x faster with 70% less memory! 🦥
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama model family and using them on various provider services
Efficient Triton Kernels for LLM Training
H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://docs.h2o.ai/h2o-llmstudio/
A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
Interact with your SQL database, Natural Language to SQL using LLMs
A PyTorch Library for Meta-learning Research
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
Curated tutorials and resources for Large Language Models, Text2SQL, Text2DSL、Text2API、Text2Vis and more.
🎯 Task-oriented embedding tuning for BERT, CLIP, etc.
Easiest and laziest way for building multi-agent LLMs applications.
Mastering Applied AI, One Concept at a Time
Toolkit for fine-tuning, ablating and unit-testing open-source LLMs.
Webui for using XTTS and for finetuning it
优质稳定的OpenAI的API接口-For企业和开发者。OpenAI的api proxy,支持ChatGPT的API调用,支持openai的API接口,支持:gpt-4,gpt-3.5。不需要openai Key, 不需要买openai的账号,不需要美元的银行卡,通通不用的,直接调用就行,稳定好用!!智增增
Finetuning large language models for GDScript generation.
[IJCAI 2023 survey track]A curated list of resources for chemical pre-trained models
Open source project for data preparation of LLM application builders
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
Add a description, image, and links to the finetuning topic page so that developers can more easily learn about it.
To associate your repository with the finetuning topic, visit your repo's landing page and select "manage topics."