Skip to content

Commit

Permalink
Revert "Merge branch 'main' into transformers_future"
Browse files Browse the repository at this point in the history
This reverts commit 50e63e5, reversing
changes made to 7f32580.
  • Loading branch information
yafshar committed Feb 24, 2025
1 parent 50e63e5 commit 09298b6
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 10 deletions.
10 changes: 5 additions & 5 deletions examples/stable-diffusion/text_to_image_generation.py
Original file line number Diff line number Diff line change
Expand Up @@ -508,6 +508,9 @@ def main():
**kwargs,
)

if args.lora_id:
pipeline.load_lora_weights(args.lora_id)

elif sd3:
# SD3 pipelines
if controlnet:
Expand All @@ -526,7 +529,6 @@ def main():
args.model_name_or_path,
**kwargs,
)

elif flux:
# Flux pipelines
if controlnet:
Expand Down Expand Up @@ -557,6 +559,8 @@ def main():
controlnet=controlnet,
**kwargs,
)
if args.lora_id:
pipeline.load_lora_weights(args.lora_id)

elif inpainting:
# SD Inpainting pipeline
Expand Down Expand Up @@ -600,10 +604,6 @@ def main():
**kwargs,
)

# Load LoRA weights if provided
if args.lora_id:
pipeline.load_lora_weights(args.lora_id)

# Setup logging
logging.basicConfig(
format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
Expand Down
5 changes: 2 additions & 3 deletions examples/text-generation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -210,8 +210,7 @@ python run_generation.py \
--dataset_name JulesBelveze/tldr_news \
--column_name content \
--bf16 \
--sdp_on_bf16 \
--trust_remote_code
--sdp_on_bf16
```
> The prompt length is limited to 16 tokens. Prompts longer than this will be truncated.
Expand Down Expand Up @@ -519,7 +518,7 @@ python run_generation.py \
### Saving FP8 Checkpoints in Hugging Face format
After quantizing the model, we can save it to a local path.

> [!NOTE]
> [!NOTE]
> Before executing the command below, please refer to the [Running with FP8](#running-with-fp8) section to measure the model quantization statistics.
Here is an example of how to quantize and save the LLama3.1-70B model on two cards:
Expand Down
2 changes: 1 addition & 1 deletion examples/text-generation/run_generation.py
Original file line number Diff line number Diff line change
Expand Up @@ -659,7 +659,7 @@ def rounder(x):

assert not args.simulate_dyn_prompt, "Both dataset_name and simulate_dyn_prompt are set"

raw_dataset = load_dataset(args.dataset_name, trust_remote_code=args.trust_remote_code)
raw_dataset = load_dataset(args.dataset_name)
if "test" in raw_dataset:
split = "test"
elif "validation" in raw_dataset:
Expand Down
1 change: 0 additions & 1 deletion examples/text-to-speech/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
datasets
soundfile
sentencepiece

0 comments on commit 09298b6

Please sign in to comment.