Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: expected scalar type BFloat16 but found Float #15

Open
wheeinin opened this issue Jan 6, 2025 · 1 comment
Open

RuntimeError: expected scalar type BFloat16 but found Float #15

wheeinin opened this issue Jan 6, 2025 · 1 comment

Comments

@wheeinin
Copy link

wheeinin commented Jan 6, 2025

Hello! I encountered a problem when fine - tuning my own dataset. I really don't know how to solve it.
Traceback (most recent call last):
File "lora_infer.py", line 204, in
generation_output = model.generate(
File "/root/autodl-tmp/prollama/ProLLaMA-main/peft/peft_model.py", line 581, in generate
outputs = self.base_model.generate(**kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/transformers/generation/utils.py", line 1719, in generate
return self.sample(
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/transformers/generation/utils.py", line 2801, in sample
outputs = self(
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/accelerate/hooks.py", line 166, in new_forward
output = module._old_forward(*args, **kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/transformers/models/llama/modeling_llama.py", line 1034, in forward
outputs = self.model(
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/transformers/models/llama/modeling_llama.py", line 922, in forward
layer_outputs = decoder_layer(
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/transformers/models/llama/modeling_llama.py", line 672, in forward
hidden_states, self_attn_weights, present_key_value = self.self_attn(
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/transformers/models/llama/modeling_llama.py", line 366, in forward
query_states = self.q_proj(hidden_states)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/autodl-tmp/prollama/ProLLaMA-main/peft/tuners/lora.py", line 375, in forward
result += self.lora_B(self.lora_A(self.lora_dropout(x))) * self.scaling
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/root/miniconda3/envs/q/lib/python3.8/site-packages/torch/nn/modules/linear.py", line 114, in forward
return F.linear(input, self.weight, self.bias)
RuntimeError: expected scalar type BFloat16 but found Float

Can someone provide guidance on how to proceed? Thanks in advance!

@Lyu6PosHao
Copy link
Member

I guess these are caused by the src/peft code being too old. I have updated my codes, details of which could be found in README.md.

Please let me know if the error still occurs when using new codes.

Best regards

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants