Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Int8 model inference problem #528

Open
zhaowujin opened this issue Mar 19, 2021 · 1 comment
Open

Int8 model inference problem #528

zhaowujin opened this issue Mar 19, 2021 · 1 comment

Comments

@zhaowujin
Copy link

zhaowujin commented Mar 19, 2021

I transfer int8 model successfully, but get error when inference.

Error as flow:
self.model_trt.load_state_dict(torch.load(model_path))
File "/usr/local/lib64/python3.6/site-packages/torch/nn/modules/module.py", line 832, in load_state_dict
load(self)
File "/usr/local/lib64/python3.6/site-packages/torch/nn/modules/module.py", line 827, in load
state_dict, prefix, local_metadata, True, missing_keys, unexpected_keys, error_msgs)
File "/usr/local/lib64/python3.6/site-packages/torch2trt-0.1.0-py3.6-linux-x86_64.egg/torch2trt/torch2trt.py", line 443, in _load_from_state_dict
self.context = self.engine.create_execution_context()
AttributeError: 'NoneType' object has no attribute 'create_execution_context'

Load model code:
self.model_trt = TRTModule()
self.model_trt.load_state_dict(torch.load(model_path))

Transfer code:
self.model_trt = torch2trt(self.model, [data_set[0]], int8_mode=True, max_batch_size=100, int8_calib_dataset=data_set)
torch.save(self.model_trt.state_dict(), '10_trt_int8_1000.pth')

@wang-TJ-20
Copy link

@zhaowujin 您好,请问这个问题您解决了吗

hello , have you solved this question

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants