-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade to onnxruntime 0.19 #753
base: main
Are you sure you want to change the base?
Conversation
RUN python3 -m pip install --extra-index-url https://download.pytorch.org/whl/cu118 \ | ||
RUN python3 -m pip install \ | ||
--extra-index-url https://download.pytorch.org/whl/cu118 \ | ||
--extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-11/pypi/simple/ \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is this guy doing? changing the default cuda version for onnxruntime? so onnxruntime installs cuda?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The default is now to install the onnxruntime that's compatible with CUDA 12; this forces it to install the one built for CUDA 11 instead. (It used to be the reverse; you had to choose the channel for CUDA 12 and the default was 11)
@@ -264,7 +264,7 @@ | |||
|
|||
ONNXRUNTIME_EXECUTION_PROVIDERS = os.getenv( | |||
"ONNXRUNTIME_EXECUTION_PROVIDERS", | |||
"[CUDAExecutionProvider,OpenVINOExecutionProvider,CPUExecutionProvider]", | |||
"[CUDAExecutionProvider,OpenVINOExecutionProvider,CoreMLExecutionProvider,CPUExecutionProvider]", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
now everyone not on a mac is gonna get stupid warnings about not having coremlexecution provider :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeh I was thinking we should probably update this default to be what onnxruntime.get_available_providers()
reports but was going to do that in a different PR.
Description
This pulls up our dependencies the the latest onnxruntime and updates our Docker containers to match.
Benefits:
Type of change
How has this change been tested, please provide a testcase or example of how you tested the change?
Have run the tests locally. About to build all the containers via our github actions.
Any specific deployment considerations
No.
Docs