Skip to content

Issues: microsoft/onnxruntime

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Feature Request] Add Wasm Relaxed SIMD support and integer dot product instructions for ONNX Runtime Web feature request request for unsupported feature or enhancement platform:web issues related to ONNX Runtime web; typically submitted using template
#22533 opened Oct 22, 2024 by jing-bao
DistilBERT model inference failure using ONNX Runtime QNNExecutionProvider on Snapdragon® X Elite NPU ep:QNN issues related to QNN exeution provider model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc.
#22532 opened Oct 22, 2024 by sean830314
[Mobile] after build and install miss some file contributions welcome lower priority issues for the core ORT teams feature request request for unsupported feature or enhancement platform:mobile issues related to ONNX Runtime mobile; typically submitted using template
#22531 opened Oct 22, 2024 by WangHHY19931001
[Build] Discrepancies in ONNX Runtime Inference Results on RISC-V contributions welcome lower priority issues for the core ORT teams
#22530 opened Oct 22, 2024 by sarmentow
[Performance] Difference in the ONNX model loading times in C# vs Python api:CSharp issues related to the C# API performance issues related to performance regressions
#22528 opened Oct 21, 2024 by BhSinghal
[Build] No onnxruntime package for python 3.13 feature request request for unsupported feature or enhancement
#22523 opened Oct 21, 2024 by oussama-gourari
[Mobile] Memory crash after repeated inference with dynamic shape input api:Java issues related to the Java API platform:mobile issues related to ONNX Runtime mobile; typically submitted using template
#22520 opened Oct 21, 2024 by laurenspriem
[Build] compilation error: invalid instruction mnemonic 'vcvtneeph2ps' build build issues; typically submitted using template contributions welcome lower priority issues for the core ORT teams
#22519 opened Oct 21, 2024 by saiden89
[Mobile] QNN SetupBackend fails due to 32-bit library loading in 64-bit environment ep:QNN issues related to QNN exeution provider platform:mobile issues related to ONNX Runtime mobile; typically submitted using template
#22518 opened Oct 21, 2024 by w11m
Using directML to inference accelerate onnxruntime, a crash occurred. ep:DML issues related to the DirectML execution provider
#22514 opened Oct 20, 2024 by yunhaolsh
onnxruntime optimizers fails
#22512 opened Oct 19, 2024 by xadupre
[Performance] C++ api: destroy the execution provider if the Ort::Session is destroyed ep:QNN issues related to QNN exeution provider performance issues related to performance regressions
#22511 opened Oct 19, 2024 by kristoftunner
[Mobile] iOS - ZipMap output cannot be read platform:mobile issues related to ONNX Runtime mobile; typically submitted using template
#22505 opened Oct 18, 2024 by sanjaymk908
[Web] custom wasm model location path for inference api:Javascript issues related to the Javascript API platform:web issues related to ONNX Runtime web; typically submitted using template
#22504 opened Oct 18, 2024 by sca1235
[Documentation] Document TensorRT engine encryption dll specification documentation improvements or additions to documentation; typically submitted using template ep:TensorRT issues related to TensorRT execution provider
#22496 opened Oct 18, 2024 by BengtGustafsson
[WebNN EP] Deprecate MLTensorUsage in favor of boolean flags ep:WebNN WebNN execution provider
#22495 opened Oct 18, 2024 by huningxin
[Web] Can't create a session model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc. platform:web issues related to ONNX Runtime web; typically submitted using template
#22484 opened Oct 17, 2024 by djannot
onnxruntime.dll may fail to run in sandboxed processes on Windows platform:windows issues related to the Windows platform
#22475 opened Oct 17, 2024 by SteveBeckerMSFT
[Feature Request] FP16 support for MatMul and GEMM on CPU execution provider feature request request for unsupported feature or enhancement
#22467 opened Oct 16, 2024 by cjm715
[Documentation] Running genai-directml-quantized models with metacommands disabled documentation improvements or additions to documentation; typically submitted using template ep:DML issues related to the DirectML execution provider quantization issues related to quantization
#22466 opened Oct 16, 2024 by jakubedzior
[Training] Error building gradient graph for bert models for on-device training contributions welcome lower priority issues for the core ORT teams training issues related to ONNX Runtime training; typically submitted using template
#22465 opened Oct 16, 2024 by riccardopinosio
ProTip! Exclude everything labeled bug with -label:bug.