Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Yolov5 train sample #13510

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

wangxc2006
Copy link

@wangxc2006 wangxc2006 commented Feb 13, 2025

πŸ› οΈ PR Summary

Made with ❀️ by Ultralytics Actions

🌟 Summary

This PR introduces support for TPU-specific model compilation and execution, enhancing training performance on specialized hardware. πŸš€

πŸ“Š Key Changes

  • Added TPU integration: Includes TPU-specific modules and tools such as tpu_mlir_jit, enabling model compilation and execution on TPU hardware.
  • Optimized graph transformation: Introduced new graph conversion utilities (fx2mlir) for converting PyTorch models into TPU-compatible formats.
  • Enhanced model compilation: Utilized Torch's aot_autograd backend to enable ahead-of-time (AOT) module export and joint graph compilation for TPU acceleration.
  • Updated training script: Integrated TPU-specific optimizations into the main training loop via torch.compile.
  • Additional argument options: Added command-line arguments for finer control over TPU compilation and debugging settings.

🎯 Purpose & Impact

  • Performance Boost: Enables accelerated training on TPU hardware by leveraging advanced compilation and optimization techniques. ⚑
  • TPU Support: Expands hardware compatibility for users deploying YOLOv5 on TPU platforms, making it more versatile. 🌐
  • Debugging and Flexibility: Additional CLI options enhance user control and debugging capabilities for TPU-related workflows. πŸ› οΈ
  • Seamless Integration: Maintains compatibility with the existing training pipeline while adding TPU-specific enhancements. πŸ”„

Copy link
Contributor

github-actions bot commented Feb 13, 2025


Thank you for your submission, we really appreciate it. Like many open-source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution. You can sign the CLA by just posting a Pull Request Comment same as the below format.


I have read the CLA Document and I sign the CLA


1 out of 3 committers have signed the CLA.
βœ… (UltralyticsAssistant)[https://github.com/UltralyticsAssistant]
❌ @zwysophon
❌ @wangxc2006
You can retrigger this bot by commenting recheck in this Pull Request. Posted by the CLA Assistant Lite bot.

@UltralyticsAssistant UltralyticsAssistant added dependencies Dependencies and packages detect Object Detection issues, PR's enhancement New feature or request labels Feb 13, 2025
@UltralyticsAssistant
Copy link
Member

πŸ‘‹ Hello @wangxc2006, thank you for submitting a ultralytics/yolov5 πŸš€ PR! To ensure a seamless integration of your work, please review the following checklist:

  • βœ… Define a Purpose: Clearly explain the purpose of your fix or feature in your PR description, and link to any relevant issues. Your work on TPU-specific model compilation and execution is an exciting enhancement! Please ensure your summary outlines all the key goals, such as performance boosts and hardware compatibility.
  • βœ… Synchronize with Source: Confirm your PR is synchronized with the ultralytics/yolov5 main branch. If it's behind, update it by clicking the 'Update branch' button or by running git pull and git merge main locally.
  • βœ… Ensure CI Checks Pass: Verify all Ultralytics Continuous Integration (CI) checks are passing. If any checks fail, please make sure to address them.
  • βœ… Update Documentation: Your changes introduce substantial new functionality related to TPU integration. Please update the relevant documentation to ensure users have clear guidance on how to use these features and their expected benefits.
  • βœ… Add Tests: If applicable, please include TPU-specific unit and integration test cases for your changes. Make sure all tests pass to confirm the feature's robustness.
  • βœ… Sign the CLA: If you haven’t yet signed our Contributor License Agreement, please do so by writing "I have read the CLA Document and I sign the CLA" in a new message on this PR.
  • βœ… Minimize Changes: While the comprehensive additions for TPU support are valuable, ensure that the PR includes only the necessary changes for the feature to avoid unrelated updates.

To support further review:

  1. Could you provide a minimum reproducible example (MRE) that demonstrates TPU-specific training using your new features? This will assist in validating the functionality and performance of your implementation. πŸš€
  2. For clarity, consider including demonstrations (e.g., TPU-support CLI examples and before/after benchmarks) in your documentation or PR description.

For more details, please check out our Contributing Guide.

This is an automated response to guide the PR process 😊. An Ultralytics engineer will review this in more detail shortly. Thank you for contributing to Ultralytics! πŸš€

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Dependencies and packages detect Object Detection issues, PR's enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants