Skip to content
This repository has been archived by the owner on Aug 11, 2023. It is now read-only.

model assignment on multiple tpus #83

Open
hitch22 opened this issue May 7, 2021 · 2 comments
Open

model assignment on multiple tpus #83

hitch22 opened this issue May 7, 2021 · 2 comments
Labels
❤️ Happy feedback 📌 ToDo

Comments

@hitch22
Copy link

hitch22 commented May 7, 2021

Howdy,

I was wondering if there is a way to create multiple tflite interpreters so that I can assign and run multiple tpu models accordingly
(as outlined in: https://coral.ai/docs/edgetpu/multiple-edgetpu/#using-the-tensorflow-lite-python-api).

Could you please let me know if there is a way to run multiple models on different tpus?

Thanks in advance

@hhk7734
Copy link
Owner

hhk7734 commented May 7, 2021

🤩 oh... I got to know this time.
I don't have USB-type edgeTPU.
It's a feature I want to test. When I'm done with what I'm doing now, I'll buy it and test it.

@hhk7734 hhk7734 added 📌 ToDo ❤️ Happy feedback labels May 7, 2021
@hitch22
Copy link
Author

hitch22 commented May 10, 2021

I see. It appears that "pci:" will let you choose the TPUs if you have the pci version.
I wonder if that will let me switch between my m.2 TPUs, which supposedly have 2 TPUs per chip.

🤩 oh... I got to know this time.
I don't have USB-type edgeTPU.
It's a feature I want to test. When I'm done with what I'm doing now, I'll buy it and test it.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
❤️ Happy feedback 📌 ToDo
Projects
None yet
Development

No branches or pull requests

2 participants