Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Endpoint fails to execute on Runpod with the docker image provided (AttributeError: 'NoneType' object has no attribute 'tobytes') #3

Open
marten42 opened this issue Jun 29, 2024 · 5 comments

Comments

@marten42
Copy link

I followed the howto, pulled the current docker image from "depositame/rvc_runpod_serverless/general" and deployed the image on my runpod console. Deplyoment ran smoothly on runpod.

When creating a request for the endpoint on runpod with an adjusted input_json (the audio_url and model_name given in the howto don not exist any longer), i run into an python error which is visible in the runpod logs. For input paramters i used an mp3 file (small 3min song) and another model from "sail-rvc".

Here is the error i am getting:
AttributeError: 'NoneType' object has no attribute 'tobytes'
temp_dir = Path(dir) / self.hash_bytes(data.tobytes())

More context to the log/trace:
2024-06-28 19:18:26.620
[ocmtv88uo73xas]
[info]
INFO | 68a8cde3-c7ec-4ba3-80e3-3a8cf59b1c44-e1 | Finished
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[info]
}
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[info]
"pod_id": "ocmtv88uo73xas"
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[info]
"host_name": "ocmtv88uo73xas-64411128",
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[info]
"error_traceback": "Traceback (most recent call last):\n File "/opt/conda/lib/python3.10/site-packages/runpod/serverless/modules/job.py", line 85, in run_job\n job_output = handler(job)\n File "/rvc_serverless/main.py", line 283, in handler\n return self.infer(request)\n File "/rvc_serverless/main.py", line 229, in infer\n result = client.predict(\n File "/opt/conda/lib/python3.10/site-packages/gradio_client/client.py", line 285, in predict\n return self.submit(*args, api_name=api_name, fn_index=fn_index).result()\n File "/opt/conda/lib/python3.10/site-packages/gradio_client/client.py", line 982, in result\n raise self.future._exception # type: ignore\n File "/opt/conda/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n result = self.fn(*self.args, **self.kwargs)\n File "/opt/conda/lib/python3.10/site-packages/gradio_client/client.py", line 645, in _inner\n predictions = _predict(*data)\n File "/opt/conda/lib/python3.10/site-packages/gradio_client/client.py", line 676, in _predict\n raise ValueError(result["error"])\nValueError: None\n",
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[info]
"error_message": "None",
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[info]
"error_type": "<class 'ValueError'>",
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[error]
ERROR | {
2024-06-28 19:18:26.466
[ocmtv88uo73xas]
[error]
ERROR | 68a8cde3-c7ec-4ba3-80e3-3a8cf59b1c44-e1 | Captured Handler Exception
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
AttributeError: 'NoneType' object has no attribute 'tobytes'
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
temp_dir = Path(dir) / self.hash_bytes(data.tobytes())
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
File "/opt/conda/lib/python3.10/site-packages/gradio/components/base.py", line 321, in audio_to_temp_file
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
file_path = self.audio_to_temp_file(
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
prediction_value = block.postprocess(prediction_value)
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1321, in postprocess_data
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
data = self.postprocess_data(fn_index, result["prediction"], state)
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
File "/opt/conda/lib/python3.10/site-packages/gradio/blocks.py", line 1387, in process_api
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
output = await app.get_blocks().process_api(
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
File "/opt/conda/lib/python3.10/site-packages/gradio/routes.py", line 439, in run_predict
2024-06-28 19:18:26.457
[ocmtv88uo73xas]
[info]
Traceback (most recent call last):
2024-06-28 19:18:25.221
[ocmtv88uo73xas]
[info]
Downloading model.index: 0%| | 0.00/373M [00:00<?, ?B/s] Downloading model.index: 3%|▎ | 10.5M/373M [00:00<00:04, 76.9MB/s] Downloading model.index: 6%|▌ | 21.0M/373M [00:00<00:04, 78.7MB/s] Downloading model.index: 8%|▊ | 31.5M/373M [00:00<00:04, 79.2MB/s] Downloading model.index: 11%|█ | 41.9M/373M [00:00<00:04, 79.5MB/s] Downloading model.index: 14%|█▍ | 52.4M/373M [00:00<00:04, 79.6MB/s] Downloading model.index: 17%|█▋ | 62.9M/373M [00:00<00:03, 79.7MB/s] Downloading model.index: 20%|█▉ | 73.4M/373M [00:00<00:03, 79.8MB/s] Downloading model.index: 22%|██▏ | 83.9M/373M [00:01<00:03, 79.7MB/s] Downloading model.index: 25%|██▌ | 94.4M/373M [00:01<00:03, 79.8MB/s] Downloading model.index: 28%|██▊ | 105M/373M [00:01<00:03, 79.8MB/s] Downloading model.index: 31%|███ | 115M/373M [00:01<00:03, 79.7MB/s] Downloading model.index: 34%|███▎ | 126M/373M [00:01<00:03, 79.8MB/s] Downloading model.index: 37%|███▋ | 136M/373M [00:01<00:03, 65.0MB/s] Downloading model.index: 39%|███▉ | 147M/373M [00:01<00:03, 67.3MB/s] Downloading model.index: 42%|████▏ | 157M/373M [00:02<00:03, 70.6MB/s] Downloading model.index: 45%|████▍ | 168M/373M [00:02<00:02, 70.5MB/s] Downloading model.index: 48%|████▊ | 178M/373M [00:02<00:02, 68.4MB/s] Downloading model.index: 51%|█████ | 189M/373M [00:02<00:02, 71.0MB/s] Downloading model.index: 53%|█████▎ | 199M/373M [00:02<00:02, 67.2MB/s] Downloading model.index: 56%|█████▌ | 210M/373M [00:02<00:02, 69.7MB/s] Downloading model.index: 59%|█████▉ | 220M/373M [00:03<00:02, 67.2MB/s] Downloading model.index: 62%|██████▏ | 231M/373M [00:03<00:02, 68.6MB/s] Downloading model.index: 65%|██████▍ | 241M/373M [00:03<00:01, 71.2MB/s] Downloading model.index: 67%|██████▋ | 252M/373M [00:03<00:01, 73.5MB/s] Downloading model.index: 70%|███████ | 262M/373M [00:03<00:01, 75.2MB/s] Downloading model.index: 73%|███████▎ | 273M/373M [00:03<00:01, 76.6MB/s] Downloading model.index: 76%|███████▌ | 283M/373M [00:03<00:01, 77.5MB/s] Downloading model.index: 79%|███████▊ | 294M/373M [00:04<00:01, 70.2MB/s] Downloading model.index: 82%|████████▏ | 304M/373M [00:04<00:00, 72.4MB/s] Downloading model.index: 84%|████████▍ | 315M/373M [00:04<00:00, 74.3MB/s] Downloading model.index: 87%|████████▋ | 325M/373M [00:04<00:00, 75.9MB/s] Downloading model.index: 90%|████████▉ | 336M/373M [00:04<00:00, 76.9MB/s] Downloading model.index: 93%|█████████▎| 346M/373M [00:04<00:00, 77.7MB/s] Downloading model.index: 96%|█████████▌| 357M/373M [00:04<00:00, 78.3MB/s] Downloading model.index: 98%|█████████▊| 367M/373M [00:04<00:00, 78.6MB/s] Downloading model.index: 100%|██████████| 373M/373M [00:05<00:00, 74.5MB/s]
2024-06-28 19:18:22.713
[ocmtv88uo73xas]
[info]
Downloading model.pth: 0%| | 0.00/55.2M [00:00<?, ?B/s] Downloading model.pth: 19%|█▉ | 10.5M/55.2M [00:00<00:01, 30.1MB/s] Downloading model.pth: 38%|███▊ | 21.0M/55.2M [00:00<00:00, 47.1MB/s] Downloading model.pth: 57%|█████▋ | 31.5M/55.2M [00:00<00:00, 57.3MB/s] Downloading model.pth: 76%|███████▌ | 41.9M/55.2M [00:00<00:00, 64.0MB/s] Downloading model.pth: 95%|█████████▍| 52.4M/55.2M [00:00<00:00, 68.9MB/s] Downloading model.pth: 100%|██████████| 55.2M/55.2M [00:00<00:00, 59.9MB/s]
2024-06-28 19:18:22.713
[ocmtv88uo73xas]
[info]
Downloading config.json: 0%| | 0.00/130 [00:00<?, ?B/s] Downloading config.json: 100%|██████████| 130/130 [00:00<00:00, 688kB/s]
2024-06-28 19:18:22.713
[ocmtv88uo73xas]
[info]
INFO | 68a8cde3-c7ec-4ba3-80e3-3a8cf59b1c44-e1 | Started
2024-06-28 19:18:22.713
[ocmtv88uo73xas]

@chavinlo
Copy link
Owner

chavinlo commented Jul 3, 2024

made some changes to requierments please try building the docker again

@Hassanahmed669
Copy link

Sail-rvc/example is not available anymore so can you guide us for any existing model. or how to get the model and where to place the model in docker image and related request.

@chavinlo
Copy link
Owner

chavinlo commented Jul 6, 2024

Sail-rvc/example is not available anymore so can you guide us for any existing model. or how to get the model and where to place the model in docker image and related request.

You can use any model thats on https://huggingface.co/sail-rvc , just copy the huggingface path (ie. "sail-rvc/Donald_Trump__RVC_v2_")
for where to place the model, you just need to include it on the http request

@Hassanahmed669
Copy link

ok but what would be the exact location ? where i can find the models in my container ? like if i want to put models directly in my docker image so where should i place them ? not only for "sail-rvc" but if i need different model so whats the default location in docker image / container ?

@Hassanahmed669
Copy link

can you please answer ? thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants