-
Notifications
You must be signed in to change notification settings - Fork 583
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible memory leak executing inference multiple times #1939
Comments
Any update about this?? |
please run for 10 minutes and post the result |
runned 10 min and memory stabilized sorry, but in other machine: D:\a\sherpa-onnx\sherpa-onnx\sherpa-onnx\csrc\session.cc:GetSessionOptionsImpl:176 Please compile with -DSHERPA_ONNX_ENABLE_GPU=ON. Available providers: AzureExecutionProvider, CPUExecutionProvider, . Fallback to cpu! and i have installed cuda: |
Please follow.our doc and search in sherpa-onnx's issues about running sherpa-onnx with gpu. |
@insanebytes Please see #1954 |
Hello when i run inference by this code, i see that memory increases in usage without freeing.
`var cwd = Directory.GetCurrentDirectory();
string modelDirPath = Path.Join(cwd, "Assets", "Voice");
string modelPath = Path.Join(modelDirPath, "vits-vctk.int8.onnx");
OfflineTtsVitsModelConfig modelConfigVits = new OfflineTtsVitsModelConfig();
modelConfigVits.Model = modelPath;
modelConfigVits.Lexicon = Path.Join(modelDirPath, "lexicon.txt");
modelConfigVits.Tokens = Path.Join(modelDirPath, "tokens.txt");
OfflineTtsModelConfig modelConfig = new OfflineTtsModelConfig();
modelConfig.Vits = modelConfigVits;
modelConfig.Provider = "cuda";
OfflineTtsConfig config = new OfflineTtsConfig();
config.Model = modelConfig;
var offlineTts = new SherpaOnnx.OfflineTts(config);
var audioDevice = new WasapiOut();
var waveFormat = WaveFormat.CreateIeeeFloatWaveFormat((offlineTts.SampleRate), 1);
var waveFileStream = new MemoryStream(waveFormat.ConvertLatencyToByteSize(30 * 1000)); //pre allocate 30 seconds
var rawSourceWaveStream = new RawSourceWaveStream(waveFileStream, waveFormat);
audioDevice.Init(rawSourceWaveStream);
while (true)
{
var input = Console.ReadLine();
}`
when first executed waiting:
274 MB RAM
executed: hello world how are you
341 MB RAM
second execution same phrase:
342 MB RAM
third execution same phrase:

343 MB RAM
Im am disposing the result struct.
Is there something more to dispose? or that 1MB extra per inference is a memory leak?
execution screenshot:
Thank you
The text was updated successfully, but these errors were encountered: