Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training routine #2

Closed
relyativist opened this issue Apr 19, 2022 · 10 comments
Closed

Training routine #2

relyativist opened this issue Apr 19, 2022 · 10 comments
Labels
enhancement New feature or request

Comments

@relyativist
Copy link

Hey,
I am trying transfer learning on my data with given weights. Do you have updated training routing script for this?Do you now if it possible to translate Theano weight to Pytorch?
Thanks, VY

@ravnoor
Copy link
Contributor

ravnoor commented Apr 21, 2022

Hey @relyativist,

Thanks for your interest in our work!

I've added a relatively complete end-to-end demo of transfer learning at app/examples/transfer.ipynb. Try to train and fine-tune with your own data, and see if you can get to perform reasonably well. I've added annotations, wherever necessary, to get you up and running.

I've illustrated a single instance of the layer freezing (or unfreezing) and model (since there's a CNN-1 and a CNN-2) combination, so you can experiment with different combinations and see what works best for you.

If there are any questions or issues getting the notebook to run, please post here.

Best,
Ravnoor

@ravnoor
Copy link
Contributor

ravnoor commented Apr 21, 2022

From my brief research, it looks like it isn't possible to convert theano weights to pytorch directly or indirectly via keras2onnx.

I might port the code over to pytorch eventually, but that's probably more than six months away. I'll open an issue to serve as a reminder.

@relyativist
Copy link
Author

@ravnoor Thanks for complete response and support. In your example inference.py in tree directory markdown you mentioned about stereochastic native MRIs, but I didn't find whether labels should also be in t1 native or MNI? I didn't find corresponding transformation in NoelImageProcessing regarding labels. Thanks VY

@ravnoor
Copy link
Contributor

ravnoor commented Apr 26, 2022

The labels are expected to be in the same stereotaxic space as the T1- and T2-weighted images, in this case, MNI152.

You're right, the labels aren't transformed to the template space yet. Thanks for pointing that out! I'll fix that next week.

@relyativist
Copy link
Author

relyativist commented Apr 30, 2022

Hi @ravnoor

I successfully tuned mode with fixed layers on model[1], and saved learnt model with frozen layers in model[0], thanks for guide. However, I have an issue with model inference on new model in example notebook on last cell inference, because it ran too long, what actually progress of predict_stochastic means as it ran too long for one subject, thank you.

@ravnoor
Copy link
Contributor

ravnoor commented May 3, 2022

predict_stochastic is a helper function that loops over mini batches of data during DropoutMC inference to make it less GPU memory intensive.

It shouldn't take too long. How much RAM are you currently working with? Are you using a GPU for compute?

@relyativist
Copy link
Author

@ravnoor Yes, I am using GPU with 12 gb of ram, and it utilise about 9 gb with batch_size=4096. Actually, It didn't finish after 12 hours for 1 subject

@ravnoor
Copy link
Contributor

ravnoor commented May 5, 2022

How about the system RAM?

Does the inference exit with or without an error?

Also, make sure your inputs (T1 and T2 images) are correctly co-registered and skull-stripped (gh-6). If they are not, the number of input patches to the network will be overwhelmingly large. This might partially explain why the network takes so long.

@relyativist
Copy link
Author

relyativist commented May 6, 2022

@ravnoor Thank you for tip with memory alloc. Resolved the issue tuning batch size, so the inference finished successfully. I have additional questions regarding results. I experimented with performance set to true during test model, I have additional output with files noel_deepFCD_dropoutMC_FCD_42_1_out_morph_labels.nii.gz and noel_deepFCD_dropoutMC_out_CNN.nii.gz, what this files mean? Also I couldn't find dice evaluation is printed
Additional question regarding file FCD_42_1_noel_deepFCD_dropoutMC_prob_var_1.nii.gz, from reference this is probability map of fcd roi lesion, however maximum value of this map is 0.24, what actually this value means?
Sorry, it this answers on this are posted early somewhere.

@ravnoor
Copy link
Contributor

ravnoor commented May 10, 2022

I'm glad it worked out for you!

The *out_morph_labels.nii.gz file is the product of a series of morphological operations following thresholding (probability- and size-based) described here. The purpose is to generate a binary output and compare with the the ground truth (if available), and quantify the number of false positives.

Looks like *out_CNN.nii.gz no longer exists in the most recent commit. If you let me know the version commit (SHA) you're using, I can look into it.

*prob_var_{0,1}.nii.gz is the uncertainty (or variance) image resulting from 20 or 50 forward passes through the network. Thus, the range (0,0.3) is reasonable. *prob_mean_{0,1}.nii.gz is the probability image with the range (0,1).

@ravnoor ravnoor added the enhancement New feature or request label May 14, 2022
@ravnoor ravnoor closed this as completed May 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants