Replies: 2 comments 3 replies
-
i also have this question. ann my guess is that add the "freeeze" function in the RPN and freeze it by modify the configs. but i don't know whether it is ok. |
Beta Was this translation helpful? Give feedback.
0 replies
-
@yribeiro @Kegard you can freeze backbone using backbone = model.backbone
for param in backbone.parameters():
param.requires_grad = False |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to train the Faster RCNN network. I have used a pretrained model using the load_from config var. As the paper outlines, we need to train the RPN first, freeze the RPN network head and train the FCN classifier on top.
Is there an easy way to freeze components, so that after I am done training the RPN, I can freeze the module and only train the FCN classifier on top?
My initial guess was that I would need to inherit the original FasterRCNN detector and extend / modify the bits needed (loss / head etc). Is there an easier way to do this? Via configs maybe?
Beta Was this translation helpful? Give feedback.
All reactions