You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi @kaifishr
Thanks for your implementation. I'm trying to reimplement lrp on Resnet50, but it has a BatchNorm2D layer in the backbone, I'm a freshman in python and I don't know how to code the RelevancePropagationBatchNorm2D in lrp_layers.py. Can you just give me some ideas? Thanks a lot.
The text was updated successfully, but these errors were encountered:
As BatchNorm2D consists of two consecutive affine linear transformations I would try to weight the relevance scores by the weight parameters of the batch normalization layer learned during the training.
Hello, I also meet problems when calculating relevance via the BatchNorm1D layer. I'm not professional on math, but I'm in urgent to use this method to evaluate my FCN model in data-driven fault diagnosis task. Could you add the RelevancePropagationBatchNorm1d/2d in lrp_layers or explain more clear on how to calculate this?
Thanks!
Best regards.
hi @kaifishr
Thanks for your implementation. I'm trying to reimplement lrp on Resnet50, but it has a BatchNorm2D layer in the backbone, I'm a freshman in python and I don't know how to code the
RelevancePropagationBatchNorm2D
in lrp_layers.py. Can you just give me some ideas? Thanks a lot.The text was updated successfully, but these errors were encountered: