From 0385ef7471d028cb512cc466b68221c8a5c24198 Mon Sep 17 00:00:00 2001 From: Thinam Tamang <60099698+ThinamXx@users.noreply.github.com> Date: Tue, 21 Sep 2021 17:47:26 +0545 Subject: [PATCH] Updated --- 17. CNN Interpretation/README.md | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/17. CNN Interpretation/README.md b/17. CNN Interpretation/README.md index 1310c61..d58d4ee 100644 --- a/17. CNN Interpretation/README.md +++ b/17. CNN Interpretation/README.md @@ -1,2 +1,17 @@ # **Fastai : CNN Interpretation with CAM** +The [**CNN Interpretation**](https://github.com/ThinamXx/Fastai/blob/main/17.%20CNN%20Interpretation/CNN%20Interpretation.ipynb) notebook presents the implementation of **Class Activation Maps** in model interpretation. Class activation maps give insights into why a model predicted a certain result by showing the areas of images that were most responsible for a given prediction. + +**Note:** +- 📑[**CNN Interpretation with CAM**](https://nbviewer.jupyter.org/github/ThinamXx/Fastai/blob/main/17.%20CNN%20Interpretation/CNN%20Interpretation.ipynb) + +**Class Activation Map** +- The Class Activation Map uses the output of the last convolutional layer which is just before the average pooling layer together with predictions to give a heatmap visualization of model decision. I have presented the implementation of Defining Hook Function and Decoding Images using Fastai and PyTorch here in the snapshot. + +![Image](https://github.com/ThinamXx/300Days__MachineLearningDeepLearning/blob/main/Images/Day%20271.PNG) + +**Hook Class** +- I have presented the implementation of Defining Hook Function, Activations, Gradients and Heatmap Visualization using Fastai and PyTorch here in the snapshot. + +![Image](https://github.com/ThinamXx/300Days__MachineLearningDeepLearning/blob/main/Images/Day%20272a.PNG) +![Image](https://github.com/ThinamXx/300Days__MachineLearningDeepLearning/blob/main/Images/Day%20272b.PNG)