CBAM-keras-master

所属分类:人工智能/神经网络/深度学习
开发工具:Python
文件大小:1274KB
下载次数:35
上传日期:2019-06-25 10:23:42
上 传 者wxylily
说明:  实现深度学习中,对图像的注意力机制,强化网络的学习能力,提高网络精度和泛化能力。
(It can improve the classification accuracy of deep learning network, and can be easily embedded into other networks such as densenet,resnet and etc.)

文件列表:
版\.idea\CBAM-keras-master.iml (447, 2019-04-10)
版\.idea\misc.xml (304, 2019-04-10)
版\.idea\modules.xml (286, 2019-04-10)
版\.idea\workspace.xml (36391, 2019-04-24)
版\figures\exp4.png (292712, 2018-09-14)
版\figures\exp5.png (93652, 2018-09-14)
版\figures\overview.png (74331, 2018-09-14)
版\figures\resnetv1结构.doc (73216, 2019-04-16)
版\figures\submodule.png (82477, 2018-09-14)
版\LICENSE (1067, 2018-09-14)
版\main.py (10803, 2019-04-23)
版\models\.ipynb_checkpoints\attention_module-checkpoint.py (3980, 2018-09-14)
版\models\.ipynb_checkpoints\densenet-checkpoint.py (20707, 2018-09-14)
版\models\.ipynb_checkpoints\inception_resnet_v2-checkpoint.py (15283, 2018-09-14)
版\models\.ipynb_checkpoints\inception_v3-checkpoint.py (14519, 2018-09-14)
版\models\.ipynb_checkpoints\mobilenets-checkpoint.py (24499, 2018-09-14)
版\models\.ipynb_checkpoints\resnet_v1-checkpoint.py (5213, 2018-09-14)
版\models\.ipynb_checkpoints\resnet_v2-checkpoint.py (6167, 2018-09-14)
版\models\.ipynb_checkpoints\resnext-checkpoint.py (21686, 2018-09-14)
版\models\attention_module.py (5375, 2019-04-20)
版\models\densenet.py (20644, 2019-04-10)
版\models\inception_resnet_v2.py (15265, 2019-04-10)
版\models\inception_v3.py (14182, 2019-04-10)
版\models\mobilenets.py (24489, 2019-04-10)
版\models\resnetv1_model.png (806492, 2019-04-15)
版\models\resnet_v1.py (7251, 2019-04-17)
版\models\resnet_v2.py (6149, 2018-09-14)
版\models\resnext.py (21676, 2019-04-10)
版\models\small_cnn.py (3140, 2019-04-23)
版\models\test_output.py (2859, 2019-04-12)
版\models\__pycache__\attention_module.cpython-35.pyc (4013, 2019-04-23)
版\models\__pycache__\attention_module.cpython-36.pyc (3264, 2018-09-14)
版\models\__pycache__\densenet.cpython-35.pyc (15659, 2019-04-10)
版\models\__pycache__\densenet.cpython-36.pyc (14609, 2018-09-14)
版\models\__pycache__\inception_resnet_v2.cpython-35.pyc (11944, 2019-04-10)
版\models\__pycache__\inception_resnet_v2.cpython-36.pyc (11129, 2018-09-14)
版\models\__pycache__\inception_v3.cpython-35.pyc (9776, 2019-04-10)
版\models\__pycache__\inception_v3.cpython-36.pyc (8806, 2018-09-14)
版\models\__pycache__\mobilenets.cpython-35.pyc (20271, 2019-04-10)
... ...

# CBAM-Keras This is a Keras implementation of ["CBAM: Convolutional Block Attention Module"](https://arxiv.org/pdf/1807.06521). This repository includes the implementation of ["Squeeze-and-Excitation Networks"](https://arxiv.org/pdf/1709.01507) as well, so that you can train and compare among base CNN model, base model with CBAM block and base model with SE block. ## CBAM: Convolutional Block Attention Module **CBAM** proposes an architectural unit called *"Convolutional Block Attention Module" (CBAM)* block to improve representation power by using attention mechanism: focusing on important features and supressing unnecessary ones. This research can be considered as a descendant and an improvement of ["Squeeze-and-Excitation Networks"](https://arxiv.org/pdf/1709.01507). ### Diagram of a CBAM_block
### Diagram of each attention sub-module
### Classification results on ImageNet-1K
## Prerequisites - Python 3.x - Keras ## Prepare Data set This repository use [*Cifar10*](https://www.cs.toronto.edu/~kriz/cifar.html) dataset. When you run the training script, the dataset will be automatically downloaded. (Note that you **can not run Inception series model** with Cifar10 dataset, since the smallest input size available in Inception series model is 139 when Cifar10 is 32. So, try to use Inception series model with other dataset.) ## CBAM_block and SE_block Supportive Models You can train and test base CNN model, base model with CBAM block and base model with SE block. You can run **CBAM_block** or **SE_block** added models in the below list. - Inception V3 + CBAM / + SE - Inception-ResNet-v2 + CBAM / + SE - ResNet_v1 + CBAM / + SE (ResNet20, ResNet32, ResNet44, ResNet56, ResNet110, ResNet1***, ResNet1001) - ResNet_v2 + CBAM / + SE (ResNet20, ResNet56, ResNet110, ResNet1***, ResNet1001) - ResNeXt + CBAM / + SE - MobileNet + CBAM / + SE - DenseNet + CBAM / + SE (DenseNet121, DenseNet161, DenseNet169, DenseNet201, DenseNet2***) ### Change *Reduction ratio* To change *reduction ratio*, you can set `ratio` on `se_block` and `cbam_block` method in `models/attention_module.py` ## Train a Model You can simply train a model with `main.py`. 1. Set a model you want to train. - e.g. `model = resnet_v1.resnet_v1(input_shape=input_shape, depth=depth, attention_module=attention_module)` 2. Set attention_module parameter - e.g. `attention_module = 'cbam_block'` 3. Set other parameter such as *batch_size*, *epochs*, *data_augmentation* and so on. 4. Run the `main.py` file - e.g. `python main.py` ## Related Works - Blog: [CBAM: Convolutional Block Attention Module](https://kobiso.github.io//research/research-CBAM/) - Repository: [CBAM-TensorFlow](https://github.com/kobiso/CBAM-tensorflow) - Repository: [CBAM-TensorFlow-Slim](https://github.com/kobiso/CBAM-tensorflow-slim) - Repository: [SENet-TensorFlow-Slim](https://github.com/kobiso/SENet-tensorflow-slim) ## Reference - Paper: [CBAM: Convolutional Block Attention Module](https://arxiv.org/pdf/1807.06521) - Paper: [Squeeze-and-Excitation Networks](https://arxiv.org/pdf/1709.01507) - Repository: [Keras: Cifar10 ResNet example](https://github.com/keras-team/keras/blob/master/examples/cifar10_resnet.py) - Repository: [keras-squeeze-excite-network](https://github.com/titu1994/keras-squeeze-excite-network) ## Author Byung Soo Ko / kobiso62@gmail.com

近期下载者

相关文件


收藏者