注意力机制BAM和CBAM详细解析（附代码）

2021-07-19 19:32:54 小小谢先生

• 论文题目①：BAM: Bottleneck Attention Module
• 论文题目②：CBAM：CBAM: Convolutional Block Attention Module

Bottlenet attention Module（BAM）

BAM具体结构

F ′ = F + F ⊗ M ( F ) F′=F+F⊗M(F) ,其中⊗是逐元素相乘。

M ( F ) = σ ( M c ( F ) + M s ( F ) ) M(F) =σ(Mc(F)+Ms(F))

channel attention branch

M c ( F ) = B N ( M L P ( A v g P o o l ( F ) ) ) = B N ( W 1 ( W 0 A v g P o o l ( F ) + b 0 ) + b 1 ) Mc(F) = BN(MLP(AvgPool(F))) = BN(W1(W0AvgPool(F)+b0)+b1)

spatial attention branch

M s ( F ) = B N ( f 3 1 ∗ 1 ( f 2 3 ∗ 3 ( f 1 3 ∗ 3 ( f 0 1 ∗ 1 ( F ) ) ) ) ) M_{s}(F)=BN(f_{3}^{1*1}(f_{2}^{3*3}(f_{1}^{3*3}(f_{0}^{1*1}(F)))))

Convolutional Block Attention Module（CBAM）

CBAM结构介绍

通道注意力模块

M c ( F ) = σ ( M L P ( A v g P o o l ( F ) ) + M L P ( M a x P o o l ( F ) ) ) = σ ( W 1 ( W 0 ( F a v g C ) ) + ( W 1 ( W 0 ( F m a x C ) ) ) Mc(F) = σ(MLP(AvgPool(F)) + MLP(MaxPool(F))) =\sigma (W_{1}(W_{0}(F_{avg}^{C}))+(W_{1}(W_{0}(F_{max}^{C})))

空间注意力模块

M s ( F ) = σ ( f 7 ∗ 7 ( [ A v g P o o l ( F ) , M a x P o o l ( F ) ] ) ) = σ ( f 7 ∗ 7 ( [ F a v g S ; F m a x S ] ) ) M_{s}(F)=\sigma (f^{7*7}([AvgPool(F),MaxPool(F)]))=\sigma (f^{7*7}([F_{avg}^{S};F_{max}^{S}]))

代码解析及开源地址

https://github.com/Jongchan/attention-module

https://blog.51cto.com/u_15242250/2870175