深度学习案例:ResNet50模型+SE-Net
本文为为🔗365天深度学习训练营内部文章
原作者:K同学啊
一 回顾ResNet模型
ResNet,即残差网络,是由微软研究院的Kaiming He及其合作者于2015年提出的一种深度卷积神经网络架构。该网络架构的核心创新在于引入了“残差连接”(residual connections)或“跳跃连接”(skip connections),这一结构的引入使得网络能够有效地训练深度极深的模型。该技术成功克服了传统深度网络在增加层数时所面临的梯度消失(vanishing gradients)或梯度爆炸(exploding gradients)的难题。
1. ResNet的核心思想
残差网络(ResNet)的核心理念在于采用残差学习策略以替代直接学习目标函数的方法。在该框架下,我们旨在学习一个映射函数H(x)。通过引入残差连接,ResNet并非直接对H(x)进行学习,而是转而学习残差函数F(x),即H(x)与输入x之间的差值,从而将学习目标转化为对F(x)的掌握。
该方法显著提升了网络的优化效率,即便是在恒等映射(即 H(x)=x)的情形下,亦可通过学习残差 F(x)=0来实现。残差连接的引入促进了信息与梯度的直接传递,从较浅层至较深层,有效缓解了深层网络训练过程中的困难。
2.ResNet的构成
残差网络(ResNet)的核心组件为残差块。该结构单元由一个或多个卷积层组成,其中输入数据通过快捷连接(short-cut connection)绕过卷积层,直接与输出相加。ResNet的典型架构可概述如下:
卷积层:残差块通常由两个或三个卷积层构成,其中3x3卷积核的应用尤为普遍。卷积操作后,通常会应用一个激活函数(如ReLU函数),以实现数据的非线性映射。
快捷连接:输入数据通过快捷连接直接与输出相加,绕过了中间的卷积层。这种设计有助于简化信息在深层网络中的传递路径。
ReLU激活函数:在卷积层之后,通常会施加ReLU激活函数,以增强网络的非线性表达能力。
二 通道注意力机制 SE-Net
E-Net(Squeeze-and-Excitation Networks),由Hu等人于2018年提出,是一种针对卷积神经网络(CNN)的通道注意力机制。该机制通过引入自适应通道权重调整机制,显著提升了CNN在图像分类等任务中的性能。SE-Net模块的核心理念在于通过“压缩(Squeeze)”与“激励(Excitation)”操作,自动学习并赋予各通道以重要性权重,进而对特征图进行加权处理,使模型能够集中注意力于关键特征,同时抑制不相关特征。
SE-Net模块可嵌入至各类传统卷积神经网络架构中(如ResNet、Inception等),通过引入通道级注意力机制以增强网络性能。SE模块的基本原理是通过“全局信息池化”步骤捕捉全局特征信息,并据此动态调整各通道的特征图权重。SE模块的处理流程主要包括两个阶段:
1)压缩
在压缩过程中,空间特征提取(Spatial Excitation, SE)模块利用全局平均池化(Global Average Pooling, GAP)技术对各个通道的特征图进行聚合,以实现对全局特征的表征。设输入特征图为 X,其高度和宽度分别为 H 和 W,通道数为 C。通过执行全局平均池化,对每个通道的 H×W 特征图进行平均值计算,从而为每个通道提取出一个单一的标量值。
2)激励
激励机制旨在通过全连接层学习各通道的重要性权重。本研究采用全局特征表示 z=[z1,z2,…,zC] 作为输入,通过两个全连接层(FC层)的处理,生成各通道的权重系数 s=[s1,s2,…,sC]。具体操作步骤如下:
首先,第一个全连接层接收输入 z,并通过激活函数(例如ReLU)获得隐层特征。
其次,第二个全连接层利用Sigmoid激活函数输出各通道的权重系数,以确保每个系数Sc介于0到1之间。
最终,通过Sigmoid激活函数处理后得到的通道注意力权重向量 s,可计算出各通道的重要性得分。
3)重标定
最终,空间注意力(Spatial Enhancement,SE)模块通过应用学习得到的通道权重s对原始特征图的各个通道进行加权,实现特征重标定。设输入特征图为X,经由SE模块处理后,每个通道c的输出Y可表示为:
其中,xc是输入特征图中通道 c的特征图,sc 是通道 c 的权重。通过这个重标定步骤,SE模块为每个通道分配一个权重,从而增强了对重要通道的关注,抑制了无关通道的影响。
三 ResNet+SE-Net 性能提升
from keras import layers
from keras.layers import Input, Activation, BatchNormalization, Flatten, Dropout,Reshape
from keras.layers import Dense, Conv2D, MaxPooling2D, ZeroPadding2D, AveragePooling2D, GlobalAveragePooling2D
from keras.models import Model
import tensorflow as tf
from keras.layers import Add, UpSampling2D, Conv2D
from keras.layers import Multiply, Concatenate
def squeeze_excite_block(input_tensor, ratio=16):
'''
Squeeze-and-Excitation Block
:param input_tensor: 输入张量
:param ratio: 压缩比,控制激励层中间层的维度。通常选择较小的值,如16。
:return: 加权后的张量
'''
channel_axis = -1 # 通道轴通常在最后一维
channels = input_tensor.shape[channel_axis] # 获取通道数
# Squeeze:全局平均池化
x = GlobalAveragePooling2D()(input_tensor)
x = Reshape((1, 1, channels))(x)
# Excite:两个全连接层生成通道权重
x = Dense(channels // ratio, activation='relu', kernel_initializer='he_normal', use_bias=False)(x)
x = Dense(channels, activation='sigmoid', kernel_initializer='he_normal', use_bias=False)(x)
# 将生成的权重与输入张量相乘
x = Multiply()([input_tensor, x])
return x
def identity_block(input_tensor, kernel_size, filters, stage, block):
filters1, filters2, filters3 = filters
name_base = str(stage) + block + '_identity_block_'
# 第一卷积层
x = Conv2D(filters1, (1, 1), name=name_base + 'conv1')(input_tensor)
x = BatchNormalization(name=name_base + 'bn1')(x)
x = Activation('relu', name=name_base + 'relu1')(x)
# 第二卷积层
x = Conv2D(filters2, kernel_size, padding='same', name=name_base + 'conv2')(x)
x = BatchNormalization(name=name_base + 'bn2')(x)
x = Activation('relu', name=name_base + 'relu2')(x)
# 第三卷积层
x = Conv2D(filters3, (1, 1), name=name_base + 'conv3')(x)
x = BatchNormalization(name=name_base + 'bn3')(x)
# SE-Net通道注意力机制
x = squeeze_excite_block(x)
# 残差连接
x = layers.add([x, input_tensor], name=name_base + 'add')
x = Activation('relu', name=name_base + 'relu4')(x)
return x
def conv_block(input_tensor, kernel_size, filters, stage, block, strides=(2, 2)):
filters1, filters2, filters3 = filters
res_name_base = str(stage) + block + '_conv_block_res_'
name_base = str(stage) + block + '_conv_block_'
# 主卷积层
x = Conv2D(filters1, (1, 1), strides=strides, name=name_base + 'conv1')(input_tensor)
x = BatchNormalization(name=name_base + 'bn1')(x)
x = Activation('relu', name=name_base + 'relu1')(x)
x = Conv2D(filters2, kernel_size, padding='same', name=name_base + 'conv2')(x)
x = BatchNormalization(name=name_base + 'bn2')(x)
x = Activation('relu', name=name_base + 'relu2')(x)
x = Conv2D(filters3, (1, 1), name=name_base + 'conv3')(x)
x = BatchNormalization(name=name_base + 'bn3')(x)
# 残差连接的卷积
shortcut = Conv2D(filters3, (1, 1), strides=strides, name=res_name_base + 'conv')(input_tensor)
shortcut = BatchNormalization(name=res_name_base + 'bn')(shortcut)
# SE-Net通道注意力机制
x = squeeze_excite_block(x)
# 残差连接加和
x = layers.add([x, shortcut], name=name_base + 'add')
x = Activation('relu', name=name_base + 'relu4')(x)
return x
def ResNet50(input_shape=[224,224,3], classes=4):
img_input = Input(shape=input_shape)
x = ZeroPadding2D((3, 3))(img_input)
x = Conv2D(64, (7, 7), strides=(2, 2), name='conv1')(x)
x = BatchNormalization(name='bn_conv1')(x)
x = Activation('relu')(x)
x = MaxPooling2D((3, 3), strides=(2, 2))(x)
x = conv_block(x, 3, [64, 64, 256], stage=2, block='a', strides=(1, 1))
x = identity_block(x, 3, [64, 64, 256], stage=2, block='b')
x = identity_block(x, 3, [64, 64, 256], stage=2, block='c')
x = conv_block(x, 3, [128, 128, 512], stage=3, block='a')
x = identity_block(x, 3, [128, 128, 512], stage=3, block='b')
x = identity_block(x, 3, [128, 128, 512], stage=3, block='c')
x = identity_block(x, 3, [128, 128, 512], stage=3, block='d')
x = conv_block(x, 3, [256, 256, 1024], stage=4, block='a')
x = identity_block(x, 3, [256, 256, 1024], stage=4, block='b')
x = identity_block(x, 3, [256, 256, 1024], stage=4, block='c')
x = identity_block(x, 3, [256, 256, 1024], stage=4, block='d')
x = identity_block(x, 3, [256, 256, 1024], stage=4, block='e')
x = identity_block(x, 3, [256, 256, 1024], stage=4, block='f')
x = conv_block(x, 3, [512, 512, 2048], stage=5, block='a')
x = identity_block(x, 3, [512, 512, 2048], stage=5, block='b')
x = identity_block(x, 3, [512, 512, 2048], stage=5, block='c')
x = AveragePooling2D((7, 7), name='avg_pool')(x)
x = Flatten()(x)
x = Dropout(0.5)(x)
x = Dense(classes, activation='softmax', name='fc2')(x)
model = Model(img_input, x, name='resnet50')
return model
model = ResNet50()
model.summary()
Model: "resnet50" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 224, 224, 3 0 [] )] zero_padding2d (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_1[0][0]'] conv1 (Conv2D) (None, 112, 112, 64 9472 ['zero_padding2d[0][0]'] ) bn_conv1 (BatchNormalization) (None, 112, 112, 64 256 ['conv1[0][0]'] ) activation (Activation) (None, 112, 112, 64 0 ['bn_conv1[0][0]'] ) max_pooling2d (MaxPooling2D) (None, 55, 55, 64) 0 ['activation[0][0]'] 2a_conv_block_conv1 (Conv2D) (None, 55, 55, 64) 4160 ['max_pooling2d[0][0]'] 2a_conv_block_bn1 (BatchNormal (None, 55, 55, 64) 256 ['2a_conv_block_conv1[0][0]'] ization) 2a_conv_block_relu1 (Activatio (None, 55, 55, 64) 0 ['2a_conv_block_bn1[0][0]'] n) 2a_conv_block_conv2 (Conv2D) (None, 55, 55, 64) 36928 ['2a_conv_block_relu1[0][0]'] 2a_conv_block_bn2 (BatchNormal (None, 55, 55, 64) 256 ['2a_conv_block_conv2[0][0]'] ization) 2a_conv_block_relu2 (Activatio (None, 55, 55, 64) 0 ['2a_conv_block_bn2[0][0]'] n) 2a_conv_block_conv3 (Conv2D) (None, 55, 55, 256) 16640 ['2a_conv_block_relu2[0][0]'] 2a_conv_block_bn3 (BatchNormal (None, 55, 55, 256) 1024 ['2a_conv_block_conv3[0][0]'] ization) global_average_pooling2d (Glob (None, 256) 0 ['2a_conv_block_bn3[0][0]'] alAveragePooling2D) reshape (Reshape) (None, 1, 1, 256) 0 ['global_average_pooling2d[0][0]' ] dense (Dense) (None, 1, 1, 16) 4096 ['reshape[0][0]'] dense_1 (Dense) (None, 1, 1, 256) 4096 ['dense[0][0]'] 2a_conv_block_res_conv (Conv2D (None, 55, 55, 256) 16640 ['max_pooling2d[0][0]'] ) multiply (Multiply) (None, 55, 55, 256) 0 ['2a_conv_block_bn3[0][0]', 'dense_1[0][0]'] 2a_conv_block_res_bn (BatchNor (None, 55, 55, 256) 1024 ['2a_conv_block_res_conv[0][0]'] malization) 2a_conv_block_add (Add) (None, 55, 55, 256) 0 ['multiply[0][0]', '2a_conv_block_res_bn[0][0]'] 2a_conv_block_relu4 (Activatio (None, 55, 55, 256) 0 ['2a_conv_block_add[0][0]'] n) 2b_identity_block_conv1 (Conv2 (None, 55, 55, 64) 16448 ['2a_conv_block_relu4[0][0]'] D) 2b_identity_block_bn1 (BatchNo (None, 55, 55, 64) 256 ['2b_identity_block_conv1[0][0]'] rmalization) 2b_identity_block_relu1 (Activ (None, 55, 55, 64) 0 ['2b_identity_block_bn1[0][0]'] ation) 2b_identity_block_conv2 (Conv2 (None, 55, 55, 64) 36928 ['2b_identity_block_relu1[0][0]'] D) 2b_identity_block_bn2 (BatchNo (None, 55, 55, 64) 256 ['2b_identity_block_conv2[0][0]'] rmalization) 2b_identity_block_relu2 (Activ (None, 55, 55, 64) 0 ['2b_identity_block_bn2[0][0]'] ation) 2b_identity_block_conv3 (Conv2 (None, 55, 55, 256) 16640 ['2b_identity_block_relu2[0][0]'] D) 2b_identity_block_bn3 (BatchNo (None, 55, 55, 256) 1024 ['2b_identity_block_conv3[0][0]'] rmalization) global_average_pooling2d_1 (Gl (None, 256) 0 ['2b_identity_block_bn3[0][0]'] obalAveragePooling2D) reshape_1 (Reshape) (None, 1, 1, 256) 0 ['global_average_pooling2d_1[0][0 ]'] dense_2 (Dense) (None, 1, 1, 16) 4096 ['reshape_1[0][0]'] dense_3 (Dense) (None, 1, 1, 256) 4096 ['dense_2[0][0]'] multiply_1 (Multiply) (None, 55, 55, 256) 0 ['2b_identity_block_bn3[0][0]', 'dense_3[0][0]'] 2b_identity_block_add (Add) (None, 55, 55, 256) 0 ['multiply_1[0][0]', '2a_conv_block_relu4[0][0]'] 2b_identity_block_relu4 (Activ (None, 55, 55, 256) 0 ['2b_identity_block_add[0][0]'] ation) 2c_identity_block_conv1 (Conv2 (None, 55, 55, 64) 16448 ['2b_identity_block_relu4[0][0]'] D) 2c_identity_block_bn1 (BatchNo (None, 55, 55, 64) 256 ['2c_identity_block_conv1[0][0]'] rmalization) 2c_identity_block_relu1 (Activ (None, 55, 55, 64) 0 ['2c_identity_block_bn1[0][0]'] ation) 2c_identity_block_conv2 (Conv2 (None, 55, 55, 64) 36928 ['2c_identity_block_relu1[0][0]'] D) 2c_identity_block_bn2 (BatchNo (None, 55, 55, 64) 256 ['2c_identity_block_conv2[0][0]'] rmalization) 2c_identity_block_relu2 (Activ (None, 55, 55, 64) 0 ['2c_identity_block_bn2[0][0]'] ation) 2c_identity_block_conv3 (Conv2 (None, 55, 55, 256) 16640 ['2c_identity_block_relu2[0][0]'] D) 2c_identity_block_bn3 (BatchNo (None, 55, 55, 256) 1024 ['2c_identity_block_conv3[0][0]'] rmalization) global_average_pooling2d_2 (Gl (None, 256) 0 ['2c_identity_block_bn3[0][0]'] obalAveragePooling2D) reshape_2 (Reshape) (None, 1, 1, 256) 0 ['global_average_pooling2d_2[0][0 ]'] dense_4 (Dense) (None, 1, 1, 16) 4096 ['reshape_2[0][0]'] dense_5 (Dense) (None, 1, 1, 256) 4096 ['dense_4[0][0]'] multiply_2 (Multiply) (None, 55, 55, 256) 0 ['2c_identity_block_bn3[0][0]', 'dense_5[0][0]'] 2c_identity_block_add (Add) (None, 55, 55, 256) 0 ['multiply_2[0][0]', '2b_identity_block_relu4[0][0]'] 2c_identity_block_relu4 (Activ (None, 55, 55, 256) 0 ['2c_identity_block_add[0][0]'] ation) 3a_conv_block_conv1 (Conv2D) (None, 28, 28, 128) 32896 ['2c_identity_block_relu4[0][0]'] 3a_conv_block_bn1 (BatchNormal (None, 28, 28, 128) 512 ['3a_conv_block_conv1[0][0]'] ization) 3a_conv_block_relu1 (Activatio (None, 28, 28, 128) 0 ['3a_conv_block_bn1[0][0]'] n) 3a_conv_block_conv2 (Conv2D) (None, 28, 28, 128) 147584 ['3a_conv_block_relu1[0][0]'] 3a_conv_block_bn2 (BatchNormal (None, 28, 28, 128) 512 ['3a_conv_block_conv2[0][0]'] ization) 3a_conv_block_relu2 (Activatio (None, 28, 28, 128) 0 ['3a_conv_block_bn2[0][0]'] n) 3a_conv_block_conv3 (Conv2D) (None, 28, 28, 512) 66048 ['3a_conv_block_relu2[0][0]'] 3a_conv_block_bn3 (BatchNormal (None, 28, 28, 512) 2048 ['3a_conv_block_conv3[0][0]'] ization) global_average_pooling2d_3 (Gl (None, 512) 0 ['3a_conv_block_bn3[0][0]'] obalAveragePooling2D) reshape_3 (Reshape) (None, 1, 1, 512) 0 ['global_average_pooling2d_3[0][0 ]'] dense_6 (Dense) (None, 1, 1, 32) 16384 ['reshape_3[0][0]'] dense_7 (Dense) (None, 1, 1, 512) 16384 ['dense_6[0][0]'] 3a_conv_block_res_conv (Conv2D (None, 28, 28, 512) 131584 ['2c_identity_block_relu4[0][0]'] ) multiply_3 (Multiply) (None, 28, 28, 512) 0 ['3a_conv_block_bn3[0][0]', 'dense_7[0][0]'] 3a_conv_block_res_bn (BatchNor (None, 28, 28, 512) 2048 ['3a_conv_block_res_conv[0][0]'] malization) 3a_conv_block_add (Add) (None, 28, 28, 512) 0 ['multiply_3[0][0]', '3a_conv_block_res_bn[0][0]'] 3a_conv_block_relu4 (Activatio (None, 28, 28, 512) 0 ['3a_conv_block_add[0][0]'] n) 3b_identity_block_conv1 (Conv2 (None, 28, 28, 128) 65664 ['3a_conv_block_relu4[0][0]'] D) 3b_identity_block_bn1 (BatchNo (None, 28, 28, 128) 512 ['3b_identity_block_conv1[0][0]'] rmalization) 3b_identity_block_relu1 (Activ (None, 28, 28, 128) 0 ['3b_identity_block_bn1[0][0]'] ation) 3b_identity_block_conv2 (Conv2 (None, 28, 28, 128) 147584 ['3b_identity_block_relu1[0][0]'] D) 3b_identity_block_bn2 (BatchNo (None, 28, 28, 128) 512 ['3b_identity_block_conv2[0][0]'] rmalization) 3b_identity_block_relu2 (Activ (None, 28, 28, 128) 0 ['3b_identity_block_bn2[0][0]'] ation) 3b_identity_block_conv3 (Conv2 (None, 28, 28, 512) 66048 ['3b_identity_block_relu2[0][0]'] D) 3b_identity_block_bn3 (BatchNo (None, 28, 28, 512) 2048 ['3b_identity_block_conv3[0][0]'] rmalization) global_average_pooling2d_4 (Gl (None, 512) 0 ['3b_identity_block_bn3[0][0]'] obalAveragePooling2D) reshape_4 (Reshape) (None, 1, 1, 512) 0 ['global_average_pooling2d_4[0][0 ]'] dense_8 (Dense) (None, 1, 1, 32) 16384 ['reshape_4[0][0]'] dense_9 (Dense) (None, 1, 1, 512) 16384 ['dense_8[0][0]'] multiply_4 (Multiply) (None, 28, 28, 512) 0 ['3b_identity_block_bn3[0][0]', 'dense_9[0][0]'] 3b_identity_block_add (Add) (None, 28, 28, 512) 0 ['multiply_4[0][0]', '3a_conv_block_relu4[0][0]'] 3b_identity_block_relu4 (Activ (None, 28, 28, 512) 0 ['3b_identity_block_add[0][0]'] ation) 3c_identity_block_conv1 (Conv2 (None, 28, 28, 128) 65664 ['3b_identity_block_relu4[0][0]'] D) 3c_identity_block_bn1 (BatchNo (None, 28, 28, 128) 512 ['3c_identity_block_conv1[0][0]'] rmalization) 3c_identity_block_relu1 (Activ (None, 28, 28, 128) 0 ['3c_identity_block_bn1[0][0]'] ation) 3c_identity_block_conv2 (Conv2 (None, 28, 28, 128) 147584 ['3c_identity_block_relu1[0][0]'] D) 3c_identity_block_bn2 (BatchNo (None, 28, 28, 128) 512 ['3c_identity_block_conv2[0][0]'] rmalization) 3c_identity_block_relu2 (Activ (None, 28, 28, 128) 0 ['3c_identity_block_bn2[0][0]'] ation) 3c_identity_block_conv3 (Conv2 (None, 28, 28, 512) 66048 ['3c_identity_block_relu2[0][0]'] D) 3c_identity_block_bn3 (BatchNo (None, 28, 28, 512) 2048 ['3c_identity_block_conv3[0][0]'] rmalization) global_average_pooling2d_5 (Gl (None, 512) 0 ['3c_identity_block_bn3[0][0]'] obalAveragePooling2D) reshape_5 (Reshape) (None, 1, 1, 512) 0 ['global_average_pooling2d_5[0][0 ]'] dense_10 (Dense) (None, 1, 1, 32) 16384 ['reshape_5[0][0]'] dense_11 (Dense) (None, 1, 1, 512) 16384 ['dense_10[0][0]'] multiply_5 (Multiply) (None, 28, 28, 512) 0 ['3c_identity_block_bn3[0][0]', 'dense_11[0][0]'] 3c_identity_block_add (Add) (None, 28, 28, 512) 0 ['multiply_5[0][0]', '3b_identity_block_relu4[0][0]'] 3c_identity_block_relu4 (Activ (None, 28, 28, 512) 0 ['3c_identity_block_add[0][0]'] ation) 3d_identity_block_conv1 (Conv2 (None, 28, 28, 128) 65664 ['3c_identity_block_relu4[0][0]'] D) 3d_identity_block_bn1 (BatchNo (None, 28, 28, 128) 512 ['3d_identity_block_conv1[0][0]'] rmalization) 3d_identity_block_relu1 (Activ (None, 28, 28, 128) 0 ['3d_identity_block_bn1[0][0]'] ation) 3d_identity_block_conv2 (Conv2 (None, 28, 28, 128) 147584 ['3d_identity_block_relu1[0][0]'] D) 3d_identity_block_bn2 (BatchNo (None, 28, 28, 128) 512 ['3d_identity_block_conv2[0][0]'] rmalization) 3d_identity_block_relu2 (Activ (None, 28, 28, 128) 0 ['3d_identity_block_bn2[0][0]'] ation) 3d_identity_block_conv3 (Conv2 (None, 28, 28, 512) 66048 ['3d_identity_block_relu2[0][0]'] D) 3d_identity_block_bn3 (BatchNo (None, 28, 28, 512) 2048 ['3d_identity_block_conv3[0][0]'] rmalization) global_average_pooling2d_6 (Gl (None, 512) 0 ['3d_identity_block_bn3[0][0]'] obalAveragePooling2D) reshape_6 (Reshape) (None, 1, 1, 512) 0 ['global_average_pooling2d_6[0][0 ]'] dense_12 (Dense) (None, 1, 1, 32) 16384 ['reshape_6[0][0]'] dense_13 (Dense) (None, 1, 1, 512) 16384 ['dense_12[0][0]'] multiply_6 (Multiply) (None, 28, 28, 512) 0 ['3d_identity_block_bn3[0][0]', 'dense_13[0][0]'] 3d_identity_block_add (Add) (None, 28, 28, 512) 0 ['multiply_6[0][0]', '3c_identity_block_relu4[0][0]'] 3d_identity_block_relu4 (Activ (None, 28, 28, 512) 0 ['3d_identity_block_add[0][0]'] ation) 4a_conv_block_conv1 (Conv2D) (None, 14, 14, 256) 131328 ['3d_identity_block_relu4[0][0]'] 4a_conv_block_bn1 (BatchNormal (None, 14, 14, 256) 1024 ['4a_conv_block_conv1[0][0]'] ization) 4a_conv_block_relu1 (Activatio (None, 14, 14, 256) 0 ['4a_conv_block_bn1[0][0]'] n) 4a_conv_block_conv2 (Conv2D) (None, 14, 14, 256) 590080 ['4a_conv_block_relu1[0][0]'] 4a_conv_block_bn2 (BatchNormal (None, 14, 14, 256) 1024 ['4a_conv_block_conv2[0][0]'] ization) 4a_conv_block_relu2 (Activatio (None, 14, 14, 256) 0 ['4a_conv_block_bn2[0][0]'] n) 4a_conv_block_conv3 (Conv2D) (None, 14, 14, 1024 263168 ['4a_conv_block_relu2[0][0]'] ) 4a_conv_block_bn3 (BatchNormal (None, 14, 14, 1024 4096 ['4a_conv_block_conv3[0][0]'] ization) ) global_average_pooling2d_7 (Gl (None, 1024) 0 ['4a_conv_block_bn3[0][0]'] obalAveragePooling2D) reshape_7 (Reshape) (None, 1, 1, 1024) 0 ['global_average_pooling2d_7[0][0 ]'] dense_14 (Dense) (None, 1, 1, 64) 65536 ['reshape_7[0][0]'] dense_15 (Dense) (None, 1, 1, 1024) 65536 ['dense_14[0][0]'] 4a_conv_block_res_conv (Conv2D (None, 14, 14, 1024 525312 ['3d_identity_block_relu4[0][0]'] ) ) multiply_7 (Multiply) (None, 14, 14, 1024 0 ['4a_conv_block_bn3[0][0]', ) 'dense_15[0][0]'] 4a_conv_block_res_bn (BatchNor (None, 14, 14, 1024 4096 ['4a_conv_block_res_conv[0][0]'] malization) ) 4a_conv_block_add (Add) (None, 14, 14, 1024 0 ['multiply_7[0][0]', ) '4a_conv_block_res_bn[0][0]'] 4a_conv_block_relu4 (Activatio (None, 14, 14, 1024 0 ['4a_conv_block_add[0][0]'] n) ) 4b_identity_block_conv1 (Conv2 (None, 14, 14, 256) 262400 ['4a_conv_block_relu4[0][0]'] D) 4b_identity_block_bn1 (BatchNo (None, 14, 14, 256) 1024 ['4b_identity_block_conv1[0][0]'] rmalization) 4b_identity_block_relu1 (Activ (None, 14, 14, 256) 0 ['4b_identity_block_bn1[0][0]'] ation) 4b_identity_block_conv2 (Conv2 (None, 14, 14, 256) 590080 ['4b_identity_block_relu1[0][0]'] D) 4b_identity_block_bn2 (BatchNo (None, 14, 14, 256) 1024 ['4b_identity_block_conv2[0][0]'] rmalization) 4b_identity_block_relu2 (Activ (None, 14, 14, 256) 0 ['4b_identity_block_bn2[0][0]'] ation) 4b_identity_block_conv3 (Conv2 (None, 14, 14, 1024 263168 ['4b_identity_block_relu2[0][0]'] D) ) 4b_identity_block_bn3 (BatchNo (None, 14, 14, 1024 4096 ['4b_identity_block_conv3[0][0]'] rmalization) ) global_average_pooling2d_8 (Gl (None, 1024) 0 ['4b_identity_block_bn3[0][0]'] obalAveragePooling2D) reshape_8 (Reshape) (None, 1, 1, 1024) 0 ['global_average_pooling2d_8[0][0 ]'] dense_16 (Dense) (None, 1, 1, 64) 65536 ['reshape_8[0][0]'] dense_17 (Dense) (None, 1, 1, 1024) 65536 ['dense_16[0][0]'] multiply_8 (Multiply) (None, 14, 14, 1024 0 ['4b_identity_block_bn3[0][0]', ) 'dense_17[0][0]'] 4b_identity_block_add (Add) (None, 14, 14, 1024 0 ['multiply_8[0][0]', ) '4a_conv_block_relu4[0][0]'] 4b_identity_block_relu4 (Activ (None, 14, 14, 1024 0 ['4b_identity_block_add[0][0]'] ation) ) 4c_identity_block_conv1 (Conv2 (None, 14, 14, 256) 262400 ['4b_identity_block_relu4[0][0]'] D) 4c_identity_block_bn1 (BatchNo (None, 14, 14, 256) 1024 ['4c_identity_block_conv1[0][0]'] rmalization) 4c_identity_block_relu1 (Activ (None, 14, 14, 256) 0 ['4c_identity_block_bn1[0][0]'] ation) 4c_identity_block_conv2 (Conv2 (None, 14, 14, 256) 590080 ['4c_identity_block_relu1[0][0]'] D) 4c_identity_block_bn2 (BatchNo (None, 14, 14, 256) 1024 ['4c_identity_block_conv2[0][0]'] rmalization) 4c_identity_block_relu2 (Activ (None, 14, 14, 256) 0 ['4c_identity_block_bn2[0][0]'] ation) 4c_identity_block_conv3 (Conv2 (None, 14, 14, 1024 263168 ['4c_identity_block_relu2[0][0]'] D) ) 4c_identity_block_bn3 (BatchNo (None, 14, 14, 1024 4096 ['4c_identity_block_conv3[0][0]'] rmalization) ) global_average_pooling2d_9 (Gl (None, 1024) 0 ['4c_identity_block_bn3[0][0]'] obalAveragePooling2D) reshape_9 (Reshape) (None, 1, 1, 1024) 0 ['global_average_pooling2d_9[0][0 ]'] dense_18 (Dense) (None, 1, 1, 64) 65536 ['reshape_9[0][0]'] dense_19 (Dense) (None, 1, 1, 1024) 65536 ['dense_18[0][0]'] multiply_9 (Multiply) (None, 14, 14, 1024 0 ['4c_identity_block_bn3[0][0]', ) 'dense_19[0][0]'] 4c_identity_block_add (Add) (None, 14, 14, 1024 0 ['multiply_9[0][0]', ) '4b_identity_block_relu4[0][0]'] 4c_identity_block_relu4 (Activ (None, 14, 14, 1024 0 ['4c_identity_block_add[0][0]'] ation) ) 4d_identity_block_conv1 (Conv2 (None, 14, 14, 256) 262400 ['4c_identity_block_relu4[0][0]'] D) 4d_identity_block_bn1 (BatchNo (None, 14, 14, 256) 1024 ['4d_identity_block_conv1[0][0]'] rmalization) 4d_identity_block_relu1 (Activ (None, 14, 14, 256) 0 ['4d_identity_block_bn1[0][0]'] ation) 4d_identity_block_conv2 (Conv2 (None, 14, 14, 256) 590080 ['4d_identity_block_relu1[0][0]'] D) 4d_identity_block_bn2 (BatchNo (None, 14, 14, 256) 1024 ['4d_identity_block_conv2[0][0]'] rmalization) 4d_identity_block_relu2 (Activ (None, 14, 14, 256) 0 ['4d_identity_block_bn2[0][0]'] ation) 4d_identity_block_conv3 (Conv2 (None, 14, 14, 1024 263168 ['4d_identity_block_relu2[0][0]'] D) ) 4d_identity_block_bn3 (BatchNo (None, 14, 14, 1024 4096 ['4d_identity_block_conv3[0][0]'] rmalization) ) global_average_pooling2d_10 (G (None, 1024) 0 ['4d_identity_block_bn3[0][0]'] lobalAveragePooling2D) reshape_10 (Reshape) (None, 1, 1, 1024) 0 ['global_average_pooling2d_10[0][ 0]'] dense_20 (Dense) (None, 1, 1, 64) 65536 ['reshape_10[0][0]'] dense_21 (Dense) (None, 1, 1, 1024) 65536 ['dense_20[0][0]'] multiply_10 (Multiply) (None, 14, 14, 1024 0 ['4d_identity_block_bn3[0][0]', ) 'dense_21[0][0]'] 4d_identity_block_add (Add) (None, 14, 14, 1024 0 ['multiply_10[0][0]', ) '4c_identity_block_relu4[0][0]'] 4d_identity_block_relu4 (Activ (None, 14, 14, 1024 0 ['4d_identity_block_add[0][0]'] ation) ) 4e_identity_block_conv1 (Conv2 (None, 14, 14, 256) 262400 ['4d_identity_block_relu4[0][0]'] D) 4e_identity_block_bn1 (BatchNo (None, 14, 14, 256) 1024 ['4e_identity_block_conv1[0][0]'] rmalization) 4e_identity_block_relu1 (Activ (None, 14, 14, 256) 0 ['4e_identity_block_bn1[0][0]'] ation) 4e_identity_block_conv2 (Conv2 (None, 14, 14, 256) 590080 ['4e_identity_block_relu1[0][0]'] D) 4e_identity_block_bn2 (BatchNo (None, 14, 14, 256) 1024 ['4e_identity_block_conv2[0][0]'] rmalization) 4e_identity_block_relu2 (Activ (None, 14, 14, 256) 0 ['4e_identity_block_bn2[0][0]'] ation) 4e_identity_block_conv3 (Conv2 (None, 14, 14, 1024 263168 ['4e_identity_block_relu2[0][0]'] D) ) 4e_identity_block_bn3 (BatchNo (None, 14, 14, 1024 4096 ['4e_identity_block_conv3[0][0]'] rmalization) ) global_average_pooling2d_11 (G (None, 1024) 0 ['4e_identity_block_bn3[0][0]'] lobalAveragePooling2D) reshape_11 (Reshape) (None, 1, 1, 1024) 0 ['global_average_pooling2d_11[0][ 0]'] dense_22 (Dense) (None, 1, 1, 64) 65536 ['reshape_11[0][0]'] dense_23 (Dense) (None, 1, 1, 1024) 65536 ['dense_22[0][0]'] multiply_11 (Multiply) (None, 14, 14, 1024 0 ['4e_identity_block_bn3[0][0]', ) 'dense_23[0][0]'] 4e_identity_block_add (Add) (None, 14, 14, 1024 0 ['multiply_11[0][0]', ) '4d_identity_block_relu4[0][0]'] 4e_identity_block_relu4 (Activ (None, 14, 14, 1024 0 ['4e_identity_block_add[0][0]'] ation) ) 4f_identity_block_conv1 (Conv2 (None, 14, 14, 256) 262400 ['4e_identity_block_relu4[0][0]'] D) 4f_identity_block_bn1 (BatchNo (None, 14, 14, 256) 1024 ['4f_identity_block_conv1[0][0]'] rmalization) 4f_identity_block_relu1 (Activ (None, 14, 14, 256) 0 ['4f_identity_block_bn1[0][0]'] ation) 4f_identity_block_conv2 (Conv2 (None, 14, 14, 256) 590080 ['4f_identity_block_relu1[0][0]'] D) 4f_identity_block_bn2 (BatchNo (None, 14, 14, 256) 1024 ['4f_identity_block_conv2[0][0]'] rmalization) 4f_identity_block_relu2 (Activ (None, 14, 14, 256) 0 ['4f_identity_block_bn2[0][0]'] ation) 4f_identity_block_conv3 (Conv2 (None, 14, 14, 1024 263168 ['4f_identity_block_relu2[0][0]'] D) ) 4f_identity_block_bn3 (BatchNo (None, 14, 14, 1024 4096 ['4f_identity_block_conv3[0][0]'] rmalization) ) global_average_pooling2d_12 (G (None, 1024) 0 ['4f_identity_block_bn3[0][0]'] lobalAveragePooling2D) reshape_12 (Reshape) (None, 1, 1, 1024) 0 ['global_average_pooling2d_12[0][ 0]'] dense_24 (Dense) (None, 1, 1, 64) 65536 ['reshape_12[0][0]'] dense_25 (Dense) (None, 1, 1, 1024) 65536 ['dense_24[0][0]'] multiply_12 (Multiply) (None, 14, 14, 1024 0 ['4f_identity_block_bn3[0][0]', ) 'dense_25[0][0]'] 4f_identity_block_add (Add) (None, 14, 14, 1024 0 ['multiply_12[0][0]', ) '4e_identity_block_relu4[0][0]'] 4f_identity_block_relu4 (Activ (None, 14, 14, 1024 0 ['4f_identity_block_add[0][0]'] ation) ) 5a_conv_block_conv1 (Conv2D) (None, 7, 7, 512) 524800 ['4f_identity_block_relu4[0][0]'] 5a_conv_block_bn1 (BatchNormal (None, 7, 7, 512) 2048 ['5a_conv_block_conv1[0][0]'] ization) 5a_conv_block_relu1 (Activatio (None, 7, 7, 512) 0 ['5a_conv_block_bn1[0][0]'] n) 5a_conv_block_conv2 (Conv2D) (None, 7, 7, 512) 2359808 ['5a_conv_block_relu1[0][0]'] 5a_conv_block_bn2 (BatchNormal (None, 7, 7, 512) 2048 ['5a_conv_block_conv2[0][0]'] ization) 5a_conv_block_relu2 (Activatio (None, 7, 7, 512) 0 ['5a_conv_block_bn2[0][0]'] n) 5a_conv_block_conv3 (Conv2D) (None, 7, 7, 2048) 1050624 ['5a_conv_block_relu2[0][0]'] 5a_conv_block_bn3 (BatchNormal (None, 7, 7, 2048) 8192 ['5a_conv_block_conv3[0][0]'] ization) global_average_pooling2d_13 (G (None, 2048) 0 ['5a_conv_block_bn3[0][0]'] lobalAveragePooling2D) reshape_13 (Reshape) (None, 1, 1, 2048) 0 ['global_average_pooling2d_13[0][ 0]'] dense_26 (Dense) (None, 1, 1, 128) 262144 ['reshape_13[0][0]'] dense_27 (Dense) (None, 1, 1, 2048) 262144 ['dense_26[0][0]'] 5a_conv_block_res_conv (Conv2D (None, 7, 7, 2048) 2099200 ['4f_identity_block_relu4[0][0]'] ) multiply_13 (Multiply) (None, 7, 7, 2048) 0 ['5a_conv_block_bn3[0][0]', 'dense_27[0][0]'] 5a_conv_block_res_bn (BatchNor (None, 7, 7, 2048) 8192 ['5a_conv_block_res_conv[0][0]'] malization) 5a_conv_block_add (Add) (None, 7, 7, 2048) 0 ['multiply_13[0][0]', '5a_conv_block_res_bn[0][0]'] 5a_conv_block_relu4 (Activatio (None, 7, 7, 2048) 0 ['5a_conv_block_add[0][0]'] n) 5b_identity_block_conv1 (Conv2 (None, 7, 7, 512) 1049088 ['5a_conv_block_relu4[0][0]'] D) 5b_identity_block_bn1 (BatchNo (None, 7, 7, 512) 2048 ['5b_identity_block_conv1[0][0]'] rmalization) 5b_identity_block_relu1 (Activ (None, 7, 7, 512) 0 ['5b_identity_block_bn1[0][0]'] ation) 5b_identity_block_conv2 (Conv2 (None, 7, 7, 512) 2359808 ['5b_identity_block_relu1[0][0]'] D) 5b_identity_block_bn2 (BatchNo (None, 7, 7, 512) 2048 ['5b_identity_block_conv2[0][0]'] rmalization) 5b_identity_block_relu2 (Activ (None, 7, 7, 512) 0 ['5b_identity_block_bn2[0][0]'] ation) 5b_identity_block_conv3 (Conv2 (None, 7, 7, 2048) 1050624 ['5b_identity_block_relu2[0][0]'] D) 5b_identity_block_bn3 (BatchNo (None, 7, 7, 2048) 8192 ['5b_identity_block_conv3[0][0]'] rmalization) global_average_pooling2d_14 (G (None, 2048) 0 ['5b_identity_block_bn3[0][0]'] lobalAveragePooling2D) reshape_14 (Reshape) (None, 1, 1, 2048) 0 ['global_average_pooling2d_14[0][ 0]'] dense_28 (Dense) (None, 1, 1, 128) 262144 ['reshape_14[0][0]'] dense_29 (Dense) (None, 1, 1, 2048) 262144 ['dense_28[0][0]'] multiply_14 (Multiply) (None, 7, 7, 2048) 0 ['5b_identity_block_bn3[0][0]', 'dense_29[0][0]'] 5b_identity_block_add (Add) (None, 7, 7, 2048) 0 ['multiply_14[0][0]', '5a_conv_block_relu4[0][0]'] 5b_identity_block_relu4 (Activ (None, 7, 7, 2048) 0 ['5b_identity_block_add[0][0]'] ation) 5c_identity_block_conv1 (Conv2 (None, 7, 7, 512) 1049088 ['5b_identity_block_relu4[0][0]'] D) 5c_identity_block_bn1 (BatchNo (None, 7, 7, 512) 2048 ['5c_identity_block_conv1[0][0]'] rmalization) 5c_identity_block_relu1 (Activ (None, 7, 7, 512) 0 ['5c_identity_block_bn1[0][0]'] ation) 5c_identity_block_conv2 (Conv2 (None, 7, 7, 512) 2359808 ['5c_identity_block_relu1[0][0]'] D) 5c_identity_block_bn2 (BatchNo (None, 7, 7, 512) 2048 ['5c_identity_block_conv2[0][0]'] rmalization) 5c_identity_block_relu2 (Activ (None, 7, 7, 512) 0 ['5c_identity_block_bn2[0][0]'] ation) 5c_identity_block_conv3 (Conv2 (None, 7, 7, 2048) 1050624 ['5c_identity_block_relu2[0][0]'] D) 5c_identity_block_bn3 (BatchNo (None, 7, 7, 2048) 8192 ['5c_identity_block_conv3[0][0]'] rmalization) global_average_pooling2d_15 (G (None, 2048) 0 ['5c_identity_block_bn3[0][0]'] lobalAveragePooling2D) reshape_15 (Reshape) (None, 1, 1, 2048) 0 ['global_average_pooling2d_15[0][ 0]'] dense_30 (Dense) (None, 1, 1, 128) 262144 ['reshape_15[0][0]'] dense_31 (Dense) (None, 1, 1, 2048) 262144 ['dense_30[0][0]'] multiply_15 (Multiply) (None, 7, 7, 2048) 0 ['5c_identity_block_bn3[0][0]', 'dense_31[0][0]'] 5c_identity_block_add (Add) (None, 7, 7, 2048) 0 ['multiply_15[0][0]', '5b_identity_block_relu4[0][0]'] 5c_identity_block_relu4 (Activ (None, 7, 7, 2048) 0 ['5c_identity_block_add[0][0]'] ation) avg_pool (AveragePooling2D) (None, 1, 1, 2048) 0 ['5c_identity_block_relu4[0][0]'] flatten (Flatten) (None, 2048) 0 ['avg_pool[0][0]'] dropout (Dropout) (None, 2048) 0 ['flatten[0][0]'] fc2 (Dense) (None, 4) 8196 ['dropout[0][0]'] ================================================================================================== Total params: 26,110,852 Trainable params: 26,057,732 Non-trainable params: 53,120 _______________________________________________________________________________________
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
epochs = 30
history = model.fit(
train_ds,
validation_data=val_ds,
epochs=epochs,
)
Epoch 1/30 57/57 [==============================] - 120s 2s/step - loss: 1.6071 - accuracy: 0.5022 - val_loss: 1.6805 - val_accuracy: 0.3628 Epoch 2/30 57/57 [==============================] - 101s 2s/step - loss: 1.1572 - accuracy: 0.5973 - val_loss: 2.9043 - val_accuracy: 0.3628 Epoch 3/30 57/57 [==============================] - 103s 2s/step - loss: 1.0106 - accuracy: 0.6438 - val_loss: 3.5882 - val_accuracy: 0.2655 Epoch 4/30 57/57 [==============================] - 104s 2s/step - loss: 0.6960 - accuracy: 0.7345 - val_loss: 3.8824 - val_accuracy: 0.2920 Epoch 5/30 57/57 [==============================] - 103s 2s/step - loss: 0.6722 - accuracy: 0.8009 - val_loss: 1.9140 - val_accuracy: 0.3805 Epoch 6/30 57/57 [==============================] - 102s 2s/step - loss: 0.5720 - accuracy: 0.8009 - val_loss: 1.3526 - val_accuracy: 0.5133 Epoch 7/30 57/57 [==============================] - 103s 2s/step - loss: 0.5234 - accuracy: 0.8252 - val_loss: 1.6950 - val_accuracy: 0.6195 Epoch 8/30 57/57 [==============================] - 103s 2s/step - loss: 0.5409 - accuracy: 0.8186 - val_loss: 1.0905 - val_accuracy: 0.5752 Epoch 9/30 57/57 [==============================] - 102s 2s/step - loss: 0.4960 - accuracy: 0.8230 - val_loss: 1.0269 - val_accuracy: 0.5664 Epoch 10/30 57/57 [==============================] - 106s 2s/step - loss: 0.3521 - accuracy: 0.8761 - val_loss: 0.7942 - val_accuracy: 0.7965 Epoch 11/30 57/57 [==============================] - 103s 2s/step - loss: 0.2162 - accuracy: 0.9204 - val_loss: 0.9084 - val_accuracy: 0.7168 Epoch 12/30 57/57 [==============================] - 103s 2s/step - loss: 0.3159 - accuracy: 0.9093 - val_loss: 1.8489 - val_accuracy: 0.7611 Epoch 13/30 57/57 [==============================] - 102s 2s/step - loss: 0.2727 - accuracy: 0.8960 - val_loss: 2.2825 - val_accuracy: 0.7345 Epoch 14/30 57/57 [==============================] - 104s 2s/step - loss: 0.1902 - accuracy: 0.9381 - val_loss: 1.1050 - val_accuracy: 0.7257 Epoch 15/30 57/57 [==============================] - 105s 2s/step - loss: 0.2085 - accuracy: 0.9292 - val_loss: 0.3290 - val_accuracy: 0.9027 Epoch 16/30 57/57 [==============================] - 105s 2s/step - loss: 0.1697 - accuracy: 0.9358 - val_loss: 1.0470 - val_accuracy: 0.7965 Epoch 17/30 57/57 [==============================] - 106s 2s/step - loss: 0.1955 - accuracy: 0.9381 - val_loss: 9.2690 - val_accuracy: 0.3540 Epoch 18/30 57/57 [==============================] - 103s 2s/step - loss: 0.3337 - accuracy: 0.8960 - val_loss: 1.6920 - val_accuracy: 0.7699 Epoch 19/30 57/57 [==============================] - 103s 2s/step - loss: 0.1869 - accuracy: 0.9292 - val_loss: 9.7153 - val_accuracy: 0.3628 Epoch 20/30 57/57 [==============================] - 103s 2s/step - loss: 0.2506 - accuracy: 0.9049 - val_loss: 0.9142 - val_accuracy: 0.7876 Epoch 21/30 57/57 [==============================] - 107s 2s/step - loss: 0.1941 - accuracy: 0.9358 - val_loss: 0.7740 - val_accuracy: 0.8142 Epoch 22/30 57/57 [==============================] - 103s 2s/step - loss: 0.0971 - accuracy: 0.9690 - val_loss: 0.5248 - val_accuracy: 0.8230 Epoch 23/30 57/57 [==============================] - 104s 2s/step - loss: 0.0549 - accuracy: 0.9845 - val_loss: 2.8425 - val_accuracy: 0.6637 Epoch 24/30 57/57 [==============================] - 107s 2s/step - loss: 0.0251 - accuracy: 0.9934 - val_loss: 0.1835 - val_accuracy: 0.9292 Epoch 25/30 57/57 [==============================] - 110s 2s/step - loss: 0.0131 - accuracy: 1.0000 - val_loss: 0.2147 - val_accuracy: 0.9469 Epoch 26/30 57/57 [==============================] - 103s 2s/step - loss: 0.0043 - accuracy: 1.0000 - val_loss: 0.1484 - val_accuracy: 0.9469 Epoch 27/30 57/57 [==============================] - 106s 2s/step - loss: 0.0029 - accuracy: 1.0000 - val_loss: 0.1338 - val_accuracy: 0.9558 Epoch 28/30 57/57 [==============================] - 107s 2s/step - loss: 0.0013 - accuracy: 1.0000 - val_loss: 0.1428 - val_accuracy: 0.9469 Epoch 29/30 57/57 [==============================] - 108s 2s/step - loss: 6.1017e-04 - accuracy: 1.0000 - val_loss: 0.1412 - val_accuracy: 0.9469 Epoch 30/30 57/57 [==============================] - 111s 2s/step - loss: 5.0776e-04 - accuracy: 1.0000 - val_loss: 0.1457 - val_accuracy: 0.9381
# 获取实际训练轮数
actual_epochs = len(history.history['accuracy'])
acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs_range = range(actual_epochs)
plt.figure(figsize=(12, 4))
# 绘制准确率
plt.subplot(1, 2, 1)
plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
# 绘制损失
plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss, label='Training Loss')
plt.plot(epochs_range, val_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()