早上起来看到一篇微信公众号推文,看到了新冠肺炎检测的demo,于是阅读起了源代码,原文链接:https://mp.weixin.qq.com/s?__biz=MzUzODkxNzQzMw==&mid=2247484133&idx=1&sn=a4fdd0a6f3a884e94d973f4ca308ab0f&chksm=fad12db3cda6a4a56f3eb5225cea2077bdc9fb11c707434ea0cd80da9906169db187da0ee317&mpshare=1&scene=23&srcid=&sharer_sharetime=1584490235032&sharer_shareid=60d973e8e96477776dbe7e0700964da0#rd
【Keras之模型拼接】里面有一段模型加载和网络设置的,我不是太懂,代码如下:
文章图片
为什么既有baseModel,又有headModel呢,于是查阅了相关资料,这里有篇博客讲懂了我,链接:https://www.cnblogs.com/ZhengPeng7/p/9904771.html
说是,在训练较大网络时, 往往想加载预训练的模型, 但若想在网络结构上做些添补, 可能会出现问题,所以需要keras的这种写法。
以单输出回归任务为例:
# 添在末尾:
base_model = InceptionV3(weights='imagenet', include_top=False)
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1, activation='relu')(x)model = Model(inputs=base_model.input, outputs=x)
model.summary()
# 添在开头和末尾:
# 在开头加1x1卷积层, 使4通道降为3通道, 再传入InceptionV3
def head_model(input_shape=(150, 150, 4)):
input_tensor = Input(input_shape)
x = Conv2D(128, (1, 1), activation='relu')(input_tensor)
x = Conv2D(3, (1, 1), activation='relu')(x)
model = Model(inputs=input_tensor, outputs=x, name='head')
return modelhead_model = head_model()
body_model = InceptionV3(weights='imagenet', include_top=False)
base_model = Model(head_model.input, body_model(head_model.output))
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1, activation='relu')(x)model = Model(inputs=base_model.inputs, outputs=x, name='net')
base_model.summary()
# 两数据输入流合并于末尾:
base_model = InceptionV3(weights='imagenet', include_top=False, input_shape=(150, 150, 3))
flat = Flatten()(base_model.output)
input_K = Input((100, ))# another_input
K_flow = Activation(activation='linear')(input_K)
x = concatenate([flat, K_flow])# 合流
x = Dense(1024, activation='relu')(x)
x = Dense(512, activation='relu')(x)
x = Dense(1, activation='relu')(x)
model = Model(inputs=[*base_model.inputs, input_K], outputs=x)# 数据生成器那里也以这种形式生成([x_0, x_1], y)即可.
model.summary()