【神经网络|使用Keras对多个模型进行拼接】在训练模型时候,常常需要把多个模型拼接起来,常用的方式主要有以下几种:
代码引自于,感谢原作者默盒
1. 添在末尾:
base_model = InceptionV3(weights='imagenet', include_top=False)
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1, activation='relu')(x)model = Model(inputs=base_model.input, outputs=x)
model.summary()
2. 添在开头和末尾:
# 在开头加1x1卷积层, 使4通道降为3通道, 再传入InceptionV3
def head_model(input_shape=(150, 150, 4)):
input_tensor = Input(input_shape)
x = Conv2D(128, (1, 1), activation='relu')(input_tensor)
x = Conv2D(3, (1, 1), activation='relu')(x)
model = Model(inputs=input_tensor, outputs=x, name='head')
return modelhead_model = head_model()
body_model = InceptionV3(weights='imagenet', include_top=False)
base_model = Model(head_model.input, body_model(head_model.output))
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1, activation='relu')(x)model = Model(inputs=base_model.inputs, outputs=x, name='net')
base_model.summary()
3. 两数据输入流合并于末尾:
base_model = InceptionV3(weights='imagenet', include_top=False, input_shape=(150, 150, 3))
flat = Flatten()(base_model.output)
input_K = Input((100, ))# another_input
K_flow = Activation(activation='linear')(input_K)
x = concatenate([flat, K_flow])# 合流
x = Dense(1024, activation='relu')(x)
x = Dense(512, activation='relu')(x)
x = Dense(1, activation='relu')(x)
model = Model(inputs=[*base_model.inputs, input_K], outputs=x)# 数据生成器那里也以这种形式生成([x_0, x_1], y)即可.
model.summary()
推荐阅读
- 人脸识别|【人脸识别系列】| 实现自动化妆
- 人工智能|干货!人体姿态估计与运动预测
- 推荐系统论文进阶|CTR预估 论文精读(十一)--Deep Interest Evolution Network(DIEN)
- Python专栏|数据分析的常规流程
- 历史上的今天|【历史上的今天】2 月 16 日(世界上第一个 BBS 诞生;中国计算机教育开端;IBM 机器人赢得智能竞赛)
- pytorch|YOLOX 阅读笔记
- Python|Win10下 Python开发环境搭建(PyCharm + Anaconda) && 环境变量配置 && 常用工具安装配置
- Python绘制小红花