Loading [MathJax]/jax/output/CommonHTML/config.js
前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >专栏 >ResNet34_keras dropout

ResNet34_keras dropout

作者头像
全栈程序员站长
发布于 2022-11-08 07:18:36
发布于 2022-11-08 07:18:36
1K00
代码可运行
举报
运行总次数:0
代码可运行

backbone

Resnet34网络结构图:

其中在网络搭建的过程中分为4个stage,蓝色箭头是在Unet中要进行合并的层。注意:前向的运算encoder过程一共经过了5次降采样,包括刚开始的 7 ∗ 7 7*7 7∗7卷积 stride,所以decoder过程要有5次上采样的过程,但是跨层连接(encoder 与 decoder之间)只有4次,如下图所示,以输入图像大小224×224为例:

Resnet34代码搭建(keras)

卷积block搭建

有两种形式:

A: 单纯的shortcut B: 虚线的shortcut是对特征图的维度做了调整( 1 ∗ 1 1*1 1∗1卷积)

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
def basic_identity_block(filters, stage, block):
"""The identity block is the block that has no conv layer at shortcut. # Arguments kernel_size: default 3, the kernel size of middle conv layer at main path filters: list of integers, the filters of 3 conv layer at main path stage: integer, current stage label, used for generating layer names block: 'a','b'..., current block label, used for generating layer names # Returns Output tensor for the block. """
def layer(input_tensor):
conv_params = get_conv_params()
bn_params = get_bn_params()
conv_name, bn_name, relu_name, sc_name = handle_block_names(stage, block)
x = BatchNormalization(name=bn_name + '1', **bn_params)(input_tensor)
x = Activation('relu', name=relu_name + '1')(x)
x = ZeroPadding2D(padding=(1, 1))(x)
x = Conv2D(filters, (3, 3), name=conv_name + '1', **conv_params)(x)
x = BatchNormalization(name=bn_name + '2', **bn_params)(x)
x = Activation('relu', name=relu_name + '2')(x)
x = ZeroPadding2D(padding=(1, 1))(x)
x = Conv2D(filters, (3, 3), name=conv_name + '2', **conv_params)(x)
x = Add()([x, input_tensor])
return x
return layer
def basic_conv_block(filters, stage, block, strides=(2, 2)):
"""The identity block is the block that has no conv layer at shortcut. # Arguments input_tensor: input tensor kernel_size: default 3, the kernel size of middle conv layer at main path filters: list of integers, the filters of 3 conv layer at main path stage: integer, current stage label, used for generating layer names block: 'a','b'..., current block label, used for generating layer names # Returns Output tensor for the block. """
def layer(input_tensor):
conv_params = get_conv_params()
bn_params = get_bn_params()
conv_name, bn_name, relu_name, sc_name = handle_block_names(stage, block)
x = BatchNormalization(name=bn_name + '1', **bn_params)(input_tensor)
x = Activation('relu', name=relu_name + '1')(x)
shortcut = x
x = ZeroPadding2D(padding=(1, 1))(x)
x = Conv2D(filters, (3, 3), strides=strides, name=conv_name + '1', **conv_params)(x)
x = BatchNormalization(name=bn_name + '2', **bn_params)(x)
x = Activation('relu', name=relu_name + '2')(x)
x = ZeroPadding2D(padding=(1, 1))(x)
x = Conv2D(filters, (3, 3), name=conv_name + '2', **conv_params)(x)
shortcut = Conv2D(filters, (1, 1), name=sc_name, strides=strides, **conv_params)(shortcut)
x = Add()([x, shortcut])
return x
return layer
Resnet34网络搭建

网络结构即如上图所示。

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
def build_resnet(
repetitions=(2, 2, 2, 2),
include_top=True,
input_tensor=None,
input_shape=None,
classes=1000,
block_type='usual'):
# Determine proper input shape
input_shape = _obtain_input_shape(input_shape,
default_size=224,
min_size=197,
data_format='channels_last',
require_flatten=include_top)
if input_tensor is None:
img_input = Input(shape=input_shape, name='data')
else:
if not K.is_keras_tensor(input_tensor):
img_input = Input(tensor=input_tensor, shape=input_shape)
else:
img_input = input_tensor
# get parameters for model layers
no_scale_bn_params = get_bn_params(scale=False)
bn_params = get_bn_params()
conv_params = get_conv_params()
init_filters = 64
if block_type == 'basic':
conv_block = basic_conv_block
identity_block = basic_identity_block
else:
conv_block = usual_conv_block
identity_block = usual_identity_block
# renet bottom
x = BatchNormalization(name='bn_data', **no_scale_bn_params)(img_input)
x = ZeroPadding2D(padding=(3, 3))(x)
x = Conv2D(init_filters, (7, 7), strides=(2, 2), name='conv0', **conv_params)(x)
x = BatchNormalization(name='bn0', **bn_params)(x)
x = Activation('relu', name='relu0')(x)
x = ZeroPadding2D(padding=(1, 1))(x)
x = MaxPooling2D((3, 3), strides=(2, 2), padding='valid', name='pooling0')(x)
# resnet body repetitions = (3,4,6,3)
for stage, rep in enumerate(repetitions):
for block in range(rep):
# print(block)
filters = init_filters * (2**stage) 
# first block of first stage without strides because we have maxpooling before
if block == 0 and stage == 0:
# x = conv_block(filters, stage, block, strides=(1, 1))(x)
x = identity_block(filters, stage, block)(x)
continue 
elif block == 0:
x = conv_block(filters, stage, block, strides=(2, 2))(x)  
else:
x = identity_block(filters, stage, block)(x)          
x = BatchNormalization(name='bn1', **bn_params)(x)
x = Activation('relu', name='relu1')(x)
# resnet top
if include_top:
x = GlobalAveragePooling2D(name='pool1')(x)
x = Dense(classes, name='fc1')(x)
x = Activation('softmax', name='softmax')(x)
# Ensure that the model takes into account any potential predecessors of `input_tensor`.
if input_tensor is not None:
inputs = get_source_inputs(input_tensor)
else:
inputs = img_input      
# Create model.
model = Model(inputs, x)
return model
def ResNet34(input_shape, input_tensor=None, weights=None, classes=1000, include_top=True):
model = build_resnet(input_tensor=input_tensor,
input_shape=input_shape,
repetitions=(3, 4, 6, 3),
classes=classes,
include_top=include_top,
block_type='basic')
model.name = 'resnet34'
if weights:
load_model_weights(weights_collection, model, weights, classes, include_top)
return model

decoder过程

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
def build_unet(backbone, classes, skip_connection_layers,
decoder_filters=(256,128,64,32,16),
upsample_rates=(2,2,2,2,2),
n_upsample_blocks=5,
block_type='upsampling',
activation='sigmoid',
use_batchnorm=True):
input = backbone.input
x = backbone.output
if block_type == 'transpose':
up_block = Transpose2D_block
else:
up_block = Upsample2D_block
# convert layer names to indices
skip_connection_idx = ([get_layer_number(backbone, l) if isinstance(l, str) else l
for l in skip_connection_layers])
# print(skip_connection_idx) [128, 73, 36, 5]
for i in range(n_upsample_blocks):
# print(i)
# check if there is a skip connection
skip_connection = None
if i < len(skip_connection_idx):
skip_connection = backbone.layers[skip_connection_idx[i]].output
# print(backbone.layers[skip_connection_idx[i]])
# <keras.layers.core.Activation object at 0x00000164CC562A20>
upsample_rate = to_tuple(upsample_rates[i])
x = up_block(decoder_filters[i], i, upsample_rate=upsample_rate,
skip=skip_connection, use_batchnorm=use_batchnorm)(x)
x = Conv2D(classes, (3,3), padding='same', name='final_conv')(x)
x = Activation(activation, name=activation)(x)
model = Model(input, x)
return model

参考:

  1. https://www.kaggle.com/meaninglesslives/unet-resnet34-in-keras
  2. https://github.com/qubvel/segmentation_models/blob/master/segmentation_models/unet/model.py

版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 举报,一经查实,本站将立刻删除。

发布者:全栈程序员栈长,转载请注明出处:https://javaforall.cn/185407.html原文链接:https://javaforall.cn

本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2022年10月6日 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
暂无评论
推荐阅读
编辑精选文章
换一批
卷积神经网络 第三周作业:Residual+Networks+-+v1
Welcome to the second assignment of this week! You will learn how to build very deep convolutional networks, using Residual Networks (ResNets). In theory, very deep networks can represent very complex functions; but in practice, they are hard to train. Residual Networks, introduced by He et al., allow you to train much deeper networks than were previously practically feasible.
Steve Wang
2019/05/28
1.2K0
卷积神经网络 第三周作业:Residual+Networks+-+v1
Residual_Networks_v2a
Welcome to the second assignment of this week! You will learn how to build very deep convolutional networks, using Residual Networks (ResNets). In theory, very deep networks can represent very complex functions; but in practice, they are hard to train. Residual Networks, introduced by He et al., allow you to train much deeper networks than were previously practically feasible.
列夫托尔斯昊
2020/08/25
9890
Residual_Networks_v2a
AI识万物:从0搭建和部署手语识别系统 ⛵
据北京听力协会预估数据,我国听障人群数量已过千万。而在全球范围内有4.66亿人患有残疾性听力损失,约占全世界人口的5%。聋哑人士很特殊,他们需要使用手语进行交流,其他与常人无异,我国存在特殊教育水平在各城市中发展力度具有较大差异,国家通用手语推广程度浅,但不懂手语,与听力障碍者交流会非常困难。
ShowMeAI
2022/08/09
1.1K0
AI识万物:从0搭建和部署手语识别系统 ⛵
ResNet50及其Keras实现
你或许看过这篇访问量过12万的博客ResNet解析,但该博客的第一小节ResNet和吴恩达的叙述完全不同,因此博主对这篇博文持怀疑态度,你可以在这篇博文最下面找到提出该网络的论文链接,这篇博文可以作为研读这篇论文的基础。
Steve Wang
2019/05/28
6.5K0
ResNet50及其Keras实现
卷积神经网络 第三周作业 Keras+-+Tutorial+-+Happy+House+v1
Welcome to the first assignment of week 2. In this assignment, you will:
Steve Wang
2019/05/28
7500
卷积神经网络 第三周作业 Keras+-+Tutorial+-+Happy+House+v1
Keras_Tutorial_v2a
Welcome to the first assignment of week 2. In this assignment, you will:
列夫托尔斯昊
2020/08/25
9370
Keras_Tutorial_v2a
Build Residual Networks
我们将使用残差网络建立一个很深的卷积神经网络,理论上而言越深的网络可以表示更加复杂的函数,但是训练也更加困难。Residual Networks可以让我们训练更深的网络。
小飞侠xp
2018/08/29
1.1K0
keras TensorFlow_tensorflow 安装
版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 举报,一经查实,本站将立刻删除。
全栈程序员站长
2022/11/10
8490
keras+resnet34实现车牌识别
参考:https://github.com/Haveoneriver/License-Plate-Recognition-Items
全栈程序员站长
2022/11/10
4960
深度学习模型系列(1) | VGG16 Keras实现
VGGNet是牛津大学视觉几何组(Visual Geometry Group)提出的模型,该模型在2014ImageNet图像分类与定位挑战赛 ILSVRC-2014中取得在分类任务第二,定位任务第一的优异成绩。VGGNet突出的贡献是证明了很小的卷积,通过增加网络深度可以有效提高性能。
机器学习学研社
2019/09/19
4.8K0
深度学习模型系列(1) | VGG16 Keras实现
ResNet18和ResNet50的keras实现
发布者:全栈程序员栈长,转载请注明出处:https://javaforall.cn/141343.html原文链接:https://javaforall.cn
全栈程序员站长
2022/08/24
6890
[深度概念]·Keras实现DenseNet
先来一张图,便于理解网络结构,推荐的dense_block一般是3。两个dense_block之间的就是过渡层。每个dense_block内部都使用密集连接。
小宋是呢
2019/06/27
1.6K0
[深度概念]·Keras实现DenseNet
Maskrcnn中resnet50改为resnet34「建议收藏」
找到很多关于maskrcnn具体用法的代码,但是全是基于resnet50/101的,因需要训练的数据集并不复杂,resnet50的结构有点冗余,于是就把maskrcnn的backbone从resnet50改为resnet34结构。 找到model文件,将resnet50(侵删)部分代码做一定的修改,就可以得到resnet34的相关代码 下面是相关代码:
全栈程序员站长
2022/11/08
4900
【从零开始学Mask RCNN】三,Mask RCNN网络架构解析及TensorFlow和Keras的交互
上一节把握了一下Mask RCNN项目的整体逻辑,这一节主要从TensorFlow和Keras的交互以及Mask RCNN的网络结构入手来分析一下。
BBuf
2020/07/02
1.8K0
shuffleNet_flush privileges
由于每个filter不再是和输入的全部feature map做卷积,而是仅仅和一个group的feature map做卷积。
全栈程序员站长
2022/10/01
5560
shuffleNet_flush privileges
04.卷积神经网络 W2.深度卷积网络:实例探究(作业:Keras教程+ResNets残差网络)
Keras 是更高级的框架,对普通模型来说很友好,但是要实现更复杂的模型需要 TensorFlow 等低级的框架
Michael阿明
2021/02/19
7750
resnet18[通俗易懂]
从上面这幅图可以看出,在一定的训练迭代中,适合的浅层网络要比深层网络有更低的训练误差和测试误差
全栈程序员站长
2022/09/01
1.6K0
resnet18[通俗易懂]
densenet详解_resnet详解
从2012年AlexNet大展身手以来,卷积神经网络经历了(LeNet、)AlexNet、ZFNet、VGGNet、GoogLeNet(借鉴Network in Network)、ResNet、DenseNet的大致发展路线。其实,自从ResNet提出之后,ResNet的变种网络层出不穷,各有特点,性能都略有提高。
全栈程序员站长
2022/11/10
1.6K0
densenet详解_resnet详解
【连载19】GoogLeNet Inception V4/ResNet V1/V2-3.9
这三种结构在《Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning》一文中提出,论文的亮点
lujohn3li
2020/03/11
6430
keras 基础入门整理
在进行自然语言处理之前,需要对文本进行处理。 本文介绍keras提供的预处理包keras.preproceing下的text与序列处理模块sequence模块
学到老
2019/01/25
1.6K0
相关推荐
卷积神经网络 第三周作业:Residual+Networks+-+v1
更多 >
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档
本文部分代码块支持一键运行,欢迎体验
本文部分代码块支持一键运行,欢迎体验