development environment jupyter notebook/pycharm# 肺有效区域提取 import SimpleITK from scipy import ndimage as ndi from skimage.segmentation import clear_border from skimage.measure import label, regionprops from skimage.morphology import disk, dilation, binary_erosion, binary_closing from skimage.filters import roberts, sobel import cv2 def get_pixels_hu_by_simpleitk(dicom_dir): ''' 读取某文件夹内的所有dicom文件,并提取像素值(

Development Environment jupyter notebook# -- coding: utf-8 -- #训练图像分割网络(u-net)模型 import csv import glob import random import cv2 import numpy import os from typing import List, Tuple from keras.optimizers import SGD from keras.layers import Input, Convolution2D, MaxPooling2D, UpSampling2D, merge, BatchNormalization, SpatialDropout2D from keras.models import Model from keras import backend as K from keras.callbacks import

This article let's take a look at how to do gradient testing if you customize the layer. Lao Wu still highly advocates gradient testing. Since many articles have been written, the repeated contents will not be explained any more. Please refer to the previous articles. public class CustomLayerExample { static{//静态代码块 //Double precision for the gradient checks. See

"From simple to complex, then to simple!"
Preface Skip nonsense and read the text directly
After a period of study, I have a preliminary understanding of the basic principles and implementation methods of RNN. Here are three different implementation methods of RNN for reference.
RNN principle can be found on the Internet, I will not say it here, it will not be better than those, here first recommend a RNN tutorial, speak very well, four post after watching the basic can realize RNN.

I have always had a "special interest" in multi-label classification because I always felt that I didn't fully understand it.Recently, I have read many blogs and suddenly felt a little bit, so I wrote down my current understanding.At present, the multi-label classification tasks I have seen have the following two situations (if there are any errors, please contact me and correct me):
Each sample corresponds to multiple label, the value of which is not 0, i.

Keras, as a deep learning library, is very suitable for beginners.When making a neural network, it comes with many commonly used objective functions, optimization methods and so on, which can basically meet the needs of novice learners.It includes objective function and optimization method.However, it also supports users to customize the objective function. The following describes one of the simplest methods to customize the objective function.
In order to realize the custom objective function, it is natural to look at how the objective function in Keras is defined first.

For the first scenario, the categories expected to be predicted have different emphases Background of this custom loss function: (MSE is generally used for regression, but it will vary depending on the actual situation)
We now want to make a regression to estimate the sales volume of a certain commodity. Now we know that the cost of a commodity is 1 yuan and the selling price is 10 yuan.

nonsense not much said, directly on the code, the code has comments, do not know how to comment on the blogger can be asked # -*- coding: utf-8 -*- import keras from keras.models import Sequential from keras.layers import Dense import numpy as np import matplotlib.pyplot as plt import time # 输入训练数据 keras接收numpy数组

Concept Clarification:
traditional single label classification (China also translates into single label, but I personally think it should be translated into a noun) learning is from a sample set that only belongs to one label l, where each label belongs to a mutually exclusive label set L,|L| > 1.
In multi-label classification, each sample belongs to a subset of an L sample set.
Multiple Tags:
In the past, multi-label classification was generated and promoted by text classification and medical analysis.

The main reason for writing this blog is to summarize some problems we often encounter in in-depth study and do not know how to solve them. I am going to take this part as a series. In order to make everyone walk less, if there are any mistakes in this blog, please point out and get to the point below. 1. Deep learning, a headache problem is how to adjust