QA Official

keras Getting Started-fine-tune on Pre-trained Network Model 2019-03-31
In the process of deep learning, we may use some well-trained models, such as Alex Net, google net, VGG net, ResNet, etc. How can we fine-tune these well-trained models to improve accuracy?In this blog, we use the already trained VGG16 model to help us with this classification task, because we want to classify objects such as cats and dogs, while VGG net is trained on imageNet, which actually already contains these 2 objects.

kerasimagedatagenerator usage 2019-03-31
ImageDataGenerator image generator ImageDataGenerator keras.preprocessing.image.ImageDataGenerator(featurewise_center=False, samplewise_center=False, featurewise_std_normalization=False, samplewise_std_normalization=False, zca_whitening=False, zca_epsilon=1e-6, rotation_range=0., width_shift_range=0., height_shift_range=0., shear_range=0., zoom_range=0., channel_shift_range=0., fill_mode='nearest', cval=0., horizontal_flip=False, vertical_flip=False, rescale=None, preprocessing_function=None, data_format=K.image_data_format()) Parameter Description: featurewise_center: boolean value, which de-centers the input data set (with an average value of 0) and executes according to featureSamplewise_center: Boolean value, which makes the average value of each sample of input data 0Featurewise_std_normalization: Boolean value, dividing the input by the standard deviation of the data set to complete normalization, executed by featureSamplewise_std_normalization: Boolean value, dividing each sample entered by its own standard deviationZca_whitening: Boolean value, applying ZCA whitening to input dataZca_epsilon: ZCA Epsilon used by ZCA, default 1e-6Rotation_range: integer, the random rotation angle of the picture when the data is promotedWidth_shift_range: floating point number, a certain proportion of the width of the picture, and the magnitude of the horizontal offset of the picture when the data is promotedHeight_shift_range: floating point number, a certain proportion of the height of the picture, and the vertical offset of the picture when the data is promotedShear_range: Floating Point Number, Shear Strength (Shear Transformation Angle in Counter-clockwise Direction)Zoom_range: floating point number or list shaped like [lower,upper], the amplitude of random scaling, if floating point number, it is equivalent to [ lower, upper ] = [ 1-zoom _ range, 1+zoom _ range ]Channel_shift_range: floating point number, amplitude of random channel offsetfill_mode:;One of' constant',' ‘nearest',' reflect' or' wrap', points beyond the boundary will be processed according to the method given in this parameter when transformation is performed. and model.fit_generator in keras 2019-03-31
fit(self, x=None, y=None, batch_size=None, epochs=1, verbose=1, callbacks=None, validation_split=0.0, validation_data=None, shuffle=True, class_weight=None, sample_weight=None, initial_epoch=0, steps_per_epoch=None, validation_steps=None) This function is used to train the model with the following parameters: x: input data.If the model has only one input, then the type of x is numpy array. if the model has multiple inputs, then the type of x should be list, and the elements of list are numpy array corresponding to each input.

multi-class loss function-cross entropy cross entropy 2019-03-31
Cross entropy is commonly used as a loss function for multi-classification problems.For a detAIled explanation of softmax and crossrope, please refer to the blog post of ai road blogger. web address:

multi-input ImageDataGenerator picture generator in keras 2019-03-31
Refer to keras website and reloading Keras's own ImageDataGenerator, multiple images can be input at the same time.This code is to implement Triplet Loss in TripletNet in Keras.For more information: environment: keras:2.1.2 tensorflow:1.4.0 python3.6 win7 from keras import backend as K import numpy as

softmax layer in caffe 2019-03-31
the last layer of lenet implementation in caffe is softmax layer, which outputs the classification results. the following is a brief introduction to softmax regression. 1, first of all, in caffe, softmax layer outputs the probability of the original input on each classification label. for example, in lenet network, the output classification has a total of 10 classifications of 0-9, then softmax layer outputs a vector containing 10 elements, each element represents the probability of the input on each classification.

tensorflow realizes Mnist and verification code identification and cross entropy function in Tensorflow (distinguishing multi-classification and multi-target) 2019-03-31
Mnist digital recognition data preparation simple linear transformation+softmax convolutional neural network(cnn) captcha recognition data processing simple linear transformation+softmax convolutional neural network(cnn) Cross Entropy Function in Tensor Flow tf.nn.sigmoid_cross_entropy_with_logits tf.nn.softmax_cross_entropy_with_logits tf.nn.sparse_softmax_cross_entropy_with_logits tf.nn.weighted_cross_entropy_with_logits Mnist there are detailed data download and linear+softmax, cnn recognition tutorials on the website of tensorflow for Mnist digital recognition.This will not be repeated here.

use of inception_v3 and ImageDataGenerator on keras 2019-03-31
keras has been used for image classification recently.For the first water test, use the existing modules to build the simplest model to play.In this model, two modules are mainly used: ImageDataGenerator, which is used to preprocess image data, and inception_v3, which is api of inception network.I have to say that the interface provided by keras is really powerful. 1. functions and parameters of imagedatagenerator The Chinese official website of this

CentOS7 Basic Command Summary 2019-03-30
Provide relevant information download address: CentOS7 1. Introduction Binary Edition of Red Hat 2. Tools Required CentOS7 CD Mirror File -everything (full version) SecureCRT Remote Login Terminal Software WinSCP Linux and Window Transport Tools 3. Common Operation Commands ifconfig 查看IP地址 init 3 切换命令模式 init 5 切换图形界面 pwd 查看当前所处目

Keras Some Basic Concepts 2019-03-30
symbolic computation Keras' underlying library uses Theano or TensorFlow, which are also called Keras' back-ends. Whether it is Theano or TensorFlow, they are symbolic libraries.As for symbolism, it can be generally summarized as follows: the calculation of symbolism firstly defines various variables, and then establishes a " calculation chart", which specifies the calculation relationship between various variables.The built calculation chart needs to be compiled to determine its internal details.