QA Official

Keras Use vgg16 to Identify Cats and Dogs 2019-04-29
import module import os import numpy as np import tensorflow as tf import random import seaborn as sns import matplotlib.pyplot as plt from keras.models import Sequential, Model from keras.layers import Dense, Dropout, Activation, Flatten, Input from keras.layers.convolutional import Conv2D, MaxPooling2D from keras.optimizers import RMSprop, Adam, SGD from keras.preprocessing import image from keras.preprocessing.image import ImageDataGenerator from keras.applications.vgg16 import VGG16, preprocess_input from sklearn.model_selection import train_test_split Read the picture function from the target

LSTM Text Classification Example 2019-04-29
The data to be classified are text documents that have been segmented, each line of which represents an article. The segmentation is relatively rough and no stop words filtering has been carried out. The effect should be significantly improved after using stop words filtering.1. Load data # -*- coding: utf-8 -*- import sys reload(sys) sys.setdefaultencoding('utf-8') def loadData(fileN

Softmax function and Cross Entropy 2019-04-29
Softmax function Background and Definition In Logistic regression binary classification problem, we can use sigmoid function to map input Wx+bWx+b into (0,1)(0, 1) interval, thus obtaining the probability of belonging to a certain category.To generalize this problem to multi-classification problems, we can use softmax function to normalize the output value to a probability value. It is assumed here that before entering softmax function, there is already a model output CC value, where CC is the number of categories to be predicted, and the model can be the output aa of the fully connected network, with the output number CC, i.

SqueezeNet cat and dog identification 2019-04-29
caffe training data preparation New data Stores Training Data, test_data Stores Test Data, data,test_data directory under the new folder corresponding to each category, used to store each category of pictures, for example, divided into cat,dog two categories, respectively, the new directory cat,dog, /home/data/cat/home/data/dog /home//test_data/cat/home//test_data/dog data is converted into caffe training data format training data, /home//caffe/build/tools/convert_imageset –resize_height=227 –resize_width=227 –shuffle experiments/animal1/train_val/trainval//home/train_val/trainval/train.txt/home/train_val/lmdb/train_lmdb45 Verify data, /home/caffe/build/tools/convert_imageset –resize_height=227 –resize_width=227 –shuffle experiments/animal1/train_val/trainval//home/train_val/trainval/val.

keras Classify Cat and Dog Data Sets (2) 2019-04-29
directory keras Classification of Cat and Dog Data Sets (1) keras Classify Cat and Dog Data Sets (2) keras Classify Cat and Dog Data Sets (3) DataSet: cats_and_dogs_dataset A common and efficient way to learn more about small image data sets is to use pre-trained networks.The pre-trained network is only a storage network previously trained on large data sets, usually on large-scale image classification tasks.If this original data set is

keras cnn and lstm tests 2019-04-29
1.cnn test import numpy as np import keras from keras.models import Sequential from keras.layers import Dense, Dropout, Flatten from keras.layers import Conv2D, MaxPooling2D from keras.optimizers import SGD # 创建虚假数据 x_train = np.random.random((100, 100, 100, 3)) y_train = keras.utils.to_categorical(np.random.randint(10, size=(100, 1)), num_classes=10) x_test = np.random.random((20, 100, 100, 3)) y_test = keras.utils.to_categorical(np.random.randint(10, size=(20, 1)), num_classes=10) model = Sequential() # 输入: 100x1

tensorflow to Realize Cat and Dog Classification (LeNet-5) 2019-04-29
This article is provided by the author of network video. It is a two-classification problem and is recorded in three parts: 1, data processing, i.e. labeling, cat: 0, dog: 1 #%% import tensorflow as tf import numpy as np import os # img_width = 208 img_height = 208 #%% 获取图片 及 生成标签 train_dir = 'G:/tensorflow/cats_vs_dogs/data/train/' def get_files(file_dir): ''' args: file_dir: file directory Returns: ist

Cross Entropy Calculation for Multi-classification Problems 2019-04-29
Cross Entropy of Multi-classification Problem In the multi-classification problem, the loss function is the cross entropy loss function.For sample points (x,y), y is a real label. in multi-classification problems, its value can only be label set labels. we assume that there are k label values, and the probability that the I-th sample is predicted to be the k-th label value is pi,kpi,kp_{i,k}, i.e. pi, k = pr (ti, k =

Emotional Polarity: Knowledge of Chinese Emotional Classification 2019-04-29
1. Text Classification1, what is text classification?Text classification is the process of associating a given text with one or more categories according to its characteristics (content or attributes) under a predefined classification system. 2, the specific steps of text classification?(1) Construction of Classification Category System (2) Obtaining Text with Category Label (3) Text Feature Selection and Weight Calculation (4) Selection and Training of Classifiers (5) Text Classification Application 3, category systemGeneral text classification is based on text content and automatically classifies text into political, economic, military, sports and other categories.

Evaluation of Classification Problems (Two-classification & Multi-classification) 2019-04-29
Recall Rate, Accuracy Rate, F Value For the two-class classification problem, the sample can be divided into the following categories according to its real category and classifier prediction category: True Positive (TP): The real category is positive and the predicted category is positive.False Positive (FP): the real category is negative and the predicted category is positive.False Negative (FN): the real category is positive and the predicted category is negative.