QA Official

Text Emotion Classification 2019-04-29
Movie Text Emotion Classification Github address Kaggle address This task is mainly to classify the emotion of movie review texts, which is mainly divided into positive reviews and negative reviews, so it is a two-classification problem. We can select some common models such as Bayesian and logistic regression. One of the challenges here is the vectorization of text content. Therefore, we first try the vectorization method based on TF-IDF, and

Use Keras to Classify Cats and Dogs 2019-04-29
This paper introduces an image classification problem. The goal is to get the classification of input images.The method used is to train a convolution neural network. The data set includes thousands of images of cats and dogs.The framework used is Keras library, data set download: write link content here1 download test_set and training_set, including 10000 pictures.The training_set contains two subfolders cats and dogs, each with 8000 pictures about the corresponding category.

Use Word Vector+lstm for Emotional Analysis 2019-04-29
The data set this time is from github, and I am very grateful to the original author for collecting it. The data set is Jingdong's shopping review, which is divided into two texts of positive emotions and negative emotions. Among them, 947 are positive emotion samples and 2142 are negative emotion samples. Use all words to do word vector training.The word vector is trained by gensim, which is very convenient

[ keras ] summary of cat-dog war 2019-04-29
This blog mainly refers to " Building Image Classification Model for Small Datasets" in keras official documents.This article records the problems encountered in studying this article and some methods explored by myself.The general idea of the official document is: Firstly, the VGG-16 network with the full connection layer removed is used to obtain the bottleneck of the data set.Feature, then design several full connection layers, train them, and finally fine-tune the last few convolution layers.

kaggle Project Actual Combat-Cat and Dog Classification Detection 2019-04-29
Main Reference: Deep Learning: Detailed Explanation and Actual Combat of ——caffe's Classic Model kaggle dataset download link: Data Set Description: consists of two files: training set, picture file naming format: cat.X.jpg,dog.X.jpg this is a two-classification problem, which requires 0/1 classification according to cat/dog category, i.e. cat->0,dog->1 used for detection, naming format: X.jpg, used to verify the recognition accuracy of

keras Migration Learning, Fine-tuning, model's predict Function Definition 2019-04-29
click here: cat and dog vs keras instance def add_new_last_layer(base_model, nb_classes): """Add last layer to the convnet Args: base_model: keras model excluding top nb_classes: # of classes Returns: new keras model with last layer """ x = base_model.output x = GlobalAveragePooling2D()(x) x = Dense(FC_SIZE, activation='relu')(x) predictions = Dense(nb_classes, activation='softmax')(x) model = Model(input=base_model.input, output=predictions) return model Load the pre-training model as the front-end network and fine-tune it on its own data

keras-transfer learning fine-tuning 2019-04-29
This program demonstrates the process of fine-tuning a pre-trained model on a new data set.We freeze the convolution layer and only adjust the full connection layer. train a convolution network on the MNIST dataset using the first five digits [ 0 ... 4 ]. In the last five digits [ 5 ... 9 ], the convolution network is used to classify, freeze the convolution layer and fine-tune the full connection

using resnet to do kaggle cat and dog war image recognition, 98 accuracy in seconds 2019-04-29
1, Data Introduction this data set comes from Kaggle, with 12,500 cats and 12,500 dogs.Here is a brief introduction to the overall idea1.1 Train a small network directly from the picture (as a reference method), that is, the ordinary cnn methodAfter 2, 2, I will use the latest pre-trained resnet and other methods for training 2 Data Promotion and cnn In order to make the best use of our limited

word2vec-(1) nltk implements simple word cutting, sentiment analysis, and text similarity (TF-IDF) 2019-04-29
Nltk from nltk.corpusimport brown (1) Brown. Categories () The article directory under this file (2) len(brown.sents()) (3) len(brown.words()) tokenizer participle nltk.tokenize(sentence) stuttering participleThree Word Cutting Modes Import jieba jieba.cut (' open the official one', cut _ all = true) # full mode jieba.cut (' the official one', cut _ all = false) # exact mode print "Full Mode:", "/".join(seg_list) seg _ list = jieba.cut _ for _ search (" Xiao

CGLIB Introduction and Principle 2019-04-28
CGLIB Introduction and Principle (Some Excerpts from Network) 1. What is CGLIB? CGLIB is a powerful and high-performance code generation package.It provides proxies for classes that do not implement interfaces and a good supplement for JDK's dynamic proxies.You can usually use Java's dynamic proxy to create a proxy, but CGLIB is a good choice when the class you want to proxy does not implement an interface or for better performance.