QA Official

Faster-RCNN(keras) Realize the Generation of Dog Identification-Labeling Box

https://qaofficial.com/post/2019/05/06/23413-faster-rcnnkeras-realize-the-generation-of-dog-identification-labeling-box.html 2019-05-06
Preface: The article on Zhihu is very detailed about the use of trained models to generate annotation boxes, so let's make some usage records here.I found a source code written by Yann Henon on GitHub. I call it the initial version. The initial version has train.py. I used this version for initial learning and Faster-RCNN learning. The identification of pet dogs is also based on this project.However, it is very

Introduction and Understanding of WEKA and Mulan

https://qaofficial.com/post/2019/05/06/25011-introduction-and-understanding-of-weka-and-mulan.html 2019-05-06
http://blog.csdn.net/u011274209/article/details/51996753 Weka: Weka's classifiers are all placed in the packet inside that starts with weka.classifiers.According to their functions, they are divided into different categories. See their methods for details.The core class of Weka inside is placed in the package inside with weka.core as the beginning. For Weka data, Instances inside exists.Then each piece of data is an instance of the interface Instrance (with and without s, which is well understood).

Java implementation of KNN classification algorithm

https://qaofficial.com/post/2019/05/06/25031-java-implementation-of-knn-classification-algorithm.html 2019-05-06
Nearest Neighbor Classification Algorithm The idea of KNN algorithm is summarized as follows: under the condition that the data and labels in the training set are known, test data are input, and the characteristics of the test data and the corresponding characteristics in the training set are compared with each other to find the first K data in the training set that are most similar to them, then the category

Multi-label Classification

https://qaofficial.com/post/2019/05/06/23408-multi-label-classification.html 2019-05-06
Multilabel Classification Multilabel Classification Multi-label Classification Problem caffe corresponds to multiple label per sample? training a multi-label classification/regression model using caffe a single label image classification model using caffe fine-tune Multi-label Detection of googlenet Based on caffe https://github.com/Numeria-Jun/multi-labels-googLeNet-caffe Multi-label Training Based on Inception v3 Generate hdf5 File for Multi-label Training caffehdf5layerdata > 2G import caffe hdf5 data layer data generation caffe learning notes (11): HDF5Data type dataset generation for multitasking learning.

Non-local Neural Networks

https://qaofficial.com/post/2019/05/06/24979-non-local-neural-networks.html 2019-05-06
Reference Reference 2

The Simple Application ofKeras (Three Classification Problem)-Adapted from cifar10vgg16

https://qaofficial.com/post/2019/05/06/23434-the-simple-application-ofkeras-three-classification-problem-adapted-from-cifar10vgg16.html 2019-05-06
This is a three-class problem adapted from cifar10: from __future__ import print_function #此为在老版本的python中兼顾新特性的一种方法 import keras from keras.preprocessing.image import ImageDataGenerator from keras.preprocessing import image from keras.models import Sequential from keras.models import model_from_json from keras.layers import Dense, Dropout, Activation, Flatten from keras.layers import Conv2D, MaxPooling2D, BatchNormalization from keras

Types of Neural Networks

https://qaofficial.com/post/2019/05/06/25036-types-of-neural-networks.html 2019-05-06
KNN DNN SVM DL BP DBN RBF CNN RNN ANN Overview This paper mainly introduces the commonly used neural networks, their main uses, and the advantages and limitations of various neural networks. 1 BP neural network BP (Back Propagation) neural network is a neural network learning algorithm.It is a hierarchical neural network composed of an input layer, an intermediate layer and an output layer, and the intermediate layer can be expanded into multiple layers.

[ reprint ] Keras custom complex loss function

https://qaofficial.com/post/2019/05/06/23430-reprint-keras-custom-complex-loss-function.html 2019-05-06
Keras is a building block deep learning framework, which can be used to easily and intuitively build some common deep learning models.Before tensorflow came out, Keras was almost the most popular in-depth learning framework at that time, taking theano as the back end. Now Keras has supported four back ends at the same time: theano, tensorflow, cntk and mxnet (the first three official supports, mxnet has not yet been integrated

[Competition Sharing] Kaggle-Toxicomment [Keras Multiple Binary Classification, High Quality Comment Corpus, Use of Pre-trained Word Vector]

https://qaofficial.com/post/2019/05/06/24952-competition-sharing-kaggle-toxicomment-keras-multiple-binary-classification-high-quality-comment-corpus-use-of-pre-trained-word-vector.html 2019-05-06
summary recently watching a Kaggle competition [toxcomment] https://www.kaggle.com/c/jigsaw-toxic-comment-classification-challenge The goal of the competition is to judge whether a written comment is a poison comment or not. At the same time, poison comments are specifically divided into six categories.【’toxic’, ‘severe_toxic’, ‘obscene’, ‘threat’, ‘insult’, ‘identity_hate’】 This blog mainly shares the new postures learned. Keras can actually do multiple 2 classifications at the same time Baseline[0.051] implemented by Bi-LSTM actually does 6 2 classifications at the same time, but I didn't know that I could do this before!

macro-average and micro-average of Multi-label classification Performance Evaluation

https://qaofficial.com/post/2019/05/06/25005-macro-average-and-micro-average-of-multi-label-classification-performance-evaluation.html 2019-05-06
Generally, we use accuracy when evaluating the performance of the classifier. Consider in the context of multi-class classification accuracy = (Number of Samples Correctly Classified)/(Number of Samples Classified) In fact, it looks pretty good, but there may be a serious problem: for example, an opaque bag contains 1,000 mobile phones, including 600 iphone6, 300 galaxy s6, 50 Huawei mate7,50 and 50 mx4 (of course, these information classifiers are unknown.