Added: （1）、Guide To Multi-Class Multi-Label Classification With Neural Networks In Python(2), multi-label classification1. train a multi-label classification/regression model using caffe2, keras Solve Multi-label Classification Problem3, keras: multi-label neural network
frontierThis article records keras-related parts used in my project.Since this project involves both multi-class and multi-label classification, there are many related articles on the multi-class classification network.Let's talk about the networking part of multi-label.After that, if there is time, let's talk about cross validation and how to deal with some multi-label metric problems in the callback function of epoch.

Evaluation Index for Multi-label Classification Performance evaluation in multi-label learning system is different from that in classical single-label learning problems. The classical metrics used in single-label learning problems include accuracy, Precision, Recall and F-measure. In multi-label learning, evaluation is more complicated. For a test set S = { (X1, Y1), (X2, Y2), ..., (XP, YP) } S = \ { (X _ { 1 },Y _ 1), (x _ 2, y _ 2), .

Summary: 1. Let me first ask a question, how to do multi-classification in LR (Logical Regression)? 2. The idea of training multiple binary classifiers 2.1 One-to-One (OvO) 2.2 Pair of Others (OvR) 2.3 many-to-many (MvM) 3. For the above methods are actually training multiple classifiers, is there a more direct method to classify LR into multiple classes?

appendix b: machineLearning projectchecklist translated from 'hands on machineLearning with scikit learning and tendorflow'The translation process has been adjusted according to the author's habits. If it is inappropriate, please point it out.
This list shows you how to deploy your own machine learning programs.There are eight steps in total:
First you have a problem to solve Get the data needed to solve the problem Explore data and have a clear understanding of the data Preprocessing Data for Better Input to Machine Learning Algorithms Explore different models and find the best one Adjust your model parameters and combine them into a better solution Show your results Go online, monitor and maintain your system 1 normalization problem: framethe problem and look at the bigpicture Define your goals in business terms How will your solution be used?

Logic regression templates compiled by myself are shared as learning notes.The data set uses the australian data set with 14 independent variables Xi and one dependent variable y. 1. Grouping Test Sets and Training Sets 3 and 7 australian <- read.csv("australian.csv",as.is = T,sep=",",header=TRUE) #读取行数 N = length(australian$Y) #ind=1的是0.7概率出现的行，i

In the third article.The author gives an example of how to use Keras neural network to train a linear function. In this article, the author gives an example of how to use Keras neural network to recognize 60000 handwritten numbers of 1 ~ 9 in mnist library.This code refers to the code in the self-paced video, but with an improvement, the binary of one of the test pictures has been

Preface Although there is no classification, let's look at the difference between multi-label and multi-classification.In order to avoid his wrong understanding, of course, I chose to forgive him ... Look for formal documents. The following translations are from scikit-learn.org and Wikipedia
International practice, posted from:
Multiclass and multilabel algorithms
Multi-label classification
Multiclass classification
scikit-learn introduction Multi-class classification: indicates that there are multiple classes in the classification task, such as classifying a pile of fruit pictures, which may be oranges, apples, pears, etc.

github has two versions of multitasking training:1、 https://github.com/miraclewkf/multi-task-MXNet2. examples from mxnetThe first may be slower because its data iterator is Image.The second example is mnist, which needs to modify its own data iterator.The main record here is multi-task training based on ImageRecordIter iterator. 1, data productionYou need to generate *.lst files, which are as follows: index task

Paper Link:
https://www.semanticscholar.org/paper/Deep-Learning-for-Extreme-Multi-label-Text-Liu-Chang/1a0365567850837931d04126714ae6e2cbfc6270
Purpose of the paper:
To solve the problem of extremely multi-label text classification, the total number of labels here is generally more than 1000 (can be more).
Model Structure:
This model is based on CNN-Kim (that is, ordinary TextCNN) and has two improvements.
On the one hand, Kim's max-pooling layer for the entire text sequence is dynamicpooling pooled for the text convolution layer, i.