When svm method is used for image annotation experiments, some words in the vocabulary set have very few pictures, even smaller than the feature dimension.In this case, neither logistic regression nor non-linear svm can obtain a classifier with good performance.It is equivalent to the degree of freedom where the number of equations is less than the unknowns of the equation, and the equation has no exact solution. This is the
Traditional Machine Learning Methods
The general steps of classification problems can be divided into feature extraction, model construction, algorithm optimization, cross-validation, etc.For text, how to extract features is a very important and challenging problem.What are the characteristics of a text and how can it be quantified into mathematical expressions?
However, the document expression of Term frequency–inverse document frequency only takes into account the frequency information of words, and does not take into account the context structure information of words and the topic information implied by words.
Basic Ideas for Solving Problems with Dynamic Programming 1. If the problem is to find the optimal solution (usually to find the maximum or minimum value) of a problem, and the problem can be decomposed into several subproblems, and there are smaller subproblems overlapping among the subproblems, dynamic programming can be considered to solve the problem.Before applying dynamic programming, it is necessary to analyze whether the large problem can be decomposed into small problems, and there is an optimal solution for each small problem after decomposition.
Recently, I have been deeply learning relevant knowledge. In order to consolidate what I have learned, I plan to start with more popular tasks such as text classification and image classification, and write blog records to encourage my friends.This article introduces the operation of image classification (data source mnist) using keras framework. The following sections respectively introduce some background knowledge and specific operation steps.
1. Introduction to Data Sets and Frameworks Used
I have transferred the latest blog update to my personal website, welcome to visit ~ ~ SCP-173’s BLOG Installation of Deep Learning Framework Keras Keras is an encapsulation framework based on the original deep learning framework Tensorflow or Theano in Python language.If you are going to use Keras, you must first prepare to install Tensorflow or Theano. Keras Chinese Document Address 0.
Sequential Model Implement a simple AND gate neural network.In order to facilitate readers to understand how to deal with the multi-classification problem, this code also treats the two-classification problem as a multi-classification problem. #coding:utf-8 from keras.models import Sequential from keras.layers.core import Dense, Dropout, Activation import scipy.io as sio import numpy as np import keras model = Sequential() model.add(Dense(input_dim=2, units=2)) model.add(Activation('sigmoid')) model.compile(loss='mean_squared_error', optimizer='sgd', metrics=['accuracy']) trainDa = np.mat([[1,1],[1,0],[0,1],[0,0]]) trainBl = np.mat([,,,]) testDa
Convolutional Neural Network has achieved good results in emotion analysis. Compared with previous shallow machine learning methods such as NB and SVM, it has better effect, especially when the data set is large, and CNN does not need us to extract features manually. The original shallow ML requires text feature extraction, text feature representation, normalization and text classification. Text feature extraction can be divided into four steps: (1) segmenting all
Before the code is executed, all the included modules must be installed first. print(__doc__) # Modified for documentation by Jaques Grobler # License: BSD 3 clause import numpy as np import matplotlib.pyplot as plt from matplotlib.colors import ListedColormap from sklearn.cross_validation import train_test_split from sklearn.preprocessing import StandardScaler from sklearn.datasets import make_moons, make_circles, make_classification from sklearn.neighbors import KNeighborsClassifier from sklearn.svm import SVC from sklearn.tree import DecisionTreeClassifier from sklearn.ensemble import RandomForestClassifier, AdaBoostClassifier from sklearn.
The full name of CAFFE (Convolutional Architecture for Fast Feature Embedding) should be Convolutional Architecture for Fast Feature Embedded. It is a clear and efficient in-depth learning framework. It is open source and its core language is C++. It supports command line, Python and Matlab interfaces. It can run on either Central Processor or GPU.Its license is BSD.2-Clause。
Deep Learning is popular mainly because it can learn useful feature from data autonomously.