QA Official

Softmax regression C++ implementation 2019-05-02
Experimental Environment Visual Studio 2013 data Data comes from and contains 26 capital letters. There are 20,000 samples in it, each with 16 dimensions. Experimental Purpose Complete the classification of character samples in the data set. Experimental Code 1. Define a LogisticRegression class: header fileLogisticRegression.h #include <iostream> #include <math.h> #include<algorithm> #include <functional> #include <string> #include <cassert> #include <vector> using namespace std; class LogisticRegression { public: LogisticRegression(int inputSize, int k,

TensorFlow Actual Combat (1)-Definition of cross entropy 2019-05-02
for multi-class problem, cross-entropy is usually used as lostfunction.Cross entropy is a concept in information theory at the earliest. It is changed from information entropy and then used in many places, including communication, error correction code, game theory and machine learning.For the relationship between cross entropy and information entropy, please see: < a href = "", target = "_ blank" > machine learning foundation (6)-cross-entropy cost function (cross-entropy error). When

TensorFlow-RNN Recurrent Neural Network Example 2: Sentiment analysis 2019-05-02
TensorFlow-RNN Sentiment analysis Previously wrote a Sentiment analysis using fully connected neural networks Now, use TensorFlow to build an RNN network for sentiment analysis of text complete code and detailed solution training data Step 1 Data Processing import numpy as np # 读取数据 with open('reviews.txt', 'r') as f: reviews = with open('labels.txt', 'r') as f: labels = # 每一个 \n

Tensorflow loss function and Custom Loss Function (3) 2019-05-02
I mainly divided into three articles to introduce tensorflow's loss function. This article is tensorflow's custom loss function.(A) tensorflow built-in four loss functions(2) Other loss functions(3) custom loss function Self-defined loss function is the end of the chapter on loss function. Learning self-defined loss function is very helpful to improve the accuracy of classification, segmentation and other issues. At the same time, exploring new loss function can also make you

Tensorflow loss function and Custom Loss Function (I) 2019-05-02
I mainly divided into three articles to introduce tensorflow's loss function, this is tensorflow built-in four loss functions (a) tensorflow built-in four loss functions(2) Other loss functions(3) custom loss function loss function quantifies the difference between the results (predicted values) output by the classifier and the results (labels) we expect, which is as important as the classifier structure itself.There are many scholars who devote themselves to discussing how to improve the loss function so as to optimize the results of the classifier.

keras custom loss loss function, sample weighted sum metric on loss 2019-05-02
First differentiate and analyze the concept: 1. loss is the goal of the optimization of the whole network and needs to participate in the optimization operation and update the weight W. 2. metric is only used as an "indicator" to evaluate the performance of the network, such as accuracy, in order to intuitively understand the effect of the algorithm, act as view, and do not participate in the optimization process

logistic Regression Detailed Explanation (II): Detailed Explanation of cost function 2019-05-02
supervised learning Machine learning is divided into supervised learning, unsupervised learning, semi-supervised learning and reinforcement learning.For logical regression, it is a typical supervised learning.Since it is supervised learning, the training set can naturally be expressed as follows:\{(x^1,y^1),(x^2,y^2),\cdots,(x^m,y^m)\} For these M training samples, each sample itself has N-dimensional characteristics.Plus an offset x0x_0, each sample contains n+1 dimensional features:x = [x_0,x_1,x_2,\cdots,x_n]^TWhere x ∈ rn+1x \ in r {n+1}, x0 = 1x _ 0 = 1, y ∈ {0,1} y \ in \ {0,1 \}

python Reads json Files and Other Processes 2019-05-02
Three pandas Read json Files from import json_normalize import pandas as pd import json import time # 读入数据 data_str = open('AgriculturalDisease_train_annotations.json').read() #———————————————————— 测试json_normalize ————————

(keras)-custom objective function (loss function) in keras 2019-05-02
When doing the kaggle project, I saw a person designing an unet and using a custom iou as the loss function. Only then did I realize that I could design the loss function myself ...In order to realize its own objective function, it naturally occurred to me to look at the definition of the objective function in Keras first, and check the source code found in/usr/local/lib/python3.5/dist-packages/Keras (my system is ubuntu16.

3 minutes to highlight Softmax 2019-05-02
Personal Website: Red Stone's Road to Machine LearningCSDN blog: red stone columnZhihu: Red StoneWeibo: RedstoneWill's WeiboGitHub: GitHub:RedstoneWillWechat Official Account: Aiyoudao (ID: Redstone Will) 1. what is Softmax Softmax is widely used in machine learning and deep learning.Especially when dealing with multi-classification (C > 2), the final output unit of the classifier needs Softmax function to carry out numerical processing.The definition of Softmax function is as follows: