QA Official

CNN and LSTM Realize Two Classifications of DNA Binding Proteins (python+keras Realization)

https://qaofficial.com/post/2019/05/02/24367-cnn-and-lstm-realize-two-classifications-of-dna-binding-proteins-python-keras-realization.html 2019-05-02
CNN and LSTM Realize Two Classifications of DNA Binding Proteins (python+keras Realization) Main Contents word to vector binding protein sequence modification word embedding CNN1D implementation LSTM implementation from __future__ import print_function import numpy as np import h5py from keras.models import model_from_json np.random.seed(1337) # for reproducibility from keras.preprocessing import sequence from keras.models import Sequential from keras.layers.core import Dense, Dropout, Activation from keras.

Java Data Structure (3)-Collection Summary and Thread Safety of Collections

https://qaofficial.com/post/2019/05/02/24480-java-data-structure-3-collection-summary-and-thread-safety-of-collections.html 2019-05-02
Vector,HashTable are thread-safe collection classes. However, these two classes are very early usage and should be used as little as possible now. set–No collection of duplicatesThere are three specific types of sets availableHashSet- based on the hash table set, the elements added to the hash table implement the hashCode () methodLinkedHashSet- returns elements in increasing order when iterating over a setTreeSet- Data Structure Based on

Keras Custom Loss Function Used in Scene Classification

https://qaofficial.com/post/2019/05/02/24481-keras-custom-loss-function-used-in-scene-classification.html 2019-05-02
Keras Custom Loss Function Used in Scene Classification In the process of image scene classification, we need to customize the loss function and encounter many pits.Keras's own loss functions are all in the losses.py file.(The following defaults to classification processing) #losses.py #y_true是分类的标签,y_pred是分类中预测值(这

Keras Learning Note 02-Common Network Layer

https://qaofficial.com/post/2019/05/02/24399-keras-learning-note-02-common-network-layer.html 2019-05-02
When building a neural network structure, we should add some network layers to the network. The following are some common network layers and their related usages. 1. Common Layer Common layers correspond to core modules. A series of common network layers are defined inside core, including full connection and activation layers. 1.Dense layer Dense Layer: Full Connection Layer. keras.layers.core.Dense(output_dim, init='glorot_uniform', activation='linear', weights=None, W_regularizer=None, b_regularizer=None, activity_regularizer=None, W_constraint=None, b_constraint=None, bias=True, input_dim=None) output_dim: an integer greater than 0, representing the output dimension of the layer.

Keras callback function

https://qaofficial.com/post/2019/05/02/24530-keras-callback-function.html 2019-05-02
When training models, many things cannot be predicted at first.Especially you don't know how many rounds are needed to get the best verification loss.Usually, the simple method is to train enough rounds, at which time the model has already started fitting. According to the first run, determine the correct rounds required for training, and then start a new training from scratch with this optimal number of rounds.Of course, this method

Keras custom loss [centerloss for fine-grained classification]

https://qaofficial.com/post/2019/05/02/24519-keras-custom-loss-centerloss-for-fine-grained-classification.html 2019-05-02
Keras is a building block deep learning framework, which can be used to easily and intuitively build some common deep learning models.Before tensorflow came out, Keras was almost the most popular in-depth learning framework at that time, taking theano as the back end. Now Keras has supported four back ends at the same time: theano, tensorflow, cntk and mxnet (the first three official supports, mxnet has not yet been integrated

Keras(2): Using Keras to build a neural network to classify Mnist handwritten fonts and qualitatively analyze the influence of various hyperparameter

https://qaofficial.com/post/2019/05/02/24387-keras2-using-keras-to-build-a-neural-network-to-classify-mnist-handwritten-fonts-and-qualitatively-analyze-the-influence-of-various-hyperparameter.html 2019-05-02
Experimental Report 0, BaseNet (Layer 3, sigmoid, 784-386-10)one 1, Hidden_Net(784-112-10 and 784-543-10)2. 2、reluActivation_Net 3 3, DeepNet (4, 5) 4 4, DeepNet (four, five layers;Increase in the number of training rounds) 5 5, DeepNet (five layers;Dropout) 6 7, DeepNet (five layers;Dropout+relu) 8 8, AutoEncoder_Net (five layers;AutoEncoder) 9 9, Conclusion 10   Abstract: The data set of this experiment is MNIST digits.

RNN(LSTM) is used for classification

https://qaofficial.com/post/2019/05/02/24364-rnnlstm-is-used-for-classification.html 2019-05-02
import tensorflow as tf import sys import random from sklearn.cross_validation import train_test_split from sklearn.cross_validation import StratifiedKFold #StratifiedKFold import matplotlib.pyplot as plt from sklearn.metrics import roc_curve, auc from scipy import interp # from tensorflow.contrib.keras.python.keras.layers import BatchNormalization %matplotlib inline # hyperparameters lr = 0.001 training_iters = 10000 batch_size = 200 #3200/200=16用作训练 800/20

Softmax regression C++ implementation

https://qaofficial.com/post/2019/05/02/24454-softmax-regression-c-implementation.html 2019-05-02
Experimental Environment Visual Studio 2013 data Data comes from http://archive.ics.uci.edu/ml/datasets/optical+recognition+of+handwritten+digits and contains 26 capital letters. There are 20,000 samples in it, each with 16 dimensions. Experimental Purpose Complete the classification of character samples in the data set. Experimental Code 1. Define a LogisticRegression class: header fileLogisticRegression.h #include <iostream> #include <math.h> #include<algorithm> #include <functional> #include <string> #include <cassert> #include <vector> using namespace std; class LogisticRegression { public: LogisticRegression(int inputSize, int k,

TensorFlow Actual Combat (1)-Definition of cross entropy

https://qaofficial.com/post/2019/05/02/24567-tensorflow-actual-combat-1-definition-of-cross-entropy.html 2019-05-02
for multi-class problem, cross-entropy is usually used as lostfunction.Cross entropy is a concept in information theory at the earliest. It is changed from information entropy and then used in many places, including communication, error correction code, game theory and machine learning.For the relationship between cross entropy and information entropy, please see: < a href = "http://blog.csdn.net/lanchunhui/article/details/50970625", target = "_ blank" > machine learning foundation (6)-cross-entropy cost function (cross-entropy error). When