QA Official

Keras Learning (3)-classification 2019-05-03
This article mainly introduces the use of keras to build a neural network and classify handwritten numbers. code: import numpy as np from keras.datasets import mnist from keras.utils import np_utils from keras.models import Sequential from keras.layers import Dense, Activation from keras.optimizers import RMSprop # 使多次生成的随机数相同 np.random.seed(1337) # 下载数据集 # X_shape(60000 28x28),y shape(10000) (X_train, y_train), (X_test, y_test)

Keras Learning III: Realizing cifar10 Image Classification Model with CNN 2019-05-03
Keras Learning III: Realizing cifar10 Image Classification Model with CNN 1 Introduction to Convolutional Neural Network Convolutional Neural Network, like fully connected neural networks, is formed by connecting multiple neural network layers.The difference is that CNN is generally composed of multiple convolution layers and pooling layers alternately connected to extract high-level features of input data and reduce the dimension of data.Finally, the extracted features are classified by neural network to

TensorFlow Source Code Analysis of Learning Notes (3) 2019-05-03 """简单调用Inception V3架构模型的学习在tensorboard显示了摘要。 这个例子展示了如何采取一个Inception V3架构

keras Classification and Regression Code 2019-05-03
keras CNN classification #-*- coding:utf-8 -*- import numpy as np from keras.models import Sequential from keras.layers import Dense, Dropout, Activation, Flatten from keras.layers.convolutional import Conv2D from keras.layers.pooling import MaxPooling2D from keras.layers import Embedding, LSTM from keras.utils import np_utils def loda_mnist(): (X_train, y_train), (X_test, y_test) = mnist.load_data() X_train = X_train.reshape(X_train.shape[0], 28, 28, 1) X_test = X_test.reshape(X_test.shape[0], 28, 28, 1) X_train = X_train.astype('float32') X_test = X_test.astype('float32') X_train /= 255 X_test /= 255

python keras (an excellent neural network framework) and its examples 2019-05-03
Let's talk about how difficult it is to install this keras based on theano. Anyway, I couldn't do it under windows, so I installed a dual system myself.This is the beginning of the powerful linux system. It is no wonder that large companies use it for development. Who knows who uses it?!!!First, let's introduce this framework: We all know the deep neural network. python has theano framework to write neural

tensorflow (6) Training and Classifying His Pictures (CNN Super-detailed Beginner Edition) 2019-05-03
CAFFE (Convolutional Architecture for Fast Feature Embedding) has always been used as an image before. Due to tensorflow's simple environmental configuration and excellent comprehensive performance, it plans to move to tensorflow.To learn this framework, the first step is to run the mnist applet in the document (please refer to the official document of tensorflow for details).However, all mnist are processed data, and the specific data processing process is not mentioned.

tensorflow reported an error: setting anarray element with a sequence. 2019-05-03
Recently, many people sent me private letters saying that they would encounter this problem. In fact, I have encountered this problem twice before, but I felt that I had tried it and solved it. I didn't think it was a very common problem.So write a blog to record this problem for future use. First of all, this problem usually occurs when reading data, that is, when passing the data in Python to placeholder, reporting the error.

tesnsorflow Routine for Classifying Using LSTM 2019-05-03
import tensorflow as tf import sys from tensorflow.examples.tutorials.mnist import input_data # this is data mnist = input_data.read_data_sets('MNIST_data', one_hot=True) # hyperparameters lr = 0.001 training_iters = 100000 batch_size = 128 n_inputs = 28 # MNIST data input (img shape: 28*28) n_steps = 28 # time steps n_hidden_units = 128 # neurons in hidden layer n_classes = 10 # MNIST classes (0-9 digits) # tf Graph input x = tf.placeholder(tf.float32, [None, n_steps,

"Tensor Flow Actual Combat" Learning 5-Bidirectional LSTM Classifier 2019-05-03
Bi-RNN splits the ordinary RNN into two directions, one positive and the other related to historical data.A reverse, related future data, so that for the same time, the input historical data and future data can be used.RNN in both directions has its own state, and there is no direct connection between them, except that the last two outputs are connected to the output node of Bi-RNN together.Backward propagation of sequences

A reduced version of inception Convolutional Neural Network is used for classification of mnist data sets 2019-05-03
Only one inception module is used in this network.After 20 epochs, the accuracy rate can reach 99.25%.Anyone who wants test data can contact me.Mainly referring to from __future__ import print_function import numpy as np np.random.seed(1337) # for reproducibility from keras.models import Model from keras.models import Sequential from keras import layers from keras.layers import Dense, Dropout, Activation, Flatten from keras.layers import Conv2D, MaxPooling2D, Input from keras.layers import BatchNormalization, AveragePooling2D, GlobalAveragePooling2D from keras.