QA Official

My Opinion---Kmeans and PCA

https://qaofficial.com/post/2019/04/25/68789-my-opinion-kmeans-and-pca.html 2019-04-25
pca code: # -*- coding: utf-8 -*- import numpy as np def pca(data):#data:m*n data=np.transpose(data)#变成 特征*样本 mean_data=data-np.average(

On Understanding Principal Component Analysis (PCA) Algorithm

https://qaofficial.com/post/2019/04/25/68777-on-understanding-principal-component-analysis-pca-algorithm.html 2019-04-25
The PCA algorithm has been studied for a period of time before, but it has not been compiled into articles. Recently, the project plans to use PCA algorithm again, so strike while the iron is hot to consolidate the knowledge of PCA algorithm.The point of view of this article is to throw bricks to attract jade. It is not an authority, nor can it be fully believed. It is only my own experience.

Stores tensorflow's ckpt model as npy

https://qaofficial.com/post/2019/04/25/68703-stores-tensorflowamp#x27s-ckpt-model-as-npy.html 2019-04-25
#coding=gbk import numpy as np import tensorflow as tf from tensorflow.python import pywrap_tensorflow checkpoint_path='model.ckpt-5000'#your ckpt path reader=pywrap_tensorflow.NewCheckpointReader(checkpoint_path) var_to_shape_map=reader.get_variable_to_shape_map() alexnet={} alexnet_layer = ['conv1','conv2','conv3','conv4','conv5','fc6','fc7','fc8'] add_info = ['weights','biases'] alexnet={'conv1':[[],[]],'conv2':[[],[]],'conv3':[[],[]],'conv4':[[],[]],'conv5':[[],[]],'fc6':[[],[]],'fc7':[[],[]],'fc8':[[],[]]} for key in var_to_shape_map: #print ("tensor_name",key) str_name = key # 因为模型使用Adam算法优化的,在生成的ckpt中,有Adam后缀的

UVA 12168-Cat vs. Dog (bipartite matching+Maximum Independent Set)

https://qaofficial.com/post/2019/04/25/68678-uva-12168-cat-vs.-dog-bipartite-matching-maximum-independent-set.html 2019-04-25
UVA 12168 - Cat vs. Dog topic link question meaning: given some cat lovers and some dog lovers, everyone has a favorite cat (dog) and a hated dog (cat), ask now to give a plan so that as many people as possible can be satisfied. train of thought: bipartite matching's largest independent set, the contradiction between cat lovers and dog lovers, do a maximum independent set code: #include <cstdio> #include <cstring> #include <vector> using namespace std; const int N = 505; int t, c, d, n; struct People { int a, b; People() {} People(int a, int b) { this->a = a; this->b = b; } } cat[N], dog[N]; int cn, dn; vector<int> g[N]; char A[105], B[105]; int match[N], vis[N]; bool dfs(int u) { for (int i = 0; i < g[u].

bn layer and scale layer in caffe

https://qaofficial.com/post/2019/04/25/68649-bn-layer-and-scale-layer-in-caffe.html 2019-04-25
From: https://zhidao.baidu.com/question/621624946902864092.html why should bn layer be used with scale layer in caffe 这个问题首先你要理解batchnormal是做什么的。它其实做了两件事。1) 输入归一化 x_norm = (x-u)/std, 其中u和std是个累计

relationship between ChannelPipeline and ChannelHandler in netty

https://qaofficial.com/post/2019/04/25/68717-relationship-between-channelpipeline-and-channelhandler-in-netty.html 2019-04-25
ChannelPipeline provides a container for the ChannelHandler chain and defines an API for propagating inbound and outbound event streams on the chain.When a Channel is created, it is automatically assigned to its own ChannelPipeline. ChannelHandler installation into ChannelPipeline: An implementation of ChannelInitializer is registered in ServerBootstrap (or Bootstrap for clients). when the ChannelInitializer.initChannel () method is called, ChannelInitializer will install a set of custom ChannelHandler in the ChannelPipeline; ChannelInitialier removes itself from ChannelPipeline.

sentiment analysis using Python&amp;#x27;s SnowNLP module

https://qaofficial.com/post/2019/04/25/68823-sentiment-analysis-using-pythonamp#x27s-snownlp-module.html 2019-04-25
SnowNLP is a python-written class library, which can conveniently process Chinese text content. It was written inspired by TextBlob. Since most natural language processing libraries are basically aimed at English, it has written a class library convenient to process Chinese. And unlike TextBlob, NLTK is not used here, all algorithms are implemented by itself, and some trained dictionaries are brought with it. Examples of calculating emotional values are as follows:

sqlpupper () function

https://qaofficial.com/post/2019/04/25/68647-sqlpupper-function.html 2019-04-25
sqlpupper () function instance code tutorial-returns the string str and changes all characters mapped according to the current character set to uppercase. example explanation: upper (field name): converts the contents of the field to uppercase for return note: the data type of the field must be a string type upper (string): Upper('aabbcc') returns' aabbcc' This method is seldom used in SQL Server, mostly in case-sensitive Oracle.

there are four labels, namely label1, label2, ..., label4, how to take the caption of the corresponding label according to an integer of 1-4

https://qaofficial.com/post/2019/04/25/68835-there-are-four-labels-namely-label1-label2-...-label4-how-to-take-the-caption-of-the-corresponding-label-according-to-an-integer-of-1-4.html 2019-04-25
there are four labels, namely label1, label2, ..., label4, how to take the caption traversal genus of the corresponding label according to an integer of 1-4  Method 1: Traverse all label controls on the owner. This traversal method stores the controls on a Tlist in the order in which they are placed on the owner. The controls are placed first and stored first procedure TForm1.Button1Click(Sender: TObject);var i: Integer;begin for I:=0 to Form1.

Ajax Post Request Solution to Chinese Scrambling Problem

https://qaofficial.com/post/2019/04/24/68557-ajax-post-request-solution-to-chinese-scrambling-problem.html 2019-04-24
This article introduces a solution to the problem of Chinese garbled code in ajax post. If you use utf-8 encoding in all your web applications, there will be no such problem.This is mainly to solve the problem of ajax post Chinese garbled code encountered by web applications that require page codes to be non-utf-8 codes such as gbk or gb2312 (sometimes due to problems left over from history). Generally, there are three solutions to this problem: