Deep Learning 13: Understanding Generative Adversarial Network

Proposed in 2014, the interesting Generative Adversarial Network (GAN) has now many variants. You might not surprised that the relevant papers are more like statistics research. When a model was proposed, the evaluations would be based on some fundamental probability distributions, where generalized applications start. Continue reading “Deep Learning 13: Understanding Generative Adversarial Network”

Two sample problem(2): kernel function, feature space and reproducing kernel map

Find Two sample problem (1) here.

We will take a look at RHKS (Reproducing Hilbert Kernel Space ) in this post. You might think of it a very statistical term but it is amazing because of various applications. You will need to refresh your mind for some linear algebra computations. We start with some basic terms and definitions. Continue reading “Two sample problem(2): kernel function, feature space and reproducing kernel map”

NLP 05: From Word2vec to Doc2vec: a simple example with Gensim

 

Introduction

First introduced by Mikolov 1 in 2013, the word2vec is to learn distributed representations (word embeddings) when applying neural network. It is based on the distributed hypothesis that words occur in similar contexts (neighboring words) tend to have similar meanings. Two models here: cbow ( continuous bag of words) where we use a bag of words to predict a target word and skip-gram where we use one word to predict its neighbors. For more, although not highly recommended, have a look at TensorFlow tutorial here. Continue reading “NLP 05: From Word2vec to Doc2vec: a simple example with Gensim”