Logistic Regression is very popular in Machine Learning, used to give predictions on something. (It is not the exact probabilities, but general values. )
Author Archives: Irene
TinkerpopGraph 01: Easy API usage, examples
Parallel Gibbs Sampling and Neural Networks
Parallel in Variables (Vertexes): General huge, undirected graph: each vertex is a variable (parallel sampling on a high dimension).
Gibbs Sampling: about Parallelization
About BN Belief Network, or directed acyclic graphical model (DAG). When BN is huge: Exact Inference(variable elimination) Stochastic Inference(MCMC)
Gibbs Sampling: an easy Java Version on TinkerPop3
Introduction
Loopy BP: an easy implementation on Pregel Model
Pregel: Message Passing. Focus on the process, no matter each vertex computation. Steps [1]: (this part was referenced from a blog)
MCMC:Gibbs Sampling
In the Important Sampling, all the samples are independent. But in MCMC, samples are dependent.
Sampling Methods
This post contains very basic knowledge of few sampling methods. As I am going to implement a java version of Gibbs Sampling, I went through some materials on the internet, and kept a learning journal here. Will learn more about Gibbs Sampling in few days, and I will focus on how it involves with GraphContinue reading “Sampling Methods”
