LectureBank: a dataset for NLP Education and Prerequisite Chain Learning

Introduction In this blog post, we introduce our AAAI 2019 accepted paper “What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning.” Our LectureBank dataset contains 1,352 English lecture files collected from university courses in mainly Natural Language Processing (NLP) field. Besides, each file is manually classified according to an existingContinue reading “LectureBank: a dataset for NLP Education and Prerequisite Chain Learning”

TensorFlow 08: save and restore a subset of variables

TensorFlow provides save and restore functions for us to save and re-use the model parameters. If you have a trained VGG model, for example, it will be helpful for you to restore the first few layers then apply them in your own networks. This may raise a problem, how do we restore a subset ofContinue reading “TensorFlow 08: save and restore a subset of variables”

To copy or not, that is the question: copying mechanism

In our daily life, we always repeating something mentioned before in our dialogue, like the name of people or organizations. “Hi, my name is Pikachu”, “Hi, Pikachu,…” There is a high probability that the word “Pikachu” will not be in the vocabulary extracted from the training data. So in the paper (Incorporating Copying Mechanism inContinue reading “To copy or not, that is the question: copying mechanism”

Deep Learning 16: Understanding Capsule Nets

This post is the learning notes from Prof Hung-Yi Lee‘s lecture, the pdf could be found here (page40-52). I have read few articles, and I found this is a must-read. It is simple, and you can easily understand what is going on. I would say it is a good starting point for further readings. PaperContinue reading “Deep Learning 16: Understanding Capsule Nets”