LectureBank: a dataset for NLP Education and Prerequisite Chain Learning

Introduction

In this blog post, we introduce our AAAI 2019 accepted paper “What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning.”
Our LectureBank dataset contains 1,352 English lecture files collected from university courses in mainly Natural Language Processing (NLP) field. Besides, each file is manually classified according to an existing taxonomy. Together with the dataset, we include 208 manually-labeled prerequisite relation topics. The dataset will be useful for educational purposes such as lecture preparation and organization as well as applications such as reading list generation. Additionally, we experiment with neural graph-based networks and non-neural classifiers to learn these prerequisite relations from our dataset.

Continue reading “LectureBank: a dataset for NLP Education and Prerequisite Chain Learning”

TensorFlow 08: save and restore a subset of variables

TensorFlow provides save and restore functions for us to save and re-use the model parameters. If you have a trained VGG model, for example, it will be helpful for you to restore the first few layers then apply them in your own networks. This may raise a problem, how do we restore a subset of the parameters? You can always check the TF official document here. In this post, I will take some code from the document and add some practical points.

Continue reading “TensorFlow 08: save and restore a subset of variables”