MLM, masked language modeling, is an important task for trianing a BERT model. In the orignal BERT paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, it is one of the main tasks of how BERT was pre-trained. So if you have your own corpus, it is possible to train MLM on any pre-trainedContinue reading “Deep Learning 19: Training MLM on any pre-trained BERT models”
Check out my class talk slides about Graph Neural Networks and their applications in NLP! Covered materials: Semi-Supervised Classification with Graph Convolutional Networks Variational Graph Auto-Encoders Graph Attention Networks Graph Convolutional Networks for Text Classification (AAAI 2019) Heterogeneous Graph Neural Networks for Extractive Document Summarization (ACL 2020) A Graph-based Coarse-to-fine Method for Unsupervised Bilingual LexiconContinue reading “Slideshare (11): Graph Neural Networks and Applications in NLP”
Please check my notes for the Spring Semester 2020: Topics covered: Text Generation BERT understanding Knowledge Distillation NLP Applications EHR+Translation
To bridge the gap between Paddle Serving and PaddlePaddle framework, we release the new service of PaddleServing: Model As A Service (MAAS) online in Github. With the help of the new service, when a PaddlePaddle model is trained, users now can obtain the corresponding inference model at the same time, making it possible to deployContinue reading “Paddle Serving: model-as-a-service! Triggered by a single command line, deployment finishes in 10 minutes”
Please check my notes for an updated version about Transfer Learning with NLP tasks: Irene_TransferLearning_2020Spring A brief outline: Transfer Learning with word embedding Pre-BERT times BERT and its variants Understanding, reducing BERT Transfer Learning in the real world Also, visit my A brief Introduction on Transfer Learning notes from 2019 Spring.
The pdf contains some notes on several papers, some of which only have 1-2 slide pages. Again, red fonts are my thoughts and insights. I wish my notes can help readers to better understand the new concepts and get inspired. Click to view: GNN_notes_Fall2019.pdf
Finally, my notes are online now: NeurIPS_notes_Irene
More resources! We have almost doubled the number of manually collected resources since our previous release, now totaling over 13,000.
During this summer, I did a project on cross-lingual NLP tasks. Recently I was working my notes and I organized them into a better format. I would like to share some of the notes with the readers who might be interested in this topic. Cross_lingual_NLP(PDF) Papers covered: A Robust Abstractive System for Cross-Lingual Summarization MASS:Continue reading “Slideshare (6): Cross-lingual Paper reading notes”
A quick introduction to Graph Neural Networks: Graph Neural Networks List of papers covered: Semi-Supervised Classification with Graph Convolutional Networks Graph Attention Networks Variational Graph Auto-Encoders Keep It Simple: Graph Autoencoders Without Graph Convolutional Networks