Codes & Datasets

We publicly released source code implementations and datasets developed for our research, with the hope that they will help facilitate further AI research and applications. 

Released Source Codes

NO

Code

Paper

Conference

Year

01.

BERTweet

540
52
BERTweet: A pre-trained language model for English Tweets EMNLP 2020
02.

PhoBERT

584
86
Prediction, Consistency, Curvature EMNLP Findings 2020
03.

DSW

45
11
Distributional Sliced-Wasserstein distance ICLR 2020
04.

PC3-pytorch

16
3
Predictive Coding for Locally-Linear Control ICML 2020
05.

PCC-pytorch

59
13
Representation Learning for Locally-Linear Control ICLR 2019
06.

plasticbag-faster-rcnn

48
11
No paper yet 2019