Publications
This page features a selection of my publications. For a more comprehensive list of publications, please refer to my Google Scholar profile, Semantic Scholar profile, Computer Science Bibliography, or ACL Anthology.
Published in , 2023
随着自然语言处理的广泛应用以及以深度学习为代表的机器学习算法的快速进步,近年来自然语言处理算法和研究任务也在快速发展中。作者自2003年起,在复旦大学计算机科学技术学院针对本科生、硕士生和博士生先后分别开设了自然语言处理课程。本书对多年教学和研究进行总结梳理,希望使得读者对自然语言处理有更加系统性且全面的了解。
Recommended citation: 张奇、桂韬、黄萱菁:自然语言处理导论,2023 https://intro-nlp.github.io/
Published in Proceedings of the 29th International Conference on Computational Linguistics, 2022
In this paper, we propose a multi-format transfer learning model with variational information bottleneck for EAE in new datasets.
Recommended citation: Jie Zhou, Qi Zhang, Qin Chen, Liang He, Xuanjing Huang: A Multi-Format Transfer Learning Model for Event Argument Extraction via Variational Information Bottleneck. COLING 2022: 1990-2000 http://xuanjing-huang.github.io/files/mft.pdf
Published in Findings of the Association for Computational Linguistics: ACL-IJCNLP, 2021
The paper proposes a framework that retains the original parameters of the pre-trained model fixed and supports the development of versatile knowledge-infused model.
Recommended citation: Ruize Wang, Duyu Tang, Nan Duan, Zhongyu Wei, Xuanjing Huang, Jianshu Ji, Guihong Cao, Daxin Jiang, Ming Zhou: K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters. ACL/IJCNLP (Findings) 2021: 1405-1418 http://xuanjing-huang.github.io/files/K-Adapter.pdf
Published in SCIENCE CHINA Technological Sciences (SCTS), 2020
In this survey, we provide a comprehensive review of PTMs for NLP.
Recommended citation: Xipeng Qiu, TianXiang Sun, Yige Xu, Yunfan Shao, Ning Dai, Xuanjing Huang, Pre-trained Models for Natural Language Processing: A Survey, SCIENCE CHINA Technological Sciences (SCTS) , 2020, Vol. 63(10), pp. 1872–1897 http://xuanjing-huang.github.io/files/PTM.pdf
Published in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020
This paper creates a paradigm shift with regard to the way we build neural extractive summarization systems.
Recommended citation: Ming Zhong, Pengfei Liu, Yiran Chen, Danqing Wang, Xipeng Qiu, Xuanjing Huang: Extractive Summarization as Text Matching. ACL 2020: 6197-6208 http://xuanjing-huang.github.io/files/ext.pdf
Published in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020
In this work, we propose a simple but effective method for incorporating the word lexicon into the character representations.
Recommended citation: Ruotian Ma, Minlong Peng, Qi Zhang, Zhongyu Wei, Xuanjing Huang: Simplify the Usage of Lexicon in Chinese NER. ACL 2020: 5951-5960 http://xuanjing-huang.github.io/files/Simplify.pdf
Published in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020
In this paper, we propose FLAT: Flat-LAttice Transformer for Chinese NER, which converts the lattice structure into a flat structure consisting of spans.
Recommended citation: Xiaonan Li, Hang Yan, Xipeng Qiu, Xuanjing Huang: FLAT: Chinese NER Using Flat-Lattice Transformer. ACL 2020: 6836-6842 http://xuanjing-huang.github.io/files/FLAT.pdf
Published in Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), 2019
In this work, we introduce a lexicon-based graph neural network with global semantics for Chinese NER.
Recommended citation: Tao Gui, Yicheng Zou, Qi Zhang, Minlong Peng, Jinlan Fu, Zhongyu Wei, Xuanjing Huang: A Lexicon-Based Graph Neural Network for Chinese NER. EMNLP/IJCNLP (1) 2019: 1040-1050 http://xuanjing-huang.github.io/files/ALB.pdf
Published in The Eighteenth China National Conference on Computational Linguistics, 2019
In this paper, we conduct exhaustive experiments to investigate different fine-tuning methods of BERT on text classification task.
Recommended citation: Chi Sun, Xipeng Qiu, Yige Xu, Xuanjing Huang: How to Fine-Tune BERT for Text Classification? CCL 2019: 194-206 http://xuanjing-huang.github.io/files/bert-ft.pdf
Published in Proceedings of the 27th International Conference on Computational Linguistics, 2018
We propose a novel lexicon-based supervised attention model (LBSA) for generating sentiment-informative representations.
Recommended citation: Yicheng Zou, Tao Gui, Qi Zhang, Xuanjing Huang: A Lexicon-Based Supervised Attention Model for Neural Sentiment Analysis. COLING 2018: 868-877 http://xuanjing-huang.github.io/files/nsa.pdf
Published in Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017
In this paper, we propose adversarial multi-criteria learning for CWS by integrating shared knowledge from multiple heterogeneous segmentation criteria.
Recommended citation: Xinchi Chen, Zhan Shi, Xipeng Qiu, Xuanjing Huang: Adversarial Multi-Criteria Learning for Chinese Word Segmentation. ACL (1) 2017: 1193-1203 http://xuanjing-huang.github.io/files/cws.pdf
Published in Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2017
The paper proposed an adversarial multi-task learning framework, alleviating the shared and private latent feature spaces from interfering with each other.
Recommended citation: Pengfei Liu, Xipeng Qiu, Xuanjing Huang: Adversarial Multi-task Learning for Text Classification. ACL (1) 2017: 1-10 http://xuanjing-huang.github.io/files/AMT.pdf
Published in Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, 2016
We propose three different mechanisms of sharing information to model text with task-specific and shared layers.
Recommended citation: Pengfei Liu, Xipeng Qiu, Xuanjing Huang: Recurrent Neural Network for Text Classification with Multi-Task Learning. IJCAI 2016: 2873-2879 http://xuanjing-huang.github.io/files/RNN.pdf