Most Cited Articles

Open access

ISSN: 2666-6510

Graph neural networks: A review of methods and applications

Share article

Pre-trained models: Past, present and future

Share article

A survey of transformers

Share article

PTR: Prompt Tuning with Rules for Text Classification

Share article

Data augmentation approaches in natural language processing: A survey

Share article

Advances and challenges in conversational recommender systems: A survey

Share article

Lawformer: A pre-trained language model for Chinese legal long documents

Share article

Neural machine translation: A review of methods, resources, and tools

Share article

A comprehensive survey of entity alignment for knowledge graphs

Share article

CPM: A large-scale generative Chinese Pre-trained language model

Share article

Neural, symbolic and neural-symbolic reasoning on knowledge graphs

Share article

Deep learning for fake news detection: A comprehensive survey

Share article

CPM-2: Large-scale cost-effective pre-trained language models

Share article

Sarcasm detection using news headlines dataset

Share article

A survey on heterogeneous information network based recommender systems: Concepts, methods, applications and resources

Share article

WuDaoCorpora: A super large-scale Chinese corpora for pre-training language models

Share article

CokeBERT: Contextual knowledge selection and embedding towards enhanced pre-trained language models

Share article

Extracting Events and Their Relations from Texts: A Survey on Recent Research Progress and Challenges

Share article

A comprehensive review on resolving ambiguities in natural language processing

Share article

Survey: Transformer based video-language pre-training

Share article

Learning towards conversational AI: A survey

Share article

Network representation learning: A macro and micro view

Share article

User behavior modeling for Web search evaluation

Share article

Discrete and continuous representations and processing in deep learning: Looking forward

Share article

Know what you don't need: Single-Shot Meta-Pruning for attention heads

Share article

Stay Informed

Register your interest and receive email alerts tailored to your needs. Sign up below.