Zhangyin Feng
Harbin Institute of Technology(CN)
Publications by Year
Research Areas
Topic Modeling, Natural Language Processing Techniques, Multimodal Machine Learning Applications, Text Readability and Simplification, Software Engineering Research
Most-Cited Works
- → CodeBERT: A Pre-Trained Model for Programming and Natural Languages(2020)2,312 cited
- → A Survey on Hallucination in Large Language Models: Principles, Taxonomy, Challenges, and Open Questions(2024)1,154 cited
- GraphCodeBERT: Pre-training Code Representations with Data Flow(2021)
- → Improving Low Resource Named Entity Recognition using Cross-lingual Knowledge Transfer(2018)59 cited
- → LogicalFactChecker: Leveraging Logical Operations for Fact Checking with Graph Module Network(2020)50 cited
- → Retrieval-Generation Synergy Augmented Large Language Models(2024)31 cited
- → “Is Whole Word Masking Always Better for Chinese BERT?”: Probing on Chinese Grammatical Error Correction(2022)12 cited