期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Leveraging Unlabeled Corpus for Arabic Dialect Identification
1
作者 Mohammed Abdelmajeed Jiangbin Zheng +3 位作者 Ahmed Murtadha Youcef Nafa Mohammed Abaker Muhammad Pervez Akhter 《Computers, Materials & Continua》 2025年第5期3471-3491,共21页
Arabic Dialect Identification(DID)is a task in Natural Language Processing(NLP)that involves determining the dialect of a given piece of text in Arabic.The state-of-the-art solutions for DID are built on various deep ... Arabic Dialect Identification(DID)is a task in Natural Language Processing(NLP)that involves determining the dialect of a given piece of text in Arabic.The state-of-the-art solutions for DID are built on various deep neural networks that commonly learn the representation of sentences in response to a given dialect.Despite the effectiveness of these solutions,the performance heavily relies on the amount of labeled examples,which is labor-intensive to atain and may not be readily available in real-world scenarios.To alleviate the burden of labeling data,this paper introduces a novel solution that leverages unlabeled corpora to boost performance on the DID task.Specifically,we design an architecture that enables learning the shared information between labeled and unlabeled texts through a gradient reversal layer.The key idea is to penalize the model for learning source dataset specific features and thus enable it to capture common knowledge regardless of the label.Finally,we evaluate the proposed solution on benchmark datasets for DID.Our extensive experiments show that it performs signifcantly better,especially,with sparse labeled data.By comparing our approach with existing Pre-trained Language Models(PLMs),we achieve a new state-of-the-art performance in the DID field.The code will be available on GitHub upon the paper's acceptance. 展开更多
关键词 Arabic dialect identification natural language processing bidirectional encoder representations from transformers pre-trained language models gradient reversal layer
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部