期刊文献+
共找到9篇文章
< 1 >
每页显示 20 50 100
Optimizing Airline Review Sentiment Analysis:A Comparative Analysis of LLaMA and BERT Models through Fine-Tuning and Few-Shot Learning
1
作者 Konstantinos I.Roumeliotis Nikolaos D.Tselikas Dimitrios K.Nasiopoulos 《Computers, Materials & Continua》 2025年第2期2769-2792,共24页
In the rapidly evolving landscape of natural language processing(NLP)and sentiment analysis,improving the accuracy and efficiency of sentiment classification models is crucial.This paper investigates the performance o... In the rapidly evolving landscape of natural language processing(NLP)and sentiment analysis,improving the accuracy and efficiency of sentiment classification models is crucial.This paper investigates the performance of two advanced models,the Large Language Model(LLM)LLaMA model and NLP BERT model,in the context of airline review sentiment analysis.Through fine-tuning,domain adaptation,and the application of few-shot learning,the study addresses the subtleties of sentiment expressions in airline-related text data.Employing predictive modeling and comparative analysis,the research evaluates the effectiveness of Large Language Model Meta AI(LLaMA)and Bidirectional Encoder Representations from Transformers(BERT)in capturing sentiment intricacies.Fine-tuning,including domain adaptation,enhances the models'performance in sentiment classification tasks.Additionally,the study explores the potential of few-shot learning to improve model generalization using minimal annotated data for targeted sentiment analysis.By conducting experiments on a diverse airline review dataset,the research quantifies the impact of fine-tuning,domain adaptation,and few-shot learning on model performance,providing valuable insights for industries aiming to predict recommendations and enhance customer satisfaction through a deeper understanding of sentiment in user-generated content(UGC).This research contributes to refining sentiment analysis models,ultimately fostering improved customer satisfaction in the airline industry. 展开更多
关键词 Sentiment classification review sentiment analysis user-generated content domain adaptation customer satisfaction LLaMA model bert model airline reviews LLM classification fine-tuning
在线阅读 下载PDF
Enhancing Multi-Class Cyberbullying Classification with Hybrid Feature Extraction and Transformer-Based Models
2
作者 Suliman Mohamed Fati Mohammed A.Mahdi +4 位作者 Mohamed A.G.Hazber Shahanawaj Ahamad Sawsan A.Saad Mohammed Gamal Ragab Mohammed Al-Shalabi 《Computer Modeling in Engineering & Sciences》 2025年第5期2109-2131,共23页
Cyberbullying on social media poses significant psychological risks,yet most detection systems over-simplify the task by focusing on binary classification,ignoring nuanced categories like passive-aggressive remarks or... Cyberbullying on social media poses significant psychological risks,yet most detection systems over-simplify the task by focusing on binary classification,ignoring nuanced categories like passive-aggressive remarks or indirect slurs.To address this gap,we propose a hybrid framework combining Term Frequency-Inverse Document Frequency(TF-IDF),word-to-vector(Word2Vec),and Bidirectional Encoder Representations from Transformers(BERT)based models for multi-class cyberbullying detection.Our approach integrates TF-IDF for lexical specificity and Word2Vec for semantic relationships,fused with BERT’s contextual embeddings to capture syntactic and semantic complexities.We evaluate the framework on a publicly available dataset of 47,000 annotated social media posts across five cyberbullying categories:age,ethnicity,gender,religion,and indirect aggression.Among BERT variants tested,BERT Base Un-Cased achieved the highest performance with 93%accuracy(standard deviation across±1%5-fold cross-validation)and an average AUC of 0.96,outperforming standalone TF-IDF(78%)and Word2Vec(82%)models.Notably,it achieved near-perfect AUC scores(0.99)for age and ethnicity-based bullying.A comparative analysis with state-of-the-art benchmarks,including Generative Pre-trained Transformer 2(GPT-2)and Text-to-Text Transfer Transformer(T5)models highlights BERT’s superiority in handling ambiguous language.This work advances cyberbullying detection by demonstrating how hybrid feature extraction and transformer models improve multi-class classification,offering a scalable solution for moderating nuanced harmful content. 展开更多
关键词 Cyberbullying classification multi-class classification bert models machine learning TF-IDF Word2Vec social media analysis transformer models
在线阅读 下载PDF
Ontology-Based BERT Model for Automated Information Extraction from Geological Hazard Reports 被引量:5
3
作者 Kai Ma Miao Tian +3 位作者 Yongjian Tan Qinjun Qiu Zhong Xie Rong Huang 《Journal of Earth Science》 SCIE CAS CSCD 2023年第5期1390-1405,共16页
Geological knowledge can provide support for knowledge discovery, knowledge inference and mineralization predictions of geological big data. Entity identification and relationship extraction from geological data descr... Geological knowledge can provide support for knowledge discovery, knowledge inference and mineralization predictions of geological big data. Entity identification and relationship extraction from geological data description text are the key links for constructing knowledge graphs. Given the lack of publicly annotated datasets in the geology domain, this paper illustrates the construction process of geological entity datasets, defines the types of entities and interconceptual relationships by using the geological entity concept system, and completes the construction of the geological corpus. To address the shortcomings of existing language models(such as Word2vec and Glove) that cannot solve polysemous words and have a poor ability to fuse contexts, we propose a geological named entity recognition and relationship extraction model jointly with Bidirectional Encoder Representation from Transformers(BERT) pretrained language model. To effectively represent the text features, we construct a BERT-bidirectional gated recurrent unit network(BiGRU)-conditional random field(CRF)-based architecture to extract the named entities and the BERT-BiGRU-Attention-based architecture to extract the entity relations. The results show that the F1-score of the BERT-BiGRU-CRF named entity recognition model is 0.91 and the F1-score of the BERT-BiGRU-Attention relationship extraction model is 0.84, which are significant performance improvements when compared to classic language models(e.g., word2vec and Embedding from Language Models(ELMo)). 展开更多
关键词 ONTOLOGY bert model name entity recognition relation extraction knowledge graph
原文传递
End End-to to-End Chinese Entity Recognition Based on BERT BERT-BiLSTM BiLSTM-ATT ATT-CRF 被引量:3
4
作者 LI Daiyi TU Yaofeng +2 位作者 ZHOU Xiangsheng ZHANG Yangming MA Zongmin 《ZTE Communications》 2022年第S01期27-35,共9页
Traditional named entity recognition methods need professional domain knowl-edge and a large amount of human participation to extract features,as well as the Chinese named entity recognition method based on a neural n... Traditional named entity recognition methods need professional domain knowl-edge and a large amount of human participation to extract features,as well as the Chinese named entity recognition method based on a neural network model,which brings the prob-lem that vector representation is too singular in the process of character vector representa-tion.To solve the above problem,we propose a Chinese named entity recognition method based on the BERT-BiLSTM-ATT-CRF model.Firstly,we use the bidirectional encoder representations from transformers(BERT)pre-training language model to obtain the se-mantic vector of the word according to the context information of the word;Secondly,the word vectors trained by BERT are input into the bidirectional long-term and short-term memory network embedded with attention mechanism(BiLSTM-ATT)to capture the most important semantic information in the sentence;Finally,the conditional random field(CRF)is used to learn the dependence between adjacent tags to obtain the global optimal sentence level tag sequence.The experimental results show that the proposed model achieves state-of-the-art performance on both Microsoft Research Asia(MSRA)corpus and people’s daily corpus,with F1 values of 94.77% and 95.97% respectively. 展开更多
关键词 named entity recognition(NER) feature extraction bert model BiLSTM at-tention mechanism CRF
在线阅读 下载PDF
Classification of Acupuncture Points Based on the Bert Model*
5
作者 Xi Zhong Yangli Jia +1 位作者 Dekui Li Xiangliang Zhang 《Journal of Data Analysis and Information Processing》 2021年第3期123-135,共13页
In this paper, we explore the multi-classification problem of acupuncture acupoints bas</span><span><span style="font-family:Verdana;">ed on </span><span style="font-family:Ve... In this paper, we explore the multi-classification problem of acupuncture acupoints bas</span><span><span style="font-family:Verdana;">ed on </span><span style="font-family:Verdana;">Bert</span><span style="font-family:Verdana;"> model, </span><i><span style="font-family:Verdana;">i.e.</span></i><span style="font-family:Verdana;">, we try to recommend the best main acupuncture point for treating the disease by classifying and predicting the main acupuncture point for the disease, and further explore its acupuncture point grouping to provide the medical practitioner with the optimal solution for treating the disease and improv</span></span></span><span style="font-family:Verdana;">ing</span><span style="font-family:""><span style="font-family:Verdana;"> the clinical decision-making ability. The Bert-Chinese-Acupoint model was constructed by retraining </span><span style="font-family:Verdana;">on the basis of</span><span style="font-family:Verdana;"> the Bert model, and the semantic features in terms of acupuncture points were added to the acupunctu</span></span><span style="font-family:""><span style="font-family:Verdana;">re point corpus in the fine-tuning process to increase the semantic features in terms of acupuncture </span><span style="font-family:Verdana;">points,</span><span style="font-family:Verdana;"> and compared with the machine learning method. The results show that the Bert-Chinese Acupoint model proposed in this paper has a 3% improvement in accuracy compared to the </span><span style="font-family:Verdana;">best performing</span><span style="font-family:Verdana;"> model in the machine learning approach. 展开更多
关键词 bert Model Machine Learning Classification Model Comparison
暂未订购
Bidirectional Recurrent Nets for ECG Signal Compression
6
作者 Eman AL-Saidi Khalil El Hindi 《Journal of Computer Science Research》 2022年第4期15-25,共11页
Electrocardiogram(ECG)is a commonly used tool in biological diagnosis of heart diseases.ECG allows the representation of electrical signals which cause heart muscles to contract and relax.Recently,accurate deep learni... Electrocardiogram(ECG)is a commonly used tool in biological diagnosis of heart diseases.ECG allows the representation of electrical signals which cause heart muscles to contract and relax.Recently,accurate deep learning methods have been developed to overcome manual diagnosis in terms of time and effort.However,most of current automatic medical diagnosis use long electrocardiogram(ECG)signals to inspect different types of heart arrhythmia.Therefore,ECG signal files tend to require large storage to store and may cause significant overhead when exchanged over a computer network.This raises the need to come up with effective compression methods for ECG signals.In this work,the authors investigate using BERT(Bidirectional Encoder Representations from Transformers)model,which is a bidirectional neural network that was originally designed for natural language.The authors evaluate the model with respect to its compression ratio and information preservation,and measure information preservation in terms of the of the accuracy of a convolutional neural network in classifying the decompressed signal.The results show that the method can achieve up to 83%saving in storage.Also,the classification accuracy of the decompressed signals is around 92.41%.Furthermore,the method enables the user to balance the compression ratio and the required accuracy of the CNN classifiers. 展开更多
关键词 bert model Convolutional neural networks(CNN) Data compression Deep learning ECG diagnosis
暂未订购
Analysis of Patents Related to COVID-19-Based on Patent Clustering Model in Specific Fields
7
作者 Fu Nan Li Qian Yuan Hongmei 《Asian Journal of Social Pharmacy》 2024年第4期371-382,共12页
Objective To improve the efficiency of patent clustering related to COVID-19 through the topic extraction algorithm and BERT model,and to help researchers understand the patent applications for novel corona virus.Meth... Objective To improve the efficiency of patent clustering related to COVID-19 through the topic extraction algorithm and BERT model,and to help researchers understand the patent applications for novel corona virus.Methods The weights of topic vector and BERT model vector were adjusted by cross-entropy loss algorithm to obtain joint vector.Then,k-means++algorithm was used for patent clustering after dimension reduction.Results and Conclusion The model was applied to patents for corona virus drugs,and five clustering topics were generated.Through comparison,it is proved that the clustering results of this model are more centralized and the differentiation between clusters is significant.The five clusters generated are visually analyzed to reveal the development status of patents for corona virus drugs. 展开更多
关键词 corona virus patent clustering patent analysis bert model
暂未订购
Semantic and secure search over encrypted outsourcing cloud based on BERT 被引量:2
8
作者 Zhangjie FU Yan WANG +1 位作者 Xingming SUN Xiaosong ZHANG 《Frontiers of Computer Science》 SCIE EI CSCD 2022年第2期152-159,共8页
Searchable encryption provides an effective way for data security and privacy in cloud storage.Users can retrieve encrypted data in the cloud under the premise of protecting their own data security and privacy.However... Searchable encryption provides an effective way for data security and privacy in cloud storage.Users can retrieve encrypted data in the cloud under the premise of protecting their own data security and privacy.However,most of the current content-based retrieval schemes do not contain enough semantic information of the article and cannot fully reflect the semantic information of the text.In this paper,we propose two secure and semantic retrieval schemes based on BERT(bidirectional encoder representations from transformers)named SSRB-1,SSRB-2.By training the documents with BERT,the keyword vector is generated to contain more semantic information of the documents,which improves the accuracy of retrieval and makes the retrieval result more consistent with the user’s intention.Finally,through testing on real data sets,it is shown that both of our solutions are feasible and effective. 展开更多
关键词 cloud computing semantic search bert model searchable encryption
原文传递
Automatic Generation of Graduation Thesis Comments Based on Multilevel Analysis
9
作者 Yiwen Zhu Zhaoyi Li +1 位作者 Yanzi Li Yanqing Wang 《国际计算机前沿大会会议论文集》 2022年第1期67-79,共13页
In the evaluation of graduation theses,teachers’evaluation criteria for graduation theses are inconsistent,subjective and not completely reasonable and fair.This paper proposes using the BERT model to analyze the exi... In the evaluation of graduation theses,teachers’evaluation criteria for graduation theses are inconsistent,subjective and not completely reasonable and fair.This paper proposes using the BERT model to analyze the existing graduation papers in colleges and universities and make quantitatively evaluate students’graduation projects according to the given relevant parameters.The purpose of this method is to use standards to make comprehensive,systematic and accurate evaluations and avoid the phenomenon of high repetition and similarity caused by a large number of teachers’comments.This can not only effectively improve the efficiency of graduation design evaluation but also improve the fairness of evaluation.In this paper,changing the review work of the graduation thesis from pure manual operation to machine review combined with manual operation can not only reduce manpower consumption but also make the review work more objective and fair,making it more objective on the basis of traditional subjective review. 展开更多
关键词 Multilevel analysis Reviews the automatically generated bert model
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部