期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
An Optimized Customer Churn Prediction Approach Based on Regularized Bidirectional Long Short-Term Memory Model
1
作者 Adel Saad Assiri 《Computers, Materials & Continua》 2026年第1期1783-1803,共21页
Customer churn is the rate at which customers discontinue doing business with a company over a given time period.It is an essential measure for businesses to monitor high churn rates,as they often indicate underlying ... Customer churn is the rate at which customers discontinue doing business with a company over a given time period.It is an essential measure for businesses to monitor high churn rates,as they often indicate underlying issues with services,products,or customer experience,resulting in considerable income loss.Prediction of customer churn is a crucial task aimed at retaining customers and maintaining revenue growth.Traditional machine learning(ML)models often struggle to capture complex temporal dependencies in client behavior data.To address this,an optimized deep learning(DL)approach using a Regularized Bidirectional Long Short-Term Memory(RBiLSTM)model is proposed to mitigate overfitting and improve generalization error.The model integrates dropout,L2-regularization,and early stopping to enhance predictive accuracy while preventing over-reliance on specific patterns.Moreover,this study investigates the effect of optimization techniques on boosting the training efficiency of the developed model.Experimental results on a recent public customer churn dataset demonstrate that the trained model outperforms the traditional ML models and some other DL models,such as Long Short-Term Memory(LSTM)and Deep Neural Network(DNN),in churn prediction performance and stability.The proposed approach achieves 96.1%accuracy,compared with LSTM and DNN,which attain 94.5%and 94.1%accuracy,respectively.These results confirm that the proposed approach can be used as a valuable tool for businesses to identify at-risk consumers proactively and implement targeted retention strategies. 展开更多
关键词 Customer churn prediction deep learning RBiLSTM DROPOUT baseline models
在线阅读 下载PDF
DeBERTa-GRU: Sentiment Analysis for Large Language Model
2
作者 Adel Assiri Abdu Gumaei +2 位作者 Faisal Mehmood Touqeer Abbas Sami Ullah 《Computers, Materials & Continua》 SCIE EI 2024年第6期4219-4236,共18页
Modern technological advancements have made social media an essential component of daily life.Social media allow individuals to share thoughts,emotions,and ideas.Sentiment analysis plays the function of evaluating whe... Modern technological advancements have made social media an essential component of daily life.Social media allow individuals to share thoughts,emotions,and ideas.Sentiment analysis plays the function of evaluating whether the sentiment of the text is positive,negative,neutral,or any other personal emotion to understand the sentiment context of the text.Sentiment analysis is essential in business and society because it impacts strategic decision-making.Sentiment analysis involves challenges due to lexical variation,an unlabeled dataset,and text distance correlations.The execution time increases due to the sequential processing of the sequence models.However,the calculation times for the Transformer models are reduced because of the parallel processing.This study uses a hybrid deep learning strategy to combine the strengths of the Transformer and Sequence models while ignoring their limitations.In particular,the proposed model integrates the Decoding-enhanced with Bidirectional Encoder Representations from Transformers(BERT)attention(DeBERTa)and the Gated Recurrent Unit(GRU)for sentiment analysis.Using the Decoding-enhanced BERT technique,the words are mapped into a compact,semantic word embedding space,and the Gated Recurrent Unit model can capture the distance contextual semantics correctly.The proposed hybrid model achieves F1-scores of 97%on the Twitter Large Language Model(LLM)dataset,which is much higher than the performance of new techniques. 展开更多
关键词 DeBERTa GRU Naive Bayes LSTM sentiment analysis large language model
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部