期刊文献+
共找到315篇文章
< 1 2 16 >
每页显示 20 50 100
Computerized Detection of Limbal Stem Cell Deficiency from Digital Cornea Images
1
作者 Hanan A.Hosni Mahmoud Doaa S.Khafga Amal H.Alharbi 《Computer Systems Science & Engineering》 SCIE EI 2022年第2期805-821,共17页
Limbal Stem Cell Deficiency(LSCD)is an eye disease that can cause corneal opacity and vascularization.In its advanced stage it can lead to a degree of visual impairment.It involves the changing in the semispherical sh... Limbal Stem Cell Deficiency(LSCD)is an eye disease that can cause corneal opacity and vascularization.In its advanced stage it can lead to a degree of visual impairment.It involves the changing in the semispherical shape of the cornea to a drooping shape to downwards direction.LSCD is hard to be diagnosed at early stages.The color and texture of the cornea surface can provide significant information about the cornea affected by LSCD.Parameters such as shape and texture are very crucial to differentiate normal from LSCD cornea.Although several medical approaches exist,most of them requires complicated procedure and medical devices.Therefore,in this paper,we pursued the development of a LSCD detection technique(LDT)utilizing image processing methods.Early diagnosis of LSCD is very crucial for physicians to arrange for effective treatment.In the proposed technique,we developed a method for LSCD detection utilizing frontal eye images.A dataset of 280 eye images of frontal and lateral LSCD and normal patients were used in this research.First,the cornea region of both frontal and lateral images is segmented,and the geometric features are extracted through the automated active contour model and the spline curve.While the texture features are extracted using the feature selection algorithm.The experimental results exhibited that the combined features of the geometric and texture will exhibit accuracy of 95.95%,sensitivity of 97.91% and specificity of 94.05% with the random forest classifier of n=40.As a result,this research developed a Limbal stem cell deficiency detection system utilizing features’fusion using image processing techniques for frontal and lateral digital images of the eyes. 展开更多
关键词 Feature extraction corneal opacity geometric features computerized detection image processing
在线阅读 下载PDF
Multi-Step Clustering of Smart Meters Time Series:Application to Demand Flexibility Characterization of SME Customers
2
作者 Santiago Bañales Raquel Dormido Natividad Duro 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期869-907,共39页
Customer segmentation according to load-shape profiles using smart meter data is an increasingly important application to vital the planning and operation of energy systems and to enable citizens’participation in the... Customer segmentation according to load-shape profiles using smart meter data is an increasingly important application to vital the planning and operation of energy systems and to enable citizens’participation in the energy transition.This study proposes an innovative multi-step clustering procedure to segment customers based on load-shape patterns at the daily and intra-daily time horizons.Smart meter data is split between daily and hourly normalized time series to assess monthly,weekly,daily,and hourly seasonality patterns separately.The dimensionality reduction implicit in the splitting allows a direct approach to clustering raw daily energy time series data.The intraday clustering procedure sequentially identifies representative hourly day-unit profiles for each customer and the entire population.For the first time,a step function approach is applied to reduce time series dimensionality.Customer attributes embedded in surveys are employed to build external clustering validation metrics using Cramer’s V correlation factors and to identify statistically significant determinants of load-shape in energy usage.In addition,a time series features engineering approach is used to extract 16 relevant demand flexibility indicators that characterize customers and corresponding clusters along four different axes:available Energy(E),Temporal patterns(T),Consistency(C),and Variability(V).The methodology is implemented on a real-world electricity consumption dataset of 325 Small and Medium-sized Enterprise(SME)customers,identifying 4 daily and 6 hourly easy-to-interpret,well-defined clusters.The application of the methodology includes selecting key parameters via grid search and a thorough comparison of clustering distances and methods to ensure the robustness of the results.Further research can test the scalability of the methodology to larger datasets from various customer segments(households and large commercial)and locations with different weather and socioeconomic conditions. 展开更多
关键词 Electric load clustering load profiling smart meters machine learning data mining demand flexibility demand response
在线阅读 下载PDF
Harnessing Machine Learning for Superior Prediction of Uniaxial Compressive Strength in Reinforced Soilcrete
3
作者 Ala’a R.Al-Shamasneh Faten Khalid Karim Arsalan Mahmoodzadeh 《Computers, Materials & Continua》 2025年第7期281-303,共23页
Soilcrete is a composite material of soil and cement that is highly valued in the construction industry.Accurate measurement of its mechanical properties is essential,but laboratory testing methods are expensive,timec... Soilcrete is a composite material of soil and cement that is highly valued in the construction industry.Accurate measurement of its mechanical properties is essential,but laboratory testing methods are expensive,timeconsuming,and include inaccuracies.Machine learning(ML)algorithms provide a more efficient alternative for this purpose,so after assessment with a statistical extraction method,ML algorithms including back-propagation neural network(BPNN),K-nearest neighbor(KNN),radial basis function(RBF),feed-forward neural networks(FFNN),and support vector regression(SVR)for predicting the uniaxial compressive strength(UCS)of soilcrete,were proposed in this study.The developed models in this study were optimized using an optimization technique,gradient descent(GD),throughout the analysis(direct optimization for neural networks and indirect optimization for other models corresponding to their hyperparameters).After doing laboratory analysis,data pre-preprocessing,and data-processing analysis,a database including 600 soilcrete specimens was gathered,which includes two different soil types(clay and limestone)and metakaolin as a mineral additive.80%of the database was used for the training set and 20%for testing,considering eight input parameters,including metakaolin content,soil type,superplasticizer content,water-to-binder ratio,shrinkage,binder,density,and ultrasonic velocity.The analysis showed that most algorithms performed well in the prediction,with BPNN,KNN,and RBF having higher accuracy compared to others(R^(2)=0.95,0.95,0.92,respectively).Based on this evaluation,it was observed that all models show an acceptable accuracy rate in prediction(RMSE:BPNN=0.11,FFNN=0.24,KNN=0.05,SVR=0.06,RBF=0.05,MAD:BPNN=0.006,FFNN=0.012,KNN=0.008,SVR=0.006,RBF=0.009).The ML importance ranking-sensitivity analysis indicated that all input parameters influence theUCS of soilcrete,especially the water-to-binder ratio and density,which have themost impact. 展开更多
关键词 Soilcrete laboratory analysis uniaxial compressive strength machine learning sensitivity analysis
在线阅读 下载PDF
A Novel Malware Detection Framework for Internet of Things Applications
4
作者 Muhammad Adil Mona M.Jamjoom Zahid Ullah 《Computers, Materials & Continua》 2025年第9期4363-4380,共18页
In today’s digital world,the Internet of Things(IoT)plays an important role in both local and global economies due to its widespread adoption in different applications.This technology has the potential to offer sever... In today’s digital world,the Internet of Things(IoT)plays an important role in both local and global economies due to its widespread adoption in different applications.This technology has the potential to offer several advantages over conventional technologies in the near future.However,the potential growth of this technology also attracts attention from hackers,which introduces new challenges for the research community that range from hardware and software security to user privacy and authentication.Therefore,we focus on a particular security concern that is associated with malware detection.The literature presents many countermeasures,but inconsistent results on identical datasets and algorithms raise concerns about model biases,training quality,and complexity.This highlights the need for an adaptive,real-time learning framework that can effectively mitigate malware threats in IoT applications.To address these challenges,(i)we propose an intelligent framework based on Two-step Deep Reinforcement Learning(TwStDRL)that is capable of learning and adapting in real-time to counter malware threats in IoT applications.This framework uses exploration and exploitation phenomena during both the training and testing phases by storing results in a replay memory.The stored knowledge allows the model to effectively navigate the environment and maximize cumulative rewards.(ii)To demonstrate the superiority of the TwStDRL framework,we implement and evaluate several machine learning algorithms for comparative analysis that include Support Vector Machines(SVM),Multi-Layer Perceptron,Random Forests,and k-means Clustering.The selection of these algorithms is driven by the inconsistent results reported in the literature,which create doubt about their robustness and reliability in real-world IoT deployments.(iii)Finally,we provide a comprehensive evaluation to justify why the TwStDRL framework outperforms them in mitigating security threats.During analysis,we noted that our proposed TwStDRL scheme achieves an average performance of 99.45%across accuracy,precision,recall,and F1-score,which is an absolute improvement of roughly 3%over the existing malware-detection models. 展开更多
关键词 IoT applications security malware detection advanced machine learning algorithms data privacy challenges
在线阅读 下载PDF
Automated Gleason Grading of Prostate Cancer from Low-Resolution Histopathology Images Using an Ensemble Network of CNN and Transformer Models
5
作者 Md Shakhawat Hossain Md Sahilur Rahman +3 位作者 Munim Ahmed Anowar Hussen Zahid Ullah Mona Jamjoom 《Computers, Materials & Continua》 2025年第8期3193-3215,共23页
One in every eight men in the US is diagnosed with prostate cancer,making it the most common cancer in men.Gleason grading is one of the most essential diagnostic and prognostic factors for planning the treatment of p... One in every eight men in the US is diagnosed with prostate cancer,making it the most common cancer in men.Gleason grading is one of the most essential diagnostic and prognostic factors for planning the treatment of prostate cancer patients.Traditionally,urological pathologists perform the grading by scoring the morphological pattern,known as the Gleason pattern,in histopathology images.However,thismanual grading is highly subjective,suffers intra-and inter-pathologist variability and lacks reproducibility.An automated grading system could be more efficient,with no subjectivity and higher accuracy and reproducibility.Automated methods presented previously failed to achieve sufficient accuracy,lacked reproducibility and depended on high-resolution images such as 40×.This paper proposes an automated Gleason grading method,ProGENET,to accurately predict the grade using low-resolution images such as 10×.This method first divides the patient’s histopathology whole slide image(WSI)into patches.Then,it detects artifacts and tissue-less regions and predicts the patch-wise grade using an ensemble network of CNN and transformer models.The proposed method adapted the International Society of Urological Pathology(ISUP)grading system and achieved 90.8%accuracy in classifying the patches into healthy and Gleason grades 1 through 5 using 10×WSI,outperforming the state-of-the-art accuracy by 27%.Finally,the patient’s grade was determined by combining the patch-wise results.The method was also demonstrated for 4−class grading and binary classification of prostate cancer,achieving 93.0%and 99.6%accuracy,respectively.The reproducibility was over 90%.Since the proposedmethod determined the grades with higher accuracy and reproducibility using low-resolution images,it is more reliable and effective than existing methods and can potentially improve subsequent therapy decisions. 展开更多
关键词 Gleason grading prostate cancer whole slide image ensemble learning digital pathology
暂未订购
Advances in Machine Learning for Explainable Intrusion Detection Using Imbalance Datasets in Cybersecurity with Harris Hawks Optimization
6
作者 Amjad Rehman Tanzila Saba +2 位作者 Mona M.Jamjoom Shaha Al-Otaibi Muhammad I.Khan 《Computers, Materials & Continua》 2026年第1期1804-1818,共15页
Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness a... Modern intrusion detection systems(MIDS)face persistent challenges in coping with the rapid evolution of cyber threats,high-volume network traffic,and imbalanced datasets.Traditional models often lack the robustness and explainability required to detect novel and sophisticated attacks effectively.This study introduces an advanced,explainable machine learning framework for multi-class IDS using the KDD99 and IDS datasets,which reflects real-world network behavior through a blend of normal and diverse attack classes.The methodology begins with sophisticated data preprocessing,incorporating both RobustScaler and QuantileTransformer to address outliers and skewed feature distributions,ensuring standardized and model-ready inputs.Critical dimensionality reduction is achieved via the Harris Hawks Optimization(HHO)algorithm—a nature-inspired metaheuristic modeled on hawks’hunting strategies.HHO efficiently identifies the most informative features by optimizing a fitness function based on classification performance.Following feature selection,the SMOTE is applied to the training data to resolve class imbalance by synthetically augmenting underrepresented attack types.The stacked architecture is then employed,combining the strengths of XGBoost,SVM,and RF as base learners.This layered approach improves prediction robustness and generalization by balancing bias and variance across diverse classifiers.The model was evaluated using standard classification metrics:precision,recall,F1-score,and overall accuracy.The best overall performance was recorded with an accuracy of 99.44%for UNSW-NB15,demonstrating the model’s effectiveness.After balancing,the model demonstrated a clear improvement in detecting the attacks.We tested the model on four datasets to show the effectiveness of the proposed approach and performed the ablation study to check the effect of each parameter.Also,the proposed model is computationaly efficient.To support transparency and trust in decision-making,explainable AI(XAI)techniques are incorporated that provides both global and local insight into feature contributions,and offers intuitive visualizations for individual predictions.This makes it suitable for practical deployment in cybersecurity environments that demand both precision and accountability. 展开更多
关键词 Intrusion detection XAI machine learning ensemble method CYBERSECURITY imbalance data
在线阅读 下载PDF
Quantitative Modelling of Multiphase Lithospheric Stretching and Deep Thermal History of Some Tertiary Rift Basins in Eastern China 被引量:21
7
作者 林畅松 张燕梅 +4 位作者 李思田 刘景彦 仝志刚 丁孝忠 李喜臣 《Acta Geologica Sinica(English Edition)》 SCIE CAS CSCD 2002年第3期324-330,共7页
The stretching process of some Tertiary rift basins in eastern China is characterized by multiphase rifting. A multiple instantaneous uniform stretching model is proposed in this paper to simulate the formation of the... The stretching process of some Tertiary rift basins in eastern China is characterized by multiphase rifting. A multiple instantaneous uniform stretching model is proposed in this paper to simulate the formation of the basins as the rifting process cannot be accurately described by a simple (one episode) stretching model. The study shows that the multiphase stretching model, combined with the back-stripping technique, can be used to reconstruct the subsidence history and the stretching process of the lithosphere, and to evaluate the depth to the top of the asthenosphere and the deep thermal evolution of the basins. The calculated results obtained by applying the quantitative model to the episodic rifting process of the Tertiary Qiongdongnan and Yinggehai basins in the South China Sea are in agreement with geophysical data and geological observations. This provides a new method for quantitative evaluation of the geodynamic process of multiphase rifting occurring during the Tertiary in eastern China. 展开更多
关键词 multiphase rifting quantitative model Tertiary basins eastern China
在线阅读 下载PDF
Machine Learning Empowered Security Management and Quality of Service Provision in SDN-NFV Environment 被引量:8
8
作者 Shumaila Shahzadi Fahad Ahmad +5 位作者 Asma Basharat Madallah Alruwaili Saad Alanazi Mamoona Humayun Muhammad Rizwan Shahid Naseem 《Computers, Materials & Continua》 SCIE EI 2021年第3期2723-2749,共27页
With the rising demand for data access,network service providers face the challenge of growing their capital and operating costs while at the same time enhancing network capacity and meeting the increased demand for a... With the rising demand for data access,network service providers face the challenge of growing their capital and operating costs while at the same time enhancing network capacity and meeting the increased demand for access.To increase efficacy of Software Defined Network(SDN)and Network Function Virtualization(NFV)framework,we need to eradicate network security configuration errors that may create vulnerabilities to affect overall efficiency,reduce network performance,and increase maintenance cost.The existing frameworks lack in security,and computer systems face few abnormalities,which prompts the need for different recognition and mitigation methods to keep the system in the operational state proactively.The fundamental concept behind SDN-NFV is the encroachment from specific resource execution to the programming-based structure.This research is around the combination of SDN and NFV for rational decision making to control and monitor traffic in the virtualized environment.The combination is often seen as an extra burden in terms of resources usage in a heterogeneous network environment,but as well as it provides the solution for critical problems specially regarding massive network traffic issues.The attacks have been expanding step by step;therefore,it is hard to recognize and protect by conventional methods.To overcome these issues,there must be an autonomous system to recognize and characterize the network traffic’s abnormal conduct if there is any.Only four types of assaults,including HTTP Flood,UDP Flood,Smurf Flood,and SiDDoS Flood,are considered in the identified dataset,to optimize the stability of the SDN-NFVenvironment and security management,through several machine learning based characterization techniques like Support Vector Machine(SVM),K-Nearest Neighbors(KNN),Logistic Regression(LR)and Isolation Forest(IF).Python is used for simulation purposes,including several valuable utilities like the mine package,the open-source Python ML libraries Scikit-learn,NumPy,SciPy,Matplotlib.Few Flood assaults and Structured Query Language(SQL)injections anomalies are validated and effectively-identified through the anticipated procedure.The classification results are promising and show that overall accuracy lies between 87%to 95%for SVM,LR,KNN,and IF classifiers in the scrutiny of traffic,whether the network traffic is normal or anomalous in the SDN-NFV environment. 展开更多
关键词 Software defined network network function virtualization machine learning support vector machine K-nearest neighbors logistic regression isolation forest anomaly detection ATTACKS
在线阅读 下载PDF
Depositional architecture of the late Ordovician drowned carbonate platform margin and its responses to sea-level fluctuation in the northern slope of the Tazhong region, Tarim Basin 被引量:9
9
作者 Yang Xiaofa Lin Changsong +8 位作者 Yang Haijun Han Jianfa Liu Jingyan Zhang Yanmei Peng Li Jing Bing Tong Jianyu Wang Haiping Li Huanpu 《Petroleum Science》 SCIE CAS CSCD 2010年第3期323-336,共14页
The Tazhong Uplift of the late Ordovician is a drowned rimmed carbonate platform. The carbonate rock of the late Ordovician Lianglitage Formation in the northern slope of the Tazhong region is one of the significant p... The Tazhong Uplift of the late Ordovician is a drowned rimmed carbonate platform. The carbonate rock of the late Ordovician Lianglitage Formation in the northern slope of the Tazhong region is one of the significant petroliferous intervals. Based on petrofacies, depositional cycles, natural gammaray spectrometry and carbon/oxygen isotope data from the Lianglitage Formation, one 2nd-order, three 3rd-order and several 4th-order sequences have been recognized, and the late Ordovician relative sealevel fluctuation curve has been established. The sequences O3 1-1 and O3 1-2 on the platform are composed of highstand and transgressive systems tracts, but lack the lowstand systems tract. The sequence O3 1-3 is a drowning sequence. The sequence O3 1-1 overlapped the eroded slope and pinched out to the northwest and landward. The highstand systems tract in the sequence O3 1-2 consists of low-angle sigmoid and high-angle shingled progradation configuration. Major sedimentary facies of the Lianglitage Formation include reef and shoal in the platform margin and lagoon, which can be subdivided into coral-sponge-stromatoporoid reef complex, sand shoal, lime mud mound, and intershoal sea. Reefs, sand shoals and their complex are potential reservoir facies. The reefs and sand shoals in the sequence O3 1-1 developed in the upper of its highstand systems tract. In the sequence O3 1-2, the highstand systems tract with an internal prograding configuration is a response to the lateral shifting of the complex of reef and sand shoal. The transgressive systems tract, in particular the sand shoals, developed widely on the slope of the platform margin and interior. The reefs in the sequence O3 1-3 migrated towards high positions and formed retrograding reefs in the western platform and low relief in the platform interior. Basinward lateral migration of the reefs and pure carbonate rock both characterize highstand systems tract and show that the rise of the relative sea-level was very slow. Shingled prograding stacking pattern of the 4th-order sequences and reefs grow horizontally, which represents the late stage of highstand systems tract and implies relative sealevel stillstand. Reefs migrating towards high land and impure carbonate rock both indicate transgressive systems tract and suggest that the relative sea-level rose fast. Erosional truncation and epidiagenetic karstification represent a falling relative sea-level. The relative sea-level fluctuation and antecedent palaeotopography control the development and distribution of reef complexes and unconformity karst zones. Currently, the composite zone of epidiagenetic karstic intervals and high-energy complexes of reefs and sand shoals with prograding configuration is an important oil and gas reservoir in the northern slope of the Tazhong carbonate platform. 展开更多
关键词 Tarim Basin late Ordovician carbonate platform depositional architecture sea-level fluctuation
原文传递
Data Analytics for the Identification of Fake Reviews Using Supervised Learning 被引量:8
10
作者 Saleh Nagi Alsubari Sachin N.Deshmukh +4 位作者 Ahmed Abdullah Alqarni Nizar Alsharif Theyazn H.H.Aldhyani Fawaz Waselallah Alsaade Osamah I.Khalaf 《Computers, Materials & Continua》 SCIE EI 2022年第2期3189-3204,共16页
Fake reviews,also known as deceptive opinions,are used to mislead people and have gained more importance recently.This is due to the rapid increase in online marketing transactions,such as selling and purchasing.E-com... Fake reviews,also known as deceptive opinions,are used to mislead people and have gained more importance recently.This is due to the rapid increase in online marketing transactions,such as selling and purchasing.E-commerce provides a facility for customers to post reviews and comment about the product or service when purchased.New customers usually go through the posted reviews or comments on the website before making a purchase decision.However,the current challenge is how new individuals can distinguish truthful reviews from fake ones,which later deceives customers,inflicts losses,and tarnishes the reputation of companies.The present paper attempts to develop an intelligent system that can detect fake reviews on ecommerce platforms using n-grams of the review text and sentiment scores given by the reviewer.The proposed methodology adopted in this study used a standard fake hotel review dataset for experimenting and data preprocessing methods and a term frequency-Inverse document frequency(TF-IDF)approach for extracting features and their representation.For detection and classification,n-grams of review texts were inputted into the constructed models to be classified as fake or truthful.However,the experiments were carried out using four different supervised machine-learning techniques and were trained and tested on a dataset collected from the Trip Advisor website.The classification results of these experiments showed that na飗e Bayes(NB),support vector machine(SVM),adaptive boosting(AB),and random forest(RF)received 88%,93%,94%,and 95%,respectively,based on testing accuracy and tje F1-score.The obtained results were compared with existing works that used the same dataset,and the proposed methods outperformed the comparable methods in terms of accuracy. 展开更多
关键词 E-COMMERCE fake reviews detection METHODOLOGIES machine learning hotel reviews
在线阅读 下载PDF
Evaluating the Efficiency of CBAM-Resnet Using Malaysian Sign Language 被引量:5
11
作者 Rehman Ullah Khan Woei Sheng Wong +4 位作者 Insaf Ullah Fahad Algarni Muhammad Inam Ul Haq Mohamad Hardyman bin Barawi Muhammad Asghar Khan 《Computers, Materials & Continua》 SCIE EI 2022年第5期2755-2772,共18页
The deaf-mutes population is constantly feeling helpless when others do not understand them and vice versa.To fill this gap,this study implements a CNN-based neural network,Convolutional Based Attention Module(CBAM),t... The deaf-mutes population is constantly feeling helpless when others do not understand them and vice versa.To fill this gap,this study implements a CNN-based neural network,Convolutional Based Attention Module(CBAM),to recognise Malaysian Sign Language(MSL)in videos recognition.This study has created 2071 videos for 19 dynamic signs.Two different experiments were conducted for dynamic signs,using CBAM-3DResNet implementing‘Within Blocks’and‘Before Classifier’methods.Various metrics such as the accuracy,loss,precision,recall,F1-score,confusion matrix,and training time were recorded to evaluate the models’efficiency.Results showed that CBAM-ResNet models had good performances in videos recognition tasks,with recognition rates of over 90%with little variations.CBAMResNet‘Before Classifier’is more efficient than‘Within Blocks’models of CBAM-ResNet.All experiment results indicated the CBAM-ResNet‘Before Classifier’efficiency in recognising Malaysian Sign Language and its worth of future research. 展开更多
关键词 CBAM-ResNet malaysian sign language within blocks before classifier efficiency evaluation
在线阅读 下载PDF
Suggestion Mining from Opinionated Text of Big Social Media Data 被引量:6
12
作者 Youseef Alotaibi Muhammad Noman Malik +4 位作者 Huma Hayat Khan Anab Batool Saif ul Islam Abdulmajeed Alsufyani Saleh Alghamdi 《Computers, Materials & Continua》 SCIE EI 2021年第9期3323-3338,共16页
:Social media data are rapidly increasing and constitute a source of user opinions and tips on a wide range of products and services.The increasing availability of such big data on biased reviews and blogs creates cha... :Social media data are rapidly increasing and constitute a source of user opinions and tips on a wide range of products and services.The increasing availability of such big data on biased reviews and blogs creates challenges for customers and businesses in reviewing all content in their decision-making process.To overcome this challenge,extracting suggestions from opinionated text is a possible solution.In this study,the characteristics of suggestions are analyzed and a suggestion mining extraction process is presented for classifying suggestive sentences from online customers’reviews.A classification using a word-embedding approach is used via the XGBoost classifier.The two datasets used in this experiment relate to online hotel reviews and Microsoft Windows App Studio discussion reviews.F1,precision,recall,and accuracy scores are calculated.The results demonstrated that the XGBoost classifier outperforms—with an accuracy of more than 80%.Moreover,the results revealed that suggestion keywords and phrases are the predominant features for suggestion extraction.Thus,this study contributes to knowledge and practice by comparing feature extraction classifiers and identifying XGBoost as a better suggestion mining process for identifying online reviews. 展开更多
关键词 Suggestion mining word embedding Naïve Bayes random forest XGBoost DATASET
在线阅读 下载PDF
Machine Learning Enabled Early Detection of Breast Cancer by Structural Analysis of Mammograms 被引量:4
13
作者 Mavra Mehmood Ember Ayub +7 位作者 Fahad Ahmad Madallah Alruwaili Ziyad AAlrowaili Saad Alanazi Mamoona Humayun Muhammad Rizwan Shahid Naseem Tahir Alyas 《Computers, Materials & Continua》 SCIE EI 2021年第4期641-657,共17页
Clinical image processing plays a signicant role in healthcare systems and is currently a widely used methodology.In carcinogenic diseases,time is crucial;thus,an image’s accurate analysis can help treat disease at a... Clinical image processing plays a signicant role in healthcare systems and is currently a widely used methodology.In carcinogenic diseases,time is crucial;thus,an image’s accurate analysis can help treat disease at an early stage.Ductal carcinoma in situ(DCIS)and lobular carcinoma in situ(LCIS)are common types of malignancies that affect both women and men.The number of cases of DCIS and LCIS has increased every year since 2002,while it still takes a considerable amount of time to recommend a controlling technique.Image processing is a powerful technique to analyze preprocessed images to retrieve useful information by using some remarkable processing operations.In this paper,we used a dataset from the Mammographic Image Analysis Society and MATLAB 2019b software from MathWorks to simulate and extract our results.In this proposed study,mammograms are primarily used to diagnose,more precisely,the breast’s tumor component.The detection of DCIS and LCIS on breast mammograms is done by preprocessing the images using contrast-limited adaptive histogram equalization.The resulting images’tumor portions are then isolated by a segmentation process,such as threshold detection.Furthermore,morphological operations,such as erosion and dilation,are applied to the images,then a gray-level co-occurrence matrix texture features,Harlick texture features,and shape features are extracted from the regions of interest.For classication purposes,a support vector machine(SVM)classier is used to categorize normal and abnormal patterns.Finally,the adaptive neuro-fuzzy inference system is deployed for the amputation of fuzziness due to overlapping features of patterns within the images,and the exact categorization of prior patterns is gained through the SVM.Early detection of DCIS and LCIS can save lives and help physicians and surgeons todiagnose and treat these diseases.Substantial results are obtained through cubic support vector machine(CSVM),respectively,showing 98.95%and 98.01%accuracies for normal and abnormal mammograms.Through ANFIS,promising results of mean square error(MSE)0.01866,0.18397,and 0.19640 for DCIS and LCIS differentiation during the training,testing,and checking phases. 展开更多
关键词 Image processing tumor segmentation DILATION EROSION machine learning classication support vector machine adaptive neuro-fuzzy inference system
在线阅读 下载PDF
Prediction of flyrock induced by mine blasting using a novel kernel-based extreme learning machine 被引量:4
14
作者 Mehdi Jamei Mahdi Hasanipanah +2 位作者 Masoud Karbasi Iman Ahmadianfar Somaye Taherifar 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2021年第6期1438-1451,共14页
Blasting is a common method of breaking rock in surface mines.Although the fragmentation with proper size is the main purpose,other undesirable effects such as flyrock are inevitable.This study is carried out to evalu... Blasting is a common method of breaking rock in surface mines.Although the fragmentation with proper size is the main purpose,other undesirable effects such as flyrock are inevitable.This study is carried out to evaluate the capability of a novel kernel-based extreme learning machine algorithm,called kernel extreme learning machine(KELM),by which the flyrock distance(FRD) is predicted.Furthermore,the other three data-driven models including local weighted linear regression(LWLR),response surface methodology(RSM) and boosted regression tree(BRT) are also developed to validate the main model.A database gathered from three quarry sites in Malaysia is employed to construct the proposed models using 73 sets of spacing,burden,stemming length and powder factor data as inputs and FRD as target.Afterwards,the validity of the models is evaluated by comparing the corresponding values of some statistical metrics and validation tools.Finally,the results verify that the proposed KELM model on account of highest correlation coefficient(R) and lowest root mean square error(RMSE) is more computationally efficient,leading to better predictive capability compared to LWLR,RSM and BRT models for all data sets. 展开更多
关键词 BLASTING Flyrock distance Kernel extreme learning machine(KELM) Local weighted linear regression(LWLR) Response surface methodology(RSM)
在线阅读 下载PDF
Prediction of COVID-19 Cases Using Machine Learning for Effective Public Health Management 被引量:3
15
作者 Fahad Ahmad Saleh N.Almuayqil +3 位作者 Mamoona Humayun Shahid Naseem Wasim Ahmad Khan Kashaf Junaid 《Computers, Materials & Continua》 SCIE EI 2021年第3期2265-2282,共18页
COVID-19 is a pandemic that has affected nearly every country in the world.At present,sustainable development in the area of public health is considered vital to securing a promising and prosperous future for humans.H... COVID-19 is a pandemic that has affected nearly every country in the world.At present,sustainable development in the area of public health is considered vital to securing a promising and prosperous future for humans.However,widespread diseases,such as COVID-19,create numerous challenges to this goal,and some of those challenges are not yet defined.In this study,a Shallow Single-Layer Perceptron Neural Network(SSLPNN)and Gaussian Process Regression(GPR)model were used for the classification and prediction of confirmed COVID-19 cases in five geographically distributed regions of Asia with diverse settings and environmental conditions:namely,China,South Korea,Japan,Saudi Arabia,and Pakistan.Significant environmental and non-environmental features were taken as the input dataset,and confirmed COVID-19 cases were taken as the output dataset.A correlation analysis was done to identify patterns in the cases related to fluctuations in the associated variables.The results of this study established that the population and air quality index of a region had a statistically significant influence on the cases.However,age and the human development index had a negative influence on the cases.The proposed SSLPNN-based classification model performed well when predicting the classes of confirmed cases.During training,the binary classification model was highly accurate,with a Root Mean Square Error(RMSE)of 0.91.Likewise,the results of the regression analysis using the GPR technique with Matern 5/2 were highly accurate(RMSE=0.95239)when predicting the number of confirmed COVID-19 cases in an area.However,dynamic management has occupied a core place in studies on the sustainable development of public health but dynamic management depends on proactive strategies based on statistically verified approaches,like Artificial Intelligence(AI).In this study,an SSLPNN model has been trained to fit public health associated data into an appropriate class,allowing GPR to predict the number of confirmed COVID-19 cases in an area based on the given values of selected parameters. Therefore, this tool can help authorities in different ecological settingseffectively manage COVID-19. 展开更多
关键词 Public health sustainable development artificial intelligence SARSCoV-2 shallow single-layer perceptron neural network binary classification gaussian process regression
在线阅读 下载PDF
Effective use of FibroTest to generate decision trees in hepatitis C 被引量:2
16
作者 Dana Lau-Corona Luís Alberto Pineda +10 位作者 Héctor Hugo Avilés Gabriela Gutiérrez-Reyes Blanca Eugenia Farfan-Labonne Rafael Núez-Nateras Alan Bonder Rosalinda Martínez-García Clara Corona-Lau Marco Antonio Olivera-Martínez Maria Concepción Gutiérrez-Ruiz Guillermo Robles-Díaz David Kershenobich 《World Journal of Gastroenterology》 SCIE CAS CSCD 2009年第21期2617-2622,共6页
AIM: To assess the usefulness of FibroTest to forecast scores by constructing decision trees in patients with chronic hepatitis C.METHODS: We used the C4.5 classification algorithm to construct decision trees with d... AIM: To assess the usefulness of FibroTest to forecast scores by constructing decision trees in patients with chronic hepatitis C.METHODS: We used the C4.5 classification algorithm to construct decision trees with data from 261 patients with chronic hepatitis C without a liver biopsy. The FibroTest attributes of age, gender, bilirubin, apolipoprotein, haptoglobin, α2 macroglobulin, and γ-glutamyl transpeptidase were used as predictors, and the FibroTest score as the target. For testing, a 10-fold cross validation was used.RESULTS: The overall classification error was 14.9% (accuracy 85.1%). FibroTest's cases with true scores of FO and F4 were classified with very high accuracy (18/20 for FO, 9/9 for FO-1 and 92/96 for F4) and the largest confusion centered on F3. The algorithm produced a set of compound rules out of the ten classification trees and was used to classify the 261 patients. The rules for the classification of patients in FO and F4 were effective in more than 75% of the cases in which they were tested.CONCLUSION: The recognition of clinical subgroups should help to enhance our ability to assess differences in fibrosis scores in clinical studies and improve our understanding of fibrosis progression, 展开更多
关键词 Hepatitis C FibroTest Decision trees C4.5algorithm Non-invasive biomarkers
暂未订购
Stabilizing Energy Consumption in Unequal Clusters of Wireless Sensor Networks 被引量:2
17
作者 Nithya Rekha Sivakumar 《Computers, Materials & Continua》 SCIE EI 2020年第7期81-96,共16页
In the past few decades,Energy Efficiency(EE)has been a significant challenge in Wireless Sensor Networks(WSNs).WSN requires reduced transmission delay and higher throughput with high quality services,it further pays ... In the past few decades,Energy Efficiency(EE)has been a significant challenge in Wireless Sensor Networks(WSNs).WSN requires reduced transmission delay and higher throughput with high quality services,it further pays much attention in increased energy consumption to improve the network lifetime.To collect and transmit data Clustering based routing algorithm is considered as an effective way.Cluster Head(CH)acts as an essential role in network connectivity and perform data transmission and data aggregation,where the energy consumption is superior to non-CH nodes.Conventional clustering approaches attempts to cluster nodes of same size.Moreover,owing to randomly distributed node distribution,a cluster with equal nodes is not an obvious possibility to reduce the energy consumption.To resolve this issue,this paper provides a novel,Balanced-Imbalanced Cluster Algorithm(B-IBCA)with a Stabilized Boltzmann Approach(SBA)that attempts to balance the energy dissipation across uneven clusters in WSNs.B-IBCA utilizes stabilizing logic to maintain the consistency of energy consumption among sensor nodes’.So as to handle the changing topological characteristics of sensor nodes,this stability based Boltzmann estimation algorithm allocates proper radius amongst the sensor nodes.The simulation shows that the proposed B-IBCA outperforms effectually over other approaches in terms of energy efficiency,lifetime,network stability,average residual energy and so on. 展开更多
关键词 WSN STABILITY cluster head node balancing average residual energy
在线阅读 下载PDF
Entropy-Based Approach to Detect DDoS Attacks on Software Defined Networking Controller 被引量:2
18
作者 Mohammad Aladaileh Mohammed Anbar +2 位作者 Iznan H.Hasbullah Yousef K.Sanjalawe Yung-Wey Chong 《Computers, Materials & Continua》 SCIE EI 2021年第10期373-391,共19页
The Software-Defined Networking(SDN)technology improves network management over existing technology via centralized network control.The SDN provides a perfect platform for researchers to solve traditional network’s o... The Software-Defined Networking(SDN)technology improves network management over existing technology via centralized network control.The SDN provides a perfect platform for researchers to solve traditional network’s outstanding issues.However,despite the advantages of centralized control,concern about its security is rising.The more traditional network switched to SDN technology,the more attractive it becomes to malicious actors,especially the controller,because it is the network’s brain.A Distributed Denial of Service(DDoS)attack on the controller could cripple the entire network.For that reason,researchers are always looking for ways to detect DDoS attacks against the controller with higher accuracy and lower false-positive rate.This paper proposes an entropy-based approach to detect low-rate and high-rate DDoS attacks against the SDN controller,regardless of the number of attackers or targets.The proposed approach generalized the Rényi joint entropy for analyzing the network traffic flow to detect DDoS attack traffic flow of varying rates.Using two packet header features and generalized Rényi joint entropy,the proposed approach achieved a better detection rate than the EDDSC approach that uses Shannon entropy metrics. 展开更多
关键词 Software-defined networking DDoS attack distributed denial of service Rényi joint entropy
在线阅读 下载PDF
Security Requirement Management for Cloud-Assisted and Internet of Things—Enabled Smart City 被引量:2
19
作者 Muhammad Usman Tariq Muhammad Babar +3 位作者 Mian Ahmad Jan Akmal Saeed Khattak Mohammad Dahman Alshehri Abid Yahya 《Computers, Materials & Continua》 SCIE EI 2021年第4期625-639,共15页
The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that trans... The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that transfer information.The IoT architecture permits on-demand services to a public pool of resources.Cloud computing plays a vital role in developing IoT-enabled smart applications.The integration of cloud computing enhances the offering of distributed resources in the smart city.Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability,security,performance,condentiality,and privacy.The key reason for cloud-and IoT-enabled smart city application failure is improper security practices at the early stages of development.This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications.Its three-layered architecture includes privacy preserved stakeholder analysis(PPSA),security requirement modeling and validation(SRMV),and secure cloud-assistance(SCA).A case study highlights the applicability and effectiveness of the proposed framework.A hybrid survey enables the identication and evaluation of signicant challenges. 展开更多
关键词 SECURITY PRIVACY smart city Internet of Things cloud computing
在线阅读 下载PDF
Quest for the best endoscopic imaging modality for computer-assisted colonic polyp staging 被引量:2
20
作者 Georg Wimmer Michael Gadermayr +8 位作者 Gernot Wolkersdorfer Roland Kwitt Toru Tamaki Jens Tischendorf Michael Hafner Shigeto Yoshida Shinji Tanaka Dorit Merhof Andreas Uhl 《World Journal of Gastroenterology》 SCIE CAS 2019年第10期1197-1209,共13页
BACKGROUND It was shown in previous studies that high definition endoscopy,high magnification endoscopy and image enhancement technologies,such as chromoendoscopy and digital chromoendoscopy[narrow-band imaging(NBI),i... BACKGROUND It was shown in previous studies that high definition endoscopy,high magnification endoscopy and image enhancement technologies,such as chromoendoscopy and digital chromoendoscopy[narrow-band imaging(NBI),iScan]facilitate the detection and classification of colonic polyps during endoscopic sessions.However,there are no comprehensive studies so far that analyze which endoscopic imaging modalities facilitate the automated classification of colonic polyps.In this work,we investigate the impact of endoscopic imaging modalities on the results of computer-assisted diagnosis systems for colonic polyp staging.AIM To assess which endoscopic imaging modalities are best suited for the computerassisted staging of colonic polyps.METHODS In our experiments,we apply twelve state-of-the-art feature extraction methods for the classification of colonic polyps to five endoscopic image databases of colonic lesions.For this purpose,we employ a specifically designed experimental setup to avoid biases in the outcomes caused by differing numbers of images per image database.The image databases were obtained using different imaging modalities.Two databases were obtained by high-definition endoscopy in combination with i-Scan technology(one with chromoendoscopy and one without chromoendoscopy).Three databases were obtained by highmagnification endoscopy(two databases using narrow band imaging and one using chromoendoscopy).The lesions are categorized into non-neoplastic and neoplastic according to the histological diagnosis.RESULTS Generally,it is feature-dependent which imaging modalities achieve high results and which do not.For the high-definition image databases,we achieved overall classification rates of up to 79.2%with chromoendoscopy and 88.9%without chromoendoscopy.In the case of the database obtained by high-magnification chromoendoscopy,the classification rates were up to 81.4%.For the combination of high-magnification endoscopy with NBI,results of up to 97.4%for one database and up to 84%for the other were achieved.Non-neoplastic lesions were classified more accurately in general than non-neoplastic lesions.It was shown that the image recording conditions highly affect the performance of automated diagnosis systems and partly contribute to a stronger effect on the staging results than the used imaging modality.CONCLUSION Chromoendoscopy has a negative impact on the results of the methods.NBI is better suited than chromoendoscopy.High-definition and high-magnification endoscopy are equally suited. 展开更多
关键词 ENDOSCOPY Colonic polyps Automated diagnosis system Narrow-band imaging CHROMOENDOSCOPY Imaging modalities Image enhancement technologies
暂未订购
上一页 1 2 16 下一页 到第
使用帮助 返回顶部