Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal...Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal lung tissue,honeycombing lungs,and Ground Glass Opacity(GGO)in CT images is often challenging for radiologists and may lead to misinterpretations.Although earlier studies have proposed models to detect and classify HCL,many faced limitations such as high computational demands,lower accuracy,and difficulty distinguishing between HCL and GGO.CT images are highly effective for lung classification due to their high resolution,3D visualization,and sensitivity to tissue density variations.This study introduces Honeycombing Lungs Network(HCL Net),a novel classification algorithm inspired by ResNet50V2 and enhanced to overcome the shortcomings of previous approaches.HCL Net incorporates additional residual blocks,refined preprocessing techniques,and selective parameter tuning to improve classification performance.The dataset,sourced from the University Malaya Medical Centre(UMMC)and verified by expert radiologists,consists of CT images of normal,honeycombing,and GGO lungs.Experimental evaluations across five assessments demonstrated that HCL Net achieved an outstanding classification accuracy of approximately 99.97%.It also recorded strong performance in other metrics,achieving 93%precision,100%sensitivity,89%specificity,and an AUC-ROC score of 97%.Comparative analysis with baseline feature engineering methods confirmed the superior efficacy of HCL Net.The model significantly reduces misclassification,particularly between honeycombing and GGO lungs,enhancing diagnostic precision and reliability in lung image analysis.展开更多
This review examines human vulnerabilities in cybersecurity within Microfinance Institutions, analyzing their impact on organizational resilience. Focusing on social engineering, inadequate security training, and weak...This review examines human vulnerabilities in cybersecurity within Microfinance Institutions, analyzing their impact on organizational resilience. Focusing on social engineering, inadequate security training, and weak internal protocols, the study identifies key vulnerabilities exacerbating cyber threats to MFIs. A literature review using databases like IEEE Xplore and Google Scholar focused on studies from 2019 to 2023 addressing human factors in cybersecurity specific to MFIs. Analysis of 57 studies reveals that phishing and insider threats are predominant, with a 20% annual increase in phishing attempts. Employee susceptibility to these attacks is heightened by insufficient training, with entry-level employees showing the highest vulnerability rates. Further, only 35% of MFIs offer regular cybersecurity training, significantly impacting incident reduction. This paper recommends enhanced training frequency, robust internal controls, and a cybersecurity-aware culture to mitigate human-induced cyber risks in MFIs.展开更多
Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physi...Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physical systems at risk.Severe resource constraints and insufficient security design are two major causes of many security problems in IoT applications.As an extension of the cloud,the emerging edge computing with rich resources provides us a new venue to design and deploy novel security solutions for IoT applications.Although there are some research efforts in this area,edge-based security designs for IoT applications are still in its infancy.This paper aims to present a comprehensive survey of existing IoT security solutions at the edge layer as well as to inspire more edge-based IoT security designs.We first present an edge-centric IoT architecture.Then,we extensively review the edge-based IoT security research efforts in the context of security architecture designs,firewalls,intrusion detection systems,authentication and authorization protocols,and privacy-preserving mechanisms.Finally,we propose our insight into future research directions and open research issues.展开更多
AIM To identify demographic, clinical, metabolomic, and lifestyle related predictors of relapse in adult ulcerative colitis(UC) patients.METHODS In this prospective pilot study, UC patients in clinical remission were ...AIM To identify demographic, clinical, metabolomic, and lifestyle related predictors of relapse in adult ulcerative colitis(UC) patients.METHODS In this prospective pilot study, UC patients in clinical remission were recruited and followed-up at 12 mo to assess a clinical relapse, or not. At baseline information on demographic and clinical parameters was collected. Serum and urine samples were collected for analysis of metabolomic assays using a combined direct infusion/liquid chromatography tandem mass spectrometry and nuclear magnetic resolution spectroscopy. Stool samples were also collected to measure fecal calprotectin(FCP). Dietary assessment was performed using a validated self-administered food frequency questionnaire. RESULTS Twenty patients were included(mean age: 42.7 ± 14.8 years, females: 55%). Seven patients(35%) experienced a clinical relapse during the follow-up period. While 6 patients(66.7%) with normal body weight developed a clinical relapse, 1 UC patient(9.1%) who was overweight/obese relapsed during the follow-up(P = 0.02). At baseline, poultry intake was significantly higher in patients who were still in remission during follow-up(0.9 oz vs 0.2 oz, P = 0.002). Five patients(71.4%) with FCP > 150 μg/g and 2 patients(15.4%) with normal FCP(≤ 150 μg/g) at baseline relapsed during the follow-up(P = 0.02). Interestingly, baseline urinary and serum metabolomic profiling of UC patients with or without clinical relapse within 12 mo showed a significant difference. The most important metabolites that were responsible for this discrimination were trans-aconitate, cystine and acetamide in urine, and 3-hydroxybutyrate, acetoacetate and acetone in serum. CONCLUSION A combination of baseline dietary intake, fecal calprotectin, and metabolomic factors are associated with risk of UC clinical relapse within 12 mo.展开更多
Neurocognitive deficits are frequently observed in patients with schizophrenia and major depressive disorder(MDD). The relations between cognitive features may be represented by neurocognitive graphs based on cognitiv...Neurocognitive deficits are frequently observed in patients with schizophrenia and major depressive disorder(MDD). The relations between cognitive features may be represented by neurocognitive graphs based on cognitive features, modeled as Gaussian Markov random fields. However, it is unclear whether it is possible to differentiate between phenotypic patterns associated with the differential diagnosis of schizophrenia and depression using this neurocognitive graph approach. In this study, we enrolled 215 first-episode patients with schizophrenia(FES), 125 with MDD, and 237 demographically-matched healthy controls(HCs). The cognitive performance of all participants was evaluated using a battery of neurocognitive tests. The graphical LASSO model was trained with aone-vs-one scenario to learn the conditional independent structure of neurocognitive features of each group. Participants in the holdout dataset were classified into different groups with the highest likelihood. A partial correlation matrix was transformed from the graphical model to further explore the neurocognitive graph for each group. The classification approach identified the diagnostic class for individuals with an average accuracy of 73.41% for FES vs HC, 67.07% for MDD vs HC, and 59.48% for FES vs MDD. Both of the neurocognitive graphs for FES and MDD had more connections and higher node centrality than those for HC. The neurocognitive graph for FES was less sparse and had more connections than that for MDD.Thus, neurocognitive graphs based on cognitive features are promising for describing endophenotypes that may discriminate schizophrenia from depression.展开更多
Time-sensitive networks(TSNs)support not only traditional best-effort communications but also deterministic communications,which send each packet at a deterministic time so that the data transmissions of networked con...Time-sensitive networks(TSNs)support not only traditional best-effort communications but also deterministic communications,which send each packet at a deterministic time so that the data transmissions of networked control systems can be precisely scheduled to guarantee hard real-time constraints.No-wait scheduling is suitable for such TSNs and generates the schedules of deterministic communications with the minimal network resources so that all of the remaining resources can be used to improve the throughput of best-effort communications.However,due to inappropriate message fragmentation,the realtime performance of no-wait scheduling algorithms is reduced.Therefore,in this paper,joint algorithms of message fragmentation and no-wait scheduling are proposed.First,a specification for the joint problem based on optimization modulo theories is proposed so that off-the-shelf solvers can be used to find optimal solutions.Second,to improve the scalability of our algorithm,the worst-case delay of messages is analyzed,and then,based on the analysis,a heuristic algorithm is proposed to construct low-delay schedules.Finally,we conduct extensive test cases to evaluate our proposed algorithms.The evaluation results indicate that,compared to existing algorithms,the proposed joint algorithm improves schedulability by up to 50%.展开更多
Genetic improvement for drought stress tolerance in rice involves the quantitative nature of the trait, which reflects the additive effects of several genetic loci throughout the genome. Yield components and related t...Genetic improvement for drought stress tolerance in rice involves the quantitative nature of the trait, which reflects the additive effects of several genetic loci throughout the genome. Yield components and related traits under stressed and well-water conditions were assayed in mapping populations derived from crosses of Azucena×IR64 and Azucena×Bala. To find the candidate rice genes underlying Quantitative Trait Loci (QTL) in these populations, we conducted in silico analysis of a candidate region flanked by the genetic markers RM212 and RM319 on chromosome 1, proximal to the semi-dwarf (sd1) locus. A total of 175 annotated genes were identified from this region. These included 48 genes annotated by functional homology to known genes, 23 pseudogenes, 24 ab initio predicted genes supported by an alignment match to an EST (Expressed sequence tag) of unknown function, and 80 hypothetical genes predicted solely by ab initio means. Among these, 16 candidate genes could potentially be involved in drought stress response.展开更多
In this paper, a novel algorithm for aerosol optical depth(AOD) retrieval with a 1 km spatial resolution over land is presented using the Advanced Along Track Scanning Radiometer (AATSR) dual-view capability at 0....In this paper, a novel algorithm for aerosol optical depth(AOD) retrieval with a 1 km spatial resolution over land is presented using the Advanced Along Track Scanning Radiometer (AATSR) dual-view capability at 0.55, 0.66 and 0.87μm, in combination with the Bi-directional Reflectance Distribution Function (BRDF) model, a product of the Moderate Resolution Imaging Spectroradiometer (MODIS). The BRDF characteristics of the land surface, i.e. prior input parameters for this algorithm, are computed by extracting the geometrical information from AATSR and reducing the kernels from the MODIS BRDF/Albedo Model Parameters Product. Finally, AOD, with a i km resolution at 0.55, 0.66 and 0.87 μm for the forward and nadir views of AATSR, can be simultaneously obtained. Extensive validations of AOD derived from AATSR during the period from August 2005 to July 2006 in Beijing and its surrounding area, against in-situ AErosol RObotic NETwork (AERONET) measurements, were performed. The AOD difference between the retrievals from the forward and nadir views of AATSR was less than 5.72%, 1.9% and 13.7%, respectively. Meanwhile, it was found that the AATSR retrievals using the synergic algorithm developed in this paper are more favorable than those by assuming a Lambert surface, for the coefficient of determination between AATSR derived AOD and AERONET mearured AOD, decreased by 15.5% and 18.5%, compared to those derived by the synergic algorithm. This further suggests that the synergic algorithm can be potentially used in climate change and air quality monitoring.展开更多
The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that trans...The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that transfer information.The IoT architecture permits on-demand services to a public pool of resources.Cloud computing plays a vital role in developing IoT-enabled smart applications.The integration of cloud computing enhances the offering of distributed resources in the smart city.Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability,security,performance,condentiality,and privacy.The key reason for cloud-and IoT-enabled smart city application failure is improper security practices at the early stages of development.This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications.Its three-layered architecture includes privacy preserved stakeholder analysis(PPSA),security requirement modeling and validation(SRMV),and secure cloud-assistance(SCA).A case study highlights the applicability and effectiveness of the proposed framework.A hybrid survey enables the identication and evaluation of signicant challenges.展开更多
Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include s...Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include storing and accessing user data in commercial clouds, mining of social data, and analysis of large-scale simulations and experiments such as the Large Hadron Collider. An increasing number of such data—intensive applications and services are relying on clouds in order to process and manage the enormous amounts of data required for continuous operation. It can be difficult to decide which of the many options for cloud processing is suitable for a given application;the aim of this paper is therefore to provide an interested user with an overview of the most important concepts of cloud computing as it relates to processing of Big Data.展开更多
The goal of this manuscript is to present a research finding, based on a study conducted to identify, examine, and validate Social Media (SM) socio-technical information security factors, in line with usable-security ...The goal of this manuscript is to present a research finding, based on a study conducted to identify, examine, and validate Social Media (SM) socio-technical information security factors, in line with usable-security principles. The study followed literature search techniques, as well as theoretical and empirical methods of factor validation. The strategy used in literature search includes Boolean keywords search, and citation guides, using mainly web of science databases. As guided by study objectives, 9 SM socio-technical factors were identified, verified and validated. Both theoretical and empirical validation processes were followed. Thus, a theoretical validity test was conducted on 45 Likert scale items, involving 10 subject experts. From the score ratings of the experts, Content Validity Index (CVI) was calculated to determine the degree to which the identified factors exhibit appropriate items for the construct being measured, and 7 factors attained an adequate level of validity index. However, for reliability test, 32 respondents and 45 Likert scale items were used. Whereby, Cronbach’s alpha coefficient (α-values) were generated using SPSS. Subsequently, 8 factors attained an adequate level of reliability. Overall, the validated factors include;1) usability—visibility, learnability, and satisfaction;2) education and training—help and documentation;3) SM technology development—error handling, and revocability;4) information security —security, privacy, and expressiveness. In this case, the confirmed factors would add knowledge by providing a theoretical basis for rationalizing information security requirements on SM usage.展开更多
Aiming at the problem that the data in the user rating matrix is missing and the importance of implicit trust between users is ignored when using the TrustSVD model to fill it,this paper proposes a recommendation algo...Aiming at the problem that the data in the user rating matrix is missing and the importance of implicit trust between users is ignored when using the TrustSVD model to fill it,this paper proposes a recommendation algorithm based on TrustSVD++and XGBoost.Firstly,the explicit trust and implicit trust were introduced into the SVD++model to construct the TrustSVD++model.Secondly,considering that there is much data in the interaction matrix after filling,which may lead to a rather complex calculation process,the K-means algorithm is introduced to cluster and extract user and item features at the same time.Then,in order to improve the accuracy of rating prediction for target users,an XGBoost model is proposed to train user and item features,and finally,it is verified on the data sets MovieLens-1M and MovieLens-100k.Experiments show that compared with the SVD++model and the recommendation algorithm without XGBoost model training,the proposed algorithm has the RMSE value reduced by 2.9%and the MAE value reduced by 3%.展开更多
Urban public transport plays a critical role in stimulating economic development of any nation since most of the revenues come from cities. The majority of the city dwellers of any country use public transport. The ev...Urban public transport plays a critical role in stimulating economic development of any nation since most of the revenues come from cities. The majority of the city dwellers of any country use public transport. The evaluation of public transport service quality provides a valuable feedback to commuter operators to ensure continuous improvement of level of service and to the government to take appropriate measures for enhancing the quality of public transport service. This paper analyses and evaluates service quality of Road Public Transport (RPT) (i.e. minibuses and buses) and Urban Rail Transport (URT) in Dar es Salaam City, Tanzania. Since service quality and its attributes are intangible and vague, a fuzzy evaluation model is developed and applied. The formulated model is composed of Fuzzy Entropy Method (FEM) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). The overall evaluation procedure is as follows: initially, an intensive literature search and experts’ opinions are employed to establish criteria for evaluating the service quality of public transport in Dar es Salaam City. The developed FEM is used to obtain criteria weight. Lastly, the formulated TOPSIS is used to provide an overall ranking of urban public transport service quality. The overall evaluation shows that urban rail transport outperforms road public transport in terms of service quality. Nevertheless, the urban rail transport service in Dar es Salaam City is currently not well developed as it is provided on very limited routes. Thus, the Tanzania government, the rail transport operator i.e. Tanzania Railway Limited (TRL) and the agency responsible for the provision of rail infrastructure i.e. Reli Assets Holding Company (RAHCO) are advised to design and employ Public-Private-Partnership (PPP) schemes i.e. concession contracts to invest more in rolling stocks, locomotives and rail wagons so that the rail transport service is available on many routes of the road public transport to bring fair competition between the two operators.展开更多
In recent years,various efforts have been devoted to advancing university education through artificial intelligence(AI).To this end,this paper introduces KCUBE,a novel framework centered on knowledge graphs(KGs)design...In recent years,various efforts have been devoted to advancing university education through artificial intelligence(AI).To this end,this paper introduces KCUBE,a novel framework centered on knowledge graphs(KGs)designed to enhance student advising and career planning in university courses.Owing to KCUBE,we can improve university education in the AI era by leveraging the expressiveness,operability,and interpretability of KGs.We detail a bottom-up approach for KG construction,empowering professors to develop subject-specific KGs,augmented by tools like ChatGPT,which has demonstrated promising accuracy and coverage.Based on KGs,KCUBE supports KG reasoning for applications such as automated teaching plan generation with dynamic editing capabilities.Furthermore,KCUBE offers advanced KG manipulation through 2D and 3D visualization platforms,such as virtual reality(VR)for immersive exploration of academic subjects and potential career paths.A comparative study on collaborative learning highlights the benefits of VR and KG-enhanced environments in promoting student engagement,participation,and collaborative decision-making.展开更多
Malaysia,as one of the highest producers of palm oil globally and one of the largest exporters,has a huge potential to use palmoil waste to generate electricity since an abundance of waste is produced during the palmo...Malaysia,as one of the highest producers of palm oil globally and one of the largest exporters,has a huge potential to use palmoil waste to generate electricity since an abundance of waste is produced during the palmoil extraction process.In this paper,we have first examined and compared the use of palmoil waste as biomass for electricity generation in different countries with reference to Malaysia.Some areas with default accessibility in rural areas,like those in Sabah and Sarawak,require a cheap and reliable source of electricity.Palm oil waste possesses the potential to be the source.Therefore,this research examines the cost-effective comparison between electricity generated frompalm oil waste and standalone diesel electric generation in Marudi,Sarawak,Malaysia.This research aims to investigate the potential electricity generation using palm oil waste and the feasibility of implementing the technology in rural areas.To implement and analyze the feasibility,a case study has been carried out in a rural area in Sarawak,Malaysia.The finding shows the electricity cost calculation of small towns like Long Lama,Long Miri,and Long Atip,with ten nearby schools,and suggests that using EFB from palm oil waste is cheaper and reduces greenhouse gas emissions.The study also points out the need to conduct further research on power systems,such as energy storage andmicrogrids,to better understand the future of power systems.By collecting data through questionnaires and surveys,an analysis has been carried out to determine the approximate cost and quantity of palm oil waste to generate cheaper renewable energy.We concluded that electricity generation from palm oil waste is cost-effective and beneficial.The infrastructure can be a microgrid connected to the main grid.展开更多
Addressing classification and prediction challenges, tree ensemble models have gained significant importance. Boosting ensemble techniques are commonly employed for forecasting Type-II diabetes mellitus. Light Gradien...Addressing classification and prediction challenges, tree ensemble models have gained significant importance. Boosting ensemble techniques are commonly employed for forecasting Type-II diabetes mellitus. Light Gradient Boosting Machine (LightGBM) is a widely used algorithm known for its leaf growth strategy, loss reduction, and enhanced training precision. However, LightGBM is prone to overfitting. In contrast, CatBoost utilizes balanced base predictors known as decision tables, which mitigate overfitting risks and significantly improve testing time efficiency. CatBoost’s algorithm structure counteracts gradient boosting biases and incorporates an overfitting detector to stop training early. This study focuses on developing a hybrid model that combines LightGBM and CatBoost to minimize overfitting and improve accuracy by reducing variance. For the purpose of finding the best hyperparameters to use with the underlying learners, the Bayesian hyperparameter optimization method is used. By fine-tuning the regularization parameter values, the hybrid model effectively reduces variance (overfitting). Comparative evaluation against LightGBM, CatBoost, XGBoost, Decision Tree, Random Forest, AdaBoost, and GBM algorithms demonstrates that the hybrid model has the best F1-score (99.37%), recall (99.25%), and accuracy (99.37%). Consequently, the proposed framework holds promise for early diabetes prediction in the healthcare industry and exhibits potential applicability to other datasets sharing similarities with diabetes.展开更多
Low Earth Orbit(LEO)satellites have gained significant attention for their low-latency communication and computing capabilities but face challenges due to high mobility and limited resources.Existing studies integrate...Low Earth Orbit(LEO)satellites have gained significant attention for their low-latency communication and computing capabilities but face challenges due to high mobility and limited resources.Existing studies integrate edge computing with LEO satellite networks to optimize task offloading;however,they often overlook the impact of frequent topology changes,unstable transmission links,and intermittent satellite visibility,leading to task execution failures and increased latency.To address these issues,this paper proposes a dynamic integrated spaceground computing framework that optimizes task offloading under LEO satellite mobility constraints.We design an adaptive task migration strategy through inter-satellite links when target satellites become inaccessible.To enhance data transmission reliability,we introduce a communication stability constraint based on transmission bit error rate(BER).Additionally,we develop a genetic algorithm(GA)-based task scheduling method that dynamically allocates computing resources while minimizing latency and energy consumption.Our approach jointly considers satellite computing capacity,link stability,and task execution reliability to achieve efficient task offloading.Experimental results demonstrate that the proposed method significantly improves task execution success rates,reduces system overhead,and enhances overall computational efficiency in LEO satellite networks.展开更多
Point of interest(POI)recommendation analyses user preferences through historical check-in data.However,existing POI recommendation methods often overlook the influence of weather information and face the challenge of...Point of interest(POI)recommendation analyses user preferences through historical check-in data.However,existing POI recommendation methods often overlook the influence of weather information and face the challenge of sparse historical data for individual users.To address these issues,this paper proposes a new paradigm,namely temporal-weather-aware transition pattern for POI recommendation(TWTransNet).This paradigm is designed to capture user transition patterns under different times and weather conditions.Additionally,we introduce the construction of a user-POI interaction graph to alleviate the problem of sparse historical data for individual users.Furthermore,when predicting user interests by aggregating graph information,some POIs may not be suitable for visitation under current weather conditions.To account for this,we propose an attention mechanism to filter POI neighbours when aggregating information from the graph,considering the impact of weather and time.Empirical results on two real-world datasets demonstrate the superior performance of our proposed method,showing a substantial improvement of 6.91%-23.31% in terms of prediction accuracy.展开更多
When all the involved data in indefinite quadratic programs change simultaneously, we show the locally Lipschtiz continuity of the KKT set of the quadratic programming problem firstly, then we establish the locally Li...When all the involved data in indefinite quadratic programs change simultaneously, we show the locally Lipschtiz continuity of the KKT set of the quadratic programming problem firstly, then we establish the locally Lipschtiz continuity of the KKT solution set. Finally, the similar conclusion for the corresponding optimal value function is obtained.展开更多
In the Internet of Things(IoT)system,relay communication is widely used to solve the problem of energy loss in long-distance transmission and improve transmission efficiency.In Body Sensor Network(BSN)systems,biosenso...In the Internet of Things(IoT)system,relay communication is widely used to solve the problem of energy loss in long-distance transmission and improve transmission efficiency.In Body Sensor Network(BSN)systems,biosensors communicate with receiving devices through relay nodes to improve their limited energy efficiency.When the relay node fails,the biosensor can communicate directly with the receiving device by releasing more transmitting power.However,if the remaining battery power of the biosensor is insufficient to enable it to communicate directly with the receiving device,the biosensor will be isolated by the system.Therefore,a new combinatorial analysis method is proposed to analyze the influence of random isolation time(RIT)on system reliability,and the competition relationship between biosensor isolation and propagation failure is considered.This approach inherits the advantages of common combinatorial algorithms and provides a new approach to effectively address the impact of RIT on system reliability in IoT systems,which are affected by competing failures.Finally,the method is applied to the BSN system,and the effect of RIT on the system reliability is analyzed in detail.展开更多
文摘Honeycombing Lung(HCL)is a chronic lung condition marked by advanced fibrosis,resulting in enlarged air spaces with thick fibrotic walls,which are visible on Computed Tomography(CT)scans.Differentiating between normal lung tissue,honeycombing lungs,and Ground Glass Opacity(GGO)in CT images is often challenging for radiologists and may lead to misinterpretations.Although earlier studies have proposed models to detect and classify HCL,many faced limitations such as high computational demands,lower accuracy,and difficulty distinguishing between HCL and GGO.CT images are highly effective for lung classification due to their high resolution,3D visualization,and sensitivity to tissue density variations.This study introduces Honeycombing Lungs Network(HCL Net),a novel classification algorithm inspired by ResNet50V2 and enhanced to overcome the shortcomings of previous approaches.HCL Net incorporates additional residual blocks,refined preprocessing techniques,and selective parameter tuning to improve classification performance.The dataset,sourced from the University Malaya Medical Centre(UMMC)and verified by expert radiologists,consists of CT images of normal,honeycombing,and GGO lungs.Experimental evaluations across five assessments demonstrated that HCL Net achieved an outstanding classification accuracy of approximately 99.97%.It also recorded strong performance in other metrics,achieving 93%precision,100%sensitivity,89%specificity,and an AUC-ROC score of 97%.Comparative analysis with baseline feature engineering methods confirmed the superior efficacy of HCL Net.The model significantly reduces misclassification,particularly between honeycombing and GGO lungs,enhancing diagnostic precision and reliability in lung image analysis.
文摘This review examines human vulnerabilities in cybersecurity within Microfinance Institutions, analyzing their impact on organizational resilience. Focusing on social engineering, inadequate security training, and weak internal protocols, the study identifies key vulnerabilities exacerbating cyber threats to MFIs. A literature review using databases like IEEE Xplore and Google Scholar focused on studies from 2019 to 2023 addressing human factors in cybersecurity specific to MFIs. Analysis of 57 studies reveals that phishing and insider threats are predominant, with a 20% annual increase in phishing attempts. Employee susceptibility to these attacks is heightened by insufficient training, with entry-level employees showing the highest vulnerability rates. Further, only 35% of MFIs offer regular cybersecurity training, significantly impacting incident reduction. This paper recommends enhanced training frequency, robust internal controls, and a cybersecurity-aware culture to mitigate human-induced cyber risks in MFIs.
基金This research has been supported by the National Science Foundation(under grant#1723596)the National Security Agency(under grant#H98230-17-1-0355).
文摘Pervasive IoT applications enable us to perceive,analyze,control,and optimize the traditional physical systems.Recently,security breaches in many IoT applications have indicated that IoT applications may put the physical systems at risk.Severe resource constraints and insufficient security design are two major causes of many security problems in IoT applications.As an extension of the cloud,the emerging edge computing with rich resources provides us a new venue to design and deploy novel security solutions for IoT applications.Although there are some research efforts in this area,edge-based security designs for IoT applications are still in its infancy.This paper aims to present a comprehensive survey of existing IoT security solutions at the edge layer as well as to inspire more edge-based IoT security designs.We first present an edge-centric IoT architecture.Then,we extensively review the edge-based IoT security research efforts in the context of security architecture designs,firewalls,intrusion detection systems,authentication and authorization protocols,and privacy-preserving mechanisms.Finally,we propose our insight into future research directions and open research issues.
基金Supported by Alberta Innovates-Bio Solutionsa graduate studentship from Alberta Innovates-Health Solutions(to Keshteli AH)
文摘AIM To identify demographic, clinical, metabolomic, and lifestyle related predictors of relapse in adult ulcerative colitis(UC) patients.METHODS In this prospective pilot study, UC patients in clinical remission were recruited and followed-up at 12 mo to assess a clinical relapse, or not. At baseline information on demographic and clinical parameters was collected. Serum and urine samples were collected for analysis of metabolomic assays using a combined direct infusion/liquid chromatography tandem mass spectrometry and nuclear magnetic resolution spectroscopy. Stool samples were also collected to measure fecal calprotectin(FCP). Dietary assessment was performed using a validated self-administered food frequency questionnaire. RESULTS Twenty patients were included(mean age: 42.7 ± 14.8 years, females: 55%). Seven patients(35%) experienced a clinical relapse during the follow-up period. While 6 patients(66.7%) with normal body weight developed a clinical relapse, 1 UC patient(9.1%) who was overweight/obese relapsed during the follow-up(P = 0.02). At baseline, poultry intake was significantly higher in patients who were still in remission during follow-up(0.9 oz vs 0.2 oz, P = 0.002). Five patients(71.4%) with FCP > 150 μg/g and 2 patients(15.4%) with normal FCP(≤ 150 μg/g) at baseline relapsed during the follow-up(P = 0.02). Interestingly, baseline urinary and serum metabolomic profiling of UC patients with or without clinical relapse within 12 mo showed a significant difference. The most important metabolites that were responsible for this discrimination were trans-aconitate, cystine and acetamide in urine, and 3-hydroxybutyrate, acetoacetate and acetone in serum. CONCLUSION A combination of baseline dietary intake, fecal calprotectin, and metabolomic factors are associated with risk of UC clinical relapse within 12 mo.
基金funded by National Nature Science Foundation of China Key Projects(81130024,91332205,and 81630030)the National Key Technology R&D Program of the Ministry of Science and Technology of China(2016YFC0904300)+4 种基金the National Natural Science Foundation of China/Research Grants Council of Hong Kong Joint Research Scheme(8141101084)the Natural Science Foundation of China(8157051859)the Sichuan Science&Technology Department(2015JY0173)the Canadian Institutes of Health Research,Alberta Innovates:Centre for Machine Learningthe Canadian Depression Research&Intervention Network
文摘Neurocognitive deficits are frequently observed in patients with schizophrenia and major depressive disorder(MDD). The relations between cognitive features may be represented by neurocognitive graphs based on cognitive features, modeled as Gaussian Markov random fields. However, it is unclear whether it is possible to differentiate between phenotypic patterns associated with the differential diagnosis of schizophrenia and depression using this neurocognitive graph approach. In this study, we enrolled 215 first-episode patients with schizophrenia(FES), 125 with MDD, and 237 demographically-matched healthy controls(HCs). The cognitive performance of all participants was evaluated using a battery of neurocognitive tests. The graphical LASSO model was trained with aone-vs-one scenario to learn the conditional independent structure of neurocognitive features of each group. Participants in the holdout dataset were classified into different groups with the highest likelihood. A partial correlation matrix was transformed from the graphical model to further explore the neurocognitive graph for each group. The classification approach identified the diagnostic class for individuals with an average accuracy of 73.41% for FES vs HC, 67.07% for MDD vs HC, and 59.48% for FES vs MDD. Both of the neurocognitive graphs for FES and MDD had more connections and higher node centrality than those for HC. The neurocognitive graph for FES was less sparse and had more connections than that for MDD.Thus, neurocognitive graphs based on cognitive features are promising for describing endophenotypes that may discriminate schizophrenia from depression.
基金partially supported by National Key Research and Development Program of China(2018YFB1700200)National Natural Science Foundation of China(61972389,61903356,61803368,U1908212)+2 种基金Youth Innovation Promotion Association of the Chinese Academy of Sciences,National Science and Technology Major Project(2017ZX02101007-004)Liaoning Provincial Natural Science Foundation of China(2020-MS-034,2019-YQ-09)China Postdoctoral Science Foundation(2019M661156)。
文摘Time-sensitive networks(TSNs)support not only traditional best-effort communications but also deterministic communications,which send each packet at a deterministic time so that the data transmissions of networked control systems can be precisely scheduled to guarantee hard real-time constraints.No-wait scheduling is suitable for such TSNs and generates the schedules of deterministic communications with the minimal network resources so that all of the remaining resources can be used to improve the throughput of best-effort communications.However,due to inappropriate message fragmentation,the realtime performance of no-wait scheduling algorithms is reduced.Therefore,in this paper,joint algorithms of message fragmentation and no-wait scheduling are proposed.First,a specification for the joint problem based on optimization modulo theories is proposed so that off-the-shelf solvers can be used to find optimal solutions.Second,to improve the scalability of our algorithm,the worst-case delay of messages is analyzed,and then,based on the analysis,a heuristic algorithm is proposed to construct low-delay schedules.Finally,we conduct extensive test cases to evaluate our proposed algorithms.The evaluation results indicate that,compared to existing algorithms,the proposed joint algorithm improves schedulability by up to 50%.
基金Project supported partly by the Rockefeller Foundation thesis dis-sertation training grant and the National Hi-Tech Research and De-velopment Program (863) of China
文摘Genetic improvement for drought stress tolerance in rice involves the quantitative nature of the trait, which reflects the additive effects of several genetic loci throughout the genome. Yield components and related traits under stressed and well-water conditions were assayed in mapping populations derived from crosses of Azucena×IR64 and Azucena×Bala. To find the candidate rice genes underlying Quantitative Trait Loci (QTL) in these populations, we conducted in silico analysis of a candidate region flanked by the genetic markers RM212 and RM319 on chromosome 1, proximal to the semi-dwarf (sd1) locus. A total of 175 annotated genes were identified from this region. These included 48 genes annotated by functional homology to known genes, 23 pseudogenes, 24 ab initio predicted genes supported by an alignment match to an EST (Expressed sequence tag) of unknown function, and 80 hypothetical genes predicted solely by ab initio means. Among these, 16 candidate genes could potentially be involved in drought stress response.
基金an output from the research projects entitled "Study on the Na-tional AOD Retrieval System based on MODIS Data" supported by the Special Funds for the Basic Research in Chinese Academy of Meteorological Sciences (CAMS) of Chinese Meteorological Administration (CMA) (2007Y001)"Multi-scale Aerosol Optical Thickness Quantitative Retrieval from Remotely Sensing Data at Urban Area"(40671142)+2 种基金the project (Grant Nos. 40871173,40601068) funded by National Natural Science Foundation of ChinaInnovation Fund by State Key Laboratoryof Remote Sensing Sciences, Institute of Remote Sensing Applications of Chinese Academy of Sciences (Grant Nos.07S00502CX, 03Q0033049)"Aerosol over China and Their Climate Effect" supported by National Basic Research Program of China (2006CB403701)
文摘In this paper, a novel algorithm for aerosol optical depth(AOD) retrieval with a 1 km spatial resolution over land is presented using the Advanced Along Track Scanning Radiometer (AATSR) dual-view capability at 0.55, 0.66 and 0.87μm, in combination with the Bi-directional Reflectance Distribution Function (BRDF) model, a product of the Moderate Resolution Imaging Spectroradiometer (MODIS). The BRDF characteristics of the land surface, i.e. prior input parameters for this algorithm, are computed by extracting the geometrical information from AATSR and reducing the kernels from the MODIS BRDF/Albedo Model Parameters Product. Finally, AOD, with a i km resolution at 0.55, 0.66 and 0.87 μm for the forward and nadir views of AATSR, can be simultaneously obtained. Extensive validations of AOD derived from AATSR during the period from August 2005 to July 2006 in Beijing and its surrounding area, against in-situ AErosol RObotic NETwork (AERONET) measurements, were performed. The AOD difference between the retrievals from the forward and nadir views of AATSR was less than 5.72%, 1.9% and 13.7%, respectively. Meanwhile, it was found that the AATSR retrievals using the synergic algorithm developed in this paper are more favorable than those by assuming a Lambert surface, for the coefficient of determination between AATSR derived AOD and AERONET mearured AOD, decreased by 15.5% and 18.5%, compared to those derived by the synergic algorithm. This further suggests that the synergic algorithm can be potentially used in climate change and air quality monitoring.
基金Taif University Researchers Supporting Project No.(TURSP-2020/126),Taif University,Taif,Saudi Arabia。
文摘The world is rapidly changing with the advance of information technology.The expansion of the Internet of Things(IoT)is a huge step in the development of the smart city.The IoT consists of connected devices that transfer information.The IoT architecture permits on-demand services to a public pool of resources.Cloud computing plays a vital role in developing IoT-enabled smart applications.The integration of cloud computing enhances the offering of distributed resources in the smart city.Improper management of security requirements of cloud-assisted IoT systems can bring about risks to availability,security,performance,condentiality,and privacy.The key reason for cloud-and IoT-enabled smart city application failure is improper security practices at the early stages of development.This article proposes a framework to collect security requirements during the initial development phase of cloud-assisted IoT-enabled smart city applications.Its three-layered architecture includes privacy preserved stakeholder analysis(PPSA),security requirement modeling and validation(SRMV),and secure cloud-assistance(SCA).A case study highlights the applicability and effectiveness of the proposed framework.A hybrid survey enables the identication and evaluation of signicant challenges.
文摘Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include storing and accessing user data in commercial clouds, mining of social data, and analysis of large-scale simulations and experiments such as the Large Hadron Collider. An increasing number of such data—intensive applications and services are relying on clouds in order to process and manage the enormous amounts of data required for continuous operation. It can be difficult to decide which of the many options for cloud processing is suitable for a given application;the aim of this paper is therefore to provide an interested user with an overview of the most important concepts of cloud computing as it relates to processing of Big Data.
文摘The goal of this manuscript is to present a research finding, based on a study conducted to identify, examine, and validate Social Media (SM) socio-technical information security factors, in line with usable-security principles. The study followed literature search techniques, as well as theoretical and empirical methods of factor validation. The strategy used in literature search includes Boolean keywords search, and citation guides, using mainly web of science databases. As guided by study objectives, 9 SM socio-technical factors were identified, verified and validated. Both theoretical and empirical validation processes were followed. Thus, a theoretical validity test was conducted on 45 Likert scale items, involving 10 subject experts. From the score ratings of the experts, Content Validity Index (CVI) was calculated to determine the degree to which the identified factors exhibit appropriate items for the construct being measured, and 7 factors attained an adequate level of validity index. However, for reliability test, 32 respondents and 45 Likert scale items were used. Whereby, Cronbach’s alpha coefficient (α-values) were generated using SPSS. Subsequently, 8 factors attained an adequate level of reliability. Overall, the validated factors include;1) usability—visibility, learnability, and satisfaction;2) education and training—help and documentation;3) SM technology development—error handling, and revocability;4) information security —security, privacy, and expressiveness. In this case, the confirmed factors would add knowledge by providing a theoretical basis for rationalizing information security requirements on SM usage.
基金Guangdong Science and Technology University Young Projects(GKY-2023KYQNK-1 and GKY-2023KYQNK-10)Guangdong Provincial Key Discipline Research Capacity Improvement Project(2022ZDJS147)。
文摘Aiming at the problem that the data in the user rating matrix is missing and the importance of implicit trust between users is ignored when using the TrustSVD model to fill it,this paper proposes a recommendation algorithm based on TrustSVD++and XGBoost.Firstly,the explicit trust and implicit trust were introduced into the SVD++model to construct the TrustSVD++model.Secondly,considering that there is much data in the interaction matrix after filling,which may lead to a rather complex calculation process,the K-means algorithm is introduced to cluster and extract user and item features at the same time.Then,in order to improve the accuracy of rating prediction for target users,an XGBoost model is proposed to train user and item features,and finally,it is verified on the data sets MovieLens-1M and MovieLens-100k.Experiments show that compared with the SVD++model and the recommendation algorithm without XGBoost model training,the proposed algorithm has the RMSE value reduced by 2.9%and the MAE value reduced by 3%.
文摘Urban public transport plays a critical role in stimulating economic development of any nation since most of the revenues come from cities. The majority of the city dwellers of any country use public transport. The evaluation of public transport service quality provides a valuable feedback to commuter operators to ensure continuous improvement of level of service and to the government to take appropriate measures for enhancing the quality of public transport service. This paper analyses and evaluates service quality of Road Public Transport (RPT) (i.e. minibuses and buses) and Urban Rail Transport (URT) in Dar es Salaam City, Tanzania. Since service quality and its attributes are intangible and vague, a fuzzy evaluation model is developed and applied. The formulated model is composed of Fuzzy Entropy Method (FEM) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). The overall evaluation procedure is as follows: initially, an intensive literature search and experts’ opinions are employed to establish criteria for evaluating the service quality of public transport in Dar es Salaam City. The developed FEM is used to obtain criteria weight. Lastly, the formulated TOPSIS is used to provide an overall ranking of urban public transport service quality. The overall evaluation shows that urban rail transport outperforms road public transport in terms of service quality. Nevertheless, the urban rail transport service in Dar es Salaam City is currently not well developed as it is provided on very limited routes. Thus, the Tanzania government, the rail transport operator i.e. Tanzania Railway Limited (TRL) and the agency responsible for the provision of rail infrastructure i.e. Reli Assets Holding Company (RAHCO) are advised to design and employ Public-Private-Partnership (PPP) schemes i.e. concession contracts to invest more in rolling stocks, locomotives and rail wagons so that the rail transport service is available on many routes of the road public transport to bring fair competition between the two operators.
文摘In recent years,various efforts have been devoted to advancing university education through artificial intelligence(AI).To this end,this paper introduces KCUBE,a novel framework centered on knowledge graphs(KGs)designed to enhance student advising and career planning in university courses.Owing to KCUBE,we can improve university education in the AI era by leveraging the expressiveness,operability,and interpretability of KGs.We detail a bottom-up approach for KG construction,empowering professors to develop subject-specific KGs,augmented by tools like ChatGPT,which has demonstrated promising accuracy and coverage.Based on KGs,KCUBE supports KG reasoning for applications such as automated teaching plan generation with dynamic editing capabilities.Furthermore,KCUBE offers advanced KG manipulation through 2D and 3D visualization platforms,such as virtual reality(VR)for immersive exploration of academic subjects and potential career paths.A comparative study on collaborative learning highlights the benefits of VR and KG-enhanced environments in promoting student engagement,participation,and collaborative decision-making.
文摘Malaysia,as one of the highest producers of palm oil globally and one of the largest exporters,has a huge potential to use palmoil waste to generate electricity since an abundance of waste is produced during the palmoil extraction process.In this paper,we have first examined and compared the use of palmoil waste as biomass for electricity generation in different countries with reference to Malaysia.Some areas with default accessibility in rural areas,like those in Sabah and Sarawak,require a cheap and reliable source of electricity.Palm oil waste possesses the potential to be the source.Therefore,this research examines the cost-effective comparison between electricity generated frompalm oil waste and standalone diesel electric generation in Marudi,Sarawak,Malaysia.This research aims to investigate the potential electricity generation using palm oil waste and the feasibility of implementing the technology in rural areas.To implement and analyze the feasibility,a case study has been carried out in a rural area in Sarawak,Malaysia.The finding shows the electricity cost calculation of small towns like Long Lama,Long Miri,and Long Atip,with ten nearby schools,and suggests that using EFB from palm oil waste is cheaper and reduces greenhouse gas emissions.The study also points out the need to conduct further research on power systems,such as energy storage andmicrogrids,to better understand the future of power systems.By collecting data through questionnaires and surveys,an analysis has been carried out to determine the approximate cost and quantity of palm oil waste to generate cheaper renewable energy.We concluded that electricity generation from palm oil waste is cost-effective and beneficial.The infrastructure can be a microgrid connected to the main grid.
文摘Addressing classification and prediction challenges, tree ensemble models have gained significant importance. Boosting ensemble techniques are commonly employed for forecasting Type-II diabetes mellitus. Light Gradient Boosting Machine (LightGBM) is a widely used algorithm known for its leaf growth strategy, loss reduction, and enhanced training precision. However, LightGBM is prone to overfitting. In contrast, CatBoost utilizes balanced base predictors known as decision tables, which mitigate overfitting risks and significantly improve testing time efficiency. CatBoost’s algorithm structure counteracts gradient boosting biases and incorporates an overfitting detector to stop training early. This study focuses on developing a hybrid model that combines LightGBM and CatBoost to minimize overfitting and improve accuracy by reducing variance. For the purpose of finding the best hyperparameters to use with the underlying learners, the Bayesian hyperparameter optimization method is used. By fine-tuning the regularization parameter values, the hybrid model effectively reduces variance (overfitting). Comparative evaluation against LightGBM, CatBoost, XGBoost, Decision Tree, Random Forest, AdaBoost, and GBM algorithms demonstrates that the hybrid model has the best F1-score (99.37%), recall (99.25%), and accuracy (99.37%). Consequently, the proposed framework holds promise for early diabetes prediction in the healthcare industry and exhibits potential applicability to other datasets sharing similarities with diabetes.
基金supported by Guangdong Basic and Applied Basic Research Project(No.2025A1515012874)Foundation of Yunnan Key Laboratory of Service Computing(No.YNSC24115)+5 种基金Research Project of Pazhou Lab for Excellent Young Scholars(No.PZL2021KF0024)Guangdong Undergraduate Teaching Quality and Teaching Reform ProjectUniversity Research Project of Guangzhou Education Bureau(No.2024312189)Guangzhou Basic and Applied Basic Research Project(No.SL2024A03J00397)National Natural Science Foundation of China(No.62272113)Guangzhou Basic Research Program(No.2024A03J0398)。
文摘Low Earth Orbit(LEO)satellites have gained significant attention for their low-latency communication and computing capabilities but face challenges due to high mobility and limited resources.Existing studies integrate edge computing with LEO satellite networks to optimize task offloading;however,they often overlook the impact of frequent topology changes,unstable transmission links,and intermittent satellite visibility,leading to task execution failures and increased latency.To address these issues,this paper proposes a dynamic integrated spaceground computing framework that optimizes task offloading under LEO satellite mobility constraints.We design an adaptive task migration strategy through inter-satellite links when target satellites become inaccessible.To enhance data transmission reliability,we introduce a communication stability constraint based on transmission bit error rate(BER).Additionally,we develop a genetic algorithm(GA)-based task scheduling method that dynamically allocates computing resources while minimizing latency and energy consumption.Our approach jointly considers satellite computing capacity,link stability,and task execution reliability to achieve efficient task offloading.Experimental results demonstrate that the proposed method significantly improves task execution success rates,reduces system overhead,and enhances overall computational efficiency in LEO satellite networks.
基金supported by Stable Support Project of Shenzhen(20231120161634002)Shenzhen Science and Technology Programme(JCYJ20240813141417023)+5 种基金Natural Science Foundation of Guangdong Province of China(2025A1515010233)Guangdong Provincial Department of Education(2024KTSCX060)Tencent‘Rhinoceros Birds’—Scientific Research Foundation for Young Teachers of Shenzhen University,Open Project of State Key Laboratory for Novel Software Technology of Nanjing University(KFKT2025B22)Hong Kong RGC General Research Fund(No.152211/23E and 15216424/24E)PolyU Internal Fund(No.P0043932,P0048988)NVIDIA AI Technology Centre.
文摘Point of interest(POI)recommendation analyses user preferences through historical check-in data.However,existing POI recommendation methods often overlook the influence of weather information and face the challenge of sparse historical data for individual users.To address these issues,this paper proposes a new paradigm,namely temporal-weather-aware transition pattern for POI recommendation(TWTransNet).This paradigm is designed to capture user transition patterns under different times and weather conditions.Additionally,we introduce the construction of a user-POI interaction graph to alleviate the problem of sparse historical data for individual users.Furthermore,when predicting user interests by aggregating graph information,some POIs may not be suitable for visitation under current weather conditions.To account for this,we propose an attention mechanism to filter POI neighbours when aggregating information from the graph,considering the impact of weather and time.Empirical results on two real-world datasets demonstrate the superior performance of our proposed method,showing a substantial improvement of 6.91%-23.31% in terms of prediction accuracy.
基金Supported by the National Natural Science Foundation of China(10571141,70971109,71371152)supported by the Talents Fund of Xi’an Polytechnic University(BS1320)the Mathematics Discipline Development Fund of Xi’an Ploytechnic University(107090701)
文摘When all the involved data in indefinite quadratic programs change simultaneously, we show the locally Lipschtiz continuity of the KKT set of the quadratic programming problem firstly, then we establish the locally Lipschtiz continuity of the KKT solution set. Finally, the similar conclusion for the corresponding optimal value function is obtained.
基金supported by the National Natural Science Foundation of China(NSFC)(GrantNo.62172058)the Hunan ProvincialNatural Science Foundation of China(Grant Nos.2022JJ10052,2022JJ30624).
文摘In the Internet of Things(IoT)system,relay communication is widely used to solve the problem of energy loss in long-distance transmission and improve transmission efficiency.In Body Sensor Network(BSN)systems,biosensors communicate with receiving devices through relay nodes to improve their limited energy efficiency.When the relay node fails,the biosensor can communicate directly with the receiving device by releasing more transmitting power.However,if the remaining battery power of the biosensor is insufficient to enable it to communicate directly with the receiving device,the biosensor will be isolated by the system.Therefore,a new combinatorial analysis method is proposed to analyze the influence of random isolation time(RIT)on system reliability,and the competition relationship between biosensor isolation and propagation failure is considered.This approach inherits the advantages of common combinatorial algorithms and provides a new approach to effectively address the impact of RIT on system reliability in IoT systems,which are affected by competing failures.Finally,the method is applied to the BSN system,and the effect of RIT on the system reliability is analyzed in detail.