The data production elements are driving profound transformations in the real economy across production objects,methods,and tools,generating significant economic effects such as industrial structure upgrading.This pap...The data production elements are driving profound transformations in the real economy across production objects,methods,and tools,generating significant economic effects such as industrial structure upgrading.This paper aims to reveal the impact mechanism of the data elements on the“three transformations”(high-end,intelligent,and green)in the manufacturing sector,theoretically elucidating the intrinsic mechanisms by which the data elements influence these transformations.The study finds that the data elements significantly enhance the high-end,intelligent,and green levels of China's manufacturing industry.In terms of the pathways of impact,the data elements primarily influence the development of high-tech industries and overall green technological innovation,thereby affecting the high-end,intelligent,and green transformation of the industry.展开更多
Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpe...Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission.展开更多
Cancer deaths and new cases worldwide are projected to rise by 47%by 2040,with transitioning countries experiencing an even higher increase of up to 95%.Tumor severity is profoundly influenced by the timing,accuracy,a...Cancer deaths and new cases worldwide are projected to rise by 47%by 2040,with transitioning countries experiencing an even higher increase of up to 95%.Tumor severity is profoundly influenced by the timing,accuracy,and stage of diagnosis,which directly impacts clinical decision-making.Various biological entities,including genes,proteins,mRNAs,miRNAs,and metabolites,contribute to cancer development.The emergence of multi-omics technologies has transformed cancer research by revealing molecular alterations across multiple biological layers.This integrative approach supports the notion that cancer is fundamentally driven by such alterations,enabling the discovery ofmolecular signatures for precision oncology.This reviewexplores the role of AI-drivenmulti-omics analyses in cancer medicine,emphasizing their potential to identify novel biomarkers and therapeutic targets,enhance understanding of Tumor biology,and address integration challenges in clinical workflows.Network biology analyzes identified ERBB2,KRAS,and TP53 as top hub genes in lung cancer based on Maximal Clique Centrality(MCC)scores.In contrast,TP53,ERBB2,ESR1,MYC,and BRCA1 emerged as central regulators in breast cancer,linked to cell proliferation,hormonal signaling,and genomic stability.The review also discusses how specific Artificial Intelligence(AI)algorithms can streamline the integration of heterogeneous datasets,facilitate the interpretation of the tumor microenvironment,and support data-driven clinical strategies.展开更多
BACKGROUND Hepatocellular carcinoma(HCC)remains a significant public health concern in South Korea even though the incidence rates are declining.While medical travel for cancer treatment is common,its patterns and inf...BACKGROUND Hepatocellular carcinoma(HCC)remains a significant public health concern in South Korea even though the incidence rates are declining.While medical travel for cancer treatment is common,its patterns and influencing factors for patients with HCC are unknown.AIM To assess medical travel patterns and determinants and their policy implications among patients with newly diagnosed HCC in South Korea.METHODS This retrospective cohort study used the National Health Insurance Service database to identify patients with newly diagnosed HCC from 2013 to 2021.Medical travel was defined as receiving initial treatment outside one’s residential region.Patient characteristics and regional trends were analyzed,and factors influencing medical travel were identified using logistic regression analysis.RESULTS Among 64808 patients 52.4%received treatment in the capital.This proportion increased to 67.4%when including the surrounding metropolitan area.Medical travel was significantly more common among younger and wealthier patients.Patients with greater comorbidity burden or liver cirrhosis were less likely to travel.While geographic distance influenced travel patterns,high-volume academic centers in the capital attracted patients nationwide regardless of proximity.CONCLUSION This nationwide study highlighted the centralization of HCC care in the capital.This observation indicates that regional cancer hubs should be strengthened and promoted for equitable healthcare access.展开更多
The widespread usage of rechargeable batteries in portable devices,electric vehicles,and energy storage systems has underscored the importance for accurately predicting their lifetimes.However,data scarcity often limi...The widespread usage of rechargeable batteries in portable devices,electric vehicles,and energy storage systems has underscored the importance for accurately predicting their lifetimes.However,data scarcity often limits the accuracy of prediction models,which is escalated by the incompletion of data induced by the issues such as sensor failures.To address these challenges,we propose a novel approach to accommodate data insufficiency through achieving external information from incomplete data samples,which are usually discarded in existing studies.In order to fully unleash the prediction power of incomplete data,we have investigated the Multiple Imputation by Chained Equations(MICE)method that diversifies the training data through exploring the potential data patterns.The experimental results demonstrate that the proposed method significantly outperforms the baselines in the most considered scenarios while reducing the prediction root mean square error(RMSE)by up to 18.9%.Furthermore,we have also observed that the penetration of incomplete data benefits the explainability of the prediction model through facilitating the feature selection.展开更多
Face recognition has emerged as one of the most prominent applications of image analysis and under-standing,gaining considerable attention in recent years.This growing interest is driven by two key factors:its extensi...Face recognition has emerged as one of the most prominent applications of image analysis and under-standing,gaining considerable attention in recent years.This growing interest is driven by two key factors:its extensive applications in law enforcement and the commercial domain,and the rapid advancement of practical technologies.Despite the significant advancements,modern recognition algorithms still struggle in real-world conditions such as varying lighting conditions,occlusion,and diverse facial postures.In such scenarios,human perception is still well above the capabilities of present technology.Using the systematic mapping study,this paper presents an in-depth review of face detection algorithms and face recognition algorithms,presenting a detailed survey of advancements made between 2015 and 2024.We analyze key methodologies,highlighting their strengths and restrictions in the application context.Additionally,we examine various datasets used for face detection/recognition datasets focusing on the task-specific applications,size,diversity,and complexity.By analyzing these algorithms and datasets,this survey works as a valuable resource for researchers,identifying the research gap in the field of face detection and recognition and outlining potential directions for future research.展开更多
Electric Vehicle Charging Systems(EVCS)are increasingly vulnerable to cybersecurity threats as they integrate deeply into smart grids and Internet ofThings(IoT)environments,raising significant security challenges.Most...Electric Vehicle Charging Systems(EVCS)are increasingly vulnerable to cybersecurity threats as they integrate deeply into smart grids and Internet ofThings(IoT)environments,raising significant security challenges.Most existing research primarily emphasizes network-level anomaly detection,leaving critical vulnerabilities at the host level underexplored.This study introduces a novel forensic analysis framework leveraging host-level data,including system logs,kernel events,and Hardware Performance Counters(HPC),to detect and analyze sophisticated cyberattacks such as cryptojacking,Denial-of-Service(DoS),and reconnaissance activities targeting EVCS.Using comprehensive forensic analysis and machine learning models,the proposed framework significantly outperforms existing methods,achieving an accuracy of 98.81%.The findings offer insights into distinct behavioral signatures associated with specific cyber threats,enabling improved cybersecurity strategies and actionable recommendations for robust EVCS infrastructure protection.展开更多
Accurate capacity and State of Charge(SOC)estimation are crucial for ensuring the safety and longevity of lithium-ion batteries in electric vehicles.This study examines ten machine learning architectures,Including Dee...Accurate capacity and State of Charge(SOC)estimation are crucial for ensuring the safety and longevity of lithium-ion batteries in electric vehicles.This study examines ten machine learning architectures,Including Deep Belief Network(DBN),Bidirectional Recurrent Neural Network(BiDirRNN),Gated Recurrent Unit(GRU),and others using the NASA B0005 dataset of 591,458 instances.Results indicate that DBN excels in capacity estimation,achieving orders-of-magnitude lower error values and explaining over 99.97%of the predicted variable’s variance.When computational efficiency is paramount,the Deep Neural Network(DNN)offers a strong alternative,delivering near-competitive accuracy with significantly reduced prediction times.The GRU achieves the best overall performance for SOC estimation,attaining an R^(2) of 0.9999,while the BiDirRNN provides a marginally lower error at a slightly higher computational speed.In contrast,Convolutional Neural Networks(CNN)and Radial Basis Function Networks(RBFN)exhibit relatively high error rates,making them less viable for real-world battery management.Analyses of error distributions reveal that the top-performing models cluster most predictions within tight bounds,limiting the risk of overcharging or deep discharging.These findings highlight the trade-off between accuracy and computational overhead,offering valuable guidance for battery management system(BMS)designers seeking optimal performance under constrained resources.Future work may further explore advanced data augmentation and domain adaptation techniques to enhance these models’robustness in diverse operating conditions.展开更多
Sinkhole formation poses a significant geohazard in karst regions,where unpredictable subsurface erosion often necessitates costly grouting for stabilization.Accurate estimation of grout volume remains a persistent ch...Sinkhole formation poses a significant geohazard in karst regions,where unpredictable subsurface erosion often necessitates costly grouting for stabilization.Accurate estimation of grout volume remains a persistent challenge due to spatial variability,site-specific conditions,and the limitations of traditional empirical methods.This study introduces a novel machine learning-based regression model for grout volume prediction that integrates cone penetration test(CPT)-derived Sinkhole Resistance Ratio(SRR)values,spatial correlations between CPT and grouting points(GPs),and field-recorded grout volumes from six sinkhole sites in Florida.Three data trans-formation methods,the Proximal Allocation Method(PAM),the Equitable Distribution Method(EDM),and the Threshold-based Equitable Distribution Method(TEDM),were applied to distribute grout influence across CPTs,with TEDM demonstrating superior predictive performance.Synthetic data augmentation using spline method-ology further improved model robustness.A high-degree polynomial regression model,optimized with ridge regularization,achieved high accuracy(R^(2)=0.95;PEV=0.94)and significantly outperformed existing linear and logarithmic models.Results confirm that lower SRR values correlate with higher grout demand,and the proposed model reliably captures these nonlinear relationships.This research advances sinkhole remediation practice by providing a data-driven,accurate,and generalizable framework for grout volume estimation,enabling more efficient resource allocation and improved project outcomes.展开更多
In this paper,we establish and study a single-species logistic model with impulsive age-selective harvesting.First,we prove the ultimate boundedness of the solutions of the system.Then,we obtain conditions for the asy...In this paper,we establish and study a single-species logistic model with impulsive age-selective harvesting.First,we prove the ultimate boundedness of the solutions of the system.Then,we obtain conditions for the asymptotic stability of the trivial solution and the positive periodic solution.Finally,numerical simulations are presented to validate our results.Our results show that age-selective harvesting is more conducive to sustainable population survival than non-age-selective harvesting.展开更多
Network attacks have become a critical issue in the internet security domain.Artificial intelligence technology-based detection methodologies have attracted attention;however,recent studies have struggled to adapt to ...Network attacks have become a critical issue in the internet security domain.Artificial intelligence technology-based detection methodologies have attracted attention;however,recent studies have struggled to adapt to changing attack patterns and complex network environments.In addition,it is difficult to explain the detection results logically using artificial intelligence.We propose a method for classifying network attacks using graph models to explain the detection results.First,we reconstruct the network packet data into a graphical structure.We then use a graph model to predict network attacks using edge classification.To explain the prediction results,we observed numerical changes by randomly masking and calculating the importance of neighbors,allowing us to extract significant subgraphs.Our experiments on six public datasets demonstrate superior performance with an average F1-score of 0.960 and accuracy of 0.964,outperforming traditional machine learning and other graph models.The visual representation of the extracted subgraphs highlights the neighboring nodes that have the greatest impact on the results,thus explaining detection.In conclusion,this study demonstrates that graph-based models are suitable for network attack detection in complex environments,and the importance of graph neighbors can be calculated to efficiently analyze the results.This approach can contribute to real-world network security analyses and provide a new direction in the field.展开更多
Background There is insufficient evidence to provide recommendations for leisure-time physical activity among workers across various occupational physical activity levels.This study aimed to assess the association of ...Background There is insufficient evidence to provide recommendations for leisure-time physical activity among workers across various occupational physical activity levels.This study aimed to assess the association of leisure-time physical activity with cardiovascular and all-cause mortality across occupational physical activity levels.Methods This study utilized individual participant data from 21 cohort studies,comprising both published and unpublished data.Eligibility criteria included individual-level data on leisure-time and occupational physical activity(categorized as sedentary,low,moderate,and high)along with data on all-cause and/or cardiovascular mortality.A 2-stage individual participant data meta-analysis was conducted,with separate analysis of each study using Cox proportional hazards models(Stage 1).These results were combined using random-effects models(Stage 2).Results Higher leisure-time physical activity levels were associated with lower all-cause and cardiovascular mortality risk across most occupational physical activity levels,for both males and females.Among males with sedentary work,high compared to sedentary leisure-time physical activity was associated with lower all-cause(hazard ratios(HR)=0.77,95%confidence interval(95%CI):0.70-0.85)and cardiovascular mortality(HR=0.76,95%CI:0.66-0.87)risk.Among males with high levels of occupational physical activity,high compared to sedentary leisure-time physical activity was associated with lower all-cause(HR=0.84,95%CI:0.74-0.97)and cardiovascular mortality(HR=0.79,95%CI:0.60-1.04)risk,while HRs for low and moderate levels of leisure-time physical activity ranged between 0.87 and 0.97 and were not statistically significant.Among females,most effects were similar but more imprecise,especially in the higher occupational physical activity levels.Conclusion Higher levels of leisure-time physical activity were generally associated with lower mortality risks.However,results for workers with moderate and high occupational physical activity levels,especially women,were more imprecise.Our findings suggests that workers may benefit from engaging in high levels of leisure-time physical activity,irrespective of their level of occupational physical activity.展开更多
Alzheimer’s disease(AD)is the most common form of dementia,affecting over 50 million people worldwide.This figure is projected to nearly double every 20 years,reaching 82 million by 2030 and 152 million by 2050(Alzhe...Alzheimer’s disease(AD)is the most common form of dementia,affecting over 50 million people worldwide.This figure is projected to nearly double every 20 years,reaching 82 million by 2030 and 152 million by 2050(Alzheimer’s Disease International).The apolipoproteinε4(APOE4)allele is the strongest genetic risk factor for late-onset AD(after age 65 years).Apolipoprotein E,a lipid transporter,exists in three variants:ε2,ε3,andε4.APOEε2(APOE2)is protective against AD,APOEε3(APOE3)is neutral,while APOE4 significantly increases the risk.Individuals with one copy of APOE4 have a 4-fold greater risk of developing AD,and those with two copies face an 8-fold risk compared to non-carriers.Even in cognitively normal individuals,APOE4 carriers exhibit brain metabolic and vascular deficits decades before amyloid-beta(Aβ)plaques and neurofibrillary tau tangles emerge-the hallmark pathologies of AD(Reiman et al.,2001,2005;Thambisetty et al.,2010).Notably,studies have demonstrated reduced glucose uptake,or hypometabolism,in brain regions vulnerable to AD in asymptomatic middle-aged APOE4 carriers,long before clinical symptoms arise(Reiman et al.,2001,2005).展开更多
The advent of sixth-generation(6G)networks introduces unprecedented challenges in achieving seamless connectivity,ultra-low latency,and efficient resource management in highly dynamic environments.Although fifth-gener...The advent of sixth-generation(6G)networks introduces unprecedented challenges in achieving seamless connectivity,ultra-low latency,and efficient resource management in highly dynamic environments.Although fifth-generation(5G)networks transformed mobile broadband and machine-type communications at massive scales,their properties of scaling,interference management,and latency remain a limitation in dense high mobility settings.To overcome these limitations,artificial intelligence(AI)and unmanned aerial vehicles(UAVs)have emerged as potential solutions to develop versatile,dynamic,and energy-efficient communication systems.The study proposes an AI-based UAV architecture that utilizes cooperative reinforcement learning(CoRL)to manage an autonomous network.The UAVs collaborate by sharing local observations and real-time state exchanges to optimize user connectivity,movement directions,allocate power,and resource distribution.Unlike conventional centralized or autonomous methods,CoRL involves joint state sharing and conflict-sensitive reward shaping,which ensures fair coverage,less interference,and enhanced adaptability in a dynamic urban environment.Simulations conducted in smart city scenarios with 10 UAVs and 50 ground users demonstrate that the proposed CoRL-based UAV system increases user coverage by up to 10%,achieves convergence 40%faster,and reduces latency and energy consumption by 30%compared with centralized and decentralized baselines.Furthermore,the distributed nature of the algorithm ensures scalability and flexibility,making it well-suited for future large-scale 6G deployments.The results highlighted that AI-enabled UAV systems enhance connectivity,support ultra-reliable low-latency communications(URLLC),and improve 6G network efficiency.Future work will extend the framework with adaptive modulation,beamforming-aware positioning,and real-world testbed deployment.展开更多
The generation of synthetic trajectories has become essential in various fields for analyzing complex movement patterns.However,the use of real-world trajectory data poses significant privacy risks,such as location re...The generation of synthetic trajectories has become essential in various fields for analyzing complex movement patterns.However,the use of real-world trajectory data poses significant privacy risks,such as location reidentification and correlation attacks.To address these challenges,privacy-preserving trajectory generation methods are critical for applications relying on sensitive location data.This paper introduces DPIL-Traj,an advanced framework designed to generate synthetic trajectories while achieving a superior balance between data utility and privacy preservation.Firstly,the framework incorporates Differential Privacy Clustering,which anonymizes trajectory data by applying differential privacy techniques that add noise,ensuring the protection of sensitive user information.Secondly,Imitation Learning is used to replicate decision-making behaviors observed in real-world trajectories.By learning from expert trajectories,this component generates synthetic data that closely mimics real-world decision-making processes while optimizing the quality of the generated trajectories.Finally,Markov-based Trajectory Generation is employed to capture and maintain the inherent temporal dynamics of movement patterns.Extensive experiments conducted on the GeoLife trajectory dataset show that DPIL-Traj improves utility performance by an average of 19.85%,and in terms of privacy performance by an average of 12.51%,compared to state-of-the-art approaches.Ablation studies further reveal that DP clustering effectively safeguards privacy,imitation learning enhances utility under noise,and the Markov module strengthens temporal coherence.展开更多
It is important for modern hospital management to strengthen medical humanistic care and build a harmonious doctor-patient relationship.Innovative applications of the big data resources of patient experience in modern...It is important for modern hospital management to strengthen medical humanistic care and build a harmonious doctor-patient relationship.Innovative applications of the big data resources of patient experience in modern hospital management facilitate hospital management to realize real-time supervision,dynamic management and s&entitle decision-making based on patients experiences.It is helping the transformation of hospital management from an administrator^perspective to a patients perspective,and from experience-driven to data-driven.The technological innovations in hospital management based on patient experience data can assist the optimization and continuous improvement of healthcare quality,therefore help to increase patient satisfaction to the medical services.展开更多
Anomaly detection in high dimensional data is a critical research issue with serious implication in the real-world problems.Many issues in this field still unsolved,so several modern anomaly detection methods struggle...Anomaly detection in high dimensional data is a critical research issue with serious implication in the real-world problems.Many issues in this field still unsolved,so several modern anomaly detection methods struggle to maintain adequate accuracy due to the highly descriptive nature of big data.Such a phenomenon is referred to as the“curse of dimensionality”that affects traditional techniques in terms of both accuracy and performance.Thus,this research proposed a hybrid model based on Deep Autoencoder Neural Network(DANN)with five layers to reduce the difference between the input and output.The proposed model was applied to a real-world gas turbine(GT)dataset that contains 87620 columns and 56 rows.During the experiment,two issues have been investigated and solved to enhance the results.The first is the dataset class imbalance,which solved using SMOTE technique.The second issue is the poor performance,which can be solved using one of the optimization algorithms.Several optimization algorithms have been investigated and tested,including stochastic gradient descent(SGD),RMSprop,Adam and Adamax.However,Adamax optimization algorithm showed the best results when employed to train theDANNmodel.The experimental results show that our proposed model can detect the anomalies by efficiently reducing the high dimensionality of dataset with accuracy of 99.40%,F1-score of 0.9649,Area Under the Curve(AUC)rate of 0.9649,and a minimal loss function during the hybrid model training.展开更多
The advent of healthcare information management systems(HIMSs)continues to produce large volumes of healthcare data for patient care and compliance and regulatory requirements at a global scale.Analysis of this big da...The advent of healthcare information management systems(HIMSs)continues to produce large volumes of healthcare data for patient care and compliance and regulatory requirements at a global scale.Analysis of this big data allows for boundless potential outcomes for discovering knowledge.Big data analytics(BDA)in healthcare can,for instance,help determine causes of diseases,generate effective diagnoses,enhance Qo S guarantees by increasing efficiency of the healthcare delivery and effectiveness and viability of treatments,generate accurate predictions of readmissions,enhance clinical care,and pinpoint opportunities for cost savings.However,BDA implementations in any domain are generally complicated and resource-intensive with a high failure rate and no roadmap or success strategies to guide the practitioners.In this paper,we present a comprehensive roadmap to derive insights from BDA in the healthcare(patient care)domain,based on the results of a systematic literature review.We initially determine big data characteristics for healthcare and then review BDA applications to healthcare in academic research focusing particularly on No SQL databases.We also identify the limitations and challenges of these applications and justify the potential of No SQL databases to address these challenges and further enhance BDA healthcare research.We then propose and describe a state-of-the-art BDA architecture called Med-BDA for healthcare domain which solves all current BDA challenges and is based on the latest zeta big data paradigm.We also present success strategies to ensure the working of Med-BDA along with outlining the major benefits of BDA applications to healthcare.Finally,we compare our work with other related literature reviews across twelve hallmark features to justify the novelty and importance of our work.The aforementioned contributions of our work are collectively unique and clearly present a roadmap for clinical administrators,practitioners and professionals to successfully implement BDA initiatives in their organizations.展开更多
Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to ...Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to optimally weight geological field observations against constraints from geophysical data.Here,using the Bayesian Obsidian software package,we develop a methodology to fuse lithostratigraphic field observations with aeromagnetic and gravity data to build a 3D model in a small(13.5 km×13.5 km)region of the Gascoyne Province,Western Australia.Our approach is validated by comparing 3D model results to independently-constrained geological maps and cross-sections produced by the Geological Survey of Western Australia.By fusing geological field data with aeromagnetic and gravity surveys,we show that 89%of the modelled region has>95%certainty for a particular geological unit for the given model and data.The boundaries between geological units are characterized by narrow regions with<95%certainty,which are typically 400-1000 m wide at the Earth's surface and 500-2000 m wide at depth.Beyond~4 km depth,the model requires geophysical survey data with longer wavelengths(e.g.,active seismic)to constrain the deeper subsurface.Although Obsidian was originally built for sedimentary basin problems,there is reasonable applicability to deformed terranes such as the Gascoyne Province.Ultimately,modification of the Bayesian engine to incorporate structural data will aid in developing more robust 3D models.Nevertheless,our results show that surface geological observations fused with geophysical survey data can yield reasonable 3D geological models with narrow uncertainty regions at the surface and shallow subsurface,which will be especially valuable for mineral exploration and the development of 3D geological models under cover.展开更多
With the rapid advancement of cloud computing,cloud storage services have developed rapidly.One issue that has attracted particular attention in such remote storage services is that cloud storage servers are not enoug...With the rapid advancement of cloud computing,cloud storage services have developed rapidly.One issue that has attracted particular attention in such remote storage services is that cloud storage servers are not enough to reliably save and maintain data,which greatly affects users’confidence in purchasing and consuming cloud storage services.Traditional data integrity auditing techniques for cloud data storage are centralized,which faces huge security risks due to single-point-of-failure and vulnerabilities of central auditing servers.Blockchain technology offers a new approach to this problem.Many researchers have endeavored to employ the blockchain for data integrity auditing.Based on the search of relevant papers,we found that existing literature lacks a thorough survey of blockchain-based integrity auditing for cloud data.In this paper,we make an in-depth survey on cloud data integrity auditing based on blockchain.Firstly,we cover essential basic knowledge of integrity auditing for cloud data and blockchain techniques.Then,we propose a series of requirements for evaluating existing Blockchain-based Data Integrity Auditing(BDIA)schemes.Furthermore,we provide a comprehensive review of existing BDIA schemes and evaluate them based on our proposed criteria.Finally,according to our completed review and analysis,we explore some open issues and suggest research directions worthy of further efforts in the future.展开更多
文摘The data production elements are driving profound transformations in the real economy across production objects,methods,and tools,generating significant economic effects such as industrial structure upgrading.This paper aims to reveal the impact mechanism of the data elements on the“three transformations”(high-end,intelligent,and green)in the manufacturing sector,theoretically elucidating the intrinsic mechanisms by which the data elements influence these transformations.The study finds that the data elements significantly enhance the high-end,intelligent,and green levels of China's manufacturing industry.In terms of the pathways of impact,the data elements primarily influence the development of high-tech industries and overall green technological innovation,thereby affecting the high-end,intelligent,and green transformation of the industry.
基金supported in part by the National Key Research and Development Program of China under Grant 2024YFE0200600in part by the National Natural Science Foundation of China under Grant 62071425+3 种基金in part by the Zhejiang Key Research and Development Plan under Grant 2022C01093in part by the Zhejiang Provincial Natural Science Foundation of China under Grant LR23F010005in part by the National Key Laboratory of Wireless Communications Foundation under Grant 2023KP01601in part by the Big Data and Intelligent Computing Key Lab of CQUPT under Grant BDIC-2023-B-001.
文摘Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission.
基金funded by KAU Endowment(WAQF)at King Abdulaziz University,Jeddah,Saudi Arabia.
文摘Cancer deaths and new cases worldwide are projected to rise by 47%by 2040,with transitioning countries experiencing an even higher increase of up to 95%.Tumor severity is profoundly influenced by the timing,accuracy,and stage of diagnosis,which directly impacts clinical decision-making.Various biological entities,including genes,proteins,mRNAs,miRNAs,and metabolites,contribute to cancer development.The emergence of multi-omics technologies has transformed cancer research by revealing molecular alterations across multiple biological layers.This integrative approach supports the notion that cancer is fundamentally driven by such alterations,enabling the discovery ofmolecular signatures for precision oncology.This reviewexplores the role of AI-drivenmulti-omics analyses in cancer medicine,emphasizing their potential to identify novel biomarkers and therapeutic targets,enhance understanding of Tumor biology,and address integration challenges in clinical workflows.Network biology analyzes identified ERBB2,KRAS,and TP53 as top hub genes in lung cancer based on Maximal Clique Centrality(MCC)scores.In contrast,TP53,ERBB2,ESR1,MYC,and BRCA1 emerged as central regulators in breast cancer,linked to cell proliferation,hormonal signaling,and genomic stability.The review also discusses how specific Artificial Intelligence(AI)algorithms can streamline the integration of heterogeneous datasets,facilitate the interpretation of the tumor microenvironment,and support data-driven clinical strategies.
基金Supported by Dong-A University Research Fund,No.20230598.
文摘BACKGROUND Hepatocellular carcinoma(HCC)remains a significant public health concern in South Korea even though the incidence rates are declining.While medical travel for cancer treatment is common,its patterns and influencing factors for patients with HCC are unknown.AIM To assess medical travel patterns and determinants and their policy implications among patients with newly diagnosed HCC in South Korea.METHODS This retrospective cohort study used the National Health Insurance Service database to identify patients with newly diagnosed HCC from 2013 to 2021.Medical travel was defined as receiving initial treatment outside one’s residential region.Patient characteristics and regional trends were analyzed,and factors influencing medical travel were identified using logistic regression analysis.RESULTS Among 64808 patients 52.4%received treatment in the capital.This proportion increased to 67.4%when including the surrounding metropolitan area.Medical travel was significantly more common among younger and wealthier patients.Patients with greater comorbidity burden or liver cirrhosis were less likely to travel.While geographic distance influenced travel patterns,high-volume academic centers in the capital attracted patients nationwide regardless of proximity.CONCLUSION This nationwide study highlighted the centralization of HCC care in the capital.This observation indicates that regional cancer hubs should be strengthened and promoted for equitable healthcare access.
文摘The widespread usage of rechargeable batteries in portable devices,electric vehicles,and energy storage systems has underscored the importance for accurately predicting their lifetimes.However,data scarcity often limits the accuracy of prediction models,which is escalated by the incompletion of data induced by the issues such as sensor failures.To address these challenges,we propose a novel approach to accommodate data insufficiency through achieving external information from incomplete data samples,which are usually discarded in existing studies.In order to fully unleash the prediction power of incomplete data,we have investigated the Multiple Imputation by Chained Equations(MICE)method that diversifies the training data through exploring the potential data patterns.The experimental results demonstrate that the proposed method significantly outperforms the baselines in the most considered scenarios while reducing the prediction root mean square error(RMSE)by up to 18.9%.Furthermore,we have also observed that the penetration of incomplete data benefits the explainability of the prediction model through facilitating the feature selection.
文摘Face recognition has emerged as one of the most prominent applications of image analysis and under-standing,gaining considerable attention in recent years.This growing interest is driven by two key factors:its extensive applications in law enforcement and the commercial domain,and the rapid advancement of practical technologies.Despite the significant advancements,modern recognition algorithms still struggle in real-world conditions such as varying lighting conditions,occlusion,and diverse facial postures.In such scenarios,human perception is still well above the capabilities of present technology.Using the systematic mapping study,this paper presents an in-depth review of face detection algorithms and face recognition algorithms,presenting a detailed survey of advancements made between 2015 and 2024.We analyze key methodologies,highlighting their strengths and restrictions in the application context.Additionally,we examine various datasets used for face detection/recognition datasets focusing on the task-specific applications,size,diversity,and complexity.By analyzing these algorithms and datasets,this survey works as a valuable resource for researchers,identifying the research gap in the field of face detection and recognition and outlining potential directions for future research.
文摘Electric Vehicle Charging Systems(EVCS)are increasingly vulnerable to cybersecurity threats as they integrate deeply into smart grids and Internet ofThings(IoT)environments,raising significant security challenges.Most existing research primarily emphasizes network-level anomaly detection,leaving critical vulnerabilities at the host level underexplored.This study introduces a novel forensic analysis framework leveraging host-level data,including system logs,kernel events,and Hardware Performance Counters(HPC),to detect and analyze sophisticated cyberattacks such as cryptojacking,Denial-of-Service(DoS),and reconnaissance activities targeting EVCS.Using comprehensive forensic analysis and machine learning models,the proposed framework significantly outperforms existing methods,achieving an accuracy of 98.81%.The findings offer insights into distinct behavioral signatures associated with specific cyber threats,enabling improved cybersecurity strategies and actionable recommendations for robust EVCS infrastructure protection.
文摘Accurate capacity and State of Charge(SOC)estimation are crucial for ensuring the safety and longevity of lithium-ion batteries in electric vehicles.This study examines ten machine learning architectures,Including Deep Belief Network(DBN),Bidirectional Recurrent Neural Network(BiDirRNN),Gated Recurrent Unit(GRU),and others using the NASA B0005 dataset of 591,458 instances.Results indicate that DBN excels in capacity estimation,achieving orders-of-magnitude lower error values and explaining over 99.97%of the predicted variable’s variance.When computational efficiency is paramount,the Deep Neural Network(DNN)offers a strong alternative,delivering near-competitive accuracy with significantly reduced prediction times.The GRU achieves the best overall performance for SOC estimation,attaining an R^(2) of 0.9999,while the BiDirRNN provides a marginally lower error at a slightly higher computational speed.In contrast,Convolutional Neural Networks(CNN)and Radial Basis Function Networks(RBFN)exhibit relatively high error rates,making them less viable for real-world battery management.Analyses of error distributions reveal that the top-performing models cluster most predictions within tight bounds,limiting the risk of overcharging or deep discharging.These findings highlight the trade-off between accuracy and computational overhead,offering valuable guidance for battery management system(BMS)designers seeking optimal performance under constrained resources.Future work may further explore advanced data augmentation and domain adaptation techniques to enhance these models’robustness in diverse operating conditions.
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2022R1C1C1005409)supported by the Korea Agency for Infrastructure Technology Advancement(KAIA)grant funded by the Ministry of Land,Infrastructure and Trans-port(Grant RS-2023-00251002)+2 种基金the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(Grant No.RS-2025-00516147)support provided by the NSF PREM program(DMR-2122178)the Institute of Advanced Manufacturing(IAM)at the University of Texas at Rio Grande Valley(UTRGV).
文摘Sinkhole formation poses a significant geohazard in karst regions,where unpredictable subsurface erosion often necessitates costly grouting for stabilization.Accurate estimation of grout volume remains a persistent challenge due to spatial variability,site-specific conditions,and the limitations of traditional empirical methods.This study introduces a novel machine learning-based regression model for grout volume prediction that integrates cone penetration test(CPT)-derived Sinkhole Resistance Ratio(SRR)values,spatial correlations between CPT and grouting points(GPs),and field-recorded grout volumes from six sinkhole sites in Florida.Three data trans-formation methods,the Proximal Allocation Method(PAM),the Equitable Distribution Method(EDM),and the Threshold-based Equitable Distribution Method(TEDM),were applied to distribute grout influence across CPTs,with TEDM demonstrating superior predictive performance.Synthetic data augmentation using spline method-ology further improved model robustness.A high-degree polynomial regression model,optimized with ridge regularization,achieved high accuracy(R^(2)=0.95;PEV=0.94)and significantly outperformed existing linear and logarithmic models.Results confirm that lower SRR values correlate with higher grout demand,and the proposed model reliably captures these nonlinear relationships.This research advances sinkhole remediation practice by providing a data-driven,accurate,and generalizable framework for grout volume estimation,enabling more efficient resource allocation and improved project outcomes.
基金Supported by the National Natural Science Foundation of China(12261018)Universities Key Laboratory of Mathematical Modeling and Data Mining in Guizhou Province(2023013)。
文摘In this paper,we establish and study a single-species logistic model with impulsive age-selective harvesting.First,we prove the ultimate boundedness of the solutions of the system.Then,we obtain conditions for the asymptotic stability of the trivial solution and the positive periodic solution.Finally,numerical simulations are presented to validate our results.Our results show that age-selective harvesting is more conducive to sustainable population survival than non-age-selective harvesting.
基金supported by the MSIT(Ministry of Science and ICT),Republic of Korea,under the ICAN(ICT Challenge and Advanced Network of HRD)support program(IITP-2025-RS-2023-00259497)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation)and was supported by Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Republic of Korea government(MSIT)(No.IITP-2025-RS-2023-00254129+1 种基金Graduate School of Metaverse Convergence(Sungkyunkwan University))was supported by the Basic Science Research Program of the National Research Foundation(NRF)funded by the Republic of Korean government(MSIT)(No.RS-2024-00346737).
文摘Network attacks have become a critical issue in the internet security domain.Artificial intelligence technology-based detection methodologies have attracted attention;however,recent studies have struggled to adapt to changing attack patterns and complex network environments.In addition,it is difficult to explain the detection results logically using artificial intelligence.We propose a method for classifying network attacks using graph models to explain the detection results.First,we reconstruct the network packet data into a graphical structure.We then use a graph model to predict network attacks using edge classification.To explain the prediction results,we observed numerical changes by randomly masking and calculating the importance of neighbors,allowing us to extract significant subgraphs.Our experiments on six public datasets demonstrate superior performance with an average F1-score of 0.960 and accuracy of 0.964,outperforming traditional machine learning and other graph models.The visual representation of the extracted subgraphs highlights the neighboring nodes that have the greatest impact on the results,thus explaining detection.In conclusion,this study demonstrates that graph-based models are suitable for network attack detection in complex environments,and the importance of graph neighbors can be calculated to efficiently analyze the results.This approach can contribute to real-world network security analyses and provide a new direction in the field.
基金The Trùndelag Health Study (HUNT) is a collaboration between HUNT Research Centre (Faculty of Medicine and Health Sciences, Norwegian University of Science and Technology), Trùndelag County Council, Central Norway Regional Health Authority, and the Norwegian Institute of Public HealthThe coordination of European Prospective Investigation into Cancer and Nutrition - Spain study (EPIC) is financially supported by the International Agency for Research on Cancer (IARC)+7 种基金by the Department of Epidemiology and Biostatistics, School of Public Health, Imperial College London, which has additional infrastructure support provided by the NIHR Imperial Biomedical Research Centre (BRC)supported by Health Research Fund (FIS) - Instituto de Salud Carlos III (ISCIII), Regional Governments of Andaluc 1a, Asturias, Basque Country, Murcia and Navarra, and the Catalan Institute of Oncology - ICO (Spain)funded by The Netherlands Organisation for Health Research and DevelopmentZon Mw (Grant No.: 531-00141-3)Funding for the SHIP study has been provided by the Federal Ministry for Education and Research (BMBFidentification codes 01 ZZ96030, 01 ZZ0103, and 01 ZZ0701)support from the Swedish Research Council (2018-02527 and 2019-00193)financed by the Helmholtz Zentrum München - German Research Center for Environmental Health, which is funded by the German Federal Ministry of Education and Research (BMBF) and by the State of Bavaria.
文摘Background There is insufficient evidence to provide recommendations for leisure-time physical activity among workers across various occupational physical activity levels.This study aimed to assess the association of leisure-time physical activity with cardiovascular and all-cause mortality across occupational physical activity levels.Methods This study utilized individual participant data from 21 cohort studies,comprising both published and unpublished data.Eligibility criteria included individual-level data on leisure-time and occupational physical activity(categorized as sedentary,low,moderate,and high)along with data on all-cause and/or cardiovascular mortality.A 2-stage individual participant data meta-analysis was conducted,with separate analysis of each study using Cox proportional hazards models(Stage 1).These results were combined using random-effects models(Stage 2).Results Higher leisure-time physical activity levels were associated with lower all-cause and cardiovascular mortality risk across most occupational physical activity levels,for both males and females.Among males with sedentary work,high compared to sedentary leisure-time physical activity was associated with lower all-cause(hazard ratios(HR)=0.77,95%confidence interval(95%CI):0.70-0.85)and cardiovascular mortality(HR=0.76,95%CI:0.66-0.87)risk.Among males with high levels of occupational physical activity,high compared to sedentary leisure-time physical activity was associated with lower all-cause(HR=0.84,95%CI:0.74-0.97)and cardiovascular mortality(HR=0.79,95%CI:0.60-1.04)risk,while HRs for low and moderate levels of leisure-time physical activity ranged between 0.87 and 0.97 and were not statistically significant.Among females,most effects were similar but more imprecise,especially in the higher occupational physical activity levels.Conclusion Higher levels of leisure-time physical activity were generally associated with lower mortality risks.However,results for workers with moderate and high occupational physical activity levels,especially women,were more imprecise.Our findings suggests that workers may benefit from engaging in high levels of leisure-time physical activity,irrespective of their level of occupational physical activity.
基金supported by National Institute on Aging(NIH-NIA)R01AG054459(to ALL).
文摘Alzheimer’s disease(AD)is the most common form of dementia,affecting over 50 million people worldwide.This figure is projected to nearly double every 20 years,reaching 82 million by 2030 and 152 million by 2050(Alzheimer’s Disease International).The apolipoproteinε4(APOE4)allele is the strongest genetic risk factor for late-onset AD(after age 65 years).Apolipoprotein E,a lipid transporter,exists in three variants:ε2,ε3,andε4.APOEε2(APOE2)is protective against AD,APOEε3(APOE3)is neutral,while APOE4 significantly increases the risk.Individuals with one copy of APOE4 have a 4-fold greater risk of developing AD,and those with two copies face an 8-fold risk compared to non-carriers.Even in cognitively normal individuals,APOE4 carriers exhibit brain metabolic and vascular deficits decades before amyloid-beta(Aβ)plaques and neurofibrillary tau tangles emerge-the hallmark pathologies of AD(Reiman et al.,2001,2005;Thambisetty et al.,2010).Notably,studies have demonstrated reduced glucose uptake,or hypometabolism,in brain regions vulnerable to AD in asymptomatic middle-aged APOE4 carriers,long before clinical symptoms arise(Reiman et al.,2001,2005).
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(RS-2025-00559546)supported by the IITP(Institute of Information&Coummunications Technology Planning&Evaluation)-ITRC(Information Technology Research Center)grant funded by the Korea government(Ministry of Science and ICT)(IITP-2025-RS-2023-00259004).
文摘The advent of sixth-generation(6G)networks introduces unprecedented challenges in achieving seamless connectivity,ultra-low latency,and efficient resource management in highly dynamic environments.Although fifth-generation(5G)networks transformed mobile broadband and machine-type communications at massive scales,their properties of scaling,interference management,and latency remain a limitation in dense high mobility settings.To overcome these limitations,artificial intelligence(AI)and unmanned aerial vehicles(UAVs)have emerged as potential solutions to develop versatile,dynamic,and energy-efficient communication systems.The study proposes an AI-based UAV architecture that utilizes cooperative reinforcement learning(CoRL)to manage an autonomous network.The UAVs collaborate by sharing local observations and real-time state exchanges to optimize user connectivity,movement directions,allocate power,and resource distribution.Unlike conventional centralized or autonomous methods,CoRL involves joint state sharing and conflict-sensitive reward shaping,which ensures fair coverage,less interference,and enhanced adaptability in a dynamic urban environment.Simulations conducted in smart city scenarios with 10 UAVs and 50 ground users demonstrate that the proposed CoRL-based UAV system increases user coverage by up to 10%,achieves convergence 40%faster,and reduces latency and energy consumption by 30%compared with centralized and decentralized baselines.Furthermore,the distributed nature of the algorithm ensures scalability and flexibility,making it well-suited for future large-scale 6G deployments.The results highlighted that AI-enabled UAV systems enhance connectivity,support ultra-reliable low-latency communications(URLLC),and improve 6G network efficiency.Future work will extend the framework with adaptive modulation,beamforming-aware positioning,and real-world testbed deployment.
基金supported by the Natural Science Foundation of Fujian Province of China(2025J01380)National Natural Science Foundation of China(No.62471139)+3 种基金the Major Health Research Project of Fujian Province(2021ZD01001)Fujian Provincial Units Special Funds for Education and Research(2022639)Fujian University of Technology Research Start-up Fund(GY-S24002)Fujian Research and Training Grants for Young and Middle-aged Leaders in Healthcare(GY-H-24179).
文摘The generation of synthetic trajectories has become essential in various fields for analyzing complex movement patterns.However,the use of real-world trajectory data poses significant privacy risks,such as location reidentification and correlation attacks.To address these challenges,privacy-preserving trajectory generation methods are critical for applications relying on sensitive location data.This paper introduces DPIL-Traj,an advanced framework designed to generate synthetic trajectories while achieving a superior balance between data utility and privacy preservation.Firstly,the framework incorporates Differential Privacy Clustering,which anonymizes trajectory data by applying differential privacy techniques that add noise,ensuring the protection of sensitive user information.Secondly,Imitation Learning is used to replicate decision-making behaviors observed in real-world trajectories.By learning from expert trajectories,this component generates synthetic data that closely mimics real-world decision-making processes while optimizing the quality of the generated trajectories.Finally,Markov-based Trajectory Generation is employed to capture and maintain the inherent temporal dynamics of movement patterns.Extensive experiments conducted on the GeoLife trajectory dataset show that DPIL-Traj improves utility performance by an average of 19.85%,and in terms of privacy performance by an average of 12.51%,compared to state-of-the-art approaches.Ablation studies further reveal that DP clustering effectively safeguards privacy,imitation learning enhances utility under noise,and the Markov module strengthens temporal coherence.
文摘It is important for modern hospital management to strengthen medical humanistic care and build a harmonious doctor-patient relationship.Innovative applications of the big data resources of patient experience in modern hospital management facilitate hospital management to realize real-time supervision,dynamic management and s&entitle decision-making based on patients experiences.It is helping the transformation of hospital management from an administrator^perspective to a patients perspective,and from experience-driven to data-driven.The technological innovations in hospital management based on patient experience data can assist the optimization and continuous improvement of healthcare quality,therefore help to increase patient satisfaction to the medical services.
基金This research/paper was fully supported by Universiti Teknologi PETRONAS,under the Yayasan Universiti Teknologi PETRONAS(YUTP)Fundamental Research Grant Scheme(YUTP-015LC0-123).
文摘Anomaly detection in high dimensional data is a critical research issue with serious implication in the real-world problems.Many issues in this field still unsolved,so several modern anomaly detection methods struggle to maintain adequate accuracy due to the highly descriptive nature of big data.Such a phenomenon is referred to as the“curse of dimensionality”that affects traditional techniques in terms of both accuracy and performance.Thus,this research proposed a hybrid model based on Deep Autoencoder Neural Network(DANN)with five layers to reduce the difference between the input and output.The proposed model was applied to a real-world gas turbine(GT)dataset that contains 87620 columns and 56 rows.During the experiment,two issues have been investigated and solved to enhance the results.The first is the dataset class imbalance,which solved using SMOTE technique.The second issue is the poor performance,which can be solved using one of the optimization algorithms.Several optimization algorithms have been investigated and tested,including stochastic gradient descent(SGD),RMSprop,Adam and Adamax.However,Adamax optimization algorithm showed the best results when employed to train theDANNmodel.The experimental results show that our proposed model can detect the anomalies by efficiently reducing the high dimensionality of dataset with accuracy of 99.40%,F1-score of 0.9649,Area Under the Curve(AUC)rate of 0.9649,and a minimal loss function during the hybrid model training.
基金supported by two research grants provided by the Karachi Institute of Economics and Technology(KIET)the Big Data Analytics Laboratory at the Insitute of Business Administration(IBAKarachi)。
文摘The advent of healthcare information management systems(HIMSs)continues to produce large volumes of healthcare data for patient care and compliance and regulatory requirements at a global scale.Analysis of this big data allows for boundless potential outcomes for discovering knowledge.Big data analytics(BDA)in healthcare can,for instance,help determine causes of diseases,generate effective diagnoses,enhance Qo S guarantees by increasing efficiency of the healthcare delivery and effectiveness and viability of treatments,generate accurate predictions of readmissions,enhance clinical care,and pinpoint opportunities for cost savings.However,BDA implementations in any domain are generally complicated and resource-intensive with a high failure rate and no roadmap or success strategies to guide the practitioners.In this paper,we present a comprehensive roadmap to derive insights from BDA in the healthcare(patient care)domain,based on the results of a systematic literature review.We initially determine big data characteristics for healthcare and then review BDA applications to healthcare in academic research focusing particularly on No SQL databases.We also identify the limitations and challenges of these applications and justify the potential of No SQL databases to address these challenges and further enhance BDA healthcare research.We then propose and describe a state-of-the-art BDA architecture called Med-BDA for healthcare domain which solves all current BDA challenges and is based on the latest zeta big data paradigm.We also present success strategies to ensure the working of Med-BDA along with outlining the major benefits of BDA applications to healthcare.Finally,we compare our work with other related literature reviews across twelve hallmark features to justify the novelty and importance of our work.The aforementioned contributions of our work are collectively unique and clearly present a roadmap for clinical administrators,practitioners and professionals to successfully implement BDA initiatives in their organizations.
基金funded by the Science and Industry Endowment Fund as part of The Distal Footprints of Giant Ore Systems:UNCOVER Australia Project(RP04-063)-Capricorn Distal Footprints。
文摘Traditional approaches to develop 3D geological models employ a mix of quantitative and qualitative scientific techniques,which do not fully provide quantification of uncertainty in the constructed models and fail to optimally weight geological field observations against constraints from geophysical data.Here,using the Bayesian Obsidian software package,we develop a methodology to fuse lithostratigraphic field observations with aeromagnetic and gravity data to build a 3D model in a small(13.5 km×13.5 km)region of the Gascoyne Province,Western Australia.Our approach is validated by comparing 3D model results to independently-constrained geological maps and cross-sections produced by the Geological Survey of Western Australia.By fusing geological field data with aeromagnetic and gravity surveys,we show that 89%of the modelled region has>95%certainty for a particular geological unit for the given model and data.The boundaries between geological units are characterized by narrow regions with<95%certainty,which are typically 400-1000 m wide at the Earth's surface and 500-2000 m wide at depth.Beyond~4 km depth,the model requires geophysical survey data with longer wavelengths(e.g.,active seismic)to constrain the deeper subsurface.Although Obsidian was originally built for sedimentary basin problems,there is reasonable applicability to deformed terranes such as the Gascoyne Province.Ultimately,modification of the Bayesian engine to incorporate structural data will aid in developing more robust 3D models.Nevertheless,our results show that surface geological observations fused with geophysical survey data can yield reasonable 3D geological models with narrow uncertainty regions at the surface and shallow subsurface,which will be especially valuable for mineral exploration and the development of 3D geological models under cover.
基金This work was supported in part by the National Natural Science Foundation of China under Grant 62072351in part by the Academy of Finland under Grant 308087,Grant 335262,Grant 345072,and Grant 350464+1 种基金in part by the Open Project of Zhejiang Lab under Grant 2021PD0AB01and in part by the 111 Project under Grant B16037.
文摘With the rapid advancement of cloud computing,cloud storage services have developed rapidly.One issue that has attracted particular attention in such remote storage services is that cloud storage servers are not enough to reliably save and maintain data,which greatly affects users’confidence in purchasing and consuming cloud storage services.Traditional data integrity auditing techniques for cloud data storage are centralized,which faces huge security risks due to single-point-of-failure and vulnerabilities of central auditing servers.Blockchain technology offers a new approach to this problem.Many researchers have endeavored to employ the blockchain for data integrity auditing.Based on the search of relevant papers,we found that existing literature lacks a thorough survey of blockchain-based integrity auditing for cloud data.In this paper,we make an in-depth survey on cloud data integrity auditing based on blockchain.Firstly,we cover essential basic knowledge of integrity auditing for cloud data and blockchain techniques.Then,we propose a series of requirements for evaluating existing Blockchain-based Data Integrity Auditing(BDIA)schemes.Furthermore,we provide a comprehensive review of existing BDIA schemes and evaluate them based on our proposed criteria.Finally,according to our completed review and analysis,we explore some open issues and suggest research directions worthy of further efforts in the future.