To address the severe challenges of PM_(2.5) and ozone co-control during the"14^(th) Five-Year Plan"period and to enhance the precision and intelligence level of air environment governance,it is imperative t...To address the severe challenges of PM_(2.5) and ozone co-control during the"14^(th) Five-Year Plan"period and to enhance the precision and intelligence level of air environment governance,it is imperative to build an efficient comprehensive management platform for regional air quality.In this paper,the specific practice in Zibo City,Shandong Province is as an example to systematically analyze the top-level design,technical implementation,and innovative application of a comprehensive management platform for regional air quality integrating"perception monitoring,data fusion,research judgment of early warnings,analysis of sources,collaborative dispatching,and evaluation assessment".Through the construction of an"sky-air-ground"integrated three-dimensional monitoring network,the platform integrates multi-source heterogeneous environmental data,and employs big data,cloud computing,artificial intelligence,CALPUFF/CMAQ,and other numerical model technologies to achieve comprehensive perception,precise prediction,intelligent source tracing,and closed-loop management of air pollution.The platform innovatively establishes a full-process closed-loop management mechanism of"data-early warning-disposition-evaluation",and achieves a fundamental transformation from passive response to active anticipation and from experience-based judgment to data driving in environmental supervision.The application results show that this platform significantly improves the scientific decision-making ability and collaborative execution efficiency of air pollution governance in Zibo City,providing a replicable and scalable comprehensive solution for similar industrial cities to achieve the continuous improvement of air quality.展开更多
Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods...Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods,based on reliable existing data stored in project management tools’datasets,automating this evaluation process becomes a natural step forward.In this context,our approach focuses on quantifying software developer expertise by using metadata from the task-tracking systems.For this,we mathematically formalize two categories of expertise:technology-specific expertise,which denotes the skills required for a particular technology,and general expertise,which encapsulates overall knowledge in the software industry.Afterward,we automatically classify the zones of expertise associated with each task a developer has worked on using Bidirectional Encoder Representations from Transformers(BERT)-like transformers to handle the unique characteristics of project tool datasets effectively.Finally,our method evaluates the proficiency of each software specialist across already completed projects from both technology-specific and general perspectives.The method was experimentally validated,yielding promising results.展开更多
Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and ...Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and smart transportation systems.Fog computing tackles a range of challenges,including processing,storage,bandwidth,latency,and reliability,by locally distributing secure information through end nodes.Consisting of endpoints,fog nodes,and back-end cloud infrastructure,it provides advanced capabilities beyond traditional cloud computing.In smart environments,particularly within smart city transportation systems,the abundance of devices and nodes poses significant challenges related to power consumption and system reliability.To address the challenges of latency,energy consumption,and fault tolerance in these environments,this paper proposes a latency-aware,faulttolerant framework for resource scheduling and data management,referred to as the FORD framework,for smart cities in fog environments.This framework is designed to meet the demands of time-sensitive applications,such as those in smart transportation systems.The FORD framework incorporates latency-aware resource scheduling to optimize task execution in smart city environments,leveraging resources from both fog and cloud environments.Through simulation-based executions,tasks are allocated to the nearest available nodes with minimum latency.In the event of execution failure,a fault-tolerantmechanism is employed to ensure the successful completion of tasks.Upon successful execution,data is efficiently stored in the cloud data center,ensuring data integrity and reliability within the smart city ecosystem.展开更多
On October 18,2017,the 19th National Congress Report called for the implementation of the Healthy China Strategy.The development of biomedical data plays a pivotal role in advancing this strategy.Since the 18th Nation...On October 18,2017,the 19th National Congress Report called for the implementation of the Healthy China Strategy.The development of biomedical data plays a pivotal role in advancing this strategy.Since the 18th National Congress of the Communist Party of China,China has vigorously promoted the integration and implementation of the Healthy China and Digital China strategies.The National Health Commission has prioritized the development of health and medical big data,issuing policies to promote standardized applica-tions and foster innovation in"Internet+Healthcare."Biomedical data has significantly contributed to preci-sion medicine,personalized health management,drug development,disease diagnosis,public health monitor-ing,and epidemic prediction capabilities.展开更多
We propose a Cross-Chain Mapping Blockchain(CCMB)for scalable data management in massive Internet of Things(IoT)networks.Specifically,CCMB aims to improve the scalability of securely storing,tracing,and transmitting I...We propose a Cross-Chain Mapping Blockchain(CCMB)for scalable data management in massive Internet of Things(IoT)networks.Specifically,CCMB aims to improve the scalability of securely storing,tracing,and transmitting IoT behavior and reputation data based on our proposed cross-mapped Behavior Chain(BChain)and Reputation Chain(RChain).To improve off-chain IoT data storage scalability,we show that our lightweight CCMB architecture efficiently utilizes available fog-cloud resources.The scalability of on-chain IoT data tracing is enhanced using our Mapping Smart Contract(MSC)and cross-chain mapping design to perform rapid Reputation-to-Behavior(R2B)traceability queries between BChain and RChain blocks.To maximize off-chain to on-chain throughput,we optimize the CCMB block settings and producers based on a general Poisson Point Process(PPP)network model.The constrained optimization problem is formulated as a Markov Decision Process(MDP),and solved using a dual-network Deep Reinforcement Learning(DRL)algorithm.Simulation results validate CCMB’s scalability advantages in storage,traceability,and throughput.In specific massive IoT scenarios,CCMB can reduce the storage footprint by 50%and traceability query time by 90%,while improving system throughput by 55%compared to existing benchmarks.展开更多
In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and ...In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and constructs a higher educational data security management and control model centered on the integration of medical and educational data.By implementing a multi-dimensional strategy of dynamic classification,real-time authorization,and secure execution through educational data security levels,dynamic access control is applied to effectively enhance the security and controllability of educational data,providing a secure foundation for data sharing and openness.展开更多
With the comprehensive development of modern information technology,big data technology has been integrated into various industries and has become a pillar technology supporting industrial upgrading and transformation...With the comprehensive development of modern information technology,big data technology has been integrated into various industries and has become a pillar technology supporting industrial upgrading and transformation.In enterprise human resource management,big data technology also has a broad application space and important application value.To gain higher market competitiveness and comprehensively improve the quality and efficiency of human resource management,enterprises need to rely on big data technology for comprehensive reform and optimization,thereby building an efficient,fair,open,and scientific human resource management model.This paper analyzes the problems and changes of enterprise human resource management in the era of big data,and then puts forward effective strategies for enterprise human resource management based on the era of big data.展开更多
In the era of big data,data has gradually become an important asset of enterprises,and the application of big data technology has gradually become the key to the optimization of enterprise marketing management mode.En...In the era of big data,data has gradually become an important asset of enterprises,and the application of big data technology has gradually become the key to the optimization of enterprise marketing management mode.Enterprises take the initiative to meet the development trend of the times,rely on big data technology to effectively process and analyze data,innovate decision-making methods and operation models,and achieve efficient marketing and fine management,which is an important way to improve their market competitiveness.Therefore,the author first analyzes the empowering role of big data technology on enterprise marketing management,and then discusses the difficulties faced by enterprise marketing management in the era of big data,and finally puts forward targeted improvement strategies,aiming to provide a reference for enterprises to innovate and change the marketing management mode.展开更多
With the gradual acceleration of information construction in colleges and universities,digital campus and smart campus have gradually become important means for colleges and universities to scientifically manage the c...With the gradual acceleration of information construction in colleges and universities,digital campus and smart campus have gradually become important means for colleges and universities to scientifically manage the campus.They have been applied to teaching,scientific research,student management,and other fields,improving the quality and efficiency of management.This paper mainly studies the intelligent educational administration management system based on data mining technology.Firstly,this paper introduces the application process of data mining technology,and builds an intelligent educational administration management system based on data mining technology.Then,this paper optimizes the application of the Apriori algorithm in educational administration management through transaction compression and frequent sampling.Compared with the traditional Apriori algorithm,the optimized Apriori algorithm in this paper has a shorter execution time under the same minimum support.展开更多
With the development of the current era,China’s Internet and computer technologies are constantly updated and improved.Under the application of information technology,great changes have taken place in the financial m...With the development of the current era,China’s Internet and computer technologies are constantly updated and improved.Under the application of information technology,great changes have taken place in the financial management work of schools.At present,the content of school financial management work is constantly increasing.In order to meet the needs of school management,it is necessary to improve the level of financial management,strengthen the informatization construction of financial management,so as to realize the modernization of school financial management and improve the efficiency of financial management.Big data technology and information technology can realize the rapid collection and preprocessing of information,achieve in-depth mining of data value,assist staff in quickly identifying problems in their work,and further improve work efficiency.Based on this,this paper conducts in-depth analysis and research on the informatization construction of financial management in colleges and universities on the basis of big data,for reference.展开更多
In the era of big data,the financial industry is undergoing profound changes.By integrating multiple data sources such as transaction records,customer interactions,market trends,and regulatory requirements,big data te...In the era of big data,the financial industry is undergoing profound changes.By integrating multiple data sources such as transaction records,customer interactions,market trends,and regulatory requirements,big data technology has significantly improved the decision-making efficiency,customer insight,and risk management capabilities of financial institutions.The financial industry has become a pioneer in the application of big data technology,which is widely used in scenarios such as fraud detection,risk management,customer service optimization,and smart transactions.However,financial data security management also faces many challenges,including data breaches,privacy protection,compliance requirements,the complexity of emerging technologies,and the balance between data access and security.This article explores the major challenges of financial data security management,coping strategies,and the evolution of the regulatory environment,and it looks ahead to future trends,highlighting the important role of artificial intelligence and machine learning in financial data security.展开更多
This article focuses on the remote diagnosis and analysis of rail vehicle status based on the data of the Train Control Management System(TCMS).It first expounds on the importance of train diagnostic analysis and desi...This article focuses on the remote diagnosis and analysis of rail vehicle status based on the data of the Train Control Management System(TCMS).It first expounds on the importance of train diagnostic analysis and designs a unified TCMS data frame transmission format.Subsequently,a remote data transmission link using 4G signals and data processing methods is introduced.The advantages of remote diagnosis are analyzed,and common methods such as correlation analysis,fault diagnosis,and fault prediction are explained in detail.Then,challenges such as data security and the balance between diagnostic accuracy and real-time performance are discussed,along with development prospects in technological innovation,algorithm optimization,and application promotion.This research provides ideas for remote analysis and diagnosis based on TCMS data,contributing to the safe and efficient operation of rail vehicles.展开更多
In this paper,the application of agricultural big data in agricultural economic management is deeply explored,and its potential in promoting profit growth and innovation is analyzed.However,challenges persist in data ...In this paper,the application of agricultural big data in agricultural economic management is deeply explored,and its potential in promoting profit growth and innovation is analyzed.However,challenges persist in data collection and integration,limitations of analytical technologies,talent development,team building,and policy support when applying agricultural big data.Effective application strategies are proposed,including data-driven precision agriculture practices,construction of data integration and management platforms,data security and privacy protection strategies,as well as long-term planning and development strategies for agricultural big data,to maximize its impact on agricultural economic management.Future advancements require collaborative efforts in technological innovation,talent cultivation,and policy support,to realize the extensive application of agricultural big data in agricultural economic management and ensure sustainable industrial development.展开更多
In the rapidly evolving technological landscape,state-owned enterprises(SOEs)encounter significant challenges in sustaining their competitiveness through efficient R&D management.Integrated Product Development(IPD...In the rapidly evolving technological landscape,state-owned enterprises(SOEs)encounter significant challenges in sustaining their competitiveness through efficient R&D management.Integrated Product Development(IPD),with its emphasis on cross-functional teamwork,concurrent engineering,and data-driven decision-making,has been widely recognized for enhancing R&D efficiency and product quality.However,the unique characteristics of SOEs pose challenges to the effective implementation of IPD.The advancement of big data and artificial intelligence technologies offers new opportunities for optimizing IPD R&D management through data-driven decision-making models.This paper constructs and validates a data-driven decision-making model tailored to the IPD R&D management of SOEs.By integrating data mining,machine learning,and other advanced analytical techniques,the model serves as a scientific and efficient decision-making tool.It aids SOEs in optimizing R&D resource allocation,shortening product development cycles,reducing R&D costs,and improving product quality and innovation.Moreover,this study contributes to a deeper theoretical understanding of the value of data-driven decision-making in the context of IPD.展开更多
In the present study,data mining and network pharmacology were utilized to explore the principles and mechanisms of traditional Chinese medicine(TCM)in treating acute appendicitis.The goal was to provide a scientific ...In the present study,data mining and network pharmacology were utilized to explore the principles and mechanisms of traditional Chinese medicine(TCM)in treating acute appendicitis.The goal was to provide a scientific basis for clinical treatment and further research on this disease.First,we searched the National Patent Database for Chinese herbal compound prescriptions used to treat acute appendicitis.We then applied frequency analysis,character and taste meridian analysis,association rule analysis,and hierarchical cluster analysis to identify the patterns of TCM treatment for acute appendicitis,selecting key combinations of Chinese medicines.Next,we screened the main active components of these key TCM based on quality markers.Using databases such as SwissTargetPrediction,SymMap,ETCM,and STRING,we analyzed the pharmacological mechanisms of these key TCM in treating acute appendicitis.Key active components and targets were further verified through molecular docking.We identified a total of 129 patents involving 316 Chinese medicines,with 24 being frequently used.The results indicated that most Chinese herbs used for acute appendicitis were heat-clearing drugs,blood-activating and stasis-removing drugs,and purging drugs.The primary active ingredients of the Rhubarb-cortex moutan-flos lonicerae combination for treating acute appendicitis included Emodin,Paeonol,Physcion,Chlorogenic acid,Chrysophanol,Rhein acid,and Aloe-emodin.These ingredients targeted key proteins such as ALB,TP53,BCL2,STAT3,IL-6,and TNF,and were involved in cellular responses to lipopolysaccharides,cell composition,and various cytokine-mediated biological processes.They also interacted with signaling pathways like AGE-RAGE,TNF,IL-17,and FoxO.Based on patent data,this study analyzed medication patterns in the treatment of acute appendicitis,discussed the possible mechanisms of key TCM combinations,and provided a scientific basis and new perspectives for the diagnosis and treatment of the disease.展开更多
In the new era,the impact of emerging productive forces has permeated every sector of industry.As the core production factor of these forces,data plays a pivotal role in industrial transformation and social developmen...In the new era,the impact of emerging productive forces has permeated every sector of industry.As the core production factor of these forces,data plays a pivotal role in industrial transformation and social development.Consequently,many domestic universities have introduced majors or courses related to big data.Among these,the Big Data Management and Applications major stands out for its interdisciplinary approach and emphasis on practical skills.However,as an emerging field,it has not yet accumulated a robust foundation in teaching theory and practice.Current instructional practices face issues such as unclear training objectives,inconsistent teaching methods and course content,insufficient integration of practical components,and a shortage of qualified faculty-factors that hinder both the development of the major and the overall quality of education.Taking the statistics course within the Big Data Management and Applications major as an example,this paper examines the challenges faced by statistics education in the context of emerging productive forces and proposes corresponding improvement measures.By introducing innovative teaching concepts and strategies,the teaching system for professional courses is optimized,and authentic classroom scenarios are recreated through illustrative examples.Questionnaire surveys and statistical analyses of data collected before and after the teaching reforms indicate that the curriculum changes effectively enhance instructional outcomes,promote the development of the major,and improve the quality of talent cultivation.展开更多
Bone tumors(BTs)-including osteosarcoma,Ewing sarcoma,and chondrosarcoma-are rare but biologically complex malignancies characterized by pronounced heterogeneity in anatomical location,histological subtype,and molecul...Bone tumors(BTs)-including osteosarcoma,Ewing sarcoma,and chondrosarcoma-are rare but biologically complex malignancies characterized by pronounced heterogeneity in anatomical location,histological subtype,and molecular alterations.Recent advances in artificial intelligence(AI),particularly deep learning,have enabled the integration of diverse clinical data modalities to support diagnosis,treatment planning,and prognostication in bone oncology.This review provides a comprehensive synthesis of AI-driven multimodal fusion strategies that incorporate radiological imaging,digital pathology,multi-omics profiling,and electronic health records.We conducted a structured review of peer-reviewed literature published between 2015 and early 2025,focusing on the development,validation,and clinical applicability of AI models for BT diagnosis,subtyping,treatment response prediction,and recurrence monitoring.Although multimodal models have demonstrated advantages over unimodal approaches,especially in handling missing data and improving generalizability,most remain constrained by single-center study designs,small sample sizes,and limited prospective or external validation.Persistent technical and translational challenges include semantic misalignment across modalities,incomplete datasets,limited model interpretability,and regulatory and infrastructural barriers to clinical integration.To address these limitations,we highlight emerging directions such as contrastive representation learning,generative data augmentation,transformer-based fusion architectures,and privacy-preserving federated learning.We also discuss the evolving role of foundation models and workflow-integrated AI agents in enhancing scalability and clinical usability.In summary,multimodal AI represents a promising paradigm for advancing precision care in BTs.Realizing its full clinical potential will require methodologically rigorous,biologically informed,and system-level approaches that bridge algorithmic innovation with real-world healthcare delivery.展开更多
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei...Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.展开更多
Purpose-Interface management is the process of managing communications,responsibilities and coordination of project parties,phases or physical entities which are dependent on one another.Interface management is a cruc...Purpose-Interface management is the process of managing communications,responsibilities and coordination of project parties,phases or physical entities which are dependent on one another.Interface management is a crucial part of managing any construction project-but particularly important for high-speed railway projects that often have several contractual parties and stakeholders,very long project timelines and huge upfront cost overlays.This paper discusses how various project interfaces were managed during the design and construction of the civil engineering infrastructure for the High Speed Two(HS2)project in the United Kingdom.Design/methodology/approach-The paper uses the case study methodology.Key interfaces on the HS2 project are grouped into various categories and the paper discusses how they were managed within the Area North Integrated Project Team(IPT)of the HS2 project made up of contractor Balfour Beatty VINCI(BBV),the Mott MacDonald SYSTRA Design Joint Venture(DJV)and client HS2 Ltd.3 different case studies drawn from across the IPT are used,each of them highlighting different interfaces and how these interfaces were managed.Findings-The paper shows how innovative technical designs and modern methods of construction were used to address some of the unique and peculiar challenges of designing a brand-new railway in the United Kingdom.Addressing the contrasting and often competing requirements of different stakeholders,coupled with challenging physical constraints of the very limited land available for the project and the use of a rarely used Act of Parliament in the delivery of the project required different approach to interface management.Collaboration and proactive stakeholder engagement are necessary for successful interface management on megaprojects.The authors posit that adopting an integrated approach to engineering and construction management is an essential ingredient for the successful delivery of high-speed railway projects.Originality/value-With many high-speed railway projects around the world coming up in the next few years,understanding the context and challenges for each country will help engineering and design managers adopt appropriate approaches for their projects.The lessons learned on the HS2 project are also transferable to other mega infrastructure projects with complex project interfaces.展开更多
Journal Introduction″International Journal of Plant Engineering and Management″is in the charge of Ministry of Industry and Information Technology of the People′s Republic of China,and organized by Northwestern Pol...Journal Introduction″International Journal of Plant Engineering and Management″is in the charge of Ministry of Industry and Information Technology of the People′s Republic of China,and organized by Northwestern Polytechnical University.It is a kind of English aca-demic quarterly publication publicly issued at home and abroad.Plant engineering and management is a comprehensive interdisciplinary subject mainly reporting academic research on the application technology of equipment and industry management.It is of the characteristics of reporting both engineering and management while giving more priority to engineering as well as mechanics and electrics while giving priority to mechanics.Its contents involves technology,economy,management etc.展开更多
文摘To address the severe challenges of PM_(2.5) and ozone co-control during the"14^(th) Five-Year Plan"period and to enhance the precision and intelligence level of air environment governance,it is imperative to build an efficient comprehensive management platform for regional air quality.In this paper,the specific practice in Zibo City,Shandong Province is as an example to systematically analyze the top-level design,technical implementation,and innovative application of a comprehensive management platform for regional air quality integrating"perception monitoring,data fusion,research judgment of early warnings,analysis of sources,collaborative dispatching,and evaluation assessment".Through the construction of an"sky-air-ground"integrated three-dimensional monitoring network,the platform integrates multi-source heterogeneous environmental data,and employs big data,cloud computing,artificial intelligence,CALPUFF/CMAQ,and other numerical model technologies to achieve comprehensive perception,precise prediction,intelligent source tracing,and closed-loop management of air pollution.The platform innovatively establishes a full-process closed-loop management mechanism of"data-early warning-disposition-evaluation",and achieves a fundamental transformation from passive response to active anticipation and from experience-based judgment to data driving in environmental supervision.The application results show that this platform significantly improves the scientific decision-making ability and collaborative execution efficiency of air pollution governance in Zibo City,providing a replicable and scalable comprehensive solution for similar industrial cities to achieve the continuous improvement of air quality.
基金supported by the project“Romanian Hub for Artificial Intelligence-HRIA”,Smart Growth,Digitization and Financial Instruments Program,2021–2027,MySMIS No.334906.
文摘Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods,based on reliable existing data stored in project management tools’datasets,automating this evaluation process becomes a natural step forward.In this context,our approach focuses on quantifying software developer expertise by using metadata from the task-tracking systems.For this,we mathematically formalize two categories of expertise:technology-specific expertise,which denotes the skills required for a particular technology,and general expertise,which encapsulates overall knowledge in the software industry.Afterward,we automatically classify the zones of expertise associated with each task a developer has worked on using Bidirectional Encoder Representations from Transformers(BERT)-like transformers to handle the unique characteristics of project tool datasets effectively.Finally,our method evaluates the proficiency of each software specialist across already completed projects from both technology-specific and general perspectives.The method was experimentally validated,yielding promising results.
基金supported by the Deanship of Scientific Research and Graduate Studies at King Khalid University under research grant number(R.G.P.2/93/45).
文摘Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and smart transportation systems.Fog computing tackles a range of challenges,including processing,storage,bandwidth,latency,and reliability,by locally distributing secure information through end nodes.Consisting of endpoints,fog nodes,and back-end cloud infrastructure,it provides advanced capabilities beyond traditional cloud computing.In smart environments,particularly within smart city transportation systems,the abundance of devices and nodes poses significant challenges related to power consumption and system reliability.To address the challenges of latency,energy consumption,and fault tolerance in these environments,this paper proposes a latency-aware,faulttolerant framework for resource scheduling and data management,referred to as the FORD framework,for smart cities in fog environments.This framework is designed to meet the demands of time-sensitive applications,such as those in smart transportation systems.The FORD framework incorporates latency-aware resource scheduling to optimize task execution in smart city environments,leveraging resources from both fog and cloud environments.Through simulation-based executions,tasks are allocated to the nearest available nodes with minimum latency.In the event of execution failure,a fault-tolerantmechanism is employed to ensure the successful completion of tasks.Upon successful execution,data is efficiently stored in the cloud data center,ensuring data integrity and reliability within the smart city ecosystem.
文摘On October 18,2017,the 19th National Congress Report called for the implementation of the Healthy China Strategy.The development of biomedical data plays a pivotal role in advancing this strategy.Since the 18th National Congress of the Communist Party of China,China has vigorously promoted the integration and implementation of the Healthy China and Digital China strategies.The National Health Commission has prioritized the development of health and medical big data,issuing policies to promote standardized applica-tions and foster innovation in"Internet+Healthcare."Biomedical data has significantly contributed to preci-sion medicine,personalized health management,drug development,disease diagnosis,public health monitor-ing,and epidemic prediction capabilities.
基金supported in part by the National Key Research and Development Program of China under Grant 2023YFB3106900the National Natural Science Foundation of China under Grant 62171113the China Scholarship Council under Grant 202406080100.
文摘We propose a Cross-Chain Mapping Blockchain(CCMB)for scalable data management in massive Internet of Things(IoT)networks.Specifically,CCMB aims to improve the scalability of securely storing,tracing,and transmitting IoT behavior and reputation data based on our proposed cross-mapped Behavior Chain(BChain)and Reputation Chain(RChain).To improve off-chain IoT data storage scalability,we show that our lightweight CCMB architecture efficiently utilizes available fog-cloud resources.The scalability of on-chain IoT data tracing is enhanced using our Mapping Smart Contract(MSC)and cross-chain mapping design to perform rapid Reputation-to-Behavior(R2B)traceability queries between BChain and RChain blocks.To maximize off-chain to on-chain throughput,we optimize the CCMB block settings and producers based on a general Poisson Point Process(PPP)network model.The constrained optimization problem is formulated as a Markov Decision Process(MDP),and solved using a dual-network Deep Reinforcement Learning(DRL)algorithm.Simulation results validate CCMB’s scalability advantages in storage,traceability,and throughput.In specific massive IoT scenarios,CCMB can reduce the storage footprint by 50%and traceability query time by 90%,while improving system throughput by 55%compared to existing benchmarks.
基金supported by:the 2023 Basic Public Welfare Research Project of the Wenzhou Science and Technology Bureau“Research on Multi-Source Data Classification and Grading Standards and Intelligent Algorithms for Higher Education Institutions”(Project No.G2023094)Major Humanities and Social Sciences Research Projects in Zhejiang higher education institutions(Grant/Award Number:2024QN061)2023 Basic Public Welfare Research Project of Wenzhou(No.:S2023014).
文摘In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and constructs a higher educational data security management and control model centered on the integration of medical and educational data.By implementing a multi-dimensional strategy of dynamic classification,real-time authorization,and secure execution through educational data security levels,dynamic access control is applied to effectively enhance the security and controllability of educational data,providing a secure foundation for data sharing and openness.
文摘With the comprehensive development of modern information technology,big data technology has been integrated into various industries and has become a pillar technology supporting industrial upgrading and transformation.In enterprise human resource management,big data technology also has a broad application space and important application value.To gain higher market competitiveness and comprehensively improve the quality and efficiency of human resource management,enterprises need to rely on big data technology for comprehensive reform and optimization,thereby building an efficient,fair,open,and scientific human resource management model.This paper analyzes the problems and changes of enterprise human resource management in the era of big data,and then puts forward effective strategies for enterprise human resource management based on the era of big data.
文摘In the era of big data,data has gradually become an important asset of enterprises,and the application of big data technology has gradually become the key to the optimization of enterprise marketing management mode.Enterprises take the initiative to meet the development trend of the times,rely on big data technology to effectively process and analyze data,innovate decision-making methods and operation models,and achieve efficient marketing and fine management,which is an important way to improve their market competitiveness.Therefore,the author first analyzes the empowering role of big data technology on enterprise marketing management,and then discusses the difficulties faced by enterprise marketing management in the era of big data,and finally puts forward targeted improvement strategies,aiming to provide a reference for enterprises to innovate and change the marketing management mode.
文摘With the gradual acceleration of information construction in colleges and universities,digital campus and smart campus have gradually become important means for colleges and universities to scientifically manage the campus.They have been applied to teaching,scientific research,student management,and other fields,improving the quality and efficiency of management.This paper mainly studies the intelligent educational administration management system based on data mining technology.Firstly,this paper introduces the application process of data mining technology,and builds an intelligent educational administration management system based on data mining technology.Then,this paper optimizes the application of the Apriori algorithm in educational administration management through transaction compression and frequent sampling.Compared with the traditional Apriori algorithm,the optimized Apriori algorithm in this paper has a shorter execution time under the same minimum support.
基金2023 Guangdong Provincial Educational Science Planning Project(Higher Education Special Program)“Research on the Talent Cultivation Path of Vocational Undergraduate Programs in Finance,Economics and Trade Based on Digital Transformation”(2023GXJK653)。
文摘With the development of the current era,China’s Internet and computer technologies are constantly updated and improved.Under the application of information technology,great changes have taken place in the financial management work of schools.At present,the content of school financial management work is constantly increasing.In order to meet the needs of school management,it is necessary to improve the level of financial management,strengthen the informatization construction of financial management,so as to realize the modernization of school financial management and improve the efficiency of financial management.Big data technology and information technology can realize the rapid collection and preprocessing of information,achieve in-depth mining of data value,assist staff in quickly identifying problems in their work,and further improve work efficiency.Based on this,this paper conducts in-depth analysis and research on the informatization construction of financial management in colleges and universities on the basis of big data,for reference.
基金Exploration and Practice of the Application of Blockchain Technology to the Cultivation of Compound Talents under the Background of Free Trade Port(HKJG2023-18)。
文摘In the era of big data,the financial industry is undergoing profound changes.By integrating multiple data sources such as transaction records,customer interactions,market trends,and regulatory requirements,big data technology has significantly improved the decision-making efficiency,customer insight,and risk management capabilities of financial institutions.The financial industry has become a pioneer in the application of big data technology,which is widely used in scenarios such as fraud detection,risk management,customer service optimization,and smart transactions.However,financial data security management also faces many challenges,including data breaches,privacy protection,compliance requirements,the complexity of emerging technologies,and the balance between data access and security.This article explores the major challenges of financial data security management,coping strategies,and the evolution of the regulatory environment,and it looks ahead to future trends,highlighting the important role of artificial intelligence and machine learning in financial data security.
文摘This article focuses on the remote diagnosis and analysis of rail vehicle status based on the data of the Train Control Management System(TCMS).It first expounds on the importance of train diagnostic analysis and designs a unified TCMS data frame transmission format.Subsequently,a remote data transmission link using 4G signals and data processing methods is introduced.The advantages of remote diagnosis are analyzed,and common methods such as correlation analysis,fault diagnosis,and fault prediction are explained in detail.Then,challenges such as data security and the balance between diagnostic accuracy and real-time performance are discussed,along with development prospects in technological innovation,algorithm optimization,and application promotion.This research provides ideas for remote analysis and diagnosis based on TCMS data,contributing to the safe and efficient operation of rail vehicles.
基金Supported by Research and Application of Soil Collection Software and Soil Ecological Big Data Platform in Guangxi Woodland(GUILINKEYAN[2022ZC]44)Construction of Soil Information Database and Visualization System for Artificial Forests in Central Guangxi(2023GXZCLK62).
文摘In this paper,the application of agricultural big data in agricultural economic management is deeply explored,and its potential in promoting profit growth and innovation is analyzed.However,challenges persist in data collection and integration,limitations of analytical technologies,talent development,team building,and policy support when applying agricultural big data.Effective application strategies are proposed,including data-driven precision agriculture practices,construction of data integration and management platforms,data security and privacy protection strategies,as well as long-term planning and development strategies for agricultural big data,to maximize its impact on agricultural economic management.Future advancements require collaborative efforts in technological innovation,talent cultivation,and policy support,to realize the extensive application of agricultural big data in agricultural economic management and ensure sustainable industrial development.
文摘In the rapidly evolving technological landscape,state-owned enterprises(SOEs)encounter significant challenges in sustaining their competitiveness through efficient R&D management.Integrated Product Development(IPD),with its emphasis on cross-functional teamwork,concurrent engineering,and data-driven decision-making,has been widely recognized for enhancing R&D efficiency and product quality.However,the unique characteristics of SOEs pose challenges to the effective implementation of IPD.The advancement of big data and artificial intelligence technologies offers new opportunities for optimizing IPD R&D management through data-driven decision-making models.This paper constructs and validates a data-driven decision-making model tailored to the IPD R&D management of SOEs.By integrating data mining,machine learning,and other advanced analytical techniques,the model serves as a scientific and efficient decision-making tool.It aids SOEs in optimizing R&D resource allocation,shortening product development cycles,reducing R&D costs,and improving product quality and innovation.Moreover,this study contributes to a deeper theoretical understanding of the value of data-driven decision-making in the context of IPD.
基金Henan Province Special Research Project of Tra ditional Chinese Medicine(Grant No.2022ZY1090).
文摘In the present study,data mining and network pharmacology were utilized to explore the principles and mechanisms of traditional Chinese medicine(TCM)in treating acute appendicitis.The goal was to provide a scientific basis for clinical treatment and further research on this disease.First,we searched the National Patent Database for Chinese herbal compound prescriptions used to treat acute appendicitis.We then applied frequency analysis,character and taste meridian analysis,association rule analysis,and hierarchical cluster analysis to identify the patterns of TCM treatment for acute appendicitis,selecting key combinations of Chinese medicines.Next,we screened the main active components of these key TCM based on quality markers.Using databases such as SwissTargetPrediction,SymMap,ETCM,and STRING,we analyzed the pharmacological mechanisms of these key TCM in treating acute appendicitis.Key active components and targets were further verified through molecular docking.We identified a total of 129 patents involving 316 Chinese medicines,with 24 being frequently used.The results indicated that most Chinese herbs used for acute appendicitis were heat-clearing drugs,blood-activating and stasis-removing drugs,and purging drugs.The primary active ingredients of the Rhubarb-cortex moutan-flos lonicerae combination for treating acute appendicitis included Emodin,Paeonol,Physcion,Chlorogenic acid,Chrysophanol,Rhein acid,and Aloe-emodin.These ingredients targeted key proteins such as ALB,TP53,BCL2,STAT3,IL-6,and TNF,and were involved in cellular responses to lipopolysaccharides,cell composition,and various cytokine-mediated biological processes.They also interacted with signaling pathways like AGE-RAGE,TNF,IL-17,and FoxO.Based on patent data,this study analyzed medication patterns in the treatment of acute appendicitis,discussed the possible mechanisms of key TCM combinations,and provided a scientific basis and new perspectives for the diagnosis and treatment of the disease.
文摘In the new era,the impact of emerging productive forces has permeated every sector of industry.As the core production factor of these forces,data plays a pivotal role in industrial transformation and social development.Consequently,many domestic universities have introduced majors or courses related to big data.Among these,the Big Data Management and Applications major stands out for its interdisciplinary approach and emphasis on practical skills.However,as an emerging field,it has not yet accumulated a robust foundation in teaching theory and practice.Current instructional practices face issues such as unclear training objectives,inconsistent teaching methods and course content,insufficient integration of practical components,and a shortage of qualified faculty-factors that hinder both the development of the major and the overall quality of education.Taking the statistics course within the Big Data Management and Applications major as an example,this paper examines the challenges faced by statistics education in the context of emerging productive forces and proposes corresponding improvement measures.By introducing innovative teaching concepts and strategies,the teaching system for professional courses is optimized,and authentic classroom scenarios are recreated through illustrative examples.Questionnaire surveys and statistical analyses of data collected before and after the teaching reforms indicate that the curriculum changes effectively enhance instructional outcomes,promote the development of the major,and improve the quality of talent cultivation.
基金supported by the National Natural Science Foundation of China[Grant No.:82172524]the Natural Science Foundation of Hubei Province[Grant No.:2025AFB240].
文摘Bone tumors(BTs)-including osteosarcoma,Ewing sarcoma,and chondrosarcoma-are rare but biologically complex malignancies characterized by pronounced heterogeneity in anatomical location,histological subtype,and molecular alterations.Recent advances in artificial intelligence(AI),particularly deep learning,have enabled the integration of diverse clinical data modalities to support diagnosis,treatment planning,and prognostication in bone oncology.This review provides a comprehensive synthesis of AI-driven multimodal fusion strategies that incorporate radiological imaging,digital pathology,multi-omics profiling,and electronic health records.We conducted a structured review of peer-reviewed literature published between 2015 and early 2025,focusing on the development,validation,and clinical applicability of AI models for BT diagnosis,subtyping,treatment response prediction,and recurrence monitoring.Although multimodal models have demonstrated advantages over unimodal approaches,especially in handling missing data and improving generalizability,most remain constrained by single-center study designs,small sample sizes,and limited prospective or external validation.Persistent technical and translational challenges include semantic misalignment across modalities,incomplete datasets,limited model interpretability,and regulatory and infrastructural barriers to clinical integration.To address these limitations,we highlight emerging directions such as contrastive representation learning,generative data augmentation,transformer-based fusion architectures,and privacy-preserving federated learning.We also discuss the evolving role of foundation models and workflow-integrated AI agents in enhancing scalability and clinical usability.In summary,multimodal AI represents a promising paradigm for advancing precision care in BTs.Realizing its full clinical potential will require methodologically rigorous,biologically informed,and system-level approaches that bridge algorithmic innovation with real-world healthcare delivery.
文摘Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management.
文摘Purpose-Interface management is the process of managing communications,responsibilities and coordination of project parties,phases or physical entities which are dependent on one another.Interface management is a crucial part of managing any construction project-but particularly important for high-speed railway projects that often have several contractual parties and stakeholders,very long project timelines and huge upfront cost overlays.This paper discusses how various project interfaces were managed during the design and construction of the civil engineering infrastructure for the High Speed Two(HS2)project in the United Kingdom.Design/methodology/approach-The paper uses the case study methodology.Key interfaces on the HS2 project are grouped into various categories and the paper discusses how they were managed within the Area North Integrated Project Team(IPT)of the HS2 project made up of contractor Balfour Beatty VINCI(BBV),the Mott MacDonald SYSTRA Design Joint Venture(DJV)and client HS2 Ltd.3 different case studies drawn from across the IPT are used,each of them highlighting different interfaces and how these interfaces were managed.Findings-The paper shows how innovative technical designs and modern methods of construction were used to address some of the unique and peculiar challenges of designing a brand-new railway in the United Kingdom.Addressing the contrasting and often competing requirements of different stakeholders,coupled with challenging physical constraints of the very limited land available for the project and the use of a rarely used Act of Parliament in the delivery of the project required different approach to interface management.Collaboration and proactive stakeholder engagement are necessary for successful interface management on megaprojects.The authors posit that adopting an integrated approach to engineering and construction management is an essential ingredient for the successful delivery of high-speed railway projects.Originality/value-With many high-speed railway projects around the world coming up in the next few years,understanding the context and challenges for each country will help engineering and design managers adopt appropriate approaches for their projects.The lessons learned on the HS2 project are also transferable to other mega infrastructure projects with complex project interfaces.
文摘Journal Introduction″International Journal of Plant Engineering and Management″is in the charge of Ministry of Industry and Information Technology of the People′s Republic of China,and organized by Northwestern Polytechnical University.It is a kind of English aca-demic quarterly publication publicly issued at home and abroad.Plant engineering and management is a comprehensive interdisciplinary subject mainly reporting academic research on the application technology of equipment and industry management.It is of the characteristics of reporting both engineering and management while giving more priority to engineering as well as mechanics and electrics while giving priority to mechanics.Its contents involves technology,economy,management etc.