As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and oper...As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and operation and supervision[1,2].Healthcare data elements include biolog.ical and clinical data that are related to disease,environ-mental health data that are associated with life,and operational and healthcare management data that are related to healthcare activities(Figure 1).Activities such as the construction of a data value assessment system,the devel-opment of a data circulation and sharing platform,and the authorization of data compliance and operation products support the strong growth momentum of the market for health care data elements in China[3].展开更多
Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and ...Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and smart transportation systems.Fog computing tackles a range of challenges,including processing,storage,bandwidth,latency,and reliability,by locally distributing secure information through end nodes.Consisting of endpoints,fog nodes,and back-end cloud infrastructure,it provides advanced capabilities beyond traditional cloud computing.In smart environments,particularly within smart city transportation systems,the abundance of devices and nodes poses significant challenges related to power consumption and system reliability.To address the challenges of latency,energy consumption,and fault tolerance in these environments,this paper proposes a latency-aware,faulttolerant framework for resource scheduling and data management,referred to as the FORD framework,for smart cities in fog environments.This framework is designed to meet the demands of time-sensitive applications,such as those in smart transportation systems.The FORD framework incorporates latency-aware resource scheduling to optimize task execution in smart city environments,leveraging resources from both fog and cloud environments.Through simulation-based executions,tasks are allocated to the nearest available nodes with minimum latency.In the event of execution failure,a fault-tolerantmechanism is employed to ensure the successful completion of tasks.Upon successful execution,data is efficiently stored in the cloud data center,ensuring data integrity and reliability within the smart city ecosystem.展开更多
Standardized datasets are foundational to healthcare informatization by enhancing data quality and unleashing the value of data elements.Using bibliometrics and content analysis,this study examines China's healthc...Standardized datasets are foundational to healthcare informatization by enhancing data quality and unleashing the value of data elements.Using bibliometrics and content analysis,this study examines China's healthcare dataset standards from 2011 to 2025.It analyzes their evolution across types,applications,institutions,and themes,highlighting key achievements including substantial growth in quantity,optimized typology,expansion into innovative application scenarios such as health decision support,and broadened institutional involvement.The study also identifies critical challenges,including imbalanced development,insufficient quality control,and a lack of essential metadata—such as authoritative data element mappings and privacy annotations—which hampers the delivery of intelligent services.To address these challenges,the study proposes a multi-faceted strategy focused on optimizing the standard system's architecture,enhancing quality and implementation,and advancing both data governance—through authoritative tracing and privacy protection—and intelligent service provision.These strategies aim to promote the application of dataset standards,thereby fostering and securing the development of new productive forces in healthcare.展开更多
With the rise of data-intensive research,data literacy has become a critical capability for improving scientific data quality and achieving artificial intelligence(AI)readiness.In the biomedical domain,data are charac...With the rise of data-intensive research,data literacy has become a critical capability for improving scientific data quality and achieving artificial intelligence(AI)readiness.In the biomedical domain,data are characterized by high complexity and privacy sensitivity,calling for robust and systematic data management skills.This paper reviews current trends in scientific data governance and the evolving policy landscape,highlighting persistent challenges such as inconsistent standards,semantic misalignment,and limited awareness of compliance.These issues are largely rooted in the lack of structured training and practical support for researchers.In response,this study builds on existing data literacy frameworks and integrates the specific demands of biomedical research to propose a comprehensive,lifecycle-oriented data literacy competency model with an emphasis on ethics and regulatory awareness.Furthermore,it outlines a tiered training strategy tailored to different research stages—undergraduate,graduate,and professional,offering theoretical foundations and practical pathways for universities and research institutions to advance data literacy education.展开更多
We propose a Cross-Chain Mapping Blockchain(CCMB)for scalable data management in massive Internet of Things(IoT)networks.Specifically,CCMB aims to improve the scalability of securely storing,tracing,and transmitting I...We propose a Cross-Chain Mapping Blockchain(CCMB)for scalable data management in massive Internet of Things(IoT)networks.Specifically,CCMB aims to improve the scalability of securely storing,tracing,and transmitting IoT behavior and reputation data based on our proposed cross-mapped Behavior Chain(BChain)and Reputation Chain(RChain).To improve off-chain IoT data storage scalability,we show that our lightweight CCMB architecture efficiently utilizes available fog-cloud resources.The scalability of on-chain IoT data tracing is enhanced using our Mapping Smart Contract(MSC)and cross-chain mapping design to perform rapid Reputation-to-Behavior(R2B)traceability queries between BChain and RChain blocks.To maximize off-chain to on-chain throughput,we optimize the CCMB block settings and producers based on a general Poisson Point Process(PPP)network model.The constrained optimization problem is formulated as a Markov Decision Process(MDP),and solved using a dual-network Deep Reinforcement Learning(DRL)algorithm.Simulation results validate CCMB’s scalability advantages in storage,traceability,and throughput.In specific massive IoT scenarios,CCMB can reduce the storage footprint by 50%and traceability query time by 90%,while improving system throughput by 55%compared to existing benchmarks.展开更多
National Population Health Data Center(NPHDC)is one of China's 20 national-level science data centers,jointly designated by the Ministry of Science and Technology and the Ministry of Finance.Operated by the Chines...National Population Health Data Center(NPHDC)is one of China's 20 national-level science data centers,jointly designated by the Ministry of Science and Technology and the Ministry of Finance.Operated by the Chinese Academy of Medical Sciences under the oversight of the National Health Commission,NPHDC adheres to national regulations including the Scientific Data Management Measures and the National Science and Technology Infrastructure Service Platform Management Measures,and is committed to collecting,integrating,managing,and sharing biomedical and health data through openaccess platform,fostering open sharing and engaging in international cooperation.展开更多
In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and ...In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and constructs a higher educational data security management and control model centered on the integration of medical and educational data.By implementing a multi-dimensional strategy of dynamic classification,real-time authorization,and secure execution through educational data security levels,dynamic access control is applied to effectively enhance the security and controllability of educational data,providing a secure foundation for data sharing and openness.展开更多
This article introduces the methodologies and instrumentation for data measurement and propagation at the Back-n white neutron facility of the China Spallation Neutron Source.The Back-n facility employs backscattering...This article introduces the methodologies and instrumentation for data measurement and propagation at the Back-n white neutron facility of the China Spallation Neutron Source.The Back-n facility employs backscattering techniques to generate a broad spectrum of white neutrons.Equipped with advanced detectors such as the light particle detector array and the fission ionization chamber detector,the facility achieves high-precision data acquisition through a general-purpose electronics system.Data were managed and stored in a hierarchical system supported by the National High Energy Physics Science Data Center,ensuring long-term preservation and efficient access.The data from the Back-n experiments significantly contribute to nuclear physics,reactor design,astrophysics,and medical physics,enhancing the understanding of nuclear processes and supporting interdisciplinary research.展开更多
On the basis of PDM(product data management) definition and its connotation, the factors to ensure implementation success are analyzed. The definition phase, analysis phase, design phase, build and test phase, and pos...On the basis of PDM(product data management) definition and its connotation, the factors to ensure implementation success are analyzed. The definition phase, analysis phase, design phase, build and test phase, and post production phase during PDM implementation are described. The implementation is divided into ten processes, which consist of the above different phases. The relationships between phases and processes are illustrated. Finally, a workflow is proposed to guide the implementing at a fixed price.展开更多
The CifNet network multi-well data management system is developed for 100MB or 1000MB local network environments which are used in Chinese oil industry. The kernel techniques of CifNet system include: 1, establishing ...The CifNet network multi-well data management system is developed for 100MB or 1000MB local network environments which are used in Chinese oil industry. The kernel techniques of CifNet system include: 1, establishing a high efficient and low cost network multi-well data management architecture based on the General Logging Curve Theory and the Cif data format; 2, implementing efficient visit and transmission of multi-well data in C/S local network based on TCP/IP protocol; 3,ensuring the safety of multi-well data in store, visit and application based on Unix operating system security. By using CifNet system, the researcher in office or at home can visit curves of any borehole in any working area of any oilfield. The application foreground of CifNet system is also commented.展开更多
The mining industry faces a number of challenges that promote the adoption of new technologies.Big data,which is driven by the accelerating progress of information and communication technology,is one of the promising ...The mining industry faces a number of challenges that promote the adoption of new technologies.Big data,which is driven by the accelerating progress of information and communication technology,is one of the promising technologies that can reshape the entire mining landscape.Despite numerous attempts to apply big data in the mining industry,fundamental problems of big data,especially big data management(BDM),in the mining industry persist.This paper aims to fill the gap by presenting the basics of BDM.This work provides a brief introduction to big data and BDM,and it discusses the challenges encountered by the mining industry to indicate the necessity of implementing big data.It also summarizes data sources in the mining industry and presents the potential benefits of big data to the mining industry.This work also envisions a future in which a global database project is established and big data is used together with other technologies(i.e.,automation),supported by government policies and following international standards.This paper also outlines the precautions for the utilization of BDM in the mining industry.展开更多
The wealth of user data acts as a fuel for network intelligence toward the sixth generation wireless networks(6G).Due to data heterogeneity and dynamics,decentralized data management(DM)is desirable for achieving tran...The wealth of user data acts as a fuel for network intelligence toward the sixth generation wireless networks(6G).Due to data heterogeneity and dynamics,decentralized data management(DM)is desirable for achieving transparent data operations across network domains,and blockchain can be a promising solution.However,the increasing data volume and stringent data privacy-preservation requirements in 6G bring significantly technical challenge to balance transparency,efficiency,and privacy requirements in decentralized blockchain-based DM.In this paper,we investigate blockchain solutions to address the challenge.First,we explore the consensus protocols and scalability mechanisms in blockchains and discuss the roles of DM stakeholders in blockchain architectures.Second,we investigate the authentication and authorization requirements for DM stakeholders.Third,we categorize DM privacy requirements and study blockchain-based mechanisms for collaborative data processing.Subsequently,we present research issues and potential solutions for blockchain-based DM toward 6G from these three perspectives.Finally,we conclude this paper and discuss future research directions.展开更多
In this paper we present the MEMPHIS middleware framework for the integration of CAD geometries and assemblies with derived Virtual Reality (VR) models and its specific meta data and attributes. The goal of this work ...In this paper we present the MEMPHIS middleware framework for the integration of CAD geometries and assemblies with derived Virtual Reality (VR) models and its specific meta data and attributes. The goal of this work is to connect real time VR applications, especially for the Design Review, with enterprise software storing and managing CAD models (Product Data Man- agement—PDM). The preparation of VR models requires expert knowledge, is time consuming, and includes selection of required CAD data, tessellation, healing of unwanted gaps, applying materials and textures, and special surface and light effects. During the Design Review process, decisions are made concerning the choice of materials and surface forms. While materials can be switched directly on the VR model, the modification of part geometries must be made on the CAD model. Our system synchronizes modi- fications of the original CAD geometries and of attributes that are relevant for the realistic rendering using the PLM Services standard. Thus, repeated work for the VR preparation can be avoided.展开更多
The basic frame and the design idea of J2EE-based Product Data Management (PDM) system are presented. This paper adopts the technology of Object-Oriented to realize the database design and builds the information model...The basic frame and the design idea of J2EE-based Product Data Management (PDM) system are presented. This paper adopts the technology of Object-Oriented to realize the database design and builds the information model of this PDM system. The integration key technology of PDM and CAD systems are discussed, the isomerous interface characteristics between CAD and PDM systems are analyzed, and finally, the integration mode of the PDM and CAD systems is given. Using these technologies, the integration of PDM and CAD systems is realized and the consistence of data in PDM and CAD systems is kept. Finally, the Product Data Management system is developed, which has been tested on development process of the hydraulic generator. The running process is stable and safety.展开更多
Connected and autonomous vehicles are seeing their dawn at this moment.They provide numerous benefits to vehicle owners,manufacturers,vehicle service providers,insurance companies,etc.These vehicles generate a large a...Connected and autonomous vehicles are seeing their dawn at this moment.They provide numerous benefits to vehicle owners,manufacturers,vehicle service providers,insurance companies,etc.These vehicles generate a large amount of data,which makes privacy and security a major challenge to their success.The complicated machine-led mechanics of connected and autonomous vehicles increase the risks of privacy invasion and cyber security violations for their users by making them more susceptible to data exploitation and vulnerable to cyber-attacks than any of their predecessors.This could have a negative impact on how well-liked CAVs are with the general public,give them a poor name at this early stage of their development,put obstacles in the way of their adoption and expanded use,and complicate the economic models for their future operations.On the other hand,congestion is still a bottleneck for traffic management and planning.This research paper presents a blockchain-based framework that protects the privacy of vehicle owners and provides data security by storing vehicular data on the blockchain,which will be used further for congestion detection and mitigation.Numerous devices placed along the road are used to communicate with passing cars and collect their data.The collected data will be compiled periodically to find the average travel time of vehicles and traffic density on a particular road segment.Furthermore,this data will be stored in the memory pool,where other devices will also store their data.After a predetermined amount of time,the memory pool will be mined,and data will be uploaded to the blockchain in the form of blocks that will be used to store traffic statistics.The information is then used in two different ways.First,the blockchain’s final block will provide real-time traffic data,triggering an intelligent traffic signal system to reduce congestion.Secondly,the data stored on the blockchain will provide historical,statistical data that can facilitate the analysis of traffic conditions according to past behavior.展开更多
Artificial intelligence(AI)relies on data and algorithms.State-of-the-art(SOTA)AI smart algorithms have been developed to improve the performance of AI-oriented structures.However,model-centric approaches are limited ...Artificial intelligence(AI)relies on data and algorithms.State-of-the-art(SOTA)AI smart algorithms have been developed to improve the performance of AI-oriented structures.However,model-centric approaches are limited by the absence of high-quality data.Data-centric AI is an emerging approach for solving machine learning(ML)problems.It is a collection of various data manipulation techniques that allow ML practitioners to systematically improve the quality of the data used in an ML pipeline.However,data-centric AI approaches are not well documented.Researchers have conducted various experiments without a clear set of guidelines.This survey highlights six major data-centric AI aspects that researchers are already using to intentionally or unintentionally improve the quality of AI systems.These include big data quality assessment,data preprocessing,transfer learning,semi-supervised learning,machine learning operations(MLOps),and the effect of adding more data.In addition,it highlights recent data-centric techniques adopted by ML practitioners.We addressed how adding data might harm datasets and how HoloClean can be used to restore and clean them.Finally,we discuss the causes of technical debt in AI.Technical debt builds up when software design and implementation decisions run into“or outright collide with”business goals and timelines.This survey lays the groundwork for future data-centric AI discussions by summarizing various data-centric approaches.展开更多
Cross-border data transmission in the biomedical area is on the rise,which brings potential risks and management challenges to data security,biosafety,and national security.Focusing on cross-border data security asses...Cross-border data transmission in the biomedical area is on the rise,which brings potential risks and management challenges to data security,biosafety,and national security.Focusing on cross-border data security assessment and risk management,many countries have successively issued relevant laws,regulations,and assessment guidelines.This study aims to provide an index system model and management application reference for the risk assessment of the cross-border data movement.From the perspective of a single organization,the relevant risk assessment standards of several countries are integrated to guide the identification and determination of risk factors.Then,the risk assessment index system of cross-border data flow is constructed.A case study of risk assessment in 358 biomedical organizations is carried out,and the suggestions for data management are offered.This study is condusive to improving security monitoring and the early warning of the cross-border data flow,thereby realizing the safe and orderly global flow of biomedical data.展开更多
Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulati...Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulations, audit tampering, data backdating, data falsification, phishing and spoofing are no longer restricted to rogue individuals but in fact also prevalent in systematic organizations and states as well. Therefore, data security requires strong data integrity measures and associated technical controls in place. Without proper customized framework in place, organizations are prone to high risk of financial, reputational, revenue losses, bankruptcies, and legal penalties which we shall discuss further throughout this paper. We will also explore some of the improvised and innovative techniques in product development to better tackle the challenges and requirements of data security and integrity.展开更多
The Internet of Medical Things(IoMT)is an online device that senses and transmits medical data from users to physicians within a time interval.In,recent years,IoMT has rapidly grown in the medicalfield to provide heal...The Internet of Medical Things(IoMT)is an online device that senses and transmits medical data from users to physicians within a time interval.In,recent years,IoMT has rapidly grown in the medicalfield to provide healthcare services without physical appearance.With the use of sensors,IoMT applications are used in healthcare management.In such applications,one of the most important factors is data security,given that its transmission over the network may cause obtrusion.For data security in IoMT systems,blockchain is used due to its numerous blocks for secure data storage.In this study,Blockchain-assisted secure data management framework(BSDMF)and Proof of Activity(PoA)protocol using malicious code detection algorithm is used in the proposed data security for the healthcare system.The main aim is to enhance the data security over the networks.The PoA protocol enhances high security of data from the literature review.By replacing the malicious node from the block,the PoA can provide high security for medical data in the blockchain.Comparison with existing systems shows that the proposed simulation with BSD-Malicious code detection algorithm achieves higher accuracy ratio,precision ratio,security,and efficiency and less response time for Blockchain-enabled healthcare systems.展开更多
A product data management system for a manufacturing enterprise is to make sure that the proper product data can be communicated to the right people at the right time.This paper describes a system analysis paradigm fo...A product data management system for a manufacturing enterprise is to make sure that the proper product data can be communicated to the right people at the right time.This paper describes a system analysis paradigm for data analysis in a product data management(PDM)development.Three aspects of the paradigm,i.e.,function,structure and behavior are rep-resented.The use of the paradigm explains why so many kinds of objects are necessary in a commercial database matrix and what models are available for developing a PDM application.As another result,a lot of models are derived from the analysis of product data system paradigm to model product data and PDM database definitions.展开更多
基金supported by National Natural Science Foundation of China(Grants 72474022,71974011,72174022,71972012,71874009)"BIT think tank"Promotion Plan of Science and Technology Innovation Program of Beijing Institute of Technology(Grants 2024CX14017,2023CX13029).
文摘As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and operation and supervision[1,2].Healthcare data elements include biolog.ical and clinical data that are related to disease,environ-mental health data that are associated with life,and operational and healthcare management data that are related to healthcare activities(Figure 1).Activities such as the construction of a data value assessment system,the devel-opment of a data circulation and sharing platform,and the authorization of data compliance and operation products support the strong growth momentum of the market for health care data elements in China[3].
基金supported by the Deanship of Scientific Research and Graduate Studies at King Khalid University under research grant number(R.G.P.2/93/45).
文摘Thedeployment of the Internet of Things(IoT)with smart sensors has facilitated the emergence of fog computing as an important technology for delivering services to smart environments such as campuses,smart cities,and smart transportation systems.Fog computing tackles a range of challenges,including processing,storage,bandwidth,latency,and reliability,by locally distributing secure information through end nodes.Consisting of endpoints,fog nodes,and back-end cloud infrastructure,it provides advanced capabilities beyond traditional cloud computing.In smart environments,particularly within smart city transportation systems,the abundance of devices and nodes poses significant challenges related to power consumption and system reliability.To address the challenges of latency,energy consumption,and fault tolerance in these environments,this paper proposes a latency-aware,faulttolerant framework for resource scheduling and data management,referred to as the FORD framework,for smart cities in fog environments.This framework is designed to meet the demands of time-sensitive applications,such as those in smart transportation systems.The FORD framework incorporates latency-aware resource scheduling to optimize task execution in smart city environments,leveraging resources from both fog and cloud environments.Through simulation-based executions,tasks are allocated to the nearest available nodes with minimum latency.In the event of execution failure,a fault-tolerantmechanism is employed to ensure the successful completion of tasks.Upon successful execution,data is efficiently stored in the cloud data center,ensuring data integrity and reliability within the smart city ecosystem.
文摘Standardized datasets are foundational to healthcare informatization by enhancing data quality and unleashing the value of data elements.Using bibliometrics and content analysis,this study examines China's healthcare dataset standards from 2011 to 2025.It analyzes their evolution across types,applications,institutions,and themes,highlighting key achievements including substantial growth in quantity,optimized typology,expansion into innovative application scenarios such as health decision support,and broadened institutional involvement.The study also identifies critical challenges,including imbalanced development,insufficient quality control,and a lack of essential metadata—such as authoritative data element mappings and privacy annotations—which hampers the delivery of intelligent services.To address these challenges,the study proposes a multi-faceted strategy focused on optimizing the standard system's architecture,enhancing quality and implementation,and advancing both data governance—through authoritative tracing and privacy protection—and intelligent service provision.These strategies aim to promote the application of dataset standards,thereby fostering and securing the development of new productive forces in healthcare.
文摘With the rise of data-intensive research,data literacy has become a critical capability for improving scientific data quality and achieving artificial intelligence(AI)readiness.In the biomedical domain,data are characterized by high complexity and privacy sensitivity,calling for robust and systematic data management skills.This paper reviews current trends in scientific data governance and the evolving policy landscape,highlighting persistent challenges such as inconsistent standards,semantic misalignment,and limited awareness of compliance.These issues are largely rooted in the lack of structured training and practical support for researchers.In response,this study builds on existing data literacy frameworks and integrates the specific demands of biomedical research to propose a comprehensive,lifecycle-oriented data literacy competency model with an emphasis on ethics and regulatory awareness.Furthermore,it outlines a tiered training strategy tailored to different research stages—undergraduate,graduate,and professional,offering theoretical foundations and practical pathways for universities and research institutions to advance data literacy education.
基金supported in part by the National Key Research and Development Program of China under Grant 2023YFB3106900the National Natural Science Foundation of China under Grant 62171113the China Scholarship Council under Grant 202406080100.
文摘We propose a Cross-Chain Mapping Blockchain(CCMB)for scalable data management in massive Internet of Things(IoT)networks.Specifically,CCMB aims to improve the scalability of securely storing,tracing,and transmitting IoT behavior and reputation data based on our proposed cross-mapped Behavior Chain(BChain)and Reputation Chain(RChain).To improve off-chain IoT data storage scalability,we show that our lightweight CCMB architecture efficiently utilizes available fog-cloud resources.The scalability of on-chain IoT data tracing is enhanced using our Mapping Smart Contract(MSC)and cross-chain mapping design to perform rapid Reputation-to-Behavior(R2B)traceability queries between BChain and RChain blocks.To maximize off-chain to on-chain throughput,we optimize the CCMB block settings and producers based on a general Poisson Point Process(PPP)network model.The constrained optimization problem is formulated as a Markov Decision Process(MDP),and solved using a dual-network Deep Reinforcement Learning(DRL)algorithm.Simulation results validate CCMB’s scalability advantages in storage,traceability,and throughput.In specific massive IoT scenarios,CCMB can reduce the storage footprint by 50%and traceability query time by 90%,while improving system throughput by 55%compared to existing benchmarks.
文摘National Population Health Data Center(NPHDC)is one of China's 20 national-level science data centers,jointly designated by the Ministry of Science and Technology and the Ministry of Finance.Operated by the Chinese Academy of Medical Sciences under the oversight of the National Health Commission,NPHDC adheres to national regulations including the Scientific Data Management Measures and the National Science and Technology Infrastructure Service Platform Management Measures,and is committed to collecting,integrating,managing,and sharing biomedical and health data through openaccess platform,fostering open sharing and engaging in international cooperation.
基金supported by:the 2023 Basic Public Welfare Research Project of the Wenzhou Science and Technology Bureau“Research on Multi-Source Data Classification and Grading Standards and Intelligent Algorithms for Higher Education Institutions”(Project No.G2023094)Major Humanities and Social Sciences Research Projects in Zhejiang higher education institutions(Grant/Award Number:2024QN061)2023 Basic Public Welfare Research Project of Wenzhou(No.:S2023014).
文摘In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and constructs a higher educational data security management and control model centered on the integration of medical and educational data.By implementing a multi-dimensional strategy of dynamic classification,real-time authorization,and secure execution through educational data security levels,dynamic access control is applied to effectively enhance the security and controllability of educational data,providing a secure foundation for data sharing and openness.
基金supported by the National Key Research and Development Plan(No.2023YFA1606602)。
文摘This article introduces the methodologies and instrumentation for data measurement and propagation at the Back-n white neutron facility of the China Spallation Neutron Source.The Back-n facility employs backscattering techniques to generate a broad spectrum of white neutrons.Equipped with advanced detectors such as the light particle detector array and the fission ionization chamber detector,the facility achieves high-precision data acquisition through a general-purpose electronics system.Data were managed and stored in a hierarchical system supported by the National High Energy Physics Science Data Center,ensuring long-term preservation and efficient access.The data from the Back-n experiments significantly contribute to nuclear physics,reactor design,astrophysics,and medical physics,enhancing the understanding of nuclear processes and supporting interdisciplinary research.
文摘On the basis of PDM(product data management) definition and its connotation, the factors to ensure implementation success are analyzed. The definition phase, analysis phase, design phase, build and test phase, and post production phase during PDM implementation are described. The implementation is divided into ten processes, which consist of the above different phases. The relationships between phases and processes are illustrated. Finally, a workflow is proposed to guide the implementing at a fixed price.
文摘The CifNet network multi-well data management system is developed for 100MB or 1000MB local network environments which are used in Chinese oil industry. The kernel techniques of CifNet system include: 1, establishing a high efficient and low cost network multi-well data management architecture based on the General Logging Curve Theory and the Cif data format; 2, implementing efficient visit and transmission of multi-well data in C/S local network based on TCP/IP protocol; 3,ensuring the safety of multi-well data in store, visit and application based on Unix operating system security. By using CifNet system, the researcher in office or at home can visit curves of any borehole in any working area of any oilfield. The application foreground of CifNet system is also commented.
文摘The mining industry faces a number of challenges that promote the adoption of new technologies.Big data,which is driven by the accelerating progress of information and communication technology,is one of the promising technologies that can reshape the entire mining landscape.Despite numerous attempts to apply big data in the mining industry,fundamental problems of big data,especially big data management(BDM),in the mining industry persist.This paper aims to fill the gap by presenting the basics of BDM.This work provides a brief introduction to big data and BDM,and it discusses the challenges encountered by the mining industry to indicate the necessity of implementing big data.It also summarizes data sources in the mining industry and presents the potential benefits of big data to the mining industry.This work also envisions a future in which a global database project is established and big data is used together with other technologies(i.e.,automation),supported by government policies and following international standards.This paper also outlines the precautions for the utilization of BDM in the mining industry.
基金supported by research grants from Huawei Technologies Canada and from the Natural Sciences and Engineering Research Council(NSERC)of Canada.
文摘The wealth of user data acts as a fuel for network intelligence toward the sixth generation wireless networks(6G).Due to data heterogeneity and dynamics,decentralized data management(DM)is desirable for achieving transparent data operations across network domains,and blockchain can be a promising solution.However,the increasing data volume and stringent data privacy-preservation requirements in 6G bring significantly technical challenge to balance transparency,efficiency,and privacy requirements in decentralized blockchain-based DM.In this paper,we investigate blockchain solutions to address the challenge.First,we explore the consensus protocols and scalability mechanisms in blockchains and discuss the roles of DM stakeholders in blockchain architectures.Second,we investigate the authentication and authorization requirements for DM stakeholders.Third,we categorize DM privacy requirements and study blockchain-based mechanisms for collaborative data processing.Subsequently,we present research issues and potential solutions for blockchain-based DM toward 6G from these three perspectives.Finally,we conclude this paper and discuss future research directions.
基金Project supported by the Korean Ministry of Information and Communication (MIC)
文摘In this paper we present the MEMPHIS middleware framework for the integration of CAD geometries and assemblies with derived Virtual Reality (VR) models and its specific meta data and attributes. The goal of this work is to connect real time VR applications, especially for the Design Review, with enterprise software storing and managing CAD models (Product Data Man- agement—PDM). The preparation of VR models requires expert knowledge, is time consuming, and includes selection of required CAD data, tessellation, healing of unwanted gaps, applying materials and textures, and special surface and light effects. During the Design Review process, decisions are made concerning the choice of materials and surface forms. While materials can be switched directly on the VR model, the modification of part geometries must be made on the CAD model. Our system synchronizes modi- fications of the original CAD geometries and of attributes that are relevant for the realistic rendering using the PLM Services standard. Thus, repeated work for the VR preparation can be avoided.
基金Sponsored by Scientific Technology Development Project of Heilongjiang (Grant No.WH05A01) and Scientific Research Foundation of Harbin Institute of Technology(Grant No.HIT.MD2003.21).
文摘The basic frame and the design idea of J2EE-based Product Data Management (PDM) system are presented. This paper adopts the technology of Object-Oriented to realize the database design and builds the information model of this PDM system. The integration key technology of PDM and CAD systems are discussed, the isomerous interface characteristics between CAD and PDM systems are analyzed, and finally, the integration mode of the PDM and CAD systems is given. Using these technologies, the integration of PDM and CAD systems is realized and the consistence of data in PDM and CAD systems is kept. Finally, the Product Data Management system is developed, which has been tested on development process of the hydraulic generator. The running process is stable and safety.
基金funded by the Deanship of Scientific Research at King Khalid University,Kingdom of Saudi Arabia for large group Research Project under grant number:RGP2/249/44.
文摘Connected and autonomous vehicles are seeing their dawn at this moment.They provide numerous benefits to vehicle owners,manufacturers,vehicle service providers,insurance companies,etc.These vehicles generate a large amount of data,which makes privacy and security a major challenge to their success.The complicated machine-led mechanics of connected and autonomous vehicles increase the risks of privacy invasion and cyber security violations for their users by making them more susceptible to data exploitation and vulnerable to cyber-attacks than any of their predecessors.This could have a negative impact on how well-liked CAVs are with the general public,give them a poor name at this early stage of their development,put obstacles in the way of their adoption and expanded use,and complicate the economic models for their future operations.On the other hand,congestion is still a bottleneck for traffic management and planning.This research paper presents a blockchain-based framework that protects the privacy of vehicle owners and provides data security by storing vehicular data on the blockchain,which will be used further for congestion detection and mitigation.Numerous devices placed along the road are used to communicate with passing cars and collect their data.The collected data will be compiled periodically to find the average travel time of vehicles and traffic density on a particular road segment.Furthermore,this data will be stored in the memory pool,where other devices will also store their data.After a predetermined amount of time,the memory pool will be mined,and data will be uploaded to the blockchain in the form of blocks that will be used to store traffic statistics.The information is then used in two different ways.First,the blockchain’s final block will provide real-time traffic data,triggering an intelligent traffic signal system to reduce congestion.Secondly,the data stored on the blockchain will provide historical,statistical data that can facilitate the analysis of traffic conditions according to past behavior.
文摘Artificial intelligence(AI)relies on data and algorithms.State-of-the-art(SOTA)AI smart algorithms have been developed to improve the performance of AI-oriented structures.However,model-centric approaches are limited by the absence of high-quality data.Data-centric AI is an emerging approach for solving machine learning(ML)problems.It is a collection of various data manipulation techniques that allow ML practitioners to systematically improve the quality of the data used in an ML pipeline.However,data-centric AI approaches are not well documented.Researchers have conducted various experiments without a clear set of guidelines.This survey highlights six major data-centric AI aspects that researchers are already using to intentionally or unintentionally improve the quality of AI systems.These include big data quality assessment,data preprocessing,transfer learning,semi-supervised learning,machine learning operations(MLOps),and the effect of adding more data.In addition,it highlights recent data-centric techniques adopted by ML practitioners.We addressed how adding data might harm datasets and how HoloClean can be used to restore and clean them.Finally,we discuss the causes of technical debt in AI.Technical debt builds up when software design and implementation decisions run into“or outright collide with”business goals and timelines.This survey lays the groundwork for future data-centric AI discussions by summarizing various data-centric approaches.
基金support from the National Natural Science Foundation of China(Grant No.:71901169)the Shaanxi Province Innovative Talents Promotion Plan-Youth Science and Technology Nova Project(Grant No.:2022KJXX-50).
文摘Cross-border data transmission in the biomedical area is on the rise,which brings potential risks and management challenges to data security,biosafety,and national security.Focusing on cross-border data security assessment and risk management,many countries have successively issued relevant laws,regulations,and assessment guidelines.This study aims to provide an index system model and management application reference for the risk assessment of the cross-border data movement.From the perspective of a single organization,the relevant risk assessment standards of several countries are integrated to guide the identification and determination of risk factors.Then,the risk assessment index system of cross-border data flow is constructed.A case study of risk assessment in 358 biomedical organizations is carried out,and the suggestions for data management are offered.This study is condusive to improving security monitoring and the early warning of the cross-border data flow,thereby realizing the safe and orderly global flow of biomedical data.
文摘Data Integrity is a critical component of Data lifecycle management. Its importance increases even more in a complex and dynamic landscape. Actions like unauthorized access, unauthorized modifications, data manipulations, audit tampering, data backdating, data falsification, phishing and spoofing are no longer restricted to rogue individuals but in fact also prevalent in systematic organizations and states as well. Therefore, data security requires strong data integrity measures and associated technical controls in place. Without proper customized framework in place, organizations are prone to high risk of financial, reputational, revenue losses, bankruptcies, and legal penalties which we shall discuss further throughout this paper. We will also explore some of the improvised and innovative techniques in product development to better tackle the challenges and requirements of data security and integrity.
基金Taif University Researchers Supporting Project Number(TURSP-2020/98),Taif University,Taif,Saudi Arabia.
文摘The Internet of Medical Things(IoMT)is an online device that senses and transmits medical data from users to physicians within a time interval.In,recent years,IoMT has rapidly grown in the medicalfield to provide healthcare services without physical appearance.With the use of sensors,IoMT applications are used in healthcare management.In such applications,one of the most important factors is data security,given that its transmission over the network may cause obtrusion.For data security in IoMT systems,blockchain is used due to its numerous blocks for secure data storage.In this study,Blockchain-assisted secure data management framework(BSDMF)and Proof of Activity(PoA)protocol using malicious code detection algorithm is used in the proposed data security for the healthcare system.The main aim is to enhance the data security over the networks.The PoA protocol enhances high security of data from the literature review.By replacing the malicious node from the block,the PoA can provide high security for medical data in the blockchain.Comparison with existing systems shows that the proposed simulation with BSD-Malicious code detection algorithm achieves higher accuracy ratio,precision ratio,security,and efficiency and less response time for Blockchain-enabled healthcare systems.
文摘A product data management system for a manufacturing enterprise is to make sure that the proper product data can be communicated to the right people at the right time.This paper describes a system analysis paradigm for data analysis in a product data management(PDM)development.Three aspects of the paradigm,i.e.,function,structure and behavior are rep-resented.The use of the paradigm explains why so many kinds of objects are necessary in a commercial database matrix and what models are available for developing a PDM application.As another result,a lot of models are derived from the analysis of product data system paradigm to model product data and PDM database definitions.