This paper investigates the problem of data scarcity in spectrum prediction.A cognitive radio equipment may frequently switch the target frequency as the electromagnetic environment changes.The previously trained mode...This paper investigates the problem of data scarcity in spectrum prediction.A cognitive radio equipment may frequently switch the target frequency as the electromagnetic environment changes.The previously trained model for prediction often cannot maintain a good performance when facing small amount of historical data of the new target frequency.Moreover,the cognitive radio equipment usually implements the dynamic spectrum access in real time which means the time to recollect the data of the new task frequency band and retrain the model is very limited.To address the above issues,we develop a crossband data augmentation framework for spectrum prediction by leveraging the recent advances of generative adversarial network(GAN)and deep transfer learning.Firstly,through the similarity measurement,we pre-train a GAN model using the historical data of the frequency band that is the most similar to the target frequency band.Then,through the data augmentation by feeding the small amount of the target data into the pre-trained GAN,temporal-spectral residual network is further trained using deep transfer learning and the generated data with high similarity from GAN.Finally,experiment results demonstrate the effectiveness of the proposed framework.展开更多
With the explosive growth of data available, there is an urgent need to develop continuous data mining which reduces manual interaction evidently. A novel model for data mining is proposed in evolving environment. Fir...With the explosive growth of data available, there is an urgent need to develop continuous data mining which reduces manual interaction evidently. A novel model for data mining is proposed in evolving environment. First, some valid mining task schedules are generated, and then au tonomous and local mining are executed periodically, finally, previous results are merged and refined. The framework based on the model creates a communication mechanism to in corporate domain knowledge into continuous process through ontology service. The local and merge mining are transparent to the end user and heterogeneous data ,source by ontology. Experiments suggest that the framework should be useful in guiding the continuous mining process.展开更多
The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many d...The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents.展开更多
Cloud accounting is based on the traditional financial work process, the context of big data, and the necessary trend of future corporate accounting development. Its emergence and rapid development will have a fundame...Cloud accounting is based on the traditional financial work process, the context of big data, and the necessary trend of future corporate accounting development. Its emergence and rapid development will have a fundamental impact on corporate environmental information disclosure. In the big data era of information sharing, companies will have a new understanding of the emergence, balance, and final consideration of social responsibility, and will have new changes in their overall decision-making and information disclosure methods. "Knowing" and "behavior" will be combined on the basis of rational judgment, so that corporate environmental information disclosure is more in line with the overall social development requirements. Based on the background of big data, this article starts with the disclosure of impact factors, footholds, and path choices. It describes the evolution of corporate environmental information disclosure and provides reference suggestions for enterprises to disclose environmental information truthfully and perform social responsibilities.展开更多
Point of Care (PoC) devices and systems can be categorized into three broad classes (CAT 1, CAT 2, and CAT 3) based on the context of operation and usage. In this paper, the categories are defined to address certain u...Point of Care (PoC) devices and systems can be categorized into three broad classes (CAT 1, CAT 2, and CAT 3) based on the context of operation and usage. In this paper, the categories are defined to address certain usage models of the PoC device. PoC devices that are used for PoC testing and diagnostic applications are defined CAT 1 devices;PoC devices that are used for patient monitoring are defined as CAT 2 devices (PoCM);PoC devices that are used for as interfacing with other devices are defined as CAT 3 devices (PoCI). The PoCI devices provide an interface gateway for collecting and aggregating data from other medical devices. In all categories, data security is an important aspect. This paper presents a security framework concept, which is applicable for all of the classes of PoC operation. It outlines the concepts and security framework for preventing security challenges in unauthorized access to data, unintended data flow, and data tampering during communication between system entities, the user, and the PoC system. The security framework includes secure layering of basic PoC system architecture, protection of PoC devices in the context of application and network. Developing the security framework is taken into account of a thread model of the PoC system. A proposal for a low-level protocol is discussed. This protocol is independent of communications technologies, and it is elaborated in relation to providing security. An algorithm that can be used to overcome the threat challenges has been shown using the elements in the protocol. The paper further discusses the vulnerability scanning process for the PoC system interconnected network. The paper also presents a four-step process of authentication and authorization framework for providing the security for the PoC system. Finally, the paper concludes with the machine to machine (M2M) security viewpoint and discusses the key stakeholders within an actual deployment of the PoC system and its security challenges.展开更多
Digital educational content is gaining importance as an incubator of pedagogical methodologies in formal and informal online educational settings. Its educational efficiency is directly dependent on its quality, howev...Digital educational content is gaining importance as an incubator of pedagogical methodologies in formal and informal online educational settings. Its educational efficiency is directly dependent on its quality, however educational content is more than information and data. This paper presents a new data quality framework for assessing digital educational content used for teaching in distance learning environments. The model relies on the ISO2500 series quality standard and beside providing the mechanisms for multi-facet quality assessment it also supports organizations that design, create, manage and use educational content with the quality tools (expressed as quality metrics and measurement methods) to provide a more efficient distance education experience. The model describes the quality characteristics of the educational material content using data and software quality characteristics.展开更多
Antarctic data management is the research focus, which the international Antarctic organizations, e.g. Antarctic Treaty Consultative Meeting(ATCM) , Scientific Committee on Antarctic Research(SCAR), and Council of Man...Antarctic data management is the research focus, which the international Antarctic organizations, e.g. Antarctic Treaty Consultative Meeting(ATCM) , Scientific Committee on Antarctic Research(SCAR), and Council of Managers of National Antarctic Programmes(COMNAP) have been paying close attention to and promoting actively. Through the co effort of international Antarctic organizations and member countries concerned in recent years, Antarctic Data Directory Syatem(ADDS) is established as the most important basic programme for development of the international Antarctic data management system. At present, Joint Committee on Antarctic Data Management(JCADM) is responsible for organizing and coordinating the international Antarctic data management, and implementing the project ADDS.In this paper the background on Antarctic data management in time sequence and the structure of international framework are introduced, meanwhile, it is necessary to develop ADDS first of all. The ADDS mainly consists of the two principal parts: National Antarctic Data Center(NADCs) of all the party members and Antarctic Main Directory(AMD), the best available technology for creating ADDS is to make full use of International Directory Network(IDN) and adopt its Directory Interchange Formats(DIF). In the light of the above requirements, combined with Chinese specific situation, the contents, technical and administrative methods on Chinese Antarctic data management are discussed to promote our related work.展开更多
The standards system for cultural heritage digitalization aims to build a clear and logically rigorous framework to guide the development and revision of relevant standards.This system enhances the scientific,systemat...The standards system for cultural heritage digitalization aims to build a clear and logically rigorous framework to guide the development and revision of relevant standards.This system enhances the scientific,systematic,and practical aspects of cultural heritage digitalization.This paper comprehensively analyzes the current status and needs of cultural heritage digitalization and standardization.It further examines the methods used to construct the standards system.Through comparative analysis,it establishes a lifecycle-based framework for cultural heritage.This framework accounts for the unique characteristics of cultural heritage and systematically integrates key processes such as collection,processing,storage,transmission,and utilization of data.The standards system is divided into six sections:general,data,information,knowledge,intelligence,and application.Based on the current digitalization efforts,this paper proposes key standardization directions for each section.This framework ensures the integrity and consistency of data throughout the digitalization process.It also supports the application of intelligent technologies in cultural heritage conservation,contributing to the sustainable preservation and utilization of cultural heritage data.展开更多
An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing...An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.展开更多
The importance of the project selection phase in any six sigma initiative cannot be emphasized enough. The successfulness of the six sigma initiative is affected by successful project selection. Recently, Data Envelop...The importance of the project selection phase in any six sigma initiative cannot be emphasized enough. The successfulness of the six sigma initiative is affected by successful project selection. Recently, Data Envelopment Analysis (DEA) has been proposed as a six sigma project selection tool. However, there exist a number of different DEA formulations which may affect the selection process and the wining project being selected. This work initially applies nine different DEA formulations to several case studies and concludes that different DEA formulations select different wining projects. Also in this work, a Multi-DEA Unified Scoring Framework is proposed to overcome this problem. This framework is applied to several case studies and proved to successfully select the six sigma project with the best performance. The framework is also successful in filtering out some of the projects that have “selective” excellent performance, i.e. projects with excellent performance in some of the DEA formulations and worse performance in others. It is also successful in selecting stable projects;these are projects that perform well in the majority of the DEA formulations, even if it has not been selected as a wining project by any of the DEA formulations.展开更多
本文基于英国国家卫生与临床优化研究所(National Institute for Health and Care Excellence,NICE)发布的《真实世界证据框架》,探讨其方法论创新及对我国医保准入决策的启示。在全球医保治理迈向价值驱动的背景下,真实世界证据(Real W...本文基于英国国家卫生与临床优化研究所(National Institute for Health and Care Excellence,NICE)发布的《真实世界证据框架》,探讨其方法论创新及对我国医保准入决策的启示。在全球医保治理迈向价值驱动的背景下,真实世界证据(Real World Evidence,RWE)已成为弥补随机对照试验局限、支撑医保精准购买的关键工具,其科学生成与规范应用至关重要。NICE框架以透明性、数据适用性与方法严谨性为三大支柱,系统构建了RWE的应用体系:通过倡导“目标试验模拟”来增强观察性研究的因果推断效力,并整合定性证据,以全面评估患者体验与临床实施路径。实践表明,该框架在癌症、数字疗法及罕见病等领域的评估中成效显著。我国正处于真实世界医保价值评价探索期,鉴于此,可重点从夯实数据治理基础、规范方法学标准、明晰实施路径三个维度着手,构建符合国情的本土化RWE应用体系。展开更多
基金the National Natural Science Foundation of China(No.61303094)the Program of Science and Technology Commission of Shanghai Municipality(No.16511102400)the Innovation Program of Shanghai Municipal Education Commission(No.14YZ024)
基金This work was supported by the Science and Technology Innovation 2030-Key Project of“New Generation Artificial Intelligence”of China under Grant 2018AAA0102303the Natural Science Foundation for Distinguished Young Scholars of Jiangsu Province(No.BK20190030)the National Natural Science Foundation of China(No.61631020,No.61871398,No.61931011 and No.U20B2038).
文摘This paper investigates the problem of data scarcity in spectrum prediction.A cognitive radio equipment may frequently switch the target frequency as the electromagnetic environment changes.The previously trained model for prediction often cannot maintain a good performance when facing small amount of historical data of the new target frequency.Moreover,the cognitive radio equipment usually implements the dynamic spectrum access in real time which means the time to recollect the data of the new task frequency band and retrain the model is very limited.To address the above issues,we develop a crossband data augmentation framework for spectrum prediction by leveraging the recent advances of generative adversarial network(GAN)and deep transfer learning.Firstly,through the similarity measurement,we pre-train a GAN model using the historical data of the frequency band that is the most similar to the target frequency band.Then,through the data augmentation by feeding the small amount of the target data into the pre-trained GAN,temporal-spectral residual network is further trained using deep transfer learning and the generated data with high similarity from GAN.Finally,experiment results demonstrate the effectiveness of the proposed framework.
基金Supported by the National Natural Science Foun-dation of China (60173058 ,70372024)
文摘With the explosive growth of data available, there is an urgent need to develop continuous data mining which reduces manual interaction evidently. A novel model for data mining is proposed in evolving environment. First, some valid mining task schedules are generated, and then au tonomous and local mining are executed periodically, finally, previous results are merged and refined. The framework based on the model creates a communication mechanism to in corporate domain knowledge into continuous process through ontology service. The local and merge mining are transparent to the end user and heterogeneous data ,source by ontology. Experiments suggest that the framework should be useful in guiding the continuous mining process.
文摘The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents.
基金supported by the Philosophy and Social Science Research Project of Daqing City (Grant No. DSGB2017112)the Postgraduate Innovation Research Project of Heilongjiang Bayi Agricultural University (Grant No. YJSCX2017Y79)
文摘Cloud accounting is based on the traditional financial work process, the context of big data, and the necessary trend of future corporate accounting development. Its emergence and rapid development will have a fundamental impact on corporate environmental information disclosure. In the big data era of information sharing, companies will have a new understanding of the emergence, balance, and final consideration of social responsibility, and will have new changes in their overall decision-making and information disclosure methods. "Knowing" and "behavior" will be combined on the basis of rational judgment, so that corporate environmental information disclosure is more in line with the overall social development requirements. Based on the background of big data, this article starts with the disclosure of impact factors, footholds, and path choices. It describes the evolution of corporate environmental information disclosure and provides reference suggestions for enterprises to disclose environmental information truthfully and perform social responsibilities.
文摘Point of Care (PoC) devices and systems can be categorized into three broad classes (CAT 1, CAT 2, and CAT 3) based on the context of operation and usage. In this paper, the categories are defined to address certain usage models of the PoC device. PoC devices that are used for PoC testing and diagnostic applications are defined CAT 1 devices;PoC devices that are used for patient monitoring are defined as CAT 2 devices (PoCM);PoC devices that are used for as interfacing with other devices are defined as CAT 3 devices (PoCI). The PoCI devices provide an interface gateway for collecting and aggregating data from other medical devices. In all categories, data security is an important aspect. This paper presents a security framework concept, which is applicable for all of the classes of PoC operation. It outlines the concepts and security framework for preventing security challenges in unauthorized access to data, unintended data flow, and data tampering during communication between system entities, the user, and the PoC system. The security framework includes secure layering of basic PoC system architecture, protection of PoC devices in the context of application and network. Developing the security framework is taken into account of a thread model of the PoC system. A proposal for a low-level protocol is discussed. This protocol is independent of communications technologies, and it is elaborated in relation to providing security. An algorithm that can be used to overcome the threat challenges has been shown using the elements in the protocol. The paper further discusses the vulnerability scanning process for the PoC system interconnected network. The paper also presents a four-step process of authentication and authorization framework for providing the security for the PoC system. Finally, the paper concludes with the machine to machine (M2M) security viewpoint and discusses the key stakeholders within an actual deployment of the PoC system and its security challenges.
文摘Digital educational content is gaining importance as an incubator of pedagogical methodologies in formal and informal online educational settings. Its educational efficiency is directly dependent on its quality, however educational content is more than information and data. This paper presents a new data quality framework for assessing digital educational content used for teaching in distance learning environments. The model relies on the ISO2500 series quality standard and beside providing the mechanisms for multi-facet quality assessment it also supports organizations that design, create, manage and use educational content with the quality tools (expressed as quality metrics and measurement methods) to provide a more efficient distance education experience. The model describes the quality characteristics of the educational material content using data and software quality characteristics.
文摘Antarctic data management is the research focus, which the international Antarctic organizations, e.g. Antarctic Treaty Consultative Meeting(ATCM) , Scientific Committee on Antarctic Research(SCAR), and Council of Managers of National Antarctic Programmes(COMNAP) have been paying close attention to and promoting actively. Through the co effort of international Antarctic organizations and member countries concerned in recent years, Antarctic Data Directory Syatem(ADDS) is established as the most important basic programme for development of the international Antarctic data management system. At present, Joint Committee on Antarctic Data Management(JCADM) is responsible for organizing and coordinating the international Antarctic data management, and implementing the project ADDS.In this paper the background on Antarctic data management in time sequence and the structure of international framework are introduced, meanwhile, it is necessary to develop ADDS first of all. The ADDS mainly consists of the two principal parts: National Antarctic Data Center(NADCs) of all the party members and Antarctic Main Directory(AMD), the best available technology for creating ADDS is to make full use of International Directory Network(IDN) and adopt its Directory Interchange Formats(DIF). In the light of the above requirements, combined with Chinese specific situation, the contents, technical and administrative methods on Chinese Antarctic data management are discussed to promote our related work.
基金supported by“The Palace Museum Talent Program”.The Palace Museum Talent Program is supported by The Hong Kong Jockey Club,exclusively sponsored by the Institute of Philanthropy.
文摘The standards system for cultural heritage digitalization aims to build a clear and logically rigorous framework to guide the development and revision of relevant standards.This system enhances the scientific,systematic,and practical aspects of cultural heritage digitalization.This paper comprehensively analyzes the current status and needs of cultural heritage digitalization and standardization.It further examines the methods used to construct the standards system.Through comparative analysis,it establishes a lifecycle-based framework for cultural heritage.This framework accounts for the unique characteristics of cultural heritage and systematically integrates key processes such as collection,processing,storage,transmission,and utilization of data.The standards system is divided into six sections:general,data,information,knowledge,intelligence,and application.Based on the current digitalization efforts,this paper proposes key standardization directions for each section.This framework ensures the integrity and consistency of data throughout the digitalization process.It also supports the application of intelligent technologies in cultural heritage conservation,contributing to the sustainable preservation and utilization of cultural heritage data.
基金This project supported by the National High-Tech Research and Development Plan (863-804-3)
文摘An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.
文摘The importance of the project selection phase in any six sigma initiative cannot be emphasized enough. The successfulness of the six sigma initiative is affected by successful project selection. Recently, Data Envelopment Analysis (DEA) has been proposed as a six sigma project selection tool. However, there exist a number of different DEA formulations which may affect the selection process and the wining project being selected. This work initially applies nine different DEA formulations to several case studies and concludes that different DEA formulations select different wining projects. Also in this work, a Multi-DEA Unified Scoring Framework is proposed to overcome this problem. This framework is applied to several case studies and proved to successfully select the six sigma project with the best performance. The framework is also successful in filtering out some of the projects that have “selective” excellent performance, i.e. projects with excellent performance in some of the DEA formulations and worse performance in others. It is also successful in selecting stable projects;these are projects that perform well in the majority of the DEA formulations, even if it has not been selected as a wining project by any of the DEA formulations.
文摘本文基于英国国家卫生与临床优化研究所(National Institute for Health and Care Excellence,NICE)发布的《真实世界证据框架》,探讨其方法论创新及对我国医保准入决策的启示。在全球医保治理迈向价值驱动的背景下,真实世界证据(Real World Evidence,RWE)已成为弥补随机对照试验局限、支撑医保精准购买的关键工具,其科学生成与规范应用至关重要。NICE框架以透明性、数据适用性与方法严谨性为三大支柱,系统构建了RWE的应用体系:通过倡导“目标试验模拟”来增强观察性研究的因果推断效力,并整合定性证据,以全面评估患者体验与临床实施路径。实践表明,该框架在癌症、数字疗法及罕见病等领域的评估中成效显著。我国正处于真实世界医保价值评价探索期,鉴于此,可重点从夯实数据治理基础、规范方法学标准、明晰实施路径三个维度着手,构建符合国情的本土化RWE应用体系。