With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service...With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry.展开更多
The exponential growth of Internet of Things(IoT)devices,autonomous systems,and digital services is generating massive volumes of big data,projected to exceed 291 zettabytes by 2027.Conventional cloud computing,despit...The exponential growth of Internet of Things(IoT)devices,autonomous systems,and digital services is generating massive volumes of big data,projected to exceed 291 zettabytes by 2027.Conventional cloud computing,despite its high processing and storage capacity,suffers from increased network latency,network congestion,and high operational costs,making it unsuitable for latency-sensitive applications.Edge computing addresses these issues by processing data near the source but faces scalability challenges and elevated Total Cost of Ownership(TCO).Hybrid solutions,such as fog computing,cloudlets,and Mobile Edge Computing(MEC),attempt to balance cost and performance;however,they still struggle with limited resource sharing and high deployment expenses.This paper proposes Public Edge as a Service(PEaaS),a novel paradigm that utilizes idle resources contributed by universities,enterprises,cellular operators,and individuals under a collaborative service model.By decentralizing computation and enabling multi-tenant resource sharing,PEaaS reduces reliance on centralized cloud infrastructure,minimizes communication costs,and enhances scalability.The proposed framework is evaluated using EdgeCloudSim under varying workloads,for keymetrics such as latency,communication cost,server utilization,and task failure rate.Results reveal that while cloud has a task failure rate rising sharply to 12.3%at 2000 devices,PEaaS maintains a low rate of 2.5%,closely matching edge computing.Furthermore,communication costs remain 25% lower than cloud and latency remains below 0.3,even under peak load.These findings demonstrate that PEaaS achieves near-edge performance with reduced costs and enhanced scalability,offering a sustainable and economically viable solution for next-generation computing environments.展开更多
In the rapidly evolving landscape of digital health,the integration of data analytics and Internet healthserviceshasbecome a pivotal area of exploration.To meet keen social needs,Prof.Shan Liu(Xi'an Jiaotong Unive...In the rapidly evolving landscape of digital health,the integration of data analytics and Internet healthserviceshasbecome a pivotal area of exploration.To meet keen social needs,Prof.Shan Liu(Xi'an Jiaotong University)and Prof.Xing Zhang(Wuhan Textile University)have published the timely book Datadriven Internet Health Platform Service Value Co-creation through China Science Press.The book focuses on the provision of medical and health services from doctors to patients through Internet health platforms,where the service value is co-created by three parties.展开更多
As Internet ofThings(IoT)technologies continue to evolve at an unprecedented pace,intelligent big data control and information systems have become critical enablers for organizational digital transformation,facilitati...As Internet ofThings(IoT)technologies continue to evolve at an unprecedented pace,intelligent big data control and information systems have become critical enablers for organizational digital transformation,facilitating data-driven decision making,fostering innovation ecosystems,and maintaining operational stability.In this study,we propose an advanced deployment algorithm for Service Function Chaining(SFC)that leverages an enhanced Practical Byzantine Fault Tolerance(PBFT)mechanism.The main goal is to tackle the issues of security and resource efficiency in SFC implementation across diverse network settings.By integrating blockchain technology and Deep Reinforcement Learning(DRL),our algorithm not only optimizes resource utilization and quality of service but also ensures robust security during SFC deployment.Specifically,the enhanced PBFT consensus mechanism(VRPBFT)significantly reduces consensus latency and improves Byzantine node detection through the introduction of a Verifiable Random Function(VRF)and a node reputation grading model.Experimental results demonstrate that compared to traditional PBFT,the proposed VRPBFT algorithm reduces consensus latency by approximately 30%and decreases the proportion of Byzantine nodes by 40%after 100 rounds of consensus.Furthermore,the DRL-based SFC deployment algorithm(SDRL)exhibits rapid convergence during training,with improvements in long-term average revenue,request acceptance rate,and revenue/cost ratio of 17%,14.49%,and 20.35%,respectively,over existing algorithms.Additionally,the CPU resource utilization of the SDRL algorithmreaches up to 42%,which is 27.96%higher than other algorithms.These findings indicate that the proposed algorithm substantially enhances resource utilization efficiency,service quality,and security in SFC deployment.展开更多
This article introduces the Push-to-talk over Cellular(PoC)service,a new mobile value-added service based on the IP Multimedia Subsystem(IMS).Its implementation scheme is discussed and its Session Initiation Protocol(...This article introduces the Push-to-talk over Cellular(PoC)service,a new mobile value-added service based on the IP Multimedia Subsystem(IMS).Its implementation scheme is discussed and its Session Initiation Protocol(SIP)signaling exchange flow is described.Target user groups are predicted based on the analysis of the strengths and weaknesses of the PoC service.The article purports that the PoC system could undoubtedly be used as a platform for new services such as multimedia messaging,instant messaging,presence,and picture receiving and sending.Just like the short message service,the PoC service will help the terminal vendors,equipment vendors,content providers and operators setup a win-for-all industrial value chain.展开更多
JCOMM has strategy to establish the network of WMO-IOC Centres for Marine-meteorological and Oceanographic Climate Data (CMOCs) under the new Marine Climate Data System (MCDS) in 2012 for improving the quality and...JCOMM has strategy to establish the network of WMO-IOC Centres for Marine-meteorological and Oceanographic Climate Data (CMOCs) under the new Marine Climate Data System (MCDS) in 2012 for improving the quality and timeliness of the marine-meteorological and oceanographic data, metadata and products available to end users. China as a candidate of CMOC China has been approved to run on a trial basis after the 4th Meeting of the Joint IOC/WMO Technical Commission for Oceanography and Marine Meteorology (JCOMM). This article states the developing intention of CMOC China in the next few years through the brief introduction to critical marine data, products and service system and cooperation projects in the world.展开更多
To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of polici...To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of policies.However,the data landscape is changing.With emerging big data and cloud-based services,there are new opportunities for data collection,influencing both official data collection processes and the operation of the programmes they monitor.This paper uses cases and examples to explore the potential of crowdsourcing and public earth observation(EO)data products for monitoring and tracking the SDGs.This paper suggests that cloud-based services that integrate crowdsourcing and public EO data products provide cost-effective solutions for monitoring and tracking the SDGs,particularly for low-income countries.The paper also discusses the challenges of using cloud services and big data for SDG monitoring.Validation and quality control of public EO data is very important;otherwise,the user will be unable to assess the quality of the data or use it with confidence.展开更多
Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industr...Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industry adoption and migration of traditional computing services to the cloud,one of the main challenges in cybersecurity is to provide mechanisms to secure these technologies.This work proposes a Data Security Framework for cloud computing services(CCS)that evaluates and improves CCS data security from a software engineering perspective by evaluating the levels of security within the cloud computing paradigm using engineering methods and techniques applied to CCS.This framework is developed by means of a methodology based on a heuristic theory that incorporates knowledge generated by existing works as well as the experience of their implementation.The paper presents the design details of the framework,which consists of three stages:identification of data security requirements,management of data security risks and evaluation of data security performance in CCS.展开更多
An ocean state monitor and analysis radar(OSMAR), developed by Wuhan University in China, have been mounted at six stations along the coasts of East China Sea(ECS) to measure velocities(currents, waves and winds...An ocean state monitor and analysis radar(OSMAR), developed by Wuhan University in China, have been mounted at six stations along the coasts of East China Sea(ECS) to measure velocities(currents, waves and winds) at the sea surface. Radar-observed surface current is taken as an example to illustrate the operational high-frequency(HF) radar observing and data service platform(OP), presenting an operational flow from data observing, transmitting, processing, visualizing, to end-user service. Three layers(systems): radar observing system(ROS), data service system(DSS) and visualization service system(VSS), as well as the data flow within the platform are introduced. Surface velocities observed at stations are synthesized at the radar data receiving and preprocessing center of the ROS, and transmitted to the DSS, in which the data processing and quality control(QC) are conducted. Users are allowed to browse the processed data on the portal of the DSS, and access to those data files. The VSS aims to better show the data products by displaying the information on a visual globe. By utilizing the OP, the surface currents in East China Sea are monitored, and hourly and seasonal variabilities of them are investigated.展开更多
Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The comp...Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The competitive atmosphere within which electronic banking services are provided by different banks increases the necessity of customer retention.Methods:Being based on existing information technologies which allow one to collect data from organizations’databases,data mining introduces a powerful tool for the extraction of knowledge from huge amounts of data.In this research,the decision tree technique was applied to build a model incorporating this knowledge.Results:The results represent the characteristics of churned customers.Conclusions:Bank managers can identify churners in future using the results of decision tree.They should be provide some strategies for customers whose features are getting more likely to churner’s features.展开更多
With the increase of different sensors,applications and customers,the demand from data providers and users is for a new geospatial data service model,which supports low cost,high dexterity,and which would provide a co...With the increase of different sensors,applications and customers,the demand from data providers and users is for a new geospatial data service model,which supports low cost,high dexterity,and which would provide a comprehensive service.Based on such requirements and demands,the 21AT TripleSat constellation terminal and data delivery and management system has been developed by a Beijing based high-tech enterprise,Twenty First Century Aerospace Technology Co.,Ltd.(21AT).The company is the first commercial Earth observation satellite operator and service provider in China.This new geospatial data service model allows the user to directly access multi-source satellite data,manage the data order,and carry out automatic massive data production and delivery.The solution also implements safe and hierarchical user management,statistical data analysis,and automatic information reports.In addition,a mobile application is also available for users to easily access system functions.This new geospatial solution has already been successfully applied and installed in many customer sites in China,and is now available globally for international clients interested in fast geospatial solutions.It enables the success of customers’operational services.Besides providing TripleSat Constellation images,the multi-source data access system also allows the users to access other satellite data sources,based on customized agreement.This paper describes and discusses this new geospatial data service model.展开更多
This paper proposes a method of data-flow testing for Web services composition.Firstly,to facilitate data flow analysis and constraints collecting,the existing model representation of business process execution langua...This paper proposes a method of data-flow testing for Web services composition.Firstly,to facilitate data flow analysis and constraints collecting,the existing model representation of business process execution language(BPEL)is modified in company with the analysis of data dependency and an exact representation of dead path elimination(DPE)is proposed,which over-comes the difficulties brought to dataflow analysis.Then defining and using information based on data flow rules is collected by parsing BPEL and Web services description language(WSDL)documents and the def-use annotated control flow graph is created.Based on this model,data-flow anomalies which indicate potential errors can be discovered by traversing the paths of graph,and all-du-paths used in dynamic data flow testing for Web services composition are automatically generated,then testers can design the test cases according to the collected constraints for each path selected.展开更多
In this paper, we present a set of best practices for workflow design and implementation for numerical weather prediction models and meteorological data service, which have been in operation in China Meteorological Ad...In this paper, we present a set of best practices for workflow design and implementation for numerical weather prediction models and meteorological data service, which have been in operation in China Meteorological Administration (CMA) for years and have been proven effective in reliably managing the complexities of large-scale meteorological related workflows. Based on the previous work on the platforms, we argue that a minimum set of guidelines including workflow scheme, module design, implementation standards and maintenance consideration during the whole establishment of the platform are highly recommended, serving to reduce the need for future maintenance and adjustment. A significant gain in performance can be achieved through the workflow-based projects. We believe that a good workflow system plays an important role in the weather forecast service, providing a useful tool for monitoring the whole process, fixing the errors, repairing a workflow, or redesigning an equivalent workflow pattern with new components.展开更多
As the volume of healthcare and medical data increases from diverse sources,real-world scenarios involving data sharing and collaboration have certain challenges,including the risk of privacy leakage,difficulty in dat...As the volume of healthcare and medical data increases from diverse sources,real-world scenarios involving data sharing and collaboration have certain challenges,including the risk of privacy leakage,difficulty in data fusion,low reliability of data storage,low effectiveness of data sharing,etc.To guarantee the service quality of data collaboration,this paper presents a privacy-preserving Healthcare and Medical Data Collaboration Service System combining Blockchain with Federated Learning,termed FL-HMChain.This system is composed of three layers:Data extraction and storage,data management,and data application.Focusing on healthcare and medical data,a healthcare and medical blockchain is constructed to realize data storage,transfer,processing,and access with security,real-time,reliability,and integrity.An improved master node selection consensus mechanism is presented to detect and prevent dishonest behavior,ensuring the overall reliability and trustworthiness of the collaborative model training process.Furthermore,healthcare and medical data collaboration services in real-world scenarios have been discussed and developed.To further validate the performance of FL-HMChain,a Convolutional Neural Network-based Federated Learning(FL-CNN-HMChain)model is investigated for medical image identification.This model achieves better performance compared to the baseline Convolutional Neural Network(CNN),having an average improvement of 4.7%on Area Under Curve(AUC)and 7%on Accuracy(ACC),respectively.Furthermore,the probability of privacy leakage can be effectively reduced by the blockchain-based parameter transfer mechanism in federated learning between local and global models.展开更多
The aim of the work was to determine the spatial distribution of activity in the forest on the area of the Forest Promotional Complex“Sudety Zachodnie”using mobile phone data.The study identified the sites with the ...The aim of the work was to determine the spatial distribution of activity in the forest on the area of the Forest Promotional Complex“Sudety Zachodnie”using mobile phone data.The study identified the sites with the highest(hot spot)and lowest(cold spot)use.Habitat,stand,demographic,topographic and spatial factors affecting the distribution of activity were also analyzed.Two approaches were applied in our research:global and local Moran’s coefficients,and a machine learning technique,Boosted Regression Trees.The results show that 11,503,320 visits to forest areas were recorded in the“Sudety Zachodnie”in 2019.The most popular season for activities was winter,and the least popular was spring.Using global and local Moran’s I coefficients,three small hot clusters of activity and one large cold cluster were identified.Locations with high values with similar neighbours(hot-spots)were most often visited forest areas,averaging almost 200,000 visits over 2019.Significantly fewer visits were recorded in cold-spots,the average number of visits to these areas was about 4,500.The value of global Moran’s I was equal to 0.54 and proved significant positive spatial autocorrelation.Results of Boosted Regression Trees modeling of visits in forest,using tree stand habitat and spatial factors accurately explained 76%of randomly selected input data.The variables that had the greatest effect on the distribution of activities were the density of hiking and biking trails and diversity of topography.The methodology presented in this article allows delineation of Cultural Ecosystem Services hot spots in forest areas based on mobile phone data.It also allows the identification of factors that may influence the distribution of visits in forests.Such data are important for managing forest areas and adapting forest management to the needs of society while maintaining ecosystem stability.展开更多
The New Austrian Tunneling Method (NATM) has been widely used in the construction of mountain tun- nels, urban metro lines, underground storage tanks, underground power houses, mining roadways, and so on, The variat...The New Austrian Tunneling Method (NATM) has been widely used in the construction of mountain tun- nels, urban metro lines, underground storage tanks, underground power houses, mining roadways, and so on, The variation patterns of advance geological prediction data, stress-strain data of supporting struc- tures, and deformation data of the surrounding rock are vitally important in assessing the rationality and reliability of construction schemes, and provide essential information to ensure the safety and scheduling of tunnel construction, However, as the quantity of these data increases significantly, the uncertainty and discreteness of the mass data make it extremely difficult to produce a reasonable con- struction scheme; they also reduce the forecast accuracy of accidents and dangerous situations, creating huge challenges in tunnel construction safety, In order to solve this problem, a novel data service system is proposed that uses data-association technology and the NATM, with the support of a big data environ- ment, This system can integrate data resources from distributed monitoring sensors during the construc- tion process, and then identify associations and build relations among data resources under the same construction conditions, These data associations and relations are then stored in a data pool, With the development and supplementation of the data pool, similar relations can then he used under similar con- ditions, in order to provide data references for construction schematic designs and resource allocation, The proposed data service system also provides valuable guidance for the construction of similar projects.展开更多
Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include s...Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include storing and accessing user data in commercial clouds, mining of social data, and analysis of large-scale simulations and experiments such as the Large Hadron Collider. An increasing number of such data—intensive applications and services are relying on clouds in order to process and manage the enormous amounts of data required for continuous operation. It can be difficult to decide which of the many options for cloud processing is suitable for a given application;the aim of this paper is therefore to provide an interested user with an overview of the most important concepts of cloud computing as it relates to processing of Big Data.展开更多
文摘With the accelerating aging process of China’s population,the demand for community elderly care services has shown diversified and personalized characteristics.However,problems such as insufficient total care service resources,uneven distribution,and prominent supply-demand contradictions have seriously affected service quality.Big data technology,with core advantages including data collection,analysis and mining,and accurate prediction,provides a new solution for the allocation of community elderly care service resources.This paper systematically studies the application value of big data technology in the allocation of community elderly care service resources from three aspects:resource allocation efficiency,service accuracy,and management intelligence.Combined with practical needs,it proposes optimal allocation strategies such as building a big data analysis platform and accurately grasping the elderly’s care needs,striving to provide operable path references for the construction of community elderly care service systems,promoting the early realization of the elderly care service goal of“adequate support and proper care for the elderly”,and boosting the high-quality development of China’s elderly care service industry.
文摘The exponential growth of Internet of Things(IoT)devices,autonomous systems,and digital services is generating massive volumes of big data,projected to exceed 291 zettabytes by 2027.Conventional cloud computing,despite its high processing and storage capacity,suffers from increased network latency,network congestion,and high operational costs,making it unsuitable for latency-sensitive applications.Edge computing addresses these issues by processing data near the source but faces scalability challenges and elevated Total Cost of Ownership(TCO).Hybrid solutions,such as fog computing,cloudlets,and Mobile Edge Computing(MEC),attempt to balance cost and performance;however,they still struggle with limited resource sharing and high deployment expenses.This paper proposes Public Edge as a Service(PEaaS),a novel paradigm that utilizes idle resources contributed by universities,enterprises,cellular operators,and individuals under a collaborative service model.By decentralizing computation and enabling multi-tenant resource sharing,PEaaS reduces reliance on centralized cloud infrastructure,minimizes communication costs,and enhances scalability.The proposed framework is evaluated using EdgeCloudSim under varying workloads,for keymetrics such as latency,communication cost,server utilization,and task failure rate.Results reveal that while cloud has a task failure rate rising sharply to 12.3%at 2000 devices,PEaaS maintains a low rate of 2.5%,closely matching edge computing.Furthermore,communication costs remain 25% lower than cloud and latency remains below 0.3,even under peak load.These findings demonstrate that PEaaS achieves near-edge performance with reduced costs and enhanced scalability,offering a sustainable and economically viable solution for next-generation computing environments.
文摘In the rapidly evolving landscape of digital health,the integration of data analytics and Internet healthserviceshasbecome a pivotal area of exploration.To meet keen social needs,Prof.Shan Liu(Xi'an Jiaotong University)and Prof.Xing Zhang(Wuhan Textile University)have published the timely book Datadriven Internet Health Platform Service Value Co-creation through China Science Press.The book focuses on the provision of medical and health services from doctors to patients through Internet health platforms,where the service value is co-created by three parties.
基金supported by the National Natural Science Foundation of China under Grant 62471493 and 62402257partially supported by the Natural Science Foundation of Shandong Province under Grant ZR2023LZH017,ZR2024MF066 and 2023QF025+2 种基金partially supported by the Open Research Subject of State Key Laboratory of Intelligent Game(No.ZBKF-24-12)partially supported by the Foundation of Key Laboratory of Education Informatization for Nationalities(Yunnan Normal University),the Ministry of Education(No.EIN2024C006)partially supported by the Key Laboratory of Ethnic Language Intelligent Analysis and Security Governance of MOE(No.202306).
文摘As Internet ofThings(IoT)technologies continue to evolve at an unprecedented pace,intelligent big data control and information systems have become critical enablers for organizational digital transformation,facilitating data-driven decision making,fostering innovation ecosystems,and maintaining operational stability.In this study,we propose an advanced deployment algorithm for Service Function Chaining(SFC)that leverages an enhanced Practical Byzantine Fault Tolerance(PBFT)mechanism.The main goal is to tackle the issues of security and resource efficiency in SFC implementation across diverse network settings.By integrating blockchain technology and Deep Reinforcement Learning(DRL),our algorithm not only optimizes resource utilization and quality of service but also ensures robust security during SFC deployment.Specifically,the enhanced PBFT consensus mechanism(VRPBFT)significantly reduces consensus latency and improves Byzantine node detection through the introduction of a Verifiable Random Function(VRF)and a node reputation grading model.Experimental results demonstrate that compared to traditional PBFT,the proposed VRPBFT algorithm reduces consensus latency by approximately 30%and decreases the proportion of Byzantine nodes by 40%after 100 rounds of consensus.Furthermore,the DRL-based SFC deployment algorithm(SDRL)exhibits rapid convergence during training,with improvements in long-term average revenue,request acceptance rate,and revenue/cost ratio of 17%,14.49%,and 20.35%,respectively,over existing algorithms.Additionally,the CPU resource utilization of the SDRL algorithmreaches up to 42%,which is 27.96%higher than other algorithms.These findings indicate that the proposed algorithm substantially enhances resource utilization efficiency,service quality,and security in SFC deployment.
文摘This article introduces the Push-to-talk over Cellular(PoC)service,a new mobile value-added service based on the IP Multimedia Subsystem(IMS).Its implementation scheme is discussed and its Session Initiation Protocol(SIP)signaling exchange flow is described.Target user groups are predicted based on the analysis of the strengths and weaknesses of the PoC service.The article purports that the PoC system could undoubtedly be used as a platform for new services such as multimedia messaging,instant messaging,presence,and picture receiving and sending.Just like the short message service,the PoC service will help the terminal vendors,equipment vendors,content providers and operators setup a win-for-all industrial value chain.
文摘JCOMM has strategy to establish the network of WMO-IOC Centres for Marine-meteorological and Oceanographic Climate Data (CMOCs) under the new Marine Climate Data System (MCDS) in 2012 for improving the quality and timeliness of the marine-meteorological and oceanographic data, metadata and products available to end users. China as a candidate of CMOC China has been approved to run on a trial basis after the 4th Meeting of the Joint IOC/WMO Technical Commission for Oceanography and Marine Meteorology (JCOMM). This article states the developing intention of CMOC China in the next few years through the brief introduction to critical marine data, products and service system and cooperation projects in the world.
基金funded by the National Key Research and Development Program of China(Grant No.2016YFA0600304)the Strategic Priority Research Program of Chinese Academy of Sciences(Grant No.XDA19030201).
文摘To achieve the Sustainable Development Goals(SDGs),high-quality data are needed to inform the formulation of policies and investment decisions,to monitor progress towards the SDGs and to evaluate the impacts of policies.However,the data landscape is changing.With emerging big data and cloud-based services,there are new opportunities for data collection,influencing both official data collection processes and the operation of the programmes they monitor.This paper uses cases and examples to explore the potential of crowdsourcing and public earth observation(EO)data products for monitoring and tracking the SDGs.This paper suggests that cloud-based services that integrate crowdsourcing and public EO data products provide cost-effective solutions for monitoring and tracking the SDGs,particularly for low-income countries.The paper also discusses the challenges of using cloud services and big data for SDG monitoring.Validation and quality control of public EO data is very important;otherwise,the user will be unable to assess the quality of the data or use it with confidence.
文摘Cyberattacks are difficult to prevent because the targeted companies and organizations are often relying on new and fundamentally insecure cloudbased technologies,such as the Internet of Things.With increasing industry adoption and migration of traditional computing services to the cloud,one of the main challenges in cybersecurity is to provide mechanisms to secure these technologies.This work proposes a Data Security Framework for cloud computing services(CCS)that evaluates and improves CCS data security from a software engineering perspective by evaluating the levels of security within the cloud computing paradigm using engineering methods and techniques applied to CCS.This framework is developed by means of a methodology based on a heuristic theory that incorporates knowledge generated by existing works as well as the experience of their implementation.The paper presents the design details of the framework,which consists of three stages:identification of data security requirements,management of data security risks and evaluation of data security performance in CCS.
基金The National Natural Science Foundation of China under contract No.41206012
文摘An ocean state monitor and analysis radar(OSMAR), developed by Wuhan University in China, have been mounted at six stations along the coasts of East China Sea(ECS) to measure velocities(currents, waves and winds) at the sea surface. Radar-observed surface current is taken as an example to illustrate the operational high-frequency(HF) radar observing and data service platform(OP), presenting an operational flow from data observing, transmitting, processing, visualizing, to end-user service. Three layers(systems): radar observing system(ROS), data service system(DSS) and visualization service system(VSS), as well as the data flow within the platform are introduced. Surface velocities observed at stations are synthesized at the radar data receiving and preprocessing center of the ROS, and transmitted to the DSS, in which the data processing and quality control(QC) are conducted. Users are allowed to browse the processed data on the portal of the DSS, and access to those data files. The VSS aims to better show the data products by displaying the information on a visual globe. By utilizing the OP, the surface currents in East China Sea are monitored, and hourly and seasonal variabilities of them are investigated.
文摘Background:Given the importance of customers as the most valuable assets of organizations,customer retention seems to be an essential,basic requirement for any organization.Banks are no exception to this rule.The competitive atmosphere within which electronic banking services are provided by different banks increases the necessity of customer retention.Methods:Being based on existing information technologies which allow one to collect data from organizations’databases,data mining introduces a powerful tool for the extraction of knowledge from huge amounts of data.In this research,the decision tree technique was applied to build a model incorporating this knowledge.Results:The results represent the characteristics of churned customers.Conclusions:Bank managers can identify churners in future using the results of decision tree.They should be provide some strategies for customers whose features are getting more likely to churner’s features.
基金supported by the project of Beijing Municipal Science and Technology Commission and Science and Technology Innovation Base of Cultivating and Developing Engineering[grant number Z161100005016069]the National High Technology Research and Development Program[grant number 2013AA12A303].
文摘With the increase of different sensors,applications and customers,the demand from data providers and users is for a new geospatial data service model,which supports low cost,high dexterity,and which would provide a comprehensive service.Based on such requirements and demands,the 21AT TripleSat constellation terminal and data delivery and management system has been developed by a Beijing based high-tech enterprise,Twenty First Century Aerospace Technology Co.,Ltd.(21AT).The company is the first commercial Earth observation satellite operator and service provider in China.This new geospatial data service model allows the user to directly access multi-source satellite data,manage the data order,and carry out automatic massive data production and delivery.The solution also implements safe and hierarchical user management,statistical data analysis,and automatic information reports.In addition,a mobile application is also available for users to easily access system functions.This new geospatial solution has already been successfully applied and installed in many customer sites in China,and is now available globally for international clients interested in fast geospatial solutions.It enables the success of customers’operational services.Besides providing TripleSat Constellation images,the multi-source data access system also allows the users to access other satellite data sources,based on customized agreement.This paper describes and discusses this new geospatial data service model.
基金the National Natural Science Foundation of China(60425206,60503033)National Basic Research Program of China(973 Program,2002CB312000)Opening Foundation of State Key Laboratory of Software Engineering in Wuhan University
文摘This paper proposes a method of data-flow testing for Web services composition.Firstly,to facilitate data flow analysis and constraints collecting,the existing model representation of business process execution language(BPEL)is modified in company with the analysis of data dependency and an exact representation of dead path elimination(DPE)is proposed,which over-comes the difficulties brought to dataflow analysis.Then defining and using information based on data flow rules is collected by parsing BPEL and Web services description language(WSDL)documents and the def-use annotated control flow graph is created.Based on this model,data-flow anomalies which indicate potential errors can be discovered by traversing the paths of graph,and all-du-paths used in dynamic data flow testing for Web services composition are automatically generated,then testers can design the test cases according to the collected constraints for each path selected.
文摘In this paper, we present a set of best practices for workflow design and implementation for numerical weather prediction models and meteorological data service, which have been in operation in China Meteorological Administration (CMA) for years and have been proven effective in reliably managing the complexities of large-scale meteorological related workflows. Based on the previous work on the platforms, we argue that a minimum set of guidelines including workflow scheme, module design, implementation standards and maintenance consideration during the whole establishment of the platform are highly recommended, serving to reduce the need for future maintenance and adjustment. A significant gain in performance can be achieved through the workflow-based projects. We believe that a good workflow system plays an important role in the weather forecast service, providing a useful tool for monitoring the whole process, fixing the errors, repairing a workflow, or redesigning an equivalent workflow pattern with new components.
基金We are thankful for the funding support fromthe Science and Technology Projects of the National Archives Administration of China(Grant Number 2022-R-031)the Fundamental Research Funds for the Central Universities,Central China Normal University(Grant Number CCNU24CG014).
文摘As the volume of healthcare and medical data increases from diverse sources,real-world scenarios involving data sharing and collaboration have certain challenges,including the risk of privacy leakage,difficulty in data fusion,low reliability of data storage,low effectiveness of data sharing,etc.To guarantee the service quality of data collaboration,this paper presents a privacy-preserving Healthcare and Medical Data Collaboration Service System combining Blockchain with Federated Learning,termed FL-HMChain.This system is composed of three layers:Data extraction and storage,data management,and data application.Focusing on healthcare and medical data,a healthcare and medical blockchain is constructed to realize data storage,transfer,processing,and access with security,real-time,reliability,and integrity.An improved master node selection consensus mechanism is presented to detect and prevent dishonest behavior,ensuring the overall reliability and trustworthiness of the collaborative model training process.Furthermore,healthcare and medical data collaboration services in real-world scenarios have been discussed and developed.To further validate the performance of FL-HMChain,a Convolutional Neural Network-based Federated Learning(FL-CNN-HMChain)model is investigated for medical image identification.This model achieves better performance compared to the baseline Convolutional Neural Network(CNN),having an average improvement of 4.7%on Area Under Curve(AUC)and 7%on Accuracy(ACC),respectively.Furthermore,the probability of privacy leakage can be effectively reduced by the blockchain-based parameter transfer mechanism in federated learning between local and global models.
基金Funded by the National Science Centre,Poland under the OPUS call in the Weave programme(project No.2021/43/I/HS4/01451)funded by Ministry of Education and Science(901503)。
文摘The aim of the work was to determine the spatial distribution of activity in the forest on the area of the Forest Promotional Complex“Sudety Zachodnie”using mobile phone data.The study identified the sites with the highest(hot spot)and lowest(cold spot)use.Habitat,stand,demographic,topographic and spatial factors affecting the distribution of activity were also analyzed.Two approaches were applied in our research:global and local Moran’s coefficients,and a machine learning technique,Boosted Regression Trees.The results show that 11,503,320 visits to forest areas were recorded in the“Sudety Zachodnie”in 2019.The most popular season for activities was winter,and the least popular was spring.Using global and local Moran’s I coefficients,three small hot clusters of activity and one large cold cluster were identified.Locations with high values with similar neighbours(hot-spots)were most often visited forest areas,averaging almost 200,000 visits over 2019.Significantly fewer visits were recorded in cold-spots,the average number of visits to these areas was about 4,500.The value of global Moran’s I was equal to 0.54 and proved significant positive spatial autocorrelation.Results of Boosted Regression Trees modeling of visits in forest,using tree stand habitat and spatial factors accurately explained 76%of randomly selected input data.The variables that had the greatest effect on the distribution of activities were the density of hiking and biking trails and diversity of topography.The methodology presented in this article allows delineation of Cultural Ecosystem Services hot spots in forest areas based on mobile phone data.It also allows the identification of factors that may influence the distribution of visits in forests.Such data are important for managing forest areas and adapting forest management to the needs of society while maintaining ecosystem stability.
文摘The New Austrian Tunneling Method (NATM) has been widely used in the construction of mountain tun- nels, urban metro lines, underground storage tanks, underground power houses, mining roadways, and so on, The variation patterns of advance geological prediction data, stress-strain data of supporting struc- tures, and deformation data of the surrounding rock are vitally important in assessing the rationality and reliability of construction schemes, and provide essential information to ensure the safety and scheduling of tunnel construction, However, as the quantity of these data increases significantly, the uncertainty and discreteness of the mass data make it extremely difficult to produce a reasonable con- struction scheme; they also reduce the forecast accuracy of accidents and dangerous situations, creating huge challenges in tunnel construction safety, In order to solve this problem, a novel data service system is proposed that uses data-association technology and the NATM, with the support of a big data environ- ment, This system can integrate data resources from distributed monitoring sensors during the construc- tion process, and then identify associations and build relations among data resources under the same construction conditions, These data associations and relations are then stored in a data pool, With the development and supplementation of the data pool, similar relations can then he used under similar con- ditions, in order to provide data references for construction schematic designs and resource allocation, The proposed data service system also provides valuable guidance for the construction of similar projects.
文摘Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include storing and accessing user data in commercial clouds, mining of social data, and analysis of large-scale simulations and experiments such as the Large Hadron Collider. An increasing number of such data—intensive applications and services are relying on clouds in order to process and manage the enormous amounts of data required for continuous operation. It can be difficult to decide which of the many options for cloud processing is suitable for a given application;the aim of this paper is therefore to provide an interested user with an overview of the most important concepts of cloud computing as it relates to processing of Big Data.