Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th...Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.展开更多
The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration an...The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration and development and the requirements of low-carbon development,the focus of exploration anddevelopment in the oil and gas industry is gradually shifting to the exploration and development of renewableenergy sources such as deep sea,deep earth and geothermal energy.The traditional petrophysical evaluation andinterpretation model has encountered great challenges in the face of new evaluation objects.To establish a distributedlogging big data private cloud platform with a unified learning model as the key,which realizes the distributed storageand processing of logging big data,and enables the learning of brand-new knowledge patterns from multi-attributedata in the large function space in the unified logging learning model integrating the expert knowledge and the datamodel,so as to solve the problem of geoengineering evaluation of geothermal fields.Based on the research ideaof“logging big data cloud platform---unified logging learning model---large function space---knowledge learning&discovery---application”,the theoretical foundation of unified learning model,cloud platform architecture,datastorage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storageand processing of data and learning algorithms.New knowledge of geothermal evaluation is found in a large functionspace and applied to Geo-engineering evaluation of geothermal fields.The examples show its good application in theselection of logging series in geothermal fields,quality control of logging data,identification of complex lithologyin geothermal fields,evaluation of reservoir fluids,checking of associated helium,evaluation of cementing quality,evaluation of well-side fractures,and evaluation of geothermal water recharge under the remote logging module ofthe cloud platform.The first and second cementing surfaces of cemented wells in geothermal fields were evaluated,as well as the development of well-side distal fractures,fracture extension orientation.According to the well-sidefracture communication to form a good fluid pathway and large flow rate and long flow diameter of the thermalstorage fi ssure system,the design is conducive to the design of the recharge program of geothermal water.展开更多
Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions...Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions among multiple VMs on Key Performance Indicator(KPI)data,e.g.,memory utilization.Furthermore,the nonstationarity,high complexity,and uncertain periodicity of KPI data in VM also bring difficulties to deep learningbased anomaly detection tasks.To settle these challenges,this paper proposes MCBiWGAN-GTN,a multi-channel semi-supervised time series anomaly detection algorithm based on the Bidirectional Wasserstein Generative Adversarial Network with Graph-Time Network(BiWGAN-GTN)and the Complete Ensemble Empirical Mode Decomposition with Adaptive Noise(CEEMDAN).(a)The BiWGAN-GTN algorithm is proposed to extract spatiotemporal information from data.(b)The loss function of BiWGAN-GTN is redesigned to solve the abnormal data intrusion problem during the training process.(c)MCBiWGAN-GTN is designed to reduce data complexity through CEEMDAN for time series decomposition and utilizes BiWGAN-GTN to train different components.(d)To adapt the proposed algorithm for the entire cloud data center,a cloud data center anomaly detection framework based on Swarm Learning(SL)is designed.The evaluation results on a real-world cloud data center dataset show that MCBiWGAN-GTN outperforms the baseline,with an F1-score of 0.96,an accuracy of 0.935,a precision of 0.954,a recall of 0.967,and an FPR of 0.203.The experiments also verify the stability of MCBiWGAN-GTN,the impact of parameter configurations,and the effectiveness of the proposed SL framework.展开更多
Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)...Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field.展开更多
Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes...Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.展开更多
Efficient data management is a key issue for environments distributed on a large scale such as the data cloud. This can be taken into account by replicating the data. The replication of data reduces the time of servic...Efficient data management is a key issue for environments distributed on a large scale such as the data cloud. This can be taken into account by replicating the data. The replication of data reduces the time of service and the delay in availability, increases the availability, and optimizes the distribution of load in the system. It is worth mentioning, however, that with the replication of data, the use of resources and energy increases due to the storing of copies of the data. We suggest a replication manager that decreases the cost of using resources, energy, and the delay in the system, and also increases the availability of the system. To reach this aim, the suggested replication manager, called the locality replication manager (LRM), works by using two important algorithms that use the physical adjacency feature of blocks. In addition, a set of simulations are reported to show that LRM can be a suitable option for distributed systems as it uses less energy and resources, optimizes the distribution of load, and has more availability and less delay.展开更多
A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken b...A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm. Then, the objects in the segmented images are classified into small rock candidates, rock shadows, and large objects. Rock shadows and large objects are considered as the regions within which large rocks may exist. In these regions, large rock candidates are extracted through ground-plane fitting with the 3D point cloud data. Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results. The shape properties of the rocks (angularity, circularity, width, height, and width-height ratio) have been calculated for subsequent ~eological studies.展开更多
Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the clou...Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.展开更多
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu...Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.展开更多
To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migr...To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migration algorithm using Workload-aware Consolidation Technique(ELMWCT).As opposed to traditional energy-aware scheduling algorithms,which often focus on only one-dimensional resource,the two algorithms are based on the fact that multiple resources(such as CPU,memory and network bandwidth)are shared by users concurrently in cloud data centres and heterogeneous workloads have different resource consumption characteristics.Both algorithms investigate the problem of consolidating heterogeneous workloads.They try to execute all Virtual Machines(VMs) with the minimum amount of Physical Machines(PMs),and then power off unused physical servers to reduce power consumption.Simulation results show that both algorithms efficiently utilise the resources in cloud data centres,and the multidimensional resources have good balanced utilizations,which demonstrate their promising energy saving capability.展开更多
Storage auditing and client-side deduplication techniques have been proposed to assure data integrity and improve storage efficiency, respectively. Recently, a few schemes start to consider these two different aspects...Storage auditing and client-side deduplication techniques have been proposed to assure data integrity and improve storage efficiency, respectively. Recently, a few schemes start to consider these two different aspects together. However, these schemes either only support plaintext data file or have been proved insecure. In this paper, we propose a public auditing scheme for cloud storage systems, in which deduplication of encrypted data and data integrity checking can be achieved within the same framework. The cloud server can correctly check the ownership for new owners and the auditor can correctly check the integrity of deduplicated data. Our scheme supports deduplication of encrypted data by using the method of proxy re-encryption and also achieves deduplication of data tags by aggregating the tags from different owners. The analysis and experiment results show that our scheme is provably secure and efficient.展开更多
Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integ...Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integrity.In this research,an enhanced Merkle hash tree method of effective authentication model is proposed in the multi-owner cloud to increase the security of the cloud data.Merkle Hash tree applies the leaf nodes with a hash tag and the non-leaf node contains the table of hash information of child to encrypt the large data.Merkle Hash tree provides the efficient mapping of data and easily identifies the changesmade in the data due to proper structure.The developed model supports privacy-preserving public auditing to provide a secure cloud storage system.The data owners upload the data in the cloud and edit the data using the private key.An enhanced Merkle hash tree method stores the data in the cloud server and splits it into batches.The data files requested by the data owner are audit by a third-party auditor and the multiowner authentication method is applied during the modification process to authenticate the user.The result shows that the proposed method reduces the encryption and decryption time for cloud data storage by 2–167 ms when compared to the existing Advanced Encryption Standard and Blowfish.展开更多
This study concerns a Ka-band solid-state transmitter cloud radar, made in China, which can operate in three different work modes, with different pulse widths, and coherent and incoherent integration numbers, to meet ...This study concerns a Ka-band solid-state transmitter cloud radar, made in China, which can operate in three different work modes, with different pulse widths, and coherent and incoherent integration numbers, to meet the requirements for cloud remote sensing over the Tibetan Plateau. Specifically, the design of the three operational modes of the radar(i.e., boundary mode M1, cirrus mode M2, and precipitation mode M3) is introduced. Also, a cloud radar data merging algorithm for the three modes is proposed. Using one month's continuous measurements during summertime at Naqu on the Tibetan Plateau,we analyzed the consistency between the cloud radar measurements of the three modes. The number of occurrences of radar detections of hydrometeors and the percentage contributions of the different modes' data to the merged data were estimated.The performance of the merging algorithm was evaluated. The results indicated that the minimum detectable reflectivity for each mode was consistent with theoretical results. Merged data provided measurements with a minimum reflectivity of -35 dBZ at the height of 5 km, and obtained information above the height of 0.2 km. Measurements of radial velocity by the three operational modes agreed very well, and systematic errors in measurements of reflectivity were less than 2 dB. However,large discrepancies existed in the measurements of the linear depolarization ratio taken from the different operational modes.The percentage of radar detections of hydrometeors in mid- and high-level clouds increased by 60% through application of pulse compression techniques. In conclusion, the merged data are appropriate for cloud and precipitation studies over the Tibetan Plateau.展开更多
With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dyn...With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtuMized resources are provided as services. With virtualization technology, cloud computing offers diverse services (such as virtual computing, virtual storage, virtual bandwidth, etc.) for the public by means of multi-tenancy mode. Although users are enjoying the capabilities of super-computing and mass storage supplied by cloud computing, cloud security still remains as a hot spot problem, which is in essence the trust management between data owners and storage service providers. In this paper, we propose a data coloring method based on cloud watermarking to recognize and ensure mutual reputations. The experimental results show that the robustness of reverse cloud generator can guarantee users' embedded social reputation identifications. Hence, our work provides a reference solution to the critical problem of cloud security.展开更多
Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the dat...Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the data assimilation frequency of radar data on analyses and forecasts. A three-dimensional variational system was used to assimilate radial velocity data,and a cloud analysis system was used for reflectivity assimilation with a 2-h assimilation window covering the initial stage of the squall line. Two operators of radar reflectivity for cloud analyses corresponding to single-and double-moment schemes were used. In this study, we examined the sensitivity of assimilation frequency using 10-, 20-, 30-, and 60-min assimilation intervals. The results showed that analysis fields were not consistent with model dynamics and microphysics in general;thus, model states, including dynamic and microphysical variables, required approximately 20 min to reach a new balance after data assimilation in all experiments. Moreover, a 20-min data assimilation interval generally produced better forecasts for both single-and double-moment schemes in terms of equitable threat and bias scores. We conclude that a higher data assimilation frequency can produce a more intense cold pool and rear inflow jets but does not necessarily lead to a better forecast.展开更多
Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include s...Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include storing and accessing user data in commercial clouds, mining of social data, and analysis of large-scale simulations and experiments such as the Large Hadron Collider. An increasing number of such data—intensive applications and services are relying on clouds in order to process and manage the enormous amounts of data required for continuous operation. It can be difficult to decide which of the many options for cloud processing is suitable for a given application;the aim of this paper is therefore to provide an interested user with an overview of the most important concepts of cloud computing as it relates to processing of Big Data.展开更多
How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data cente...How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data centers. The winner tree is introduced to make the data nodes as the leaf nodes of the tree and the final winner on the purpose of reducing energy consumption is selected. The complexity of large-scale cloud data centers is fully consider, and the task comparson coefficient is defined to make task scheduling strategy more reasonable. Experiments and performance analysis show that the proposed algorithm can effectively improve the node utilization, and reduce the overall power consumption of the cloud data center.展开更多
The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many d...The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents.展开更多
Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minute...Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.展开更多
Cloud computing is a set of Information Technology services offered to users over the web on a rented base. Such services enable the organizations to scale-up or scale-down their in-house foundations. Generally, cloud...Cloud computing is a set of Information Technology services offered to users over the web on a rented base. Such services enable the organizations to scale-up or scale-down their in-house foundations. Generally, cloud services are provided by a third-party supplier who possesses the arrangement. Cloud computing has many advantages such as flexibility, efficiency, scalability, integration, and capital reduction. Moreover, it provides an advanced virtual space for organizations to deploy their applications or run their operations. With disregard to the possible benefits of cloud computing services, the organizations are reluctant to invest in cloud computing mainly due to security concerns. Security is one of the main challenges that hinder the growth of cloud computing. At the same time, service providers strive to reduce the risks over the clouds and increase their reliability in order to build mutual trust between them and the cloud customers. Various security issues and challenges are discussed in this research, and possible opportunities are stated.展开更多
基金supported By Grant (PLN2022-14) of State Key Laboratory of Oil and Gas Reservoir Geology and Exploitation (Southwest Petroleum University)。
文摘Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.
文摘The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration and development and the requirements of low-carbon development,the focus of exploration anddevelopment in the oil and gas industry is gradually shifting to the exploration and development of renewableenergy sources such as deep sea,deep earth and geothermal energy.The traditional petrophysical evaluation andinterpretation model has encountered great challenges in the face of new evaluation objects.To establish a distributedlogging big data private cloud platform with a unified learning model as the key,which realizes the distributed storageand processing of logging big data,and enables the learning of brand-new knowledge patterns from multi-attributedata in the large function space in the unified logging learning model integrating the expert knowledge and the datamodel,so as to solve the problem of geoengineering evaluation of geothermal fields.Based on the research ideaof“logging big data cloud platform---unified logging learning model---large function space---knowledge learning&discovery---application”,the theoretical foundation of unified learning model,cloud platform architecture,datastorage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storageand processing of data and learning algorithms.New knowledge of geothermal evaluation is found in a large functionspace and applied to Geo-engineering evaluation of geothermal fields.The examples show its good application in theselection of logging series in geothermal fields,quality control of logging data,identification of complex lithologyin geothermal fields,evaluation of reservoir fluids,checking of associated helium,evaluation of cementing quality,evaluation of well-side fractures,and evaluation of geothermal water recharge under the remote logging module ofthe cloud platform.The first and second cementing surfaces of cemented wells in geothermal fields were evaluated,as well as the development of well-side distal fractures,fracture extension orientation.According to the well-sidefracture communication to form a good fluid pathway and large flow rate and long flow diameter of the thermalstorage fi ssure system,the design is conducive to the design of the recharge program of geothermal water.
基金supported in part by National Natural Science Foundation of China under Grant 62071078in part by Sichuan Province Science and Technology Program under Grant 2021YFQ0053。
文摘Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions among multiple VMs on Key Performance Indicator(KPI)data,e.g.,memory utilization.Furthermore,the nonstationarity,high complexity,and uncertain periodicity of KPI data in VM also bring difficulties to deep learningbased anomaly detection tasks.To settle these challenges,this paper proposes MCBiWGAN-GTN,a multi-channel semi-supervised time series anomaly detection algorithm based on the Bidirectional Wasserstein Generative Adversarial Network with Graph-Time Network(BiWGAN-GTN)and the Complete Ensemble Empirical Mode Decomposition with Adaptive Noise(CEEMDAN).(a)The BiWGAN-GTN algorithm is proposed to extract spatiotemporal information from data.(b)The loss function of BiWGAN-GTN is redesigned to solve the abnormal data intrusion problem during the training process.(c)MCBiWGAN-GTN is designed to reduce data complexity through CEEMDAN for time series decomposition and utilizes BiWGAN-GTN to train different components.(d)To adapt the proposed algorithm for the entire cloud data center,a cloud data center anomaly detection framework based on Swarm Learning(SL)is designed.The evaluation results on a real-world cloud data center dataset show that MCBiWGAN-GTN outperforms the baseline,with an F1-score of 0.96,an accuracy of 0.935,a precision of 0.954,a recall of 0.967,and an FPR of 0.203.The experiments also verify the stability of MCBiWGAN-GTN,the impact of parameter configurations,and the effectiveness of the proposed SL framework.
基金Guangxi Key Laboratory of Spatial Information and Geomatics(21-238-21-12)Guangxi Young and Middle-aged Teachers’Research Fundamental Ability Enhancement Project(2023KY1196).
文摘Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field.
文摘Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.
文摘Efficient data management is a key issue for environments distributed on a large scale such as the data cloud. This can be taken into account by replicating the data. The replication of data reduces the time of service and the delay in availability, increases the availability, and optimizes the distribution of load in the system. It is worth mentioning, however, that with the replication of data, the use of resources and energy increases due to the storing of copies of the data. We suggest a replication manager that decreases the cost of using resources, energy, and the delay in the system, and also increases the availability of the system. To reach this aim, the suggested replication manager, called the locality replication manager (LRM), works by using two important algorithms that use the physical adjacency feature of blocks. In addition, a set of simulations are reported to show that LRM can be a suitable option for distributed systems as it uses less energy and resources, optimizes the distribution of load, and has more availability and less delay.
基金supported by the National Natural Science Foundation of China(Nos.41171355and41002120)
文摘A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm. Then, the objects in the segmented images are classified into small rock candidates, rock shadows, and large objects. Rock shadows and large objects are considered as the regions within which large rocks may exist. In these regions, large rock candidates are extracted through ground-plane fitting with the 3D point cloud data. Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results. The shape properties of the rocks (angularity, circularity, width, height, and width-height ratio) have been calculated for subsequent ~eological studies.
基金This work was supported by the National Natural Science Foundation of China(No.61702276)the Startup Foundation for Introducing Talent of Nanjing University of Information Science and Technology under Grant 2016r055 and the Priority Academic Program Development(PAPD)of Jiangsu Higher Education Institutions.The authors are grateful for the anonymous reviewers who made constructive comments and improvements.
文摘Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.
基金Projects(61363021,61540061,61663047)supported by the National Natural Science Foundation of ChinaProject(2017SE206)supported by the Open Foundation of Key Laboratory in Software Engineering of Yunnan Province,China
文摘Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.
基金supported by the Opening Project of State key Laboratory of Networking and Switching Technology under Grant No.SKLNST-2010-1-03the National Natural Science Foundation of China under Grants No.U1333113,No.61303204+1 种基金the Sichuan Province seedling project under Grant No.2012ZZ036the Scientific Research Fund of Sichuan Normal University under Grant No.13KYL06
文摘To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migration algorithm using Workload-aware Consolidation Technique(ELMWCT).As opposed to traditional energy-aware scheduling algorithms,which often focus on only one-dimensional resource,the two algorithms are based on the fact that multiple resources(such as CPU,memory and network bandwidth)are shared by users concurrently in cloud data centres and heterogeneous workloads have different resource consumption characteristics.Both algorithms investigate the problem of consolidating heterogeneous workloads.They try to execute all Virtual Machines(VMs) with the minimum amount of Physical Machines(PMs),and then power off unused physical servers to reduce power consumption.Simulation results show that both algorithms efficiently utilise the resources in cloud data centres,and the multidimensional resources have good balanced utilizations,which demonstrate their promising energy saving capability.
基金Supported by the National Natural Science Foundation of China(61373040,61173137)the Ph.D.Programs Foundation of Ministry of Education of China(20120141110002)the Key Project of Natural Science Foundation of Hubei Province(2010CDA004)
文摘Storage auditing and client-side deduplication techniques have been proposed to assure data integrity and improve storage efficiency, respectively. Recently, a few schemes start to consider these two different aspects together. However, these schemes either only support plaintext data file or have been proved insecure. In this paper, we propose a public auditing scheme for cloud storage systems, in which deduplication of encrypted data and data integrity checking can be achieved within the same framework. The cloud server can correctly check the ownership for new owners and the auditor can correctly check the integrity of deduplicated data. Our scheme supports deduplication of encrypted data by using the method of proxy re-encryption and also achieves deduplication of data tags by aggregating the tags from different owners. The analysis and experiment results show that our scheme is provably secure and efficient.
基金The Universiti Kebangsaan Malaysia(UKM)Research Grant Scheme FRGS/1/2020/ICT03/UKM/02/6 and GGPM-2020-028 funded this research.
文摘Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integrity.In this research,an enhanced Merkle hash tree method of effective authentication model is proposed in the multi-owner cloud to increase the security of the cloud data.Merkle Hash tree applies the leaf nodes with a hash tag and the non-leaf node contains the table of hash information of child to encrypt the large data.Merkle Hash tree provides the efficient mapping of data and easily identifies the changesmade in the data due to proper structure.The developed model supports privacy-preserving public auditing to provide a secure cloud storage system.The data owners upload the data in the cloud and edit the data using the private key.An enhanced Merkle hash tree method stores the data in the cloud server and splits it into batches.The data files requested by the data owner are audit by a third-party auditor and the multiowner authentication method is applied during the modification process to authenticate the user.The result shows that the proposed method reduces the encryption and decryption time for cloud data storage by 2–167 ms when compared to the existing Advanced Encryption Standard and Blowfish.
基金funded by the National Sciences Foundation of China(Grant No.91337103)the China Meteorological Administration Special Public Welfare Research Fund(Grant No.GYHY201406001)
文摘This study concerns a Ka-band solid-state transmitter cloud radar, made in China, which can operate in three different work modes, with different pulse widths, and coherent and incoherent integration numbers, to meet the requirements for cloud remote sensing over the Tibetan Plateau. Specifically, the design of the three operational modes of the radar(i.e., boundary mode M1, cirrus mode M2, and precipitation mode M3) is introduced. Also, a cloud radar data merging algorithm for the three modes is proposed. Using one month's continuous measurements during summertime at Naqu on the Tibetan Plateau,we analyzed the consistency between the cloud radar measurements of the three modes. The number of occurrences of radar detections of hydrometeors and the percentage contributions of the different modes' data to the merged data were estimated.The performance of the merging algorithm was evaluated. The results indicated that the minimum detectable reflectivity for each mode was consistent with theoretical results. Merged data provided measurements with a minimum reflectivity of -35 dBZ at the height of 5 km, and obtained information above the height of 0.2 km. Measurements of radial velocity by the three operational modes agreed very well, and systematic errors in measurements of reflectivity were less than 2 dB. However,large discrepancies existed in the measurements of the linear depolarization ratio taken from the different operational modes.The percentage of radar detections of hydrometeors in mid- and high-level clouds increased by 60% through application of pulse compression techniques. In conclusion, the merged data are appropriate for cloud and precipitation studies over the Tibetan Plateau.
基金supported by National Basic Research Program of China (973 Program) (No. 2007CB310800)China Postdoctoral Science Foundation (No. 20090460107 and No. 201003794)
文摘With the development of Internet technology and human computing, the computing environment has changed dramatically over the last three decades. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtuMized resources are provided as services. With virtualization technology, cloud computing offers diverse services (such as virtual computing, virtual storage, virtual bandwidth, etc.) for the public by means of multi-tenancy mode. Although users are enjoying the capabilities of super-computing and mass storage supplied by cloud computing, cloud security still remains as a hot spot problem, which is in essence the trust management between data owners and storage service providers. In this paper, we propose a data coloring method based on cloud watermarking to recognize and ensure mutual reputations. The experimental results show that the robustness of reverse cloud generator can guarantee users' embedded social reputation identifications. Hence, our work provides a reference solution to the critical problem of cloud security.
基金supported by the National Key R&D Program of China (Grant No.2017YFC1502104)the National Natural Science Foundation of China (Grant Nos.41775099 and 41605026)Grant No.NJCAR2016ZD02,and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD)
文摘Assimilation configurations have significant impacts on analysis results and subsequent forecasts. A squall line system that occurred on 23 April 2007 over southern China was used to investigate the impacts of the data assimilation frequency of radar data on analyses and forecasts. A three-dimensional variational system was used to assimilate radial velocity data,and a cloud analysis system was used for reflectivity assimilation with a 2-h assimilation window covering the initial stage of the squall line. Two operators of radar reflectivity for cloud analyses corresponding to single-and double-moment schemes were used. In this study, we examined the sensitivity of assimilation frequency using 10-, 20-, 30-, and 60-min assimilation intervals. The results showed that analysis fields were not consistent with model dynamics and microphysics in general;thus, model states, including dynamic and microphysical variables, required approximately 20 min to reach a new balance after data assimilation in all experiments. Moreover, a 20-min data assimilation interval generally produced better forecasts for both single-and double-moment schemes in terms of equitable threat and bias scores. We conclude that a higher data assimilation frequency can produce a more intense cold pool and rear inflow jets but does not necessarily lead to a better forecast.
文摘Big Data applications are pervading more and more aspects of our life, encompassing commercial and scientific uses at increasing rates as we move towards exascale analytics. Examples of Big Data applications include storing and accessing user data in commercial clouds, mining of social data, and analysis of large-scale simulations and experiments such as the Large Hadron Collider. An increasing number of such data—intensive applications and services are relying on clouds in order to process and manage the enormous amounts of data required for continuous operation. It can be difficult to decide which of the many options for cloud processing is suitable for a given application;the aim of this paper is therefore to provide an interested user with an overview of the most important concepts of cloud computing as it relates to processing of Big Data.
基金supported by the National Natural Science Foundation of China(6120200461272084)+9 种基金the National Key Basic Research Program of China(973 Program)(2011CB302903)the Specialized Research Fund for the Doctoral Program of Higher Education(2009322312000120113223110003)the China Postdoctoral Science Foundation Funded Project(2011M5000952012T50514)the Natural Science Foundation of Jiangsu Province(BK2011754BK2009426)the Jiangsu Postdoctoral Science Foundation Funded Project(1102103C)the Natural Science Fund of Higher Education of Jiangsu Province(12KJB520007)the Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions(yx002001)
文摘How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data centers. The winner tree is introduced to make the data nodes as the leaf nodes of the tree and the final winner on the purpose of reducing energy consumption is selected. The complexity of large-scale cloud data centers is fully consider, and the task comparson coefficient is defined to make task scheduling strategy more reasonable. Experiments and performance analysis show that the proposed algorithm can effectively improve the node utilization, and reduce the overall power consumption of the cloud data center.
文摘The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents.
基金National Natural Science Foundation of China(No.41801379)Fundamental Research Funds for the Central Universities(No.2019B08414)National Key R&D Program of China(No.2016YFC0401801)。
文摘Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.
文摘Cloud computing is a set of Information Technology services offered to users over the web on a rented base. Such services enable the organizations to scale-up or scale-down their in-house foundations. Generally, cloud services are provided by a third-party supplier who possesses the arrangement. Cloud computing has many advantages such as flexibility, efficiency, scalability, integration, and capital reduction. Moreover, it provides an advanced virtual space for organizations to deploy their applications or run their operations. With disregard to the possible benefits of cloud computing services, the organizations are reluctant to invest in cloud computing mainly due to security concerns. Security is one of the main challenges that hinder the growth of cloud computing. At the same time, service providers strive to reduce the risks over the clouds and increase their reliability in order to build mutual trust between them and the cloud customers. Various security issues and challenges are discussed in this research, and possible opportunities are stated.