Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th...Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.展开更多
Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)...Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field.展开更多
Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions...Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions among multiple VMs on Key Performance Indicator(KPI)data,e.g.,memory utilization.Furthermore,the nonstationarity,high complexity,and uncertain periodicity of KPI data in VM also bring difficulties to deep learningbased anomaly detection tasks.To settle these challenges,this paper proposes MCBiWGAN-GTN,a multi-channel semi-supervised time series anomaly detection algorithm based on the Bidirectional Wasserstein Generative Adversarial Network with Graph-Time Network(BiWGAN-GTN)and the Complete Ensemble Empirical Mode Decomposition with Adaptive Noise(CEEMDAN).(a)The BiWGAN-GTN algorithm is proposed to extract spatiotemporal information from data.(b)The loss function of BiWGAN-GTN is redesigned to solve the abnormal data intrusion problem during the training process.(c)MCBiWGAN-GTN is designed to reduce data complexity through CEEMDAN for time series decomposition and utilizes BiWGAN-GTN to train different components.(d)To adapt the proposed algorithm for the entire cloud data center,a cloud data center anomaly detection framework based on Swarm Learning(SL)is designed.The evaluation results on a real-world cloud data center dataset show that MCBiWGAN-GTN outperforms the baseline,with an F1-score of 0.96,an accuracy of 0.935,a precision of 0.954,a recall of 0.967,and an FPR of 0.203.The experiments also verify the stability of MCBiWGAN-GTN,the impact of parameter configurations,and the effectiveness of the proposed SL framework.展开更多
With cloud computing,large chunks of data can be handled at a small cost.However,there are some reservations regarding the security and privacy of cloud data stored.For solving these issues and enhancing cloud computi...With cloud computing,large chunks of data can be handled at a small cost.However,there are some reservations regarding the security and privacy of cloud data stored.For solving these issues and enhancing cloud computing security,this research provides a Three-Layered Security Access model(TLSA)aligned to an intrusion detection mechanism,access control mechanism,and data encryption system.The TLSA underlines the need for the protection of sensitive data.This proposed approach starts with Layer 1 data encryption using the Advanced Encryption Standard(AES).For data transfer and storage,this encryption guarantees the data’s authenticity and secrecy.Surprisingly,the solution employs the AES encryption algorithm to secure essential data before storing them in the Cloud to minimize unauthorized access.Role-based access control(RBAC)implements the second strategic level,which ensures specific personnel access certain data and resources.In RBAC,each user is allowed a specific role and Permission.This implies that permitted users can access some data stored in the Cloud.This layer assists in filtering granular access to data,reducing the risk that undesired data will be discovered during the process.Layer 3 deals with intrusion detection systems(IDS),which detect and quickly deal with malicious actions and intrusion attempts.The proposed TLSA security model of e-commerce includes conventional levels of security,such as encryption and access control,and encloses an insight intrusion detection system.This method offers integrated solutions for most typical security issues of cloud computing,including data secrecy,method of access,and threats.An extensive performance test was carried out to confirm the efficiency of the proposed three-tier security method.Comparisons have been made with state-of-art techniques,including DES,RSA,and DUAL-RSA,keeping into account Accuracy,QILV,F-Measure,Sensitivity,MSE,PSNR,SSIM,and computation time,encryption time,and decryption time.The proposed TLSA method provides an accuracy of 89.23%,F-Measure of 0.876,and SSIM of 0.8564 at a computation time of 5.7 s.A comparison with existing methods shows the better performance of the proposed method,thus confirming the enhanced ability to address security issues in cloud computing.展开更多
The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration an...The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration and development and the requirements of low-carbon development,the focus of exploration anddevelopment in the oil and gas industry is gradually shifting to the exploration and development of renewableenergy sources such as deep sea,deep earth and geothermal energy.The traditional petrophysical evaluation andinterpretation model has encountered great challenges in the face of new evaluation objects.To establish a distributedlogging big data private cloud platform with a unified learning model as the key,which realizes the distributed storageand processing of logging big data,and enables the learning of brand-new knowledge patterns from multi-attributedata in the large function space in the unified logging learning model integrating the expert knowledge and the datamodel,so as to solve the problem of geoengineering evaluation of geothermal fields.Based on the research ideaof“logging big data cloud platform---unified logging learning model---large function space---knowledge learning&discovery---application”,the theoretical foundation of unified learning model,cloud platform architecture,datastorage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storageand processing of data and learning algorithms.New knowledge of geothermal evaluation is found in a large functionspace and applied to Geo-engineering evaluation of geothermal fields.The examples show its good application in theselection of logging series in geothermal fields,quality control of logging data,identification of complex lithologyin geothermal fields,evaluation of reservoir fluids,checking of associated helium,evaluation of cementing quality,evaluation of well-side fractures,and evaluation of geothermal water recharge under the remote logging module ofthe cloud platform.The first and second cementing surfaces of cemented wells in geothermal fields were evaluated,as well as the development of well-side distal fractures,fracture extension orientation.According to the well-sidefracture communication to form a good fluid pathway and large flow rate and long flow diameter of the thermalstorage fi ssure system,the design is conducive to the design of the recharge program of geothermal water.展开更多
Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes...Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.展开更多
A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken b...A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm. Then, the objects in the segmented images are classified into small rock candidates, rock shadows, and large objects. Rock shadows and large objects are considered as the regions within which large rocks may exist. In these regions, large rock candidates are extracted through ground-plane fitting with the 3D point cloud data. Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results. The shape properties of the rocks (angularity, circularity, width, height, and width-height ratio) have been calculated for subsequent ~eological studies.展开更多
Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the clou...Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.展开更多
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu...Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.展开更多
To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migr...To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migration algorithm using Workload-aware Consolidation Technique(ELMWCT).As opposed to traditional energy-aware scheduling algorithms,which often focus on only one-dimensional resource,the two algorithms are based on the fact that multiple resources(such as CPU,memory and network bandwidth)are shared by users concurrently in cloud data centres and heterogeneous workloads have different resource consumption characteristics.Both algorithms investigate the problem of consolidating heterogeneous workloads.They try to execute all Virtual Machines(VMs) with the minimum amount of Physical Machines(PMs),and then power off unused physical servers to reduce power consumption.Simulation results show that both algorithms efficiently utilise the resources in cloud data centres,and the multidimensional resources have good balanced utilizations,which demonstrate their promising energy saving capability.展开更多
Security is an essential part of the cloud environment.For ensuring the security of the data being communicated to and from the cloud server,a significant parameter called trust was introduced.Trust-based security pla...Security is an essential part of the cloud environment.For ensuring the security of the data being communicated to and from the cloud server,a significant parameter called trust was introduced.Trust-based security played a vital role in ensuring that the communication between cloud users and service providers remained unadulterated and authentic.In most cloud-based data distribution environments,emphasis is placed on accepting trusted client users’requests,but the cloud servers’integrity is seldom verified.This paper designs a trust-based access control model based on user and server characteristics in a multi-cloud environment to address this issue.The proposed methodology consists of data encryption using Cyclic Shift Transposition Algorithm and trust-based access control method.In this trust-based access control mechanism framework,trust values are assigned to cloud users using direct trust degrees.The direct trust degree is estimated based on the following metrics:success and failure rate of interactions,service satisfaction index,and dishonesty level.In addition to this,trust values are assigned to cloud servers based on the metrics:server load,service rejection rate,and service access delay.The role-Based Access control policy of each user is modified based on his trust level.If the server fails to meet the minimum trust level,then another suitable server will be selected.The proposed system is found to outperform other existing systems in a multi-cloud environment.展开更多
The dissociation between data management and data ownership makes it difficult to protect data security and privacy in cloud storage systems.Traditional encryption technologies are not suitable for data protection in ...The dissociation between data management and data ownership makes it difficult to protect data security and privacy in cloud storage systems.Traditional encryption technologies are not suitable for data protection in cloud storage systems.A novel multi-authority proxy re-encryption mechanism based on ciphertext-policy attribute-based encryption(MPRE-CPABE) is proposed for cloud storage systems.MPRE-CPABE requires data owner to split each file into two blocks,one big block and one small block.The small block is used to encrypt the big one as the private key,and then the encrypted big block will be uploaded to the cloud storage system.Even if the uploaded big block of file is stolen,illegal users cannot get the complete information of the file easily.Ciphertext-policy attribute-based encryption(CPABE)is always criticized for its heavy overload and insecure issues when distributing keys or revoking user's access right.MPRE-CPABE applies CPABE to the multi-authority cloud storage system,and solves the above issues.The weighted access structure(WAS) is proposed to support a variety of fine-grained threshold access control policy in multi-authority environments,and reduce the computational cost of key distribution.Meanwhile,MPRE-CPABE uses proxy re-encryption to reduce the computational cost of access revocation.Experiments are implemented on platforms of Ubuntu and CloudSim.Experimental results show that MPRE-CPABE can greatly reduce the computational cost of the generation of key components and the revocation of user's access right.MPRE-CPABE is also proved secure under the security model of decisional bilinear Diffie-Hellman(DBDH).展开更多
Intellectualization has become a new trend for telecom industry, driven by intelligent technology including cloud computing, big data, and Internet of things. In order to satisfy the service demand of intelligent logi...Intellectualization has become a new trend for telecom industry, driven by intelligent technology including cloud computing, big data, and Internet of things. In order to satisfy the service demand of intelligent logistics, this paper designed an intelligent logistics platform containing the main applications such as e-commerce, self-service transceiver, big data analysis, path location and distribution optimization. The intelligent logistics service platform has been built based on cloud computing to collect, store and handling multi-source heterogeneous mass data from sensors, RFID electronic tag, vehicle terminals and APP, so that the open-access cloud services including distribution, positioning, navigation, scheduling and other data services can be provided for the logistics distribution applications. And then the architecture of intelligent logistics cloud platform containing software layer(SaaS), platform layer(PaaS) and infrastructure(IaaS) has been constructed accordance with the core technology relative high concurrent processing technique, heterogeneous terminal data access, encapsulation and data mining. Therefore, intelligent logistics cloud platform can be carried out by the service mode for implementation to accelerate the construction of the symbiotic win-winlogistics ecological system and the benign development of the ICT industry in the trend of intellectualization in China.展开更多
Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integ...Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integrity.In this research,an enhanced Merkle hash tree method of effective authentication model is proposed in the multi-owner cloud to increase the security of the cloud data.Merkle Hash tree applies the leaf nodes with a hash tag and the non-leaf node contains the table of hash information of child to encrypt the large data.Merkle Hash tree provides the efficient mapping of data and easily identifies the changesmade in the data due to proper structure.The developed model supports privacy-preserving public auditing to provide a secure cloud storage system.The data owners upload the data in the cloud and edit the data using the private key.An enhanced Merkle hash tree method stores the data in the cloud server and splits it into batches.The data files requested by the data owner are audit by a third-party auditor and the multiowner authentication method is applied during the modification process to authenticate the user.The result shows that the proposed method reduces the encryption and decryption time for cloud data storage by 2–167 ms when compared to the existing Advanced Encryption Standard and Blowfish.展开更多
With the rapid development of reality capture methods,such as laser scanning and oblique photogrammetry,point cloud data have become the third most important data source,after vector maps and imagery.Point cloud data ...With the rapid development of reality capture methods,such as laser scanning and oblique photogrammetry,point cloud data have become the third most important data source,after vector maps and imagery.Point cloud data also play an increasingly important role in scientific research and engineering in the fields of Earth science,spatial cognition,and smart cities.However,how to acquire high-quality three-dimensional(3D)geospatial information from point clouds has become a scientific frontier,for which there is an urgent demand in the fields of surveying and mapping,as well as geoscience applications.To address the challenges mentioned above,point cloud intelligence came into being.This paper summarizes the state-of-the-art of point cloud intelligence,with regard to acquisition equipment,intelligent processing,scientific research,and engineering applications.For this purpose,we refer to a recent project on the hybrid georeferencing of images and LiDAR data for high-quality point cloud collection,as well as a current benchmark for the semantic segmentation of high-resolution 3D point clouds.These projects were conducted at the Institute for Photogrammetry,the University of Stuttgart,which was initially headed by the late Prof.Ackermann.Finally,the development prospects of point cloud intelligence are summarized.展开更多
This study concerns a Ka-band solid-state transmitter cloud radar, made in China, which can operate in three different work modes, with different pulse widths, and coherent and incoherent integration numbers, to meet ...This study concerns a Ka-band solid-state transmitter cloud radar, made in China, which can operate in three different work modes, with different pulse widths, and coherent and incoherent integration numbers, to meet the requirements for cloud remote sensing over the Tibetan Plateau. Specifically, the design of the three operational modes of the radar(i.e., boundary mode M1, cirrus mode M2, and precipitation mode M3) is introduced. Also, a cloud radar data merging algorithm for the three modes is proposed. Using one month's continuous measurements during summertime at Naqu on the Tibetan Plateau,we analyzed the consistency between the cloud radar measurements of the three modes. The number of occurrences of radar detections of hydrometeors and the percentage contributions of the different modes' data to the merged data were estimated.The performance of the merging algorithm was evaluated. The results indicated that the minimum detectable reflectivity for each mode was consistent with theoretical results. Merged data provided measurements with a minimum reflectivity of -35 dBZ at the height of 5 km, and obtained information above the height of 0.2 km. Measurements of radial velocity by the three operational modes agreed very well, and systematic errors in measurements of reflectivity were less than 2 dB. However,large discrepancies existed in the measurements of the linear depolarization ratio taken from the different operational modes.The percentage of radar detections of hydrometeors in mid- and high-level clouds increased by 60% through application of pulse compression techniques. In conclusion, the merged data are appropriate for cloud and precipitation studies over the Tibetan Plateau.展开更多
Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minute...Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.展开更多
Storage auditing and client-side deduplication techniques have been proposed to assure data integrity and improve storage efficiency, respectively. Recently, a few schemes start to consider these two different aspects...Storage auditing and client-side deduplication techniques have been proposed to assure data integrity and improve storage efficiency, respectively. Recently, a few schemes start to consider these two different aspects together. However, these schemes either only support plaintext data file or have been proved insecure. In this paper, we propose a public auditing scheme for cloud storage systems, in which deduplication of encrypted data and data integrity checking can be achieved within the same framework. The cloud server can correctly check the ownership for new owners and the auditor can correctly check the integrity of deduplicated data. Our scheme supports deduplication of encrypted data by using the method of proxy re-encryption and also achieves deduplication of data tags by aggregating the tags from different owners. The analysis and experiment results show that our scheme is provably secure and efficient.展开更多
Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved c...Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.展开更多
基金supported By Grant (PLN2022-14) of State Key Laboratory of Oil and Gas Reservoir Geology and Exploitation (Southwest Petroleum University)。
文摘Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity.
基金Guangxi Key Laboratory of Spatial Information and Geomatics(21-238-21-12)Guangxi Young and Middle-aged Teachers’Research Fundamental Ability Enhancement Project(2023KY1196).
文摘Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field.
基金supported in part by National Natural Science Foundation of China under Grant 62071078in part by Sichuan Province Science and Technology Program under Grant 2021YFQ0053。
文摘Anomaly detection is an important task for maintaining the performance of cloud data center.Traditional anomaly detection primarily examines individual Virtual Machine(VM)behavior,neglecting the impact of interactions among multiple VMs on Key Performance Indicator(KPI)data,e.g.,memory utilization.Furthermore,the nonstationarity,high complexity,and uncertain periodicity of KPI data in VM also bring difficulties to deep learningbased anomaly detection tasks.To settle these challenges,this paper proposes MCBiWGAN-GTN,a multi-channel semi-supervised time series anomaly detection algorithm based on the Bidirectional Wasserstein Generative Adversarial Network with Graph-Time Network(BiWGAN-GTN)and the Complete Ensemble Empirical Mode Decomposition with Adaptive Noise(CEEMDAN).(a)The BiWGAN-GTN algorithm is proposed to extract spatiotemporal information from data.(b)The loss function of BiWGAN-GTN is redesigned to solve the abnormal data intrusion problem during the training process.(c)MCBiWGAN-GTN is designed to reduce data complexity through CEEMDAN for time series decomposition and utilizes BiWGAN-GTN to train different components.(d)To adapt the proposed algorithm for the entire cloud data center,a cloud data center anomaly detection framework based on Swarm Learning(SL)is designed.The evaluation results on a real-world cloud data center dataset show that MCBiWGAN-GTN outperforms the baseline,with an F1-score of 0.96,an accuracy of 0.935,a precision of 0.954,a recall of 0.967,and an FPR of 0.203.The experiments also verify the stability of MCBiWGAN-GTN,the impact of parameter configurations,and the effectiveness of the proposed SL framework.
基金funded by UKRI EPSRC Grant EP/W020408/1 Project SPRITE+2:The Security,Privacy,Identity and Trust Engagement Network plus(phase 2)for this studyThe authors also have been funded by PhD project RS718 on Explainable AI through UKRI EPSRC Grant funded Doctoral Training Centre at Swansea University.
文摘With cloud computing,large chunks of data can be handled at a small cost.However,there are some reservations regarding the security and privacy of cloud data stored.For solving these issues and enhancing cloud computing security,this research provides a Three-Layered Security Access model(TLSA)aligned to an intrusion detection mechanism,access control mechanism,and data encryption system.The TLSA underlines the need for the protection of sensitive data.This proposed approach starts with Layer 1 data encryption using the Advanced Encryption Standard(AES).For data transfer and storage,this encryption guarantees the data’s authenticity and secrecy.Surprisingly,the solution employs the AES encryption algorithm to secure essential data before storing them in the Cloud to minimize unauthorized access.Role-based access control(RBAC)implements the second strategic level,which ensures specific personnel access certain data and resources.In RBAC,each user is allowed a specific role and Permission.This implies that permitted users can access some data stored in the Cloud.This layer assists in filtering granular access to data,reducing the risk that undesired data will be discovered during the process.Layer 3 deals with intrusion detection systems(IDS),which detect and quickly deal with malicious actions and intrusion attempts.The proposed TLSA security model of e-commerce includes conventional levels of security,such as encryption and access control,and encloses an insight intrusion detection system.This method offers integrated solutions for most typical security issues of cloud computing,including data secrecy,method of access,and threats.An extensive performance test was carried out to confirm the efficiency of the proposed three-tier security method.Comparisons have been made with state-of-art techniques,including DES,RSA,and DUAL-RSA,keeping into account Accuracy,QILV,F-Measure,Sensitivity,MSE,PSNR,SSIM,and computation time,encryption time,and decryption time.The proposed TLSA method provides an accuracy of 89.23%,F-Measure of 0.876,and SSIM of 0.8564 at a computation time of 5.7 s.A comparison with existing methods shows the better performance of the proposed method,thus confirming the enhanced ability to address security issues in cloud computing.
文摘The development of machine learning and deep learning algorithms as well as the improvement ofhardware arithmetic power provide a rare opportunity for logging big data private cloud.With the deepeningof exploration and development and the requirements of low-carbon development,the focus of exploration anddevelopment in the oil and gas industry is gradually shifting to the exploration and development of renewableenergy sources such as deep sea,deep earth and geothermal energy.The traditional petrophysical evaluation andinterpretation model has encountered great challenges in the face of new evaluation objects.To establish a distributedlogging big data private cloud platform with a unified learning model as the key,which realizes the distributed storageand processing of logging big data,and enables the learning of brand-new knowledge patterns from multi-attributedata in the large function space in the unified logging learning model integrating the expert knowledge and the datamodel,so as to solve the problem of geoengineering evaluation of geothermal fields.Based on the research ideaof“logging big data cloud platform---unified logging learning model---large function space---knowledge learning&discovery---application”,the theoretical foundation of unified learning model,cloud platform architecture,datastorage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storageand processing of data and learning algorithms.New knowledge of geothermal evaluation is found in a large functionspace and applied to Geo-engineering evaluation of geothermal fields.The examples show its good application in theselection of logging series in geothermal fields,quality control of logging data,identification of complex lithologyin geothermal fields,evaluation of reservoir fluids,checking of associated helium,evaluation of cementing quality,evaluation of well-side fractures,and evaluation of geothermal water recharge under the remote logging module ofthe cloud platform.The first and second cementing surfaces of cemented wells in geothermal fields were evaluated,as well as the development of well-side distal fractures,fracture extension orientation.According to the well-sidefracture communication to form a good fluid pathway and large flow rate and long flow diameter of the thermalstorage fi ssure system,the design is conducive to the design of the recharge program of geothermal water.
文摘Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method.
基金supported by the National Natural Science Foundation of China(Nos.41171355and41002120)
文摘A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm. Then, the objects in the segmented images are classified into small rock candidates, rock shadows, and large objects. Rock shadows and large objects are considered as the regions within which large rocks may exist. In these regions, large rock candidates are extracted through ground-plane fitting with the 3D point cloud data. Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results. The shape properties of the rocks (angularity, circularity, width, height, and width-height ratio) have been calculated for subsequent ~eological studies.
基金This work was supported by the National Natural Science Foundation of China(No.61702276)the Startup Foundation for Introducing Talent of Nanjing University of Information Science and Technology under Grant 2016r055 and the Priority Academic Program Development(PAPD)of Jiangsu Higher Education Institutions.The authors are grateful for the anonymous reviewers who made constructive comments and improvements.
文摘Advanced cloud computing technology provides cost saving and flexibility of services for users.With the explosion of multimedia data,more and more data owners would outsource their personal multimedia data on the cloud.In the meantime,some computationally expensive tasks are also undertaken by cloud servers.However,the outsourced multimedia data and its applications may reveal the data owner’s private information because the data owners lose the control of their data.Recently,this thought has aroused new research interest on privacy-preserving reversible data hiding over outsourced multimedia data.In this paper,two reversible data hiding schemes are proposed for encrypted image data in cloud computing:reversible data hiding by homomorphic encryption and reversible data hiding in encrypted domain.The former is that additional bits are extracted after decryption and the latter is that extracted before decryption.Meanwhile,a combined scheme is also designed.This paper proposes the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing,which not only ensures multimedia data security without relying on the trustworthiness of cloud servers,but also guarantees that reversible data hiding can be operated over encrypted images at the different stages.Theoretical analysis confirms the correctness of the proposed encryption model and justifies the security of the proposed scheme.The computation cost of the proposed scheme is acceptable and adjusts to different security levels.
基金Projects(61363021,61540061,61663047)supported by the National Natural Science Foundation of ChinaProject(2017SE206)supported by the Open Foundation of Key Laboratory in Software Engineering of Yunnan Province,China
文摘Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures.
基金supported by the Opening Project of State key Laboratory of Networking and Switching Technology under Grant No.SKLNST-2010-1-03the National Natural Science Foundation of China under Grants No.U1333113,No.61303204+1 种基金the Sichuan Province seedling project under Grant No.2012ZZ036the Scientific Research Fund of Sichuan Normal University under Grant No.13KYL06
文摘To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migration algorithm using Workload-aware Consolidation Technique(ELMWCT).As opposed to traditional energy-aware scheduling algorithms,which often focus on only one-dimensional resource,the two algorithms are based on the fact that multiple resources(such as CPU,memory and network bandwidth)are shared by users concurrently in cloud data centres and heterogeneous workloads have different resource consumption characteristics.Both algorithms investigate the problem of consolidating heterogeneous workloads.They try to execute all Virtual Machines(VMs) with the minimum amount of Physical Machines(PMs),and then power off unused physical servers to reduce power consumption.Simulation results show that both algorithms efficiently utilise the resources in cloud data centres,and the multidimensional resources have good balanced utilizations,which demonstrate their promising energy saving capability.
文摘Security is an essential part of the cloud environment.For ensuring the security of the data being communicated to and from the cloud server,a significant parameter called trust was introduced.Trust-based security played a vital role in ensuring that the communication between cloud users and service providers remained unadulterated and authentic.In most cloud-based data distribution environments,emphasis is placed on accepting trusted client users’requests,but the cloud servers’integrity is seldom verified.This paper designs a trust-based access control model based on user and server characteristics in a multi-cloud environment to address this issue.The proposed methodology consists of data encryption using Cyclic Shift Transposition Algorithm and trust-based access control method.In this trust-based access control mechanism framework,trust values are assigned to cloud users using direct trust degrees.The direct trust degree is estimated based on the following metrics:success and failure rate of interactions,service satisfaction index,and dishonesty level.In addition to this,trust values are assigned to cloud servers based on the metrics:server load,service rejection rate,and service access delay.The role-Based Access control policy of each user is modified based on his trust level.If the server fails to meet the minimum trust level,then another suitable server will be selected.The proposed system is found to outperform other existing systems in a multi-cloud environment.
基金supported by the National Natural Science Foundation of China(6120200461472192)+1 种基金the Special Fund for Fast Sharing of Science Paper in Net Era by CSTD(2013116)the Natural Science Fund of Higher Education of Jiangsu Province(14KJB520014)
文摘The dissociation between data management and data ownership makes it difficult to protect data security and privacy in cloud storage systems.Traditional encryption technologies are not suitable for data protection in cloud storage systems.A novel multi-authority proxy re-encryption mechanism based on ciphertext-policy attribute-based encryption(MPRE-CPABE) is proposed for cloud storage systems.MPRE-CPABE requires data owner to split each file into two blocks,one big block and one small block.The small block is used to encrypt the big one as the private key,and then the encrypted big block will be uploaded to the cloud storage system.Even if the uploaded big block of file is stolen,illegal users cannot get the complete information of the file easily.Ciphertext-policy attribute-based encryption(CPABE)is always criticized for its heavy overload and insecure issues when distributing keys or revoking user's access right.MPRE-CPABE applies CPABE to the multi-authority cloud storage system,and solves the above issues.The weighted access structure(WAS) is proposed to support a variety of fine-grained threshold access control policy in multi-authority environments,and reduce the computational cost of key distribution.Meanwhile,MPRE-CPABE uses proxy re-encryption to reduce the computational cost of access revocation.Experiments are implemented on platforms of Ubuntu and CloudSim.Experimental results show that MPRE-CPABE can greatly reduce the computational cost of the generation of key components and the revocation of user's access right.MPRE-CPABE is also proved secure under the security model of decisional bilinear Diffie-Hellman(DBDH).
基金supported in part by National Key Research and Development Program under Grant No. 2016YFC0803206China Postdoctoral Science Foundation under Grant No.2016M600972
文摘Intellectualization has become a new trend for telecom industry, driven by intelligent technology including cloud computing, big data, and Internet of things. In order to satisfy the service demand of intelligent logistics, this paper designed an intelligent logistics platform containing the main applications such as e-commerce, self-service transceiver, big data analysis, path location and distribution optimization. The intelligent logistics service platform has been built based on cloud computing to collect, store and handling multi-source heterogeneous mass data from sensors, RFID electronic tag, vehicle terminals and APP, so that the open-access cloud services including distribution, positioning, navigation, scheduling and other data services can be provided for the logistics distribution applications. And then the architecture of intelligent logistics cloud platform containing software layer(SaaS), platform layer(PaaS) and infrastructure(IaaS) has been constructed accordance with the core technology relative high concurrent processing technique, heterogeneous terminal data access, encapsulation and data mining. Therefore, intelligent logistics cloud platform can be carried out by the service mode for implementation to accelerate the construction of the symbiotic win-winlogistics ecological system and the benign development of the ICT industry in the trend of intellectualization in China.
基金The Universiti Kebangsaan Malaysia(UKM)Research Grant Scheme FRGS/1/2020/ICT03/UKM/02/6 and GGPM-2020-028 funded this research.
文摘Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integrity.In this research,an enhanced Merkle hash tree method of effective authentication model is proposed in the multi-owner cloud to increase the security of the cloud data.Merkle Hash tree applies the leaf nodes with a hash tag and the non-leaf node contains the table of hash information of child to encrypt the large data.Merkle Hash tree provides the efficient mapping of data and easily identifies the changesmade in the data due to proper structure.The developed model supports privacy-preserving public auditing to provide a secure cloud storage system.The data owners upload the data in the cloud and edit the data using the private key.An enhanced Merkle hash tree method stores the data in the cloud server and splits it into batches.The data files requested by the data owner are audit by a third-party auditor and the multiowner authentication method is applied during the modification process to authenticate the user.The result shows that the proposed method reduces the encryption and decryption time for cloud data storage by 2–167 ms when compared to the existing Advanced Encryption Standard and Blowfish.
基金supported by the National Natural Science Foundation Project(No.42130105)Key Laboratory of Spatial-temporal Big Data Analysis and Application of Natural Resources in_Megacities,MNR(No.KFKT-2022-01).
文摘With the rapid development of reality capture methods,such as laser scanning and oblique photogrammetry,point cloud data have become the third most important data source,after vector maps and imagery.Point cloud data also play an increasingly important role in scientific research and engineering in the fields of Earth science,spatial cognition,and smart cities.However,how to acquire high-quality three-dimensional(3D)geospatial information from point clouds has become a scientific frontier,for which there is an urgent demand in the fields of surveying and mapping,as well as geoscience applications.To address the challenges mentioned above,point cloud intelligence came into being.This paper summarizes the state-of-the-art of point cloud intelligence,with regard to acquisition equipment,intelligent processing,scientific research,and engineering applications.For this purpose,we refer to a recent project on the hybrid georeferencing of images and LiDAR data for high-quality point cloud collection,as well as a current benchmark for the semantic segmentation of high-resolution 3D point clouds.These projects were conducted at the Institute for Photogrammetry,the University of Stuttgart,which was initially headed by the late Prof.Ackermann.Finally,the development prospects of point cloud intelligence are summarized.
基金funded by the National Sciences Foundation of China(Grant No.91337103)the China Meteorological Administration Special Public Welfare Research Fund(Grant No.GYHY201406001)
文摘This study concerns a Ka-band solid-state transmitter cloud radar, made in China, which can operate in three different work modes, with different pulse widths, and coherent and incoherent integration numbers, to meet the requirements for cloud remote sensing over the Tibetan Plateau. Specifically, the design of the three operational modes of the radar(i.e., boundary mode M1, cirrus mode M2, and precipitation mode M3) is introduced. Also, a cloud radar data merging algorithm for the three modes is proposed. Using one month's continuous measurements during summertime at Naqu on the Tibetan Plateau,we analyzed the consistency between the cloud radar measurements of the three modes. The number of occurrences of radar detections of hydrometeors and the percentage contributions of the different modes' data to the merged data were estimated.The performance of the merging algorithm was evaluated. The results indicated that the minimum detectable reflectivity for each mode was consistent with theoretical results. Merged data provided measurements with a minimum reflectivity of -35 dBZ at the height of 5 km, and obtained information above the height of 0.2 km. Measurements of radial velocity by the three operational modes agreed very well, and systematic errors in measurements of reflectivity were less than 2 dB. However,large discrepancies existed in the measurements of the linear depolarization ratio taken from the different operational modes.The percentage of radar detections of hydrometeors in mid- and high-level clouds increased by 60% through application of pulse compression techniques. In conclusion, the merged data are appropriate for cloud and precipitation studies over the Tibetan Plateau.
基金National Natural Science Foundation of China(No.41801379)Fundamental Research Funds for the Central Universities(No.2019B08414)National Key R&D Program of China(No.2016YFC0401801)。
文摘Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.
基金Supported by the National Natural Science Foundation of China(61373040,61173137)the Ph.D.Programs Foundation of Ministry of Education of China(20120141110002)the Key Project of Natural Science Foundation of Hubei Province(2010CDA004)
文摘Storage auditing and client-side deduplication techniques have been proposed to assure data integrity and improve storage efficiency, respectively. Recently, a few schemes start to consider these two different aspects together. However, these schemes either only support plaintext data file or have been proved insecure. In this paper, we propose a public auditing scheme for cloud storage systems, in which deduplication of encrypted data and data integrity checking can be achieved within the same framework. The cloud server can correctly check the ownership for new owners and the auditor can correctly check the integrity of deduplicated data. Our scheme supports deduplication of encrypted data by using the method of proxy re-encryption and also achieves deduplication of data tags by aggregating the tags from different owners. The analysis and experiment results show that our scheme is provably secure and efficient.
文摘Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.