期刊文献+
共找到10,554篇文章
< 1 2 250 >
每页显示 20 50 100
Research on Airborne Point Cloud Data Registration Using Urban Buildings as an Example
1
作者 Yajun Fan Yujun Shi +1 位作者 Chengjie Su Kai Wang 《Journal of World Architecture》 2025年第4期35-42,共8页
Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)... Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field. 展开更多
关键词 Airborne LiDAR Point cloud registration Point cloud data processing Systematic error
在线阅读 下载PDF
PH-shape:an adaptive persistent homology-based approach for building outline extraction from ALS point cloud data
2
作者 Gefei Kong Hongchao Fan 《Geo-Spatial Information Science》 CSCD 2024年第4期1107-1117,共11页
Building outline extraction from segmented point clouds is a critical step of building footprint generation.Existing methods for this task are often based on the convex hull and α-shape algorithm.There are also some ... Building outline extraction from segmented point clouds is a critical step of building footprint generation.Existing methods for this task are often based on the convex hull and α-shape algorithm.There are also some methods using grids and Delaunay triangulation.The common challenge of these methods is the determination of proper parameters.While deep learning-based methods have shown promise in reducing the impact and dependence on parameter selection,their reliance on datasets with ground truth information limits the generalization of these methods.In this study,a novel unsupervised approach,called PH-shape,is proposed to address the aforementioned challenge.The methods of Persistence Homology(PH)and Fourier descriptor are introduced into the task of building outline extraction.The PH from the theory of topological data analysis supports the automatic and adaptive determination of proper buffer radius,thus enabling the parameter-adaptive extraction of building outlines through buffering and“inverse”buffering.The quantitative and qualitative experiment results on two datasets with different point densities demonstrate the effectiveness of the proposed approach in the face of various building types,interior boundaries,and the density variation in the point cloud data of one building.The PH-supported parameter adaptivity helps the proposed approach overcome the challenge of parameter determination and data variations and achieve reliable extraction of building outlines. 展开更多
关键词 Building outline extraction point cloud data persistent homology boundary tracing
原文传递
Analysis of Secured Cloud Data Storage Model for Information
3
作者 Emmanuel Nwabueze Ekwonwune Udo Chukwuebuka Chigozie +1 位作者 Duroha Austin Ekekwe Georgina Chekwube Nwankwo 《Journal of Software Engineering and Applications》 2024年第5期297-320,共24页
This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hac... This paper was motivated by the existing problems of Cloud Data storage in Imo State University, Nigeria such as outsourced data causing the loss of data and misuse of customer information by unauthorized users or hackers, thereby making customer/client data visible and unprotected. Also, this led to enormous risk of the clients/customers due to defective equipment, bugs, faulty servers, and specious actions. The aim if this paper therefore is to analyze a secure model using Unicode Transformation Format (UTF) base 64 algorithms for storage of data in cloud securely. The methodology used was Object Orientated Hypermedia Analysis and Design Methodology (OOHADM) was adopted. Python was used to develop the security model;the role-based access control (RBAC) and multi-factor authentication (MFA) to enhance security Algorithm were integrated into the Information System developed with HTML 5, JavaScript, Cascading Style Sheet (CSS) version 3 and PHP7. This paper also discussed some of the following concepts;Development of Computing in Cloud, Characteristics of computing, Cloud deployment Model, Cloud Service Models, etc. The results showed that the proposed enhanced security model for information systems of cooperate platform handled multiple authorization and authentication menace, that only one login page will direct all login requests of the different modules to one Single Sign On Server (SSOS). This will in turn redirect users to their requested resources/module when authenticated, leveraging on the Geo-location integration for physical location validation. The emergence of this newly developed system will solve the shortcomings of the existing systems and reduce time and resources incurred while using the existing system. 展开更多
关键词 cloud data Information Model data Storage cloud Computing Security System data Encryption
在线阅读 下载PDF
MODEL RECONSTRUCTION FROM CLOUD DATA FOR RAPID PROTOTYPE MANUFACTURING 被引量:1
4
作者 张丽艳 周儒荣 周来水 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2001年第2期170-175,共6页
Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes... Model reconstruction from points scanned on existing physical objects is much important in a variety of situations such as reverse engineering for mechanical products, computer vision and recovery of biological shapes from two dimensional contours. With the development of measuring equipment, cloud points that contain more details of the object can be obtained conveniently. On the other hand, large quantity of sampled points brings difficulties to model reconstruction method. This paper first presents an algorithm to automatically reduce the number of cloud points under given tolerance. Triangle mesh surface from the simplified data set is reconstructed by the marching cubes algorithm. For various reasons, reconstructed mesh usually contains unwanted holes. An approach to create new triangles is proposed with optimized shape for covering the unexpected holes in triangle meshes. After hole filling, watertight triangle mesh can be directly output in STL format, which is widely used in rapid prototype manufacturing. Practical examples are included to demonstrate the method. 展开更多
关键词 reverse engineering model reconstruction cloud data data filtering hole filling
在线阅读 下载PDF
Automated Rock Detection and Shape Analysis from Mars Rover Imagery and 3D Point Cloud Data 被引量:10
5
作者 邸凯昌 岳宗玉 +1 位作者 刘召芹 王树良 《Journal of Earth Science》 SCIE CAS CSCD 2013年第1期125-135,共11页
A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken b... A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm. Then, the objects in the segmented images are classified into small rock candidates, rock shadows, and large objects. Rock shadows and large objects are considered as the regions within which large rocks may exist. In these regions, large rock candidates are extracted through ground-plane fitting with the 3D point cloud data. Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results. The shape properties of the rocks (angularity, circularity, width, height, and width-height ratio) have been calculated for subsequent ~eological studies. 展开更多
关键词 Mars rover rock extraction rover image 3D point cloud data.
原文传递
SLC-index: A scalable skip list-based index for cloud data processing 被引量:2
6
作者 HE Jing YAO Shao-wen +1 位作者 CAI Li ZHOU Wei 《Journal of Central South University》 SCIE EI CAS CSCD 2018年第10期2438-2450,共13页
Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle hu... Due to the increasing number of cloud applications,the amount of data in the cloud shows signs of growing faster than ever before.The nature of cloud computing requires cloud data processing systems that can handle huge volumes of data and have high performance.However,most cloud storage systems currently adopt a hash-like approach to retrieving data that only supports simple keyword-based enquiries,but lacks various forms of information search.Therefore,a scalable and efficient indexing scheme is clearly required.In this paper,we present a skip list-based cloud index,called SLC-index,which is a novel,scalable skip list-based indexing for cloud data processing.The SLC-index offers a two-layered architecture for extending indexing scope and facilitating better throughput.Dynamic load-balancing for the SLC-index is achieved by online migration of index nodes between servers.Furthermore,it is a flexible system due to its dynamic addition and removal of servers.The SLC-index is efficient for both point and range queries.Experimental results show the efficiency of the SLC-index and its usefulness as an alternative approach for cloud-suitable data structures. 展开更多
关键词 cloud computing distributed index cloud data processing skip list
在线阅读 下载PDF
Energy-Aware Scheduling Scheme Using Workload-Aware Consolidation Technique in Cloud Data Centres 被引量:2
7
作者 黎红友 王江勇 +2 位作者 彭舰 王俊峰 刘唐 《China Communications》 SCIE CSCD 2013年第12期114-124,共11页
To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migr... To reduce energy consumption in cloud data centres,in this paper,we propose two algorithms called the Energy-aware Scheduling algorithm using Workload-aware Consolidation Technique(ESWCT) and the Energyaware Live Migration algorithm using Workload-aware Consolidation Technique(ELMWCT).As opposed to traditional energy-aware scheduling algorithms,which often focus on only one-dimensional resource,the two algorithms are based on the fact that multiple resources(such as CPU,memory and network bandwidth)are shared by users concurrently in cloud data centres and heterogeneous workloads have different resource consumption characteristics.Both algorithms investigate the problem of consolidating heterogeneous workloads.They try to execute all Virtual Machines(VMs) with the minimum amount of Physical Machines(PMs),and then power off unused physical servers to reduce power consumption.Simulation results show that both algorithms efficiently utilise the resources in cloud data centres,and the multidimensional resources have good balanced utilizations,which demonstrate their promising energy saving capability. 展开更多
关键词 energy-aware scheduling hetero-geneous workloads workload-aware consoli-dation cloud data centres
在线阅读 下载PDF
Cloud Data Encryption and Authentication Based on Enhanced Merkle Hash Tree Method 被引量:1
8
作者 J.Stanly Jayaprakash Kishore Balasubramanian +3 位作者 Rossilawati Sulaiman Mohammad Kamrul Hasan B.D.Parameshachari Celestine Iwendi 《Computers, Materials & Continua》 SCIE EI 2022年第7期519-534,共16页
Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integ... Many organizations apply cloud computing to store and effectively process data for various applications.The user uploads the data in the cloud has less security due to the unreliable verification process of data integrity.In this research,an enhanced Merkle hash tree method of effective authentication model is proposed in the multi-owner cloud to increase the security of the cloud data.Merkle Hash tree applies the leaf nodes with a hash tag and the non-leaf node contains the table of hash information of child to encrypt the large data.Merkle Hash tree provides the efficient mapping of data and easily identifies the changesmade in the data due to proper structure.The developed model supports privacy-preserving public auditing to provide a secure cloud storage system.The data owners upload the data in the cloud and edit the data using the private key.An enhanced Merkle hash tree method stores the data in the cloud server and splits it into batches.The data files requested by the data owner are audit by a third-party auditor and the multiowner authentication method is applied during the modification process to authenticate the user.The result shows that the proposed method reduces the encryption and decryption time for cloud data storage by 2–167 ms when compared to the existing Advanced Encryption Standard and Blowfish. 展开更多
关键词 cloud computing cloud data storage cloud service provider merkle hash tree multi-owner authentication third-party auditor
在线阅读 下载PDF
Methodology for Extraction of Tunnel Cross-Sections Using Dense Point Cloud Data 被引量:4
9
作者 Yueqian SHEN Jinguo WANG +2 位作者 Jinhu WANG Wei DUAN Vagner G.FERREIRA 《Journal of Geodesy and Geoinformation Science》 2021年第2期56-71,共16页
Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minute... Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications. 展开更多
关键词 CROSS-SECTION control point convergence analysis z-score method terrestrial laser scanning dense point cloud data
在线阅读 下载PDF
Workload-aware request routing in cloud data center using software-defined networking
10
作者 Haitao Yuan Jing Bi Bohu Li 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2015年第1期151-160,共10页
Large latency of applications will bring revenue loss to cloud infrastructure providers in the cloud data center. The existing controllers of software-defined networking architecture can fetch and process traffic info... Large latency of applications will bring revenue loss to cloud infrastructure providers in the cloud data center. The existing controllers of software-defined networking architecture can fetch and process traffic information in the network. Therefore, the controllers can only optimize the network latency of applications. However, the serving latency of applications is also an important factor in delivered user-experience for arrival requests. Unintelligent request routing will cause large serving latency if arrival requests are allocated to overloaded virtual machines. To deal with the request routing problem, this paper proposes the workload-aware software-defined networking controller architecture. Then, request routing algorithms are proposed to minimize the total round trip time for every type of request by considering the congestion in the network and the workload in virtual machines(VMs). This paper finally provides the evaluation of the proposed algorithms in a simulated prototype. The simulation results show that the proposed methodology is efficient compared with the existing approaches. 展开更多
关键词 cloud data center(CDC) software-defined networking request routing resource allocation network latency optimization
在线阅读 下载PDF
Dynamic Automated Infrastructure for Efficient Cloud Data Centre
11
作者 R.Dhaya R.Kanthavel 《Computers, Materials & Continua》 SCIE EI 2022年第4期1625-1639,共15页
We propose a dynamic automated infrastructure model for the cloud data centre which is aimed as an efficient service stipulation for the enormous number of users.The data center and cloud computing technologies have b... We propose a dynamic automated infrastructure model for the cloud data centre which is aimed as an efficient service stipulation for the enormous number of users.The data center and cloud computing technologies have been at the moment rendering attention to major research and development efforts by companies,governments,and academic and other research institutions.In that,the difficult task is to facilitate the infrastructure to construct the information available to application-driven services and make business-smart decisions.On the other hand,the challenges that remain are the provision of dynamic infrastructure for applications and information anywhere.Further,developing technologies to handle private cloud computing infrastructure and operations in a completely automated and secure way has been critical.As a result,the focus of this article is on service and infrastructure life cycle management.We also show how cloud users interact with the cloud,how they request services from the cloud,how they select cloud strategies to deliver the desired service,and how they analyze their cloud consumption. 展开更多
关键词 DYNAMIC automated infrastructure model cloud data centre SECURITY PRIVACY energy efficient
在线阅读 下载PDF
Indoor Space Modeling and Parametric Component Construction Based on 3D Laser Point Cloud Data
12
作者 Ruzhe Wang Xin Li Xin Meng 《Journal of World Architecture》 2023年第5期37-45,共9页
In order to enhance modeling efficiency and accuracy,we utilized 3D laser point cloud data for indoor space modeling.Point cloud data was obtained with a 3D laser scanner and optimized with Autodesk Recap and Revit so... In order to enhance modeling efficiency and accuracy,we utilized 3D laser point cloud data for indoor space modeling.Point cloud data was obtained with a 3D laser scanner and optimized with Autodesk Recap and Revit software to extract geometric information about the indoor environment.Furthermore,we proposed a method for constructing indoor elements based on parametric components.The research outcomes of this paper will offer new methods and tools for indoor space modeling and design.The approach of indoor space modeling based on 3D laser point cloud data and parametric component construction can enhance modeling efficiency and accuracy,providing architects,interior designers,and decorators with a better working platform and design reference. 展开更多
关键词 3D laser scanning technology Indoor space point cloud data Building information modeling(BIM)
在线阅读 下载PDF
Low-power task scheduling algorithm for large-scale cloud data centers 被引量:3
13
作者 Xiaolong Xu Jiaxing Wu +1 位作者 Geng Yang Ruchuan Wang 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2013年第5期870-878,共9页
How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data cente... How to effectively reduce the energy consumption of large-scale data centers is a key issue in cloud computing. This paper presents a novel low-power task scheduling algorithm (L3SA) for large-scale cloud data centers. The winner tree is introduced to make the data nodes as the leaf nodes of the tree and the final winner on the purpose of reducing energy consumption is selected. The complexity of large-scale cloud data centers is fully consider, and the task comparson coefficient is defined to make task scheduling strategy more reasonable. Experiments and performance analysis show that the proposed algorithm can effectively improve the node utilization, and reduce the overall power consumption of the cloud data center. 展开更多
关键词 cloud computing data center task scheduling energy consumption.
在线阅读 下载PDF
Towards a Comprehensive Security Framework of Cloud Data Storage Based on Multi Agent System Architecture 被引量:3
14
作者 Amir Mohamed Talib Rodziah Atan +1 位作者 Rusli Abdullah Masrah Azrifah Azmi Murad 《Journal of Information Security》 2012年第4期295-306,共12页
The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many d... The tremendous growth of the cloud computing environments requires new architecture for security services. Cloud computing is the utilization of many servers/data centers or cloud data storages (CDSs) housed in many different locations and interconnected by high speed networks. CDS, like any other emerging technology, is experiencing growing pains. It is immature, it is fragmented and it lacks standardization. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this paper a comprehensive security framework based on Multi-Agent System (MAS) architecture for CDS to facilitate confidentiality, correctness assurance, availability and integrity of users' data in the cloud is proposed. Our security framework consists of two main layers as agent layer and CDS layer. Our propose MAS architecture includes main five types of agents: Cloud Service Provider Agent (CSPA), Cloud Data Confidentiality Agent (CDConA), Cloud Data Correctness Agent (CDCorA), Cloud Data Availability Agent (CDAA) and Cloud Data Integrity Agent (CDIA). In order to verify our proposed security framework based on MAS architecture, pilot study is conducted using a questionnaire survey. Rasch Methodology is used to analyze the pilot data. Item reliability is found to be poor and a few respondents and items are identified as misfits with distorted measurements. As a result, some problematic questions are revised and some predictably easy questions are excluded from the questionnaire. A prototype of the system is implemented using Java. To simulate the agents, oracle database packages and triggers are used to implement agent functions and oracle jobs are utilized to create agents. 展开更多
关键词 cloud COMPUTING MULTI-AGENT System cloud data STORAGE Security Framework cloud Service PROVIDER
在线阅读 下载PDF
Ensuring Security, Confidentiality and Fine-Grained Data Access Control of Cloud Data Storage Implementation Environment 被引量:1
15
作者 Amir Mohamed Talib 《Journal of Information Security》 2015年第2期118-130,共13页
With the development of cloud computing, the mutual understandability among distributed data access control has become an important issue in the security field of cloud computing. To ensure security, confidentiality a... With the development of cloud computing, the mutual understandability among distributed data access control has become an important issue in the security field of cloud computing. To ensure security, confidentiality and fine-grained data access control of Cloud Data Storage (CDS) environment, we proposed Multi-Agent System (MAS) architecture. This architecture consists of two agents: Cloud Service Provider Agent (CSPA) and Cloud Data Confidentiality Agent (CDConA). CSPA provides a graphical interface to the cloud user that facilitates the access to the services offered by the system. CDConA provides each cloud user by definition and enforcement expressive and flexible access structure as a logic formula over cloud data file attributes. This new access control is named as Formula-Based Cloud Data Access Control (FCDAC). Our proposed FCDAC based on MAS architecture consists of four layers: interface layer, existing access control layer, proposed FCDAC layer and CDS layer as well as four types of entities of Cloud Service Provider (CSP), cloud users, knowledge base and confidentiality policy roles. FCDAC, it’s an access policy determined by our MAS architecture, not by the CSPs. A prototype of our proposed FCDAC scheme is implemented using the Java Agent Development Framework Security (JADE-S). Our results in the practical scenario defined formally in this paper, show the Round Trip Time (RTT) for an agent to travel in our system and measured by the times required for an agent to travel around different number of cloud users before and after implementing FCDAC. 展开更多
关键词 cloud Computing cloud data STORAGE cloud Service PROVIDER Formula-Based cloud data Access Control Multi-Agent System and Secure Java Agent Development Framework
在线阅读 下载PDF
Improved Research on Fuzzy Search over Encrypted Cloud Data Based on Keywords
16
作者 Ping Zhang Jianzhong Wang 《Journal of Computer and Communications》 2015年第9期90-98,共9页
A search strategy over encrypted cloud data based on keywords has been improved and has presented a method using different strategies on the client and the server to improve the search efficiency in this paper. The cl... A search strategy over encrypted cloud data based on keywords has been improved and has presented a method using different strategies on the client and the server to improve the search efficiency in this paper. The client uses the Chinese and English to achieve the synonym construction of the keywords, the establishment of the fuzzy-syllable words and synonyms set of keywords and the implementation of fuzzy search strategy over the encryption of cloud data based on keywords. The server side through the analysis of the user’s query request provides keywords for users to choose and topic words and secondary words are picked out. System will match topic words with historical inquiry in time order, and then the new query result of the request is directly gained. The analysis of the simulation experiment shows that the fuzzy search strategy can make better use of historical results on the basis of privacy protection for the realization of efficient data search, saving the search time and improving the efficiency of search. 展开更多
关键词 cloud data FUZZY SEARCH KEYWORDS SYNONYMS Searchable ENCRYPTION
暂未订购
Subject Oriented Autonomic Cloud Data Center Networks Model
17
作者 Hang Qin Li Zhu 《Journal of Data Analysis and Information Processing》 2017年第3期87-95,共9页
This paper investigates autonomic cloud data center networks, which is the solution with the increasingly complex computing environment, in terms of the management and cost issues to meet users’ growing demand. The v... This paper investigates autonomic cloud data center networks, which is the solution with the increasingly complex computing environment, in terms of the management and cost issues to meet users’ growing demand. The virtualized cloud networking is to provide a plethora of rich online applications, including self-configuration, self-healing, self-optimization and self-protection. In addition, we draw on the intelligent subject and multi-agent system, concerning system model, strategy, autonomic cloud computing, involving independent computing system development and implementation. Then, combining the architecture with the autonomous unit, we propose the MCDN (Model of Autonomic Cloud Data Center Networks). This model can define intelligent state, elaborate the composition structure, and complete life cycle. Finally, our proposed public infrastructure can be provided with the autonomous unit in the supported interaction model. 展开更多
关键词 AUTONOMIC cloud Computing AUTONOMOUS Unit data Center SELF-CONFIGURATION Service DESCRIPTION
在线阅读 下载PDF
Cloud data security with deep maxout assisted data sanitization and restoration process
18
作者 Shrikant D.Dhamdhere M.Sivakkumar V.Subramanian 《High-Confidence Computing》 2025年第1期68-81,共14页
The potential of cloud computing,an emerging concept to minimize the costs associated with computing has recently drawn the interest of a number of researchers.The fast advancements in cloud computing techniques led t... The potential of cloud computing,an emerging concept to minimize the costs associated with computing has recently drawn the interest of a number of researchers.The fast advancements in cloud computing techniques led to the amazing arrival of cloud services.But data security is a challenging issue for modern civilization.The main issues with cloud computing are cloud security as well as effective cloud distribution over the network.Increasing the privacy of data with encryption methods is the greatest approach,which has highly progressed in recent times.In this aspect,sanitization is also the process of confidentiality of data.The goal of this work is to present a deep learning-assisted data sanitization procedure for data security.The proposed data sanitization process involves the following steps:data preprocessing,optimal key generation,deep learning-assisted key fine-tuning,and Kronecker product.Here,the data preprocessing considers original data as well as the extracted statistical feature.Key generation is the subsequent process,for which,a self-adaptive Namib beetle optimization(SANBO)algorithm is developed in this research.Among the generated keys,appropriate keys are fine-tuned by the improved Deep Maxout classifier.Then,the Kronecker product is done in the sanitization process.Reversing the sanitization procedure will yield the original data during the data restoration phase.The study part notes that the suggested data sanitization technique guarantees cloud data security against malign attacks.Also,the analysis of proposed work in terms of restoration effectiveness and key sensitivity analysis is also done. 展开更多
关键词 Adopted data sanitization cloud data security RESTORATION Improved deep maxout Optimal key generation
在线阅读 下载PDF
Design of a Private Cloud Platform for Distributed Logging Big Data Based on a Unified Learning Model of Physics and Data
19
作者 Cheng Xi Fu Haicheng Tursyngazy Mahabbat 《Applied Geophysics》 2025年第2期499-510,560,共13页
Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of th... Well logging technology has accumulated a large amount of historical data through four generations of technological development,which forms the basis of well logging big data and digital assets.However,the value of these data has not been well stored,managed and mined.With the development of cloud computing technology,it provides a rare development opportunity for logging big data private cloud.The traditional petrophysical evaluation and interpretation model has encountered great challenges in the face of new evaluation objects.The solution research of logging big data distributed storage,processing and learning functions integrated in logging big data private cloud has not been carried out yet.To establish a distributed logging big-data private cloud platform centered on a unifi ed learning model,which achieves the distributed storage and processing of logging big data and facilitates the learning of novel knowledge patterns via the unifi ed logging learning model integrating physical simulation and data models in a large-scale functional space,thus resolving the geo-engineering evaluation problem of geothermal fi elds.Based on the research idea of“logging big data cloud platform-unifi ed logging learning model-large function space-knowledge learning&discovery-application”,the theoretical foundation of unified learning model,cloud platform architecture,data storage and learning algorithm,arithmetic power allocation and platform monitoring,platform stability,data security,etc.have been carried on analysis.The designed logging big data cloud platform realizes parallel distributed storage and processing of data and learning algorithms.The feasibility of constructing a well logging big data cloud platform based on a unifi ed learning model of physics and data is analyzed in terms of the structure,ecology,management and security of the cloud platform.The case study shows that the logging big data cloud platform has obvious technical advantages over traditional logging evaluation methods in terms of knowledge discovery method,data software and results sharing,accuracy,speed and complexity. 展开更多
关键词 Unified logging learning model logging big data private cloud machine learning
在线阅读 下载PDF
美国CLOUD法案数据跨境执法中的安全风险与中国的应对 被引量:2
20
作者 廖明月 王佳宜 杨映雪 《图书馆论坛》 北大核心 2025年第1期128-137,共10页
数据是数字经济时代重要的国家战略资源,数据跨境流动亦是其中的重要一环。囿于犯罪活动呈现数字化和跨境化态势,数据跨境执法成为打击网络犯罪的重要手段,但基于执法目的的数据出境对数据存储国的影响重大。美国凭借“数据自由”话语... 数据是数字经济时代重要的国家战略资源,数据跨境流动亦是其中的重要一环。囿于犯罪活动呈现数字化和跨境化态势,数据跨境执法成为打击网络犯罪的重要手段,但基于执法目的的数据出境对数据存储国的影响重大。美国凭借“数据自由”话语体系通过CLOUD法案推出以“数据控制者标准”为核心的数据管辖模式,进而依托网络服务提供者实施“长臂管辖”,使得我国数据被动出境和被调取而引发的国家数据安全风险大幅提升。我国应将数据主权作为数据跨境流动的法理基础,探索控制数据安全风险的制度工具,包括基于国家主权调适“数据控制者”标准模式,完善数据跨境调取安全规则和审查机制,以阻断法限制单边数据跨境执法,对美国的相关“长臂管辖”进行有效制衡。 展开更多
关键词 数据跨境执法 安全风险 数据主权 cloud法案
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部