[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data...[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data file support for meteorological forecasting and services.[Method]In this paper,an efficient and accurate method for data file quality control and fusion processing is developed.By locating the missing measurement time,data are extracted from the"AWZ.db"database and the minute routine meteorological element data file,and merged into the hourly routine meteorological element data file.[Result]Data processing efficiency and accuracy are significantly improved,and the problem of incomplete hourly routine meteorological element data files is solved.At the same time,it emphasizes the importance of ensuring the accuracy of the files used and carefully checking and verifying the fusion results,and proposes strategies to improve data quality.[Conclusion]This method provides convenience for observation personnel and effectively improves the integrity and accuracy of data files.In the future,it is expected to provide more reliable data support for meteorological forecasting and services.展开更多
Metadata prefetching and data placement play a critical role in enhancing access performance for file systems operating over wide-area networks.However,developing effective strategies for metadata prefetching in envir...Metadata prefetching and data placement play a critical role in enhancing access performance for file systems operating over wide-area networks.However,developing effective strategies for metadata prefetching in environments with concurrent workloads and for data placement across distributed networks remains a significant challenge.This study introduces novel and efficient methodologies for metadata prefetching and data placement,leveraging fine-grained control of prefetching strategies and variable-sized data fragment writing to optimize the I/O bandwidth of distributed file systems.The proposed metadata prefetching technique employs dynamic workload analysis to identify dominant workload patterns and adaptively refines prefetching policies,thereby boosting metadata access efficiency under concurrent scenarios.Meanwhile,the data placement strategy improves write performance by storing data fragments locally within the nearest data center and transmitting only the fragment location metadata to the remote data center hosting the original file.Experimental evaluations using real-world system traces demonstrate that the proposed approaches reduce metadata access times by up to 33.5%and application data access times by 17.19%compared to state-of-the-art techniques.展开更多
In consultative committee for space data systems(CCSDS) file delivery protocol(CFDP) recommendation of reliable transmission,there are no detail transmission procedure and delay calculation of prompted negative ac...In consultative committee for space data systems(CCSDS) file delivery protocol(CFDP) recommendation of reliable transmission,there are no detail transmission procedure and delay calculation of prompted negative acknowledge and asynchronous negative acknowledge models.CFDP is designed to provide data and storage management,story and forward,custody transfer and reliable end-to-end delivery over deep space characterized by huge latency,intermittent link,asymmetric bandwidth and big bit error rate(BER).Four reliable transmission models are analyzed and an expected file-delivery time is calculated with different trans-mission rates,numbers and sizes of packet data units,BERs and frequencies of external events,etc.By comparison of four CFDP models,the requirement of BER for typical missions in deep space is obtained and rules of choosing CFDP models under different uplink state informations are given,which provides references for protocol models selection,utilization and modification.展开更多
Time-domain airborne electromagnetic(AEM)data are frequently subject to interference from various types of noise,which can reduce the data quality and affect data inversion and interpretation.Traditional denoising met...Time-domain airborne electromagnetic(AEM)data are frequently subject to interference from various types of noise,which can reduce the data quality and affect data inversion and interpretation.Traditional denoising methods primarily deal with data directly,without analyzing the data in detail;thus,the results are not always satisfactory.In this paper,we propose a method based on dictionary learning for EM data denoising.This method uses dictionary learning to perform feature analysis and to extract and reconstruct the true signal.In the process of dictionary learning,the random noise is fi ltered out as residuals.To verify the eff ectiveness of this dictionary learning approach for denoising,we use a fi xed overcomplete discrete cosine transform(ODCT)dictionary algorithm,the method-of-optimal-directions(MOD)dictionary learning algorithm,and the K-singular value decomposition(K-SVD)dictionary learning algorithm to denoise decay curves at single points and to denoise profi le data for diff erent time channels in time-domain AEM.The results show obvious diff erences among the three dictionaries for denoising AEM data,with the K-SVD dictionary achieving the best performance.展开更多
In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorith...In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.展开更多
In this paper, we present a distributed multi-level cache system based on cloud storage, which is aimed at the low access efficiency of small spatio-temporal data files in information service system of Smart City. Tak...In this paper, we present a distributed multi-level cache system based on cloud storage, which is aimed at the low access efficiency of small spatio-temporal data files in information service system of Smart City. Taking classification attribute of small spatio-temporal data files in Smart City as the basis of cache content selection, the cache system adopts different cache pool management strategies in different levels of cache. The results of experiment in prototype system indicate that multi-level cache in this paper effectively increases the access bandwidth of small spatio-temporal files in Smart City and greatly improves service quality of multiple concurrent access in system.展开更多
Data layout in a file system is the organization of data stored in external storages. The data layout has a huge impact on performance of storage systems. We survey three main kinds of data layout in traditional file ...Data layout in a file system is the organization of data stored in external storages. The data layout has a huge impact on performance of storage systems. We survey three main kinds of data layout in traditional file systems: in-place update file system, log-structured file system, and copy-on-write file sys- tem. Each file system has its own strengths and weaknesses under different circumstances. We also include a recent us- age of persistent layout in a file system that combines both flash memory and byte- addressable non- volatile memory. With this survey, we conclude that persistent data layout in file systems may evolve dramatically in the era of emerging non-volatile memory.展开更多
Integrating heterogeneous data sources is a precondition to share data for enterprises. Highly-efficient data updating can both save system expenses, and offer real-time data. It is one of the hot issues to modify dat...Integrating heterogeneous data sources is a precondition to share data for enterprises. Highly-efficient data updating can both save system expenses, and offer real-time data. It is one of the hot issues to modify data rapidly in the pre-processing area of the data warehouse. An extract transform loading design is proposed based on a new data algorithm called Diff-Match,which is developed by utilizing mode matching and data-filtering technology. It can accelerate data renewal, filter the heterogeneous data, and seek out different sets of data. Its efficiency has been proved by its successful application in an enterprise of electric apparatus groups.展开更多
The critical nature of satellite network traffic provides a challenging environment to detect intrusions. The intrusion detection method presented aims to raise an alert whenever satellite network signals begin to exh...The critical nature of satellite network traffic provides a challenging environment to detect intrusions. The intrusion detection method presented aims to raise an alert whenever satellite network signals begin to exhibit anomalous patterns determined by Euclidian distance metric. In line with anomaly-based intrusion detection systems, the method presented relies heavily on building a model of"normal" through the creation of a signal dictionary using windowing and k-means clustering. The results of three signals fi'om our case study are discussed to highlight the benefits and drawbacks of the method presented. Our preliminary results demonstrate that the clustering technique used has great potential for intrusion detection for non-periodic satellite network signals.展开更多
微地震监测是非常规油气藏勘探领域的一项重要技术,在水力压裂裂缝监测、CO_(2)封存等方面都有着广泛的应用。然而,微地震信号能量弱,容易被噪声污染,其信噪比低的特点使得在后续的处理过程中往往不能得到好的结果。因此微地震数据去噪...微地震监测是非常规油气藏勘探领域的一项重要技术,在水力压裂裂缝监测、CO_(2)封存等方面都有着广泛的应用。然而,微地震信号能量弱,容易被噪声污染,其信噪比低的特点使得在后续的处理过程中往往不能得到好的结果。因此微地震数据去噪是一项十分重要的处理步骤,去噪效果对后续震源定位的准确性和震源机制反演结果的可靠性有关键的影响。文中提出一种蒙特卡洛非负字典学习(Monte Carlo non-negative dictionary learning,MCNDL)微地震去噪方法。蒙特卡洛分块能利用少量的时间获得包含相对较多有效信号特征的初始字典,在字典更新的过程中,利用非负性约束来保证数据变换的稀疏性,缩小解的空间,从而降低计算成本并提高去噪精度。利用合成和实际微地震数据对该方法的应用效果进行了测试,并与带通(Band-Pass,BP)滤波、FK滤波和KSVD方法进行对比,展示出该方法针对微地震数据较好的去噪效果与较高的去噪效率。展开更多
基金the Fifth Batch of Innovation Teams of Wuzhou Meteorological Bureau"Wuzhou Innovation Team for Enhancing the Comprehensive Meteorological Observation Ability through Digitization and Intelligence"Wuzhou Science and Technology Planning Project(202402122,202402119).
文摘[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data file support for meteorological forecasting and services.[Method]In this paper,an efficient and accurate method for data file quality control and fusion processing is developed.By locating the missing measurement time,data are extracted from the"AWZ.db"database and the minute routine meteorological element data file,and merged into the hourly routine meteorological element data file.[Result]Data processing efficiency and accuracy are significantly improved,and the problem of incomplete hourly routine meteorological element data files is solved.At the same time,it emphasizes the importance of ensuring the accuracy of the files used and carefully checking and verifying the fusion results,and proposes strategies to improve data quality.[Conclusion]This method provides convenience for observation personnel and effectively improves the integrity and accuracy of data files.In the future,it is expected to provide more reliable data support for meteorological forecasting and services.
基金funded by the National Natural Science Foundation of China under Grant No.62362019the Hainan Provincial Natural Science Foundation of China under Grant No.624RC482.
文摘Metadata prefetching and data placement play a critical role in enhancing access performance for file systems operating over wide-area networks.However,developing effective strategies for metadata prefetching in environments with concurrent workloads and for data placement across distributed networks remains a significant challenge.This study introduces novel and efficient methodologies for metadata prefetching and data placement,leveraging fine-grained control of prefetching strategies and variable-sized data fragment writing to optimize the I/O bandwidth of distributed file systems.The proposed metadata prefetching technique employs dynamic workload analysis to identify dominant workload patterns and adaptively refines prefetching policies,thereby boosting metadata access efficiency under concurrent scenarios.Meanwhile,the data placement strategy improves write performance by storing data fragments locally within the nearest data center and transmitting only the fragment location metadata to the remote data center hosting the original file.Experimental evaluations using real-world system traces demonstrate that the proposed approaches reduce metadata access times by up to 33.5%and application data access times by 17.19%compared to state-of-the-art techniques.
基金supported by the National Natural Science Fandation of China (6067208960772075)
文摘In consultative committee for space data systems(CCSDS) file delivery protocol(CFDP) recommendation of reliable transmission,there are no detail transmission procedure and delay calculation of prompted negative acknowledge and asynchronous negative acknowledge models.CFDP is designed to provide data and storage management,story and forward,custody transfer and reliable end-to-end delivery over deep space characterized by huge latency,intermittent link,asymmetric bandwidth and big bit error rate(BER).Four reliable transmission models are analyzed and an expected file-delivery time is calculated with different trans-mission rates,numbers and sizes of packet data units,BERs and frequencies of external events,etc.By comparison of four CFDP models,the requirement of BER for typical missions in deep space is obtained and rules of choosing CFDP models under different uplink state informations are given,which provides references for protocol models selection,utilization and modification.
基金financially supported the Strategic Priority Research Program of the Chinese Academy of Sciences (No. XDA14020102)the National Natural Science Foundation of China (Nos. 41774125,41530320 and 41804098)the Key National Research Project of China (Nos. 2016YFC0303100,2017YFC0601900)。
文摘Time-domain airborne electromagnetic(AEM)data are frequently subject to interference from various types of noise,which can reduce the data quality and affect data inversion and interpretation.Traditional denoising methods primarily deal with data directly,without analyzing the data in detail;thus,the results are not always satisfactory.In this paper,we propose a method based on dictionary learning for EM data denoising.This method uses dictionary learning to perform feature analysis and to extract and reconstruct the true signal.In the process of dictionary learning,the random noise is fi ltered out as residuals.To verify the eff ectiveness of this dictionary learning approach for denoising,we use a fi xed overcomplete discrete cosine transform(ODCT)dictionary algorithm,the method-of-optimal-directions(MOD)dictionary learning algorithm,and the K-singular value decomposition(K-SVD)dictionary learning algorithm to denoise decay curves at single points and to denoise profi le data for diff erent time channels in time-domain AEM.The results show obvious diff erences among the three dictionaries for denoising AEM data,with the K-SVD dictionary achieving the best performance.
文摘In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.
基金Supported by the Natural Science Foundation of Hubei Province(2012FFC034,2014CFC1100)
文摘In this paper, we present a distributed multi-level cache system based on cloud storage, which is aimed at the low access efficiency of small spatio-temporal data files in information service system of Smart City. Taking classification attribute of small spatio-temporal data files in Smart City as the basis of cache content selection, the cache system adopts different cache pool management strategies in different levels of cache. The results of experiment in prototype system indicate that multi-level cache in this paper effectively increases the access bandwidth of small spatio-temporal files in Smart City and greatly improves service quality of multiple concurrent access in system.
基金supported by ZTE Industry-Academia-Research Cooperation Funds
文摘Data layout in a file system is the organization of data stored in external storages. The data layout has a huge impact on performance of storage systems. We survey three main kinds of data layout in traditional file systems: in-place update file system, log-structured file system, and copy-on-write file sys- tem. Each file system has its own strengths and weaknesses under different circumstances. We also include a recent us- age of persistent layout in a file system that combines both flash memory and byte- addressable non- volatile memory. With this survey, we conclude that persistent data layout in file systems may evolve dramatically in the era of emerging non-volatile memory.
基金Supported by National Natural Science Foundation of China (No. 50475117)Tianjin Natural Science Foundation (No.06YFJMJC03700).
文摘Integrating heterogeneous data sources is a precondition to share data for enterprises. Highly-efficient data updating can both save system expenses, and offer real-time data. It is one of the hot issues to modify data rapidly in the pre-processing area of the data warehouse. An extract transform loading design is proposed based on a new data algorithm called Diff-Match,which is developed by utilizing mode matching and data-filtering technology. It can accelerate data renewal, filter the heterogeneous data, and seek out different sets of data. Its efficiency has been proved by its successful application in an enterprise of electric apparatus groups.
文摘The critical nature of satellite network traffic provides a challenging environment to detect intrusions. The intrusion detection method presented aims to raise an alert whenever satellite network signals begin to exhibit anomalous patterns determined by Euclidian distance metric. In line with anomaly-based intrusion detection systems, the method presented relies heavily on building a model of"normal" through the creation of a signal dictionary using windowing and k-means clustering. The results of three signals fi'om our case study are discussed to highlight the benefits and drawbacks of the method presented. Our preliminary results demonstrate that the clustering technique used has great potential for intrusion detection for non-periodic satellite network signals.
文摘微地震监测是非常规油气藏勘探领域的一项重要技术,在水力压裂裂缝监测、CO_(2)封存等方面都有着广泛的应用。然而,微地震信号能量弱,容易被噪声污染,其信噪比低的特点使得在后续的处理过程中往往不能得到好的结果。因此微地震数据去噪是一项十分重要的处理步骤,去噪效果对后续震源定位的准确性和震源机制反演结果的可靠性有关键的影响。文中提出一种蒙特卡洛非负字典学习(Monte Carlo non-negative dictionary learning,MCNDL)微地震去噪方法。蒙特卡洛分块能利用少量的时间获得包含相对较多有效信号特征的初始字典,在字典更新的过程中,利用非负性约束来保证数据变换的稀疏性,缩小解的空间,从而降低计算成本并提高去噪精度。利用合成和实际微地震数据对该方法的应用效果进行了测试,并与带通(Band-Pass,BP)滤波、FK滤波和KSVD方法进行对比,展示出该方法针对微地震数据较好的去噪效果与较高的去噪效率。