With the increase in the quantity and scale of Static Random-Access Memory Field Programmable Gate Arrays (SRAM-based FPGAs) for aerospace application, the volume of FPGA configuration bit files that must be stored ha...With the increase in the quantity and scale of Static Random-Access Memory Field Programmable Gate Arrays (SRAM-based FPGAs) for aerospace application, the volume of FPGA configuration bit files that must be stored has increased dramatically. The use of compression techniques for these bitstream files is emerging as a key strategy to alleviate the burden on storage resources. Due to the severe resource constraints of space-based electronics and the unique application environment, the simplicity, efficiency and robustness of the decompression circuitry is also a key design consideration. Through comparative analysis current bitstream file compression technologies, this research suggests that the Lempel Ziv Oberhumer (LZO) compression algorithm is more suitable for satellite applications. This paper also delves into the compression process and format of the LZO compression algorithm, as well as the inherent characteristics of configuration bitstream files. We propose an improved algorithm based on LZO for bitstream file compression, which optimises the compression process by refining the format and reducing the offset. Furthermore, a low-cost, robust decompression hardware architecture is proposed based on this method. Experimental results show that the compression speed of the improved LZO algorithm is increased by 3%, the decompression hardware cost is reduced by approximately 60%, and the compression ratio is slightly reduced by 0.47%.展开更多
[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data...[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data file support for meteorological forecasting and services.[Method]In this paper,an efficient and accurate method for data file quality control and fusion processing is developed.By locating the missing measurement time,data are extracted from the"AWZ.db"database and the minute routine meteorological element data file,and merged into the hourly routine meteorological element data file.[Result]Data processing efficiency and accuracy are significantly improved,and the problem of incomplete hourly routine meteorological element data files is solved.At the same time,it emphasizes the importance of ensuring the accuracy of the files used and carefully checking and verifying the fusion results,and proposes strategies to improve data quality.[Conclusion]This method provides convenience for observation personnel and effectively improves the integrity and accuracy of data files.In the future,it is expected to provide more reliable data support for meteorological forecasting and services.展开更多
In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches d...In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches data before it is needed according to the file access pattern,which can reduce the I/O waiting time and increase the system concurrency.However,prefetching model needs to mine the degree of association between files to ensure the accuracy of prefetching.In the massive small file situation,the sheer volume of files poses a challenge to the efficiency and accuracy of relevance mining.In this paper,we propose a massive files prefetching model based on LSTM neural network with cache transaction strategy to improve file access efficiency.Firstly,we propose a file clustering algorithm based on temporal locality and spatial locality to reduce the computational complexity.Secondly,we propose a definition of cache transaction according to files occurrence in cache instead of time-offset distance based methods to extract file block feature accurately.Lastly,we innovatively propose a file access prediction algorithm based on LSTM neural network which predict the file that have high possibility to be accessed.Experiments show that compared with the traditional LRU and the plain grouping methods,the proposed model notably increase the cache hit rate and effectively reduces the I/O wait time.展开更多
In order to improve the performance of peer-to-peer files sharing system under mobile distributed en- vironments, a novel always-optimally-coordinated (AOC) criterion and corresponding candidate selection algorithm ...In order to improve the performance of peer-to-peer files sharing system under mobile distributed en- vironments, a novel always-optimally-coordinated (AOC) criterion and corresponding candidate selection algorithm are proposed in this paper. Compared with the traditional min-hops criterion, the new approach introduces a fuzzy knowledge combination theory to investigate several important factors that influence files transfer success rate and efficiency. Whereas the min-hops based protocols only ask the nearest candidate peer for desired files, the selection algorithm based on AOC comprehensively considers users' preferences and network requirements with flexible balancing rules. Furthermore, its advantage also expresses in the independence of specified resource discovering protocols, allowing for scalability. The simulation results show that when using the AOC based peer selection algorithm, system performance is much better than the rain-hops scheme, with files successful transfer rate improved more than 50% and transfer time re- duced at least 20%.展开更多
In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorith...In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.展开更多
In order to improve the management strategy for personnel files in colleges and universities,simplify the complex process of file management,and improve file management security and content preservation of the files.T...In order to improve the management strategy for personnel files in colleges and universities,simplify the complex process of file management,and improve file management security and content preservation of the files.This paper elaborates on the application of Artificial Intelligence(AI)technology in university personnel file management through theoretical analysis based on the understanding of Al technology.展开更多
To better understand different users' accessing intentions, a novel clustering and supervising method based on accessing path is presented. This method divides users' interest space to express the distribution...To better understand different users' accessing intentions, a novel clustering and supervising method based on accessing path is presented. This method divides users' interest space to express the distribution of users' interests, and directly to instruct the constructing process of web pages indexing for advanced performance.展开更多
The fast growing market of mobile device adoption and cloud computing has led to exploitation of mobile devices utilizing cloud services. One major chal-lenge facing the usage of mobile devices in the cloud environmen...The fast growing market of mobile device adoption and cloud computing has led to exploitation of mobile devices utilizing cloud services. One major chal-lenge facing the usage of mobile devices in the cloud environment is mobile synchronization to the cloud, e.g., synchronizing contacts, text messages, imag-es, and videos. Owing to the expected high volume of traffic and high time complexity required for synchronization, an appropriate synchronization algo-rithm needs to be developed. Delta synchronization is one method of synchro-nizing compressed files that requires uploading the whole file, even when no changes were made or if it was only partially changed. In the present study, we proposed an algorithm, based on Delta synchronization, to solve the problem of synchronizing compressed files under various forms of modification (e.g., not modified, partially modified, or completely modified). To measure the effi-ciency of our proposed algorithm, we compared it to the Dropbox application algorithm. The results demonstrated that our algorithm outperformed the regular Dropbox synchronization mechanism by reducing the synchronization time, cost, and traffic load between clients and the cloud service provider.展开更多
IIn order to improve the performance of wireless distributed peer-to-peer(P2P)files sharing systems,a general system architecture and a novel peer selecting model based on fuzzy cognitive maps(FCM)are proposed in this...IIn order to improve the performance of wireless distributed peer-to-peer(P2P)files sharing systems,a general system architecture and a novel peer selecting model based on fuzzy cognitive maps(FCM)are proposed in this paper.The new model provides an effective approach on choosing an optimal peer from several resource discovering results for the best file transfer.Compared with the traditional min-hops scheme that uses hops as the only selecting criterion,the proposed model uses FCM to investigate the complex relationships among various relative factors in wireless environments and gives an overall evaluation score on the candidate.It also has strong scalability for being independent of specified P2P resource discovering protocols.Furthermore,a complete implementation is explained in concrete modules.The simulation results show that the proposed model is effective and feasible compared with min-hops scheme,with the success transfer rate increased by at least 20% and transfer time improved as high as 34%.展开更多
Aim: the aim of this study was to investigate the shaping ability of thermomechanically treated files manufactured by twisting(Twisted files)and compare it to conventional rotary system (K3, Sybron Endo, Orange, CA) i...Aim: the aim of this study was to investigate the shaping ability of thermomechanically treated files manufactured by twisting(Twisted files)and compare it to conventional rotary system (K3, Sybron Endo, Orange, CA) in S-shaped canals, including formation of ledges, zipping, elbow, outer widening, danger zone, perforation and file deformation. Materials & Methods: Forty S-Shaped canals in resin blocks were randomly divided into 2 groups of 20 each. Pre-instrumentation images of the canals were taken via a digital camera and superimposed on images taken after preparation with TF and K3 systems to apical size of 25/06 and 30/06. Canal aberrations were measured from the superimposed image at five levels using AutoCAD system. Fisher exact test and Mann Whitney test were used for analysis of the data. Results: the incidence of zipping, elbow and apical transportation was significantly lower in the TF group (P = 0.04). Generally the incidence of aberration increased when the apical size increased to 30/0.06 regardless of the file system. Significant file deformation was evident in the TF after single use (P ? 0.001). Conclusion: Under the conditions of this study, TF manufactured by new technique performed better than K3 systems when used up to size 25/06 in simulated S-shaped canals. Clinical significance: The flexibility of thermomechanically treated files is beneficial in canals with multiple curvatures;however, attention should be paid to the instrument taper and final apical size of the preparation.展开更多
In public health emergencies, the collection of archival information in the area of public health emergencies is very important, which can provide a reference for the early warning and processing mechanism. At the pre...In public health emergencies, the collection of archival information in the area of public health emergencies is very important, which can provide a reference for the early warning and processing mechanism. At the present stage, the unlimited demand of archival users for resources and the convenience of obtaining resources have become the two driving forces to promote the “transformation” and “growth” of archives. Public health emergencies, the transformation of archives collection and service mode, social media has become an indispensable platform for user information exchange, sharing and transmission, which requires archives to change the mode of archives acquisition and storage. Archival users require more interactive, targeted and cutting-edge forms of access and archival services are also developing toward diversified functions and connotations. Archival information resource sharing is also an important link in this development trend. This paper attempts to analyze the collection methods of archives departments in public health emergencies, and then puts forward the corresponding measures for archives departments to play their functions, such as flexibly solving the needs of archives access, strengthening the development of information resources, doing a good job in the collection of relevant archives, and publicizing archives work in combination with hot spots. This paper discusses the completeness of archival data collection, the means of archival management, the scientific classification of archival data, the ways of archival data collection and so on.展开更多
The increase in issue width and instructions window size in modern processors demand an increase in the size of the register files, as well as an increase in the number of ports. Bigger register files implies an incre...The increase in issue width and instructions window size in modern processors demand an increase in the size of the register files, as well as an increase in the number of ports. Bigger register files implies an increase in power consumed by these units as well as longer access delays. Models that assist in estimating the size of the register file, and its timing early in the design cycle are critical to the time-budget allocated to a processor design and to its performance. In this work, we discuss a Radial Base Function (RBF) Artificial Neural Network (ANN) model for the prediction of time and area for standard cell register files designed using optimized Synopsys Design Ware components and an UMC130 nm library. The ANN model predictions were compared against experimental results (obtained using detailed simulation) and a nonlinear regression-based model, and it is observed that the ANN model is very accurate and outperformed the non-linear model in several statistical parameters. Using the trained ANN model, a parametric study was carried out to study the effect of the number of words in the file (D), the number of bit in one word (W) and the total number of Read and Write ports (P) on the latency and area of standard cell register files.展开更多
两次金卷轴奖获奖者Will Files是好莱坞的一位音效设计师和声音混录师,被好莱坞的记者称之为行业中"最杰出的最具才能的35岁以下的艺术家"他参与制作了许多好莱坞大片和卖座电影,包括Marvel漫画公司最新作品《神奇四侠》(Fant...两次金卷轴奖获奖者Will Files是好莱坞的一位音效设计师和声音混录师,被好莱坞的记者称之为行业中"最杰出的最具才能的35岁以下的艺术家"他参与制作了许多好莱坞大片和卖座电影,包括Marvel漫画公司最新作品《神奇四侠》(Fantastic Four),《猩球崛起:黎明之战》(Dawn of the Planet of the Apes)。展开更多
The so-called hospital personnel files management, the specific work content is responsible for the hospital personnel files after the search together, and classification of customs declaration, for subsequent access ...The so-called hospital personnel files management, the specific work content is responsible for the hospital personnel files after the search together, and classification of customs declaration, for subsequent access to use. Personnel file management is the basic content of hospital personnel management, which plays a key role in the operation of the whole hospital. In the hospital personnel files, personal information of each employee is recorded, including personal experience, political outlook, professional ability, moral quality, etc. Hospital knowledge files contain a large amount of information, and the number is very large, involving a very wide range, with a very strong professional, the collection process is relatively complex, especially in the continuous and in-depth reform of the medical and health system, the hospital personnel files carry more and more information, the traditional hospital file management information method can no longer adapt to the new era of hospital development for the management and use of human resources files, so we must strengthen the use and management of personnel files information, so that it can play a better role in promoting the development of hospitals.展开更多
基金supported in part by the National Key Laboratory of Science and Technology on Space Microwave(Grant Nos.HTKJ2022KL504009 and HTKJ2022KL5040010).
文摘With the increase in the quantity and scale of Static Random-Access Memory Field Programmable Gate Arrays (SRAM-based FPGAs) for aerospace application, the volume of FPGA configuration bit files that must be stored has increased dramatically. The use of compression techniques for these bitstream files is emerging as a key strategy to alleviate the burden on storage resources. Due to the severe resource constraints of space-based electronics and the unique application environment, the simplicity, efficiency and robustness of the decompression circuitry is also a key design consideration. Through comparative analysis current bitstream file compression technologies, this research suggests that the Lempel Ziv Oberhumer (LZO) compression algorithm is more suitable for satellite applications. This paper also delves into the compression process and format of the LZO compression algorithm, as well as the inherent characteristics of configuration bitstream files. We propose an improved algorithm based on LZO for bitstream file compression, which optimises the compression process by refining the format and reducing the offset. Furthermore, a low-cost, robust decompression hardware architecture is proposed based on this method. Experimental results show that the compression speed of the improved LZO algorithm is increased by 3%, the decompression hardware cost is reduced by approximately 60%, and the compression ratio is slightly reduced by 0.47%.
基金the Fifth Batch of Innovation Teams of Wuzhou Meteorological Bureau"Wuzhou Innovation Team for Enhancing the Comprehensive Meteorological Observation Ability through Digitization and Intelligence"Wuzhou Science and Technology Planning Project(202402122,202402119).
文摘[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data file support for meteorological forecasting and services.[Method]In this paper,an efficient and accurate method for data file quality control and fusion processing is developed.By locating the missing measurement time,data are extracted from the"AWZ.db"database and the minute routine meteorological element data file,and merged into the hourly routine meteorological element data file.[Result]Data processing efficiency and accuracy are significantly improved,and the problem of incomplete hourly routine meteorological element data files is solved.At the same time,it emphasizes the importance of ensuring the accuracy of the files used and carefully checking and verifying the fusion results,and proposes strategies to improve data quality.[Conclusion]This method provides convenience for observation personnel and effectively improves the integrity and accuracy of data files.In the future,it is expected to provide more reliable data support for meteorological forecasting and services.
基金This work is supported by‘The Fundamental Research Funds for the Central Universities(Grant No.HIT.NSRIF.201714)’‘Weihai Science and Technology Development Program(2016DXGJMS15)’‘Key Research and Development Program in Shandong Provincial(2017GGX90103)’.
文摘In distributed storage systems,file access efficiency has an important impact on the real-time nature of information forensics.As a popular approach to improve file accessing efficiency,prefetching model can fetches data before it is needed according to the file access pattern,which can reduce the I/O waiting time and increase the system concurrency.However,prefetching model needs to mine the degree of association between files to ensure the accuracy of prefetching.In the massive small file situation,the sheer volume of files poses a challenge to the efficiency and accuracy of relevance mining.In this paper,we propose a massive files prefetching model based on LSTM neural network with cache transaction strategy to improve file access efficiency.Firstly,we propose a file clustering algorithm based on temporal locality and spatial locality to reduce the computational complexity.Secondly,we propose a definition of cache transaction according to files occurrence in cache instead of time-offset distance based methods to extract file block feature accurately.Lastly,we innovatively propose a file access prediction algorithm based on LSTM neural network which predict the file that have high possibility to be accessed.Experiments show that compared with the traditional LRU and the plain grouping methods,the proposed model notably increase the cache hit rate and effectively reduces the I/O wait time.
基金supported by the National Nature Science Foundation of China(No.60672124)the National High Technology Research and Development Programme the of China(No.2007AA01Z221)
文摘In order to improve the performance of peer-to-peer files sharing system under mobile distributed en- vironments, a novel always-optimally-coordinated (AOC) criterion and corresponding candidate selection algorithm are proposed in this paper. Compared with the traditional min-hops criterion, the new approach introduces a fuzzy knowledge combination theory to investigate several important factors that influence files transfer success rate and efficiency. Whereas the min-hops based protocols only ask the nearest candidate peer for desired files, the selection algorithm based on AOC comprehensively considers users' preferences and network requirements with flexible balancing rules. Furthermore, its advantage also expresses in the independence of specified resource discovering protocols, allowing for scalability. The simulation results show that when using the AOC based peer selection algorithm, system performance is much better than the rain-hops scheme, with files successful transfer rate improved more than 50% and transfer time re- duced at least 20%.
文摘In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.
文摘In order to improve the management strategy for personnel files in colleges and universities,simplify the complex process of file management,and improve file management security and content preservation of the files.This paper elaborates on the application of Artificial Intelligence(AI)technology in university personnel file management through theoretical analysis based on the understanding of Al technology.
文摘To better understand different users' accessing intentions, a novel clustering and supervising method based on accessing path is presented. This method divides users' interest space to express the distribution of users' interests, and directly to instruct the constructing process of web pages indexing for advanced performance.
文摘The fast growing market of mobile device adoption and cloud computing has led to exploitation of mobile devices utilizing cloud services. One major chal-lenge facing the usage of mobile devices in the cloud environment is mobile synchronization to the cloud, e.g., synchronizing contacts, text messages, imag-es, and videos. Owing to the expected high volume of traffic and high time complexity required for synchronization, an appropriate synchronization algo-rithm needs to be developed. Delta synchronization is one method of synchro-nizing compressed files that requires uploading the whole file, even when no changes were made or if it was only partially changed. In the present study, we proposed an algorithm, based on Delta synchronization, to solve the problem of synchronizing compressed files under various forms of modification (e.g., not modified, partially modified, or completely modified). To measure the effi-ciency of our proposed algorithm, we compared it to the Dropbox application algorithm. The results demonstrated that our algorithm outperformed the regular Dropbox synchronization mechanism by reducing the synchronization time, cost, and traffic load between clients and the cloud service provider.
基金Sponsored by the National Natural Science Foundation of China(Grant No.60672124 and 60832009)Hi-Tech Research and Development Program(National 863 Program)(Grant No.2007AA01Z221)
文摘IIn order to improve the performance of wireless distributed peer-to-peer(P2P)files sharing systems,a general system architecture and a novel peer selecting model based on fuzzy cognitive maps(FCM)are proposed in this paper.The new model provides an effective approach on choosing an optimal peer from several resource discovering results for the best file transfer.Compared with the traditional min-hops scheme that uses hops as the only selecting criterion,the proposed model uses FCM to investigate the complex relationships among various relative factors in wireless environments and gives an overall evaluation score on the candidate.It also has strong scalability for being independent of specified P2P resource discovering protocols.Furthermore,a complete implementation is explained in concrete modules.The simulation results show that the proposed model is effective and feasible compared with min-hops scheme,with the success transfer rate increased by at least 20% and transfer time improved as high as 34%.
文摘Aim: the aim of this study was to investigate the shaping ability of thermomechanically treated files manufactured by twisting(Twisted files)and compare it to conventional rotary system (K3, Sybron Endo, Orange, CA) in S-shaped canals, including formation of ledges, zipping, elbow, outer widening, danger zone, perforation and file deformation. Materials & Methods: Forty S-Shaped canals in resin blocks were randomly divided into 2 groups of 20 each. Pre-instrumentation images of the canals were taken via a digital camera and superimposed on images taken after preparation with TF and K3 systems to apical size of 25/06 and 30/06. Canal aberrations were measured from the superimposed image at five levels using AutoCAD system. Fisher exact test and Mann Whitney test were used for analysis of the data. Results: the incidence of zipping, elbow and apical transportation was significantly lower in the TF group (P = 0.04). Generally the incidence of aberration increased when the apical size increased to 30/0.06 regardless of the file system. Significant file deformation was evident in the TF after single use (P ? 0.001). Conclusion: Under the conditions of this study, TF manufactured by new technique performed better than K3 systems when used up to size 25/06 in simulated S-shaped canals. Clinical significance: The flexibility of thermomechanically treated files is beneficial in canals with multiple curvatures;however, attention should be paid to the instrument taper and final apical size of the preparation.
文摘In public health emergencies, the collection of archival information in the area of public health emergencies is very important, which can provide a reference for the early warning and processing mechanism. At the present stage, the unlimited demand of archival users for resources and the convenience of obtaining resources have become the two driving forces to promote the “transformation” and “growth” of archives. Public health emergencies, the transformation of archives collection and service mode, social media has become an indispensable platform for user information exchange, sharing and transmission, which requires archives to change the mode of archives acquisition and storage. Archival users require more interactive, targeted and cutting-edge forms of access and archival services are also developing toward diversified functions and connotations. Archival information resource sharing is also an important link in this development trend. This paper attempts to analyze the collection methods of archives departments in public health emergencies, and then puts forward the corresponding measures for archives departments to play their functions, such as flexibly solving the needs of archives access, strengthening the development of information resources, doing a good job in the collection of relevant archives, and publicizing archives work in combination with hot spots. This paper discusses the completeness of archival data collection, the means of archival management, the scientific classification of archival data, the ways of archival data collection and so on.
文摘The increase in issue width and instructions window size in modern processors demand an increase in the size of the register files, as well as an increase in the number of ports. Bigger register files implies an increase in power consumed by these units as well as longer access delays. Models that assist in estimating the size of the register file, and its timing early in the design cycle are critical to the time-budget allocated to a processor design and to its performance. In this work, we discuss a Radial Base Function (RBF) Artificial Neural Network (ANN) model for the prediction of time and area for standard cell register files designed using optimized Synopsys Design Ware components and an UMC130 nm library. The ANN model predictions were compared against experimental results (obtained using detailed simulation) and a nonlinear regression-based model, and it is observed that the ANN model is very accurate and outperformed the non-linear model in several statistical parameters. Using the trained ANN model, a parametric study was carried out to study the effect of the number of words in the file (D), the number of bit in one word (W) and the total number of Read and Write ports (P) on the latency and area of standard cell register files.
文摘两次金卷轴奖获奖者Will Files是好莱坞的一位音效设计师和声音混录师,被好莱坞的记者称之为行业中"最杰出的最具才能的35岁以下的艺术家"他参与制作了许多好莱坞大片和卖座电影,包括Marvel漫画公司最新作品《神奇四侠》(Fantastic Four),《猩球崛起:黎明之战》(Dawn of the Planet of the Apes)。
文摘The so-called hospital personnel files management, the specific work content is responsible for the hospital personnel files after the search together, and classification of customs declaration, for subsequent access to use. Personnel file management is the basic content of hospital personnel management, which plays a key role in the operation of the whole hospital. In the hospital personnel files, personal information of each employee is recorded, including personal experience, political outlook, professional ability, moral quality, etc. Hospital knowledge files contain a large amount of information, and the number is very large, involving a very wide range, with a very strong professional, the collection process is relatively complex, especially in the continuous and in-depth reform of the medical and health system, the hospital personnel files carry more and more information, the traditional hospital file management information method can no longer adapt to the new era of hospital development for the management and use of human resources files, so we must strengthen the use and management of personnel files information, so that it can play a better role in promoting the development of hospitals.