Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradi...Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.展开更多
Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM t...Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM techniques(i.e.,laser powder bed fusion and laser-directed energy deposition)will be reviewed,covering the aspects of processes,materials and post-processing.The impacts of process parameters and strategies for optimizing parameters will be elucidated.Various types of Ti alloys processed by LAM,includingα-Ti,(α+β)-Ti,andβ-Ti alloys,will be overviewed in terms of micro structures and benchmarking properties.Furthermore,the post-processing methods for improving the performance of L AM-processed Ti alloys,including conventional and novel heat treatment,hot isostatic pressing,and surface processing(e.g.,ultrasonic and laser shot peening),will be systematically reviewed and discussed.The review summarizes the process windows,properties,and performance envelopes and benchmarks the research achievements in LAM of Ti alloys.The outlooks of further trends in LAM of Ti alloys are also highlighted at the end of the review.This comprehensive review could serve as a valuable resource for researchers and practitioners,promoting further advancements in LAM-built Ti alloys and their applications.展开更多
With the increase in the quantity and scale of Static Random-Access Memory Field Programmable Gate Arrays (SRAM-based FPGAs) for aerospace application, the volume of FPGA configuration bit files that must be stored ha...With the increase in the quantity and scale of Static Random-Access Memory Field Programmable Gate Arrays (SRAM-based FPGAs) for aerospace application, the volume of FPGA configuration bit files that must be stored has increased dramatically. The use of compression techniques for these bitstream files is emerging as a key strategy to alleviate the burden on storage resources. Due to the severe resource constraints of space-based electronics and the unique application environment, the simplicity, efficiency and robustness of the decompression circuitry is also a key design consideration. Through comparative analysis current bitstream file compression technologies, this research suggests that the Lempel Ziv Oberhumer (LZO) compression algorithm is more suitable for satellite applications. This paper also delves into the compression process and format of the LZO compression algorithm, as well as the inherent characteristics of configuration bitstream files. We propose an improved algorithm based on LZO for bitstream file compression, which optimises the compression process by refining the format and reducing the offset. Furthermore, a low-cost, robust decompression hardware architecture is proposed based on this method. Experimental results show that the compression speed of the improved LZO algorithm is increased by 3%, the decompression hardware cost is reduced by approximately 60%, and the compression ratio is slightly reduced by 0.47%.展开更多
The healthcare sector involves many steps to ensure efficient care for patients,such as appointment scheduling,consultation plans,online follow-up,and more.However,existing healthcare mechanisms are unable to facilita...The healthcare sector involves many steps to ensure efficient care for patients,such as appointment scheduling,consultation plans,online follow-up,and more.However,existing healthcare mechanisms are unable to facilitate a large number of patients,as these systems are centralized and hence vulnerable to various issues,including single points of failure,performance bottlenecks,and substantial monetary costs.Furthermore,these mechanisms are unable to provide an efficient mechanism for saving data against unauthorized access.To address these issues,this study proposes a blockchain-based authentication mechanism that authenticates all healthcare stakeholders based on their credentials.Furthermore,also utilize the capabilities of the InterPlanetary File System(IPFS)to store the Electronic Health Record(EHR)in a distributed way.This IPFS platform addresses not only the issue of high data storage costs on blockchain but also the issue of a single point of failure in the traditional centralized data storage model.The simulation results demonstrate that our model outperforms the benchmark schemes and provides an efficient mechanism for managing healthcare sector operations.The results show that it takes approximately 3.5 s for the smart contract to authenticate the node and provide it with the decryption key,which is ultimately used to access the data.The simulation results show that our proposed model outperforms existing solutions in terms of execution time and scalability.The execution time of our model smart contract is around 9000 transactions in just 6.5 s,while benchmark schemes require approximately 7 s for the same number of transactions.展开更多
[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data...[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data file support for meteorological forecasting and services.[Method]In this paper,an efficient and accurate method for data file quality control and fusion processing is developed.By locating the missing measurement time,data are extracted from the"AWZ.db"database and the minute routine meteorological element data file,and merged into the hourly routine meteorological element data file.[Result]Data processing efficiency and accuracy are significantly improved,and the problem of incomplete hourly routine meteorological element data files is solved.At the same time,it emphasizes the importance of ensuring the accuracy of the files used and carefully checking and verifying the fusion results,and proposes strategies to improve data quality.[Conclusion]This method provides convenience for observation personnel and effectively improves the integrity and accuracy of data files.In the future,it is expected to provide more reliable data support for meteorological forecasting and services.展开更多
At present,the polymerase chain reaction(PCR)amplification-based file retrieval method is the mostcommonly used and effective means of DNA file retrieval.The number of orthogonal primers limitsthe number of files that...At present,the polymerase chain reaction(PCR)amplification-based file retrieval method is the mostcommonly used and effective means of DNA file retrieval.The number of orthogonal primers limitsthe number of files that can be accurately accessed,which in turn affects the density in a single oligo poolof digital DNA storage.In this paper,a multi-mode DNA sequence design method based on PCR file retrie-val in a single oligonucleotide pool is proposed for high-capacity DNA data storage.Firstly,by analyzingthe maximum number of orthogonal primers at each predicted primer length,it was found that the rela-tionship between primer length and the maximum available primer number does not increase linearly,and the maximum number of orthogonal primers is on the order of 10^(4).Next,this paper analyzes themaximum address space capacity of DNA sequences with different types of primer binding sites for filemapping.In the case where the capacity of the primer library is R(where R is even),the number ofaddress spaces that can be mapped by the single-primer DNA sequence design scheme proposed in thispaper is four times that of the previous one,and the two-level primer DNA sequence design scheme can reach [R/2·(R/2-1)]^(2)times.Finally,a multi-mode DNA sequence generation method is designed based onthe number of files to be stored in the oligonucleotide pool,in order to meet the requirements of the ran-dom retrieval of target files in an oligonucleotide pool with large-scale file numbers.The performance ofthe primers generated by the orthogonal primer library generator proposed in this paper is verified,andthe average Gibbs free energy of the most stable heterodimer formed between the orthogonal primersproduced is−1 kcal·(mol·L^(−1))^(−1)(1 kcal=4.184 kJ).At the same time,by selectively PCR-amplifying theDNA sequences of the two-level primer binding sites for random access,the target sequence can be accu-rately read with a minimum of 10^(3) reads,when the primer binding site sequences at different positionsare mutually different.This paper provides a pipeline for orthogonal primer library generation and multi-mode mapping schemes between files and primers,which can help achieve precise random access to filesin large-scale DNA oligo pools.展开更多
Images and videos play an increasingly vital role in daily life and are widely utilized as key evidentiary sources in judicial investigations and forensic analysis.Simultaneously,advancements in image and video proces...Images and videos play an increasingly vital role in daily life and are widely utilized as key evidentiary sources in judicial investigations and forensic analysis.Simultaneously,advancements in image and video processing technologies have facilitated the widespread availability of powerful editing tools,such as Deepfakes,enabling anyone to easily create manipulated or fake visual content,which poses an enormous threat to social security and public trust.To verify the authenticity and integrity of images and videos,numerous approaches have been proposed,which are primarily based on content analysis and their effectiveness is susceptible to interference from various image or video post-processing operations.Recent research has highlighted the potential of file containers analysis as a promising forensic approach that offers efficient and interpretable results.However,there is still a lack of review articles on this kind of approach.In order to fill this gap,we present a comprehensive review of file containers-based image and video forensics in this paper.Specifically,we categorize the existing methods into two distinct stages,qualitative analysis and quantitative analysis.In addition,an overall framework is proposed to organize the exiting approaches.Then,the advantages and disadvantages of the schemes used across different forensic tasks are provided.Finally,we outline the trends in this research area,aiming to provide valuable insights and technical guidance for future research.展开更多
In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow bas...In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow based on computational or experimental results. Field plots by contours, iso-surfaces, streamlines, vectors and others are traditional post-processing techniques. While the shock wave, as one important and critical flow structure in many aerodynamic problems, can hardly be detected or distinguished in a direct way using these traditional methods, due to possible confusions with other similar discontinuous flow structures like slip line, contact discontinuity, etc. Therefore, method for automatic detection of shock wave in post-processing is of great importance for both academic research and engineering applications. In this paper, the current status of methodologies developed for shock wave detection and implementations in post-processing platform are reviewed, as well as discussions on advantages and limitations of the existing methods and proposals for further studies of shock wave detection method. We also develop an advanced post-processing software, with improved shock detection.展开更多
Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve...Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32× 32 APD array is up to tens of Gbits/s.展开更多
In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflec...In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.展开更多
The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for coll...The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for collecting data is not sufficient because of their limited coverage and expensive costs for installation and maintenance. Application of the Global Positioning Systems (GPS) in travel time and delay data collections is proven to be efficient in terms of accuracy, level of details for the data and required data collection of man-power. While data collection automation is improved by the GPS technique, human errors can easily find their way through the post-processing phase, and therefore data post-processing remains a challenge especially in case of big projects with high amount of data. This paper introduces a stand-alone post-processing tool called GPS Calculator, which provides an easy-to-use environment to carry out data post-processing. This is a Visual Basic application that processes the data files obtained in the field and integrates them into Geographic Information Systems (GIS) for analysis and representation. The results show that this tool obtains similar results to the currently used data post-processing method, reduces the post-processing effort, and also eliminates the need for the second person during the data collection.展开更多
When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive ...When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.展开更多
To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum...To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum that the high intense and stable line spectrum is superimposed on the wide continuous spectrum.This method modifies the traditional beam forming algorithm by calculating and fusing the beam forming results at multi-frequency band and multi-azimuth interval,showing an excellent way to extract the line spectrum when the interference and the target are not in the same azimuth interval simultaneously.Statistical efficiency of the estimated azimuth variance and corresponding power of the line spectrum band depends on the line spectra ratio(LSR)of the line spectrum.The change laws of the output signal to noise ratio(SNR)with the LSR,the input SNR,the integration time and the filtering bandwidth of different algorithms bring the selection principle of the critical LSR.As the basis,the detection gain of wideband energy integration and the narrowband line spectrum algorithm are theoretically analyzed.The simulation detection gain demonstrates a good match with the theoretical model.The application conditions of all methods are verified by the receiver operating characteristic(ROC)curve and experimental data from Qiandao Lake.In fact,combining the two methods for target detection reduces the missed detection rate.The proposed post-processing method in2-dimension with the Kalman filter in the time dimension and the background equalization algorithm in the azimuth dimension makes use of the strong correlation between adjacent frames,could further remove background fluctuation and improve the display effect.展开更多
This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids...This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.展开更多
Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intole...Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intolerably alter inherent features of MR images.Drastic changes in brightness features,induced by post-processing are not appreciated in medical imaging as the grey level values have certain diagnostic meanings.To overcome these issues this paper proposes an algorithm that enhance the contrast of MR images while preserving the underlying features as well.This method termed as Power-law and Logarithmic Modification-based Histogram Equalization(PLMHE)partitions the histogram of the image into two sub histograms after a power-law transformation and a log compression.After a modification intended for improving the dispersion of the sub-histograms and subsequent normalization,cumulative histograms are computed.Enhanced grey level values are computed from the resultant cumulative histograms.The performance of the PLMHE algorithm is comparedwith traditional histogram equalization based algorithms and it has been observed from the results that PLMHE can boost the image contrast without causing dynamic range compression,a significant change in mean brightness,and contrast-overshoot.展开更多
In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produce...In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produced by wall holes and the loss of precision induced by using differential method to derive strains, the displacement-based elements cannot always present accuracy enough for design. In this paper, the hybrid post-processing procedure based on the Hellinger-Reissner variational principle is used for improving the stress precision of two quadrilateral plane elements. In order to find the best stress field, three different forms are assumed for the displacement-based plane elements and with drilling DOF. Numerical results show that by using the proposed method, the accuracy of stress solutions of these two displacement-based plane elements can be improved.展开更多
基金the Young Investigator Group“Artificial Intelligence for Probabilistic Weather Forecasting”funded by the Vector Stiftungfunding from the Federal Ministry of Education and Research(BMBF)and the Baden-Württemberg Ministry of Science as part of the Excellence Strategy of the German Federal and State Governments。
文摘Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.
基金financially supported by the 2022 MTC Young Individual Research Grants under Singapore Research,Innovation and Enterprise(RIE)2025 Plan(No.M22K3c0097)the Natural Science Foundation of US(No.DMR-2104933)the sponsorship of the China Scholarship Council(No.202106130051)。
文摘Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM techniques(i.e.,laser powder bed fusion and laser-directed energy deposition)will be reviewed,covering the aspects of processes,materials and post-processing.The impacts of process parameters and strategies for optimizing parameters will be elucidated.Various types of Ti alloys processed by LAM,includingα-Ti,(α+β)-Ti,andβ-Ti alloys,will be overviewed in terms of micro structures and benchmarking properties.Furthermore,the post-processing methods for improving the performance of L AM-processed Ti alloys,including conventional and novel heat treatment,hot isostatic pressing,and surface processing(e.g.,ultrasonic and laser shot peening),will be systematically reviewed and discussed.The review summarizes the process windows,properties,and performance envelopes and benchmarks the research achievements in LAM of Ti alloys.The outlooks of further trends in LAM of Ti alloys are also highlighted at the end of the review.This comprehensive review could serve as a valuable resource for researchers and practitioners,promoting further advancements in LAM-built Ti alloys and their applications.
基金supported in part by the National Key Laboratory of Science and Technology on Space Microwave(Grant Nos.HTKJ2022KL504009 and HTKJ2022KL5040010).
文摘With the increase in the quantity and scale of Static Random-Access Memory Field Programmable Gate Arrays (SRAM-based FPGAs) for aerospace application, the volume of FPGA configuration bit files that must be stored has increased dramatically. The use of compression techniques for these bitstream files is emerging as a key strategy to alleviate the burden on storage resources. Due to the severe resource constraints of space-based electronics and the unique application environment, the simplicity, efficiency and robustness of the decompression circuitry is also a key design consideration. Through comparative analysis current bitstream file compression technologies, this research suggests that the Lempel Ziv Oberhumer (LZO) compression algorithm is more suitable for satellite applications. This paper also delves into the compression process and format of the LZO compression algorithm, as well as the inherent characteristics of configuration bitstream files. We propose an improved algorithm based on LZO for bitstream file compression, which optimises the compression process by refining the format and reducing the offset. Furthermore, a low-cost, robust decompression hardware architecture is proposed based on this method. Experimental results show that the compression speed of the improved LZO algorithm is increased by 3%, the decompression hardware cost is reduced by approximately 60%, and the compression ratio is slightly reduced by 0.47%.
基金supported by the Ongoing Research Funding program(ORF-2025-636),King Saud University,Riyadh,Saudi Arabia.
文摘The healthcare sector involves many steps to ensure efficient care for patients,such as appointment scheduling,consultation plans,online follow-up,and more.However,existing healthcare mechanisms are unable to facilitate a large number of patients,as these systems are centralized and hence vulnerable to various issues,including single points of failure,performance bottlenecks,and substantial monetary costs.Furthermore,these mechanisms are unable to provide an efficient mechanism for saving data against unauthorized access.To address these issues,this study proposes a blockchain-based authentication mechanism that authenticates all healthcare stakeholders based on their credentials.Furthermore,also utilize the capabilities of the InterPlanetary File System(IPFS)to store the Electronic Health Record(EHR)in a distributed way.This IPFS platform addresses not only the issue of high data storage costs on blockchain but also the issue of a single point of failure in the traditional centralized data storage model.The simulation results demonstrate that our model outperforms the benchmark schemes and provides an efficient mechanism for managing healthcare sector operations.The results show that it takes approximately 3.5 s for the smart contract to authenticate the node and provide it with the decryption key,which is ultimately used to access the data.The simulation results show that our proposed model outperforms existing solutions in terms of execution time and scalability.The execution time of our model smart contract is around 9000 transactions in just 6.5 s,while benchmark schemes require approximately 7 s for the same number of transactions.
基金the Fifth Batch of Innovation Teams of Wuzhou Meteorological Bureau"Wuzhou Innovation Team for Enhancing the Comprehensive Meteorological Observation Ability through Digitization and Intelligence"Wuzhou Science and Technology Planning Project(202402122,202402119).
文摘[Objective]In response to the issue of insufficient integrity in hourly routine meteorological element data files,this paper aims to improve the availability and reliability of data files,and provide high-quality data file support for meteorological forecasting and services.[Method]In this paper,an efficient and accurate method for data file quality control and fusion processing is developed.By locating the missing measurement time,data are extracted from the"AWZ.db"database and the minute routine meteorological element data file,and merged into the hourly routine meteorological element data file.[Result]Data processing efficiency and accuracy are significantly improved,and the problem of incomplete hourly routine meteorological element data files is solved.At the same time,it emphasizes the importance of ensuring the accuracy of the files used and carefully checking and verifying the fusion results,and proposes strategies to improve data quality.[Conclusion]This method provides convenience for observation personnel and effectively improves the integrity and accuracy of data files.In the future,it is expected to provide more reliable data support for meteorological forecasting and services.
基金supported by the fund from Tianjin Municipal Science and Technology Bureau(22JCYBJC01390).
文摘At present,the polymerase chain reaction(PCR)amplification-based file retrieval method is the mostcommonly used and effective means of DNA file retrieval.The number of orthogonal primers limitsthe number of files that can be accurately accessed,which in turn affects the density in a single oligo poolof digital DNA storage.In this paper,a multi-mode DNA sequence design method based on PCR file retrie-val in a single oligonucleotide pool is proposed for high-capacity DNA data storage.Firstly,by analyzingthe maximum number of orthogonal primers at each predicted primer length,it was found that the rela-tionship between primer length and the maximum available primer number does not increase linearly,and the maximum number of orthogonal primers is on the order of 10^(4).Next,this paper analyzes themaximum address space capacity of DNA sequences with different types of primer binding sites for filemapping.In the case where the capacity of the primer library is R(where R is even),the number ofaddress spaces that can be mapped by the single-primer DNA sequence design scheme proposed in thispaper is four times that of the previous one,and the two-level primer DNA sequence design scheme can reach [R/2·(R/2-1)]^(2)times.Finally,a multi-mode DNA sequence generation method is designed based onthe number of files to be stored in the oligonucleotide pool,in order to meet the requirements of the ran-dom retrieval of target files in an oligonucleotide pool with large-scale file numbers.The performance ofthe primers generated by the orthogonal primer library generator proposed in this paper is verified,andthe average Gibbs free energy of the most stable heterodimer formed between the orthogonal primersproduced is−1 kcal·(mol·L^(−1))^(−1)(1 kcal=4.184 kJ).At the same time,by selectively PCR-amplifying theDNA sequences of the two-level primer binding sites for random access,the target sequence can be accu-rately read with a minimum of 10^(3) reads,when the primer binding site sequences at different positionsare mutually different.This paper provides a pipeline for orthogonal primer library generation and multi-mode mapping schemes between files and primers,which can help achieve precise random access to filesin large-scale DNA oligo pools.
基金supported in part by Natural Science Foundation of Hubei Province of China under Grant 2023AFB016the 2022 Opening Fund for Hubei Key Laboratory of Intelligent Vision Based Monitoring for Hydroelectric Engineering under Grant 2022SDSJ02the Construction Fund for Hubei Key Laboratory of Intelligent Vision Based Monitoring for Hydroelectric Engineering under Grant 2019ZYYD007.
文摘Images and videos play an increasingly vital role in daily life and are widely utilized as key evidentiary sources in judicial investigations and forensic analysis.Simultaneously,advancements in image and video processing technologies have facilitated the widespread availability of powerful editing tools,such as Deepfakes,enabling anyone to easily create manipulated or fake visual content,which poses an enormous threat to social security and public trust.To verify the authenticity and integrity of images and videos,numerous approaches have been proposed,which are primarily based on content analysis and their effectiveness is susceptible to interference from various image or video post-processing operations.Recent research has highlighted the potential of file containers analysis as a promising forensic approach that offers efficient and interpretable results.However,there is still a lack of review articles on this kind of approach.In order to fill this gap,we present a comprehensive review of file containers-based image and video forensics in this paper.Specifically,we categorize the existing methods into two distinct stages,qualitative analysis and quantitative analysis.In addition,an overall framework is proposed to organize the exiting approaches.Then,the advantages and disadvantages of the schemes used across different forensic tasks are provided.Finally,we outline the trends in this research area,aiming to provide valuable insights and technical guidance for future research.
文摘In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow based on computational or experimental results. Field plots by contours, iso-surfaces, streamlines, vectors and others are traditional post-processing techniques. While the shock wave, as one important and critical flow structure in many aerodynamic problems, can hardly be detected or distinguished in a direct way using these traditional methods, due to possible confusions with other similar discontinuous flow structures like slip line, contact discontinuity, etc. Therefore, method for automatic detection of shock wave in post-processing is of great importance for both academic research and engineering applications. In this paper, the current status of methodologies developed for shock wave detection and implementations in post-processing platform are reviewed, as well as discussions on advantages and limitations of the existing methods and proposals for further studies of shock wave detection method. We also develop an advanced post-processing software, with improved shock detection.
基金Supported by the Chinese Academy of Sciences Center for Excellence and Synergetic Innovation Center in Quantum Information and Quantum Physics,Shanghai Branch,University of Science and Technology of Chinathe National Natural Science Foundation of China under Grant No 11405172
文摘Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32× 32 APD array is up to tens of Gbits/s.
基金Projects 50221402, 50490271 and 50025413 supported by the National Natural Science Foundation of Chinathe National Basic Research Program of China (2009CB219603, 2009 CB724601, 2006CB202209 and 2005CB221500)+1 种基金the Key Project of the Ministry of Education (306002)the Program for Changjiang Scholars and Innovative Research Teams in Universities of MOE (IRT0408)
文摘In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.
文摘The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for collecting data is not sufficient because of their limited coverage and expensive costs for installation and maintenance. Application of the Global Positioning Systems (GPS) in travel time and delay data collections is proven to be efficient in terms of accuracy, level of details for the data and required data collection of man-power. While data collection automation is improved by the GPS technique, human errors can easily find their way through the post-processing phase, and therefore data post-processing remains a challenge especially in case of big projects with high amount of data. This paper introduces a stand-alone post-processing tool called GPS Calculator, which provides an easy-to-use environment to carry out data post-processing. This is a Visual Basic application that processes the data files obtained in the field and integrates them into Geographic Information Systems (GIS) for analysis and representation. The results show that this tool obtains similar results to the currently used data post-processing method, reduces the post-processing effort, and also eliminates the need for the second person during the data collection.
基金supported by the New Century Excellent Talents in University(NCET-09-0396)the National Science&Technology Key Projects of Numerical Control(2012ZX04014-031)+1 种基金the Natural Science Foundation of Hubei Province(2011CDB279)the Foundation for Innovative Research Groups of the Natural Science Foundation of Hubei Province,China(2010CDA067)
文摘When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.
基金supported by the National Natural Science Foundation of China(51875535)the Natural Science Foundation for Young Scientists of Shanxi Province(201701D221017,201901D211242)。
文摘To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum that the high intense and stable line spectrum is superimposed on the wide continuous spectrum.This method modifies the traditional beam forming algorithm by calculating and fusing the beam forming results at multi-frequency band and multi-azimuth interval,showing an excellent way to extract the line spectrum when the interference and the target are not in the same azimuth interval simultaneously.Statistical efficiency of the estimated azimuth variance and corresponding power of the line spectrum band depends on the line spectra ratio(LSR)of the line spectrum.The change laws of the output signal to noise ratio(SNR)with the LSR,the input SNR,the integration time and the filtering bandwidth of different algorithms bring the selection principle of the critical LSR.As the basis,the detection gain of wideband energy integration and the narrowband line spectrum algorithm are theoretically analyzed.The simulation detection gain demonstrates a good match with the theoretical model.The application conditions of all methods are verified by the receiver operating characteristic(ROC)curve and experimental data from Qiandao Lake.In fact,combining the two methods for target detection reduces the missed detection rate.The proposed post-processing method in2-dimension with the Kalman filter in the time dimension and the background equalization algorithm in the azimuth dimension makes use of the strong correlation between adjacent frames,could further remove background fluctuation and improve the display effect.
文摘This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.
基金This work was supported by Taif university Researchers Supporting Project Number(TURSP-2020/114),Taif University,Taif,Saudi Arabia.
文摘Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intolerably alter inherent features of MR images.Drastic changes in brightness features,induced by post-processing are not appreciated in medical imaging as the grey level values have certain diagnostic meanings.To overcome these issues this paper proposes an algorithm that enhance the contrast of MR images while preserving the underlying features as well.This method termed as Power-law and Logarithmic Modification-based Histogram Equalization(PLMHE)partitions the histogram of the image into two sub histograms after a power-law transformation and a log compression.After a modification intended for improving the dispersion of the sub-histograms and subsequent normalization,cumulative histograms are computed.Enhanced grey level values are computed from the resultant cumulative histograms.The performance of the PLMHE algorithm is comparedwith traditional histogram equalization based algorithms and it has been observed from the results that PLMHE can boost the image contrast without causing dynamic range compression,a significant change in mean brightness,and contrast-overshoot.
文摘In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produced by wall holes and the loss of precision induced by using differential method to derive strains, the displacement-based elements cannot always present accuracy enough for design. In this paper, the hybrid post-processing procedure based on the Hellinger-Reissner variational principle is used for improving the stress precision of two quadrilateral plane elements. In order to find the best stress field, three different forms are assumed for the displacement-based plane elements and with drilling DOF. Numerical results show that by using the proposed method, the accuracy of stress solutions of these two displacement-based plane elements can be improved.