In real industrial scenarios, equipment cannot be operated in a faulty state for a long time, resulting in a very limited number of available fault samples, and the method of data augmentation using generative adversa...In real industrial scenarios, equipment cannot be operated in a faulty state for a long time, resulting in a very limited number of available fault samples, and the method of data augmentation using generative adversarial networks for smallsample data has achieved a wide range of applications. However, the current generative adversarial networks applied in industrial processes do not impose realistic physical constraints on the generation of data, resulting in the generation of data that do not have realistic physical consistency. To address this problem, this paper proposes a physical consistency-based WGAN, designs a loss function containing physical constraints for industrial processes, and validates the effectiveness of the method using a common dataset in the field of industrial process fault diagnosis. The experimental results show that the proposed method not only makes the generated data consistent with the physical constraints of the industrial process, but also has better fault diagnosis performance than the existing GAN-based methods.展开更多
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di...Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.展开更多
The corrosion rate is a crucial factor that impacts the longevity of materials in different applications.After undergoing friction stir processing(FSP),the refined grain structure leads to a notable decrease in corros...The corrosion rate is a crucial factor that impacts the longevity of materials in different applications.After undergoing friction stir processing(FSP),the refined grain structure leads to a notable decrease in corrosion rate.However,a better understanding of the correlation between the FSP process parameters and the corrosion rate is still lacking.The current study used machine learning to establish the relationship between the corrosion rate and FSP process parameters(rotational speed,traverse speed,and shoulder diameter)for WE43 alloy.The Taguchi L27 design of experiments was used for the experimental analysis.In addition,synthetic data was generated using particle swarm optimization for virtual sample generation(VSG).The application of VSG has led to an increase in the prediction accuracy of machine learning models.A sensitivity analysis was performed using Shapley Additive Explanations to determine the key factors affecting the corrosion rate.The shoulder diameter had a significant impact in comparison to the traverse speed.A graphical user interface(GUI)has been created to predict the corrosion rate using the identified factors.This study focuses on the WE43 alloy,but its findings can also be used to predict the corrosion rate of other magnesium alloys.展开更多
Tracy-Widom distribution was rst discovered in the study of largest eigenvalues of high dimensional Gaussian unitary ensembles(GUE),and since then it has appeared in a number of apparently distinct research elds.It is...Tracy-Widom distribution was rst discovered in the study of largest eigenvalues of high dimensional Gaussian unitary ensembles(GUE),and since then it has appeared in a number of apparently distinct research elds.It is believed that Tracy-Widom distribution have a universal feature like classic normal distribution.Airy2 process is de ned through nite dimensional distributions with Tracy-Widom distribution as its marginal distributions.In this introductory survey,we will briey review some basic notions,intuitive background and fundamental properties concerning Tracy-Widom distribution and Airy2 process.For sake of reading,the paper starts with some simple and well-known facts about normal distributions,Gaussian processes and their sample path properties.展开更多
During environment testing, the estimation of random vibration signals (RVS) is an important technique for the airborne platform safety and reliability. However, the available meth- ods including extreme value envel...During environment testing, the estimation of random vibration signals (RVS) is an important technique for the airborne platform safety and reliability. However, the available meth- ods including extreme value envelope method (EVEM), statistical tolerances method (STM) and improved statistical tolerance method (ISTM) require large samples and typical probability distri- bution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM) is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated inter- val, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM) and gray method (GM) in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.展开更多
Virtual testability demonstration test has many advantages,such as low cost,high efficiency,low risk and few restrictions.It brings new requirements to the fault sample generation.A fault sample simulation approach fo...Virtual testability demonstration test has many advantages,such as low cost,high efficiency,low risk and few restrictions.It brings new requirements to the fault sample generation.A fault sample simulation approach for virtual testability demonstration test based on stochastic process theory is proposed.First,the similarities and differences of fault sample generation between physical testability demonstration test and virtual testability demonstration test are discussed.Second,it is pointed out that the fault occurrence process subject to perfect repair is renewal process.Third,the interarrival time distribution function of the next fault event is given.Steps and flowcharts of fault sample generation are introduced.The number of faults and their occurrence time are obtained by statistical simulation.Finally,experiments are carried out on a stable tracking platform.Because a variety of types of life distributions and maintenance modes are considered and some assumptions are removed,the sample size and structure of fault sample simulation results are more similar to the actual results and more reasonable.The proposed method can effectively guide the fault injection in virtual testability demonstration test.展开更多
This paper presents an efficient technique for processing of 3D meshed surfaces via spherical wavelets. More specifically, an input 3D mesh is firstly transformed into a spherical vector signal by a fast low distortio...This paper presents an efficient technique for processing of 3D meshed surfaces via spherical wavelets. More specifically, an input 3D mesh is firstly transformed into a spherical vector signal by a fast low distortion spherical parameterization approach based on symmetry analysis of 3D meshes. This signal is then sampled on the sphere with the help of an adaptive sampling scheme. Finally, the sampled signal is transformed into the wavelet domain according to spherical wavelet transform where many 3D mesh processing operations can be implemented such as smoothing, enhancement, compression, and so on. Our main contribution lies in incorporating a fast low distortion spherical parameterization approach and an adaptive sampling scheme into the frame for pro- cessing 3D meshed surfaces by spherical wavelets, which can handle surfaces with complex shapes. A number of experimental ex- amples demonstrate that our algorithm is robust and efficient.展开更多
This fully digital beam position measurement instrument is designed for beam position monitoring and machine research in Shanghai Synchrotron Radiation Facility. The signals received from four position-sensitive detec...This fully digital beam position measurement instrument is designed for beam position monitoring and machine research in Shanghai Synchrotron Radiation Facility. The signals received from four position-sensitive detectors are narrow pulses with a repetition rate up to 499.654 MHz and a pulse width of around 100 ps, and their dynamic range could vary over more than 40 dB in machine research. By the employment of the under-sampling technique based on high-speed high-resolution A/D conversion, all the processing procedure is performed fully by the digital signal processing algorithms integrated in one single Field Programmable Gate Array. This system functions well in the laboratory and commissioning tests, demonstrating a position resolution (at the turn by turn rate of 694 kHz) better than 7 μm over the input amplitude range of -40 dBm to 10 dBm which is well beyond the requirement.展开更多
Within the framework of feasibility studies for a reversible, deep geological repository of high-and intermediate-level long-lived radioactive waste(HLW, IL-LLW), the French National Radioactive Waste Management Agenc...Within the framework of feasibility studies for a reversible, deep geological repository of high-and intermediate-level long-lived radioactive waste(HLW, IL-LLW), the French National Radioactive Waste Management Agency(Andra) is investigating the Callovo-Oxfordian(COx) formation near Bure(northeast part of France) as a potential host rock for the repository. The hydro-mechanical(HM) behaviour is an important issue to design and optimise different components of the disposal such as shaft, ramp, drift,and waste package disposal facilities. Over the past 20 years, a large number of laboratory experiments have been carried out to characterise and understand the HM behaviours of COx claystones. At the beginning, samples came from deep boreholes drilled at the ground surface with oil base mud. From2000 onwards, with the launch of the construction of the Meuse/Haute-Marne Underground Research Laboratory(MHM URL), most samples have been extracted from a large number of air drilled boreholes in the URL. In parallel, various constitutive models have been developed for modelling. The thermohydro-mechanical(THM) behaviours of the COx claystones were investigated under different repository conditions. Core samples are subjected to a complex HM loading path before testing, due to drilling, conditioning and preparation. Various kinds of effects on the characteristics of the claystones are highlighted and discussed, and the procedures for core extraction and packaging as well as a systematic sample preparation protocol are proposed in order to minimise the uncertainties on test results. The representativeness of the test results is also addressed with regard to the in situ rock mass.展开更多
For a sampled-data control system with nonuniform sampling, the sampling interval sequence, which is continuously distributed in a given interval, is described as a multiple independent and identically distributed (i....For a sampled-data control system with nonuniform sampling, the sampling interval sequence, which is continuously distributed in a given interval, is described as a multiple independent and identically distributed (i.i.d.) process. With this process, the closed-loop system is transformed into an asynchronous dynamical impulsive model with input delays. Sufficient conditions for the closed-loop mean-square exponential stability are presented in terms of linear matrix inequalities (LMIs), in which the relation between the nonuniform sampling and the mean-square exponential stability of the closed-loop system is explicitly established. Based on the stability conditions, the controller design method is given, which is further formulated as a convex optimization problem with LMI constraints. Numerical examples and experiment results are given to show the effectiveness and the advantages of the theoretical results.展开更多
This paper contributes to the structural reliability problem by presenting a novel approach that enables for identification of stochastic oscillatory processes as a critical input for given mechanical models. Identifi...This paper contributes to the structural reliability problem by presenting a novel approach that enables for identification of stochastic oscillatory processes as a critical input for given mechanical models. Identification development follows a transparent image processing paradigm completely independent of state-of-the-art structural dynamics, aiming at delivering a simple and wide purpose method. Validation of the proposed importance sampling strategy is based on multi-scale clusters of realizations of digitally generated non-stationary stochastic processes. Good agreement with the reference pure Monte Carlo results indicates a significant potential in reducing the computational task of first passage probabilities estimation, an important feature in the field of e.g., probabilistic seismic design or risk assessment generally.展开更多
Sample entropy can reflect the change of level of new information in signal sequence as well as the size of the new information. Based on the sample entropy as the features of speech classification, the paper firstly ...Sample entropy can reflect the change of level of new information in signal sequence as well as the size of the new information. Based on the sample entropy as the features of speech classification, the paper firstly extract the sample entropy of mixed signal, mean and variance to calculate each signal sample entropy, finally uses the K mean clustering to recognize. The simulation results show that: the recognition rate can be increased to 89.2% based on sample entropy.展开更多
ADSP-TS101 is a high performance DSP with good properties of parallel processing and high speed.According to the real-time processing requirements of underwater acoustic communication algorithms,a real-time parallel p...ADSP-TS101 is a high performance DSP with good properties of parallel processing and high speed.According to the real-time processing requirements of underwater acoustic communication algorithms,a real-time parallel processing system with multi-channel synchronous sample,which is composed of multiple ADSP-TS101s,is designed and carried out.For the hardware design,field programmable gate array(FPGA)logical control is adopted for the design of multi-channel synchronous sample module and cluster/data flow associated pin connection mode is adopted for multiprocessing parallel processing configuration respectively.And the software is optimized by two kinds of communication ways:broadcast writing way through shared bus and point-to-point way through link ports.Through the whole system installation,connective debugging,and experiments in a lake,the results show that the real-time parallel processing system has good stability and real-time processing capability and meets the technical design requirements of real-time processing.展开更多
Bilingual lexicon induction focuses on learning word translation pairs,also known as bitexts,from monolingual corpora by establishing a mapping between the source and target embedding spaces.Despite recent advancement...Bilingual lexicon induction focuses on learning word translation pairs,also known as bitexts,from monolingual corpora by establishing a mapping between the source and target embedding spaces.Despite recent advancements,bilingual lexicon induction is limited to inducing bitexts consisting of individual words,lacking the ability to handle semantics-rich phrases.To bridge this gap and support downstream cross-lingual tasks,it is practical to develop a method for bilingual phrase induction that extracts bilingual phrase pairs from monolingual corpora without relying on cross-lingual knowledge.In this paper,the authors propose a novel phrase embedding training method based on the skip-gram structure.Specifically,a local hard negative sampling strategy that utilises negative samples of central tokens in sliding windows to enhance phrase embedding learning is introduced.The proposed method achieves competitive or superior performance compared to baseline approaches,with exceptional results recorded for distant languages.Additionally,we develop a phrase representation learning method that leverages multilingual pre-trained language models.These mPLMs-based representations can be combined with the above-mentioned static phrase embeddings to further improve the accuracy of the bilingual phrase induction task.We manually construct a dataset of bilingual phrase pairs and integrate it with MUSE to facilitate the bilingual phrase induction task.展开更多
We propose a novel all-optical sampling method using nonlinear polarization rotation in a semiconductor optical amplifier. A rate-equation model capable of describing the all-optical sampling mechanism is presented in...We propose a novel all-optical sampling method using nonlinear polarization rotation in a semiconductor optical amplifier. A rate-equation model capable of describing the all-optical sampling mechanism is presented in this paper. Based on this model, we investigate the optimized operating parameters of the proposed system by simulating the output intensity of the probe light as functions of the input polarization angle, the phase induced by the polarization controller, and the ori- entation of the polarization beam splitter. The simulated results show that we can obtain a good linear slope and a large linear dynamic range,which is suitable for all-optical sampling. The operating power of the pump light can be less than lmW. The presented all-optical sampling method can potentially operate at a sampling rate up to hundreds GS/s and needs low optical power.展开更多
A novel data streams partitioning method is proposed to resolve problems of range-aggregation continuous queries over parallel streams for power industry.The first step of this method is to parallel sample the data,wh...A novel data streams partitioning method is proposed to resolve problems of range-aggregation continuous queries over parallel streams for power industry.The first step of this method is to parallel sample the data,which is implemented as an extended reservoir-sampling algorithm.A skip factor based on the change ratio of data-values is introduced to describe the distribution characteristics of data-values adaptively.The second step of this method is to partition the fluxes of data streams averagely,which is implemented with two alternative equal-depth histogram generating algorithms that fit the different cases:one for incremental maintenance based on heuristics and the other for periodical updates to generate an approximate partition vector.The experimental results on actual data prove that the method is efficient,practical and suitable for time-varying data streams processing.展开更多
Objective:Rhei Radix et Rhizoma has five types of products,namely,raw rhubarb(RR),wine rhubarb(WR),vinegar rhubarb(VR),cooked rhubarb(CR),and rhubarb charcoal(RC).However,Rhei Radix et Rhizoma is easily contaminated w...Objective:Rhei Radix et Rhizoma has five types of products,namely,raw rhubarb(RR),wine rhubarb(WR),vinegar rhubarb(VR),cooked rhubarb(CR),and rhubarb charcoal(RC).However,Rhei Radix et Rhizoma is easily contaminated with fungi and mycotoxins if not harvested or processed properly.Here,we intend to analyze how microbiome assemblies and co-occurrence patterns are influenced by sampling locations and processing methods.Methods:High-throughput sequencing and internal transcribed spacer 2(ITS2)were carried out to study the diversities(α-andβ-diversity),composition(dominant taxa and potential biomarkers),and network complexitity of surface fungi on RR,WR,VR,CR,and RC collected from Gansu and Sichuan provinces,China.Results:The phyla Ascomycota and Basidiomycota;the genera Kazachstania,Malassezia,and Asterotremella;and the species Kazachstania exigua,Asterotremella pseudolonga,and Malassezia restricta were the dominant fungi and exhibited differences in the two provinces and the five processed products.Theα-diversity and network complexity were strongly dependent on processing methods.Chao 1,the Shannon index,and network complexity and connectivity were highest in the CR group.Theα-diversity and network complexity were influenced by sampling locations.Chao 1 and network complexity and connectivity were highest in the Gansu Province.Conclusion:The assembly and network of the surface microbiome on Rhei Radix et Rhizoma were shaped by processing methods and sampling locations.This paper offers a comprehensive understanding of microorganisms,which can provide early warning for potential mycotoxins and ensure the safety of drugs and consumers.展开更多
China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a...China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a new technology to increase the accuracy of seismic exploration. We briefly discuss point source and receiver technology, analyze the high density spatial sampling in situ method, introduce the symmetric sampling principles presented by Gijs J. O. Vermeer, and discuss high density spatial sampling technology from the point of view of wave field continuity. We emphasize the analysis of the high density spatial sampling characteristics, including the high density first break advantages for investigation of near surface structure, improving static correction precision, the use of dense receiver spacing at short offsets to increase the effective coverage at shallow depth, and the accuracy of reflection imaging. Coherent noise is not aliased and the noise analysis precision and suppression increases as a result. High density spatial sampling enhances wave field continuity and the accuracy of various mathematical transforms, which benefits wave field separation. Finally, we point out that the difficult part of high density spatial sampling technology is the data processing. More research needs to be done on the methods of analyzing and processing huge amounts of seismic data.展开更多
The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and qu...The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and quality of the resource estimation. These techniques include: 1) the use of the Multivariate Discovery Process model (MDP) to derive unbiased distribution parameters of reservoir volumetric variables and to reveal correlations among the variables; 2) the use of the Geo-anchored method to estimate simultaneously the number of oil and gas pools in the same play; and 3) the crossvalidation of assessment results from different methods. These techniques are illustrated by using an example of crude oil and natural gas resource assessment of the Sverdrup Basin, Canadian Archipelago. The example shows that when direct volumetric measurements of the untested prospects are not available, the MDP model can help derive unbiased estimates of the distribution parameters by using information from the discovered oil and gas accumulations. It also shows that an estimation of the number of oil and gas accumulations and associated size ranges from a discovery process model can provide an alternative and efficient approach when inadequate geological data hinder the estimation. Cross-examination of assessment results derived using different methods allows one to focus on and analyze the causes for the major differences, thus providing a more reliable assessment outcome.展开更多
Many factors have been identified as having the ability to affect the sensitivity of rapid antigen detection(RAD)tests for severe acute respiratory syndrome coronavirus 2(SARS-CoV-2).This study aimed to identify the i...Many factors have been identified as having the ability to affect the sensitivity of rapid antigen detection(RAD)tests for severe acute respiratory syndrome coronavirus 2(SARS-CoV-2).This study aimed to identify the impact of sample processing on the sensitivity of the RAD tests.We explored the effect of different inactivation methods,viral transport media(VTM)solutions,and sample preservation on the sensitivity of four RAD kits based on two SARS-CoV-2 strains.Compared with non-inactivation,heat inactivation significantly impacted the sensitivity of most RAD kits;however,β-propiolactone inactivation only had a minor effect.Some of the VTM solutions(VTM2,MANTACC)had a significant influence on the sensitivity of the RAD kits,especially for low viral-loads samples.The detection value of RAD kits was slightly decreased,while most of them were still in the detection range with the extension of preservation time and the increase of freeze–thaw cycles.Our results showed that selecting the appropriate inactivation methods and VTM solutions is necessary during reagent development,performance evaluation,and clinical application。展开更多
文摘In real industrial scenarios, equipment cannot be operated in a faulty state for a long time, resulting in a very limited number of available fault samples, and the method of data augmentation using generative adversarial networks for smallsample data has achieved a wide range of applications. However, the current generative adversarial networks applied in industrial processes do not impose realistic physical constraints on the generation of data, resulting in the generation of data that do not have realistic physical consistency. To address this problem, this paper proposes a physical consistency-based WGAN, designs a loss function containing physical constraints for industrial processes, and validates the effectiveness of the method using a common dataset in the field of industrial process fault diagnosis. The experimental results show that the proposed method not only makes the generated data consistent with the physical constraints of the industrial process, but also has better fault diagnosis performance than the existing GAN-based methods.
文摘Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.
文摘The corrosion rate is a crucial factor that impacts the longevity of materials in different applications.After undergoing friction stir processing(FSP),the refined grain structure leads to a notable decrease in corrosion rate.However,a better understanding of the correlation between the FSP process parameters and the corrosion rate is still lacking.The current study used machine learning to establish the relationship between the corrosion rate and FSP process parameters(rotational speed,traverse speed,and shoulder diameter)for WE43 alloy.The Taguchi L27 design of experiments was used for the experimental analysis.In addition,synthetic data was generated using particle swarm optimization for virtual sample generation(VSG).The application of VSG has led to an increase in the prediction accuracy of machine learning models.A sensitivity analysis was performed using Shapley Additive Explanations to determine the key factors affecting the corrosion rate.The shoulder diameter had a significant impact in comparison to the traverse speed.A graphical user interface(GUI)has been created to predict the corrosion rate using the identified factors.This study focuses on the WE43 alloy,but its findings can also be used to predict the corrosion rate of other magnesium alloys.
基金the National Natural Science Foundation of China(11731012,11871425)Fundamental Research Funds for Central Universities grant(2020XZZX002-03).
文摘Tracy-Widom distribution was rst discovered in the study of largest eigenvalues of high dimensional Gaussian unitary ensembles(GUE),and since then it has appeared in a number of apparently distinct research elds.It is believed that Tracy-Widom distribution have a universal feature like classic normal distribution.Airy2 process is de ned through nite dimensional distributions with Tracy-Widom distribution as its marginal distributions.In this introductory survey,we will briey review some basic notions,intuitive background and fundamental properties concerning Tracy-Widom distribution and Airy2 process.For sake of reading,the paper starts with some simple and well-known facts about normal distributions,Gaussian processes and their sample path properties.
基金supported by Aviation Science Foundation of China (No. 20100251006)the Technological Foundation Project (No. J132012C001)
文摘During environment testing, the estimation of random vibration signals (RVS) is an important technique for the airborne platform safety and reliability. However, the available meth- ods including extreme value envelope method (EVEM), statistical tolerances method (STM) and improved statistical tolerance method (ISTM) require large samples and typical probability distri- bution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM) is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated inter- val, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM) and gray method (GM) in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.
基金National Natural Science Foundation of China(51105369)
文摘Virtual testability demonstration test has many advantages,such as low cost,high efficiency,low risk and few restrictions.It brings new requirements to the fault sample generation.A fault sample simulation approach for virtual testability demonstration test based on stochastic process theory is proposed.First,the similarities and differences of fault sample generation between physical testability demonstration test and virtual testability demonstration test are discussed.Second,it is pointed out that the fault occurrence process subject to perfect repair is renewal process.Third,the interarrival time distribution function of the next fault event is given.Steps and flowcharts of fault sample generation are introduced.The number of faults and their occurrence time are obtained by statistical simulation.Finally,experiments are carried out on a stable tracking platform.Because a variety of types of life distributions and maintenance modes are considered and some assumptions are removed,the sample size and structure of fault sample simulation results are more similar to the actual results and more reasonable.The proposed method can effectively guide the fault injection in virtual testability demonstration test.
基金Supported by the National Natural Science Foundation of China(No.61173102)the NSFC Guangdong Joint Fund(No.U0935004)+2 种基金the Fundamental Research Funds for the Central Universities(No.DUT11SX08)the Opening Foundation of Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education of China(No.93K172012K02)the Doctor Research Start-up Fund of North East Dian Li university(No.BSJXM-200912)
文摘This paper presents an efficient technique for processing of 3D meshed surfaces via spherical wavelets. More specifically, an input 3D mesh is firstly transformed into a spherical vector signal by a fast low distortion spherical parameterization approach based on symmetry analysis of 3D meshes. This signal is then sampled on the sphere with the help of an adaptive sampling scheme. Finally, the sampled signal is transformed into the wavelet domain according to spherical wavelet transform where many 3D mesh processing operations can be implemented such as smoothing, enhancement, compression, and so on. Our main contribution lies in incorporating a fast low distortion spherical parameterization approach and an adaptive sampling scheme into the frame for pro- cessing 3D meshed surfaces by spherical wavelets, which can handle surfaces with complex shapes. A number of experimental ex- amples demonstrate that our algorithm is robust and efficient.
基金Supported by Knowledge Innovation Program of The Chinese Academy of Sciences (KJCX2-YW-N27)the National Natural Science Foundation of China (10875119)100 Talents Program of The Chinese Academy of Sciences
文摘This fully digital beam position measurement instrument is designed for beam position monitoring and machine research in Shanghai Synchrotron Radiation Facility. The signals received from four position-sensitive detectors are narrow pulses with a repetition rate up to 499.654 MHz and a pulse width of around 100 ps, and their dynamic range could vary over more than 40 dB in machine research. By the employment of the under-sampling technique based on high-speed high-resolution A/D conversion, all the processing procedure is performed fully by the digital signal processing algorithms integrated in one single Field Programmable Gate Array. This system functions well in the laboratory and commissioning tests, demonstrating a position resolution (at the turn by turn rate of 694 kHz) better than 7 μm over the input amplitude range of -40 dBm to 10 dBm which is well beyond the requirement.
文摘Within the framework of feasibility studies for a reversible, deep geological repository of high-and intermediate-level long-lived radioactive waste(HLW, IL-LLW), the French National Radioactive Waste Management Agency(Andra) is investigating the Callovo-Oxfordian(COx) formation near Bure(northeast part of France) as a potential host rock for the repository. The hydro-mechanical(HM) behaviour is an important issue to design and optimise different components of the disposal such as shaft, ramp, drift,and waste package disposal facilities. Over the past 20 years, a large number of laboratory experiments have been carried out to characterise and understand the HM behaviours of COx claystones. At the beginning, samples came from deep boreholes drilled at the ground surface with oil base mud. From2000 onwards, with the launch of the construction of the Meuse/Haute-Marne Underground Research Laboratory(MHM URL), most samples have been extracted from a large number of air drilled boreholes in the URL. In parallel, various constitutive models have been developed for modelling. The thermohydro-mechanical(THM) behaviours of the COx claystones were investigated under different repository conditions. Core samples are subjected to a complex HM loading path before testing, due to drilling, conditioning and preparation. Various kinds of effects on the characteristics of the claystones are highlighted and discussed, and the procedures for core extraction and packaging as well as a systematic sample preparation protocol are proposed in order to minimise the uncertainties on test results. The representativeness of the test results is also addressed with regard to the in situ rock mass.
基金supported by National Natural Science Foundation of China (Nos.61104105,U0735003 and 60974047)Natural Science Foundation of Guangdong Province of China (No.9451009001002702)
文摘For a sampled-data control system with nonuniform sampling, the sampling interval sequence, which is continuously distributed in a given interval, is described as a multiple independent and identically distributed (i.i.d.) process. With this process, the closed-loop system is transformed into an asynchronous dynamical impulsive model with input delays. Sufficient conditions for the closed-loop mean-square exponential stability are presented in terms of linear matrix inequalities (LMIs), in which the relation between the nonuniform sampling and the mean-square exponential stability of the closed-loop system is explicitly established. Based on the stability conditions, the controller design method is given, which is further formulated as a convex optimization problem with LMI constraints. Numerical examples and experiment results are given to show the effectiveness and the advantages of the theoretical results.
文摘This paper contributes to the structural reliability problem by presenting a novel approach that enables for identification of stochastic oscillatory processes as a critical input for given mechanical models. Identification development follows a transparent image processing paradigm completely independent of state-of-the-art structural dynamics, aiming at delivering a simple and wide purpose method. Validation of the proposed importance sampling strategy is based on multi-scale clusters of realizations of digitally generated non-stationary stochastic processes. Good agreement with the reference pure Monte Carlo results indicates a significant potential in reducing the computational task of first passage probabilities estimation, an important feature in the field of e.g., probabilistic seismic design or risk assessment generally.
文摘Sample entropy can reflect the change of level of new information in signal sequence as well as the size of the new information. Based on the sample entropy as the features of speech classification, the paper firstly extract the sample entropy of mixed signal, mean and variance to calculate each signal sample entropy, finally uses the K mean clustering to recognize. The simulation results show that: the recognition rate can be increased to 89.2% based on sample entropy.
基金Sponsored by National Natural Science Foundation of China(60572098)
文摘ADSP-TS101 is a high performance DSP with good properties of parallel processing and high speed.According to the real-time processing requirements of underwater acoustic communication algorithms,a real-time parallel processing system with multi-channel synchronous sample,which is composed of multiple ADSP-TS101s,is designed and carried out.For the hardware design,field programmable gate array(FPGA)logical control is adopted for the design of multi-channel synchronous sample module and cluster/data flow associated pin connection mode is adopted for multiprocessing parallel processing configuration respectively.And the software is optimized by two kinds of communication ways:broadcast writing way through shared bus and point-to-point way through link ports.Through the whole system installation,connective debugging,and experiments in a lake,the results show that the real-time parallel processing system has good stability and real-time processing capability and meets the technical design requirements of real-time processing.
基金National Key Research and Development Program of China,Grant/Award Number:2023YFC3305003National Natural Science Foundation of China,Grant/Award Number:62376076。
文摘Bilingual lexicon induction focuses on learning word translation pairs,also known as bitexts,from monolingual corpora by establishing a mapping between the source and target embedding spaces.Despite recent advancements,bilingual lexicon induction is limited to inducing bitexts consisting of individual words,lacking the ability to handle semantics-rich phrases.To bridge this gap and support downstream cross-lingual tasks,it is practical to develop a method for bilingual phrase induction that extracts bilingual phrase pairs from monolingual corpora without relying on cross-lingual knowledge.In this paper,the authors propose a novel phrase embedding training method based on the skip-gram structure.Specifically,a local hard negative sampling strategy that utilises negative samples of central tokens in sliding windows to enhance phrase embedding learning is introduced.The proposed method achieves competitive or superior performance compared to baseline approaches,with exceptional results recorded for distant languages.Additionally,we develop a phrase representation learning method that leverages multilingual pre-trained language models.These mPLMs-based representations can be combined with the above-mentioned static phrase embeddings to further improve the accuracy of the bilingual phrase induction task.We manually construct a dataset of bilingual phrase pairs and integrate it with MUSE to facilitate the bilingual phrase induction task.
文摘We propose a novel all-optical sampling method using nonlinear polarization rotation in a semiconductor optical amplifier. A rate-equation model capable of describing the all-optical sampling mechanism is presented in this paper. Based on this model, we investigate the optimized operating parameters of the proposed system by simulating the output intensity of the probe light as functions of the input polarization angle, the phase induced by the polarization controller, and the ori- entation of the polarization beam splitter. The simulated results show that we can obtain a good linear slope and a large linear dynamic range,which is suitable for all-optical sampling. The operating power of the pump light can be less than lmW. The presented all-optical sampling method can potentially operate at a sampling rate up to hundreds GS/s and needs low optical power.
基金The High Technology Research Plan of Jiangsu Prov-ince (No.BG2004034)the Foundation of Graduate Creative Program ofJiangsu Province (No.xm04-36).
文摘A novel data streams partitioning method is proposed to resolve problems of range-aggregation continuous queries over parallel streams for power industry.The first step of this method is to parallel sample the data,which is implemented as an extended reservoir-sampling algorithm.A skip factor based on the change ratio of data-values is introduced to describe the distribution characteristics of data-values adaptively.The second step of this method is to partition the fluxes of data streams averagely,which is implemented with two alternative equal-depth histogram generating algorithms that fit the different cases:one for incremental maintenance based on heuristics and the other for periodical updates to generate an approximate partition vector.The experimental results on actual data prove that the method is efficient,practical and suitable for time-varying data streams processing.
基金supported by grants from the National Key R&D Plan(No.2022YFC3501801,2022YFC3501802and 2022YFC3501804)Fundamental Research Funds for the Central Public Welfare Research Institutes(No.ZXKT21037,ZZ15-YQ044,ZXKT22050 and ZXKT22001)+1 种基金Scientific Research Project of Hainan Academician Innovation Platform(No.YSPTZX202137 and SQ2021PTZ0052)Scientific and Technological Innovation Project of China Academy of Chinese Medical Sciences(No.CI2023E002-Y-58)。
文摘Objective:Rhei Radix et Rhizoma has five types of products,namely,raw rhubarb(RR),wine rhubarb(WR),vinegar rhubarb(VR),cooked rhubarb(CR),and rhubarb charcoal(RC).However,Rhei Radix et Rhizoma is easily contaminated with fungi and mycotoxins if not harvested or processed properly.Here,we intend to analyze how microbiome assemblies and co-occurrence patterns are influenced by sampling locations and processing methods.Methods:High-throughput sequencing and internal transcribed spacer 2(ITS2)were carried out to study the diversities(α-andβ-diversity),composition(dominant taxa and potential biomarkers),and network complexitity of surface fungi on RR,WR,VR,CR,and RC collected from Gansu and Sichuan provinces,China.Results:The phyla Ascomycota and Basidiomycota;the genera Kazachstania,Malassezia,and Asterotremella;and the species Kazachstania exigua,Asterotremella pseudolonga,and Malassezia restricta were the dominant fungi and exhibited differences in the two provinces and the five processed products.Theα-diversity and network complexity were strongly dependent on processing methods.Chao 1,the Shannon index,and network complexity and connectivity were highest in the CR group.Theα-diversity and network complexity were influenced by sampling locations.Chao 1 and network complexity and connectivity were highest in the Gansu Province.Conclusion:The assembly and network of the surface microbiome on Rhei Radix et Rhizoma were shaped by processing methods and sampling locations.This paper offers a comprehensive understanding of microorganisms,which can provide early warning for potential mycotoxins and ensure the safety of drugs and consumers.
文摘China's continental deposition basins are characterized by complex geological structures and various reservoir lithologies. Therefore, high precision exploration methods are needed. High density spatial sampling is a new technology to increase the accuracy of seismic exploration. We briefly discuss point source and receiver technology, analyze the high density spatial sampling in situ method, introduce the symmetric sampling principles presented by Gijs J. O. Vermeer, and discuss high density spatial sampling technology from the point of view of wave field continuity. We emphasize the analysis of the high density spatial sampling characteristics, including the high density first break advantages for investigation of near surface structure, improving static correction precision, the use of dense receiver spacing at short offsets to increase the effective coverage at shallow depth, and the accuracy of reflection imaging. Coherent noise is not aliased and the noise analysis precision and suppression increases as a result. High density spatial sampling enhances wave field continuity and the accuracy of various mathematical transforms, which benefits wave field separation. Finally, we point out that the difficult part of high density spatial sampling technology is the data processing. More research needs to be done on the methods of analyzing and processing huge amounts of seismic data.
文摘The reservoir volumetric approach represents a widely accepted, but flawed method of petroleum play resource calculation. In this paper, we propose a combination of techniques that can improve the applicability and quality of the resource estimation. These techniques include: 1) the use of the Multivariate Discovery Process model (MDP) to derive unbiased distribution parameters of reservoir volumetric variables and to reveal correlations among the variables; 2) the use of the Geo-anchored method to estimate simultaneously the number of oil and gas pools in the same play; and 3) the crossvalidation of assessment results from different methods. These techniques are illustrated by using an example of crude oil and natural gas resource assessment of the Sverdrup Basin, Canadian Archipelago. The example shows that when direct volumetric measurements of the untested prospects are not available, the MDP model can help derive unbiased estimates of the distribution parameters by using information from the discovered oil and gas accumulations. It also shows that an estimation of the number of oil and gas accumulations and associated size ranges from a discovery process model can provide an alternative and efficient approach when inadequate geological data hinder the estimation. Cross-examination of assessment results derived using different methods allows one to focus on and analyze the causes for the major differences, thus providing a more reliable assessment outcome.
基金supported by China's National Science and Technology Major Project(2018ZX10102001)the Non-profit Central Research Institute Fund of Chinese Academy of Medical Sciences(2019PT3100292020PT310004).
文摘Many factors have been identified as having the ability to affect the sensitivity of rapid antigen detection(RAD)tests for severe acute respiratory syndrome coronavirus 2(SARS-CoV-2).This study aimed to identify the impact of sample processing on the sensitivity of the RAD tests.We explored the effect of different inactivation methods,viral transport media(VTM)solutions,and sample preservation on the sensitivity of four RAD kits based on two SARS-CoV-2 strains.Compared with non-inactivation,heat inactivation significantly impacted the sensitivity of most RAD kits;however,β-propiolactone inactivation only had a minor effect.Some of the VTM solutions(VTM2,MANTACC)had a significant influence on the sensitivity of the RAD kits,especially for low viral-loads samples.The detection value of RAD kits was slightly decreased,while most of them were still in the detection range with the extension of preservation time and the increase of freeze–thaw cycles.Our results showed that selecting the appropriate inactivation methods and VTM solutions is necessary during reagent development,performance evaluation,and clinical application。