Test data compression and test resource partitioning (TRP) are essential to reduce the amount of test data in system-on-chip testing. A novel variable-to-variable-length compression codes is designed as advanced fre...Test data compression and test resource partitioning (TRP) are essential to reduce the amount of test data in system-on-chip testing. A novel variable-to-variable-length compression codes is designed as advanced fre- quency-directed run-length (AFDR) codes. Different [rom frequency-directed run-length (FDR) codes, AFDR encodes both 0- and 1-runs and uses the same codes to the equal length runs. It also modifies the codes for 00 and 11 to improve the compression performance. Experimental results for ISCAS 89 benchmark circuits show that AFDR codes achieve higher compression ratio than FDR and other compression codes.展开更多
This paper presents a new test data compression/decompression method for SoC testing,called hybrid run length codes. The method makes a full analysis of the factors which influence test parameters:compression ratio,t...This paper presents a new test data compression/decompression method for SoC testing,called hybrid run length codes. The method makes a full analysis of the factors which influence test parameters:compression ratio,test application time, and area overhead. To improve the compression ratio, the new method is based on variable-to-variable run length codes,and a novel algorithm is proposed to reorder the test vectors and fill the unspecified bits in the pre-processing step. With a novel on-chip decoder, low test application time and low area overhead are obtained by hybrid run length codes. Finally, an experimental comparison on ISCAS 89 benchmark circuits validates the proposed method展开更多
A new structural damage identification method using limited test static displacement based on grey system theory is proposed in this paper. The grey relation coefficient of displacement curvature is defined and used t...A new structural damage identification method using limited test static displacement based on grey system theory is proposed in this paper. The grey relation coefficient of displacement curvature is defined and used to locate damage in the structure, and an iterative estimation scheme for solving nonlinear optimization programming problems based on the quadratic programming technique is used to identify the damage magnitude. A numerical example of a cantilever beam with single or multiple damages is used to examine the capability of the proposed grey-theory-based method to localize and identify damages. The factors of meas-urement noise and incomplete test data are also discussed. The numerical results showed that the damage in the structure can be localized correctly through using the grey-related coefficient of displacement curvature, and the damage magnitude can be iden-tified with a high degree of accuracy, regardless of the number of measured displacement nodes. This proposed method only requires limited static test data, which is easily available in practice, and has wide applications in structural damage detection.展开更多
We developed an inversion technique to determine in situ stresses for elliptical boreholes of arbitrary trajectory. In this approach, borehole geometry, drilling-induced fracture information, and other available leak-...We developed an inversion technique to determine in situ stresses for elliptical boreholes of arbitrary trajectory. In this approach, borehole geometry, drilling-induced fracture information, and other available leak-off test data were used to construct a mathematical model, which was in turn applied to finding the inverse of an overdetermined system of equations.The method has been demonstrated by a case study in the Appalachian Basin, USA. The calculated horizontal stresses are in reasonable agreement with the reported regional stress study of the area, although there are no field measurement data of the studied well for direct calibration. The results also indicate that 2% of axis difference in the elliptical borehole geometry can cause a 5% difference in minimum horizontal stress calculation and a 10% difference in maximum horizontal stress calculation.展开更多
Testing is an integral part of software development.Current fastpaced system developments have rendered traditional testing techniques obsolete.Therefore,automated testing techniques are needed to adapt to such system...Testing is an integral part of software development.Current fastpaced system developments have rendered traditional testing techniques obsolete.Therefore,automated testing techniques are needed to adapt to such system developments speed.Model-based testing(MBT)is a technique that uses system models to generate and execute test cases automatically.It was identified that the test data generation(TDG)in many existing model-based test case generation(MB-TCG)approaches were still manual.An automatic and effective TDG can further reduce testing cost while detecting more faults.This study proposes an automated TDG approach in MB-TCG using the extended finite state machine model(EFSM).The proposed approach integrates MBT with combinatorial testing.The information available in an EFSM model and the boundary value analysis strategy are used to automate the domain input classifications which were done manually by the existing approach.The results showed that the proposed approach was able to detect 6.62 percent more faults than the conventionalMB-TCG but at the same time generated 43 more tests.The proposed approach effectively detects faults,but a further treatment to the generated tests such as test case prioritization should be done to increase the effectiveness and efficiency of testing.展开更多
By analyzing some existing test data generation methods, a new automated test data generation approach was presented. The linear predicate functions on a given path was directly used to construct a linear constrain sy...By analyzing some existing test data generation methods, a new automated test data generation approach was presented. The linear predicate functions on a given path was directly used to construct a linear constrain system for input variables. Only when the predicate function is nonlinear, does the linear arithmetic representation need to be computed. If the entire predicate functions on the given path are linear, either the desired test data or the guarantee that the path is infeasible can be gotten from the solution of the constrain system. Otherwise, the iterative refining for the input is required to obtain the desired test data. Theoretical analysis and test results show that the approach is simple and effective, and takes less computation. The scheme can also be used to generate path-based test data for the programs with arrays and loops.展开更多
The automatic generation of test data is a key step in realizing automated testing.Most automated testing tools for unit testing only provide test case execution drivers and cannot generate test data that meets covera...The automatic generation of test data is a key step in realizing automated testing.Most automated testing tools for unit testing only provide test case execution drivers and cannot generate test data that meets coverage requirements.This paper presents an improved Whale Genetic Algorithm for generating test data re-quired for unit testing MC/DC coverage.The proposed algorithm introduces an elite retention strategy to avoid the genetic algorithm from falling into iterative degradation.At the same time,the mutation threshold of the whale algorithm is introduced to balance the global exploration and local search capabilities of the genetic al-gorithm.The threshold is dynamically adjusted according to the diversity and evolution stage of current popu-lation,which positively guides the evolution of the population.Finally,an improved crossover strategy is pro-posed to accelerate the convergence of the algorithm.The improved whale genetic algorithm is compared with genetic algorithm,whale algorithm and particle swarm algorithm on two benchmark programs.The results show that the proposed algorithm is faster for test data generation than comparison methods and can provide better coverage with fewer evaluations,and has great advantages in generating test data.展开更多
Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvat...Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvatures of mode shapes of the periodic spring-mass system by utilizing the periodic structure theory are derived in this paper. The sensitivities of these mode parameters with respect to structural damages, which do not depend on the physical parameters of the original structures, are obtained. Based on the sensitivity analysis of these mode parameters, a two-stage method is proposed to localize and quantify damages of multi-story or highrise buildings. The slopes and curvatures of mode shapes, which are highly sensitive to local damages, are used to localize the damages. Subsequently, the limited measured natural frequencies, which have a better accuracy than the other mode parameters, are used to quantify the extent of damages within the potential damaged locations. The experimental results of a 3-story experimental building demonstrate that the single or multiple damages of buildings, either slight or severe, can be correctly localized by using only the slope or curvature of mode shape in one of the lower modes, in which the change of natural frequency is the largest, and can be accurately quantified by the limited measured natural frequencies with noise pollution.展开更多
A separation method is proposed to design and improve shock absorber according to the characteristics of each force. The method is validated by rig test. The force data measured during rig test is the resultant force ...A separation method is proposed to design and improve shock absorber according to the characteristics of each force. The method is validated by rig test. The force data measured during rig test is the resultant force of damping force, rebound force produced by pressed air, and friction force. Different characters of damping force, air rebound force and friction force can be applied to seperate each force from others. A massive produced air filling shock absorber is adopted for the validation. The statistic test is used to get the displacement-force curves. The data are used as the input of separation calculation. Then the tests are carried out again to obtain the force data without air rebound force. The force without air rebound is compared to the data derived from the former tests with the separation method. The result shows that this method can separate the damping force and the air elastic force.展开更多
Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems....Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.展开更多
This paper introduces the background, aim, experimental design, configuration and data processing for an airborne test flight of the HY-2 Microwave scatterometer(HSCAT). The aim was to evaluate HSCAT performance and a...This paper introduces the background, aim, experimental design, configuration and data processing for an airborne test flight of the HY-2 Microwave scatterometer(HSCAT). The aim was to evaluate HSCAT performance and a developed data processing algorithm for the HSCAT before launch. There were three test flights of the scatterometer, on January 15, 18 and 22, 2010, over the South China Sea near Lingshui, Hainan. The test flights successfully generated simultaneous airborne scatterometer normalized radar cross section(NRCS), ASCAT wind, and ship-borne-measured wind datasets, which were used to analyze HSCAT performance. Azimuthal dependence of the NRCS relative to the wind direction was nearly cos(2w), with NRCS minima at crosswind directions, and maxima near upwind and downwind. The NRCS also showed a small difference between upwind and downwind directions, with upwind crosssections generally larger than those downwind. The dependence of airborne scatterometer NRCS on wind direction and speed showed favorable consistency with the NASA scatterometer geophysical model function(NSCAT GMF), indicating satisfactory HSCAT performance.展开更多
It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate ...It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate these nonlinear failure envelopes from triaxial test data. Currently only the power-type failure envelope has been studied with an established formal procedure for its determination from triaxial test data. In this paper, a simplified procedure is evolved for the development of four different types of nonlinear envelopes. These are of invaluable assistance in the evaluation of true factors of safety in problems of slope stability and correct computation of lateral earth pressure and bearing capacity. The use of the Mohr-Coulomb failure envelopes leads to an overestimation of the factors of safety and other geotechnical quantities.展开更多
System-on-a-chips with intellectual property cores need a large volume of data for testing. The large volume of test data requires a large testing time and test data memory. Therefore new techniques are needed to opti...System-on-a-chips with intellectual property cores need a large volume of data for testing. The large volume of test data requires a large testing time and test data memory. Therefore new techniques are needed to optimize the test data volume, decrease the testing time, and conquer the ATE memory limitation for SOC designs. This paper presents a new compression method of testing for intellectual property core-based system-on-chip. The proposed method is based on new split- data variable length (SDV) codes that are designed using the split-options along with identification bits in a string of test data. This paper analyses the reduction of test data volume, testing time, run time, size of memory required in ATE and improvement of compression ratio. Experimental results for ISCAS 85 and ISCAS 89 Benchmark circuits show that SDV codes outperform other compression methods with the best compression ratio for test data compression. The decompression architecture for SDV codes is also presented for decoding the implementations of compressed bits. The proposed scheme shows that SDV codes are accessible to any of the variations in the input test data stream.展开更多
To solve the emerging complex optimization problems, multi objectiveoptimization algorithms are needed. By introducing the surrogate model forapproximate fitness calculation, the multi objective firefly algorithm with...To solve the emerging complex optimization problems, multi objectiveoptimization algorithms are needed. By introducing the surrogate model forapproximate fitness calculation, the multi objective firefly algorithm with surrogatemodel (MOFA-SM) is proposed in this paper. Firstly, the population wasinitialized according to the chaotic mapping. Secondly, the external archive wasconstructed based on the preference sorting, with the lightweight clustering pruningstrategy. In the process of evolution, the elite solutions selected from archivewere used to guide the movement to search optimal solutions. Simulation resultsshow that the proposed algorithm can achieve better performance in terms ofconvergence iteration and stability.展开更多
At present, with the rapid development of science and technology, based on the requirements of weapon test and identification tasks, the experimental data acquisition and processing space station, which is suitable fo...At present, with the rapid development of science and technology, based on the requirements of weapon test and identification tasks, the experimental data acquisition and processing space station, which is suitable for a variety of extreme natural environments such as alpine, plateau, mountain, jungle, desert, island and reef, has been studied theoretically and in practice. The space station is a dome-shaped structure with scale-shaped modules and basalt reinforced fiber composite materials, providing thermal insulation, ventilation and continuous power supply. It can provide support and guarantee for the real-time monitoring, recovery and information transmission of test data, and meet the basic work and life needs of test personnel.展开更多
In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use Op...In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use OpenGL technique and the characteristic of analyzed data to construct a TDDF, the ways of reality processing and interactive processing are described. Then the medium geometric element and a related realistic model are constructed by means of the first algorithm. Models obtained for attaching the third dimension in three-dimensional data field are presented. An example for TDDF realization of machine measuring is provided. The analysis of resultant graphic indicates that the three-dimensional graphics built by the method developed is featured by good reality, fast processing and strong interaction展开更多
Accurate estimation of Zenith Tropospheric Delay(ZTD)is essential for mitigating atmospheric effects in radio astronomical observations and improving the retrieval of precipitable water vapor(PWV).In this study,we fir...Accurate estimation of Zenith Tropospheric Delay(ZTD)is essential for mitigating atmospheric effects in radio astronomical observations and improving the retrieval of precipitable water vapor(PWV).In this study,we first analyze the periodic characteristics of ZTD at the NanShan Radio Telescope site using Fourier transform,revealing its dominant seasonal variations,and then investigate the correlation between ZTD and local meteorological parameters,to better understand atmospheric influences on tropospheric delay.Based on these analyses,we propose a hybrid deep learning Gated Recurrent Units-Long Short-Term Memory model,incorporating meteorological parameters as external inputs to enhance ZTD forecasting accuracy.Experimental results demonstrate that the proposed approach achieves a Root Mean Squared Error of 7.97 mm and a correlation coefficient R of 96%,significantly outperforming traditional empirical models and standalone deep learning architectures.These findings indicate that the model effectively captures both short-term dynamics and long-term dependencies in ZTD variations.The improved ZTD predictions not only contribute to reducing atmospheric errors in radio astronomical observations but also provide a more reliable basis for PWV retrieval and forecasting.This study highlights the potential of deep learning in tropospheric delay modeling,offering advancements in both atmospheric science and geodetic applications.展开更多
Quantitatively correcting the unconfined compressive strength for sample disturbance is an important research project in the practice of ocean engineering and geotechnical engineering. In this study, the specimens of ...Quantitatively correcting the unconfined compressive strength for sample disturbance is an important research project in the practice of ocean engineering and geotechnical engineering. In this study, the specimens of undisturbed natural marine clay obtained from the same depth at the same site were deliberately disturbed to different levels. Then, the specimens with different extents of sample disturbance were trimmed for both oedometer tests and unconfined compression tests. The degree of sample disturbance SD is obtained from the oedometer test data. The relationship between the unconfined compressive strength q u and SD is studied for investigating the effect of sample disturbance on q u. It is found that the value of q u decreases linearly with the increase in SD. Then, a simple method of correcting q u for sample disturbance is proposed. Its validity is also verified through analysis of the existing published data.展开更多
Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features ...Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.展开更多
To understand the variations in vegetation and their correlation with climate factors in the upper catchments of the Yellow River, China, Normalized Difference Vegetation Index(NDVI) time series data from 2000 to 20...To understand the variations in vegetation and their correlation with climate factors in the upper catchments of the Yellow River, China, Normalized Difference Vegetation Index(NDVI) time series data from 2000 to 2010 were collected based on the MOD13Q1 product. The coefficient of variation, Theil–Sen median trend analysis and the Mann–Kendall test were combined to investigate the volatility characteristic and trend characteristic of the vegetation. Climate data sets were then used to analyze the correlation between variations in vegetation and climate change. In terms of the temporal variations, the vegetation in this study area improved slightly from 2000 to 2010, although the volatility characteristic was larger in 2000–2005 than in 2006–2010. In terms of the spatial variation, vegetation which is relatively stable and has a significantly increasing trend accounts for the largest part of the study area. Its spatial distribution is highly correlated with altitude, which ranges from about 2000 to 3000 m in this area. Highly fluctuating vegetation and vegetation which showed a significantly decreasing trend were mostly distributed around the reservoirs and in the reaches of the river with hydropower developments. Vegetation with a relatively stable and significantly decreasing trend and vegetation with a highly fluctuating and significantly increasing trend are widely dispersed. With respect to the response of vegetation to climate change, about 20–30% of the vegetation has a significant correlation with climatic factors and the correlations in most areas are positive: regions with precipitation as the key influencing factor account for more than 10% of the area; regions with temperature as the key influencing factor account for less than 10% of the area; and regions with precipitation and temperature as the key influencing factors together account for about 5% of the total area. More than 70% of the vegetation has an insignificant correlation with climatic factors.展开更多
基金Supported by the National Natural Science Foundation of China(61076019,61106018)the Aeronautical Science Foundation of China(20115552031)+3 种基金the China Postdoctoral Science Foundation(20100481134)the Jiangsu Province Key Technology R&D Program(BE2010003)the Nanjing University of Aeronautics and Astronautics Research Funding(NS2010115)the Nanjing University of Aeronatics and Astronautics Initial Funding for Talented Faculty(1004-YAH10027)~~
文摘Test data compression and test resource partitioning (TRP) are essential to reduce the amount of test data in system-on-chip testing. A novel variable-to-variable-length compression codes is designed as advanced fre- quency-directed run-length (AFDR) codes. Different [rom frequency-directed run-length (FDR) codes, AFDR encodes both 0- and 1-runs and uses the same codes to the equal length runs. It also modifies the codes for 00 and 11 to improve the compression performance. Experimental results for ISCAS 89 benchmark circuits show that AFDR codes achieve higher compression ratio than FDR and other compression codes.
文摘This paper presents a new test data compression/decompression method for SoC testing,called hybrid run length codes. The method makes a full analysis of the factors which influence test parameters:compression ratio,test application time, and area overhead. To improve the compression ratio, the new method is based on variable-to-variable run length codes,and a novel algorithm is proposed to reorder the test vectors and fill the unspecified bits in the pre-processing step. With a novel on-chip decoder, low test application time and low area overhead are obtained by hybrid run length codes. Finally, an experimental comparison on ISCAS 89 benchmark circuits validates the proposed method
基金Project supported by the Natural Science Foundation of China(No. 50378041) and the Specialized Research Fund for the Doc-toral Program of Higher Education (No. 20030487016), China
文摘A new structural damage identification method using limited test static displacement based on grey system theory is proposed in this paper. The grey relation coefficient of displacement curvature is defined and used to locate damage in the structure, and an iterative estimation scheme for solving nonlinear optimization programming problems based on the quadratic programming technique is used to identify the damage magnitude. A numerical example of a cantilever beam with single or multiple damages is used to examine the capability of the proposed grey-theory-based method to localize and identify damages. The factors of meas-urement noise and incomplete test data are also discussed. The numerical results showed that the damage in the structure can be localized correctly through using the grey-related coefficient of displacement curvature, and the damage magnitude can be iden-tified with a high degree of accuracy, regardless of the number of measured displacement nodes. This proposed method only requires limited static test data, which is easily available in practice, and has wide applications in structural damage detection.
基金support of the United States Department of Energy (DE-FE0026825, UCFER-University Coalition for Fossil Energy Research)
文摘We developed an inversion technique to determine in situ stresses for elliptical boreholes of arbitrary trajectory. In this approach, borehole geometry, drilling-induced fracture information, and other available leak-off test data were used to construct a mathematical model, which was in turn applied to finding the inverse of an overdetermined system of equations.The method has been demonstrated by a case study in the Appalachian Basin, USA. The calculated horizontal stresses are in reasonable agreement with the reported regional stress study of the area, although there are no field measurement data of the studied well for direct calibration. The results also indicate that 2% of axis difference in the elliptical borehole geometry can cause a 5% difference in minimum horizontal stress calculation and a 10% difference in maximum horizontal stress calculation.
基金The research was funded by Universiti Teknologi Malaysia(UTM)and the MalaysianMinistry of Higher Education(MOHE)under the Industry-International Incentive Grant Scheme(IIIGS)(Vote Number:Q.J130000.3651.02M67 and Q.J130000.3051.01M86)the Aca-demic Fellowship Scheme(SLAM).
文摘Testing is an integral part of software development.Current fastpaced system developments have rendered traditional testing techniques obsolete.Therefore,automated testing techniques are needed to adapt to such system developments speed.Model-based testing(MBT)is a technique that uses system models to generate and execute test cases automatically.It was identified that the test data generation(TDG)in many existing model-based test case generation(MB-TCG)approaches were still manual.An automatic and effective TDG can further reduce testing cost while detecting more faults.This study proposes an automated TDG approach in MB-TCG using the extended finite state machine model(EFSM).The proposed approach integrates MBT with combinatorial testing.The information available in an EFSM model and the boundary value analysis strategy are used to automate the domain input classifications which were done manually by the existing approach.The results showed that the proposed approach was able to detect 6.62 percent more faults than the conventionalMB-TCG but at the same time generated 43 more tests.The proposed approach effectively detects faults,but a further treatment to the generated tests such as test case prioritization should be done to increase the effectiveness and efficiency of testing.
文摘By analyzing some existing test data generation methods, a new automated test data generation approach was presented. The linear predicate functions on a given path was directly used to construct a linear constrain system for input variables. Only when the predicate function is nonlinear, does the linear arithmetic representation need to be computed. If the entire predicate functions on the given path are linear, either the desired test data or the guarantee that the path is infeasible can be gotten from the solution of the constrain system. Otherwise, the iterative refining for the input is required to obtain the desired test data. Theoretical analysis and test results show that the approach is simple and effective, and takes less computation. The scheme can also be used to generate path-based test data for the programs with arrays and loops.
文摘The automatic generation of test data is a key step in realizing automated testing.Most automated testing tools for unit testing only provide test case execution drivers and cannot generate test data that meets coverage requirements.This paper presents an improved Whale Genetic Algorithm for generating test data re-quired for unit testing MC/DC coverage.The proposed algorithm introduces an elite retention strategy to avoid the genetic algorithm from falling into iterative degradation.At the same time,the mutation threshold of the whale algorithm is introduced to balance the global exploration and local search capabilities of the genetic al-gorithm.The threshold is dynamically adjusted according to the diversity and evolution stage of current popu-lation,which positively guides the evolution of the population.Finally,an improved crossover strategy is pro-posed to accelerate the convergence of the algorithm.The improved whale genetic algorithm is compared with genetic algorithm,whale algorithm and particle swarm algorithm on two benchmark programs.The results show that the proposed algorithm is faster for test data generation than comparison methods and can provide better coverage with fewer evaluations,and has great advantages in generating test data.
基金Project supported by the National Natural Science Foundation of China (No. 50378041) Specialized Research Fund for Doctoral Programs of Higher Education (No. 20030487016).
文摘Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvatures of mode shapes of the periodic spring-mass system by utilizing the periodic structure theory are derived in this paper. The sensitivities of these mode parameters with respect to structural damages, which do not depend on the physical parameters of the original structures, are obtained. Based on the sensitivity analysis of these mode parameters, a two-stage method is proposed to localize and quantify damages of multi-story or highrise buildings. The slopes and curvatures of mode shapes, which are highly sensitive to local damages, are used to localize the damages. Subsequently, the limited measured natural frequencies, which have a better accuracy than the other mode parameters, are used to quantify the extent of damages within the potential damaged locations. The experimental results of a 3-story experimental building demonstrate that the single or multiple damages of buildings, either slight or severe, can be correctly localized by using only the slope or curvature of mode shape in one of the lower modes, in which the change of natural frequency is the largest, and can be accurately quantified by the limited measured natural frequencies with noise pollution.
文摘A separation method is proposed to design and improve shock absorber according to the characteristics of each force. The method is validated by rig test. The force data measured during rig test is the resultant force of damping force, rebound force produced by pressed air, and friction force. Different characters of damping force, air rebound force and friction force can be applied to seperate each force from others. A massive produced air filling shock absorber is adopted for the validation. The statistic test is used to get the displacement-force curves. The data are used as the input of separation calculation. Then the tests are carried out again to obtain the force data without air rebound force. The force without air rebound is compared to the data derived from the former tests with the separation method. The result shows that this method can separate the damping force and the air elastic force.
文摘Many search-based algorithms have been successfully applied in sev-eral software engineering activities.Genetic algorithms(GAs)are the most used in the scientific domains by scholars to solve software testing problems.They imi-tate the theory of natural selection and evolution.The harmony search algorithm(HSA)is one of the most recent search algorithms in the last years.It imitates the behavior of a musician tofind the best harmony.Scholars have estimated the simi-larities and the differences between genetic algorithms and the harmony search algorithm in diverse research domains.The test data generation process represents a critical task in software validation.Unfortunately,there is no work comparing the performance of genetic algorithms and the harmony search algorithm in the test data generation process.This paper studies the similarities and the differences between genetic algorithms and the harmony search algorithm based on the ability and speed offinding the required test data.The current research performs an empirical comparison of the HSA and the GAs,and then the significance of the results is estimated using the t-Test.The study investigates the efficiency of the harmony search algorithm and the genetic algorithms according to(1)the time performance,(2)the significance of the generated test data,and(3)the adequacy of the generated test data to satisfy a given testing criterion.The results showed that the harmony search algorithm is significantly faster than the genetic algo-rithms because the t-Test showed that the p-value of the time values is 0.026<α(αis the significance level=0.05 at 95%confidence level).In contrast,there is no significant difference between the two algorithms in generating the adequate test data because the t-Test showed that the p-value of thefitness values is 0.25>α.
基金Supported by the National Natural Science Foundation of China(No.41106152)the National Science and Technology Support Program of China(No.2013BAD13B01)+3 种基金the National High Technology Research and Development Program of China(863 Program)(No.2013AA09A505)the International Science&Technology Cooperation Program of China(No.2011DFA22260)the National High Technology Industrialization Project(No.[2012]2083)the Marine Public Projects of China(Nos.201105032,201305032,201105002-07)
文摘This paper introduces the background, aim, experimental design, configuration and data processing for an airborne test flight of the HY-2 Microwave scatterometer(HSCAT). The aim was to evaluate HSCAT performance and a developed data processing algorithm for the HSCAT before launch. There were three test flights of the scatterometer, on January 15, 18 and 22, 2010, over the South China Sea near Lingshui, Hainan. The test flights successfully generated simultaneous airborne scatterometer normalized radar cross section(NRCS), ASCAT wind, and ship-borne-measured wind datasets, which were used to analyze HSCAT performance. Azimuthal dependence of the NRCS relative to the wind direction was nearly cos(2w), with NRCS minima at crosswind directions, and maxima near upwind and downwind. The NRCS also showed a small difference between upwind and downwind directions, with upwind crosssections generally larger than those downwind. The dependence of airborne scatterometer NRCS on wind direction and speed showed favorable consistency with the NASA scatterometer geophysical model function(NSCAT GMF), indicating satisfactory HSCAT performance.
文摘It is now recognized that many geomaterials have nonlinear failure envelopes. This non-linearity is most marked at lower stress levels, the failure envelope being of quasi-parabolic shape. It is not easy to calibrate these nonlinear failure envelopes from triaxial test data. Currently only the power-type failure envelope has been studied with an established formal procedure for its determination from triaxial test data. In this paper, a simplified procedure is evolved for the development of four different types of nonlinear envelopes. These are of invaluable assistance in the evaluation of true factors of safety in problems of slope stability and correct computation of lateral earth pressure and bearing capacity. The use of the Mohr-Coulomb failure envelopes leads to an overestimation of the factors of safety and other geotechnical quantities.
文摘System-on-a-chips with intellectual property cores need a large volume of data for testing. The large volume of test data requires a large testing time and test data memory. Therefore new techniques are needed to optimize the test data volume, decrease the testing time, and conquer the ATE memory limitation for SOC designs. This paper presents a new compression method of testing for intellectual property core-based system-on-chip. The proposed method is based on new split- data variable length (SDV) codes that are designed using the split-options along with identification bits in a string of test data. This paper analyses the reduction of test data volume, testing time, run time, size of memory required in ATE and improvement of compression ratio. Experimental results for ISCAS 85 and ISCAS 89 Benchmark circuits show that SDV codes outperform other compression methods with the best compression ratio for test data compression. The decompression architecture for SDV codes is also presented for decoding the implementations of compressed bits. The proposed scheme shows that SDV codes are accessible to any of the variations in the input test data stream.
文摘To solve the emerging complex optimization problems, multi objectiveoptimization algorithms are needed. By introducing the surrogate model forapproximate fitness calculation, the multi objective firefly algorithm with surrogatemodel (MOFA-SM) is proposed in this paper. Firstly, the population wasinitialized according to the chaotic mapping. Secondly, the external archive wasconstructed based on the preference sorting, with the lightweight clustering pruningstrategy. In the process of evolution, the elite solutions selected from archivewere used to guide the movement to search optimal solutions. Simulation resultsshow that the proposed algorithm can achieve better performance in terms ofconvergence iteration and stability.
文摘At present, with the rapid development of science and technology, based on the requirements of weapon test and identification tasks, the experimental data acquisition and processing space station, which is suitable for a variety of extreme natural environments such as alpine, plateau, mountain, jungle, desert, island and reef, has been studied theoretically and in practice. The space station is a dome-shaped structure with scale-shaped modules and basalt reinforced fiber composite materials, providing thermal insulation, ventilation and continuous power supply. It can provide support and guarantee for the real-time monitoring, recovery and information transmission of test data, and meet the basic work and life needs of test personnel.
基金This project is supported by National Natural Science Foundation of China (No.50405009)
文摘In order to realize visualization of three-dimensional data field (TDDF) in instrument, two methods of visualization of TDDF and the usual manner of quick graphic and image processing are analyzed. And how to use OpenGL technique and the characteristic of analyzed data to construct a TDDF, the ways of reality processing and interactive processing are described. Then the medium geometric element and a related realistic model are constructed by means of the first algorithm. Models obtained for attaching the third dimension in three-dimensional data field are presented. An example for TDDF realization of machine measuring is provided. The analysis of resultant graphic indicates that the three-dimensional graphics built by the method developed is featured by good reality, fast processing and strong interaction
基金funded by the CAS“Light of West China”Program(grant Nos.2021-XBQNXZ-030 and 2021-XBQNXZ-005)the Xinjiang Key Laboratory of Radio Astrophysics(grant No.2023D04064)the National Key R&D Program of China(grant No.2024YFA1611503)。
文摘Accurate estimation of Zenith Tropospheric Delay(ZTD)is essential for mitigating atmospheric effects in radio astronomical observations and improving the retrieval of precipitable water vapor(PWV).In this study,we first analyze the periodic characteristics of ZTD at the NanShan Radio Telescope site using Fourier transform,revealing its dominant seasonal variations,and then investigate the correlation between ZTD and local meteorological parameters,to better understand atmospheric influences on tropospheric delay.Based on these analyses,we propose a hybrid deep learning Gated Recurrent Units-Long Short-Term Memory model,incorporating meteorological parameters as external inputs to enhance ZTD forecasting accuracy.Experimental results demonstrate that the proposed approach achieves a Root Mean Squared Error of 7.97 mm and a correlation coefficient R of 96%,significantly outperforming traditional empirical models and standalone deep learning architectures.These findings indicate that the model effectively captures both short-term dynamics and long-term dependencies in ZTD variations.The improved ZTD predictions not only contribute to reducing atmospheric errors in radio astronomical observations but also provide a more reliable basis for PWV retrieval and forecasting.This study highlights the potential of deep learning in tropospheric delay modeling,offering advancements in both atmospheric science and geodetic applications.
文摘Quantitatively correcting the unconfined compressive strength for sample disturbance is an important research project in the practice of ocean engineering and geotechnical engineering. In this study, the specimens of undisturbed natural marine clay obtained from the same depth at the same site were deliberately disturbed to different levels. Then, the specimens with different extents of sample disturbance were trimmed for both oedometer tests and unconfined compression tests. The degree of sample disturbance SD is obtained from the oedometer test data. The relationship between the unconfined compressive strength q u and SD is studied for investigating the effect of sample disturbance on q u. It is found that the value of q u decreases linearly with the increase in SD. Then, a simple method of correcting q u for sample disturbance is proposed. Its validity is also verified through analysis of the existing published data.
基金support from the Deanship of Scientific Research,University of Hail,Saudi Arabia through the project Ref.(RG-191315).
文摘Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.
基金National Natural Science Foundation of China,No.41171318 National Key Technology Support Program,No.2012BAH32B03+1 种基金No.2012BAH33B05 The Remote Sensing Investigation and Assessment Project for Decade-Change of the National Ecological Environment(2000–2010)
文摘To understand the variations in vegetation and their correlation with climate factors in the upper catchments of the Yellow River, China, Normalized Difference Vegetation Index(NDVI) time series data from 2000 to 2010 were collected based on the MOD13Q1 product. The coefficient of variation, Theil–Sen median trend analysis and the Mann–Kendall test were combined to investigate the volatility characteristic and trend characteristic of the vegetation. Climate data sets were then used to analyze the correlation between variations in vegetation and climate change. In terms of the temporal variations, the vegetation in this study area improved slightly from 2000 to 2010, although the volatility characteristic was larger in 2000–2005 than in 2006–2010. In terms of the spatial variation, vegetation which is relatively stable and has a significantly increasing trend accounts for the largest part of the study area. Its spatial distribution is highly correlated with altitude, which ranges from about 2000 to 3000 m in this area. Highly fluctuating vegetation and vegetation which showed a significantly decreasing trend were mostly distributed around the reservoirs and in the reaches of the river with hydropower developments. Vegetation with a relatively stable and significantly decreasing trend and vegetation with a highly fluctuating and significantly increasing trend are widely dispersed. With respect to the response of vegetation to climate change, about 20–30% of the vegetation has a significant correlation with climatic factors and the correlations in most areas are positive: regions with precipitation as the key influencing factor account for more than 10% of the area; regions with temperature as the key influencing factor account for less than 10% of the area; and regions with precipitation and temperature as the key influencing factors together account for about 5% of the total area. More than 70% of the vegetation has an insignificant correlation with climatic factors.