In this paper surrogate data method of phase-randomized is proposed to identify the random or chaotic nature of the data obtained in dynamic analysis: The calculating results validate the phase-randomized method to be...In this paper surrogate data method of phase-randomized is proposed to identify the random or chaotic nature of the data obtained in dynamic analysis: The calculating results validate the phase-randomized method to be useful as it can increase the extent of accuracy of the results. And the calculating results show that threshold values of the random timeseries and nonlinear chaotic timeseries have marked difference.展开更多
As a relatively new method of processing non-stationary signal with high time-frequency resolution, S transform can be used to analyze the time-frequency characteristics of seismic signals. It has the following charac...As a relatively new method of processing non-stationary signal with high time-frequency resolution, S transform can be used to analyze the time-frequency characteristics of seismic signals. It has the following characteristics: its time-frequency resolution corresponding to the signal frequency, reversible inverse transform, basic wavelet that does not have to meet the permit conditions. We combined the threshold method, proposed the S-transform threshold filtering on the basis of S transform timefrequency filtering, and processed airgun seismic records from temporary stations in "Yangtze Program"(the Anhui experiment). Compared with the results of the bandpass filtering, the S transform threshold filtering can improve the signal to noise ratio(SNR) of seismic waves and provide effective help for first arrival pickup and accurate travel time. The first arrival wave seismic phase can be traced farther continuously, and the Pm seismic phase in the subsequent zone is also highlighted.展开更多
With the simultaneous rise of energy costs and demand for cloud computing, efficient control of data centers becomes crucial. In the data center control problem, one needs to plan at every time step how many servers t...With the simultaneous rise of energy costs and demand for cloud computing, efficient control of data centers becomes crucial. In the data center control problem, one needs to plan at every time step how many servers to switch on or off in order to meet stochastic job arrivals while trying to minimize electricity consumption. This problem becomes particularly challenging when servers can be of various types and jobs from different classes can only be served by certain types of server, as it is often the case in real data centers. We model this problem as a robust Markov decision process(i.e., the transition function is not assumed to be known precisely). We give sufficient conditions(which seem to be reasonable and satisfied in practice) guaranteeing that an optimal threshold policy exists. This property can then be exploited in the design of an efficient solving method, which we provide.Finally, we present some experimental results demonstrating the practicability of our approach and compare with a previous related approach based on model predictive control.展开更多
Debris flows are the one type of natural disaster that is most closely associated with hu- man activities. Debris flows are characterized as being widely distributed and frequently activated. Rainfall is an important ...Debris flows are the one type of natural disaster that is most closely associated with hu- man activities. Debris flows are characterized as being widely distributed and frequently activated. Rainfall is an important component of debris flows and is the most active factor when debris flows oc- cur. Rainfall also determines the temporal and spatial distribution characteristics of the hazards. A reasonable rainfall threshold target is essential to ensuring the accuracy of debris flow pre-warning. Such a threshold is important for the study of the mechanisms of debris flow formation, predicting the characteristics of future activities and the design of prevention and engineering control measures. Most mountainous areas have little data regarding rainfall and hazards, especially in debris flow forming re- gions. Therefore, both the traditional demonstration method and frequency calculated method cannot satisfy the debris flow pre-warning requirements. This study presents the characteristics of pre-warning regions, included the rainfall, hydrologic and topographic conditions. An analogous area with abundant data and the same conditions as the pre-warning region was selected, and the rainfall threshold was calculated by proxy. This method resolved the problem of debris flow pre-warning in ar- eas lacking data and provided a new approach for debris flow pre-warning in mountainous areas.展开更多
Massive data covert transmission scheme based on Shamir threshold is proposed in this paper. This method applies Shamir threshold scheme to divide data, uses information hiding technology to cover shadows, and realize...Massive data covert transmission scheme based on Shamir threshold is proposed in this paper. This method applies Shamir threshold scheme to divide data, uses information hiding technology to cover shadows, and realizes massive data covert transmission through transmitting stego-covers. Analysis proves that compared with the natural division method, this scheme not only improves the time-efficiency of transmitting but also enhances the security.展开更多
In gene prediction, the Fisher discriminant analysis (FDA) is used to separate protein coding region (exon) from non-coding regions (intron). Usually, the positive data set and the negative data set are of the same si...In gene prediction, the Fisher discriminant analysis (FDA) is used to separate protein coding region (exon) from non-coding regions (intron). Usually, the positive data set and the negative data set are of the same size if the number of the data is big enough. But for some situations the data are not sufficient or not equal, the threshold used in FDA may have important influence on prediction results. This paper presents a study on the selection of the threshold. The eigen value of each exon/intron sequence is computed using the Z-curve method with 69 variables. The experiments results suggest that the size and the standard deviation of the data sets and the threshold are the three key elements to be taken into consideration to improve the prediction results.展开更多
A new classification algorithm for web mining is proposed on the basis of general classification algorithm for data mining in order to implement personalized information services. The building tree method of detecting...A new classification algorithm for web mining is proposed on the basis of general classification algorithm for data mining in order to implement personalized information services. The building tree method of detecting class threshold is used for construction of decision tree according to the concept of user expectation so as to find classification rules in different layers. Compared with the traditional C4.5 algorithm, the disadvantage of excessive adaptation in C4.5 has been improved so that classification results not only have much higher accuracy but also statistic meaning.展开更多
In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updatin...In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updating the records list and the recovery mechanism of updating message proves the classical invalidation strategy. When the request cycle of duplication is too long, the strategy uses the method of updating the records list to pause for sending updating message; when the long cycle duplication is requested again, it uses the recovery mechanism to resume the updating message. This strategy not only ensures the consistency of the workflow data, but also reduces the unnecessary network traffic. From theoretical comparison with those common strategies, the unnecessary network traffic of this strategy is fewer and more stable. The simulation results validate this conclusion.展开更多
The probability distributions of the critical threshold chloride concentration Ccr, the chloride diffusion coefficient D, and the surface chloride concentration Cs are determined based on the collected natural exposur...The probability distributions of the critical threshold chloride concentration Ccr, the chloride diffusion coefficient D, and the surface chloride concentration Cs are determined based on the collected natural exposure data, and the probability estimation of reinforcement depassivation in concrete is presented using Monte-Carlo simulation. From sensitivity analysis of mean value for ccr, cs, and D on the depassivation probability of reinforcement, it is found that ccr, cs, and D respectively has the greatest, smaller, and the lowest effect on the probability of depassivation. Finally the effect of stress state of concrete on the reinforcement depassivation probability is analyzed. It is found that the influence of stress state becomes apparent as exposure time increases.展开更多
Temporal activity patterns in animals emerge from complex interactions between choices made by organisms as responses to biotic interactions and challenges posed by external factors. Temporal activity pattern is an in...Temporal activity patterns in animals emerge from complex interactions between choices made by organisms as responses to biotic interactions and challenges posed by external factors. Temporal activity pattern is an inherently continuous process, even being recorded as a time series. The discreteness of the data set is clearly due to data-acquisition limitations rather than a true underlying discrete nature of the phenomenon itself. Therefore, curves are a natural representation for high-frequency data. Here, we fully model temporal activity data as curves integrating wavelets and functional data analysis, allowing for testing hypotheses based on curves rather than on scalar and vector-valued data. Temporal activity data were obtained experimentally for males and females of a small-bodied marsupial and modelled as wavelets with independent and identically distributed errors and dependent errors. The null hypothesis of no difference in temporal activity pattern between male and female curves was tested with functional analysis of variance (FANOVA). The null hypothesis was rejected by FANOVA and we discussed the differences in temporal activity pattern curves between males and females in terms of ecological and life-history attributes of the reference species. We also performed numerical analysis that shed light on the regularity properties of the wavelet bases used and the thresholding parameters.展开更多
文摘In this paper surrogate data method of phase-randomized is proposed to identify the random or chaotic nature of the data obtained in dynamic analysis: The calculating results validate the phase-randomized method to be useful as it can increase the extent of accuracy of the results. And the calculating results show that threshold values of the random timeseries and nonlinear chaotic timeseries have marked difference.
基金funded by the National Natural Science Foundation Item (41674068)Seismic Youth Funding of GEC (YFGEC2016001)
文摘As a relatively new method of processing non-stationary signal with high time-frequency resolution, S transform can be used to analyze the time-frequency characteristics of seismic signals. It has the following characteristics: its time-frequency resolution corresponding to the signal frequency, reversible inverse transform, basic wavelet that does not have to meet the permit conditions. We combined the threshold method, proposed the S-transform threshold filtering on the basis of S transform timefrequency filtering, and processed airgun seismic records from temporary stations in "Yangtze Program"(the Anhui experiment). Compared with the results of the bandpass filtering, the S transform threshold filtering can improve the signal to noise ratio(SNR) of seismic waves and provide effective help for first arrival pickup and accurate travel time. The first arrival wave seismic phase can be traced farther continuously, and the Pm seismic phase in the subsequent zone is also highlighted.
文摘With the simultaneous rise of energy costs and demand for cloud computing, efficient control of data centers becomes crucial. In the data center control problem, one needs to plan at every time step how many servers to switch on or off in order to meet stochastic job arrivals while trying to minimize electricity consumption. This problem becomes particularly challenging when servers can be of various types and jobs from different classes can only be served by certain types of server, as it is often the case in real data centers. We model this problem as a robust Markov decision process(i.e., the transition function is not assumed to be known precisely). We give sufficient conditions(which seem to be reasonable and satisfied in practice) guaranteeing that an optimal threshold policy exists. This property can then be exploited in the design of an efficient solving method, which we provide.Finally, we present some experimental results demonstrating the practicability of our approach and compare with a previous related approach based on model predictive control.
基金supported by the National Natural Science Foundation of China(Nos.40830742 and 40901007)
文摘Debris flows are the one type of natural disaster that is most closely associated with hu- man activities. Debris flows are characterized as being widely distributed and frequently activated. Rainfall is an important component of debris flows and is the most active factor when debris flows oc- cur. Rainfall also determines the temporal and spatial distribution characteristics of the hazards. A reasonable rainfall threshold target is essential to ensuring the accuracy of debris flow pre-warning. Such a threshold is important for the study of the mechanisms of debris flow formation, predicting the characteristics of future activities and the design of prevention and engineering control measures. Most mountainous areas have little data regarding rainfall and hazards, especially in debris flow forming re- gions. Therefore, both the traditional demonstration method and frequency calculated method cannot satisfy the debris flow pre-warning requirements. This study presents the characteristics of pre-warning regions, included the rainfall, hydrologic and topographic conditions. An analogous area with abundant data and the same conditions as the pre-warning region was selected, and the rainfall threshold was calculated by proxy. This method resolved the problem of debris flow pre-warning in ar- eas lacking data and provided a new approach for debris flow pre-warning in mountainous areas.
基金Supported by the National High Technology Research and Development Program of China (863 Program) (2007AA0825)
文摘Massive data covert transmission scheme based on Shamir threshold is proposed in this paper. This method applies Shamir threshold scheme to divide data, uses information hiding technology to cover shadows, and realizes massive data covert transmission through transmitting stego-covers. Analysis proves that compared with the natural division method, this scheme not only improves the time-efficiency of transmitting but also enhances the security.
文摘In gene prediction, the Fisher discriminant analysis (FDA) is used to separate protein coding region (exon) from non-coding regions (intron). Usually, the positive data set and the negative data set are of the same size if the number of the data is big enough. But for some situations the data are not sufficient or not equal, the threshold used in FDA may have important influence on prediction results. This paper presents a study on the selection of the threshold. The eigen value of each exon/intron sequence is computed using the Z-curve method with 69 variables. The experiments results suggest that the size and the standard deviation of the data sets and the threshold are the three key elements to be taken into consideration to improve the prediction results.
文摘A new classification algorithm for web mining is proposed on the basis of general classification algorithm for data mining in order to implement personalized information services. The building tree method of detecting class threshold is used for construction of decision tree according to the concept of user expectation so as to find classification rules in different layers. Compared with the traditional C4.5 algorithm, the disadvantage of excessive adaptation in C4.5 has been improved so that classification results not only have much higher accuracy but also statistic meaning.
基金National Basic Research Program of China (973 Program) (2005CD312904)
文摘In order to settle the problem of workflow data consis-tency under the distributed environment, an invalidation strategy based-on timely updating record list is put forward. The strategy adopting the method of updating the records list and the recovery mechanism of updating message proves the classical invalidation strategy. When the request cycle of duplication is too long, the strategy uses the method of updating the records list to pause for sending updating message; when the long cycle duplication is requested again, it uses the recovery mechanism to resume the updating message. This strategy not only ensures the consistency of the workflow data, but also reduces the unnecessary network traffic. From theoretical comparison with those common strategies, the unnecessary network traffic of this strategy is fewer and more stable. The simulation results validate this conclusion.
基金Funded by National Natural Science Foundation of China (Nos.50908148and 50925829)Research Project of Ministry of Housing and Urban-Rural Development of China (Nos.2009-K4-23, 2010-11-33)National KeyTechnologies R&D Program of China (No.2006BAJ02B04)
文摘The probability distributions of the critical threshold chloride concentration Ccr, the chloride diffusion coefficient D, and the surface chloride concentration Cs are determined based on the collected natural exposure data, and the probability estimation of reinforcement depassivation in concrete is presented using Monte-Carlo simulation. From sensitivity analysis of mean value for ccr, cs, and D on the depassivation probability of reinforcement, it is found that ccr, cs, and D respectively has the greatest, smaller, and the lowest effect on the probability of depassivation. Finally the effect of stress state of concrete on the reinforcement depassivation probability is analyzed. It is found that the influence of stress state becomes apparent as exposure time increases.
文摘Temporal activity patterns in animals emerge from complex interactions between choices made by organisms as responses to biotic interactions and challenges posed by external factors. Temporal activity pattern is an inherently continuous process, even being recorded as a time series. The discreteness of the data set is clearly due to data-acquisition limitations rather than a true underlying discrete nature of the phenomenon itself. Therefore, curves are a natural representation for high-frequency data. Here, we fully model temporal activity data as curves integrating wavelets and functional data analysis, allowing for testing hypotheses based on curves rather than on scalar and vector-valued data. Temporal activity data were obtained experimentally for males and females of a small-bodied marsupial and modelled as wavelets with independent and identically distributed errors and dependent errors. The null hypothesis of no difference in temporal activity pattern between male and female curves was tested with functional analysis of variance (FANOVA). The null hypothesis was rejected by FANOVA and we discussed the differences in temporal activity pattern curves between males and females in terms of ecological and life-history attributes of the reference species. We also performed numerical analysis that shed light on the regularity properties of the wavelet bases used and the thresholding parameters.