Estimation of random errors, which are due to shot noise of photomultiplier tube(PMT) or avalanche photodiode(APD) detectors, is very necessary in lidar observation. Due to the Poisson distribution of incident electro...Estimation of random errors, which are due to shot noise of photomultiplier tube(PMT) or avalanche photodiode(APD) detectors, is very necessary in lidar observation. Due to the Poisson distribution of incident electrons, there still exists a proportional relationship between standard deviation and square root of its mean value. Based on this relationship,noise scale factor(NSF) is introduced into the estimation, which only needs a single data sample. This method overcomes the distractions of atmospheric fluctuations during calculation of random errors. The results show that this method is feasible and reliable.展开更多
The effects of statistically dependent random errors on the sidelobe are analyzed for the linear array. It is shown that the random errors cause a rise in the sidelobe level. The simple formulas can also be obtained f...The effects of statistically dependent random errors on the sidelobe are analyzed for the linear array. It is shown that the random errors cause a rise in the sidelobe level. The simple formulas can also be obtained for the case of independent random errors.展开更多
Chaos theory has taught us that a system which has both nonlinearity and random input will most likely produce irregular data. If random errors are irregular data, then random error process will raise nonlinearity (K...Chaos theory has taught us that a system which has both nonlinearity and random input will most likely produce irregular data. If random errors are irregular data, then random error process will raise nonlinearity (Kantz and Schreiber (1997)). Tsai (1986) introduced a composite test for autocorrelation and heteroscedasticity in linear models with AR(1) errors. Liu (2003) introduced a composite test for correlation and heteroscedasticity in nonlinear models with DBL(p, 0, 1) errors. Therefore, the important problems in regression model axe detections of bilinearity, correlation and heteroscedasticity. In this article, the authors discuss more general case of nonlinear models with DBL(p, q, 1) random errors by score test. Several statistics for the test of bilinearity, correlation, and heteroscedasticity are obtained, and expressed in simple matrix formulas. The results of regression models with linear errors are extended to those with bilinear errors. The simulation study is carried out to investigate the powers of the test statistics. All results of this article extend and develop results of Tsai (1986), Wei, et al (1995), and Liu, et al (2003).展开更多
To achieve better observation for sea surface,a new generation of wide-swath interferometric altimeter satellites is proposed.Before satellite launch,it is particularly important to study the data processing methods a...To achieve better observation for sea surface,a new generation of wide-swath interferometric altimeter satellites is proposed.Before satellite launch,it is particularly important to study the data processing methods and carry out the detailed error analysis of ocean satellites,because it is directly related to the ultimate ability of satellites to capture ocean information.For this purpose,ocean eddies are considered a specific case of ocean signals,and it can cause significant changes in sea surface elevation.It is suitable for theoretical simulation of the sea surface and systematic simulation of the altimeter.We analyzed the impacts of random error and baseline error on the sea surface and ocean signals and proposed a combined strategy of low-pass filtering,empirical orthogonal function(EOF)decomposition,and linear fitting to remove the errors.Through this strategy,sea surface anomalies caused by errors were considerably improved,and the capability of satellite for capturing ocean information was enhanced.Notably,we found that the baseline error in sea surface height data was likely to cause inaccuracy in eddy boundary detection,as well as false eddy detection.These abnormalities could be prevented for"clean"sea surface height after the errors removal.展开更多
Based on probability and statistic, a design method of precision cam profileconcerning the influence of random processing errors is advanced. Combining the design with theprocess, which can be used to predict that cam...Based on probability and statistic, a design method of precision cam profileconcerning the influence of random processing errors is advanced. Combining the design with theprocess, which can be used to predict that cam profiles will be successfully processed or not in thedesign stage, design of the cam can be done by balancing the economization and reliability. Inaddition, an fuzzy deduction method based on Bayers formula is advanced to estimate processingreasonable of the designed precision cam profile, and it take few samples.展开更多
Species evolution is essentially a random process of interaction between biological populations and their environ- ments. As a result, some physical parameters in evolution models are subject to statistical fluctuatio...Species evolution is essentially a random process of interaction between biological populations and their environ- ments. As a result, some physical parameters in evolution models are subject to statistical fluctuations. In this work, two important parameters in the Eigen model, the fitness and mutation rate, are treated as Gaassian dis- tributed random variables simultaneously to examine the property of the error threshold. Numerical simulation results show that the error threshold in the fully random model appears as a crossover region instead of a phase transition point, and &s the fluctuation strength increases the crossover region becomes smoother and smoother. Furthermore, it is shown that the randomization of the mutation rate plays a dominant role in changing the error threshold in the fully random model, which is consistent with the existing experimental data. The implication of the threshold change due to the randomization for antiviral strategies is discussed.展开更多
A radiotherapy treatment margin formula has been analytically derived when a standard deviation (SD) of systematic positioning errors Ʃis relatively small compared to an SD of random positioning errors &s...A radiotherapy treatment margin formula has been analytically derived when a standard deviation (SD) of systematic positioning errors Ʃis relatively small compared to an SD of random positioning errors σ. The margin formula for 0 ≤ Ʃ≤ σwas calculated by linearly interpolating two boundaries at Ʃ= 0 and Ʃ= σ, assuming that the van Herk margin approximation of k1Ʃ+ k2σis valid at Ʃ= σ. It was shown that a margin formula for 0 ≤ Ʃ≤ σmay be approximated by k1σ+ k2Ʃ, leading to a more general form of k1 max(Ʃ,σ) + k2 min(Ʃ,σ) which is a piecewise linear approximation for any values of Ʃand σ.展开更多
Throughout this paper,let(Ω,ι,μ)be a probability space,D the collection of allleft-continuous distribution functions,and D^+={F(0)=0|F∈D},and L(Ω)the collec-tion of all random variables which is a.s.finite on Ω,...Throughout this paper,let(Ω,ι,μ)be a probability space,D the collection of allleft-continuous distribution functions,and D^+={F(0)=0|F∈D},and L(Ω)the collec-tion of all random variables which is a.s.finite on Ω,and L^+={ξ≥0 a.s.|ξ∈L(Ω)}.For random metric (normed) spaces,see [1]or[2].Theorem 1 Let(M,d)be a complete metric space f:M→M,a contract mappingwith contract coefficient α∈[0,1),L(Ω,m)the collection of all M-valued random vari-展开更多
For the dislribulion if mean error under independent but not identicallydislribuled conditions. its approximating dislribution whose precision reachO is obtained.
This study investigates the effects of systematic errors in phase inversions on the success rate and number of iterations in the optimized quantum random-walk search algorithm. Using the geometric description of this ...This study investigates the effects of systematic errors in phase inversions on the success rate and number of iterations in the optimized quantum random-walk search algorithm. Using the geometric description of this algorithm, a model of the algorithm with phase errors is established, and the relationship between the success rate of the algorithm, the database size, the number of iterations, and the phase error is determined. For a given database size, we obtain both the maximum success rate of the algorithm and the required number of iterations when phase errors are present in the algorithm. Analyses and numerical simulations show that the optimized quantum random-walk search algorithm is more robust against phase errors than Grover's algorithm.展开更多
This paper proved the statement that a good linear block encoder is in fact a good local-random sequence generator. Furthermore, this statement discovers the deep relationship between the error-correcting coding theor...This paper proved the statement that a good linear block encoder is in fact a good local-random sequence generator. Furthermore, this statement discovers the deep relationship between the error-correcting coding theory and the modern cryptography.展开更多
The accuracy of landslide susceptibility prediction(LSP)mainly depends on the precision of the landslide spatial position.However,the spatial position error of landslide survey is inevitable,resulting in considerable ...The accuracy of landslide susceptibility prediction(LSP)mainly depends on the precision of the landslide spatial position.However,the spatial position error of landslide survey is inevitable,resulting in considerable uncertainties in LSP modeling.To overcome this drawback,this study explores the influence of positional errors of landslide spatial position on LSP uncertainties,and then innovatively proposes a semi-supervised machine learning model to reduce the landslide spatial position error.This paper collected 16 environmental factors and 337 landslides with accurate spatial positions taking Shangyou County of China as an example.The 30e110 m error-based multilayer perceptron(MLP)and random forest(RF)models for LSP are established by randomly offsetting the original landslide by 30,50,70,90 and 110 m.The LSP uncertainties are analyzed by the LSP accuracy and distribution characteristics.Finally,a semi-supervised model is proposed to relieve the LSP uncertainties.Results show that:(1)The LSP accuracies of error-based RF/MLP models decrease with the increase of landslide position errors,and are lower than those of original data-based models;(2)70 m error-based models can still reflect the overall distribution characteristics of landslide susceptibility indices,thus original landslides with certain position errors are acceptable for LSP;(3)Semi-supervised machine learning model can efficiently reduce the landslide position errors and thus improve the LSP accuracies.展开更多
This paper presents a new random weighting estimation method for dynamic navigation positioning. This method adopts the concept of random weighting estimation to estimate the covariance matrices of system state noises...This paper presents a new random weighting estimation method for dynamic navigation positioning. This method adopts the concept of random weighting estimation to estimate the covariance matrices of system state noises and observation noises for controlling the disturbances of singular observations and the kinematic model errors. It satisfies the practical requirements of the residual vector and innovation vector to sufficiently utilize observation information, thus weakening the disturbing effect of the kinematic model error and observation model error on the state parameter estimation. Theories and algorithms of random weighting estimation are established for estimating the covariance matrices of observation residual vectors and innovation vec- tors. This random weighting estimation method provides an effective solution for improving the positioning accuracy in dynamic navigation. Experimental results show that compared with the Kalman filtering, the extended Kalman filtering and the adaptive windowing filtering, the proposed method can adaptively determine the covariance matrices of observation error and state error, effectively resist the disturbances caused by system error and observation error, and significantly improve the positioning accu- racy for dynamic navigation.展开更多
For the product degradation process with random effect (RE), measurement error (ME) and nonlinearity in step-stress accelerated degradation test (SSADT), the nonlinear Wiener based degradation model with RE and ME is ...For the product degradation process with random effect (RE), measurement error (ME) and nonlinearity in step-stress accelerated degradation test (SSADT), the nonlinear Wiener based degradation model with RE and ME is built. An analytical approximation to the probability density function (PDF) of the product's lifetime is derived in a closed form. The process and data of SSADT are analyzed to obtain the relation model of the observed data under each accelerated stress. The likelihood function for the population-based observed data is constructed. The population-based model parameters and its random coefficient prior values are estimated. According to the newly observed data of the target product in SSADT, an analytical approximation to the PDF of its residual lifetime (RL) is derived in accordance with its individual degradation characteristics. The parameter updating method based on Bayesian inference is applied to obtain the posterior value of random coefficient of the RL model. A numerical example by simulation is analyzed to verify the accuracy and advantage of the proposed model.展开更多
Numerical modeling is an important tool to study and predict the transport of oil spills. However, the accu- racy of numerical models is not always good enough to provide reliable information for oil spill transport. ...Numerical modeling is an important tool to study and predict the transport of oil spills. However, the accu- racy of numerical models is not always good enough to provide reliable information for oil spill transport. It is necessary to analyze and identify major error sources for the models. A case study was conducted to analyze error sources of a three-dimensional oil spill model that was used operationally for oil spill forecast- ing in the National Marine Environmental Forecasting Center (NMEFC), the State Oceanic Administration, China. On June 4, 2011, oil from sea bed spilled into seawater in Penglai 19-3 region, the largest offshore oil field of China, and polluted an area of thousands of square kilometers in the Bohai Sea. Satellite remote sensing images were collected to locate oil slicks. By performing a series of model sensitivity experiments with different wind and current forcings and comparing the model results with the satellite images, it was identified that the major errors of the long-term simulation for oil spill transport were from the wind fields, and the wind-induced surface currents. An inverse model was developed to estimate the temporal variabil- ity of emission intensity at the oil spill source, which revealed the importance of the accuracy in oil spill source emission time function.展开更多
Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, th...Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, that double assumption is unlikely to hold, particularly for the random effects, a crucial component </span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">in </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">which assessment of magnitude is key in such modeling. Alternative fitting methods not relying on that assumption (as ANOVA ones and Rao</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">s MINQUE) apply, quite often, only to the very constrained class of variance components models. In this paper, a new computationally feasible estimation methodology is designed, first for the widely used class of 2-level (or longitudinal) LMMs with only assumption (beyond the usual basic ones) that residual errors are uncorrelated and homoscedastic, with no distributional assumption imposed on the random effects. A major asset of this new approach is that it yields nonnegative variance estimates and covariance matrices estimates which are symmetric and, at least, positive semi-definite. Furthermore, it is shown that when the LMM is, indeed, Gaussian, this new methodology differs from ML just through a slight variation in the denominator of the residual variance estimate. The new methodology actually generalizes to LMMs a well known nonparametric fitting procedure for standard Linear Models. Finally, the methodology is also extended to ANOVA LMMs, generalizing an old method by Henderson for ML estimation in such models under normality.展开更多
The quality of the radiation dose depends upon the gamma count rate of the radionuclide used. Any reduction in error in the count rate is reflected in the reduction in error in the activity and consequently on the qua...The quality of the radiation dose depends upon the gamma count rate of the radionuclide used. Any reduction in error in the count rate is reflected in the reduction in error in the activity and consequently on the quality of dose. All the efforts so far have been directed only to minimize the random errors in count rate by repetition. In the absence of probability distribution for the systematic errors, we propose to minimize these errors by estimating the upper and lower limits by the technique of determinant in equalities developed by us. Using the algorithm we have developed based on the tech- nique of determinant inequalities and the concept of maximization of mutual information (MI), we show how to process element by element of the covariance matrix to minimize the correlated systematic errors in the count rate of 113 mIn. The element wise processing of covariance matrix is so unique by our technique that it gives experimentalists enough maneuverability to mitigate different factors causing systematic errors in the count rate and consequently the activity of 113 mIn.展开更多
基金supported by the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB05040300)the National Natural Science Foundation of China(Grant No.41205119)
文摘Estimation of random errors, which are due to shot noise of photomultiplier tube(PMT) or avalanche photodiode(APD) detectors, is very necessary in lidar observation. Due to the Poisson distribution of incident electrons, there still exists a proportional relationship between standard deviation and square root of its mean value. Based on this relationship,noise scale factor(NSF) is introduced into the estimation, which only needs a single data sample. This method overcomes the distractions of atmospheric fluctuations during calculation of random errors. The results show that this method is feasible and reliable.
文摘The effects of statistically dependent random errors on the sidelobe are analyzed for the linear array. It is shown that the random errors cause a rise in the sidelobe level. The simple formulas can also be obtained for the case of independent random errors.
文摘Chaos theory has taught us that a system which has both nonlinearity and random input will most likely produce irregular data. If random errors are irregular data, then random error process will raise nonlinearity (Kantz and Schreiber (1997)). Tsai (1986) introduced a composite test for autocorrelation and heteroscedasticity in linear models with AR(1) errors. Liu (2003) introduced a composite test for correlation and heteroscedasticity in nonlinear models with DBL(p, 0, 1) errors. Therefore, the important problems in regression model axe detections of bilinearity, correlation and heteroscedasticity. In this article, the authors discuss more general case of nonlinear models with DBL(p, q, 1) random errors by score test. Several statistics for the test of bilinearity, correlation, and heteroscedasticity are obtained, and expressed in simple matrix formulas. The results of regression models with linear errors are extended to those with bilinear errors. The simulation study is carried out to investigate the powers of the test statistics. All results of this article extend and develop results of Tsai (1986), Wei, et al (1995), and Liu, et al (2003).
基金Supported by the National Key R&D Program of China(No.2016YFC1401008)the Key R&D Program of Shandong Province,China(No.2019GHY112055)+6 种基金the National Natural Science Foundation of China(Nos.U2006211,42090044,41606200,41776183,41906157)the Major Scientifi c and Technological Innovation Projects in Shandong Province(No.2019JZZY010102)the Strategic Priority Research Program of the Chinese Academy of Sciences(Nos.XDA19060101,XDB42000000)the Key Project of Center for Ocean Mega-Science,Chinese Academy of Sciences(No.COMS2019R02)the CAS(Chinese Academy of Sciences)100-Talent Program(No.Y9KY04101L)the Marine S&T Fund of Shandong Province for Pilot National Laboratory for Marine Science and Technology(Qingdao)(No.2018SDKJ0102-2)the Fundamental Research Funds for the Central Universities(Hohai University)(No.2018B41814)。
文摘To achieve better observation for sea surface,a new generation of wide-swath interferometric altimeter satellites is proposed.Before satellite launch,it is particularly important to study the data processing methods and carry out the detailed error analysis of ocean satellites,because it is directly related to the ultimate ability of satellites to capture ocean information.For this purpose,ocean eddies are considered a specific case of ocean signals,and it can cause significant changes in sea surface elevation.It is suitable for theoretical simulation of the sea surface and systematic simulation of the altimeter.We analyzed the impacts of random error and baseline error on the sea surface and ocean signals and proposed a combined strategy of low-pass filtering,empirical orthogonal function(EOF)decomposition,and linear fitting to remove the errors.Through this strategy,sea surface anomalies caused by errors were considerably improved,and the capability of satellite for capturing ocean information was enhanced.Notably,we found that the baseline error in sea surface height data was likely to cause inaccuracy in eddy boundary detection,as well as false eddy detection.These abnormalities could be prevented for"clean"sea surface height after the errors removal.
基金This project is supported by Significant Project Foundation of National 863 Program, China.
文摘Based on probability and statistic, a design method of precision cam profileconcerning the influence of random processing errors is advanced. Combining the design with theprocess, which can be used to predict that cam profiles will be successfully processed or not in thedesign stage, design of the cam can be done by balancing the economization and reliability. Inaddition, an fuzzy deduction method based on Bayers formula is advanced to estimate processingreasonable of the designed precision cam profile, and it take few samples.
基金Supported by the Natural Science Foundation of Hebei Province under Grant No C2013202192
文摘Species evolution is essentially a random process of interaction between biological populations and their environ- ments. As a result, some physical parameters in evolution models are subject to statistical fluctuations. In this work, two important parameters in the Eigen model, the fitness and mutation rate, are treated as Gaassian dis- tributed random variables simultaneously to examine the property of the error threshold. Numerical simulation results show that the error threshold in the fully random model appears as a crossover region instead of a phase transition point, and &s the fluctuation strength increases the crossover region becomes smoother and smoother. Furthermore, it is shown that the randomization of the mutation rate plays a dominant role in changing the error threshold in the fully random model, which is consistent with the existing experimental data. The implication of the threshold change due to the randomization for antiviral strategies is discussed.
文摘A radiotherapy treatment margin formula has been analytically derived when a standard deviation (SD) of systematic positioning errors Ʃis relatively small compared to an SD of random positioning errors σ. The margin formula for 0 ≤ Ʃ≤ σwas calculated by linearly interpolating two boundaries at Ʃ= 0 and Ʃ= σ, assuming that the van Herk margin approximation of k1Ʃ+ k2σis valid at Ʃ= σ. It was shown that a margin formula for 0 ≤ Ʃ≤ σmay be approximated by k1σ+ k2Ʃ, leading to a more general form of k1 max(Ʃ,σ) + k2 min(Ʃ,σ) which is a piecewise linear approximation for any values of Ʃand σ.
文摘Throughout this paper,let(Ω,ι,μ)be a probability space,D the collection of allleft-continuous distribution functions,and D^+={F(0)=0|F∈D},and L(Ω)the collec-tion of all random variables which is a.s.finite on Ω,and L^+={ξ≥0 a.s.|ξ∈L(Ω)}.For random metric (normed) spaces,see [1]or[2].Theorem 1 Let(M,d)be a complete metric space f:M→M,a contract mappingwith contract coefficient α∈[0,1),L(Ω,m)the collection of all M-valued random vari-
文摘For the dislribulion if mean error under independent but not identicallydislribuled conditions. its approximating dislribution whose precision reachO is obtained.
基金Project supported by the National Basic Research Program of China(Grant No.2013CB338002)
文摘This study investigates the effects of systematic errors in phase inversions on the success rate and number of iterations in the optimized quantum random-walk search algorithm. Using the geometric description of this algorithm, a model of the algorithm with phase errors is established, and the relationship between the success rate of the algorithm, the database size, the number of iterations, and the phase error is determined. For a given database size, we obtain both the maximum success rate of the algorithm and the required number of iterations when phase errors are present in the algorithm. Analyses and numerical simulations show that the optimized quantum random-walk search algorithm is more robust against phase errors than Grover's algorithm.
基金Supported by Trans-century Training Program Foundation for the Talents by the State Education Commission
文摘This paper proved the statement that a good linear block encoder is in fact a good local-random sequence generator. Furthermore, this statement discovers the deep relationship between the error-correcting coding theory and the modern cryptography.
基金the National Natural Science Foundation of China(Grant Nos.42377164 and 52079062)the Interdisciplinary Innovation Fund of Natural Science,Nanchang University(Grant No.9167-28220007-YB2107).
文摘The accuracy of landslide susceptibility prediction(LSP)mainly depends on the precision of the landslide spatial position.However,the spatial position error of landslide survey is inevitable,resulting in considerable uncertainties in LSP modeling.To overcome this drawback,this study explores the influence of positional errors of landslide spatial position on LSP uncertainties,and then innovatively proposes a semi-supervised machine learning model to reduce the landslide spatial position error.This paper collected 16 environmental factors and 337 landslides with accurate spatial positions taking Shangyou County of China as an example.The 30e110 m error-based multilayer perceptron(MLP)and random forest(RF)models for LSP are established by randomly offsetting the original landslide by 30,50,70,90 and 110 m.The LSP uncertainties are analyzed by the LSP accuracy and distribution characteristics.Finally,a semi-supervised model is proposed to relieve the LSP uncertainties.Results show that:(1)The LSP accuracies of error-based RF/MLP models decrease with the increase of landslide position errors,and are lower than those of original data-based models;(2)70 m error-based models can still reflect the overall distribution characteristics of landslide susceptibility indices,thus original landslides with certain position errors are acceptable for LSP;(3)Semi-supervised machine learning model can efficiently reduce the landslide position errors and thus improve the LSP accuracies.
基金National Natural Science Foundation of China(60574034)Aeronautical Science Foundation of China(20080818004)
文摘This paper presents a new random weighting estimation method for dynamic navigation positioning. This method adopts the concept of random weighting estimation to estimate the covariance matrices of system state noises and observation noises for controlling the disturbances of singular observations and the kinematic model errors. It satisfies the practical requirements of the residual vector and innovation vector to sufficiently utilize observation information, thus weakening the disturbing effect of the kinematic model error and observation model error on the state parameter estimation. Theories and algorithms of random weighting estimation are established for estimating the covariance matrices of observation residual vectors and innovation vec- tors. This random weighting estimation method provides an effective solution for improving the positioning accuracy in dynamic navigation. Experimental results show that compared with the Kalman filtering, the extended Kalman filtering and the adaptive windowing filtering, the proposed method can adaptively determine the covariance matrices of observation error and state error, effectively resist the disturbances caused by system error and observation error, and significantly improve the positioning accu- racy for dynamic navigation.
基金supported by the National Defense Foundation of China(71601183)
文摘For the product degradation process with random effect (RE), measurement error (ME) and nonlinearity in step-stress accelerated degradation test (SSADT), the nonlinear Wiener based degradation model with RE and ME is built. An analytical approximation to the probability density function (PDF) of the product's lifetime is derived in a closed form. The process and data of SSADT are analyzed to obtain the relation model of the observed data under each accelerated stress. The likelihood function for the population-based observed data is constructed. The population-based model parameters and its random coefficient prior values are estimated. According to the newly observed data of the target product in SSADT, an analytical approximation to the PDF of its residual lifetime (RL) is derived in accordance with its individual degradation characteristics. The parameter updating method based on Bayesian inference is applied to obtain the posterior value of random coefficient of the RL model. A numerical example by simulation is analyzed to verify the accuracy and advantage of the proposed model.
基金supported by Marine Industry Scientific Research Special Funds for Public Welfare Project "The development and application of fine-scale high precision comprehensive forecast system on the key protection coastal area",under contact No.201305031 and "The modular construction and application of marine forecasting operational system",under contact No.201205017
文摘Numerical modeling is an important tool to study and predict the transport of oil spills. However, the accu- racy of numerical models is not always good enough to provide reliable information for oil spill transport. It is necessary to analyze and identify major error sources for the models. A case study was conducted to analyze error sources of a three-dimensional oil spill model that was used operationally for oil spill forecast- ing in the National Marine Environmental Forecasting Center (NMEFC), the State Oceanic Administration, China. On June 4, 2011, oil from sea bed spilled into seawater in Penglai 19-3 region, the largest offshore oil field of China, and polluted an area of thousands of square kilometers in the Bohai Sea. Satellite remote sensing images were collected to locate oil slicks. By performing a series of model sensitivity experiments with different wind and current forcings and comparing the model results with the satellite images, it was identified that the major errors of the long-term simulation for oil spill transport were from the wind fields, and the wind-induced surface currents. An inverse model was developed to estimate the temporal variabil- ity of emission intensity at the oil spill source, which revealed the importance of the accuracy in oil spill source emission time function.
文摘Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, that double assumption is unlikely to hold, particularly for the random effects, a crucial component </span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">in </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">which assessment of magnitude is key in such modeling. Alternative fitting methods not relying on that assumption (as ANOVA ones and Rao</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">s MINQUE) apply, quite often, only to the very constrained class of variance components models. In this paper, a new computationally feasible estimation methodology is designed, first for the widely used class of 2-level (or longitudinal) LMMs with only assumption (beyond the usual basic ones) that residual errors are uncorrelated and homoscedastic, with no distributional assumption imposed on the random effects. A major asset of this new approach is that it yields nonnegative variance estimates and covariance matrices estimates which are symmetric and, at least, positive semi-definite. Furthermore, it is shown that when the LMM is, indeed, Gaussian, this new methodology differs from ML just through a slight variation in the denominator of the residual variance estimate. The new methodology actually generalizes to LMMs a well known nonparametric fitting procedure for standard Linear Models. Finally, the methodology is also extended to ANOVA LMMs, generalizing an old method by Henderson for ML estimation in such models under normality.
文摘The quality of the radiation dose depends upon the gamma count rate of the radionuclide used. Any reduction in error in the count rate is reflected in the reduction in error in the activity and consequently on the quality of dose. All the efforts so far have been directed only to minimize the random errors in count rate by repetition. In the absence of probability distribution for the systematic errors, we propose to minimize these errors by estimating the upper and lower limits by the technique of determinant in equalities developed by us. Using the algorithm we have developed based on the tech- nique of determinant inequalities and the concept of maximization of mutual information (MI), we show how to process element by element of the covariance matrix to minimize the correlated systematic errors in the count rate of 113 mIn. The element wise processing of covariance matrix is so unique by our technique that it gives experimentalists enough maneuverability to mitigate different factors causing systematic errors in the count rate and consequently the activity of 113 mIn.