This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ...This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ratio test is X^2(1).展开更多
Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying th...Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.展开更多
In this paper, we propose the test statistic to check whether the nonparametric function in partially linear models is linear or not. We estimate the nonparametric function in alternative by using the local linear met...In this paper, we propose the test statistic to check whether the nonparametric function in partially linear models is linear or not. We estimate the nonparametric function in alternative by using the local linear method, and then estimate the parameters by the two stage method. The test statistic under the null hypothesis is calculated, and it is shown to be asymptotically normal.展开更多
We propose the test statistic to check whether the nonpararnetric functions in two partially linear models are equality or not in this paper. We estimate the nonparametric function both in null hypothesis and the alte...We propose the test statistic to check whether the nonpararnetric functions in two partially linear models are equality or not in this paper. We estimate the nonparametric function both in null hypothesis and the alternative by the local linear method, where we ignore the parametric components, and then estimate the parameters by the two stage method. The test statistic is derived, and it is shown to be asymptotically normal under the null hypothesis.展开更多
Coutsourides derived an ad hoc nuisance paratmeter removal test for testing equality of two multiple correlation matrices of two independent p variate normal populations under the assumption that a sample of size ...Coutsourides derived an ad hoc nuisance paratmeter removal test for testing equality of two multiple correlation matrices of two independent p variate normal populations under the assumption that a sample of size n is available from each population. This paper presents a likelihood ratio test criterion for testing equality of K multiple correlation matrices and extends the results to the testing of equality of K partial correlation matrices.展开更多
Consider I pairs of independent binomial variates x0i and x1i with corresponding parameters P0i and p1i and sample sizes n0i and n1i for i=1, …,I. Let △i = P1i-P0i be the difference of the two binomial parameters, w...Consider I pairs of independent binomial variates x0i and x1i with corresponding parameters P0i and p1i and sample sizes n0i and n1i for i=1, …,I. Let △i = P1i-P0i be the difference of the two binomial parameters, where △i’s are to be of interest and P0i’s are nuisance parameters. The null hypothesis of homogeneity on the risk difference can be written as展开更多
Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the c...Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the context of change point analysis. This study develops a likelihood-based algorithm that detects and estimates multiple change points in a set of count data assumed to follow the Negative Binomial distribution. Discrete change point procedures discussed in literature work well for equi-dispersed data. The new algorithm produces reliable estimates of change points in cases of both equi-dispersed and over-dispersed count data;hence its advantage over other count data change point techniques. The Negative Binomial Multiple Change Point Algorithm was tested using simulated data for different sample sizes and varying positions of change. Changes in the distribution parameters were detected and estimated by conducting a likelihood ratio test on several partitions of data obtained through step-wise recursive binary segmentation. Critical values for the likelihood ratio test were developed and used to check for significance of the maximum likelihood estimates of the change points. The change point algorithm was found to work best for large datasets, though it also works well for small and medium-sized datasets with little to no error in the location of change points. The algorithm correctly detects changes when present and fails to detect changes when change is absent in actual sense. Power analysis of the likelihood ratio test for change was performed through Monte-Carlo simulation in the single change point setting. Sensitivity analysis of the test power showed that likelihood ratio test is the most powerful when the simulated change points are located mid-way through the sample data as opposed to when changes were located in the periphery. Further, the test is more powerful when the change was located three-quarter-way through the sample data compared to when the change point is closer (quarter-way) to the first observation.展开更多
To resolve problems of complicated clutter, fast-varying scenes, and low signal-clutterratio (SCR) in application of target detection on sea for space-based radar (SBR), a target detection approach based on adapti...To resolve problems of complicated clutter, fast-varying scenes, and low signal-clutterratio (SCR) in application of target detection on sea for space-based radar (SBR), a target detection approach based on adaptive waveform design is proposed in this paper. Firstly, complicated sea clutter is modeled as compound Gaussian process, and a target is modeled as some scatterers with Gaussian reflectivity. Secondly, every dwell duration of radar is divided into several sub-dwells. Regular linear frequency modulated pulses are transmitted at Sub-dwell 1, and the received signal at this sub-dwell is used to estimate clutter covariance matrices and pre-detection. Estimated matrices are updated at every following sub-dwell by multiple particle filtering to cope with fast-varying clutter scenes of SBR. Furthermore, waveform of every following sub-dwell is designed adaptively according to mean square optimization technique. Finally, principal component analysis and generalized likelihood ratio test is used for mitigation of colored interference and property of constant false alarm rate, respectively. Simulation results show that, considering configuration of SBR and condition of complicated clutter, 9 dB is reduced for SCR which reliable detection requires by this target detection approach. Therefore, the work in this paper can markedly improve radar detection performance for weak targets.展开更多
The occurrence of lightning-induced forest fires during a time period is count data featuring over-dispersion (i.e., variance is larger than mean) and a high frequency of zero counts. In this study, we used six gene...The occurrence of lightning-induced forest fires during a time period is count data featuring over-dispersion (i.e., variance is larger than mean) and a high frequency of zero counts. In this study, we used six generalized linear models to examine the relationship between the occurrence of lightning-induced forest fires and meteorological factors in the Northern Daxing'an Mountains of China. The six models included Poisson, negative binomial (NB), zero- inflated Poisson (ZIP), zero-inflated negative binomial (ZINB), Poisson hurdle (PH), and negative binomial hurdle (NBH) models. Goodness-of-fit was compared and tested among the six models using Akaike information criterion (AIC), sum of squared errors, likelihood ratio test, and Vuong test. The predictive performance of the models was assessed and compared using independent validation data by the data-splitting method. Based on the model AIC, the ZINB model best fitted the fire occurrence data, followed by (in order of smaller AIC) NBH, ZIP, NB, PH, and Poisson models. The ZINB model was also best for pre- dicting either zero counts or positive counts (〉1). The two Hurdle models (PH and NBH) were better than ZIP, Poisson, and NB models for predicting positive counts, but worse than these three models for predicting zero counts. Thus, the ZINB model was the first choice for modeling the occurrence of lightning-induced forest fires in this study, which implied that the excessive zero counts of lightning- induced fires came from both structure and sampling zeros.展开更多
To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the ...To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.展开更多
In this article, radar echoes of aircraft wake vortices are modeled as weighted sums of the frequency components of the echoes with a special covariance matrix for the weighted coefficients. With a proposed detection ...In this article, radar echoes of aircraft wake vortices are modeled as weighted sums of the frequency components of the echoes with a special covariance matrix for the weighted coefficients. With a proposed detection scheme, two generalized likelihood ratio test (GLRT) detectors are derived respectively for aircraft wake vortices with time-varying and time-invariant Doppler spectra. Then the analytical expressions for detection and false alarm probabilities of the detectors are derived and three factors are investigated which mainly influence the detection performance, i.e., the Doppler extension and uncertainty of the aircraft wake vortex, and the number of the detection cells. The results indicate that, the signal-to-noise ratio (SNR) loss induced by Doppler extension is generally several decibels. The SNR loss due to Doppler uncertainty is approximately proportional to the logarithm of the number of spectrum lines in the uncertain Doppler spectrum intervals. For a large number of detection cells, the SNR gain is approximately proportional to the square root of the number of the detection cells.展开更多
As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of a...As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of activity related to lo- cation-verification techniques in wireless networks. In particular, there has been a specific focus on intelligent transport systems because of the mission-critical nature of vehicle location verification. In this paper, we review recent research on wireless location verification related to vehicular networks. We focus on location verification systems that rely on for- mal mathematical classification frameworks and show how many systems are either partially or fully encompassed by such frameworks.展开更多
An optimized detection model based on weighted entropy for multiple input multiple output (MIMO) radar in multipath environment is presented. After defining the multipath distance difference (MDD), the multipath recei...An optimized detection model based on weighted entropy for multiple input multiple output (MIMO) radar in multipath environment is presented. After defining the multipath distance difference (MDD), the multipath received signal model with four paths is built systematically. Both the variance and correlation coefficient of multipath scattering coefficient with MDD are analyzed, which indicates that the multipath variable can decrease the detection performance by reducing the echo power. By making use of the likelihood ratio test (LRT), a new method based on weighted entropy is introduced to use the positive multipath echo power and suppress the negative echo power, which results in better performance. Simulation results show that, compared with non-multipath environment or other recently developed methods, the proposed method can achieve detection performance improvement with the increase of sensors.展开更多
When all the rules of sensor decision are known, the optimal distributeddecision fusion, which relies only on the joint conditional probability densities, can be derivedfor very general decision systems. They include ...When all the rules of sensor decision are known, the optimal distributeddecision fusion, which relies only on the joint conditional probability densities, can be derivedfor very general decision systems. They include those systems with interdependent sensorobservations and any network structure. It is also valid for m-ary Bayesian decision problems andbinary problems under the Neyman-Pearson criterion. Local decision rules of a sensor withcommunication from other sensors that are optimal for the sensor itself are also presented, whichtake the form of a generalized likelihood ratio test. Numerical examples are given to reveal someinteresting phenomena that communication between sensors can improve performance of a senordecision, but cannot guarantee to improve the global fusion performance when sensor rules were givenbefore fusing.展开更多
As traditional two-parameter constant false alarm rate (CFAR) target detec-tion algorithms in SAR images ignore target distribution, their performances are not the best or near best. As the resolution of SAR images ...As traditional two-parameter constant false alarm rate (CFAR) target detec-tion algorithms in SAR images ignore target distribution, their performances are not the best or near best. As the resolution of SAR images increases, small targets present more pixels in SAR images. So the target distribution is of much significance. Distribution-based CFAR detection algorithm is presented. We unite the pixels around the test cell, and estimate the distribution of test cell by them. Generalized Likelihood Ratio Test (GLRT) is used to deduce the detectors. The performance of the distribution-based CFAR (DBCFAR) detectors is analyzed theoretically. False alarms of DBCFAR detection are fewer than those of CFAR at the same detection rate. Finally experiments are done and the results show the performance of DBCFAR is out of conventional CFAR. False alarms of DBCFAR detection are concentrated while those of CFAR detection are dispersive.展开更多
Varying-coefficient models are a useful extension of classical linear model. They are widely applied to economics, biomedicine, epidemiology, and so on. There are extensive studies on them in the latest three decade y...Varying-coefficient models are a useful extension of classical linear model. They are widely applied to economics, biomedicine, epidemiology, and so on. There are extensive studies on them in the latest three decade years. In this paper, many of models related to varying-coefficient models are gathered up. All kinds of the estimation procedures and theory of hypothesis test on the varying-coefficients model are summarized. Prom my opinion, some aspects waiting to study are proposed.展开更多
In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are giv...In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are given. Furthermore,the robustnessof the test statistic in a certain sense is proved. Finally,the optimality properties of thetest are derived.展开更多
The problem of detecting signal with multiple input mul-tiple output (MIMO) radar in correlated Gaussian clutter dominated scenario with unknown covariance matrix is dealt with. The gen-eral MIMO model, with widely ...The problem of detecting signal with multiple input mul-tiple output (MIMO) radar in correlated Gaussian clutter dominated scenario with unknown covariance matrix is dealt with. The gen-eral MIMO model, with widely separated sub-arrays and co-located antennas at each sub-array, is adopted. Firstly, the generalized likelihood ratio test (GLRT) with known covariance matrix is ob-tained, and then the Rao and Wald detectors are devised, which have proved that the Rao and Wald test coincide with GLRT detec-tor. To make the detectors fully adaptive, the secondary data with signal-free will be collected to estimate the covariance. The per-formance of the proposed detector is analyzed, however, it is just ancillary. A thorough performance assessment by several numer-ical examples is also given, which has considered the sense with co-located antennas configure of transmitters and receivers array. The results show that the performance the proposed adaptive de-tector is better than LJ-GLRT, and the loss can be acceptable in comparison to their non-adaptive counterparts.展开更多
The problem of adaptive detection in the situation of signal mismatch is considered; that is, the actual signal steering vector is not aligned with the nominal one. Two novel tunable detectors are proposed. They can c...The problem of adaptive detection in the situation of signal mismatch is considered; that is, the actual signal steering vector is not aligned with the nominal one. Two novel tunable detectors are proposed. They can control the degree to which the mismatched signals are rejected. Remarkably, it is found that they both cover existing famous detectors as their special cases. More importantly, they possess the constant false alarm rate(CFAR)property and achieve enhanced mismatched signal rejection or improved robustness than their natural competitors. Besides, they can provide slightly better matched signals detection performance than the existing detectors.展开更多
The problem of adaptive radar detection in compound-Gaussian clutter without secondary data is considered in this paper.In most practical applications,the number of training data is limited.To overcome the lack of tra...The problem of adaptive radar detection in compound-Gaussian clutter without secondary data is considered in this paper.In most practical applications,the number of training data is limited.To overcome the lack of training data,an autoregressive(AR)-process-based covariance matrix estimator is proposed.Then,with the estimated covariance matrix the one-step generalized likelihood ratio test(GLRT)detector is designed without training data.Finally,detection performance of our proposed detector is assessed.展开更多
基金Supported by the National Natural Science Foundation of China(10661003)the SRF for ROCS,SEM([2004]527)the NSF of Guangxi(0728092)
文摘This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ratio test is X^2(1).
文摘Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.
文摘In this paper, we propose the test statistic to check whether the nonparametric function in partially linear models is linear or not. We estimate the nonparametric function in alternative by using the local linear method, and then estimate the parameters by the two stage method. The test statistic under the null hypothesis is calculated, and it is shown to be asymptotically normal.
文摘We propose the test statistic to check whether the nonpararnetric functions in two partially linear models are equality or not in this paper. We estimate the nonparametric function both in null hypothesis and the alternative by the local linear method, where we ignore the parametric components, and then estimate the parameters by the two stage method. The test statistic is derived, and it is shown to be asymptotically normal under the null hypothesis.
文摘Coutsourides derived an ad hoc nuisance paratmeter removal test for testing equality of two multiple correlation matrices of two independent p variate normal populations under the assumption that a sample of size n is available from each population. This paper presents a likelihood ratio test criterion for testing equality of K multiple correlation matrices and extends the results to the testing of equality of K partial correlation matrices.
文摘Consider I pairs of independent binomial variates x0i and x1i with corresponding parameters P0i and p1i and sample sizes n0i and n1i for i=1, …,I. Let △i = P1i-P0i be the difference of the two binomial parameters, where △i’s are to be of interest and P0i’s are nuisance parameters. The null hypothesis of homogeneity on the risk difference can be written as
文摘Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the context of change point analysis. This study develops a likelihood-based algorithm that detects and estimates multiple change points in a set of count data assumed to follow the Negative Binomial distribution. Discrete change point procedures discussed in literature work well for equi-dispersed data. The new algorithm produces reliable estimates of change points in cases of both equi-dispersed and over-dispersed count data;hence its advantage over other count data change point techniques. The Negative Binomial Multiple Change Point Algorithm was tested using simulated data for different sample sizes and varying positions of change. Changes in the distribution parameters were detected and estimated by conducting a likelihood ratio test on several partitions of data obtained through step-wise recursive binary segmentation. Critical values for the likelihood ratio test were developed and used to check for significance of the maximum likelihood estimates of the change points. The change point algorithm was found to work best for large datasets, though it also works well for small and medium-sized datasets with little to no error in the location of change points. The algorithm correctly detects changes when present and fails to detect changes when change is absent in actual sense. Power analysis of the likelihood ratio test for change was performed through Monte-Carlo simulation in the single change point setting. Sensitivity analysis of the test power showed that likelihood ratio test is the most powerful when the simulated change points are located mid-way through the sample data as opposed to when changes were located in the periphery. Further, the test is more powerful when the change was located three-quarter-way through the sample data compared to when the change point is closer (quarter-way) to the first observation.
基金supported by the National Defense Pre-Research Foundation of China
文摘To resolve problems of complicated clutter, fast-varying scenes, and low signal-clutterratio (SCR) in application of target detection on sea for space-based radar (SBR), a target detection approach based on adaptive waveform design is proposed in this paper. Firstly, complicated sea clutter is modeled as compound Gaussian process, and a target is modeled as some scatterers with Gaussian reflectivity. Secondly, every dwell duration of radar is divided into several sub-dwells. Regular linear frequency modulated pulses are transmitted at Sub-dwell 1, and the received signal at this sub-dwell is used to estimate clutter covariance matrices and pre-detection. Estimated matrices are updated at every following sub-dwell by multiple particle filtering to cope with fast-varying clutter scenes of SBR. Furthermore, waveform of every following sub-dwell is designed adaptively according to mean square optimization technique. Finally, principal component analysis and generalized likelihood ratio test is used for mitigation of colored interference and property of constant false alarm rate, respectively. Simulation results show that, considering configuration of SBR and condition of complicated clutter, 9 dB is reduced for SCR which reliable detection requires by this target detection approach. Therefore, the work in this paper can markedly improve radar detection performance for weak targets.
基金funded by Asia–Pacific Forests Net(APFNET/2010/FPF/001)National Natural Science Foundation of China(Grant No.31400552)
文摘The occurrence of lightning-induced forest fires during a time period is count data featuring over-dispersion (i.e., variance is larger than mean) and a high frequency of zero counts. In this study, we used six generalized linear models to examine the relationship between the occurrence of lightning-induced forest fires and meteorological factors in the Northern Daxing'an Mountains of China. The six models included Poisson, negative binomial (NB), zero- inflated Poisson (ZIP), zero-inflated negative binomial (ZINB), Poisson hurdle (PH), and negative binomial hurdle (NBH) models. Goodness-of-fit was compared and tested among the six models using Akaike information criterion (AIC), sum of squared errors, likelihood ratio test, and Vuong test. The predictive performance of the models was assessed and compared using independent validation data by the data-splitting method. Based on the model AIC, the ZINB model best fitted the fire occurrence data, followed by (in order of smaller AIC) NBH, ZIP, NB, PH, and Poisson models. The ZINB model was also best for pre- dicting either zero counts or positive counts (〉1). The two Hurdle models (PH and NBH) were better than ZIP, Poisson, and NB models for predicting positive counts, but worse than these three models for predicting zero counts. Thus, the ZINB model was the first choice for modeling the occurrence of lightning-induced forest fires in this study, which implied that the excessive zero counts of lightning- induced fires came from both structure and sampling zeros.
基金This work was supported by the Natural Science Foundation of Tianjin (Grant No. 033603111).
文摘To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.
基金National Defense Exploratory Research Project (7130620)
文摘In this article, radar echoes of aircraft wake vortices are modeled as weighted sums of the frequency components of the echoes with a special covariance matrix for the weighted coefficients. With a proposed detection scheme, two generalized likelihood ratio test (GLRT) detectors are derived respectively for aircraft wake vortices with time-varying and time-invariant Doppler spectra. Then the analytical expressions for detection and false alarm probabilities of the detectors are derived and three factors are investigated which mainly influence the detection performance, i.e., the Doppler extension and uncertainty of the aircraft wake vortex, and the number of the detection cells. The results indicate that, the signal-to-noise ratio (SNR) loss induced by Doppler extension is generally several decibels. The SNR loss due to Doppler uncertainty is approximately proportional to the logarithm of the number of spectrum lines in the uncertain Doppler spectrum intervals. For a large number of detection cells, the SNR gain is approximately proportional to the square root of the number of the detection cells.
基金supported by the University of New South Wales and the Australian Research Council under grant No.DP120102607
文摘As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of activity related to lo- cation-verification techniques in wireless networks. In particular, there has been a specific focus on intelligent transport systems because of the mission-critical nature of vehicle location verification. In this paper, we review recent research on wireless location verification related to vehicular networks. We focus on location verification systems that rely on for- mal mathematical classification frameworks and show how many systems are either partially or fully encompassed by such frameworks.
基金supported by the Natural Science Foundation Research Project of Shaanxi Province(2016JQ6020)
文摘An optimized detection model based on weighted entropy for multiple input multiple output (MIMO) radar in multipath environment is presented. After defining the multipath distance difference (MDD), the multipath received signal model with four paths is built systematically. Both the variance and correlation coefficient of multipath scattering coefficient with MDD are analyzed, which indicates that the multipath variable can decrease the detection performance by reducing the echo power. By making use of the likelihood ratio test (LRT), a new method based on weighted entropy is introduced to use the positive multipath echo power and suppress the negative echo power, which results in better performance. Simulation results show that, compared with non-multipath environment or other recently developed methods, the proposed method can achieve detection performance improvement with the increase of sensors.
文摘When all the rules of sensor decision are known, the optimal distributeddecision fusion, which relies only on the joint conditional probability densities, can be derivedfor very general decision systems. They include those systems with interdependent sensorobservations and any network structure. It is also valid for m-ary Bayesian decision problems andbinary problems under the Neyman-Pearson criterion. Local decision rules of a sensor withcommunication from other sensors that are optimal for the sensor itself are also presented, whichtake the form of a generalized likelihood ratio test. Numerical examples are given to reveal someinteresting phenomena that communication between sensors can improve performance of a senordecision, but cannot guarantee to improve the global fusion performance when sensor rules were givenbefore fusing.
文摘As traditional two-parameter constant false alarm rate (CFAR) target detec-tion algorithms in SAR images ignore target distribution, their performances are not the best or near best. As the resolution of SAR images increases, small targets present more pixels in SAR images. So the target distribution is of much significance. Distribution-based CFAR detection algorithm is presented. We unite the pixels around the test cell, and estimate the distribution of test cell by them. Generalized Likelihood Ratio Test (GLRT) is used to deduce the detectors. The performance of the distribution-based CFAR (DBCFAR) detectors is analyzed theoretically. False alarms of DBCFAR detection are fewer than those of CFAR at the same detection rate. Finally experiments are done and the results show the performance of DBCFAR is out of conventional CFAR. False alarms of DBCFAR detection are concentrated while those of CFAR detection are dispersive.
基金Foundation item: Supported by the National Natural Science Foundation of China(10501053) Acknowledgement I would like to thank Henan Society of Applied Statistics for which give me a chance to declare my opinion about the varying-coefficient model.
文摘Varying-coefficient models are a useful extension of classical linear model. They are widely applied to economics, biomedicine, epidemiology, and so on. There are extensive studies on them in the latest three decade years. In this paper, many of models related to varying-coefficient models are gathered up. All kinds of the estimation procedures and theory of hypothesis test on the varying-coefficients model are summarized. Prom my opinion, some aspects waiting to study are proposed.
文摘In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are given. Furthermore,the robustnessof the test statistic in a certain sense is proved. Finally,the optimality properties of thetest are derived.
基金supported by the Fundamental Research Funds for the Central Universities (103.1.2-E022050205)
文摘The problem of detecting signal with multiple input mul-tiple output (MIMO) radar in correlated Gaussian clutter dominated scenario with unknown covariance matrix is dealt with. The gen-eral MIMO model, with widely separated sub-arrays and co-located antennas at each sub-array, is adopted. Firstly, the generalized likelihood ratio test (GLRT) with known covariance matrix is ob-tained, and then the Rao and Wald detectors are devised, which have proved that the Rao and Wald test coincide with GLRT detec-tor. To make the detectors fully adaptive, the secondary data with signal-free will be collected to estimate the covariance. The per-formance of the proposed detector is analyzed, however, it is just ancillary. A thorough performance assessment by several numer-ical examples is also given, which has considered the sense with co-located antennas configure of transmitters and receivers array. The results show that the performance the proposed adaptive de-tector is better than LJ-GLRT, and the loss can be acceptable in comparison to their non-adaptive counterparts.
基金supported by the National Natural Science Foundation of China(6110216960925005)
文摘The problem of adaptive detection in the situation of signal mismatch is considered; that is, the actual signal steering vector is not aligned with the nominal one. Two novel tunable detectors are proposed. They can control the degree to which the mismatched signals are rejected. Remarkably, it is found that they both cover existing famous detectors as their special cases. More importantly, they possess the constant false alarm rate(CFAR)property and achieve enhanced mismatched signal rejection or improved robustness than their natural competitors. Besides, they can provide slightly better matched signals detection performance than the existing detectors.
基金supported by the Fundamental Research Funds for the Central Universities under Grant No.E022050205
文摘The problem of adaptive radar detection in compound-Gaussian clutter without secondary data is considered in this paper.In most practical applications,the number of training data is limited.To overcome the lack of training data,an autoregressive(AR)-process-based covariance matrix estimator is proposed.Then,with the estimated covariance matrix the one-step generalized likelihood ratio test(GLRT)detector is designed without training data.Finally,detection performance of our proposed detector is assessed.