Laboratory experiments next to a variety of observations, especially in subduction zones, have explored the existence of a premonitory stable slow slip growth phase preceding large earthquakes. These phe- nomena play ...Laboratory experiments next to a variety of observations, especially in subduction zones, have explored the existence of a premonitory stable slow slip growth phase preceding large earthquakes. These phe- nomena play an important role in the earthquake cycle and thus precise imaging and monitoring of these events are of great significance. In the literature, ENIF (extended network inversion filter) has been proposed as a rigorous algorithm capable of isolating signal from different types of noise and thereby provides us with deep insight into spatio-temporal evolution of slow slip events. Despite its considerable advantages, ENIF still suffers from some limitations. ENIF applies Tikhonov method of regularization with a quadratic form of cost function. While anomalous slip regions have clear contrast with the background slip in reality, Tikhonov regularization tends to over smooth (globally smooth) the slipping portion on the estimated images. In order to avoid over smoothing phenomenon, we have incorporated into ENIF an image segmentation step which tries to preserve edges of slow-slip event. As a second limitation, due to the nonlinearity imposed by such constraint as non-negativity of slip rate, uncertainty propagation through model is not simple. As the core of ENIF, EKF (extended Kalman filter), performs uncertainty propagation by linearization of nonlinear model using Jacobian and Hessian matrices. As an alternative for EKE we have also investigated the application of UKF (unscented Kalman filter) which uses UT (unscented transform) for uncertainty propagation. Finally, we tested our proposed algorithm using a low signal to noise ratio synthetic data set. The results show a significant improvement in the perfor- mance of ENIF when the segmentation step is incorporated into the algorithm.展开更多
Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the c...Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the context of change point analysis. This study develops a likelihood-based algorithm that detects and estimates multiple change points in a set of count data assumed to follow the Negative Binomial distribution. Discrete change point procedures discussed in literature work well for equi-dispersed data. The new algorithm produces reliable estimates of change points in cases of both equi-dispersed and over-dispersed count data;hence its advantage over other count data change point techniques. The Negative Binomial Multiple Change Point Algorithm was tested using simulated data for different sample sizes and varying positions of change. Changes in the distribution parameters were detected and estimated by conducting a likelihood ratio test on several partitions of data obtained through step-wise recursive binary segmentation. Critical values for the likelihood ratio test were developed and used to check for significance of the maximum likelihood estimates of the change points. The change point algorithm was found to work best for large datasets, though it also works well for small and medium-sized datasets with little to no error in the location of change points. The algorithm correctly detects changes when present and fails to detect changes when change is absent in actual sense. Power analysis of the likelihood ratio test for change was performed through Monte-Carlo simulation in the single change point setting. Sensitivity analysis of the test power showed that likelihood ratio test is the most powerful when the simulated change points are located mid-way through the sample data as opposed to when changes were located in the periphery. Further, the test is more powerful when the change was located three-quarter-way through the sample data compared to when the change point is closer (quarter-way) to the first observation.展开更多
广播电视卫星地球站网际互连协议(Internet Protocol,IP)化改造通过引入高性能信号路由优化技术来提升超高清视频业务承载能力与传输效率。通过分析传统同步数字体系在4K/8K传输中存在的瓶颈,研究了基于IPv6段路由(Segment Routing over...广播电视卫星地球站网际互连协议(Internet Protocol,IP)化改造通过引入高性能信号路由优化技术来提升超高清视频业务承载能力与传输效率。通过分析传统同步数字体系在4K/8K传输中存在的瓶颈,研究了基于IPv6段路由(Segment Routing over Internet Protocol Version 6,SRv6)驱动的智能路由决策、快速倒换机制与服务质量(Quality of Service,QoS)保障体系在卫星回传链路中的应用,并提出多维度资源协同优化框架,同时设计了以IP为承载的软件定义网络(IP over Software Defined Network,IPoSDN)的混合组网策略,旨在提高路由收敛效率和链路利用率、降低端到端时延、增强业务连续性。展开更多
基金the International Association of Geodesy(IAG)Secretary General Herman Drewes,for providing us with financial support to present the current study in this symposium
文摘Laboratory experiments next to a variety of observations, especially in subduction zones, have explored the existence of a premonitory stable slow slip growth phase preceding large earthquakes. These phe- nomena play an important role in the earthquake cycle and thus precise imaging and monitoring of these events are of great significance. In the literature, ENIF (extended network inversion filter) has been proposed as a rigorous algorithm capable of isolating signal from different types of noise and thereby provides us with deep insight into spatio-temporal evolution of slow slip events. Despite its considerable advantages, ENIF still suffers from some limitations. ENIF applies Tikhonov method of regularization with a quadratic form of cost function. While anomalous slip regions have clear contrast with the background slip in reality, Tikhonov regularization tends to over smooth (globally smooth) the slipping portion on the estimated images. In order to avoid over smoothing phenomenon, we have incorporated into ENIF an image segmentation step which tries to preserve edges of slow-slip event. As a second limitation, due to the nonlinearity imposed by such constraint as non-negativity of slip rate, uncertainty propagation through model is not simple. As the core of ENIF, EKF (extended Kalman filter), performs uncertainty propagation by linearization of nonlinear model using Jacobian and Hessian matrices. As an alternative for EKE we have also investigated the application of UKF (unscented Kalman filter) which uses UT (unscented transform) for uncertainty propagation. Finally, we tested our proposed algorithm using a low signal to noise ratio synthetic data set. The results show a significant improvement in the perfor- mance of ENIF when the segmentation step is incorporated into the algorithm.
文摘Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the context of change point analysis. This study develops a likelihood-based algorithm that detects and estimates multiple change points in a set of count data assumed to follow the Negative Binomial distribution. Discrete change point procedures discussed in literature work well for equi-dispersed data. The new algorithm produces reliable estimates of change points in cases of both equi-dispersed and over-dispersed count data;hence its advantage over other count data change point techniques. The Negative Binomial Multiple Change Point Algorithm was tested using simulated data for different sample sizes and varying positions of change. Changes in the distribution parameters were detected and estimated by conducting a likelihood ratio test on several partitions of data obtained through step-wise recursive binary segmentation. Critical values for the likelihood ratio test were developed and used to check for significance of the maximum likelihood estimates of the change points. The change point algorithm was found to work best for large datasets, though it also works well for small and medium-sized datasets with little to no error in the location of change points. The algorithm correctly detects changes when present and fails to detect changes when change is absent in actual sense. Power analysis of the likelihood ratio test for change was performed through Monte-Carlo simulation in the single change point setting. Sensitivity analysis of the test power showed that likelihood ratio test is the most powerful when the simulated change points are located mid-way through the sample data as opposed to when changes were located in the periphery. Further, the test is more powerful when the change was located three-quarter-way through the sample data compared to when the change point is closer (quarter-way) to the first observation.
文摘广播电视卫星地球站网际互连协议(Internet Protocol,IP)化改造通过引入高性能信号路由优化技术来提升超高清视频业务承载能力与传输效率。通过分析传统同步数字体系在4K/8K传输中存在的瓶颈,研究了基于IPv6段路由(Segment Routing over Internet Protocol Version 6,SRv6)驱动的智能路由决策、快速倒换机制与服务质量(Quality of Service,QoS)保障体系在卫星回传链路中的应用,并提出多维度资源协同优化框架,同时设计了以IP为承载的软件定义网络(IP over Software Defined Network,IPoSDN)的混合组网策略,旨在提高路由收敛效率和链路利用率、降低端到端时延、增强业务连续性。