By using the method of least square linear fitting to analyze data do not exist errors under certain conditions, in order to make the linear data fitting method that can more accurately solve the relationship expressi...By using the method of least square linear fitting to analyze data do not exist errors under certain conditions, in order to make the linear data fitting method that can more accurately solve the relationship expression between the volume and quantity in scientific experiments and engineering practice, this article analyzed data error by commonly linear data fitting method, and proposed improved process of the least distance squ^re method based on least squares method. Finally, the paper discussed the advantages and disadvantages through the example analysis of two kinds of linear data fitting method, and given reasonable control conditions for its application.展开更多
Infection with Hepatitis B Virus(HBV)has been a serious public health issue worldwide.It caused more than one million fatalities per year.The mathematical modelling of the disease allows better understanding of the tr...Infection with Hepatitis B Virus(HBV)has been a serious public health issue worldwide.It caused more than one million fatalities per year.The mathematical modelling of the disease allows better understanding of the transmission of the disease and help the government policy makers to choose the best control strategies.With this inspiration,we proposed a novel dynamic model by incorporating infection-age structure to imitate the transmission of HBV,especially the age heterogeneity in horizontal and vertical(mother-to-child)transmission modes.We also discussed its impact on control measures and analyzed the dynamics of waning immunity and reinfection.We conducted sensitivity analysis to evaluate the effectiveness of each control measure.Our research concentrates on HBV acute patient cases in the United States data from Centre for Disease Control and Prevention(CDC).Our findings show that a mixed approach by including vaccination,medication and periodic health assessments can effectively control HBV transmission.Among these measures,we found that early vaccination with a single-dose vaccine of US$50 is the most cost-effective control strategy.展开更多
Hepatitis B is an infectious disease worthy of attention.Considering the incubation period,psychological inhibition factor,vaccine,limited medical resources and horizontal transmission,an SIRS model is proposed to des...Hepatitis B is an infectious disease worthy of attention.Considering the incubation period,psychological inhibition factor,vaccine,limited medical resources and horizontal transmission,an SIRS model is proposed to describe hepatitis B transmission dynamics.In order to describe the behavior changes caused by people's psychological changes,the non-monotonic incidence rate is adopted in the model.We use the saturated treatment rate to describe the limited medical resources.Mathematical analysis shows the existence conditions of the equilibria,forward or backward bifurcation,Hopf bifurcation and the Bogdanov-Takens bifurcation.During the observation of the case data of hepatitis B in China,it is found that there are mainly three features,periodic outbreaks,aperiodic outbreaks,and periodic outbreaks turns to aperiodic outbreaks.According to the above features,we select three different representative regions,Jiangxi,Zhejiang province and Beijing,and then use our model to fit the actual monthly hepatitis B case data.The basic reproduction numbers that we estimated are 1.7712,1.4805 and 1.4132,respectively.The results of data fitting are consistent with those of theoretical analysis.According to the sensitivity analysis of Ro,we conclude that reducing contact,increasing treatment rate,strengthening vaccination and revaccinating can effectively prevent and control the prevalence of hepatitis B.展开更多
We propose a new reconstruction scheme for the backward heat conduction problem. By using the eigenfunction expansions, this ill-posed problem is solved by an optimization problem, which is essentially a regularizing ...We propose a new reconstruction scheme for the backward heat conduction problem. By using the eigenfunction expansions, this ill-posed problem is solved by an optimization problem, which is essentially a regularizing scheme for the noisy input data with both the number of truncation terms and the approximation accuracy for the final data as multiple regularizing parameters. The convergence rate analysis depending on the strategy of choosing regularizing parameters as well as the computational accuracy of eigenfunctions is given. Numerical implementations are presented to show the validity of this new scheme.展开更多
Given a set of scattered data with derivative values. If the data is noisy or there is an extremely large number of data, we use an extension of the penalized least squares method of von Golitschek and Schumaker [Serd...Given a set of scattered data with derivative values. If the data is noisy or there is an extremely large number of data, we use an extension of the penalized least squares method of von Golitschek and Schumaker [Serdica, 18 (2002), pp.1001-1020] to fit the data. We show that the extension of the penalized least squares method produces a unique spline to fit the data. Also we give the error bound for the extension method. Some numerical examples are presented to demonstrate the effectiveness of the proposed method.展开更多
The Dazu Rock Carvings in Chongqing were inscribed on the World Heritage List in 1999.In recent years,the Dazu Rock Carvings have faced environmental challenges such as geological forces,increased precipitation,pollut...The Dazu Rock Carvings in Chongqing were inscribed on the World Heritage List in 1999.In recent years,the Dazu Rock Carvings have faced environmental challenges such as geological forces,increased precipitation,pollution and tourism,which have led to rock deterioration and structural instability.The multi-source monitoring system for the protection of the rock carvings,based on the Internet of Things,includes Global Navigation Satellite System(GNSS)displacement monitoring,static level displacement monitoring,laser rangefinder displacement monitoring,roof pressure sensor monitoring and environmental damage monitoring.This paper analyses data from each sub-monitoring system within the multi-source monitoring system applied to Yuanjue Cave in the Dazu Rock Carvings.Initially,a correlation analysis between climate monitoring data and roof displacement data was carried out to assess the effect of temperature.Based on the results of the analysis,a temperature correction equation for the laser rangefinder was derived to improve the laser rangefinder displacement monitoring system.The improved system was then used to monitor Cave 168,revealing the deformation and erosion patterns of the roof.The research results demonstrate that the multiparameter monitoring system is capable of accurately measuring and analyzing the stability of the Dazu stone carvings,as well as the effects of environmental conditions on them.The use of the Internet of Things(IoT)and real-time data collection to monitor rock deformation and environmental conditions is an innovative application of technology in cultural heritage conservation.Interpretation of the monitoring system and statistical correlation analysis of temperature and laser rangefinder data highlight the thoroughness of the methodology in this paper and its relevance to sustainable mountain development.In the future,multi-source monitoring systems will have a broader application in the conservation of other UNESCO World Heritage Sites.展开更多
Based on the definition of MQ-B-Splines,this article constructs five types of univariate quasi-interpolants to non-uniformly distributed data. The error estimates and the shape-preserving properties are shown in detai...Based on the definition of MQ-B-Splines,this article constructs five types of univariate quasi-interpolants to non-uniformly distributed data. The error estimates and the shape-preserving properties are shown in details.And examples are shown to demonstrate the capacity of the quasi-interpolants for curve representation.展开更多
With the observation of a series of ground-based laser interferometer gravitational wave(GW)detectors such as LIGO and Virgo,nearly 100 GW events have been detected successively.At present,all detected GW events are g...With the observation of a series of ground-based laser interferometer gravitational wave(GW)detectors such as LIGO and Virgo,nearly 100 GW events have been detected successively.At present,all detected GW events are generated by the mergers of compact binary systems and are identified through the data processing of matched filtering.Based on matched filtering,we use the GW waveform of the Newtonian approximate(NA)model constructed by linearized theory to match the events detected by LIGO and injections to determine the coalescence time and utilize the frequency curve for data fitting to estimate the parameters of the chirp masses of binary black holes(BBHs).The average chirp mass of our results is 22.05_(-6.31)^(+6.31)M_(⊙),which is very close to 23.80_(-3.52)^(+4.83)M_(⊙)provided by GWOSC.In the process,we can analyze LIGO GW events and estimate the chirp masses of the BBHs.This work presents the feasibility and accuracy of the low-order approximate model and data fitting in the application of GW data processing.It is beneficial for further data processing and has certain research value for the preliminary application of GW data.展开更多
As the basic protective element, steel plate had attracted world-wide attention because of frequent threats of explosive loads. This paper reports the relationships between microscopic defects of Q345 steel plate unde...As the basic protective element, steel plate had attracted world-wide attention because of frequent threats of explosive loads. This paper reports the relationships between microscopic defects of Q345 steel plate under the explosive load and its macroscopic dynamics simulation. Firstly, the defect characteristics of the steel plate were investigated by stereoscopic microscope(SM) and scanning electron microscope(SEM). At the macroscopic level, the defect was the formation of cave which was concentrated in the range of 0-3.0 cm from the explosion center, while at the microscopic level, the cavity and void formation were the typical damage characteristics. It also explains that the difference in defect morphology at different positions was the combining results of high temperature and high pressure. Secondly, the variation rules of mechanical properties of steel plate under explosive load were studied. The Arbitrary Lagrange-Euler(ALE) algorithm and multi-material fluid-structure coupling method were used to simulate the explosion process of steel plate. The accuracy of the method was verified by comparing the deformation of the simulation results with the experimental results, the pressure and stress at different positions on the surface of the steel plate were obtained. The simulation results indicated that the critical pressure causing the plate defects may be approximately 2.01 GPa. On this basis, it was found that the variation rules of surface pressure and microscopic defect area of the Q345 steel plate were strikingly similar, and the corresponding mathematical relationship between them was established. Compared with Monomolecular growth fitting models(MGFM) and Logistic fitting models(LFM), the relationship can be better expressed by cubic polynomial fitting model(CPFM). This paper illustrated that the explosive defect characteristics of metal plate at the microscopic level can be explored by analyzing its macroscopic dynamic mechanical response.展开更多
Some recent developments(accelerated expansion)in the Universe cannot be explained by the conventional formulation of general relativity.We apply the recently proposed f(T,B)gravity to investigate the accelerated expa...Some recent developments(accelerated expansion)in the Universe cannot be explained by the conventional formulation of general relativity.We apply the recently proposed f(T,B)gravity to investigate the accelerated expansion of the Universe.By parametrizing the Hubble parameter and estimating the best fit values of the model parameters b_(0),b_(1),and b_(2)imposed from Supernovae type la,Cosmic Microwave Background,B aryon Acoustic Oscillation,and Hubble data using the Markov Chain Monte Carlo method,we propose a method to determine the precise solutions to the field equations.We then observe that the model appears to be in good agreement with the observations.A change from the deceleration to the acceleration phase of the Universe is shown by the evolution of the deceleration parameter.In addition,we investigate the behavior of the statefinder analysis,equation of state(EoS)parameters,along with the energy conditions.Furthermore,to discuss other cosmological parameters,we consider some wellknown f(T,B)gravity models,specifically,f(T,B)=aT^(b)+cB^(d).Lastly,we find that the considered f(T,B)gravity models predict that the present Universe is accelerating and the EoS parameter behaves like the ACDM model.展开更多
The fitting of lifetime distribution in real-life data has been studied in various fields of research. With the theory of evolution still applicable, more complex data from real-world scenarios will continue to emerge...The fitting of lifetime distribution in real-life data has been studied in various fields of research. With the theory of evolution still applicable, more complex data from real-world scenarios will continue to emerge. Despite this, many researchers have made commendable efforts to develop new lifetime distributions that can fit this complex data. In this paper, we utilized the KM-transformation technique to increase the flexibility of the power Lindley distribution, resulting in the Kavya-Manoharan Power Lindley (KMPL) distribution. We study the mathematical treatments of the KMPL distribution in detail and adapt the widely used method of maximum likelihood to estimate the unknown parameters of the KMPL distribution. We carry out a Monte Carlo simulation study to investigate the performance of the Maximum Likelihood Estimates (MLEs) of the parameters of the KMPL distribution. To demonstrate the effectiveness of the KMPL distribution for data fitting, we use a real dataset comprising the waiting time of 100 bank customers. We compare the KMPL distribution with other models that are extensions of the power Lindley distribution. Based on some statistical model selection criteria, the summary results of the analysis were in favor of the KMPL distribution. We further investigate the density fit and probability-probability (p-p) plots to validate the superiority of the KMPL distribution over the competing distributions for fitting the waiting time dataset.展开更多
In working state, the dynamic performance of dry gas seal, generated by the rotating end face with spiral grooves, is determined by the open force of gas film and leakage flow rate. Generally, the open force and the l...In working state, the dynamic performance of dry gas seal, generated by the rotating end face with spiral grooves, is determined by the open force of gas film and leakage flow rate. Generally, the open force and the leakage flow rate can be obtained by finite element method, computational fluid dynamics method and experimental measurement method. However, it will take much time to carry out the above measurements and calculations. In this paper, the approximate model of parallel grooves based on the narrow groove theory is used to establish the dynamic equations of the gas film for the purpose of obtaining the dynamic parameters of gas film. The nonlinear differential equations of gas film model are solved by Runge-Kutta method and shooting method. The numerical values of the pressure profiles, leakage flux and opening force on the seal surface are integrated, and then compared to experimental data for the reliability of the numerical simulation. The results show that the numerical simulation curves are in good agreement with experimental values. Furthermore, the opening force and the leakage flux are proved to be strongly correlated with the operating parameters. Then, the function-coupling method is introduced to analyze the numerical results to obtain the correlation formulae of the opening force and leakage flux respectively with the operating parameters, i.e., the inlet pressure and the rotating speed. This study intends to provide an effective way to predict the aerodynamic performance for designing and optimizing the groove styles in dry gas seal rapidly and accurately.展开更多
Uncontrolled residual stresses have significant effects on the service time and defects of the spun parts.Nowadays,X-Ray Diffraction(XRD)method has been widely used in the residual stress measurement of industry produ...Uncontrolled residual stresses have significant effects on the service time and defects of the spun parts.Nowadays,X-Ray Diffraction(XRD)method has been widely used in the residual stress measurement of industry products with different forming processes.The calculated residual stress is usually obtained from the data fitting slope of strain and angle with Ordinary Least Squares(OLS)method.But this fitting method is not always suitable for the big fluctuant data.In this paper,the Weighted Least Square(WLS)method is used for the data fitting and compared with the OLS method.The nickel-based superalloy GH3030 and iron-based superalloy GH1140 are applied in the multi-pass cold spinning experiments.The residual stress distributions of normal,potential crack and wrinkle workpieces are discussed with the grain structure.The results show that WLS method has better goodness of fit compared with OLS method.The residual stress distributions have special relationship with potential crack,wrinkle workpiece and grain structure.展开更多
The state estimation of a maneuvering target,of which the trajectory shape is independent on dynamic characteristics,is studied.The conventional motion models in Cartesian coordinates imply that the trajectory of a ta...The state estimation of a maneuvering target,of which the trajectory shape is independent on dynamic characteristics,is studied.The conventional motion models in Cartesian coordinates imply that the trajectory of a target is completely determined by its dynamic characteristics.However,this is not true in the applications of road-target,sea-route-target or flight route-target tracking,where target trajectory shape is uncoupled with target velocity properties.In this paper,a new estimation algorithm based on separate modeling of target trajectory shape and dynamic characteristics is proposed.The trajectory of a target over a sliding window is described by a linear function of the arc length.To determine the unknown target trajectory,an augmented system is derived by denoting the unknown coefficients of the function as states in mileage coordinates.At every estimation cycle except the first one,the interaction(mixing)stage of the proposed algorithm starts from the latest estimated base state and a recalculated parameter vector,which is determined by the least squares(LS).Numerical experiments are conducted to assess the performance of the proposed algorithm.Simulation results show that the proposed algorithm can achieve better performance than the conventional coupled model-based algorithms in the presence of target maneuvers.展开更多
With the development of computational power, there has been an increased focus on data-fitting related seismic inversion techniques for high fidelity seismic velocity model and image, such as full-waveform inversion a...With the development of computational power, there has been an increased focus on data-fitting related seismic inversion techniques for high fidelity seismic velocity model and image, such as full-waveform inversion and least squares migration. However, though more advanced than conventional methods, these data fitting methods can be very expensive in terms of computational cost. Recently, various techniques to optimize these data-fitting seismic inversion problems have been implemented to cater for the industrial need for much improved efficiency. In this study, we propose a general stochastic conjugate gradient method for these data-fitting related inverse problems. We first prescribe the basic theory of our method and then give synthetic examples. Our numerical experiments illustrate the potential of this method for large-size seismic inversion application.展开更多
Spherical indentations that rely on original date are analyzed with the physically correct mathematical formula and its integration that take into account the radius over depth changes upon penetration. Linear plots, ...Spherical indentations that rely on original date are analyzed with the physically correct mathematical formula and its integration that take into account the radius over depth changes upon penetration. Linear plots, phase-transition onsets, energies, and pressures are algebraically obtained for germanium, zinc-oxide and gallium-nitride. There are low pressure phase-transitions that correspond to, or are not resolved by hydrostatic anvil onset pressures. This enables the attribution of polymorph structures, by comparing with known structures from pulsed laser deposition or molecular beam epitaxy and twinning. The spherical indentation is the easiest way for the synthesis and further characterization of polymorphs, now available in pure form under diamond calotte and in contact with their corresponding less dense polymorph. The unprecedented results and new possibilities require loading curves from experimental data. These are now easily distinguished from data that are “fitted” to make them concur with widely used unphysical Johnson’s formula for spheres (“<span style="white-space:nowrap;"><em>P</em> = (4/3)<em>h</em><sup>3/2</sup><em>R</em><sup>1/2</sup><em>E</em><sup><span style="white-space:nowrap;">∗</span></sup></span>”) not taking care of the <em>R/h</em> variation. Its challenge is indispensable, because its use involves “fitting equations” for making the data concur. These faked reports (no “experimental” data) provide dangerous false moduli and theories. The fitted spherical indentation reports with radii ranging from 4 to 250 μm are identified for PDMS, GaAs, Al, Si, SiC, MgO, and Steel. The detailed analysis reveals characteristic features.展开更多
Based on an analysis of 280 Type SNIa supernovae and gamma-ray bursts redshifts in the range of z = 0.0104 - 8.1 the Hubble diagram is shown to follow a strictly exponential slope predicting an exponentially expanding...Based on an analysis of 280 Type SNIa supernovae and gamma-ray bursts redshifts in the range of z = 0.0104 - 8.1 the Hubble diagram is shown to follow a strictly exponential slope predicting an exponentially expanding or static universe. At redshifts > 2 - 3 ΛCDM models show a poor agreement with the observed data. Based on the results presented in this paper, the Hubble diagram test does not necessarily support the idea of expansion according to the big-bang concordance model.展开更多
In this paper we consider quintessence reconstruction of interacting holographic dark energy in a non-fiat background. As system's IR cutoff we choose the radius of the event horizon measured on the sphere of the hor...In this paper we consider quintessence reconstruction of interacting holographic dark energy in a non-fiat background. As system's IR cutoff we choose the radius of the event horizon measured on the sphere of the horizon, defined as L = at(t). To this end we construct a quintessence model by a real, single scalar field. Evolution of the potential, V(φ), as well as the dynamics of the scalar field, φ, is obtained according to the respective holographic dark energy. The reconstructed potentials show a cosmological constant behavior for the present time. We constrain the model parameters in a fiat universe by using the observational data, and applying the Monte Carlo Markov chain simulation. We obtain the best fit values of the holographic dark energy model and the interacting parameters as c=1.0576-0.6632-0.6632^+0.3010+0.3052 and ζ =0.2433-0.2251-.2251^+0.6373+0.6373 , respectively. From the data fitting results we also find that the model can cross the phantom line in the present universe where the best fit value of the dark energy equation of state is WD=-1.2429.展开更多
In experimental tests, besides data in range of allowable error, the experimenters usually get some unexpected wrong data called bad points. In usual experimental data processing, the method of bad points exclusion ba...In experimental tests, besides data in range of allowable error, the experimenters usually get some unexpected wrong data called bad points. In usual experimental data processing, the method of bad points exclusion based on automatic programming is seldom taken into consideration by researchers. This paper presents a new method to reject bad points based on Hough transform, which is modified to save computational and memory consumptions. It is fit for linear data processing and can be extended to process data that is possible to be transformed into and from linear form; curved lines, which can be effectively detected by Hough transform. In this paper, the premise is the distribution of data, such as linear distribution and exponential distribution, is predetermined. Steps of the algorithm start from searching for an approximate curve line that minimizes the sum of parameters of data points. The data points, whose parameters are above a self-adapting threshold, will be deleted. Simulation experiments have manifested that the method proposed in this paper performs efficiently and robustly.展开更多
文摘By using the method of least square linear fitting to analyze data do not exist errors under certain conditions, in order to make the linear data fitting method that can more accurately solve the relationship expression between the volume and quantity in scientific experiments and engineering practice, this article analyzed data error by commonly linear data fitting method, and proposed improved process of the least distance squ^re method based on least squares method. Finally, the paper discussed the advantages and disadvantages through the example analysis of two kinds of linear data fitting method, and given reasonable control conditions for its application.
基金supported by the National Natural Science Foundation of China(Nos.12326335,12326341)the Natural Science Foundation of Xinjiang,China(No.2022D01A198).
文摘Infection with Hepatitis B Virus(HBV)has been a serious public health issue worldwide.It caused more than one million fatalities per year.The mathematical modelling of the disease allows better understanding of the transmission of the disease and help the government policy makers to choose the best control strategies.With this inspiration,we proposed a novel dynamic model by incorporating infection-age structure to imitate the transmission of HBV,especially the age heterogeneity in horizontal and vertical(mother-to-child)transmission modes.We also discussed its impact on control measures and analyzed the dynamics of waning immunity and reinfection.We conducted sensitivity analysis to evaluate the effectiveness of each control measure.Our research concentrates on HBV acute patient cases in the United States data from Centre for Disease Control and Prevention(CDC).Our findings show that a mixed approach by including vaccination,medication and periodic health assessments can effectively control HBV transmission.Among these measures,we found that early vaccination with a single-dose vaccine of US$50 is the most cost-effective control strategy.
文摘Hepatitis B is an infectious disease worthy of attention.Considering the incubation period,psychological inhibition factor,vaccine,limited medical resources and horizontal transmission,an SIRS model is proposed to describe hepatitis B transmission dynamics.In order to describe the behavior changes caused by people's psychological changes,the non-monotonic incidence rate is adopted in the model.We use the saturated treatment rate to describe the limited medical resources.Mathematical analysis shows the existence conditions of the equilibria,forward or backward bifurcation,Hopf bifurcation and the Bogdanov-Takens bifurcation.During the observation of the case data of hepatitis B in China,it is found that there are mainly three features,periodic outbreaks,aperiodic outbreaks,and periodic outbreaks turns to aperiodic outbreaks.According to the above features,we select three different representative regions,Jiangxi,Zhejiang province and Beijing,and then use our model to fit the actual monthly hepatitis B case data.The basic reproduction numbers that we estimated are 1.7712,1.4805 and 1.4132,respectively.The results of data fitting are consistent with those of theoretical analysis.According to the sensitivity analysis of Ro,we conclude that reducing contact,increasing treatment rate,strengthening vaccination and revaccinating can effectively prevent and control the prevalence of hepatitis B.
基金Acknowledgments. This work is supported by NSFC (No.11071039) and Natural Science Foundation of Jiangsu Province (No.BK2011584).
文摘We propose a new reconstruction scheme for the backward heat conduction problem. By using the eigenfunction expansions, this ill-posed problem is solved by an optimization problem, which is essentially a regularizing scheme for the noisy input data with both the number of truncation terms and the approximation accuracy for the final data as multiple regularizing parameters. The convergence rate analysis depending on the strategy of choosing regularizing parameters as well as the computational accuracy of eigenfunctions is given. Numerical implementations are presented to show the validity of this new scheme.
基金supported by Science Foundation of Zhejiang Sci-Tech University(ZSTU) under Grant No.0813826-Y
文摘Given a set of scattered data with derivative values. If the data is noisy or there is an extremely large number of data, we use an extension of the penalized least squares method of von Golitschek and Schumaker [Serdica, 18 (2002), pp.1001-1020] to fit the data. We show that the extension of the penalized least squares method produces a unique spline to fit the data. Also we give the error bound for the extension method. Some numerical examples are presented to demonstrate the effectiveness of the proposed method.
基金financial support from the National Natural Science Foundation of China(No.42377154)。
文摘The Dazu Rock Carvings in Chongqing were inscribed on the World Heritage List in 1999.In recent years,the Dazu Rock Carvings have faced environmental challenges such as geological forces,increased precipitation,pollution and tourism,which have led to rock deterioration and structural instability.The multi-source monitoring system for the protection of the rock carvings,based on the Internet of Things,includes Global Navigation Satellite System(GNSS)displacement monitoring,static level displacement monitoring,laser rangefinder displacement monitoring,roof pressure sensor monitoring and environmental damage monitoring.This paper analyses data from each sub-monitoring system within the multi-source monitoring system applied to Yuanjue Cave in the Dazu Rock Carvings.Initially,a correlation analysis between climate monitoring data and roof displacement data was carried out to assess the effect of temperature.Based on the results of the analysis,a temperature correction equation for the laser rangefinder was derived to improve the laser rangefinder displacement monitoring system.The improved system was then used to monitor Cave 168,revealing the deformation and erosion patterns of the roof.The research results demonstrate that the multiparameter monitoring system is capable of accurately measuring and analyzing the stability of the Dazu stone carvings,as well as the effects of environmental conditions on them.The use of the Internet of Things(IoT)and real-time data collection to monitor rock deformation and environmental conditions is an innovative application of technology in cultural heritage conservation.Interpretation of the monitoring system and statistical correlation analysis of temperature and laser rangefinder data highlight the thoroughness of the methodology in this paper and its relevance to sustainable mountain development.In the future,multi-source monitoring systems will have a broader application in the conservation of other UNESCO World Heritage Sites.
基金Supported by the National Natural Science Foundation of China( 1 9971 0 1 7,1 0 1 2 5 1 0 2 )
文摘Based on the definition of MQ-B-Splines,this article constructs five types of univariate quasi-interpolants to non-uniformly distributed data. The error estimates and the shape-preserving properties are shown in details.And examples are shown to demonstrate the capacity of the quasi-interpolants for curve representation.
基金the National Key Research and Development Program of China(Grant No.2021YFC2203004)the National Natural Science Foundation of China(Grant No.12147102)the Sichuan Youth Science and Technology Innovation Research Team(Grant No.21CXTD0038)。
文摘With the observation of a series of ground-based laser interferometer gravitational wave(GW)detectors such as LIGO and Virgo,nearly 100 GW events have been detected successively.At present,all detected GW events are generated by the mergers of compact binary systems and are identified through the data processing of matched filtering.Based on matched filtering,we use the GW waveform of the Newtonian approximate(NA)model constructed by linearized theory to match the events detected by LIGO and injections to determine the coalescence time and utilize the frequency curve for data fitting to estimate the parameters of the chirp masses of binary black holes(BBHs).The average chirp mass of our results is 22.05_(-6.31)^(+6.31)M_(⊙),which is very close to 23.80_(-3.52)^(+4.83)M_(⊙)provided by GWOSC.In the process,we can analyze LIGO GW events and estimate the chirp masses of the BBHs.This work presents the feasibility and accuracy of the low-order approximate model and data fitting in the application of GW data processing.It is beneficial for further data processing and has certain research value for the preliminary application of GW data.
基金Science and Technology Project of Fire Rescue Bureau of Ministry of Emergency Management(Grant No.2022XFZD05)S&T Program of Hebei(Grant No.22375419D)National Natural Science Foundation of China(Grant No.11802160).
文摘As the basic protective element, steel plate had attracted world-wide attention because of frequent threats of explosive loads. This paper reports the relationships between microscopic defects of Q345 steel plate under the explosive load and its macroscopic dynamics simulation. Firstly, the defect characteristics of the steel plate were investigated by stereoscopic microscope(SM) and scanning electron microscope(SEM). At the macroscopic level, the defect was the formation of cave which was concentrated in the range of 0-3.0 cm from the explosion center, while at the microscopic level, the cavity and void formation were the typical damage characteristics. It also explains that the difference in defect morphology at different positions was the combining results of high temperature and high pressure. Secondly, the variation rules of mechanical properties of steel plate under explosive load were studied. The Arbitrary Lagrange-Euler(ALE) algorithm and multi-material fluid-structure coupling method were used to simulate the explosion process of steel plate. The accuracy of the method was verified by comparing the deformation of the simulation results with the experimental results, the pressure and stress at different positions on the surface of the steel plate were obtained. The simulation results indicated that the critical pressure causing the plate defects may be approximately 2.01 GPa. On this basis, it was found that the variation rules of surface pressure and microscopic defect area of the Q345 steel plate were strikingly similar, and the corresponding mathematical relationship between them was established. Compared with Monomolecular growth fitting models(MGFM) and Logistic fitting models(LFM), the relationship can be better expressed by cubic polynomial fitting model(CPFM). This paper illustrated that the explosive defect characteristics of metal plate at the microscopic level can be explored by analyzing its macroscopic dynamic mechanical response.
基金the Science Committee of the Ministry of Education and Science of the Republic of Kazakhstan provided funding for this study(Grant No.AP09058240)。
文摘Some recent developments(accelerated expansion)in the Universe cannot be explained by the conventional formulation of general relativity.We apply the recently proposed f(T,B)gravity to investigate the accelerated expansion of the Universe.By parametrizing the Hubble parameter and estimating the best fit values of the model parameters b_(0),b_(1),and b_(2)imposed from Supernovae type la,Cosmic Microwave Background,B aryon Acoustic Oscillation,and Hubble data using the Markov Chain Monte Carlo method,we propose a method to determine the precise solutions to the field equations.We then observe that the model appears to be in good agreement with the observations.A change from the deceleration to the acceleration phase of the Universe is shown by the evolution of the deceleration parameter.In addition,we investigate the behavior of the statefinder analysis,equation of state(EoS)parameters,along with the energy conditions.Furthermore,to discuss other cosmological parameters,we consider some wellknown f(T,B)gravity models,specifically,f(T,B)=aT^(b)+cB^(d).Lastly,we find that the considered f(T,B)gravity models predict that the present Universe is accelerating and the EoS parameter behaves like the ACDM model.
文摘The fitting of lifetime distribution in real-life data has been studied in various fields of research. With the theory of evolution still applicable, more complex data from real-world scenarios will continue to emerge. Despite this, many researchers have made commendable efforts to develop new lifetime distributions that can fit this complex data. In this paper, we utilized the KM-transformation technique to increase the flexibility of the power Lindley distribution, resulting in the Kavya-Manoharan Power Lindley (KMPL) distribution. We study the mathematical treatments of the KMPL distribution in detail and adapt the widely used method of maximum likelihood to estimate the unknown parameters of the KMPL distribution. We carry out a Monte Carlo simulation study to investigate the performance of the Maximum Likelihood Estimates (MLEs) of the parameters of the KMPL distribution. To demonstrate the effectiveness of the KMPL distribution for data fitting, we use a real dataset comprising the waiting time of 100 bank customers. We compare the KMPL distribution with other models that are extensions of the power Lindley distribution. Based on some statistical model selection criteria, the summary results of the analysis were in favor of the KMPL distribution. We further investigate the density fit and probability-probability (p-p) plots to validate the superiority of the KMPL distribution over the competing distributions for fitting the waiting time dataset.
基金Supported by National Natural Science Foundation of China(Grant No.51276125)National Key Basic Research Development Program of China(973 Program,Grant No.2012CB720101)
文摘In working state, the dynamic performance of dry gas seal, generated by the rotating end face with spiral grooves, is determined by the open force of gas film and leakage flow rate. Generally, the open force and the leakage flow rate can be obtained by finite element method, computational fluid dynamics method and experimental measurement method. However, it will take much time to carry out the above measurements and calculations. In this paper, the approximate model of parallel grooves based on the narrow groove theory is used to establish the dynamic equations of the gas film for the purpose of obtaining the dynamic parameters of gas film. The nonlinear differential equations of gas film model are solved by Runge-Kutta method and shooting method. The numerical values of the pressure profiles, leakage flux and opening force on the seal surface are integrated, and then compared to experimental data for the reliability of the numerical simulation. The results show that the numerical simulation curves are in good agreement with experimental values. Furthermore, the opening force and the leakage flux are proved to be strongly correlated with the operating parameters. Then, the function-coupling method is introduced to analyze the numerical results to obtain the correlation formulae of the opening force and leakage flux respectively with the operating parameters, i.e., the inlet pressure and the rotating speed. This study intends to provide an effective way to predict the aerodynamic performance for designing and optimizing the groove styles in dry gas seal rapidly and accurately.
基金supported by the Zhejiang Provincial Natural Science Foundation,China(No.LZ17E050001)the National Natural Science Foundation of China(No.51975301 and No.52075359)。
文摘Uncontrolled residual stresses have significant effects on the service time and defects of the spun parts.Nowadays,X-Ray Diffraction(XRD)method has been widely used in the residual stress measurement of industry products with different forming processes.The calculated residual stress is usually obtained from the data fitting slope of strain and angle with Ordinary Least Squares(OLS)method.But this fitting method is not always suitable for the big fluctuant data.In this paper,the Weighted Least Square(WLS)method is used for the data fitting and compared with the OLS method.The nickel-based superalloy GH3030 and iron-based superalloy GH1140 are applied in the multi-pass cold spinning experiments.The residual stress distributions of normal,potential crack and wrinkle workpieces are discussed with the grain structure.The results show that WLS method has better goodness of fit compared with OLS method.The residual stress distributions have special relationship with potential crack,wrinkle workpiece and grain structure.
基金supported by the National Natural Science Foundation of China(61671181).
文摘The state estimation of a maneuvering target,of which the trajectory shape is independent on dynamic characteristics,is studied.The conventional motion models in Cartesian coordinates imply that the trajectory of a target is completely determined by its dynamic characteristics.However,this is not true in the applications of road-target,sea-route-target or flight route-target tracking,where target trajectory shape is uncoupled with target velocity properties.In this paper,a new estimation algorithm based on separate modeling of target trajectory shape and dynamic characteristics is proposed.The trajectory of a target over a sliding window is described by a linear function of the arc length.To determine the unknown target trajectory,an augmented system is derived by denoting the unknown coefficients of the function as states in mileage coordinates.At every estimation cycle except the first one,the interaction(mixing)stage of the proposed algorithm starts from the latest estimated base state and a recalculated parameter vector,which is determined by the least squares(LS).Numerical experiments are conducted to assess the performance of the proposed algorithm.Simulation results show that the proposed algorithm can achieve better performance than the conventional coupled model-based algorithms in the presence of target maneuvers.
基金partially supported by the National Natural Science Foundation of China (No.41230318)
文摘With the development of computational power, there has been an increased focus on data-fitting related seismic inversion techniques for high fidelity seismic velocity model and image, such as full-waveform inversion and least squares migration. However, though more advanced than conventional methods, these data fitting methods can be very expensive in terms of computational cost. Recently, various techniques to optimize these data-fitting seismic inversion problems have been implemented to cater for the industrial need for much improved efficiency. In this study, we propose a general stochastic conjugate gradient method for these data-fitting related inverse problems. We first prescribe the basic theory of our method and then give synthetic examples. Our numerical experiments illustrate the potential of this method for large-size seismic inversion application.
文摘Spherical indentations that rely on original date are analyzed with the physically correct mathematical formula and its integration that take into account the radius over depth changes upon penetration. Linear plots, phase-transition onsets, energies, and pressures are algebraically obtained for germanium, zinc-oxide and gallium-nitride. There are low pressure phase-transitions that correspond to, or are not resolved by hydrostatic anvil onset pressures. This enables the attribution of polymorph structures, by comparing with known structures from pulsed laser deposition or molecular beam epitaxy and twinning. The spherical indentation is the easiest way for the synthesis and further characterization of polymorphs, now available in pure form under diamond calotte and in contact with their corresponding less dense polymorph. The unprecedented results and new possibilities require loading curves from experimental data. These are now easily distinguished from data that are “fitted” to make them concur with widely used unphysical Johnson’s formula for spheres (“<span style="white-space:nowrap;"><em>P</em> = (4/3)<em>h</em><sup>3/2</sup><em>R</em><sup>1/2</sup><em>E</em><sup><span style="white-space:nowrap;">∗</span></sup></span>”) not taking care of the <em>R/h</em> variation. Its challenge is indispensable, because its use involves “fitting equations” for making the data concur. These faked reports (no “experimental” data) provide dangerous false moduli and theories. The fitted spherical indentation reports with radii ranging from 4 to 250 μm are identified for PDMS, GaAs, Al, Si, SiC, MgO, and Steel. The detailed analysis reveals characteristic features.
文摘Based on an analysis of 280 Type SNIa supernovae and gamma-ray bursts redshifts in the range of z = 0.0104 - 8.1 the Hubble diagram is shown to follow a strictly exponential slope predicting an exponentially expanding or static universe. At redshifts > 2 - 3 ΛCDM models show a poor agreement with the observed data. Based on the results presented in this paper, the Hubble diagram test does not necessarily support the idea of expansion according to the big-bang concordance model.
基金supported financially by Research Institute for Astronomy & Astrophysics of Maragha (RIAAM), Iran
文摘In this paper we consider quintessence reconstruction of interacting holographic dark energy in a non-fiat background. As system's IR cutoff we choose the radius of the event horizon measured on the sphere of the horizon, defined as L = at(t). To this end we construct a quintessence model by a real, single scalar field. Evolution of the potential, V(φ), as well as the dynamics of the scalar field, φ, is obtained according to the respective holographic dark energy. The reconstructed potentials show a cosmological constant behavior for the present time. We constrain the model parameters in a fiat universe by using the observational data, and applying the Monte Carlo Markov chain simulation. We obtain the best fit values of the holographic dark energy model and the interacting parameters as c=1.0576-0.6632-0.6632^+0.3010+0.3052 and ζ =0.2433-0.2251-.2251^+0.6373+0.6373 , respectively. From the data fitting results we also find that the model can cross the phantom line in the present universe where the best fit value of the dark energy equation of state is WD=-1.2429.
文摘In experimental tests, besides data in range of allowable error, the experimenters usually get some unexpected wrong data called bad points. In usual experimental data processing, the method of bad points exclusion based on automatic programming is seldom taken into consideration by researchers. This paper presents a new method to reject bad points based on Hough transform, which is modified to save computational and memory consumptions. It is fit for linear data processing and can be extended to process data that is possible to be transformed into and from linear form; curved lines, which can be effectively detected by Hough transform. In this paper, the premise is the distribution of data, such as linear distribution and exponential distribution, is predetermined. Steps of the algorithm start from searching for an approximate curve line that minimizes the sum of parameters of data points. The data points, whose parameters are above a self-adapting threshold, will be deleted. Simulation experiments have manifested that the method proposed in this paper performs efficiently and robustly.