Outbreaks of hand-foot-mouth disease(HFMD) have occurred many times and caused serious health burden in China since 2008. Application of modern information technology to prediction and early response can be helpful ...Outbreaks of hand-foot-mouth disease(HFMD) have occurred many times and caused serious health burden in China since 2008. Application of modern information technology to prediction and early response can be helpful for efficient HFMD prevention and control. A seasonal auto-regressive integrated moving average(ARIMA) model for time series analysis was designed in this study. Eighty-four-month(from January 2009 to December 2015) retrospective data obtained from the Chinese Information System for Disease Prevention and Control were subjected to ARIMA modeling. The coefficient of determination(R^2), normalized Bayesian Information Criterion(BIC) and Q-test P value were used to evaluate the goodness-of-fit of constructed models. Subsequently, the best-fitted ARIMA model was applied to predict the expected incidence of HFMD from January 2016 to December 2016. The best-fitted seasonal ARIMA model was identified as(1,0,1)(0,1,1)12, with the largest coefficient of determination(R^2=0.743) and lowest normalized BIC(BIC=3.645) value. The residuals of the model also showed non-significant autocorrelations(P_(Box-Ljung(Q))=0.299). The predictions by the optimum ARIMA model adequately captured the pattern in the data and exhibited two peaks of activity over the forecast interval, including a major peak during April to June, and again a light peak for September to November. The ARIMA model proposed in this study can forecast HFMD incidence trend effectively, which could provide useful support for future HFMD prevention and control in the study area. Besides, further observations should be added continually into the modeling data set, and parameters of the models should be adjusted accordingly.展开更多
The polynomial matrix using the block coefficient matrix representation auto-regressive moving average(referred to as the PM-ARMA)model is constructed in this paper for actively controlled multi-degree-of-freedom(MDOF...The polynomial matrix using the block coefficient matrix representation auto-regressive moving average(referred to as the PM-ARMA)model is constructed in this paper for actively controlled multi-degree-of-freedom(MDOF)structures with time-delay through equivalently transforming the preliminary state space realization into the new state space realization.The PM-ARMA model is a more general formulation with respect to the polynomial using the coefficient representation auto-regressive moving average(ARMA)model due to its capability to cope with actively controlled structures with any given structural degrees of freedom and any chosen number of sensors and actuators.(The sensors and actuators are required to maintain the identical number.)under any dimensional stationary stochastic excitation.展开更多
Statistical properties of winds near the Taichung Harbour are investigated. The 26 years'incomplete data of wind speeds, measured on an hourly basis, are used as reference. The possibility of imputation using simu...Statistical properties of winds near the Taichung Harbour are investigated. The 26 years'incomplete data of wind speeds, measured on an hourly basis, are used as reference. The possibility of imputation using simulated results of the Auto-Regressive (AR), Moving-Average (MA), and/ or Auto-Regressive and Moving-Average (ARMA) models is studied. Predictions of the 25-year extreme wind speeds based upon the augmented data are compared with the original series. Based upon the results, predictions of the 50- and 100-year extreme wind speeds are then made.展开更多
Signal-to-noise ratio(SNR)estimation for signal which can be modeled by Auto-regressive(AR)process is studied in this paper.First,the conventional frequency domain method is introduced to estimate the SNR for the ...Signal-to-noise ratio(SNR)estimation for signal which can be modeled by Auto-regressive(AR)process is studied in this paper.First,the conventional frequency domain method is introduced to estimate the SNR for the received signal in additive white Gauss noise(AWGN)channel.Then a parametric SNR estimation algorithm is proposed by taking advantage of the AR model information of the received signal.The simulation results show that the proposed parametric method has better performance than the conventional frequency doma in method in case of AWGN channel.展开更多
An absolute gravimeter is a precision instrument for measuring gravitational acceleration, which plays an important role in earthquake monitoring, crustal deformation, national defense construction, etc. The frequency...An absolute gravimeter is a precision instrument for measuring gravitational acceleration, which plays an important role in earthquake monitoring, crustal deformation, national defense construction, etc. The frequency of laser interference fringes of an absolute gravimeter gradually increases with the fall time. Data are sparse in the early stage and dense in the late stage. The fitting accuracy of gravitational acceleration will be affected by least-squares fitting according to the fixed number of zero-crossing groups. In response to this problem, a method based on Fourier series fitting is proposed in this paper to calculate the zero-crossing point. The whole falling process is divided into five frequency bands using the Hilbert transformation. The multiplicative auto-regressive moving average model is then trained according to the number of optimal zero-crossing groups obtained by the honey badger algorithm. Through this model, the number of optimal zero-crossing groups determined in each segment is predicted by the least-squares fitting. The mean value of gravitational acceleration in each segment is then obtained. The method can improve the accuracy of gravitational measurement by more than 25% compared to the fixed zero-crossing groups method. It provides a new way to improve the measuring accuracy of an absolute gravimeter.展开更多
Memristors have a synapse-like two-terminal structure and electrical properties,which are widely used in the construc-tion of artificial synapses.However,compared to inorganic materials,organic materials are rarely us...Memristors have a synapse-like two-terminal structure and electrical properties,which are widely used in the construc-tion of artificial synapses.However,compared to inorganic materials,organic materials are rarely used for artificial spiking synapses due to their relatively poor memrisitve performance.Here,for the first time,we present an organic memristor based on an electropolymerized dopamine-based memristive layer.This polydopamine-based memristor demonstrates the improve-ments in key performance,including a low threshold voltage of 0.3 V,a thin thickness of 16 nm,and a high parasitic capaci-tance of about 1μF·mm^(-2).By leveraging these properties in combination with its stable threshold switching behavior,we con-struct a capacitor-free and low-power artificial spiking neuron capable of outputting the oscillation voltage,whose spiking fre-quency increases with the increase of current stimulation analogous to a biological neuron.The experimental results indicate that our artificial spiking neuron holds potential for applications in neuromorphic computing and systems.展开更多
With the increasing popularity of blockchain applications, the security of data sources on the blockchain is gradually receiving attention. Providing reliable data for the blockchain safely and efficiently has become ...With the increasing popularity of blockchain applications, the security of data sources on the blockchain is gradually receiving attention. Providing reliable data for the blockchain safely and efficiently has become a research hotspot, and the security of the oracle responsible for providing reliable data has attracted much attention. The most widely used centralized oracles in blockchain, such as Provable and Town Crier, all rely on a single oracle to obtain data, which suffers from a single point of failure and limits the large-scale development of blockchain. To this end, the distributed oracle scheme is put forward, but the existing distributed oracle schemes such as Chainlink and Augur generally have low execution efficiency and high communication overhead, which leads to their poor applicability. To solve the above problems, this paper proposes a trusted distributed oracle scheme based on a share recovery threshold signature. First, a data verification method of distributed oracles is designed based on threshold signature. By aggregating the signatures of oracles, data from different data sources can be mutually verified, leading to a more efficient data verification and aggregation process. Then, a credibility-based cluster head election algorithm is designed, which reduces the communication overhead by clarifying the function distribution and building a hierarchical structure. Considering the good performance of the BLS threshold signature in large-scale applications, this paper combines it with distributed oracle technology and proposes a BLS threshold signature algorithm that supports share recovery in distributed oracles. The share recovery mechanism enables the proposed scheme to solve the key loss issue, and the setting of the threshold value enables the proposed scheme to complete signature aggregation with only a threshold number of oracles, making the scheme more robust. Finally, experimental results indicate that, by using the threshold signature technology and the cluster head election algorithm, our scheme effectively improves the execution efficiency of oracles and solves the problem of a single point of failure, leading to higher scalability and robustness.展开更多
Objective: To study the relationship between cortical auditory evoked potential (CAEP) thresholds and behavioral thresholds in pediatric populations with sensorineural hearing loss (SNHL). Methods: Fifteen children (m...Objective: To study the relationship between cortical auditory evoked potential (CAEP) thresholds and behavioral thresholds in pediatric populations with sensorineural hearing loss (SNHL). Methods: Fifteen children (mean age 6.8 years) with bilateral SNHL underwent behavioral pure-tone audiometry and CAEP testing at 0.5, 1, 2, and 4 kHz. CAEP thresholds were determined using tone bursts, and correlations between CAEP and pure-tone thresholds were analyzed using Pearson correlation and t-tests. Results: A strong positive correlation was observed between P1 thresholds and behavioral thresholds across all test frequencies: 0.5 kHz (r = 0.765, p Conclusion: The strong correlation between P1 and behavioral thresholds demonstrates the reliability of CAEP testing for estimating auditory thresholds in children. These findings support the use of CAEP testing as a reliable objective tool for threshold estimation, particularly in cases where behavioral responses cannot be reliably obtained. When adjusted with frequency-specific correction values, CAEP testing provides a reliable method for assessing hearing thresholds in pediatric populations.展开更多
The societal risk related to rainfalltriggered rapid debris flows is commonly managed in urbanized areas by means of early warning systems based on monitoring of hydrological parameters(such as rainfall or soil moistu...The societal risk related to rainfalltriggered rapid debris flows is commonly managed in urbanized areas by means of early warning systems based on monitoring of hydrological parameters(such as rainfall or soil moisture) and thresholds values.In Alpine catchments,this type of landslides is recurrent and represent one of the major geohazards.Debris flows are typically initiated by high-intensity rainstorms,prolonged rainfall with moderate intensity or snow melting.They frequently happen in situations of temporary infiltration into soils that are initially unsaturated.During significant rainfall events,the rise in pore water pressure can become crucial for the stability of slopes in particular areas.This phenomenon relies on hydraulic and geotechnical characteristics,along with the thickness of the involved soils.This procedure can result in a local drop in shear strength,as both apparent cohesion and effective stress decline,while driving forces rise because of the increase in unit weight.Accordingly,this study estimates Intensity-Duration(I-D) rainfall thresholds at the site-specific and distributed scales by combining empirical and physics-based approaches and modeling of soil coverings involved in soil slips or debris slides inducing debris flows.The approach was tested for mountain slopes of the Valtellina valley(Lombardia region,northern Italy),which suffered several catastrophic landslide events in the last decades.The empirical approach was adopted to reconstruct physics-based slope models of representative source areas of past debris flows events.To such a scope,nonpunctual but distributed data of hydro-mechanical soil properties and thicknesses were considered.Thus,to reconstruct the unsaturated/saturated critical conditions leading the slope instability,a combined hydrological modeling and infinite-slope stability analysis was adopted.This combined hydromechanical numerical model was used to attempt to determine a three-dimensional Intensity-Duration threshold for landslide initiation considering plausible rainfall for the Valtellina valley.Due to the lack of reliable records of past landslide hindering a thorough empirical analysis,the presented approach can be considered as a feasible approach for establishing a warning standard in urbanized areas at risk of shallow landslides triggered by rainfall.Moreover,findings highlight the importance of having access to spatially distributed soil characteristics to define and enhance input data for physics-based modelling.Finally,the proposed approach can aid an early warning system for the onset of shallow landslides by utilizing real-time rainfall monitoring or now-casting through a meteorological radar technique.展开更多
Radio environment plays an important role in radio astronomy observations.Further analysis is needed on the time and intensity distributions of interference signals for long-term radio environment monitoring.Sample va...Radio environment plays an important role in radio astronomy observations.Further analysis is needed on the time and intensity distributions of interference signals for long-term radio environment monitoring.Sample variance is an important estimate of the interference signal decision threshold.Here,we propose an improved algorithm for calculating data sample variance relying on four established statistical methods:the variance of the trimmed data,winsorized sample variance,median absolute deviation,and median of the trimmed data pairwise averaged squares method.The variance and decision threshold in the protected section of the radio astronomy L-band are calculated.Among the four methods,the improved median of the trimmed data pairwise averaged squares algorithm has higher accuracy,but in a comparison of overall experimental results,the cleanliness rate of all algorithms is above 96%.In a comparison between the improved algorithm and the four methods,the cleanliness rate of the improved algorithm is above 98%,verifying its feasibility.The time-intensity interference distribution in the radio protection band is also obtained.Finally,we use comprehensive monitoring data of radio astronomy protection bands,radio interference bands,and interfered frequency bands to establish a comprehensive evaluation system for radio observatory sites,including the observable time proportion in the radio astronomy protection band,the occasional time-intensity distribution in the radio interference frequency band,and the intensity distribution of the interfered frequency band.展开更多
Low-pressure mercury lamps,with a main emission of about 254 nm,have been used for disinfecting air,water and surfaces for nearly a century.However,only a few studies on the corneal damage threshold at the wavelength ...Low-pressure mercury lamps,with a main emission of about 254 nm,have been used for disinfecting air,water and surfaces for nearly a century.However,only a few studies on the corneal damage threshold at the wavelength of 254nm exist.In this paper,the in vivo corneal damage threshold was determined inchinchilla rabbit model using a laser system at 254 nm.The irradiance of the laser spot was nearly flat-top distributed and the beam diameter on the animal corneal surface was about 3.44mm and 3.28 mm along the horizontal and vertical directions.Damage lesion determinations were performed at 12 h post-exposure using fluorescein sodium staining.TheED50 was17.7 mJ/cm^(2)with a 95%confidence interval of 15.3-20.1 mJ/cm^(2).The obtained results may contribute to the knowledge base for the refinement of the UV hazard function.展开更多
Atmospheric aerosols are the primary contributors to environmental pollution.As such aerosols are micro-to nanosized particles invisible to the naked eye,it is necessary to utilize LiDAR technology for their detection...Atmospheric aerosols are the primary contributors to environmental pollution.As such aerosols are micro-to nanosized particles invisible to the naked eye,it is necessary to utilize LiDAR technology for their detection.The laser radar echo signal is vulnerable to background light and electronic thermal noise.While single-photon LiDAR can effectively reduce background light interference,electronic thermal noise remains a significant challenge,especially at long distances and in environments with a low signal-to-noise ratio(SNR).However,conventional denoising methods cannot achieve satisfactory results in this case.In this paper,a novel adaptive continuous threshold wavelet denoising algorithm is proposed to filter out the noise.The algorithm features an adaptive threshold and a continuous threshold function.The adaptive threshold is dynamically adjusted according to the wavelet decomposition level,and the continuous threshold function ensures continuity with lower constant error,thus optimizing the denoising process.Simulation results show that the proposed algorithm has excellent performance in improving SNR and reducing root mean square error(RMSE)compared with other algorithms.Experimental results show that denoising of an actual LiDAR echo signal results in a 4.37 dB improvement in SNR and a 39.5%reduction in RMSE.The proposed method significantly enhances the ability of single-photon LiDAR to detect weak signals.展开更多
A clock bias data processing method based on interval correlation coefficient wavelet threshold denoising is suggested for minor mistakes in clock bias data in order to increase the efficacy of satellite clock bias pr...A clock bias data processing method based on interval correlation coefficient wavelet threshold denoising is suggested for minor mistakes in clock bias data in order to increase the efficacy of satellite clock bias prediction.Wavelet analysis was first used to break down the satellite clock frequency data into several levels,producing high and low frequency coefficients for each layer.The correlation coefficients of the high and low frequency coefficients in each of the three sub-intervals created by splitting these coefficients were then determined.The major noise region—the sub-interval with the lowest correlation coefficient—was chosen for thresholding treatment and noise threshold computation.The clock frequency data was then processed using wavelet reconstruction and reconverted to clock data.Lastly,three different kinds of satellite clock data—RTS,whu-o,and IGS-F—were used to confirm the produced data.Our method enhanced the stability of the Quadratic Polynomial(QP)model’s predictions for the C16 satellite by about 40%,according to the results.The accuracy and stability of the Auto Regression Integrated Moving Average(ARIMA)model improved up to 41.8%and 14.2%,respectively,whilst the Wavelet Neural Network(WNN)model improved by roughly 27.8%and 63.6%,respectively.Although our method has little effect on forecasting IGS-F series satellites,the experimental findings show that it can improve the accuracy and stability of QP,ARIMA,and WNN model forecasts for RTS and whu-o satellite clock bias.展开更多
In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This...In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This article considers subgroup analysis methods for longitudinal data containing multiple covariates and biomarkers.We divide subgroups based on whether a linear combination of these biomarkers exceeds a predetermined threshold,and assess the heterogeneity of treatment effects across subgroups using the interaction between subgroups and exposure variables.Quantile regression is used to better characterize the global distribution of the response variable and sparsity penalties are imposed to achieve variable selection of covariates and biomarkers.The effectiveness of our proposed methodology for both variable selection and parameter estimation is verified through random simulations.Finally,we demonstrate the application of this method by analyzing data from the PA.3 trial,further illustrating the practicality of the method proposed in this paper.展开更多
BackgroundIt's crucial to study the effect of changes in thresholds(T)and most comfortable levels(M)on behavioral measurements in young children using cochlear implants.This would help the clinician with the optim...BackgroundIt's crucial to study the effect of changes in thresholds(T)and most comfortable levels(M)on behavioral measurements in young children using cochlear implants.This would help the clinician with the optimization and validation of programming parameters.ObjectiveThe study has attempted to describe the changes in behavioral responses with modification of T and M levels.MethodsTwenty-five participants in the age range 5 to 12 years using HR90K/HiFocus1J or HR90KAdvantage/HiFocus1J with Harmony speech processors participated in the study.A decrease in T levels,a rise in T levels,or a decrease in M levels in the everyday program were used to create experimental programs.Sound field thresholds and speech perception were measured at 50 dBHL for three experimental and everyday programs.ConclusionThe results indicated that only reductions of M levels resulted in significantly(p<0.01)poor aided thresholds and speech perception.On the other hand,variation in T levels did not have significant changes in either sound field thresholds or speech perception.The results highlight that M levels must be correctly established in order to prevent decreased speech perception and audibility.展开更多
The oilseed crop Camelina sativa exhibits salinity tolerance,but the effects on early growth stages across a range of different salts and in combination with salicylic acid(SA)have not been thoroughly evaluated.In thi...The oilseed crop Camelina sativa exhibits salinity tolerance,but the effects on early growth stages across a range of different salts and in combination with salicylic acid(SA)have not been thoroughly evaluated.In this study,seeds were germinated in varying concentrations of six salts(NaCl,CaCl_(2),ZnCl_(2),KCl,MgSO_(4),and Na2SO_(4))with or without 0.5 mM SA.Using the halotime model,we estimated salt thresholds for germination and parameters of seedling growth.Germination and seedling growth parameters of camelina significantly decreased with increasing salt concentration across all salt types.Salts containing Zn and SO_(4) were most detrimental to germination and seedling growth.Except for KCl,0.5 mM SA generally reduced the salinity tolerance threshold(Saltb(50))of camelina.Specifically,Saltb(50)was 21.5%higher for KCl and 16.1%,25.0%,54.9%,21.0%,and 5.6%lower for CaCl_(2),NaCl,MgSO_(4),Na2SO_(4),and ZnCl_(2),respectively,when 0.5 mM SA was compared to 0 mM SA.Furthermore,camelina seedling growth was consistently more sensitive than germination across all salt types.SA did not significantly enhance germination or seedling growth and was harmful when combined with certain salts or at the germination stage.It can be concluded that both the type of salt and the concentration of SA are as critical as the salt concentration in saline irrigation water.展开更多
The flow behaviors of gas and water in hydrate-bearing sediments(HBS)are significantly affected by the threshold pressure gradient(TPG).During long-term natural gas hydrates(NGHs)mining,there exists creep deformation ...The flow behaviors of gas and water in hydrate-bearing sediments(HBS)are significantly affected by the threshold pressure gradient(TPG).During long-term natural gas hydrates(NGHs)mining,there exists creep deformation in HBS,which significantly alters pore structures,makes the flow path of fluid more complex,and leads to changes in TPG.Thus,clarifying the evolution of TPG in HBS during creep is essential for NGH production,but it also confronts enormous challenges.In this study,based on the nonlinear creep constitutive model,a novel theoretical TPG model of HBS during creep is proposed that considers pore structures and hydrate pore morphology.The established model is validated against experimental data,demonstrating its ability to capture the evolution of TPG and permeability in HBS during creep.Additionally,the relationship between initial hydrate saturation and TPG of HBS during creep is revealed by sensitivity analysis.The creep strain increases with the decrease in initial hydrate saturation,leading to a greater TPG and a lower permeability.The evolution of TPG at the stable creep stage and the accelerated creep stage is primarily controlled by the Kelvin element and visco-plastic element,respectively.This novel proposed model provides a mechanistic understanding of TPG evolution in HBS during creep,and it is of great significance to optimize the exploitation of NGHs.展开更多
基金financially supported by the Health and Family Planning Commission of Hubei Province(No.WJ2017F047)the Health and Family Planning Commission of Wuhan(No.WG17D05)
文摘Outbreaks of hand-foot-mouth disease(HFMD) have occurred many times and caused serious health burden in China since 2008. Application of modern information technology to prediction and early response can be helpful for efficient HFMD prevention and control. A seasonal auto-regressive integrated moving average(ARIMA) model for time series analysis was designed in this study. Eighty-four-month(from January 2009 to December 2015) retrospective data obtained from the Chinese Information System for Disease Prevention and Control were subjected to ARIMA modeling. The coefficient of determination(R^2), normalized Bayesian Information Criterion(BIC) and Q-test P value were used to evaluate the goodness-of-fit of constructed models. Subsequently, the best-fitted ARIMA model was applied to predict the expected incidence of HFMD from January 2016 to December 2016. The best-fitted seasonal ARIMA model was identified as(1,0,1)(0,1,1)12, with the largest coefficient of determination(R^2=0.743) and lowest normalized BIC(BIC=3.645) value. The residuals of the model also showed non-significant autocorrelations(P_(Box-Ljung(Q))=0.299). The predictions by the optimum ARIMA model adequately captured the pattern in the data and exhibited two peaks of activity over the forecast interval, including a major peak during April to June, and again a light peak for September to November. The ARIMA model proposed in this study can forecast HFMD incidence trend effectively, which could provide useful support for future HFMD prevention and control in the study area. Besides, further observations should be added continually into the modeling data set, and parameters of the models should be adjusted accordingly.
基金The project supported by the National Natural Science Foundation of China(50278054)
文摘The polynomial matrix using the block coefficient matrix representation auto-regressive moving average(referred to as the PM-ARMA)model is constructed in this paper for actively controlled multi-degree-of-freedom(MDOF)structures with time-delay through equivalently transforming the preliminary state space realization into the new state space realization.The PM-ARMA model is a more general formulation with respect to the polynomial using the coefficient representation auto-regressive moving average(ARMA)model due to its capability to cope with actively controlled structures with any given structural degrees of freedom and any chosen number of sensors and actuators.(The sensors and actuators are required to maintain the identical number.)under any dimensional stationary stochastic excitation.
基金The project is partly supported by the National Science Council, Contract Nos. NSC-89-261 l-E-019-024 (JZY), and NSC-89-2611-E-019-027 (CRC).
文摘Statistical properties of winds near the Taichung Harbour are investigated. The 26 years'incomplete data of wind speeds, measured on an hourly basis, are used as reference. The possibility of imputation using simulated results of the Auto-Regressive (AR), Moving-Average (MA), and/ or Auto-Regressive and Moving-Average (ARMA) models is studied. Predictions of the 25-year extreme wind speeds based upon the augmented data are compared with the original series. Based upon the results, predictions of the 50- and 100-year extreme wind speeds are then made.
基金supported by the National Natural Science Foundation of China under Grant No. 60372022Program for New Century Excellent Talentsin University under Grant No. NCET-05-0806
文摘Signal-to-noise ratio(SNR)estimation for signal which can be modeled by Auto-regressive(AR)process is studied in this paper.First,the conventional frequency domain method is introduced to estimate the SNR for the received signal in additive white Gauss noise(AWGN)channel.Then a parametric SNR estimation algorithm is proposed by taking advantage of the AR model information of the received signal.The simulation results show that the proposed parametric method has better performance than the conventional frequency doma in method in case of AWGN channel.
基金Project supported by the National Key R&D Program of China (Grant No. 2022YFF0607504)。
文摘An absolute gravimeter is a precision instrument for measuring gravitational acceleration, which plays an important role in earthquake monitoring, crustal deformation, national defense construction, etc. The frequency of laser interference fringes of an absolute gravimeter gradually increases with the fall time. Data are sparse in the early stage and dense in the late stage. The fitting accuracy of gravitational acceleration will be affected by least-squares fitting according to the fixed number of zero-crossing groups. In response to this problem, a method based on Fourier series fitting is proposed in this paper to calculate the zero-crossing point. The whole falling process is divided into five frequency bands using the Hilbert transformation. The multiplicative auto-regressive moving average model is then trained according to the number of optimal zero-crossing groups obtained by the honey badger algorithm. Through this model, the number of optimal zero-crossing groups determined in each segment is predicted by the least-squares fitting. The mean value of gravitational acceleration in each segment is then obtained. The method can improve the accuracy of gravitational measurement by more than 25% compared to the fixed zero-crossing groups method. It provides a new way to improve the measuring accuracy of an absolute gravimeter.
基金support from the Beijing Natural Science Foundation-Xiaomi Innovation Joint Fund(No.L233009)National Natural Science Foundation of China(NSFC Nos.62422409,62174152,and 62374159)from the Youth Innovation Promotion Association of Chinese Academy of Sciences(No.2020115).
文摘Memristors have a synapse-like two-terminal structure and electrical properties,which are widely used in the construc-tion of artificial synapses.However,compared to inorganic materials,organic materials are rarely used for artificial spiking synapses due to their relatively poor memrisitve performance.Here,for the first time,we present an organic memristor based on an electropolymerized dopamine-based memristive layer.This polydopamine-based memristor demonstrates the improve-ments in key performance,including a low threshold voltage of 0.3 V,a thin thickness of 16 nm,and a high parasitic capaci-tance of about 1μF·mm^(-2).By leveraging these properties in combination with its stable threshold switching behavior,we con-struct a capacitor-free and low-power artificial spiking neuron capable of outputting the oscillation voltage,whose spiking fre-quency increases with the increase of current stimulation analogous to a biological neuron.The experimental results indicate that our artificial spiking neuron holds potential for applications in neuromorphic computing and systems.
基金supported by the National Natural Science Foundation of China(Grant No.62102449)the Central Plains Talent Program under Grant No.224200510003.
文摘With the increasing popularity of blockchain applications, the security of data sources on the blockchain is gradually receiving attention. Providing reliable data for the blockchain safely and efficiently has become a research hotspot, and the security of the oracle responsible for providing reliable data has attracted much attention. The most widely used centralized oracles in blockchain, such as Provable and Town Crier, all rely on a single oracle to obtain data, which suffers from a single point of failure and limits the large-scale development of blockchain. To this end, the distributed oracle scheme is put forward, but the existing distributed oracle schemes such as Chainlink and Augur generally have low execution efficiency and high communication overhead, which leads to their poor applicability. To solve the above problems, this paper proposes a trusted distributed oracle scheme based on a share recovery threshold signature. First, a data verification method of distributed oracles is designed based on threshold signature. By aggregating the signatures of oracles, data from different data sources can be mutually verified, leading to a more efficient data verification and aggregation process. Then, a credibility-based cluster head election algorithm is designed, which reduces the communication overhead by clarifying the function distribution and building a hierarchical structure. Considering the good performance of the BLS threshold signature in large-scale applications, this paper combines it with distributed oracle technology and proposes a BLS threshold signature algorithm that supports share recovery in distributed oracles. The share recovery mechanism enables the proposed scheme to solve the key loss issue, and the setting of the threshold value enables the proposed scheme to complete signature aggregation with only a threshold number of oracles, making the scheme more robust. Finally, experimental results indicate that, by using the threshold signature technology and the cluster head election algorithm, our scheme effectively improves the execution efficiency of oracles and solves the problem of a single point of failure, leading to higher scalability and robustness.
文摘Objective: To study the relationship between cortical auditory evoked potential (CAEP) thresholds and behavioral thresholds in pediatric populations with sensorineural hearing loss (SNHL). Methods: Fifteen children (mean age 6.8 years) with bilateral SNHL underwent behavioral pure-tone audiometry and CAEP testing at 0.5, 1, 2, and 4 kHz. CAEP thresholds were determined using tone bursts, and correlations between CAEP and pure-tone thresholds were analyzed using Pearson correlation and t-tests. Results: A strong positive correlation was observed between P1 thresholds and behavioral thresholds across all test frequencies: 0.5 kHz (r = 0.765, p Conclusion: The strong correlation between P1 and behavioral thresholds demonstrates the reliability of CAEP testing for estimating auditory thresholds in children. These findings support the use of CAEP testing as a reliable objective tool for threshold estimation, particularly in cases where behavioral responses cannot be reliably obtained. When adjusted with frequency-specific correction values, CAEP testing provides a reliable method for assessing hearing thresholds in pediatric populations.
文摘The societal risk related to rainfalltriggered rapid debris flows is commonly managed in urbanized areas by means of early warning systems based on monitoring of hydrological parameters(such as rainfall or soil moisture) and thresholds values.In Alpine catchments,this type of landslides is recurrent and represent one of the major geohazards.Debris flows are typically initiated by high-intensity rainstorms,prolonged rainfall with moderate intensity or snow melting.They frequently happen in situations of temporary infiltration into soils that are initially unsaturated.During significant rainfall events,the rise in pore water pressure can become crucial for the stability of slopes in particular areas.This phenomenon relies on hydraulic and geotechnical characteristics,along with the thickness of the involved soils.This procedure can result in a local drop in shear strength,as both apparent cohesion and effective stress decline,while driving forces rise because of the increase in unit weight.Accordingly,this study estimates Intensity-Duration(I-D) rainfall thresholds at the site-specific and distributed scales by combining empirical and physics-based approaches and modeling of soil coverings involved in soil slips or debris slides inducing debris flows.The approach was tested for mountain slopes of the Valtellina valley(Lombardia region,northern Italy),which suffered several catastrophic landslide events in the last decades.The empirical approach was adopted to reconstruct physics-based slope models of representative source areas of past debris flows events.To such a scope,nonpunctual but distributed data of hydro-mechanical soil properties and thicknesses were considered.Thus,to reconstruct the unsaturated/saturated critical conditions leading the slope instability,a combined hydrological modeling and infinite-slope stability analysis was adopted.This combined hydromechanical numerical model was used to attempt to determine a three-dimensional Intensity-Duration threshold for landslide initiation considering plausible rainfall for the Valtellina valley.Due to the lack of reliable records of past landslide hindering a thorough empirical analysis,the presented approach can be considered as a feasible approach for establishing a warning standard in urbanized areas at risk of shallow landslides triggered by rainfall.Moreover,findings highlight the importance of having access to spatially distributed soil characteristics to define and enhance input data for physics-based modelling.Finally,the proposed approach can aid an early warning system for the onset of shallow landslides by utilizing real-time rainfall monitoring or now-casting through a meteorological radar technique.
基金supported by the Ministry of Science and Technology SKA Special Project(2020SKA0110202)the Special Project on Building a Science and Technology Innovation Center for South and Southeast Asia-International Joint Innovation Platform in Yunnan Province:“Yunnan Sino-Malaysian International Joint Laboratory of HF-VHF Advanced Radio Astronomy Technology”(202303AP140003)+4 种基金the National Natural Science Foundation of China(NSFC)Joint Fund for Astronomy(JFA)incubator program(U2031133)the International Partnership Program Project of the International Cooperation Bureau of the Chinese Academy of Sciences:“Belt and Road”Cooperation(114A11KYSB20200001)the Kunming Foreign(International)Cooperation Base Program:“Yunnan Observatory of the Chinese Academy of Sciences-University of Malaya Joint R&D Cooperation Base for Advanced Radio Astronomy Technology”(GHJD-2021022)the China-Malaysia Collaborative Research on Space Remote Sensing and Radio Astronomy Observation of Space Weather at Low and Middle Latitudes under the Key Special Project of the State Key R&D Program of the Ministry of Science and Technol ogy for International Cooperation in Science,Technology and Innovation among Governments(2022YFE0140000)the High-precision calibration method for low-frequency radio interferometric arrays for the SKA project of the Ministry of Science and Technology(2020SKA0110300).
文摘Radio environment plays an important role in radio astronomy observations.Further analysis is needed on the time and intensity distributions of interference signals for long-term radio environment monitoring.Sample variance is an important estimate of the interference signal decision threshold.Here,we propose an improved algorithm for calculating data sample variance relying on four established statistical methods:the variance of the trimmed data,winsorized sample variance,median absolute deviation,and median of the trimmed data pairwise averaged squares method.The variance and decision threshold in the protected section of the radio astronomy L-band are calculated.Among the four methods,the improved median of the trimmed data pairwise averaged squares algorithm has higher accuracy,but in a comparison of overall experimental results,the cleanliness rate of all algorithms is above 96%.In a comparison between the improved algorithm and the four methods,the cleanliness rate of the improved algorithm is above 98%,verifying its feasibility.The time-intensity interference distribution in the radio protection band is also obtained.Finally,we use comprehensive monitoring data of radio astronomy protection bands,radio interference bands,and interfered frequency bands to establish a comprehensive evaluation system for radio observatory sites,including the observable time proportion in the radio astronomy protection band,the occasional time-intensity distribution in the radio interference frequency band,and the intensity distribution of the interfered frequency band.
基金supported by the National Natural Science Foundation of China(No.61575221).
文摘Low-pressure mercury lamps,with a main emission of about 254 nm,have been used for disinfecting air,water and surfaces for nearly a century.However,only a few studies on the corneal damage threshold at the wavelength of 254nm exist.In this paper,the in vivo corneal damage threshold was determined inchinchilla rabbit model using a laser system at 254 nm.The irradiance of the laser spot was nearly flat-top distributed and the beam diameter on the animal corneal surface was about 3.44mm and 3.28 mm along the horizontal and vertical directions.Damage lesion determinations were performed at 12 h post-exposure using fluorescein sodium staining.TheED50 was17.7 mJ/cm^(2)with a 95%confidence interval of 15.3-20.1 mJ/cm^(2).The obtained results may contribute to the knowledge base for the refinement of the UV hazard function.
基金funded by the National Key R&D Program of China(Grant No.2022YFC3300705)the National Natural Science Foundation of China(Grant Nos.62203056,12202048,and 62201056).
文摘Atmospheric aerosols are the primary contributors to environmental pollution.As such aerosols are micro-to nanosized particles invisible to the naked eye,it is necessary to utilize LiDAR technology for their detection.The laser radar echo signal is vulnerable to background light and electronic thermal noise.While single-photon LiDAR can effectively reduce background light interference,electronic thermal noise remains a significant challenge,especially at long distances and in environments with a low signal-to-noise ratio(SNR).However,conventional denoising methods cannot achieve satisfactory results in this case.In this paper,a novel adaptive continuous threshold wavelet denoising algorithm is proposed to filter out the noise.The algorithm features an adaptive threshold and a continuous threshold function.The adaptive threshold is dynamically adjusted according to the wavelet decomposition level,and the continuous threshold function ensures continuity with lower constant error,thus optimizing the denoising process.Simulation results show that the proposed algorithm has excellent performance in improving SNR and reducing root mean square error(RMSE)compared with other algorithms.Experimental results show that denoising of an actual LiDAR echo signal results in a 4.37 dB improvement in SNR and a 39.5%reduction in RMSE.The proposed method significantly enhances the ability of single-photon LiDAR to detect weak signals.
基金2023 Liaoning Institute of Science and Technology Doctoral Program Launch fund(No.2307B29).
文摘A clock bias data processing method based on interval correlation coefficient wavelet threshold denoising is suggested for minor mistakes in clock bias data in order to increase the efficacy of satellite clock bias prediction.Wavelet analysis was first used to break down the satellite clock frequency data into several levels,producing high and low frequency coefficients for each layer.The correlation coefficients of the high and low frequency coefficients in each of the three sub-intervals created by splitting these coefficients were then determined.The major noise region—the sub-interval with the lowest correlation coefficient—was chosen for thresholding treatment and noise threshold computation.The clock frequency data was then processed using wavelet reconstruction and reconverted to clock data.Lastly,three different kinds of satellite clock data—RTS,whu-o,and IGS-F—were used to confirm the produced data.Our method enhanced the stability of the Quadratic Polynomial(QP)model’s predictions for the C16 satellite by about 40%,according to the results.The accuracy and stability of the Auto Regression Integrated Moving Average(ARIMA)model improved up to 41.8%and 14.2%,respectively,whilst the Wavelet Neural Network(WNN)model improved by roughly 27.8%and 63.6%,respectively.Although our method has little effect on forecasting IGS-F series satellites,the experimental findings show that it can improve the accuracy and stability of QP,ARIMA,and WNN model forecasts for RTS and whu-o satellite clock bias.
基金Supported by the Natural Science Foundation of Fujian Province(2022J011177,2024J01903)the Key Project of Fujian Provincial Education Department(JZ230054)。
文摘In clinical research,subgroup analysis can help identify patient groups that respond better or worse to specific treatments,improve therapeutic effect and safety,and is of great significance in precision medicine.This article considers subgroup analysis methods for longitudinal data containing multiple covariates and biomarkers.We divide subgroups based on whether a linear combination of these biomarkers exceeds a predetermined threshold,and assess the heterogeneity of treatment effects across subgroups using the interaction between subgroups and exposure variables.Quantile regression is used to better characterize the global distribution of the response variable and sparsity penalties are imposed to achieve variable selection of covariates and biomarkers.The effectiveness of our proposed methodology for both variable selection and parameter estimation is verified through random simulations.Finally,we demonstrate the application of this method by analyzing data from the PA.3 trial,further illustrating the practicality of the method proposed in this paper.
文摘BackgroundIt's crucial to study the effect of changes in thresholds(T)and most comfortable levels(M)on behavioral measurements in young children using cochlear implants.This would help the clinician with the optimization and validation of programming parameters.ObjectiveThe study has attempted to describe the changes in behavioral responses with modification of T and M levels.MethodsTwenty-five participants in the age range 5 to 12 years using HR90K/HiFocus1J or HR90KAdvantage/HiFocus1J with Harmony speech processors participated in the study.A decrease in T levels,a rise in T levels,or a decrease in M levels in the everyday program were used to create experimental programs.Sound field thresholds and speech perception were measured at 50 dBHL for three experimental and everyday programs.ConclusionThe results indicated that only reductions of M levels resulted in significantly(p<0.01)poor aided thresholds and speech perception.On the other hand,variation in T levels did not have significant changes in either sound field thresholds or speech perception.The results highlight that M levels must be correctly established in order to prevent decreased speech perception and audibility.
基金the Genetics and Agricultural Biotechnology Institute of Tabarestan (GABIT) Sari Agricultural Sciences and Natural Resources University (SANRU) for the use of the services and financial supports of this research
文摘The oilseed crop Camelina sativa exhibits salinity tolerance,but the effects on early growth stages across a range of different salts and in combination with salicylic acid(SA)have not been thoroughly evaluated.In this study,seeds were germinated in varying concentrations of six salts(NaCl,CaCl_(2),ZnCl_(2),KCl,MgSO_(4),and Na2SO_(4))with or without 0.5 mM SA.Using the halotime model,we estimated salt thresholds for germination and parameters of seedling growth.Germination and seedling growth parameters of camelina significantly decreased with increasing salt concentration across all salt types.Salts containing Zn and SO_(4) were most detrimental to germination and seedling growth.Except for KCl,0.5 mM SA generally reduced the salinity tolerance threshold(Saltb(50))of camelina.Specifically,Saltb(50)was 21.5%higher for KCl and 16.1%,25.0%,54.9%,21.0%,and 5.6%lower for CaCl_(2),NaCl,MgSO_(4),Na2SO_(4),and ZnCl_(2),respectively,when 0.5 mM SA was compared to 0 mM SA.Furthermore,camelina seedling growth was consistently more sensitive than germination across all salt types.SA did not significantly enhance germination or seedling growth and was harmful when combined with certain salts or at the germination stage.It can be concluded that both the type of salt and the concentration of SA are as critical as the salt concentration in saline irrigation water.
基金supported by the Guangdong Basic and Applied Basic Research Foundation(Grant No.2022A1515110376)the Open Research Fund of National Center for International Research on Deep Earth Drilling and Resource Development,Ministry of Science and Technology(Grant No.DEDRD-2023-04)the Fundamental Research Funds for the Central Universities,China University of Geosciences(Grant No.107-G1323523046).
文摘The flow behaviors of gas and water in hydrate-bearing sediments(HBS)are significantly affected by the threshold pressure gradient(TPG).During long-term natural gas hydrates(NGHs)mining,there exists creep deformation in HBS,which significantly alters pore structures,makes the flow path of fluid more complex,and leads to changes in TPG.Thus,clarifying the evolution of TPG in HBS during creep is essential for NGH production,but it also confronts enormous challenges.In this study,based on the nonlinear creep constitutive model,a novel theoretical TPG model of HBS during creep is proposed that considers pore structures and hydrate pore morphology.The established model is validated against experimental data,demonstrating its ability to capture the evolution of TPG and permeability in HBS during creep.Additionally,the relationship between initial hydrate saturation and TPG of HBS during creep is revealed by sensitivity analysis.The creep strain increases with the decrease in initial hydrate saturation,leading to a greater TPG and a lower permeability.The evolution of TPG at the stable creep stage and the accelerated creep stage is primarily controlled by the Kelvin element and visco-plastic element,respectively.This novel proposed model provides a mechanistic understanding of TPG evolution in HBS during creep,and it is of great significance to optimize the exploitation of NGHs.