Ribonucleic Acid(RNA)contact prediction holds great significance for modeling RNA 3D structures and further understanding RNA biological functions.The rapid growth of RNA sequencing data has driven the development of ...Ribonucleic Acid(RNA)contact prediction holds great significance for modeling RNA 3D structures and further understanding RNA biological functions.The rapid growth of RNA sequencing data has driven the development of diverse computational methods for RNA contact prediction,and a benchmark evaluation of these methods remains essential.In this work,we first classified RNA contact prediction methods into statistical inference-based and neural networkbased ones.We then evaluated eight state-of-the-art methods on three test sets:a sequencediverse set,a structurally non-redundant set and a CASP RNA targets set.Our evaluation shows that for identifying non-local and long-range contacts,neural network-based methods outperform statistical inference-based ones,with SPOT-RNA-2D achieving the best performance,followed by CoCoNet and RNAcontact.However,for identifying the long-range tertiary contacts,which are vital for stabilizing RNA tertiary structure,statistical inference-based methods exhibit superior performance with GREMLIN emerging as the top performer.This work provides a comprehensive benchmarking of RNA contact prediction methods,highlighting their strengths and limitations to guide further methodological improvements and applications in RNA structure modeling.展开更多
In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating th...In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating the RFI becomes an essential procedure in any survey data processing.The Five-hundred-meter Aperture Spherical radio Telescope(FAST)is an extremely sensitive radio telescope.It is necessary to find out an effective and precise RFI mitigation method for FAST data processing.In this work,we introduce a method to mitigate the RFI in FAST spectral observation and make a statistic for the RFI using~300 h FAST data.The details are as follows.First,according to the characteristics of FAST spectra,we propose to use the Asymmetrically Reweighted Penalized Least Squares algorithm for baseline fitting.Our test results show that it has a good performance.Second,we flag the RFI with four strategies,which are to flag extremely strong RFI,flag long-lasting RFI,flag polarized RFI,and flag beam-combined RFI,respectively.The test results show that all the RFI above a preset threshold could be flagged.Third,we make a statistic for the probabilities of polarized XX and YY RFI in FAST observations.The statistical results could tell us which frequencies are relatively quiescent.With such statistical data,we are able to avoid using such frequencies in our spectral observations.Finally,based on the~300 h FAST data,we obtained an RFI table,which is the most complete database currently for FAST.展开更多
In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main ...In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main objective is to evaluate the efficiency and accuracy of the methods in separation of anomalies on the shear zone gold mineralization.For this purpose,samples were taken from the secondary lithogeochemical environment(stream sediment samples)on the gold mineralization in Saqqez,NW of Iran.Interpretation of the histograms and diagrams showed that the MPD is capable of identifying two phases of mineralization.The fractal method could separate only one phase of change based on the fractal dimension with high concentration areas of the Au element.The spatial analysis showed two mixed subpopulations after U=0 and another subpopulation with very high U values.The MPD analysis followed spatial analysis,which shows the detail of the variations.Six mineralized zones detected from local geochemical exploration results were used for validating the methods mentioned above.The MPD method was able to identify the anomalous areas higher than 90%,whereas the two other methods identified 60%(maximum)of the anomalous areas.The raw data without any estimation for the concentration was used by the MPD method using aminimum of calculations to determine the threshold values.Therefore,the MPD method is more robust than the other methods.The spatial analysis identified the detail soft hegeological and mineralization events that were affected in the study area.MPD is recommended as the best,and the spatial U-analysis is the next reliable method to be used.The fractal method could show more detail of the events and variations in the area with asymmetrical grid net and a higher density of sampling or at the detailed exploration stage.展开更多
The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of ...The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.展开更多
We collected the basic parameters of 231 supernova remnants (SNRs) in ourGalaxy, namely, distances (d) from the Sun, linear diameters (D), Galactic heights (Z), estimatedages (t), luminosities (L), surface brightness ...We collected the basic parameters of 231 supernova remnants (SNRs) in ourGalaxy, namely, distances (d) from the Sun, linear diameters (D), Galactic heights (Z), estimatedages (t), luminosities (L), surface brightness (Σ) and flux densities (S_1) at 1-GHz frequency andspectral indices (α). We tried to find possible correlations between these parameters. As expected,the linear diameters were found to increase with ages for the shell-type remnants, and also to havea tendency to increase with the Galactic heights. Both the surface brightness and luminosity ofSNRs at 1-GHz tend to decrease with the linear diameter and with age. No other relations between theparameters were found.展开更多
In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values ...In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values of standard deviation(X+nS).The primary purpose is to compare the results of these methods with each other.To increase the accuracy of comparison,regional geochemical data were used where occurrences and mineralization zones of epithermal gold have been introduced.The study area is part of the Hashtjin geological map,which is structurally part of the folded and thrust belt and part of the Alborz Tertiary magmatic complex.Samples were taken from secondary lithogeochemical environments.Au element data concerning epithermal gold reserves were used to investigate the efficacy of these two methods.In the U-spatial statistics method,and criteria were used to determine the threshold,and in the method,the element enrichment index of the region rock units was obtained with grouping these units.The anomalous areas were identified by,and criteria.Comparison of methods was made considering the position of discovered occurrences and the occurrences obtained from these methods,the flexibility of the methods in separating the anomalous zones,and the two-dimensional spatial correlation of the three elements As,Pb,and Ag with Au element.The ability of two methods to identify potential areas is acceptable.Among these methods,it seems the method with criteria has a high degree of flexibility in separating anomalous regions in the case of epithermal type gold deposits.展开更多
Existing theory and models suggest that a Type I (merger) GRB should have a larger jet beaming angle than a Type II (collapsar) GRB, but so far no statistical evidence is available to support this suggestion. In t...Existing theory and models suggest that a Type I (merger) GRB should have a larger jet beaming angle than a Type II (collapsar) GRB, but so far no statistical evidence is available to support this suggestion. In this paper, we obtain a sample of 37 beaming angles and calculate the probability that this is true. A correction is also devised to account for the scarcity of Type I GRBs in our sample. The probability is calculated to be 83% without the correction and 71% with it.展开更多
“Four classes of enterprises above designated size”(hereinafter called four-classes enterprises)refer to objects of statistical survey that have reached a certain scale in China’s current statistical method system,...“Four classes of enterprises above designated size”(hereinafter called four-classes enterprises)refer to objects of statistical survey that have reached a certain scale in China’s current statistical method system,including four classes in national economy,namely,industrial enterprises above designated size,construction and real estate development and management enterprises above qualifications,wholesale and retail,catering and accommodation enterprises,and service enterprises above designated size,which are the primary part of national economic and social development activities.This paper is focused on analyzing the practice and difficulties in the current statistics work of four-classes enterprises,and then this paper proposes some recommendations.展开更多
Meta-analysis plays a crucial role in synthesizing evidence across studies,yet its application in the context of rare diseases poses unique methodological challenges.A major limitation is the small sample size typical...Meta-analysis plays a crucial role in synthesizing evidence across studies,yet its application in the context of rare diseases poses unique methodological challenges.A major limitation is the small sample size typical of rare disease studies,which undermines statistical power and increases uncertainty in pooled estimates.Publication bias is particularly pronounced,as studies with non-significant or negative results are less likely to be published,distorting the overall evidence base.High heterogeneity in study designs,populations,and outcomes-especially between observational studies and randomized controlled trials-further complicates the integration of findings.Additionally,rare disease datasets are often characterized by sparse data,including zero-event studies,which are difficult to analyse using traditional meta-analytic approaches.The frequent use of inappropriate statistical methods,such as fixed-effects models in the presence of heterogeneity or continuity corrections for zero-event data,can yield misleading results.These issues collectively limit the generalisability of meta-analytic conclusions to broader patient populations.This article critically evaluates these problems and highlights the need for advanced statistical techniques,rigorous study selection,and transparent reporting standards to enhance the validity and utility of metaanalyses in rare disease research.展开更多
The solar corona is the primary driver of the solar storms and space weather.In order to routinely monitor the coronal activities,a dedicated solar telescope is essential,a coronagraph.However,the site conditions suit...The solar corona is the primary driver of the solar storms and space weather.In order to routinely monitor the coronal activities,a dedicated solar telescope is essential,a coronagraph.However,the site conditions suitable for a ground-based coronagraph are rather critical and necessitate stringent site selection criteria.Among the numerous astronomical observatory site parameters,the sky background brightness of the day is recognized as a key parameter in evaluating the quality of a solar observatory for coronal observation before considering to install a coronagraph.To achieve optimal sky brightness observing condition,coronagraphs are usually installed on high mountains where the atmosphere is thin and the background scattered light effects from the sky are significantly reduced.Given the unique characteristics of the Tibetan Plateau,often referred to as the"Third Pole"of the Earth,a scientific assessment is crucial to evaluate its suitability as a potential location for the next-generation Chinese Giant Solar Telescope and large coronagraphs.Here,we report the first results of our measurements of the sky brightness at the Namco Lake location(30°45.0′N,90°40.0′E,4730 m above sea level,the highest large lake in the world),using a new Sky Brightness Monitor during the period 2013 August to December.The results show that the average value of the normalized(per airmass)sky brightness is as low as 13.84μI_(☉)for 530 nm.The excellent sky brightness conditions at the Namco Lake location demonstrate its high suitability for routine coronal observations.Therefore,the Namco site deserves attention for solar observations and we should continue to carry out in-depth monitoring and systematic study of multiple parameters in the future.展开更多
Solar magnetic field measurements mainly use the Zeeman effect,but this method has two problems,namely,low accuracy of the transverse magnetic field components and a 180°ambiguity.Multi-perspective observations c...Solar magnetic field measurements mainly use the Zeeman effect,but this method has two problems,namely,low accuracy of the transverse magnetic field components and a 180°ambiguity.Multi-perspective observations can increase the measurement accuracy and resolve the ambiguity.This study investigates how combined observations from the Sun-Earth L5 point,Sun-Earth line,and solar polar-orbiting satellites improve the accuracy of the transverse solar magnetic field under different satellite positional configurations.A three-satellite model is developed using spherical trigonometry to establish coordinate relationships,and the error propagation formulas are applied to correc transverse field measurement errors.The magnetic field measurement error distribution of the Helioseismic and Magnetic Imager is analyzed,and the magnetograms from the three satellites are simulated.The improvement to the transverse field accuracy under various satellite configurations is then assessed based on simulation results.The results show that multi-perspective measurements can reduce transverse component errorsΔB_(x)to approximately 10%andΔB_(y)to about 15%compared to the error from a single satellite.An optimally designed polar orbit can decrease the transverse field error by nearly an order of magnitude for 80%of its operation time.展开更多
We perform a time-resolved statistical study of GRB 221009A’s X-ray emission using Swift XRT Photon Counting and Windowed Timing data.After standard reduction(barycentric correction,pile-up,background subtraction via...We perform a time-resolved statistical study of GRB 221009A’s X-ray emission using Swift XRT Photon Counting and Windowed Timing data.After standard reduction(barycentric correction,pile-up,background subtraction via HEASOFT),we extracted light curves for each observational ID and for their aggregation.Countrate histograms were fitted using various statistical distributions;fit quality was assessed by chi-squared and the Bayesian Information Criterion.The first observational segment is best described by a Gaussian distribution(χ^(2)=68.4;BIC=7651.2),and the second by a Poisson distribution(χ^(2)=33.5;BIC=4413.3).When all segments are combined,the lognormal model provides the superior fit(χ^(2)=541.9;BIC=34365.5),indicating that the full data set’s count rates exhibit the skewness expected from a multiplicative process.These findings demonstrate that while individual time intervals conform to discrete or symmetric statistics,the collective emission profile across multiple observations is better captured by a lognormal distribution,consistent with complex,compounded variability in GRB afterglows.展开更多
In this study,we investigate the potential of mark-weighted angular correlation functions,which integrateβ-cosmic-web classification with angular correlation function analysis to improve cosmological constraints.Usin...In this study,we investigate the potential of mark-weighted angular correlation functions,which integrateβ-cosmic-web classification with angular correlation function analysis to improve cosmological constraints.Using SDSS DR12 CMASS-NGC galaxies and mock catalogs withΩ_(m)varying from 0.25 to 0.40,we assess the discriminative power of different statistics via the average improvement in chi-squared,ΔX^(2),across six redshift bins.This metric quantifies how effectively each statistic distinguishes between different cosmological models.Incorporating cosmic-web weights leads to substantial improvements.Using statistics weighted by the mean neighbor distance(Dnei)increasesΔX^(2)by approximately 40%–130%,while applying inverse mean neighbor distance weighting(1/Dnei)yields even larger gains,boostingΔX^(2)by a factor of 2–3 compared to traditional unweighted angular statistics.These enhancements are consistent with previous 3D clustering results,demonstrating the superior sensitivity of theβ-weighted approaches.Our method,based on thin redshift slices,is particularly suitable for slitless surveys(e.g.,Euclid,CSST)where redshift uncertainties limit 3D analyses.This study also offers a framework for applying marked statistics to 2D angular clustering.展开更多
To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA)...To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA) is proposed to analyze vibro-acoustics responses with uncertainties at middle frequencies. The mid-frequency dynamic response of the framework-plate structure with uncertainties is studied based on the hybrid FE-SEA method and the Monte Carlo(MC)simulation is performed so as to provide a benchmark comparison with the hybrid method. The energy response of the framework-plate structure matches well with the MC simulation results, which validates the effectiveness of the hybrid FE-SEA method considering both the complexity of the vibro-acoustic structure and the uncertainties in mid-frequency vibro-acousitc analysis. Based on the hybrid method, a vibroacoustic model of a construction machinery cab with random properties is established, and the excitations of the model are measured by experiments. The responses of the sound pressure level of the cab and the vibration power spectrum density of the front windscreen are calculated and compared with those of the experiment. At middle frequencies, the results have a good consistency with the tests and the prediction error is less than 3. 5dB.展开更多
Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts...Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.展开更多
Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them...Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox.Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox.Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome,instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.展开更多
Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased ...Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.展开更多
Coal oxidation at low temperatures is the heat source liable for the self-heating and spontaneous combustion of coal. This phenomenon has imposed severe problems in coal related industries. Attempts to understand this...Coal oxidation at low temperatures is the heat source liable for the self-heating and spontaneous combustion of coal. This phenomenon has imposed severe problems in coal related industries. Attempts to understand this phenomenon by previous researchers have provided significant progress. It is wellknown that coal oxidation at low temperatures involves oxygen consumption and formation of gaseous and solid oxidation products. This process is majorly influenced by temperature, oxidation history of coal,coal properties, particle size distribution of the coal, etc. The current understanding of the phenomenon of self-heating and spontaneous combustion of coal is discussed along with the different experimental and numerical models established to predict self-heating characteristics of coal. This paper focuses on the global position of the study carried out by academics, research institutes and industries on spontaneous combustion of coal and coal mine fires. Within this framework, the generally used spontaneous combustion techniques to predict the spontaneous combustion liability of coal were evaluated. These techniques are well-known in their usage, but no specific method has become a standard to predict the spontaneous combustion liability. Further study is still needed to indicate a number of impending issues and to obtain a more complete understanding on the phenomenon.展开更多
Different artificial intelligence(AI)methods have been applied to various aspects of rock mechanics,but the fact that none of these methods have been used as a standard implies that doubt as to their generality and va...Different artificial intelligence(AI)methods have been applied to various aspects of rock mechanics,but the fact that none of these methods have been used as a standard implies that doubt as to their generality and validity still exists.For this,a literature review of application of AI to the field of rock mechanics is presented.Comprehensive studies of the researches published in the top journals relative to the fields of rock mechanics,computer applications in engineering,and the textbooks were conducted.The performances of the AI methods that have been used in rock mechanics applications were evaluated.The literature review shows that AI methods have successfully been used to solve various problems in the rock mechanics field and they performed better than the traditional empirical,mathematical or statistical methods.However,their practical applicability is still an issue of concern as many of the existing AI models require some level of expertise before they can be used,because they are not in the form of tractable mathematical equations.Thus some advanced AI methods are still yet to be explored.The limited availability of dataset for the AI simulations is also identified as a major problem.The solutions to the identified problems and the possible future research focus were proposed in the study subsequently.展开更多
As the performance of dedicated facilities has continually improved, large numbers of pulsar candidates are being received, which makes selecting valuable pulsar signals from the candidates challenging. In this paper,...As the performance of dedicated facilities has continually improved, large numbers of pulsar candidates are being received, which makes selecting valuable pulsar signals from the candidates challenging. In this paper, we describe the design for a deep convolutional neural network(CNN) with 11 layers for classifying pulsar candidates. Compared to artificially designed features, the CNN chooses the subintegrations plot and sub-bands plot for each candidate as inputs without carrying biases. To address the imbalance problem, a data augmentation method based on synthetic minority samples is proposed according to the characteristics of pulsars. The maximum pulses of pulsar candidates were first translated to the same position, and then new samples were generated by adding up multiple subplots of pulsars. The data augmentation method is simple and effective for obtaining varied and representative samples which keep pulsar characteristics. In experiments on the HTRU 1 dataset, it is shown that this model can achieve recall of 0.962 and precision of 0.963.展开更多
基金supported by grants from the National Natural Science Foundation of China(Grant No.12205223 to YLT,Grant No.12375038 to ZJT and Grant No.11605125 to YZS)the Department of Education of Hubei Province(Grant No.Q20221705 to YLT)。
文摘Ribonucleic Acid(RNA)contact prediction holds great significance for modeling RNA 3D structures and further understanding RNA biological functions.The rapid growth of RNA sequencing data has driven the development of diverse computational methods for RNA contact prediction,and a benchmark evaluation of these methods remains essential.In this work,we first classified RNA contact prediction methods into statistical inference-based and neural networkbased ones.We then evaluated eight state-of-the-art methods on three test sets:a sequencediverse set,a structurally non-redundant set and a CASP RNA targets set.Our evaluation shows that for identifying non-local and long-range contacts,neural network-based methods outperform statistical inference-based ones,with SPOT-RNA-2D achieving the best performance,followed by CoCoNet and RNAcontact.However,for identifying the long-range tertiary contacts,which are vital for stabilizing RNA tertiary structure,statistical inference-based methods exhibit superior performance with GREMLIN emerging as the top performer.This work provides a comprehensive benchmarking of RNA contact prediction methods,highlighting their strengths and limitations to guide further methodological improvements and applications in RNA structure modeling.
基金supported by the National Key R&D Program of China(2018YFE0202900)support by the NAOC Nebula Talents Program and the Cultivation Project for FAST Scientific Payoff and Research Achievement of CAMS-CAS。
文摘In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating the RFI becomes an essential procedure in any survey data processing.The Five-hundred-meter Aperture Spherical radio Telescope(FAST)is an extremely sensitive radio telescope.It is necessary to find out an effective and precise RFI mitigation method for FAST data processing.In this work,we introduce a method to mitigate the RFI in FAST spectral observation and make a statistic for the RFI using~300 h FAST data.The details are as follows.First,according to the characteristics of FAST spectra,we propose to use the Asymmetrically Reweighted Penalized Least Squares algorithm for baseline fitting.Our test results show that it has a good performance.Second,we flag the RFI with four strategies,which are to flag extremely strong RFI,flag long-lasting RFI,flag polarized RFI,and flag beam-combined RFI,respectively.The test results show that all the RFI above a preset threshold could be flagged.Third,we make a statistic for the probabilities of polarized XX and YY RFI in FAST observations.The statistical results could tell us which frequencies are relatively quiescent.With such statistical data,we are able to avoid using such frequencies in our spectral observations.Finally,based on the~300 h FAST data,we obtained an RFI table,which is the most complete database currently for FAST.
文摘In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main objective is to evaluate the efficiency and accuracy of the methods in separation of anomalies on the shear zone gold mineralization.For this purpose,samples were taken from the secondary lithogeochemical environment(stream sediment samples)on the gold mineralization in Saqqez,NW of Iran.Interpretation of the histograms and diagrams showed that the MPD is capable of identifying two phases of mineralization.The fractal method could separate only one phase of change based on the fractal dimension with high concentration areas of the Au element.The spatial analysis showed two mixed subpopulations after U=0 and another subpopulation with very high U values.The MPD analysis followed spatial analysis,which shows the detail of the variations.Six mineralized zones detected from local geochemical exploration results were used for validating the methods mentioned above.The MPD method was able to identify the anomalous areas higher than 90%,whereas the two other methods identified 60%(maximum)of the anomalous areas.The raw data without any estimation for the concentration was used by the MPD method using aminimum of calculations to determine the threshold values.Therefore,the MPD method is more robust than the other methods.The spatial analysis identified the detail soft hegeological and mineralization events that were affected in the study area.MPD is recommended as the best,and the spatial U-analysis is the next reliable method to be used.The fractal method could show more detail of the events and variations in the area with asymmetrical grid net and a higher density of sampling or at the detailed exploration stage.
文摘The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.
基金the National Natural Science Foundation of China.
文摘We collected the basic parameters of 231 supernova remnants (SNRs) in ourGalaxy, namely, distances (d) from the Sun, linear diameters (D), Galactic heights (Z), estimatedages (t), luminosities (L), surface brightness (Σ) and flux densities (S_1) at 1-GHz frequency andspectral indices (α). We tried to find possible correlations between these parameters. As expected,the linear diameters were found to increase with ages for the shell-type remnants, and also to havea tendency to increase with the Galactic heights. Both the surface brightness and luminosity ofSNRs at 1-GHz tend to decrease with the linear diameter and with age. No other relations between theparameters were found.
文摘In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values of standard deviation(X+nS).The primary purpose is to compare the results of these methods with each other.To increase the accuracy of comparison,regional geochemical data were used where occurrences and mineralization zones of epithermal gold have been introduced.The study area is part of the Hashtjin geological map,which is structurally part of the folded and thrust belt and part of the Alborz Tertiary magmatic complex.Samples were taken from secondary lithogeochemical environments.Au element data concerning epithermal gold reserves were used to investigate the efficacy of these two methods.In the U-spatial statistics method,and criteria were used to determine the threshold,and in the method,the element enrichment index of the region rock units was obtained with grouping these units.The anomalous areas were identified by,and criteria.Comparison of methods was made considering the position of discovered occurrences and the occurrences obtained from these methods,the flexibility of the methods in separating the anomalous zones,and the two-dimensional spatial correlation of the three elements As,Pb,and Ag with Au element.The ability of two methods to identify potential areas is acceptable.Among these methods,it seems the method with criteria has a high degree of flexibility in separating anomalous regions in the case of epithermal type gold deposits.
基金supported by the National Natural Science Foundation of China (Grant No. 10873009)the National Basic Research Program of China (973 program, No. 2007CB815404)
文摘Existing theory and models suggest that a Type I (merger) GRB should have a larger jet beaming angle than a Type II (collapsar) GRB, but so far no statistical evidence is available to support this suggestion. In this paper, we obtain a sample of 37 beaming angles and calculate the probability that this is true. A correction is also devised to account for the scarcity of Type I GRBs in our sample. The probability is calculated to be 83% without the correction and 71% with it.
文摘“Four classes of enterprises above designated size”(hereinafter called four-classes enterprises)refer to objects of statistical survey that have reached a certain scale in China’s current statistical method system,including four classes in national economy,namely,industrial enterprises above designated size,construction and real estate development and management enterprises above qualifications,wholesale and retail,catering and accommodation enterprises,and service enterprises above designated size,which are the primary part of national economic and social development activities.This paper is focused on analyzing the practice and difficulties in the current statistics work of four-classes enterprises,and then this paper proposes some recommendations.
文摘Meta-analysis plays a crucial role in synthesizing evidence across studies,yet its application in the context of rare diseases poses unique methodological challenges.A major limitation is the small sample size typical of rare disease studies,which undermines statistical power and increases uncertainty in pooled estimates.Publication bias is particularly pronounced,as studies with non-significant or negative results are less likely to be published,distorting the overall evidence base.High heterogeneity in study designs,populations,and outcomes-especially between observational studies and randomized controlled trials-further complicates the integration of findings.Additionally,rare disease datasets are often characterized by sparse data,including zero-event studies,which are difficult to analyse using traditional meta-analytic approaches.The frequent use of inappropriate statistical methods,such as fixed-effects models in the presence of heterogeneity or continuity corrections for zero-event data,can yield misleading results.These issues collectively limit the generalisability of meta-analytic conclusions to broader patient populations.This article critically evaluates these problems and highlights the need for advanced statistical techniques,rigorous study selection,and transparent reporting standards to enhance the validity and utility of metaanalyses in rare disease research.
基金supported by the National Natural Science Foundation of China(NSFC,grant Nos.12373063 and 11533009)the Deanship of Scientific Research at King Saud University for funding this work through research group No.(RG-1440-092)。
文摘The solar corona is the primary driver of the solar storms and space weather.In order to routinely monitor the coronal activities,a dedicated solar telescope is essential,a coronagraph.However,the site conditions suitable for a ground-based coronagraph are rather critical and necessitate stringent site selection criteria.Among the numerous astronomical observatory site parameters,the sky background brightness of the day is recognized as a key parameter in evaluating the quality of a solar observatory for coronal observation before considering to install a coronagraph.To achieve optimal sky brightness observing condition,coronagraphs are usually installed on high mountains where the atmosphere is thin and the background scattered light effects from the sky are significantly reduced.Given the unique characteristics of the Tibetan Plateau,often referred to as the"Third Pole"of the Earth,a scientific assessment is crucial to evaluate its suitability as a potential location for the next-generation Chinese Giant Solar Telescope and large coronagraphs.Here,we report the first results of our measurements of the sky brightness at the Namco Lake location(30°45.0′N,90°40.0′E,4730 m above sea level,the highest large lake in the world),using a new Sky Brightness Monitor during the period 2013 August to December.The results show that the average value of the normalized(per airmass)sky brightness is as low as 13.84μI_(☉)for 530 nm.The excellent sky brightness conditions at the Namco Lake location demonstrate its high suitability for routine coronal observations.Therefore,the Namco site deserves attention for solar observations and we should continue to carry out in-depth monitoring and systematic study of multiple parameters in the future.
基金supported by the National Key R&D Program of China(2022YFF0503004,2020YFC2201200 and 2022YFF0503800)the National Natural Science Foundation of China(NSFC,grant No.12333009)supported by the National Space Administration of China under the Pre-research Program of Civil Space Technology"Design and Key Technologies of the Solar All-round Stereoscopic Detection System"。
文摘Solar magnetic field measurements mainly use the Zeeman effect,but this method has two problems,namely,low accuracy of the transverse magnetic field components and a 180°ambiguity.Multi-perspective observations can increase the measurement accuracy and resolve the ambiguity.This study investigates how combined observations from the Sun-Earth L5 point,Sun-Earth line,and solar polar-orbiting satellites improve the accuracy of the transverse solar magnetic field under different satellite positional configurations.A three-satellite model is developed using spherical trigonometry to establish coordinate relationships,and the error propagation formulas are applied to correc transverse field measurement errors.The magnetic field measurement error distribution of the Helioseismic and Magnetic Imager is analyzed,and the magnetograms from the three satellites are simulated.The improvement to the transverse field accuracy under various satellite configurations is then assessed based on simulation results.The results show that multi-perspective measurements can reduce transverse component errorsΔB_(x)to approximately 10%andΔB_(y)to about 15%compared to the error from a single satellite.An optimally designed polar orbit can decrease the transverse field error by nearly an order of magnitude for 80%of its operation time.
文摘We perform a time-resolved statistical study of GRB 221009A’s X-ray emission using Swift XRT Photon Counting and Windowed Timing data.After standard reduction(barycentric correction,pile-up,background subtraction via HEASOFT),we extracted light curves for each observational ID and for their aggregation.Countrate histograms were fitted using various statistical distributions;fit quality was assessed by chi-squared and the Bayesian Information Criterion.The first observational segment is best described by a Gaussian distribution(χ^(2)=68.4;BIC=7651.2),and the second by a Poisson distribution(χ^(2)=33.5;BIC=4413.3).When all segments are combined,the lognormal model provides the superior fit(χ^(2)=541.9;BIC=34365.5),indicating that the full data set’s count rates exhibit the skewness expected from a multiplicative process.These findings demonstrate that while individual time intervals conform to discrete or symmetric statistics,the collective emission profile across multiple observations is better captured by a lognormal distribution,consistent with complex,compounded variability in GRB afterglows.
基金supported by the Ministry of Science and Technology of China(2020SKA0110401,2020SKA0110402 and 2020SKA0110100)the National Key Research and Development Program of China(2018YFA0404504,2018YFA0404601 and 2020YFC2201600)+2 种基金the National Natural Science Foundation of China(12373005,11890691,12205388,12220101003 and 12473097)the China Manned Space Project with numbers CMS-CSST-2021(A02,A03,B01)Guangdong Basic and Applied Basic Research Foundation(2024A1515012309)。
文摘In this study,we investigate the potential of mark-weighted angular correlation functions,which integrateβ-cosmic-web classification with angular correlation function analysis to improve cosmological constraints.Using SDSS DR12 CMASS-NGC galaxies and mock catalogs withΩ_(m)varying from 0.25 to 0.40,we assess the discriminative power of different statistics via the average improvement in chi-squared,ΔX^(2),across six redshift bins.This metric quantifies how effectively each statistic distinguishes between different cosmological models.Incorporating cosmic-web weights leads to substantial improvements.Using statistics weighted by the mean neighbor distance(Dnei)increasesΔX^(2)by approximately 40%–130%,while applying inverse mean neighbor distance weighting(1/Dnei)yields even larger gains,boostingΔX^(2)by a factor of 2–3 compared to traditional unweighted angular statistics.These enhancements are consistent with previous 3D clustering results,demonstrating the superior sensitivity of theβ-weighted approaches.Our method,based on thin redshift slices,is particularly suitable for slitless surveys(e.g.,Euclid,CSST)where redshift uncertainties limit 3D analyses.This study also offers a framework for applying marked statistics to 2D angular clustering.
基金Science and Technology Support Planning of Jiangsu Province(No.BE2014133)the Open Foundation of Key Laboratory of Underw ater Acoustic Signal Processing(No.UASP1301)the Prospective Joint Research Project of Jiangsu province(No.BY2014127-01)
文摘To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA) is proposed to analyze vibro-acoustics responses with uncertainties at middle frequencies. The mid-frequency dynamic response of the framework-plate structure with uncertainties is studied based on the hybrid FE-SEA method and the Monte Carlo(MC)simulation is performed so as to provide a benchmark comparison with the hybrid method. The energy response of the framework-plate structure matches well with the MC simulation results, which validates the effectiveness of the hybrid FE-SEA method considering both the complexity of the vibro-acoustic structure and the uncertainties in mid-frequency vibro-acousitc analysis. Based on the hybrid method, a vibroacoustic model of a construction machinery cab with random properties is established, and the excitations of the model are measured by experiments. The responses of the sound pressure level of the cab and the vibration power spectrum density of the front windscreen are calculated and compared with those of the experiment. At middle frequencies, the results have a good consistency with the tests and the prediction error is less than 3. 5dB.
文摘Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.
文摘Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox.Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox.Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome,instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.
文摘Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.
基金The work presented in this paper is part of a PhD research study in the School of Mining Engineering at the University of the Witwatersrand,Johannesburg,South Africa.
文摘Coal oxidation at low temperatures is the heat source liable for the self-heating and spontaneous combustion of coal. This phenomenon has imposed severe problems in coal related industries. Attempts to understand this phenomenon by previous researchers have provided significant progress. It is wellknown that coal oxidation at low temperatures involves oxygen consumption and formation of gaseous and solid oxidation products. This process is majorly influenced by temperature, oxidation history of coal,coal properties, particle size distribution of the coal, etc. The current understanding of the phenomenon of self-heating and spontaneous combustion of coal is discussed along with the different experimental and numerical models established to predict self-heating characteristics of coal. This paper focuses on the global position of the study carried out by academics, research institutes and industries on spontaneous combustion of coal and coal mine fires. Within this framework, the generally used spontaneous combustion techniques to predict the spontaneous combustion liability of coal were evaluated. These techniques are well-known in their usage, but no specific method has become a standard to predict the spontaneous combustion liability. Further study is still needed to indicate a number of impending issues and to obtain a more complete understanding on the phenomenon.
基金This work was supported by Korea Research Fellowship Program through the National Research Foundation of Korea funded by the Ministry of Science and ICT(Grant No.2019H1D3A1A01102993).
文摘Different artificial intelligence(AI)methods have been applied to various aspects of rock mechanics,but the fact that none of these methods have been used as a standard implies that doubt as to their generality and validity still exists.For this,a literature review of application of AI to the field of rock mechanics is presented.Comprehensive studies of the researches published in the top journals relative to the fields of rock mechanics,computer applications in engineering,and the textbooks were conducted.The performances of the AI methods that have been used in rock mechanics applications were evaluated.The literature review shows that AI methods have successfully been used to solve various problems in the rock mechanics field and they performed better than the traditional empirical,mathematical or statistical methods.However,their practical applicability is still an issue of concern as many of the existing AI models require some level of expertise before they can be used,because they are not in the form of tractable mathematical equations.Thus some advanced AI methods are still yet to be explored.The limited availability of dataset for the AI simulations is also identified as a major problem.The solutions to the identified problems and the possible future research focus were proposed in the study subsequently.
文摘As the performance of dedicated facilities has continually improved, large numbers of pulsar candidates are being received, which makes selecting valuable pulsar signals from the candidates challenging. In this paper, we describe the design for a deep convolutional neural network(CNN) with 11 layers for classifying pulsar candidates. Compared to artificially designed features, the CNN chooses the subintegrations plot and sub-bands plot for each candidate as inputs without carrying biases. To address the imbalance problem, a data augmentation method based on synthetic minority samples is proposed according to the characteristics of pulsars. The maximum pulses of pulsar candidates were first translated to the same position, and then new samples were generated by adding up multiple subplots of pulsars. The data augmentation method is simple and effective for obtaining varied and representative samples which keep pulsar characteristics. In experiments on the HTRU 1 dataset, it is shown that this model can achieve recall of 0.962 and precision of 0.963.