Solar magnetic field measurements mainly use the Zeeman effect,but this method has two problems,namely,low accuracy of the transverse magnetic field components and a 180°ambiguity.Multi-perspective observations c...Solar magnetic field measurements mainly use the Zeeman effect,but this method has two problems,namely,low accuracy of the transverse magnetic field components and a 180°ambiguity.Multi-perspective observations can increase the measurement accuracy and resolve the ambiguity.This study investigates how combined observations from the Sun-Earth L5 point,Sun-Earth line,and solar polar-orbiting satellites improve the accuracy of the transverse solar magnetic field under different satellite positional configurations.A three-satellite model is developed using spherical trigonometry to establish coordinate relationships,and the error propagation formulas are applied to correc transverse field measurement errors.The magnetic field measurement error distribution of the Helioseismic and Magnetic Imager is analyzed,and the magnetograms from the three satellites are simulated.The improvement to the transverse field accuracy under various satellite configurations is then assessed based on simulation results.The results show that multi-perspective measurements can reduce transverse component errorsΔB_(x)to approximately 10%andΔB_(y)to about 15%compared to the error from a single satellite.An optimally designed polar orbit can decrease the transverse field error by nearly an order of magnitude for 80%of its operation time.展开更多
We perform a time-resolved statistical study of GRB 221009A’s X-ray emission using Swift XRT Photon Counting and Windowed Timing data.After standard reduction(barycentric correction,pile-up,background subtraction via...We perform a time-resolved statistical study of GRB 221009A’s X-ray emission using Swift XRT Photon Counting and Windowed Timing data.After standard reduction(barycentric correction,pile-up,background subtraction via HEASOFT),we extracted light curves for each observational ID and for their aggregation.Countrate histograms were fitted using various statistical distributions;fit quality was assessed by chi-squared and the Bayesian Information Criterion.The first observational segment is best described by a Gaussian distribution(χ^(2)=68.4;BIC=7651.2),and the second by a Poisson distribution(χ^(2)=33.5;BIC=4413.3).When all segments are combined,the lognormal model provides the superior fit(χ^(2)=541.9;BIC=34365.5),indicating that the full data set’s count rates exhibit the skewness expected from a multiplicative process.These findings demonstrate that while individual time intervals conform to discrete or symmetric statistics,the collective emission profile across multiple observations is better captured by a lognormal distribution,consistent with complex,compounded variability in GRB afterglows.展开更多
In this study,we investigate the potential of mark-weighted angular correlation functions,which integrateβ-cosmic-web classification with angular correlation function analysis to improve cosmological constraints.Usin...In this study,we investigate the potential of mark-weighted angular correlation functions,which integrateβ-cosmic-web classification with angular correlation function analysis to improve cosmological constraints.Using SDSS DR12 CMASS-NGC galaxies and mock catalogs withΩ_(m)varying from 0.25 to 0.40,we assess the discriminative power of different statistics via the average improvement in chi-squared,ΔX^(2),across six redshift bins.This metric quantifies how effectively each statistic distinguishes between different cosmological models.Incorporating cosmic-web weights leads to substantial improvements.Using statistics weighted by the mean neighbor distance(Dnei)increasesΔX^(2)by approximately 40%–130%,while applying inverse mean neighbor distance weighting(1/Dnei)yields even larger gains,boostingΔX^(2)by a factor of 2–3 compared to traditional unweighted angular statistics.These enhancements are consistent with previous 3D clustering results,demonstrating the superior sensitivity of theβ-weighted approaches.Our method,based on thin redshift slices,is particularly suitable for slitless surveys(e.g.,Euclid,CSST)where redshift uncertainties limit 3D analyses.This study also offers a framework for applying marked statistics to 2D angular clustering.展开更多
In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating th...In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating the RFI becomes an essential procedure in any survey data processing.The Five-hundred-meter Aperture Spherical radio Telescope(FAST)is an extremely sensitive radio telescope.It is necessary to find out an effective and precise RFI mitigation method for FAST data processing.In this work,we introduce a method to mitigate the RFI in FAST spectral observation and make a statistic for the RFI using~300 h FAST data.The details are as follows.First,according to the characteristics of FAST spectra,we propose to use the Asymmetrically Reweighted Penalized Least Squares algorithm for baseline fitting.Our test results show that it has a good performance.Second,we flag the RFI with four strategies,which are to flag extremely strong RFI,flag long-lasting RFI,flag polarized RFI,and flag beam-combined RFI,respectively.The test results show that all the RFI above a preset threshold could be flagged.Third,we make a statistic for the probabilities of polarized XX and YY RFI in FAST observations.The statistical results could tell us which frequencies are relatively quiescent.With such statistical data,we are able to avoid using such frequencies in our spectral observations.Finally,based on the~300 h FAST data,we obtained an RFI table,which is the most complete database currently for FAST.展开更多
In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main ...In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main objective is to evaluate the efficiency and accuracy of the methods in separation of anomalies on the shear zone gold mineralization.For this purpose,samples were taken from the secondary lithogeochemical environment(stream sediment samples)on the gold mineralization in Saqqez,NW of Iran.Interpretation of the histograms and diagrams showed that the MPD is capable of identifying two phases of mineralization.The fractal method could separate only one phase of change based on the fractal dimension with high concentration areas of the Au element.The spatial analysis showed two mixed subpopulations after U=0 and another subpopulation with very high U values.The MPD analysis followed spatial analysis,which shows the detail of the variations.Six mineralized zones detected from local geochemical exploration results were used for validating the methods mentioned above.The MPD method was able to identify the anomalous areas higher than 90%,whereas the two other methods identified 60%(maximum)of the anomalous areas.The raw data without any estimation for the concentration was used by the MPD method using aminimum of calculations to determine the threshold values.Therefore,the MPD method is more robust than the other methods.The spatial analysis identified the detail soft hegeological and mineralization events that were affected in the study area.MPD is recommended as the best,and the spatial U-analysis is the next reliable method to be used.The fractal method could show more detail of the events and variations in the area with asymmetrical grid net and a higher density of sampling or at the detailed exploration stage.展开更多
Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them...Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox. Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox. Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome, instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.展开更多
The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of ...The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.展开更多
We collected the basic parameters of 231 supernova remnants (SNRs) in ourGalaxy, namely, distances (d) from the Sun, linear diameters (D), Galactic heights (Z), estimatedages (t), luminosities (L), surface brightness ...We collected the basic parameters of 231 supernova remnants (SNRs) in ourGalaxy, namely, distances (d) from the Sun, linear diameters (D), Galactic heights (Z), estimatedages (t), luminosities (L), surface brightness (Σ) and flux densities (S_1) at 1-GHz frequency andspectral indices (α). We tried to find possible correlations between these parameters. As expected,the linear diameters were found to increase with ages for the shell-type remnants, and also to havea tendency to increase with the Galactic heights. Both the surface brightness and luminosity ofSNRs at 1-GHz tend to decrease with the linear diameter and with age. No other relations between theparameters were found.展开更多
In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values ...In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values of standard deviation(X+nS).The primary purpose is to compare the results of these methods with each other.To increase the accuracy of comparison,regional geochemical data were used where occurrences and mineralization zones of epithermal gold have been introduced.The study area is part of the Hashtjin geological map,which is structurally part of the folded and thrust belt and part of the Alborz Tertiary magmatic complex.Samples were taken from secondary lithogeochemical environments.Au element data concerning epithermal gold reserves were used to investigate the efficacy of these two methods.In the U-spatial statistics method,and criteria were used to determine the threshold,and in the method,the element enrichment index of the region rock units was obtained with grouping these units.The anomalous areas were identified by,and criteria.Comparison of methods was made considering the position of discovered occurrences and the occurrences obtained from these methods,the flexibility of the methods in separating the anomalous zones,and the two-dimensional spatial correlation of the three elements As,Pb,and Ag with Au element.The ability of two methods to identify potential areas is acceptable.Among these methods,it seems the method with criteria has a high degree of flexibility in separating anomalous regions in the case of epithermal type gold deposits.展开更多
Existing theory and models suggest that a Type I (merger) GRB should have a larger jet beaming angle than a Type II (collapsar) GRB, but so far no statistical evidence is available to support this suggestion. In t...Existing theory and models suggest that a Type I (merger) GRB should have a larger jet beaming angle than a Type II (collapsar) GRB, but so far no statistical evidence is available to support this suggestion. In this paper, we obtain a sample of 37 beaming angles and calculate the probability that this is true. A correction is also devised to account for the scarcity of Type I GRBs in our sample. The probability is calculated to be 83% without the correction and 71% with it.展开更多
“Four classes of enterprises above designated size”(hereinafter called four-classes enterprises)refer to objects of statistical survey that have reached a certain scale in China’s current statistical method system,...“Four classes of enterprises above designated size”(hereinafter called four-classes enterprises)refer to objects of statistical survey that have reached a certain scale in China’s current statistical method system,including four classes in national economy,namely,industrial enterprises above designated size,construction and real estate development and management enterprises above qualifications,wholesale and retail,catering and accommodation enterprises,and service enterprises above designated size,which are the primary part of national economic and social development activities.This paper is focused on analyzing the practice and difficulties in the current statistics work of four-classes enterprises,and then this paper proposes some recommendations.展开更多
Objective To facilitate the quality evaluation suitable for the unique characteristics of Chinese materia medica(CMM)by developing and implementing a novel approach known as the matching frequency statistical moment(M...Objective To facilitate the quality evaluation suitable for the unique characteristics of Chinese materia medica(CMM)by developing and implementing a novel approach known as the matching frequency statistical moment(MFSM)method.Methods This study established the MFSM method.To demonstrate its effectiveness,we applied this novel approach to analyze Danxi Granules(丹膝颗粒,DXG)and its constituent herbal materials.To begin with,the ultra-performance liquid chromatography(UPLC)was applied to obtain the chromatographic fingerprints of DXG and its constituent herbal materi-als.Next,the MFSM was leveraged to compress and integrate them into a new fingerprint with fewer analytical units.Then,we characterized the properties and variability of both the original and integrated fingerprints by calculating total quantum statistical moment(TQSM)parameters,information entropy and information amount,along with their relative standard deviation(RSD).Finally,we compared the TQSM parameters,information entropy and infor-mation amount,and their RSD between the traditional and novel fingerprints to validate the new analytical method.Results The chromatographic peaks of DXG and its 12 raw herbal materials were divided and integrated into peak families by the MFSM method.Before integration,the ranges of the peak number,three TQSM parameters,information entropy and information amount for each peak or peak family of UPLC fingerprints of DXG and its 12 raw herbal materials were 95.07−209.73,9390−183064μv·s,5.928−21.33 min,22.62−106.69 min^(2),4.230−6.539,and 50530−974186μv·s,respectively.After integration,the ranges of these parameters were 10.00−88.00,9390−183064μv·s,5.951−22.02 min,22.27−104.73 min^(2),2.223−5.277,and 38159−807200μv·s,respectively.Correspondingly,the RSD of all the aforementioned pa-rameters before integration were 2.12%−9.15%,6.04%−49.78%,1.15%−23.10%,3.97%−25.79%,1.49%−19.86%,and 6.64%−51.20%,respectively.However,after integration,they changed to 0.00%,6.04%−49.87%,1.73%−23.02%,3.84%−26.85%,1.17%−16.54%,and 6.40%−48.59%,respectively.The results demonstrated that in the newly integrated fingerprint,the analytical units of constituent herbal materials,information entropy and information amount were significantly reduced(P<0.05),while the TQSM parameters remained unchanged(P>0.05).Additionally,the RSD of the TQSM parameters,information entropy,and information amount didn’t show significant difference before and after integration(P>0.05),but the RSD of the number and area of the integrated analytical units significantly decreased(P<0.05).Conclusion The MFSM method could reduce the analytical units of constituent herbal mate-rials while maintain the properties and variability from their original fingerprint.Thus,it could serve as a feasible and reliable tool to reduce difficulties in analyzing multi-compo-nents within CMMs and facilitating the evaluation of their quality.展开更多
The correlation between close-in super Earths and distant cold Jupiters in planetary systems has important implications for their formation and evolution.Contrary to some earlier findings,a recent study conducted by B...The correlation between close-in super Earths and distant cold Jupiters in planetary systems has important implications for their formation and evolution.Contrary to some earlier findings,a recent study conducted by Bonomo et al.suggests that the occurrence of cold Jupiter companions is not excessive in super-Earth systems.Here we show that this discrepancy can be seen as a Simpson’s paradox and is resolved once the metallicity dependence of the super-Earth-cold Jupiter relation is taken into account.A common feature is noticed that almost all the cold Jupiter detections with inner super-Earth companions are found around metal-rich stars.Focusing on the Sun-like hosts with super-solar metallicities,we show that the frequency of cold Jupiters conditioned on the presence of inner super Earths is 39_(-11)^(+12)%,whereas the frequency of cold Jupiters in the same metallicity range is no more than 20%.Therefore,the occurrences of close-in super Earths and distant cold Jupiters appear correlated around metal-rich hosts.The relation between the two types of planets remains unclear for stars with metal-poor hosts due to the limited sample size and the much lower occurrence rate of cold Jupiters,but a correlation between the two cannot be ruled out.展开更多
We present a study of low surface brightness galaxies(LSBGs) selected by fitting the images for all the galaxies inα.40 SDSS DR7 sample with two kinds of single-component models and two kinds of two-component models(...We present a study of low surface brightness galaxies(LSBGs) selected by fitting the images for all the galaxies inα.40 SDSS DR7 sample with two kinds of single-component models and two kinds of two-component models(disk+bulge):single exponential,single sersic,exponential+deVaucular(exp+deV),and exponential+sérsic(exp+ser).Under the criteria of the B band disk central surface brightness μ_(0,disk)(B)≥22.5 mag arcsec^(-2) and the axis ratio b/a> 0.3,we selected four none-edge-on LSBG samples from each of the models which contain 1105,1038,207,and 75 galaxies,respectively.There are 756 galaxies in common between LSBGs selected by exponential and sersic models,corresponding to 68.42% of LSBGs selected by the exponential model and 72.83% of LSBGs selected by the sersic model,the rest of the discrepancy is due to the difference in obtaining μ_(0) between the exponential and sersic models.Based on the fitting,in the range of 0.5≤n≤1.5,the relation of μ_(0) from two models can be written as μ_(0,sérsic)-μ_(0,exp)=-1.34(n-1).The LSBGs selected by disk+bulge models(LSBG_(2)comps) are more massive than LSBGs selected by single-component models(LSBG_1comp),and also show a larger disk component.Though the bulges in the majority of our LSBG_(2)comps are not prominent,more than 60% of our LSBG_(2)comps will not be selected if we adopt a single-component model only.We also identified 31 giant low surface brightness galaxies(gLSBGs) from LSBG_(2)comps.They are located at the same region in the color-magnitude diagram as other gLSBGs.After we compared different criteria of gLSBGs selection,we find that for gas-rich LSBGs,M_(*)> 10^(10)M_⊙ is the best to distinguish between gLSBGs and normal LSBGs with bulge.展开更多
Glitch activity refers to the mean increase in pulsar spin frequency per year due to rotational glitches.It is an important tool for studying super-nuclear matter using neutron star interiors as templates.Glitch event...Glitch activity refers to the mean increase in pulsar spin frequency per year due to rotational glitches.It is an important tool for studying super-nuclear matter using neutron star interiors as templates.Glitch events are typically observed in the spin frequency(ν) and frequency derivative( ν) of pulsars.The rate of glitch recurrence decreases as the pulsar ages,and the activity parameter is usually measured by linear regression of cumulative glitches over a given period.This method is effective for pulsars with multiple regular glitch events.However,due to the scarcity of glitch events and the difficulty of monitoring all known pulsars,only a few have multiple records of glitch events.This limits the use of the activity parameter in studying neutron star interiors with multiple pulsars.In this study,we examined the relationship between the activity parameters and pulsar spin parameters(spin frequency,frequency derivative,and pulsar characteristic age).We found that a quadratic function provides a better fit for the relationship between activity parameters and spin parameters than the commonly used linear functions.Using this information,we were able to estimate the activity parameters of other pulsars that do not have records of glitches.Our analysis shows that the relationship between the estimated activity parameters and pulsar spin parameters is consistent with that of the observed activity parameters in the ensemble of pulsars.展开更多
The impact of WorldWar II on the physical landscape of British towns and cities as a result of airborne assault is well known.However,less newsworthy but arguably no less significant is the impact of the war on agricu...The impact of WorldWar II on the physical landscape of British towns and cities as a result of airborne assault is well known.However,less newsworthy but arguably no less significant is the impact of the war on agriculture and the countryside,especially in South-East England.This paper outlines the building of an historical Geographical Information System(GIS)from different data sources including the National Farm Survey(NFS),Luftwaffe and Royal Air Force(RAF)aerial photographs and basic topographic mapping for the South Downs in East and West Sussex.It explores the impact and legacy ofWorldWar II on the agricultural landscape of this area through both the‘plough-up’campaigns aimed at increasing agricultural production and the occupation of farm land for military training purposes.Farms surrounding an area where extensive tracts of land were taken over for military training and defensive purposes on the Downs close to Brighton and the county town of Lewes in East Sussex are the focus of attention illuminating the beneficial and disruptive impacts of the government’s drive to increase food output by bringing land into more productive use by means of a‘plough-up’campaign and using formerly agricultural land for military training.These changes contributed to the transformation of the region into“an arable monoculture”and the virtual disappearance of traditional sheep rearing in the post-war decades.展开更多
To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA)...To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA) is proposed to analyze vibro-acoustics responses with uncertainties at middle frequencies. The mid-frequency dynamic response of the framework-plate structure with uncertainties is studied based on the hybrid FE-SEA method and the Monte Carlo(MC)simulation is performed so as to provide a benchmark comparison with the hybrid method. The energy response of the framework-plate structure matches well with the MC simulation results, which validates the effectiveness of the hybrid FE-SEA method considering both the complexity of the vibro-acoustic structure and the uncertainties in mid-frequency vibro-acousitc analysis. Based on the hybrid method, a vibroacoustic model of a construction machinery cab with random properties is established, and the excitations of the model are measured by experiments. The responses of the sound pressure level of the cab and the vibration power spectrum density of the front windscreen are calculated and compared with those of the experiment. At middle frequencies, the results have a good consistency with the tests and the prediction error is less than 3. 5dB.展开更多
Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts...Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.展开更多
Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased ...Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.展开更多
Coal oxidation at low temperatures is the heat source liable for the self-heating and spontaneous combustion of coal. This phenomenon has imposed severe problems in coal related industries. Attempts to understand this...Coal oxidation at low temperatures is the heat source liable for the self-heating and spontaneous combustion of coal. This phenomenon has imposed severe problems in coal related industries. Attempts to understand this phenomenon by previous researchers have provided significant progress. It is wellknown that coal oxidation at low temperatures involves oxygen consumption and formation of gaseous and solid oxidation products. This process is majorly influenced by temperature, oxidation history of coal,coal properties, particle size distribution of the coal, etc. The current understanding of the phenomenon of self-heating and spontaneous combustion of coal is discussed along with the different experimental and numerical models established to predict self-heating characteristics of coal. This paper focuses on the global position of the study carried out by academics, research institutes and industries on spontaneous combustion of coal and coal mine fires. Within this framework, the generally used spontaneous combustion techniques to predict the spontaneous combustion liability of coal were evaluated. These techniques are well-known in their usage, but no specific method has become a standard to predict the spontaneous combustion liability. Further study is still needed to indicate a number of impending issues and to obtain a more complete understanding on the phenomenon.展开更多
基金supported by the National Key R&D Program of China(2022YFF0503004,2020YFC2201200 and 2022YFF0503800)the National Natural Science Foundation of China(NSFC,grant No.12333009)supported by the National Space Administration of China under the Pre-research Program of Civil Space Technology"Design and Key Technologies of the Solar All-round Stereoscopic Detection System"。
文摘Solar magnetic field measurements mainly use the Zeeman effect,but this method has two problems,namely,low accuracy of the transverse magnetic field components and a 180°ambiguity.Multi-perspective observations can increase the measurement accuracy and resolve the ambiguity.This study investigates how combined observations from the Sun-Earth L5 point,Sun-Earth line,and solar polar-orbiting satellites improve the accuracy of the transverse solar magnetic field under different satellite positional configurations.A three-satellite model is developed using spherical trigonometry to establish coordinate relationships,and the error propagation formulas are applied to correc transverse field measurement errors.The magnetic field measurement error distribution of the Helioseismic and Magnetic Imager is analyzed,and the magnetograms from the three satellites are simulated.The improvement to the transverse field accuracy under various satellite configurations is then assessed based on simulation results.The results show that multi-perspective measurements can reduce transverse component errorsΔB_(x)to approximately 10%andΔB_(y)to about 15%compared to the error from a single satellite.An optimally designed polar orbit can decrease the transverse field error by nearly an order of magnitude for 80%of its operation time.
文摘We perform a time-resolved statistical study of GRB 221009A’s X-ray emission using Swift XRT Photon Counting and Windowed Timing data.After standard reduction(barycentric correction,pile-up,background subtraction via HEASOFT),we extracted light curves for each observational ID and for their aggregation.Countrate histograms were fitted using various statistical distributions;fit quality was assessed by chi-squared and the Bayesian Information Criterion.The first observational segment is best described by a Gaussian distribution(χ^(2)=68.4;BIC=7651.2),and the second by a Poisson distribution(χ^(2)=33.5;BIC=4413.3).When all segments are combined,the lognormal model provides the superior fit(χ^(2)=541.9;BIC=34365.5),indicating that the full data set’s count rates exhibit the skewness expected from a multiplicative process.These findings demonstrate that while individual time intervals conform to discrete or symmetric statistics,the collective emission profile across multiple observations is better captured by a lognormal distribution,consistent with complex,compounded variability in GRB afterglows.
基金supported by the Ministry of Science and Technology of China(2020SKA0110401,2020SKA0110402 and 2020SKA0110100)the National Key Research and Development Program of China(2018YFA0404504,2018YFA0404601 and 2020YFC2201600)+2 种基金the National Natural Science Foundation of China(12373005,11890691,12205388,12220101003 and 12473097)the China Manned Space Project with numbers CMS-CSST-2021(A02,A03,B01)Guangdong Basic and Applied Basic Research Foundation(2024A1515012309)。
文摘In this study,we investigate the potential of mark-weighted angular correlation functions,which integrateβ-cosmic-web classification with angular correlation function analysis to improve cosmological constraints.Using SDSS DR12 CMASS-NGC galaxies and mock catalogs withΩ_(m)varying from 0.25 to 0.40,we assess the discriminative power of different statistics via the average improvement in chi-squared,ΔX^(2),across six redshift bins.This metric quantifies how effectively each statistic distinguishes between different cosmological models.Incorporating cosmic-web weights leads to substantial improvements.Using statistics weighted by the mean neighbor distance(Dnei)increasesΔX^(2)by approximately 40%–130%,while applying inverse mean neighbor distance weighting(1/Dnei)yields even larger gains,boostingΔX^(2)by a factor of 2–3 compared to traditional unweighted angular statistics.These enhancements are consistent with previous 3D clustering results,demonstrating the superior sensitivity of theβ-weighted approaches.Our method,based on thin redshift slices,is particularly suitable for slitless surveys(e.g.,Euclid,CSST)where redshift uncertainties limit 3D analyses.This study also offers a framework for applying marked statistics to 2D angular clustering.
基金supported by the National Key R&D Program of China(2018YFE0202900)support by the NAOC Nebula Talents Program and the Cultivation Project for FAST Scientific Payoff and Research Achievement of CAMS-CAS。
文摘In radio astronomy,radio frequency interference(RFI)becomes more and more serious for radio observational facilities.The RFI always influences the search and study of the interesting astronomical objects.Mitigating the RFI becomes an essential procedure in any survey data processing.The Five-hundred-meter Aperture Spherical radio Telescope(FAST)is an extremely sensitive radio telescope.It is necessary to find out an effective and precise RFI mitigation method for FAST data processing.In this work,we introduce a method to mitigate the RFI in FAST spectral observation and make a statistic for the RFI using~300 h FAST data.The details are as follows.First,according to the characteristics of FAST spectra,we propose to use the Asymmetrically Reweighted Penalized Least Squares algorithm for baseline fitting.Our test results show that it has a good performance.Second,we flag the RFI with four strategies,which are to flag extremely strong RFI,flag long-lasting RFI,flag polarized RFI,and flag beam-combined RFI,respectively.The test results show that all the RFI above a preset threshold could be flagged.Third,we make a statistic for the probabilities of polarized XX and YY RFI in FAST observations.The statistical results could tell us which frequencies are relatively quiescent.With such statistical data,we are able to avoid using such frequencies in our spectral observations.Finally,based on the~300 h FAST data,we obtained an RFI table,which is the most complete database currently for FAST.
文摘In this study,geochemical anomaly separation was carried out with methods based on the distribution model,which includes probability diagram(MPD),fractal(concentration-area technique),and U-statistic methods.The main objective is to evaluate the efficiency and accuracy of the methods in separation of anomalies on the shear zone gold mineralization.For this purpose,samples were taken from the secondary lithogeochemical environment(stream sediment samples)on the gold mineralization in Saqqez,NW of Iran.Interpretation of the histograms and diagrams showed that the MPD is capable of identifying two phases of mineralization.The fractal method could separate only one phase of change based on the fractal dimension with high concentration areas of the Au element.The spatial analysis showed two mixed subpopulations after U=0 and another subpopulation with very high U values.The MPD analysis followed spatial analysis,which shows the detail of the variations.Six mineralized zones detected from local geochemical exploration results were used for validating the methods mentioned above.The MPD method was able to identify the anomalous areas higher than 90%,whereas the two other methods identified 60%(maximum)of the anomalous areas.The raw data without any estimation for the concentration was used by the MPD method using aminimum of calculations to determine the threshold values.Therefore,the MPD method is more robust than the other methods.The spatial analysis identified the detail soft hegeological and mineralization events that were affected in the study area.MPD is recommended as the best,and the spatial U-analysis is the next reliable method to be used.The fractal method could show more detail of the events and variations in the area with asymmetrical grid net and a higher density of sampling or at the detailed exploration stage.
文摘Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox. Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox. Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome, instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.
文摘The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.
基金the National Natural Science Foundation of China.
文摘We collected the basic parameters of 231 supernova remnants (SNRs) in ourGalaxy, namely, distances (d) from the Sun, linear diameters (D), Galactic heights (Z), estimatedages (t), luminosities (L), surface brightness (Σ) and flux densities (S_1) at 1-GHz frequency andspectral indices (α). We tried to find possible correlations between these parameters. As expected,the linear diameters were found to increase with ages for the shell-type remnants, and also to havea tendency to increase with the Galactic heights. Both the surface brightness and luminosity ofSNRs at 1-GHz tend to decrease with the linear diameter and with age. No other relations between theparameters were found.
文摘In this study,methods based on the distribution model(with and without personal opinion)were used for the separation of anomalous zones,which include two different methods of U-spatial statistics and mean plus values of standard deviation(X+nS).The primary purpose is to compare the results of these methods with each other.To increase the accuracy of comparison,regional geochemical data were used where occurrences and mineralization zones of epithermal gold have been introduced.The study area is part of the Hashtjin geological map,which is structurally part of the folded and thrust belt and part of the Alborz Tertiary magmatic complex.Samples were taken from secondary lithogeochemical environments.Au element data concerning epithermal gold reserves were used to investigate the efficacy of these two methods.In the U-spatial statistics method,and criteria were used to determine the threshold,and in the method,the element enrichment index of the region rock units was obtained with grouping these units.The anomalous areas were identified by,and criteria.Comparison of methods was made considering the position of discovered occurrences and the occurrences obtained from these methods,the flexibility of the methods in separating the anomalous zones,and the two-dimensional spatial correlation of the three elements As,Pb,and Ag with Au element.The ability of two methods to identify potential areas is acceptable.Among these methods,it seems the method with criteria has a high degree of flexibility in separating anomalous regions in the case of epithermal type gold deposits.
基金supported by the National Natural Science Foundation of China (Grant No. 10873009)the National Basic Research Program of China (973 program, No. 2007CB815404)
文摘Existing theory and models suggest that a Type I (merger) GRB should have a larger jet beaming angle than a Type II (collapsar) GRB, but so far no statistical evidence is available to support this suggestion. In this paper, we obtain a sample of 37 beaming angles and calculate the probability that this is true. A correction is also devised to account for the scarcity of Type I GRBs in our sample. The probability is calculated to be 83% without the correction and 71% with it.
文摘“Four classes of enterprises above designated size”(hereinafter called four-classes enterprises)refer to objects of statistical survey that have reached a certain scale in China’s current statistical method system,including four classes in national economy,namely,industrial enterprises above designated size,construction and real estate development and management enterprises above qualifications,wholesale and retail,catering and accommodation enterprises,and service enterprises above designated size,which are the primary part of national economic and social development activities.This paper is focused on analyzing the practice and difficulties in the current statistics work of four-classes enterprises,and then this paper proposes some recommendations.
基金Natural Science Foundation of Hunan province(2022JJ30453 and 2024JJ6362)the Key Research and Development Program of Hunan Province(2022SK2014).
文摘Objective To facilitate the quality evaluation suitable for the unique characteristics of Chinese materia medica(CMM)by developing and implementing a novel approach known as the matching frequency statistical moment(MFSM)method.Methods This study established the MFSM method.To demonstrate its effectiveness,we applied this novel approach to analyze Danxi Granules(丹膝颗粒,DXG)and its constituent herbal materials.To begin with,the ultra-performance liquid chromatography(UPLC)was applied to obtain the chromatographic fingerprints of DXG and its constituent herbal materi-als.Next,the MFSM was leveraged to compress and integrate them into a new fingerprint with fewer analytical units.Then,we characterized the properties and variability of both the original and integrated fingerprints by calculating total quantum statistical moment(TQSM)parameters,information entropy and information amount,along with their relative standard deviation(RSD).Finally,we compared the TQSM parameters,information entropy and infor-mation amount,and their RSD between the traditional and novel fingerprints to validate the new analytical method.Results The chromatographic peaks of DXG and its 12 raw herbal materials were divided and integrated into peak families by the MFSM method.Before integration,the ranges of the peak number,three TQSM parameters,information entropy and information amount for each peak or peak family of UPLC fingerprints of DXG and its 12 raw herbal materials were 95.07−209.73,9390−183064μv·s,5.928−21.33 min,22.62−106.69 min^(2),4.230−6.539,and 50530−974186μv·s,respectively.After integration,the ranges of these parameters were 10.00−88.00,9390−183064μv·s,5.951−22.02 min,22.27−104.73 min^(2),2.223−5.277,and 38159−807200μv·s,respectively.Correspondingly,the RSD of all the aforementioned pa-rameters before integration were 2.12%−9.15%,6.04%−49.78%,1.15%−23.10%,3.97%−25.79%,1.49%−19.86%,and 6.64%−51.20%,respectively.However,after integration,they changed to 0.00%,6.04%−49.87%,1.73%−23.02%,3.84%−26.85%,1.17%−16.54%,and 6.40%−48.59%,respectively.The results demonstrated that in the newly integrated fingerprint,the analytical units of constituent herbal materials,information entropy and information amount were significantly reduced(P<0.05),while the TQSM parameters remained unchanged(P>0.05).Additionally,the RSD of the TQSM parameters,information entropy,and information amount didn’t show significant difference before and after integration(P>0.05),but the RSD of the number and area of the integrated analytical units significantly decreased(P<0.05).Conclusion The MFSM method could reduce the analytical units of constituent herbal mate-rials while maintain the properties and variability from their original fingerprint.Thus,it could serve as a feasible and reliable tool to reduce difficulties in analyzing multi-compo-nents within CMMs and facilitating the evaluation of their quality.
基金supported by the National Natural Science Foundation of China(NSFC,grant Nos.12173021 and 12133005)CASSACA grant CCJRF2105。
文摘The correlation between close-in super Earths and distant cold Jupiters in planetary systems has important implications for their formation and evolution.Contrary to some earlier findings,a recent study conducted by Bonomo et al.suggests that the occurrence of cold Jupiter companions is not excessive in super-Earth systems.Here we show that this discrepancy can be seen as a Simpson’s paradox and is resolved once the metallicity dependence of the super-Earth-cold Jupiter relation is taken into account.A common feature is noticed that almost all the cold Jupiter detections with inner super-Earth companions are found around metal-rich stars.Focusing on the Sun-like hosts with super-solar metallicities,we show that the frequency of cold Jupiters conditioned on the presence of inner super Earths is 39_(-11)^(+12)%,whereas the frequency of cold Jupiters in the same metallicity range is no more than 20%.Therefore,the occurrences of close-in super Earths and distant cold Jupiters appear correlated around metal-rich hosts.The relation between the two types of planets remains unclear for stars with metal-poor hosts due to the limited sample size and the much lower occurrence rate of cold Jupiters,but a correlation between the two cannot be ruled out.
基金supported by the National Key R&D Program of China (grant No.2022YFA1602901)support of the National Natural Science Foundation of China(NSFC) grant Nos. 12090040, 12090041, and 12003043+5 种基金supported by the Youth Innovation Promotion AssociationCAS (No. 2020057)the science research grants of CSST from the China Manned Space Projectsupport of the NSFC grant Nos.11733006 and U1931109supported by the Strategic Priority Research Program of the Chinese Academy of Sciences,Grant No. XDB0550100partially supported by the Open Project Program of the Key Laboratory of Optical Astronomy,National Astronomical Observatories,Chinese Academy of Sciences。
文摘We present a study of low surface brightness galaxies(LSBGs) selected by fitting the images for all the galaxies inα.40 SDSS DR7 sample with two kinds of single-component models and two kinds of two-component models(disk+bulge):single exponential,single sersic,exponential+deVaucular(exp+deV),and exponential+sérsic(exp+ser).Under the criteria of the B band disk central surface brightness μ_(0,disk)(B)≥22.5 mag arcsec^(-2) and the axis ratio b/a> 0.3,we selected four none-edge-on LSBG samples from each of the models which contain 1105,1038,207,and 75 galaxies,respectively.There are 756 galaxies in common between LSBGs selected by exponential and sersic models,corresponding to 68.42% of LSBGs selected by the exponential model and 72.83% of LSBGs selected by the sersic model,the rest of the discrepancy is due to the difference in obtaining μ_(0) between the exponential and sersic models.Based on the fitting,in the range of 0.5≤n≤1.5,the relation of μ_(0) from two models can be written as μ_(0,sérsic)-μ_(0,exp)=-1.34(n-1).The LSBGs selected by disk+bulge models(LSBG_(2)comps) are more massive than LSBGs selected by single-component models(LSBG_1comp),and also show a larger disk component.Though the bulges in the majority of our LSBG_(2)comps are not prominent,more than 60% of our LSBG_(2)comps will not be selected if we adopt a single-component model only.We also identified 31 giant low surface brightness galaxies(gLSBGs) from LSBG_(2)comps.They are located at the same region in the color-magnitude diagram as other gLSBGs.After we compared different criteria of gLSBGs selection,we find that for gas-rich LSBGs,M_(*)> 10^(10)M_⊙ is the best to distinguish between gLSBGs and normal LSBGs with bulge.
文摘Glitch activity refers to the mean increase in pulsar spin frequency per year due to rotational glitches.It is an important tool for studying super-nuclear matter using neutron star interiors as templates.Glitch events are typically observed in the spin frequency(ν) and frequency derivative( ν) of pulsars.The rate of glitch recurrence decreases as the pulsar ages,and the activity parameter is usually measured by linear regression of cumulative glitches over a given period.This method is effective for pulsars with multiple regular glitch events.However,due to the scarcity of glitch events and the difficulty of monitoring all known pulsars,only a few have multiple records of glitch events.This limits the use of the activity parameter in studying neutron star interiors with multiple pulsars.In this study,we examined the relationship between the activity parameters and pulsar spin parameters(spin frequency,frequency derivative,and pulsar characteristic age).We found that a quadratic function provides a better fit for the relationship between activity parameters and spin parameters than the commonly used linear functions.Using this information,we were able to estimate the activity parameters of other pulsars that do not have records of glitches.Our analysis shows that the relationship between the estimated activity parameters and pulsar spin parameters is consistent with that of the observed activity parameters in the ensemble of pulsars.
文摘The impact of WorldWar II on the physical landscape of British towns and cities as a result of airborne assault is well known.However,less newsworthy but arguably no less significant is the impact of the war on agriculture and the countryside,especially in South-East England.This paper outlines the building of an historical Geographical Information System(GIS)from different data sources including the National Farm Survey(NFS),Luftwaffe and Royal Air Force(RAF)aerial photographs and basic topographic mapping for the South Downs in East and West Sussex.It explores the impact and legacy ofWorldWar II on the agricultural landscape of this area through both the‘plough-up’campaigns aimed at increasing agricultural production and the occupation of farm land for military training purposes.Farms surrounding an area where extensive tracts of land were taken over for military training and defensive purposes on the Downs close to Brighton and the county town of Lewes in East Sussex are the focus of attention illuminating the beneficial and disruptive impacts of the government’s drive to increase food output by bringing land into more productive use by means of a‘plough-up’campaign and using formerly agricultural land for military training.These changes contributed to the transformation of the region into“an arable monoculture”and the virtual disappearance of traditional sheep rearing in the post-war decades.
基金Science and Technology Support Planning of Jiangsu Province(No.BE2014133)the Open Foundation of Key Laboratory of Underw ater Acoustic Signal Processing(No.UASP1301)the Prospective Joint Research Project of Jiangsu province(No.BY2014127-01)
文摘To take into account the influence of uncetainties on the dynamic response of the vibro-acousitc structure, a hybrid modeling technique combining the finite element method(FE)and the statistic energy analysis(SEA) is proposed to analyze vibro-acoustics responses with uncertainties at middle frequencies. The mid-frequency dynamic response of the framework-plate structure with uncertainties is studied based on the hybrid FE-SEA method and the Monte Carlo(MC)simulation is performed so as to provide a benchmark comparison with the hybrid method. The energy response of the framework-plate structure matches well with the MC simulation results, which validates the effectiveness of the hybrid FE-SEA method considering both the complexity of the vibro-acoustic structure and the uncertainties in mid-frequency vibro-acousitc analysis. Based on the hybrid method, a vibroacoustic model of a construction machinery cab with random properties is established, and the excitations of the model are measured by experiments. The responses of the sound pressure level of the cab and the vibration power spectrum density of the front windscreen are calculated and compared with those of the experiment. At middle frequencies, the results have a good consistency with the tests and the prediction error is less than 3. 5dB.
文摘Aim To improve the efficiency of fatigue material tests and relevant statistical treatment of test data. Methods\ Least square approach and other special treatments were used. Results and Conclusion\ The concepts of each phase in fatigue tests and statistical treatment are clarified. The method proposed leads to three important properties. Reduced number of specimens brings to the advantage of lowering test expenditures. The whole test procedure has more flexibility for there is no need to conduct many tests at the same stress level as in traditional cases.
文摘Using GIS spatial statistical analysis method, with ArcGIS software as an analysis tool, taking the diseased maize in Hedong District of Linyi City as the study object, the distribution characteristic of the diseased crops this time in spatial location was analyzed. The results showed that the diseased crops mainly dis- tributed along with river tributaries and downstream of main rivers. The correlation between adjacent diseased plots was little, so the infection of pests and diseases were excluded, and the major reason of incidence might be river pollution.
基金The work presented in this paper is part of a PhD research study in the School of Mining Engineering at the University of the Witwatersrand,Johannesburg,South Africa.
文摘Coal oxidation at low temperatures is the heat source liable for the self-heating and spontaneous combustion of coal. This phenomenon has imposed severe problems in coal related industries. Attempts to understand this phenomenon by previous researchers have provided significant progress. It is wellknown that coal oxidation at low temperatures involves oxygen consumption and formation of gaseous and solid oxidation products. This process is majorly influenced by temperature, oxidation history of coal,coal properties, particle size distribution of the coal, etc. The current understanding of the phenomenon of self-heating and spontaneous combustion of coal is discussed along with the different experimental and numerical models established to predict self-heating characteristics of coal. This paper focuses on the global position of the study carried out by academics, research institutes and industries on spontaneous combustion of coal and coal mine fires. Within this framework, the generally used spontaneous combustion techniques to predict the spontaneous combustion liability of coal were evaluated. These techniques are well-known in their usage, but no specific method has become a standard to predict the spontaneous combustion liability. Further study is still needed to indicate a number of impending issues and to obtain a more complete understanding on the phenomenon.