To investigate the damage evolution caused by stress-driven and sub-critical crack propagation within the Beishan granite under multi-creep triaxial compressive conditions,the distributed optical fiber sensing and X-r...To investigate the damage evolution caused by stress-driven and sub-critical crack propagation within the Beishan granite under multi-creep triaxial compressive conditions,the distributed optical fiber sensing and X-ray computed tomography were combined to obtain the strain distribution over the sample surface and internal fractures of the samples.The Gini and skewness(G-S)coefficients were used to quantify strain localization during tests,where the Gini coefficient reflects the degree of clustering of elements with high strain values,i.e.,strain localization/delocalization.The strain localization-induced asymmetry of data distribution is quantified by the skewness coefficient.A precursor to granite failure is defined by the rapid and simultaneous increase of the G-S coefficients,which are calculated from strain increment,giving an earlier warning of failure by about 8%peak stress than those from absolute strain values.Moreover,the process of damage accumulation due to stress-driven crack propagation in Beishan granite is different at various confining pressures as the stress exceeds the crack initiation stress.Concretely,strain localization is continuous until brittle failure at higher confining pressure,while both strain localization and delocalization occur at lower confining pressure.Despite the different stress conditions,a similar statistical characteristic of strain localization during the creep stage is observed.The Gini coefficient increases,and the skewness coefficient decreases slightly as the creep stress is below 95%peak stress.When the accelerated strain localization begins,the Gini and skewness coefficients increase rapidly and simultaneously.展开更多
Objectives This review aimed to systematically synthesize the available research on the disclosure of diagnosis and related issues in childhood cancer from the perspectives of healthcare professionals,with the goal of...Objectives This review aimed to systematically synthesize the available research on the disclosure of diagnosis and related issues in childhood cancer from the perspectives of healthcare professionals,with the goal of informing the optimization of disclosure processes and meeting the communication needs of affected families.Methods In accordance with the Joanna Briggs Institute(JBI)methodology for mixed methods systematic reviews,the convergent segregated approach was used in this review.Articles were retrieved from 11 databases,including PubMed,Web of Science,CINAHL,CENTRAL,Embase,Ovid/Medline,PsycINFO,PsycArticles,Scopus,ERIC,and China National Knowledge Infrastructure(CNKI).The quality of the selected articles was assessed using the Mixed Method Appraisal Tool(MMAT).The review protocol was registered on PROSPERO(CRD42024542746).Results A total of 21 studies from 10 countries were included.Their methodological quality was generally medium to high,with MMAT scores ranging from 60%to 100%.The synthesis yielded three core themes:1)the spectrum of professional and societal attitudes toward disclosure;2)the dynamic practices of navigating disclosure amid uncertainty,including timing and environment,stakeholders,and content of disclosure;and 3)factors influencing disclosure,including children’s,parental,healthcare professionals’,and socio-cultural factors.Conclusions This review synthesized the perspectives and experiences of healthcare professionals regarding disclosure in childhood cancer,highlighting the complexity and multidimensional nature of this process in clinical practice.Future research should further investigate the experiences and needs of children and their parents,explore cultural variations in disclosure practices,develop context-appropriate assessment tools,and construct multidimensional intervention strategies to enhance the humanistic care and professional effectiveness of the disclosure process.展开更多
The Good Wife is an American TV series that focuses on women’s independence,politics,and law.The drama has been remade in China,Japan,and South Korea.This research aims to use Nida’s Functional Equivalence Theory to...The Good Wife is an American TV series that focuses on women’s independence,politics,and law.The drama has been remade in China,Japan,and South Korea.This research aims to use Nida’s Functional Equivalence Theory to analyze the methods of its English-to-Chinese subtitle translation by considering social,cultural,and historic backgrounds between China and America.After data collection and case analysis,the study found that:(1)Five major translation methods are adopted in the subtitle translation of The Good Wife.They are free translation,variation,literal translation,addition,and omission.Among them,free translation is the most frequently used,while omission is used least.(2)The subtitle translation of films and TV series is limited by time and space restrictions,social-cultural differences,and other factors.When translating,translators should try to use humorous words,euphemism,intonation,and other ways,and combine different methods such as literal translation,free translation,variation,addition,omission,and other methods to seek equivalence both in the meaning and function of subtitles under the guidance of Functional Equivalence Theory.展开更多
Geological prospecting and the identification of adverse geological features are essential in tunnel construction,providing critical information to ensure safety and guide engineering decisions.As tunnel projects exte...Geological prospecting and the identification of adverse geological features are essential in tunnel construction,providing critical information to ensure safety and guide engineering decisions.As tunnel projects extend into deeper and more mountainous terrains,engineers face increasingly complex geological conditions,including high water pressure,intense geo-stress,elevated geothermal gradients,and active fault zones.These conditions pose substantial risks such as high-pressure water inrush,largescale collapses,and tunnel boring machine(TBM)blockages.Addressing these challenges requires advanced detection technologies capable of long-distance,high-precision,and intelligent assessments of adverse geology.This paper presents a comprehensive review of recent advancements in tunnel geological ahead prospecting methods.It summarizes the fundamental principles,technical maturity,key challenges,development trends,and real-world applications of various detection techniques.Airborne and semi-airborne geophysical methods enable large-scale reconnaissance for initial surveys in complex terrain.Tunnel-and borehole-based approaches offer high-resolution detection during excavation,including seismic ahead prospecting(SAP),TBM rock-breaking source seismic methods,fulltime-domain tunnel induced polarization(TIP),borehole electrical resistivity,and ground penetrating radar(GPR).To address scenarios involving multiple,coexisting adverse geologies,intelligent inversion and geological identification methods have been developed based on multi-source data fusion and artificial intelligence(AI)techniques.Overall,these advances significantly improve detection range,resolution,and geological characterization capabilities.The methods demonstrate strong adaptability to complex environments and provide reliable subsurface information,supporting safer and more efficient tunnel construction.展开更多
Dongguan (东莞) City, located in the Pearl River Delta, South China, is famous for its rapid industrialization in the past 30 years. A total of 90 topsoil samples have been collected from agricultural fields, includ...Dongguan (东莞) City, located in the Pearl River Delta, South China, is famous for its rapid industrialization in the past 30 years. A total of 90 topsoil samples have been collected from agricultural fields, including vegetable and orchard soils in the city, and eight heavy metals (As, Cu, Cd, Cr, Hg, Ni, Pb, and Zn) and other items (pH values and organic matter) have been analyzed, to evaluate the influence of anthropic activities on the environmental quality of agricultural soils and to identify the spatial distribution of trace elements and possible sources of trace elements. The elements Hg, Pb, and Cd have accumulated remarkably here, incomparison with the soil background content of elements in Guangdong (广东) Province. Pollution is more serious in the western plain and the central region, which are heavily distributed with industries and rivers. Multivariate and geostatistical methods have been applied to differentiate the influences of natural processes and human activities on the pollution of heavy metals in topsoils in the study area. The results of cluster analysis (CA) and factor analysis (FA) show that Ni, Cr, Cu, Zn, and As are grouped in factor F1, Pb in F2, and Cd and Hg in F3, respectively. The spatial pattern of the three factors may be well demonstrated by geostatistical analysis. It is shown that the first factor could be considered as a natural source controlled by parent rocks. The second factor could be referred to as "industrial and traffic pollution sources". The source of the third factor is mainly controlled by long-term anthropic activities, as a consequence of agricultural activities, fossil fuel consumption, and atmospheric deposition.展开更多
Landslide distribution and susceptibility mapping are the fundamental steps for landslide-related hazard and disaster risk management activities, especially in the Himalaya region which has resulted in a great deal of...Landslide distribution and susceptibility mapping are the fundamental steps for landslide-related hazard and disaster risk management activities, especially in the Himalaya region which has resulted in a great deal of death and damage to property. To better understand the landslide condition in the Nepal Himalaya, we carried out an investigation on the landslide distribution and susceptibility using the landslide inventory data and 12 different contributing factors in the Dailekh district, Western Nepal. Based on the evaluation of the frequency distribution of the landslide, the relationship between the landslide and the various contributing factors was determined.Then, the landslide susceptibility was calculated using logistic regression and statistical index methods along with different topographic(slope, aspect, relative relief, plan curvature, altitude, topographic wetness index) and non-topographic factors(distance from river, normalized difference vegetation index(NDVI), distance from road, precipitation, land use and land cover, and geology), and 470(70%) of total 658 landslides. The receiver operating characteristic(ROC) curve analysis using 198(30%) of total landslides showed that the prediction curve rates(area under the curve, AUC) values for two methods(logistic regression and statistical index) were 0.826, and 0.823with success rates of 0.793, and 0.811, respectively. The values of R-Index for the logistic regression and statistical index methods were83.66 and 88.54, respectively, consisting of high susceptible hazard classes. In general, this research concluded that the cohesive and coherent natural interplay of topographic and non-topographic factors strongly affects landslide occurrence, distribution, and susceptibility condition in the Nepal Himalaya region. Furthermore, the reliability of these two methods is verified for landslide susceptibility mapping in Nepal’s central mountain region.展开更多
In this paper,the problem of increasing information transfer authenticity is formulated.And to reach a decision,the control methods and algorithms based on the use of statistical and structural information redundancy ...In this paper,the problem of increasing information transfer authenticity is formulated.And to reach a decision,the control methods and algorithms based on the use of statistical and structural information redundancy are presented.It is assumed that the controllable information is submitted as the text element images and it contains redundancy,caused by statistical relations and non-uniformity probability distribution of the transmitted data.The use of statistical redundancy allows to develop the adaptive rules of the authenticity control which take into account non-stationarity properties of image data while transferring the information.The structural redundancy peculiar to the container of image in a data transfer package is used for developing new rules to control the information authenticity on the basis of pattern recognition mechanisms.The techniques offered in this work are used to estimate the authenticity in structure of data transfer packages.The results of comparative analysis for developed methods and algorithms show that their parameters of efficiency are increased by criterion of probability of undetected mistakes,labour input and cost of realization.展开更多
Statistical analysis is critical in medical research.The objective of this article is to summarize the appropriate use and reporting of commonly used statistical methods in medical research,on the basis of existing st...Statistical analysis is critical in medical research.The objective of this article is to summarize the appropriate use and reporting of commonly used statistical methods in medical research,on the basis of existing statistical guidelines and the authors’experience in reviewing manuscripts,to provide recommendations for statistical applications and reporting.展开更多
In recent years there have been considerable new legislation and efforts by vehicle manufactures aimed at reducing pollutant emission to improve air quality in urban areas. Carbon monoxide is a major pollutant in urba...In recent years there have been considerable new legislation and efforts by vehicle manufactures aimed at reducing pollutant emission to improve air quality in urban areas. Carbon monoxide is a major pollutant in urban areas, and in this study we analyze monthly carbon monoxide (CO) data from Valencia City, a representative Mediterranean city in terms of its structure and climatology. Temporal and spatial trends in pollution were recorded from a monitoring net- work that consisted of five monitoring sites. A multiple linear model, incorporating meteorological parameters, annual cycles, and random error due to serial correlation, was used to estimate the temporal changes in pollution. An analysis performed on the meteorologically adjusted data reveals a significant decreasing trend in CO concentrations and an annual seasonal cycle. The model parameters are estimated by applying the least-squares method. The standard error of the parameters is determined while taking into account the serial correlation in the residuals. The decreasing trend im- plies to a certain extent an improvement in the air quality of the study area. The seasonal cycle shows variations that are mainly associated with traffic and meteorological patterns. Analysis of the stochastic spatial component shows that most of the intersite covariances can be analyzed using an exponential variogram model.展开更多
It is important to specify the occurrence and cause of failure of machines without stopping the machines because of increased use of various complex industrial systems. In this study, two new diagnosis methods based o...It is important to specify the occurrence and cause of failure of machines without stopping the machines because of increased use of various complex industrial systems. In this study, two new diagnosis methods based on the correlation information between sound and vibration emitted from the machine are derived. First, a diagnostic method which can detect the part of machine with fault among the assumed several faults is proposed by measuring simultaneously the time series data on sound and vibration. Next, a diagnosis method based on the estimation of the changing information of correlation between sound and vibration is considered by using prior information in only normal situation. The effectiveness of the proposed theory is experimentally confirmed by applying it to the observed data emitted from a rotational machine driven by an electric motor.展开更多
In this study, the petrophysical parameters such as density, sonic, neutron, and porosity were investigated and presented in the 3D models. The 3D models were built using geostatistical method that is used to estimate...In this study, the petrophysical parameters such as density, sonic, neutron, and porosity were investigated and presented in the 3D models. The 3D models were built using geostatistical method that is used to estimate studied parameters in the entire reservoir. For this purpose, the variogram of each parameter was determined to specify spatial correlation of data. Resulted variograms were non-monotonic. That shows anisotropy of structure. The lithology and porosity parameters are the main causes of this anisotropy. The 3D models also show that petrophysical data has higher variation in north part of reservoir than south part. In addition to, the west limb of reservoir shows higher porosity than east limb. The variation of sonic and neutron data are similar whereas the density data has opposed variation.展开更多
The sea surface temperature (SST) in the In- dian Ocean affects the regional climate over the Asian continent mostly through a modulation of the monsoon system. It is still difficult to provide an a priori indicatio...The sea surface temperature (SST) in the In- dian Ocean affects the regional climate over the Asian continent mostly through a modulation of the monsoon system. It is still difficult to provide an a priori indication of the seasonal variability over the Indian Ocean. It is widely recognized that the warm and cold events of SST over the tropical Indian Ocean are strongly linked to those of the equatorial eastern Pacific. In this study, a statistical prediction model has been developed to predict the monthly SST over the tropical Indian Ocean. This model is a linear regression model based on the lag relationship between the SST over the tropical Indian Ocean and the Nino3.4 (5°S-5°N, 170°W-120°W) SST Index. The pre- dictor (i.e., Nino3.4 SST Index) has been operationally predicted by a large size ensemble E1 Nifio and the Southern Oscillation (ENSO) forecast system with cou- pled data assimilation (Leefs_CDA), which achieves a high predictive skill of up to a 24-month lead time for the equatorial eastern Pacific SST. As a result, the prediction skill of the present statistical model over the tropical In- dian Ocean is better than that of persistence prediction for January 1982 through December 2009.展开更多
The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods...The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods is required. In this paper, the methodological approach to collecting, structuring and publishing the methods, which have been used or developed by former or present adaptation initiatives, is described. The intention is to communicate achieved knowledge and thus support future users. A key component is the participation of users in the development process. Main elements of the approach are standardized, template-based descriptions of the methods including the specific applications, references, and method assessment. All contributions have been quality checked, sorted, and placed in a larger context. The result is a report on statistical methods which is freely available as printed or online version. Examples of how to use the methods are presented in this paper and are also included in the brochure.展开更多
Data of traffic flow, speed and density are required for planning, designing, and modelling of traffic stream for all parts of the road system. Specialized equipments such as stationary counts are used to record volum...Data of traffic flow, speed and density are required for planning, designing, and modelling of traffic stream for all parts of the road system. Specialized equipments such as stationary counts are used to record volume and speed;but they are expensive, difficult to set up, and require periodic maintenance. The moving observer method was proposed in 1954 by Wardrop and Charlesworth to estimate these variables inexpensively. Basically, the observer counts the number of vehicles overtaken, the number of vehicles passed, and the number of vehicles encountered while traveling in the opposite direction. The trip time is reported for both travel directions. Additionally, the length of road segment is measured. These variables are then used in estimating speeds and volumes. In a westbound direction from Interstate Highway 30 (I-30) in the DFW area, this study examined the accuracy and feasibility of this method by comparing it with stationary observer method as the standard method for such counts. The statistical tests were used to test the accuracy. Results show that this method provides accurate volume and speed estimates when compared to the stationary method for the road segment with three lanes per direction, especially when several runs are taken.展开更多
The validity of the compensation between the enthalpies and entropies obtained from the calorimetric methods was statistically examined for the first time based on computer simulations. It turned out that several clai...The validity of the compensation between the enthalpies and entropies obtained from the calorimetric methods was statistically examined for the first time based on computer simulations. It turned out that several claimed enthalpy-entropy compensations in literature based upon the calorimetric measurements were statistically correct. Interestingly, a linear relationship between the slopes and correlation coefficients of the T DeltaS-DeltaH plots of different physical origin of the compensation behavior.展开更多
Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them...Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox.Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox.Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome,instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.展开更多
In general, digital images can be classified into photographs, textual and mixed documents. This taxonomy is very useful in many applications, such as archiving task. However, there are no effective methods to perform...In general, digital images can be classified into photographs, textual and mixed documents. This taxonomy is very useful in many applications, such as archiving task. However, there are no effective methods to perform this classification automatically. In this paper, we present a method for classifying and archiving document into the following semantic classes: photographs, textual and mixed documents. Our method is based on combining low-level image features, such as mean, Standard deviation, Skewness. Both the Decision Tree and Neuronal Network Classifiers are used for classification task.展开更多
Spatial downscaling methods are widely used for the production of bioclimatic variables(e.g. temperature and precipitation) in studies related to species ecological niche and drainage basin management and planning. Th...Spatial downscaling methods are widely used for the production of bioclimatic variables(e.g. temperature and precipitation) in studies related to species ecological niche and drainage basin management and planning. This study applied three different statistical methods, i.e. the moving window regression(MWR), nonparametric multiplicative regression(NPMR), and generalized linear model(GLM), to downscale the annual mean temperature(Bio1) and annual precipitation(Bio12) in central Iran from coarse scale(1 km × 1 km) to fine scale(250 m ×250 m). Elevation, aspect, distance from sea and normalized difference vegetation index(NDVI) were used as covariates to create downscaled bioclimatic variables. Model assessment was performed by comparing model outcomes with observational data from weather stations. Coefficients of determination(R2), bias, and root-mean-square error(RMSE) were used to evaluate models and covariates. The elevation could effectively justify the changes in bioclimatic factors related to temperature and precipitation. Allthree models could downscale the mean annual temperature data with similar R2, RMSE, and bias values. The MWR had the best performance and highest accuracy in downscaling annual precipitation(R2=0.70; RMSE=123.44). In general, the two nonparametric models, i.e. MWR and NPMR, can be reliably used for the downscaling of bioclimatic variables which have wide applications in species distribution modeling.展开更多
To sustain the management of natural resources, land use and land cover (LULC) should be spatially mapped and temporally monitored using GIS. For large areas, conventional methods are laborious. Alternatively, remot...To sustain the management of natural resources, land use and land cover (LULC) should be spatially mapped and temporally monitored using GIS. For large areas, conventional methods are laborious. Alternatively, remote sensing can be used for LULC mapping and monitoring. Normalized differential vegetation index (NDVI) is the most used vegetation index for crop identification and phenology. For agricultural areas, crop statistics are estimated yearly at regional level following administrative units. However, these statistics are not informing about spatial extent of these crops within administrative units; such information is crucial for crop monitoring. The main objective of this research was to fill the gap, based on statistical methods and GIS, by adding spatial information to crop statistics by analyzing temporal NDVI profiles. The study area covers 1300 km2. Data consist of 147 decadal Spot Vegetation NDVI images. Crop statistics were compiled on seasonal basis and aggregated to different administrative levels. Images were processed using an unsupervised classification method. A series of classification runs corresponding to different numbers of clusters were used. Using stepwise multiple linear regression, cropped areas from agricultural statistics were related to areas of each NDVI profile cluster. Estimated regression coefficients were used to generate maps showing cropped fractions by map units. The optimal number of clusters was 18. Similar profiles were merged leading to eight clusters. The results show that, for example, rice was grown, in autumn, on 50% of the area of map-units represented by NDVI-profile group 4 and 75% of the area of group 7 while it was grown, in spring, on 2, 69 and 25% of areas of NDVI-profile groups 2, 61 and 7, respectively. Regression coefficients were used to generate map of crops. This research illustrates the benefit of integrating statistical methods, GIS, remote sensing and crop statistics to delineate NDVI profile clusters with their corresponding agricultural land cover map units and to link these statistics to geographical locations. These map units can be used as a reference for future monitoring of natural resources, in particular crop growth and development and for forecasting crop production and/or yield and stresses like drought.展开更多
Modal parameters can accurately characterize the structural dynamic properties and assess the physical state of the structure.Therefore,it is particularly significant to identify the structural modal parameters accordi...Modal parameters can accurately characterize the structural dynamic properties and assess the physical state of the structure.Therefore,it is particularly significant to identify the structural modal parameters according to the monitoring data information in the structural health monitoring(SHM)system,so as to provide a scientific basis for structural damage identification and dynamic model modification.In view of this,this paper reviews methods for identifying structural modal parameters under environmental excitation and briefly describes how to identify structural damages based on the derived modal parameters.The paper primarily introduces data-driven modal parameter recognition methods(e.g.,time-domain,frequency-domain,and time-frequency-domain methods,etc.),briefly describes damage identification methods based on the variations of modal parameters(e.g.,natural frequency,modal shapes,and curvature modal shapes,etc.)and modal validation methods(e.g.,Stability Diagram and Modal Assurance Criterion,etc.).The current status of the application of artificial intelligence(AI)methods in the direction of modal parameter recognition and damage identification is further discussed.Based on the pre-vious analysis,the main development trends of structural modal parameter recognition and damage identification methods are given to provide scientific references for the optimized design and functional upgrading of SHM systems.展开更多
基金supported by the National Natural Science Foundation of China(Grant No.52339001).
文摘To investigate the damage evolution caused by stress-driven and sub-critical crack propagation within the Beishan granite under multi-creep triaxial compressive conditions,the distributed optical fiber sensing and X-ray computed tomography were combined to obtain the strain distribution over the sample surface and internal fractures of the samples.The Gini and skewness(G-S)coefficients were used to quantify strain localization during tests,where the Gini coefficient reflects the degree of clustering of elements with high strain values,i.e.,strain localization/delocalization.The strain localization-induced asymmetry of data distribution is quantified by the skewness coefficient.A precursor to granite failure is defined by the rapid and simultaneous increase of the G-S coefficients,which are calculated from strain increment,giving an earlier warning of failure by about 8%peak stress than those from absolute strain values.Moreover,the process of damage accumulation due to stress-driven crack propagation in Beishan granite is different at various confining pressures as the stress exceeds the crack initiation stress.Concretely,strain localization is continuous until brittle failure at higher confining pressure,while both strain localization and delocalization occur at lower confining pressure.Despite the different stress conditions,a similar statistical characteristic of strain localization during the creep stage is observed.The Gini coefficient increases,and the skewness coefficient decreases slightly as the creep stress is below 95%peak stress.When the accelerated strain localization begins,the Gini and skewness coefficients increase rapidly and simultaneously.
基金supported by the Fuxing Nursing Research Foundation of Fudan University[FNF202352].
文摘Objectives This review aimed to systematically synthesize the available research on the disclosure of diagnosis and related issues in childhood cancer from the perspectives of healthcare professionals,with the goal of informing the optimization of disclosure processes and meeting the communication needs of affected families.Methods In accordance with the Joanna Briggs Institute(JBI)methodology for mixed methods systematic reviews,the convergent segregated approach was used in this review.Articles were retrieved from 11 databases,including PubMed,Web of Science,CINAHL,CENTRAL,Embase,Ovid/Medline,PsycINFO,PsycArticles,Scopus,ERIC,and China National Knowledge Infrastructure(CNKI).The quality of the selected articles was assessed using the Mixed Method Appraisal Tool(MMAT).The review protocol was registered on PROSPERO(CRD42024542746).Results A total of 21 studies from 10 countries were included.Their methodological quality was generally medium to high,with MMAT scores ranging from 60%to 100%.The synthesis yielded three core themes:1)the spectrum of professional and societal attitudes toward disclosure;2)the dynamic practices of navigating disclosure amid uncertainty,including timing and environment,stakeholders,and content of disclosure;and 3)factors influencing disclosure,including children’s,parental,healthcare professionals’,and socio-cultural factors.Conclusions This review synthesized the perspectives and experiences of healthcare professionals regarding disclosure in childhood cancer,highlighting the complexity and multidimensional nature of this process in clinical practice.Future research should further investigate the experiences and needs of children and their parents,explore cultural variations in disclosure practices,develop context-appropriate assessment tools,and construct multidimensional intervention strategies to enhance the humanistic care and professional effectiveness of the disclosure process.
文摘The Good Wife is an American TV series that focuses on women’s independence,politics,and law.The drama has been remade in China,Japan,and South Korea.This research aims to use Nida’s Functional Equivalence Theory to analyze the methods of its English-to-Chinese subtitle translation by considering social,cultural,and historic backgrounds between China and America.After data collection and case analysis,the study found that:(1)Five major translation methods are adopted in the subtitle translation of The Good Wife.They are free translation,variation,literal translation,addition,and omission.Among them,free translation is the most frequently used,while omission is used least.(2)The subtitle translation of films and TV series is limited by time and space restrictions,social-cultural differences,and other factors.When translating,translators should try to use humorous words,euphemism,intonation,and other ways,and combine different methods such as literal translation,free translation,variation,addition,omission,and other methods to seek equivalence both in the meaning and function of subtitles under the guidance of Functional Equivalence Theory.
基金supported by the National Natural Science Foundation of China(Grant Nos.52021005,52325904,and 51991391)。
文摘Geological prospecting and the identification of adverse geological features are essential in tunnel construction,providing critical information to ensure safety and guide engineering decisions.As tunnel projects extend into deeper and more mountainous terrains,engineers face increasingly complex geological conditions,including high water pressure,intense geo-stress,elevated geothermal gradients,and active fault zones.These conditions pose substantial risks such as high-pressure water inrush,largescale collapses,and tunnel boring machine(TBM)blockages.Addressing these challenges requires advanced detection technologies capable of long-distance,high-precision,and intelligent assessments of adverse geology.This paper presents a comprehensive review of recent advancements in tunnel geological ahead prospecting methods.It summarizes the fundamental principles,technical maturity,key challenges,development trends,and real-world applications of various detection techniques.Airborne and semi-airborne geophysical methods enable large-scale reconnaissance for initial surveys in complex terrain.Tunnel-and borehole-based approaches offer high-resolution detection during excavation,including seismic ahead prospecting(SAP),TBM rock-breaking source seismic methods,fulltime-domain tunnel induced polarization(TIP),borehole electrical resistivity,and ground penetrating radar(GPR).To address scenarios involving multiple,coexisting adverse geologies,intelligent inversion and geological identification methods have been developed based on multi-source data fusion and artificial intelligence(AI)techniques.Overall,these advances significantly improve detection range,resolution,and geological characterization capabilities.The methods demonstrate strong adaptability to complex environments and provide reliable subsurface information,supporting safer and more efficient tunnel construction.
基金supported by the Ministry of Land and Resources of China (No. [2005]011-16)State Environment Protection Administration of China (No. 2001-1-2)+2 种基金State Key Laboratory of Geological Processes and Mineral Resources, China University of Geosciencesthe Guangdong Provincial Office of SciencesTechnology via NSF Team Project and Key Project (Nos. 06202438, 2004A3030800)
文摘Dongguan (东莞) City, located in the Pearl River Delta, South China, is famous for its rapid industrialization in the past 30 years. A total of 90 topsoil samples have been collected from agricultural fields, including vegetable and orchard soils in the city, and eight heavy metals (As, Cu, Cd, Cr, Hg, Ni, Pb, and Zn) and other items (pH values and organic matter) have been analyzed, to evaluate the influence of anthropic activities on the environmental quality of agricultural soils and to identify the spatial distribution of trace elements and possible sources of trace elements. The elements Hg, Pb, and Cd have accumulated remarkably here, incomparison with the soil background content of elements in Guangdong (广东) Province. Pollution is more serious in the western plain and the central region, which are heavily distributed with industries and rivers. Multivariate and geostatistical methods have been applied to differentiate the influences of natural processes and human activities on the pollution of heavy metals in topsoils in the study area. The results of cluster analysis (CA) and factor analysis (FA) show that Ni, Cr, Cu, Zn, and As are grouped in factor F1, Pb in F2, and Cd and Hg in F3, respectively. The spatial pattern of the three factors may be well demonstrated by geostatistical analysis. It is shown that the first factor could be considered as a natural source controlled by parent rocks. The second factor could be referred to as "industrial and traffic pollution sources". The source of the third factor is mainly controlled by long-term anthropic activities, as a consequence of agricultural activities, fossil fuel consumption, and atmospheric deposition.
基金Under the auspices of the CAS Overseas Institutions Platform Project (No. 131C11KYSB20200033)the National Natural Science Foundation of China (No. 42071349)the Sichuan Science and Technology Program (No. 2020JDJQ0003)。
文摘Landslide distribution and susceptibility mapping are the fundamental steps for landslide-related hazard and disaster risk management activities, especially in the Himalaya region which has resulted in a great deal of death and damage to property. To better understand the landslide condition in the Nepal Himalaya, we carried out an investigation on the landslide distribution and susceptibility using the landslide inventory data and 12 different contributing factors in the Dailekh district, Western Nepal. Based on the evaluation of the frequency distribution of the landslide, the relationship between the landslide and the various contributing factors was determined.Then, the landslide susceptibility was calculated using logistic regression and statistical index methods along with different topographic(slope, aspect, relative relief, plan curvature, altitude, topographic wetness index) and non-topographic factors(distance from river, normalized difference vegetation index(NDVI), distance from road, precipitation, land use and land cover, and geology), and 470(70%) of total 658 landslides. The receiver operating characteristic(ROC) curve analysis using 198(30%) of total landslides showed that the prediction curve rates(area under the curve, AUC) values for two methods(logistic regression and statistical index) were 0.826, and 0.823with success rates of 0.793, and 0.811, respectively. The values of R-Index for the logistic regression and statistical index methods were83.66 and 88.54, respectively, consisting of high susceptible hazard classes. In general, this research concluded that the cohesive and coherent natural interplay of topographic and non-topographic factors strongly affects landslide occurrence, distribution, and susceptibility condition in the Nepal Himalaya region. Furthermore, the reliability of these two methods is verified for landslide susceptibility mapping in Nepal’s central mountain region.
文摘In this paper,the problem of increasing information transfer authenticity is formulated.And to reach a decision,the control methods and algorithms based on the use of statistical and structural information redundancy are presented.It is assumed that the controllable information is submitted as the text element images and it contains redundancy,caused by statistical relations and non-uniformity probability distribution of the transmitted data.The use of statistical redundancy allows to develop the adaptive rules of the authenticity control which take into account non-stationarity properties of image data while transferring the information.The structural redundancy peculiar to the container of image in a data transfer package is used for developing new rules to control the information authenticity on the basis of pattern recognition mechanisms.The techniques offered in this work are used to estimate the authenticity in structure of data transfer packages.The results of comparative analysis for developed methods and algorithms show that their parameters of efficiency are increased by criterion of probability of undetected mistakes,labour input and cost of realization.
文摘Statistical analysis is critical in medical research.The objective of this article is to summarize the appropriate use and reporting of commonly used statistical methods in medical research,on the basis of existing statistical guidelines and the authors’experience in reviewing manuscripts,to provide recommendations for statistical applications and reporting.
文摘In recent years there have been considerable new legislation and efforts by vehicle manufactures aimed at reducing pollutant emission to improve air quality in urban areas. Carbon monoxide is a major pollutant in urban areas, and in this study we analyze monthly carbon monoxide (CO) data from Valencia City, a representative Mediterranean city in terms of its structure and climatology. Temporal and spatial trends in pollution were recorded from a monitoring net- work that consisted of five monitoring sites. A multiple linear model, incorporating meteorological parameters, annual cycles, and random error due to serial correlation, was used to estimate the temporal changes in pollution. An analysis performed on the meteorologically adjusted data reveals a significant decreasing trend in CO concentrations and an annual seasonal cycle. The model parameters are estimated by applying the least-squares method. The standard error of the parameters is determined while taking into account the serial correlation in the residuals. The decreasing trend im- plies to a certain extent an improvement in the air quality of the study area. The seasonal cycle shows variations that are mainly associated with traffic and meteorological patterns. Analysis of the stochastic spatial component shows that most of the intersite covariances can be analyzed using an exponential variogram model.
文摘It is important to specify the occurrence and cause of failure of machines without stopping the machines because of increased use of various complex industrial systems. In this study, two new diagnosis methods based on the correlation information between sound and vibration emitted from the machine are derived. First, a diagnostic method which can detect the part of machine with fault among the assumed several faults is proposed by measuring simultaneously the time series data on sound and vibration. Next, a diagnosis method based on the estimation of the changing information of correlation between sound and vibration is considered by using prior information in only normal situation. The effectiveness of the proposed theory is experimentally confirmed by applying it to the observed data emitted from a rotational machine driven by an electric motor.
文摘In this study, the petrophysical parameters such as density, sonic, neutron, and porosity were investigated and presented in the 3D models. The 3D models were built using geostatistical method that is used to estimate studied parameters in the entire reservoir. For this purpose, the variogram of each parameter was determined to specify spatial correlation of data. Resulted variograms were non-monotonic. That shows anisotropy of structure. The lithology and porosity parameters are the main causes of this anisotropy. The 3D models also show that petrophysical data has higher variation in north part of reservoir than south part. In addition to, the west limb of reservoir shows higher porosity than east limb. The variation of sonic and neutron data are similar whereas the density data has opposed variation.
基金supported by the National Basic Research Program of China (Grant No. 2012CB417404)the National Natural Science Foundation of China (Grant Nos.41075064 and 41176014)
文摘The sea surface temperature (SST) in the In- dian Ocean affects the regional climate over the Asian continent mostly through a modulation of the monsoon system. It is still difficult to provide an a priori indication of the seasonal variability over the Indian Ocean. It is widely recognized that the warm and cold events of SST over the tropical Indian Ocean are strongly linked to those of the equatorial eastern Pacific. In this study, a statistical prediction model has been developed to predict the monthly SST over the tropical Indian Ocean. This model is a linear regression model based on the lag relationship between the SST over the tropical Indian Ocean and the Nino3.4 (5°S-5°N, 170°W-120°W) SST Index. The pre- dictor (i.e., Nino3.4 SST Index) has been operationally predicted by a large size ensemble E1 Nifio and the Southern Oscillation (ENSO) forecast system with cou- pled data assimilation (Leefs_CDA), which achieves a high predictive skill of up to a 24-month lead time for the equatorial eastern Pacific SST. As a result, the prediction skill of the present statistical model over the tropical In- dian Ocean is better than that of persistence prediction for January 1982 through December 2009.
文摘The development of adaptation measures to climate change relies on data from climate models or impact models. In order to analyze these large data sets or an ensemble of these data sets, the use of statistical methods is required. In this paper, the methodological approach to collecting, structuring and publishing the methods, which have been used or developed by former or present adaptation initiatives, is described. The intention is to communicate achieved knowledge and thus support future users. A key component is the participation of users in the development process. Main elements of the approach are standardized, template-based descriptions of the methods including the specific applications, references, and method assessment. All contributions have been quality checked, sorted, and placed in a larger context. The result is a report on statistical methods which is freely available as printed or online version. Examples of how to use the methods are presented in this paper and are also included in the brochure.
文摘Data of traffic flow, speed and density are required for planning, designing, and modelling of traffic stream for all parts of the road system. Specialized equipments such as stationary counts are used to record volume and speed;but they are expensive, difficult to set up, and require periodic maintenance. The moving observer method was proposed in 1954 by Wardrop and Charlesworth to estimate these variables inexpensively. Basically, the observer counts the number of vehicles overtaken, the number of vehicles passed, and the number of vehicles encountered while traveling in the opposite direction. The trip time is reported for both travel directions. Additionally, the length of road segment is measured. These variables are then used in estimating speeds and volumes. In a westbound direction from Interstate Highway 30 (I-30) in the DFW area, this study examined the accuracy and feasibility of this method by comparing it with stationary observer method as the standard method for such counts. The statistical tests were used to test the accuracy. Results show that this method provides accurate volume and speed estimates when compared to the stationary method for the road segment with three lanes per direction, especially when several runs are taken.
基金the NSFC. Insights from Prof. E. Grunwald of Brandeis University and from Prof. W. Linert of Technical University of Vienna are
文摘The validity of the compensation between the enthalpies and entropies obtained from the calorimetric methods was statistically examined for the first time based on computer simulations. It turned out that several claimed enthalpy-entropy compensations in literature based upon the calorimetric measurements were statistically correct. Interestingly, a linear relationship between the slopes and correlation coefficients of the T DeltaS-DeltaH plots of different physical origin of the compensation behavior.
文摘Statistical approaches for evaluating causal effects and for discovering causal networks are discussed in this paper.A causal relation between two variables is different from an association or correlation between them.An association measurement between two variables and may be changed dramatically from positive to negative by omitting a third variable,which is called Yule-Simpson paradox.We shall discuss how to evaluate the causal effect of a treatment or exposure on an outcome to avoid the phenomena of Yule-Simpson paradox.Surrogates and intermediate variables are often used to reduce measurement costs or duration when measurement of endpoint variables is expensive,inconvenient,infeasible or unobservable in practice.There have been many criteria for surrogates.However,it is possible that for a surrogate satisfying these criteria,a treatment has a positive effect on the surrogate,which in turn has a positive effect on the outcome,but the treatment has a negative effect on the outcome,which is called the surrogate paradox.We shall discuss criteria for surrogates to avoid the phenomena of the surrogate paradox.Causal networks which describe the causal relationships among a large number of variables have been applied to many research fields.It is important to discover structures of causal networks from observed data.We propose a recursive approach for discovering a causal network in which a structural learning of a large network is decomposed recursively into learning of small networks.Further to discover causal relationships,we present an active learning approach in terms of external interventions on some variables.When we focus on the causes of an interest outcome,instead of discovering a whole network,we propose a local learning approach to discover these causes that affect the outcome.
文摘In general, digital images can be classified into photographs, textual and mixed documents. This taxonomy is very useful in many applications, such as archiving task. However, there are no effective methods to perform this classification automatically. In this paper, we present a method for classifying and archiving document into the following semantic classes: photographs, textual and mixed documents. Our method is based on combining low-level image features, such as mean, Standard deviation, Skewness. Both the Decision Tree and Neuronal Network Classifiers are used for classification task.
文摘Spatial downscaling methods are widely used for the production of bioclimatic variables(e.g. temperature and precipitation) in studies related to species ecological niche and drainage basin management and planning. This study applied three different statistical methods, i.e. the moving window regression(MWR), nonparametric multiplicative regression(NPMR), and generalized linear model(GLM), to downscale the annual mean temperature(Bio1) and annual precipitation(Bio12) in central Iran from coarse scale(1 km × 1 km) to fine scale(250 m ×250 m). Elevation, aspect, distance from sea and normalized difference vegetation index(NDVI) were used as covariates to create downscaled bioclimatic variables. Model assessment was performed by comparing model outcomes with observational data from weather stations. Coefficients of determination(R2), bias, and root-mean-square error(RMSE) were used to evaluate models and covariates. The elevation could effectively justify the changes in bioclimatic factors related to temperature and precipitation. Allthree models could downscale the mean annual temperature data with similar R2, RMSE, and bias values. The MWR had the best performance and highest accuracy in downscaling annual precipitation(R2=0.70; RMSE=123.44). In general, the two nonparametric models, i.e. MWR and NPMR, can be reliably used for the downscaling of bioclimatic variables which have wide applications in species distribution modeling.
文摘To sustain the management of natural resources, land use and land cover (LULC) should be spatially mapped and temporally monitored using GIS. For large areas, conventional methods are laborious. Alternatively, remote sensing can be used for LULC mapping and monitoring. Normalized differential vegetation index (NDVI) is the most used vegetation index for crop identification and phenology. For agricultural areas, crop statistics are estimated yearly at regional level following administrative units. However, these statistics are not informing about spatial extent of these crops within administrative units; such information is crucial for crop monitoring. The main objective of this research was to fill the gap, based on statistical methods and GIS, by adding spatial information to crop statistics by analyzing temporal NDVI profiles. The study area covers 1300 km2. Data consist of 147 decadal Spot Vegetation NDVI images. Crop statistics were compiled on seasonal basis and aggregated to different administrative levels. Images were processed using an unsupervised classification method. A series of classification runs corresponding to different numbers of clusters were used. Using stepwise multiple linear regression, cropped areas from agricultural statistics were related to areas of each NDVI profile cluster. Estimated regression coefficients were used to generate maps showing cropped fractions by map units. The optimal number of clusters was 18. Similar profiles were merged leading to eight clusters. The results show that, for example, rice was grown, in autumn, on 50% of the area of map-units represented by NDVI-profile group 4 and 75% of the area of group 7 while it was grown, in spring, on 2, 69 and 25% of areas of NDVI-profile groups 2, 61 and 7, respectively. Regression coefficients were used to generate map of crops. This research illustrates the benefit of integrating statistical methods, GIS, remote sensing and crop statistics to delineate NDVI profile clusters with their corresponding agricultural land cover map units and to link these statistics to geographical locations. These map units can be used as a reference for future monitoring of natural resources, in particular crop growth and development and for forecasting crop production and/or yield and stresses like drought.
基金supported by the Innovation Foundation of Provincial Education Department of Gansu(2024B-005)the Gansu Province National Science Foundation(22YF7GA182)the Fundamental Research Funds for the Central Universities(No.lzujbky2022-kb01)。
文摘Modal parameters can accurately characterize the structural dynamic properties and assess the physical state of the structure.Therefore,it is particularly significant to identify the structural modal parameters according to the monitoring data information in the structural health monitoring(SHM)system,so as to provide a scientific basis for structural damage identification and dynamic model modification.In view of this,this paper reviews methods for identifying structural modal parameters under environmental excitation and briefly describes how to identify structural damages based on the derived modal parameters.The paper primarily introduces data-driven modal parameter recognition methods(e.g.,time-domain,frequency-domain,and time-frequency-domain methods,etc.),briefly describes damage identification methods based on the variations of modal parameters(e.g.,natural frequency,modal shapes,and curvature modal shapes,etc.)and modal validation methods(e.g.,Stability Diagram and Modal Assurance Criterion,etc.).The current status of the application of artificial intelligence(AI)methods in the direction of modal parameter recognition and damage identification is further discussed.Based on the pre-vious analysis,the main development trends of structural modal parameter recognition and damage identification methods are given to provide scientific references for the optimized design and functional upgrading of SHM systems.