AIM To investigate and compare the analytical and clinical performance of Tian Long automatic hypersensitive hepatitis B virus(HBV) DNA quantification system and Roche CAP/CTM system.METHODS Two hundred blood samples ...AIM To investigate and compare the analytical and clinical performance of Tian Long automatic hypersensitive hepatitis B virus(HBV) DNA quantification system and Roche CAP/CTM system.METHODS Two hundred blood samples for HBV DNA testing, HBV-DNA negative samples and high-titer HBV-DNA mixture samples were collected and prepared. National standard materials for serum HBV and a worldwide HBV DNA panel were employed for performance verification. The analytical performance, such as limit of detection, limit of quantification, accuracy, precision, reproducibility, linearity, genotype coverage and cross-contamination, was determined using the Tian Long automatic hypersensitive HBV DNA quantification system(TL system). Correlation and Bland-Altman plot analyses were carried out to compare the clinical performance of the TL system assay and the CAP/CTM system. RESULTS The detection limit of the TL system was 10 IU/m L, and its limit of quantification was 30 IU/m L. The differences between the expected and tested concentrations of the national standards were less than ± 0.4 Log10 IU/m L, which showed high accuracy of the system. Results of the precision, reproducibility and linearity tests showed that the multiple test coefficient of variation(CV) of the same sample was less than 5% for 102-106 IU/m L; and for 30-108 IU/m L, the linear correlation coefficient r2 = 0.99. The TL system detected HBV DNA(A-H) genotypes and there was no cross-contamination during the "checkerboard" test. When compared with the CAP/CTM assay, the two assays showed 100% consistency in both negative and positive sample results(15 negative samples and 185 positive samples). No statistical differences between the two assays in the HBV DNA quantification values were observed(P > 0.05). Correlation analysis indicated a significant correlation between the two assays, r2 = 0.9774. The Bland-Altman plot analysis showed that 98.9% of the positive data were within the 95% acceptable range, and the maximum difference was-0.49.CONCLUSION The TL system has good analytical performance, and exhibits good agreement with the CAP/CTM system in clinical performance.展开更多
This study establishes and validates a method for the precise quantification of aquatic microbial loads using microbial diversity absolute quantitative sequencing.By adding synthetic spike-in DNA to water samples from...This study establishes and validates a method for the precise quantification of aquatic microbial loads using microbial diversity absolute quantitative sequencing.By adding synthetic spike-in DNA to water samples from the Dahei River prior to DNA extraction and 16S rRNA gene sequencing,it generates standard curves to convert sequencing data into absolute microbial copy numbers.The method,which is proved highly accurate(R^(2)>0.99),reveals a clear contrast between the river sites:the upstream community has not only a significantly higher total microbial load but also a completely different makeup of species compared to the downstream site.This approach effectively overcomes the limitations of relative abundance analysis,providing a powerful tool for environmental monitoring,and proposes key steps for future standardization to ensure data comparability and integration.展开更多
The regional hydrological system is extremely complex because it is affected not only by physical factors but also by human dimensions.And the hydrological models play a very important role in simulating the complex s...The regional hydrological system is extremely complex because it is affected not only by physical factors but also by human dimensions.And the hydrological models play a very important role in simulating the complex system.However,there have not been effective methods for the model reliability and uncertainty analysis due to its complexity and difficulty.The uncertainties in hydrological modeling come from four important aspects:uncertainties in input data and parameters,uncertainties in model structure,uncertainties in analysis method and the initial and boundary conditions.This paper systematically reviewed the recent advances in the study of the uncertainty analysis approaches in the large-scale complex hydrological model on the basis of uncertainty sources.Also,the shortcomings and insufficiencies in the uncertainty analysis for complex hydrological models are pointed out.And then a new uncertainty quantification platform PSUADE and its uncertainty quantification methods were introduced,which will be a powerful tool and platform for uncertainty analysis of large-scale complex hydrological models.Finally,some future perspectives on uncertainty quantification are put forward.展开更多
This paper proposes a novel method to quantify the error of a nominal normalized right graph symbol (NRGS) for an errors- in-variables (EIV) system corrupted with bounded noise. Following an identification framewo...This paper proposes a novel method to quantify the error of a nominal normalized right graph symbol (NRGS) for an errors- in-variables (EIV) system corrupted with bounded noise. Following an identification framework for estimation of a perturbation model set, a worst-case v-gap error bound for the estimated nominal NRGS can be first determined from a priori and a posteriori information on the underlying EIV system. Then, an NRGS perturbation model set can be derived from a close relation between the v-gap metric of two models and H∞-norm of their NRGSs' difference. The obtained NRGS perturbation model set paves the way for robust controller design using an H∞ loop-shaping method because it is a standard form of the well-known NCF (normalized coprime factor) perturbation model set. Finally, a numerical simulation is used to demonstrate the effectiveness of the proposed identification method.展开更多
A novel method based on time-dependent stochastic orthogonal bases for stochastic response surface approximation is proposed to overcome the problem of significant errors in the utilization of the generalized polynomi...A novel method based on time-dependent stochastic orthogonal bases for stochastic response surface approximation is proposed to overcome the problem of significant errors in the utilization of the generalized polynomial chaos(GPC) method that approximates the stochastic response by orthogonal polynomials. The accuracy and effectiveness of the method are illustrated by different numerical examples including both linear and nonlinear problems. The results indicate that the proposed method modifies the stochastic bases adaptively, and has a better approximation for the probability density function in contrast to the GPC method.展开更多
There is growing interest in using ecosystem services to aid development of management strategies that target sustainability and enhance ecosystem support to humans. Challenges remain in the search for methods and ind...There is growing interest in using ecosystem services to aid development of management strategies that target sustainability and enhance ecosystem support to humans. Challenges remain in the search for methods and indicators that can quantify ecosystem services using metrics that are meaningful in light of their high priorities. We developed a framework to link ecosystems to human wellbeing based on a stepwise approach. We evaluated prospective models in terms of their capacity to quantify national ecosystem services of forests. The most applicable models were subsequently used to quantify ecosystem services. The Korea Forest Research Institute model sat- isfied all criteria in its first practical use. A total of 12 key ecosystem services were identified. For our case study, we quantified four ecosystem functions, viz. water storage capacity in forest soil for water storage service, reduced suspended sediment for water purification service, reduced soil erosion for landslide prevention service, and reduced sediment yield for sediment regulation service. Water storage capacity in forest soil was estimated at 2142 t/ha, and reduced suspended sediment was estimated at 608 kg/ ha. Reduced soil erosion was estimated at 77 m^3/ha, and reduced sediment yield was estimated at 285 m^3/ha. These results were similar to those reported by previous studies. Mapped results revealed hotspots of ecosystem services around protected areas that were particularly rich in bio- diversity. In addition, the proposed framework illustrated that quantification of ecosystem services could be sup- ported by the spatial flow of ecosystem services. However, our approach did not address challenges faced when quantifying connections between ecosystem indicators and actual benefits of services described.展开更多
The following mini-review attempts to guide researchers in the quantification of fluorescently-labelled proteins within cultured thick or chromogenically-stained proteins within thin sections of brain tissue.It follow...The following mini-review attempts to guide researchers in the quantification of fluorescently-labelled proteins within cultured thick or chromogenically-stained proteins within thin sections of brain tissue.It follows from our examination of the utility of Fiji Image J thresholding and binarization algorithms.Describing how we identified the maximum intensity projection as the best of six tested for two dimensional(2 D)-rendering of three-dimensional(3 D) images derived from a series of z-stacked micrographs,the review summarises our comparison of 16 global and 9 local algorithms for their ability to accurately quantify the expression of astrocytic glial fibrillary acidic protein(GFAP),microglial ionized calcium binding adapter molecule 1(IBA1) and oligodendrocyte lineage Olig2 within fixed cultured rat hippocampal brain slices.The application of these algorithms to chromogenically-stained GFAP and IBA1 within thin tissue sections,is also described.Fiji’s Bio Voxxel plugin allowed categorisation of algorithms according to their sensitivity,specificity accuracy and relative quality.The Percentile algorithm was deemed best for quantifying levels of GFAP,the Li algorithm was best when quantifying IBA expression,while the Otsu algorithm was optimum for Olig2 staining,albeit with over-quantification of oligodendrocyte number when compared to a stereological approach.Also,GFAP and IBA expression in 3,3′-diaminobenzidine(DAB)/haematoxylin-stained cerebellar tissue was best quantified with Default,Isodata and Moments algorithms.The workflow presented in Figure 1 could help to improve the quality of research outcomes that are based on the quantification of protein with brain tissue.展开更多
Complex systems exist widely,including medicines from natural products,functional foods,and biological samples.The biological activity of complex systems is often the result of the synergistic effect of multiple compo...Complex systems exist widely,including medicines from natural products,functional foods,and biological samples.The biological activity of complex systems is often the result of the synergistic effect of multiple components.In the quality evaluation of complex samples,multicomponent quantitative analysis(MCQA)is usually needed.To overcome the difficulty in obtaining standard products,scholars have proposed achieving MCQA through the“single standard to determine multiple components(SSDMC)”approach.This method has been used in the determination of multicomponent content in natural source drugs and the analysis of impurities in chemical drugs and has been included in the Chinese Pharmacopoeia.Depending on a convenient(ultra)high-performance liquid chromatography method,how can the repeatability and robustness of the MCQA method be improved?How can the chromatography conditions be optimized to improve the number of quantitative components?How can computer software technology be introduced to improve the efficiency of multicomponent analysis(MCA)?These are the key problems that remain to be solved in practical MCQA.First,this review article summarizes the calculation methods of relative correction factors in the SSDMC approach in the past five years,as well as the method robustness and accuracy evaluation.Second,it also summarizes methods to improve peak capacity and quantitative accuracy in MCA,including column selection and twodimensional chromatographic analysis technology.Finally,computer software technologies for predicting chromatographic conditions and analytical parameters are introduced,which provides an idea for intelligent method development in MCA.This paper aims to provide methodological ideas for the improvement of complex system analysis,especially MCQA.展开更多
In this paper,a dynamic modeling method of motor driven electromechanical system is presented,and the uncertainty quantification of mechanism motion is investigated based on this method.The main contribution is to pro...In this paper,a dynamic modeling method of motor driven electromechanical system is presented,and the uncertainty quantification of mechanism motion is investigated based on this method.The main contribution is to propose a novel mechanism-motor coupling dynamic modeling method,in which the relationship between mechanism motion and motor rotation is established according to the geometric coordination of the system.The advantages of this include establishing intuitive coupling between the mechanism and motor,facilitating the discussion for the influence of both mechanical and electrical parameters on the mechanism,and enabling dynamic simulation with controller to take the randomness of the electric load into account.Dynamic simulation considering feedback control of ammunition delivery system is carried out,and the feasibility of the model is verified experimentally.Based on probability density evolution theory,we comprehensively discuss the effects of system parameters on mechanism motion from the perspective of uncertainty quantization.Our work can not only provide guidance for engineering design of ammunition delivery mechanism,but also provide theoretical support for modeling and uncertainty quantification research of mechatronics system.展开更多
Cardiac modeling entails the epistemic uncertainty of the input parameters,such as bundles and chambers geometry,electrical conductivities and cell parameters,thus calling for an uncertainty quantification(UQ)analysis...Cardiac modeling entails the epistemic uncertainty of the input parameters,such as bundles and chambers geometry,electrical conductivities and cell parameters,thus calling for an uncertainty quantification(UQ)analysis.Since the cardiac activation and the subsequent muscular contraction is provided by a complex electrophysiology system made of interconnected conductive media,we focus here on the fast conductivity structures of the atria(internodal pathways)with the aim of identifying which of the uncertain inputs mostly influence the propagation of the depolarization front.Firstly,the distributions of the input parameters are calibrated using data available from the literature taking into account gender differences.The output quantities of interest(Qols)of medical relevance are defined and a set of metamodels(one for each Qol)is then trained according to a polynomial chaos expansion(PCE)in order to run a global sensitivity analysis with non-linear variance-based SoboF indices with confidence intervals evaluated through the bootstrap method.The most sensitive parameters on each Qol are then identified for both genders showing the same order of importance of the model inputs on the electrical activation.Lastly,the probability distributions of the Qols are obtained through a forward sensitivity analysis using the same trained metamodels.It results that several input parameters-including the position of the internodal pathways and the electrical impulse applied at the sinoatrial node一have a little influence on the Qols studied.Vice-versa the electrical activation of the atrial fast conduction system is sensitive on the bundles geometry and electrical conductivities that need to be carefully measured or calibrated in order for the electrophysiology model to be accurate and predictive.展开更多
OBJECTIVE:To propose an automatic acupuncture robot system for performing acupuncture operations.METHODS:The acupuncture robot system consists of three components:automatic acupoint localization,acupuncture manipulati...OBJECTIVE:To propose an automatic acupuncture robot system for performing acupuncture operations.METHODS:The acupuncture robot system consists of three components:automatic acupoint localization,acupuncture manipulations,and De Qi sensation detection.The OptiTrack motion capture system is used to locate acupoints,which are then translated into coordinates in the robot control system.A flexible collaborative robot with an intelligent gripper is then used to perform acupuncture manipulations with high precision.In addition,a De Qi sensation detection system is proposed to evaluate the effect of acupuncture.To verify the stability of the designed acupuncture robot,acupoints'coordinates localized by the acupuncture robot are compared with the Gold Standard labeled by a professional acupuncturist using significant level tests.RESULTS:Through repeated experiments for eight acupoints,the acupuncture robot achieved a positioning error within 3.3 mm,which is within the allowable range of needle extraction and acupoint insertion.During needle insertion,the robot arm followed the prescribed trajectory with a mean deviation distance of 0.02 mm and a deviation angle of less than 0.15°.The results of the lifting thrusting operation in the Xingzhen process show that the mean acupuncture depth error of the designed acupuncture robot is approximately 2 mm,which is within the recommended depth range for the Xingzhen operation.In addition,the average detection accuracy of the De Qi keywords is 94.52%,which meets the requirements of acupuncture effect testing for different dialects.CONCLUSION:The proposed acupuncture robot system streamlines the acupuncture process,increases efficiency,and reduces practitioner fatigue,while also allowing for the quantification of acupuncture manipulations and evaluation of therapeutic effects.The development of an acupuncture robot system has the potential to revolutionize low back pain treatment and improve patient outcomes.展开更多
For uncertainty quantification of complex models with high-dimensional,nonlinear,multi-component coupling like digital twins,traditional statistical sampling methods,such as random sampling and Latin hypercube samplin...For uncertainty quantification of complex models with high-dimensional,nonlinear,multi-component coupling like digital twins,traditional statistical sampling methods,such as random sampling and Latin hypercube sampling,require a large number of samples,which entails huge computational costs.Therefore,how to construct a small-size sample space has been a hot issue of interest for researchers.To this end,this paper proposes a sequential search-based Latin hypercube sampling scheme to generate efficient and accurate samples for uncertainty quantification.First,the sampling range of the samples is formed by carving the polymorphic uncertainty based on theoretical analysis.Then,the optimal Latin hypercube design is selected using the Latin hypercube sampling method combined with the"space filling"criterion.Finally,the sample selection function is established,and the next most informative sample is optimally selected to obtain the sequential test sample.Compared with the classical sampling method,the generated samples can retain more information on the basis of sparsity.A series of numerical experiments are conducted to demonstrate the superiority of the proposed sequential search-based Latin hypercube sampling scheme,which is a way to provide reliable uncertainty quantification results with small sample sizes.展开更多
Viral diseases are an important threat to crop yield,as they are responsible for losses greater than US$30 billion annually.Thus,understanding the dynamics of virus propagation within plant cells is essential for devi...Viral diseases are an important threat to crop yield,as they are responsible for losses greater than US$30 billion annually.Thus,understanding the dynamics of virus propagation within plant cells is essential for devising effective control strategies.However,viruses are complex to propagate and quantify.Existing methodologies for viral quantification tend to be expensive and time-consuming.Here,we present a rapid cost-effective approach to quantify viral propagation using an engineered virus expressing a fluorescent reporter.Using a microplate reader,we measured viral protein levels and we validated our findings through comparison by western blot analysis of viral coat protein,the most common approach to quantify viral titer.Our proposed methodology provides a practical and accessible approach to studying virus-host interactions and could contribute to enhancing our understanding of plant virology.展开更多
In pre-modern China,systematic records of astronomical phenomena measured by units including“zhang,”“chi,”and“cun”and records using phrases such as“as large as something”(making comparison with images取象比类)...In pre-modern China,systematic records of astronomical phenomena measured by units including“zhang,”“chi,”and“cun”and records using phrases such as“as large as something”(making comparison with images取象比类)were kept.Some of these records survive and together they constitute a“scale system”for records of astronomical phenomena.A textual model of the celestial sphere based on naked eye observations also survives.According to this model and using the conversion ratio that 1 chi equals 1 degree,the author has reconstituted the geometric meaning of the records of“zhang,”“chi,”and“cun,”while records that use a comparison with a certain object have been converted to their apparent diameters or magnitudes.Finally,these two records are integrated into one system.Through an analysis of the origin of the chi system,the author obtains the result that the radius of the celestial sphere when the ancients observed the sky with the naked eye was about 13 meters.The paper supports this conclusion from various perspectives,including psychological factors,the radius of planetariums,and the nautical Method of Reckoning by the Stars.As naked eye observers always regard the vault of heaven as a plane hemisphere,they have a common false impression in their minds;the pre-modern observational data therefore contains some systematic errors.Under different illumination and weather conditions,the vault of heaven is plane in varying degrees,for which the author has defined“the angle of apparent plane degree”视扁度角.To correct the visual errors,the author has devised a series of calculation tables for daytime,nighttime,cloudy days,clear days,moonlit nights,and moonless nights,to convert the apparent heights or sizes of celestial bodies to the true heights or sizes.展开更多
BACKGROUND The Streptococcus salivarius(S.salivarius)group,which produces the enzyme urease has been identified as a potential contributor to ammonia production in the gut.Researchers have reported that patients with ...BACKGROUND The Streptococcus salivarius(S.salivarius)group,which produces the enzyme urease has been identified as a potential contributor to ammonia production in the gut.Researchers have reported that patients with minimal HE had an increased abundance of the S.salivarius group,which is a specific change in the gut microbiota that distinguishes them from healthy individuals.The correlation between the aggregation of specific bacterial species and fibrosis progression in chronic liver disease(CLD)is yet to be fully elucidated.AIM To quantify S.salivarius using digital PCR(dPCR)as a liver fibrosis marker of CLD.METHODS This study retrospectively analysed 52 patients with CLD.To quantify S.salivarius in patients with CLD using dPCR,we evaluated the specificity and sensitivity of S.salivarius bacterial load using dPCR for a type strain.Next,we evaluated the clinical usefulness of dPCR for S.salivarius load quantification for detecting liver fibrosis in patients with CLD.The liver fibrosis stage was categorized into mild and advanced fibrosis based on pathological findings.RESULTS The dPCR assay revealed that S.salivarius was highly positive for the tnpA gene.The lower limit of quantification for dPCR using the tnpA gene with a 1μL template comprising 1.28×102 CFU/mL was 4.3 copies.After considering the detection range in dPCR,we adjusted the extracted DNA concentration to 5.0×10-4 ng/μL from 200 mg stool samples.The median bacterial loads of S.salivarius in stool sample from patients with mild and advanced fibrosis were 1.9 and 7.4 copies/μL,respectively.The quantification of S.salivarius load was observed more frequently in patients with advanced fibrosis than in those with mild fibrosis(P=0.032).CONCLUSION Quantifying of S.salivarius load using digital PCR is a useful biomarker for liver fibrosis in patients with CLD.展开更多
During high-speed forward flight,helicopter rotor blades operate across a wide range of Reynolds and Mach numbers.Under such conditions,their aerodynamic performance is significantly influenced by dynamic stall—a com...During high-speed forward flight,helicopter rotor blades operate across a wide range of Reynolds and Mach numbers.Under such conditions,their aerodynamic performance is significantly influenced by dynamic stall—a complex,unsteady flow phenomenon highly sensitive to inlet conditions such asMach and Reynolds numbers.The key features of three-dimensional blade stall can be effectively represented by the dynamic stall behavior of a pitching airfoil.In this study,we conduct an uncertainty quantification analysis of dynamic stall aerodynamics in high-Mach-number flows over pitching airfoils,accounting for uncertainties in inlet parameters.A computational fluid dynamics(CFD)model based on the compressible unsteady Reynolds-averagedNavier–Stokes(URANS)equations,coupledwith sliding mesh techniques,is developed to simulate the unsteady aerodynamic behavior and associated flow fields.To efficiently capture the aerodynamic responses while maintaining high accuracy,a multi-fidelity Co-Kriging surrogate model is constructed.This model integrates the precision of high-fidelity wind tunnel experiments with the computational efficiency of lower-fidelity URANS simulations.Its accuracy is validated through direct comparison with experimental data.Building upon this surrogate model,we employ interval analysis and the Sobol sensitivity method to quantify the uncertainty and parameter sensitivity of the unsteady aerodynamic forces resulting frominlet condition variability.Both the inlet Mach number and Reynolds number are treated as uncertain inputs,modeled using interval representations.Our results demonstrate that variations inMach number contribute far more significantly to aerodynamic uncertainty than those in Reynolds number.Moreover,the presence of dynamic stall vortices markedly amplifies the aerodynamic sensitivity to Mach number fluctuations.展开更多
Excessive Fe^(3+) ion concentrations in wastewater pose a long-standing threat to human health.Achieving low-cost,high-efficiency quantification of Fe^(3+) ion concentration in unknown solutions can guide environmenta...Excessive Fe^(3+) ion concentrations in wastewater pose a long-standing threat to human health.Achieving low-cost,high-efficiency quantification of Fe^(3+) ion concentration in unknown solutions can guide environmental management decisions and optimize water treatment processes.In this study,by leveraging the rapid,real-time detection capabilities of nanopores and the specific chemical binding affinity of tannic acid to Fe^(3+),a linear relationship between the ion current and Fe^(3+) ion concentration was established.Utilizing this linear relationship,quantification of Fe^(3+) ion concentration in unknown solutions was achieved.Furthermore,ethylenediaminetetraacetic acid disodium salt was employed to displace Fe^(3+) from the nanopores,allowing them to be restored to their initial conditions and reused for Fe^(3+) ion quantification.The reusable bioinspired nanopores remain functional over 330 days of storage.This recycling capability and the long-term stability of the nanopores contribute to a significant reduction in costs.This study provides a strategy for the quantification of unknown Fe^(3+) concentration using nanopores,with potential applications in environmental assessment,health monitoring,and so forth.展开更多
In the data transaction process within a data asset trading platform,quantifying the trustworthiness of data source nodes is challenging due to their numerous attributes and complex structures.To address this issue,a ...In the data transaction process within a data asset trading platform,quantifying the trustworthiness of data source nodes is challenging due to their numerous attributes and complex structures.To address this issue,a distributed data source trust assessment management framework,a trust quantification model,and a dynamic adjustment mechanism are proposed.Themodel integrates the Analytic Hierarchy Process(AHP)and Dempster-Shafer(D-S)evidence theory to determine attribute weights and calculate direct trust values,while the PageRank algorithm is employed to derive indirect trust values.Thedirect and indirect trust values are then combined to compute the comprehensive trust value of the data source.Furthermore,a dynamic adjustment mechanism is introduced to continuously update the comprehensive trust value based on historical assessment data.By leveraging the collaborative efforts of multiple nodes in the distributed network,the proposed framework enables a comprehensive,dynamic,and objective evaluation of data source trustworthiness.Extensive experimental analyses demonstrate that the trust quantification model effectively handles large-scale data source trust assessments,exhibiting both strong trust differentiation capability and high robustness.展开更多
Compared to other energy sources,nuclear reactors offer several advantages as a spacecraft power source,including compact size,high power density,and long operating life.These qualities make nuclear power an ideal ene...Compared to other energy sources,nuclear reactors offer several advantages as a spacecraft power source,including compact size,high power density,and long operating life.These qualities make nuclear power an ideal energy source for future deep space exploration.A whole system model of the space nuclear reactor consisting of the reactor neutron kinetics,reactivity control,reactor heat transfer,heat exchanger,and thermoelectric converter was developed.In addition,an electrical power control system was designed based on the developed dynamic model.The GRS method was used to quantitatively calculate the uncertainty of coupling parameters of the neutronics,thermal-hydraulics,and control system for the space reactor.The Spearman correlation coefficient was applied in the sensitivity analysis of system input parameters to output parameters.The calculation results showed that the uncertainty of the output parameters caused by coupling parameters had the most considerable variation,with a relative standard deviation<2.01%.Effective delayed neutron fraction was most sensitive to electrical power.To obtain optimal control performance,the non-dominated sorting genetic algorithm method was employed to optimize the controller parameters based on the uncertainty quantification calculation.Two typical transient simulations were conducted to test the adaptive ability of the optimized controller in the uncertainty dynamic system,including 100%full power(FP)to 90%FP step load reduction transient and 5%FP/min linear variable load transient.The results showed that,considering the influence of system uncertainty,the optimized controller could improve the response speed and load following accuracy of electrical power control,in which the effectiveness and superiority have been verified.展开更多
Quantitative analysis of clinical function parameters from MRI images is crucial for diagnosing and assessing cardiovascular disease.However,the manual calculation of these parameters is challenging due to the high va...Quantitative analysis of clinical function parameters from MRI images is crucial for diagnosing and assessing cardiovascular disease.However,the manual calculation of these parameters is challenging due to the high variability among patients and the time-consuming nature of the process.In this study,the authors introduce a framework named MultiJSQ,comprising the feature presentation network(FRN)and the indicator prediction network(IEN),which is designed for simultaneous joint segmentation and quantification.The FRN is tailored for representing global image features,facilitating the direct acquisition of left ventricle(LV)contour images through pixel classification.Additionally,the IEN incorporates specifically designed modules to extract relevant clinical indices.The authors’method considers the interdependence of different tasks,demonstrating the validity of these relationships and yielding favourable results.Through extensive experiments on cardiac MR images from 145 patients,MultiJSQ achieves impressive outcomes,with low mean absolute errors of 124 mm^(2),1.72 mm,and 1.21 mm for areas,dimensions,and regional wall thicknesses,respectively,along with a Dice metric score of 0.908.The experimental findings underscore the excellent performance of our framework in LV segmentation and quantification,highlighting its promising clinical application prospects.展开更多
文摘AIM To investigate and compare the analytical and clinical performance of Tian Long automatic hypersensitive hepatitis B virus(HBV) DNA quantification system and Roche CAP/CTM system.METHODS Two hundred blood samples for HBV DNA testing, HBV-DNA negative samples and high-titer HBV-DNA mixture samples were collected and prepared. National standard materials for serum HBV and a worldwide HBV DNA panel were employed for performance verification. The analytical performance, such as limit of detection, limit of quantification, accuracy, precision, reproducibility, linearity, genotype coverage and cross-contamination, was determined using the Tian Long automatic hypersensitive HBV DNA quantification system(TL system). Correlation and Bland-Altman plot analyses were carried out to compare the clinical performance of the TL system assay and the CAP/CTM system. RESULTS The detection limit of the TL system was 10 IU/m L, and its limit of quantification was 30 IU/m L. The differences between the expected and tested concentrations of the national standards were less than ± 0.4 Log10 IU/m L, which showed high accuracy of the system. Results of the precision, reproducibility and linearity tests showed that the multiple test coefficient of variation(CV) of the same sample was less than 5% for 102-106 IU/m L; and for 30-108 IU/m L, the linear correlation coefficient r2 = 0.99. The TL system detected HBV DNA(A-H) genotypes and there was no cross-contamination during the "checkerboard" test. When compared with the CAP/CTM assay, the two assays showed 100% consistency in both negative and positive sample results(15 negative samples and 185 positive samples). No statistical differences between the two assays in the HBV DNA quantification values were observed(P > 0.05). Correlation analysis indicated a significant correlation between the two assays, r2 = 0.9774. The Bland-Altman plot analysis showed that 98.9% of the positive data were within the 95% acceptable range, and the maximum difference was-0.49.CONCLUSION The TL system has good analytical performance, and exhibits good agreement with the CAP/CTM system in clinical performance.
基金supported by the National Natural Science Foundation of China(Grant No.32160172)the Key Science-Technology Project of Inner Mongolia(2023KYPT0010)+1 种基金the Natural Science Foundation of Inner Mongolia Autonomous Region of China(Grant No.2025QN03006)the 2023 Inner Mongolia Public Institution High-level Talent Introduction Scientific Research Support Project.
文摘This study establishes and validates a method for the precise quantification of aquatic microbial loads using microbial diversity absolute quantitative sequencing.By adding synthetic spike-in DNA to water samples from the Dahei River prior to DNA extraction and 16S rRNA gene sequencing,it generates standard curves to convert sequencing data into absolute microbial copy numbers.The method,which is proved highly accurate(R^(2)>0.99),reveals a clear contrast between the river sites:the upstream community has not only a significantly higher total microbial load but also a completely different makeup of species compared to the downstream site.This approach effectively overcomes the limitations of relative abundance analysis,providing a powerful tool for environmental monitoring,and proposes key steps for future standardization to ensure data comparability and integration.
基金National Key Basic Research Program of China,No.2010CB428403National Grand Science and Technology Special Project of Water Pollution Control and Improvement,No.2009ZX07210-006
文摘The regional hydrological system is extremely complex because it is affected not only by physical factors but also by human dimensions.And the hydrological models play a very important role in simulating the complex system.However,there have not been effective methods for the model reliability and uncertainty analysis due to its complexity and difficulty.The uncertainties in hydrological modeling come from four important aspects:uncertainties in input data and parameters,uncertainties in model structure,uncertainties in analysis method and the initial and boundary conditions.This paper systematically reviewed the recent advances in the study of the uncertainty analysis approaches in the large-scale complex hydrological model on the basis of uncertainty sources.Also,the shortcomings and insufficiencies in the uncertainty analysis for complex hydrological models are pointed out.And then a new uncertainty quantification platform PSUADE and its uncertainty quantification methods were introduced,which will be a powerful tool and platform for uncertainty analysis of large-scale complex hydrological models.Finally,some future perspectives on uncertainty quantification are put forward.
基金supported in part by the National Natural Science Foundation of China(Nos.61203119,61304153)the Key Program of Tianjin Natural Science Foundation,China(No.14JCZDJC36300)the Tianjin University of Technology and Education funded project(No.RC14-48)
文摘This paper proposes a novel method to quantify the error of a nominal normalized right graph symbol (NRGS) for an errors- in-variables (EIV) system corrupted with bounded noise. Following an identification framework for estimation of a perturbation model set, a worst-case v-gap error bound for the estimated nominal NRGS can be first determined from a priori and a posteriori information on the underlying EIV system. Then, an NRGS perturbation model set can be derived from a close relation between the v-gap metric of two models and H∞-norm of their NRGSs' difference. The obtained NRGS perturbation model set paves the way for robust controller design using an H∞ loop-shaping method because it is a standard form of the well-known NCF (normalized coprime factor) perturbation model set. Finally, a numerical simulation is used to demonstrate the effectiveness of the proposed identification method.
基金Project supported by the National Natural Science Foundation of China(Nos.11632011,11572189,and 51421092)the China Postdoctoral Science Foundation(No.2016M601585)
文摘A novel method based on time-dependent stochastic orthogonal bases for stochastic response surface approximation is proposed to overcome the problem of significant errors in the utilization of the generalized polynomial chaos(GPC) method that approximates the stochastic response by orthogonal polynomials. The accuracy and effectiveness of the method are illustrated by different numerical examples including both linear and nonlinear problems. The results indicate that the proposed method modifies the stochastic bases adaptively, and has a better approximation for the probability density function in contrast to the GPC method.
基金supported by the Korea Ministry of Environment as ‘‘Climate Change Correspondence Program(2014001310008)’’ and ‘‘The Eco-Innovation Project(Project Number:2012-00021-0002)’’
文摘There is growing interest in using ecosystem services to aid development of management strategies that target sustainability and enhance ecosystem support to humans. Challenges remain in the search for methods and indicators that can quantify ecosystem services using metrics that are meaningful in light of their high priorities. We developed a framework to link ecosystems to human wellbeing based on a stepwise approach. We evaluated prospective models in terms of their capacity to quantify national ecosystem services of forests. The most applicable models were subsequently used to quantify ecosystem services. The Korea Forest Research Institute model sat- isfied all criteria in its first practical use. A total of 12 key ecosystem services were identified. For our case study, we quantified four ecosystem functions, viz. water storage capacity in forest soil for water storage service, reduced suspended sediment for water purification service, reduced soil erosion for landslide prevention service, and reduced sediment yield for sediment regulation service. Water storage capacity in forest soil was estimated at 2142 t/ha, and reduced suspended sediment was estimated at 608 kg/ ha. Reduced soil erosion was estimated at 77 m^3/ha, and reduced sediment yield was estimated at 285 m^3/ha. These results were similar to those reported by previous studies. Mapped results revealed hotspots of ecosystem services around protected areas that were particularly rich in bio- diversity. In addition, the proposed framework illustrated that quantification of ecosystem services could be sup- ported by the spatial flow of ecosystem services. However, our approach did not address challenges faced when quantifying connections between ecosystem indicators and actual benefits of services described.
基金supported by a grant from Thomas Crawford Hayes Research Fundthe NUI Galway College of Science scholarship to SHa grant from NUI Galway Foundation Office to JM
文摘The following mini-review attempts to guide researchers in the quantification of fluorescently-labelled proteins within cultured thick or chromogenically-stained proteins within thin sections of brain tissue.It follows from our examination of the utility of Fiji Image J thresholding and binarization algorithms.Describing how we identified the maximum intensity projection as the best of six tested for two dimensional(2 D)-rendering of three-dimensional(3 D) images derived from a series of z-stacked micrographs,the review summarises our comparison of 16 global and 9 local algorithms for their ability to accurately quantify the expression of astrocytic glial fibrillary acidic protein(GFAP),microglial ionized calcium binding adapter molecule 1(IBA1) and oligodendrocyte lineage Olig2 within fixed cultured rat hippocampal brain slices.The application of these algorithms to chromogenically-stained GFAP and IBA1 within thin tissue sections,is also described.Fiji’s Bio Voxxel plugin allowed categorisation of algorithms according to their sensitivity,specificity accuracy and relative quality.The Percentile algorithm was deemed best for quantifying levels of GFAP,the Li algorithm was best when quantifying IBA expression,while the Otsu algorithm was optimum for Olig2 staining,albeit with over-quantification of oligodendrocyte number when compared to a stereological approach.Also,GFAP and IBA expression in 3,3′-diaminobenzidine(DAB)/haematoxylin-stained cerebellar tissue was best quantified with Default,Isodata and Moments algorithms.The workflow presented in Figure 1 could help to improve the quality of research outcomes that are based on the quantification of protein with brain tissue.
基金the National Natural Science Foundation of China(Grant No.:81803734)National S&T Major Special Project for New Innovative Drugs Sponsored(Grant No.:2019ZX09201005).
文摘Complex systems exist widely,including medicines from natural products,functional foods,and biological samples.The biological activity of complex systems is often the result of the synergistic effect of multiple components.In the quality evaluation of complex samples,multicomponent quantitative analysis(MCQA)is usually needed.To overcome the difficulty in obtaining standard products,scholars have proposed achieving MCQA through the“single standard to determine multiple components(SSDMC)”approach.This method has been used in the determination of multicomponent content in natural source drugs and the analysis of impurities in chemical drugs and has been included in the Chinese Pharmacopoeia.Depending on a convenient(ultra)high-performance liquid chromatography method,how can the repeatability and robustness of the MCQA method be improved?How can the chromatography conditions be optimized to improve the number of quantitative components?How can computer software technology be introduced to improve the efficiency of multicomponent analysis(MCA)?These are the key problems that remain to be solved in practical MCQA.First,this review article summarizes the calculation methods of relative correction factors in the SSDMC approach in the past five years,as well as the method robustness and accuracy evaluation.Second,it also summarizes methods to improve peak capacity and quantitative accuracy in MCA,including column selection and twodimensional chromatographic analysis technology.Finally,computer software technologies for predicting chromatographic conditions and analytical parameters are introduced,which provides an idea for intelligent method development in MCA.This paper aims to provide methodological ideas for the improvement of complex system analysis,especially MCQA.
基金supported by the National Natural Science Foundation of China(Grant Nos.11472137 and U2141246)。
文摘In this paper,a dynamic modeling method of motor driven electromechanical system is presented,and the uncertainty quantification of mechanism motion is investigated based on this method.The main contribution is to propose a novel mechanism-motor coupling dynamic modeling method,in which the relationship between mechanism motion and motor rotation is established according to the geometric coordination of the system.The advantages of this include establishing intuitive coupling between the mechanism and motor,facilitating the discussion for the influence of both mechanical and electrical parameters on the mechanism,and enabling dynamic simulation with controller to take the randomness of the electric load into account.Dynamic simulation considering feedback control of ammunition delivery system is carried out,and the feasibility of the model is verified experimentally.Based on probability density evolution theory,we comprehensively discuss the effects of system parameters on mechanism motion from the perspective of uncertainty quantization.Our work can not only provide guidance for engineering design of ammunition delivery mechanism,but also provide theoretical support for modeling and uncertainty quantification research of mechatronics system.
基金This study has been performed with support of the'Fluid dynamics of hearts at risk of failure:towards methods for the prediction of disease progressions’funded by the Italian Ministry of Education and University(Grant 2017A889FP).
文摘Cardiac modeling entails the epistemic uncertainty of the input parameters,such as bundles and chambers geometry,electrical conductivities and cell parameters,thus calling for an uncertainty quantification(UQ)analysis.Since the cardiac activation and the subsequent muscular contraction is provided by a complex electrophysiology system made of interconnected conductive media,we focus here on the fast conductivity structures of the atria(internodal pathways)with the aim of identifying which of the uncertain inputs mostly influence the propagation of the depolarization front.Firstly,the distributions of the input parameters are calibrated using data available from the literature taking into account gender differences.The output quantities of interest(Qols)of medical relevance are defined and a set of metamodels(one for each Qol)is then trained according to a polynomial chaos expansion(PCE)in order to run a global sensitivity analysis with non-linear variance-based SoboF indices with confidence intervals evaluated through the bootstrap method.The most sensitive parameters on each Qol are then identified for both genders showing the same order of importance of the model inputs on the electrical activation.Lastly,the probability distributions of the Qols are obtained through a forward sensitivity analysis using the same trained metamodels.It results that several input parameters-including the position of the internodal pathways and the electrical impulse applied at the sinoatrial node一have a little influence on the Qols studied.Vice-versa the electrical activation of the atrial fast conduction system is sensitive on the bundles geometry and electrical conductivities that need to be carefully measured or calibrated in order for the electrophysiology model to be accurate and predictive.
基金Modernization of Traditional Chinese Medicine Project of National Key R&D Program of China:The construction of the theoretical system of Traditional Chinese Medicine nonpharmacological therapy based on body surface stimulation(2023YFC3502704)Sichuan Provincial Science and Technology Program Project:Research and Development of Chinese Medicine Intelligent Tongue Diagnosis Equipment for Digestive System Chinese Medicine Advantageous Diseases(2023YFS0327)+2 种基金Research and Development of Chinese Medicine Intelligent Detection System for Intestinal Functions(2024YFFK0044)Research and Application of Chinese Medicine Diagnosis and Treatment Program for Herpes Zoster Treated by Shu Pai Fire Acupuncture(2024YFFK0089)Major Research and Development Project of The China Academy of Chinese Medical Sciences Innovation:Construction and application of the theoretical research mode of Traditional Chinese Medicine diagnosis and treatment of modern diseases(CI2021A00104)。
文摘OBJECTIVE:To propose an automatic acupuncture robot system for performing acupuncture operations.METHODS:The acupuncture robot system consists of three components:automatic acupoint localization,acupuncture manipulations,and De Qi sensation detection.The OptiTrack motion capture system is used to locate acupoints,which are then translated into coordinates in the robot control system.A flexible collaborative robot with an intelligent gripper is then used to perform acupuncture manipulations with high precision.In addition,a De Qi sensation detection system is proposed to evaluate the effect of acupuncture.To verify the stability of the designed acupuncture robot,acupoints'coordinates localized by the acupuncture robot are compared with the Gold Standard labeled by a professional acupuncturist using significant level tests.RESULTS:Through repeated experiments for eight acupoints,the acupuncture robot achieved a positioning error within 3.3 mm,which is within the allowable range of needle extraction and acupoint insertion.During needle insertion,the robot arm followed the prescribed trajectory with a mean deviation distance of 0.02 mm and a deviation angle of less than 0.15°.The results of the lifting thrusting operation in the Xingzhen process show that the mean acupuncture depth error of the designed acupuncture robot is approximately 2 mm,which is within the recommended depth range for the Xingzhen operation.In addition,the average detection accuracy of the De Qi keywords is 94.52%,which meets the requirements of acupuncture effect testing for different dialects.CONCLUSION:The proposed acupuncture robot system streamlines the acupuncture process,increases efficiency,and reduces practitioner fatigue,while also allowing for the quantification of acupuncture manipulations and evaluation of therapeutic effects.The development of an acupuncture robot system has the potential to revolutionize low back pain treatment and improve patient outcomes.
基金co-supported by the National Natural Science Foundation of China(Nos.51875014,U2233212 and 51875015)the Natural Science Foundation of Beijing Municipality,China(No.L221008)+1 种基金Science,Technology Innovation 2025 Major Project of Ningbo of China(No.2022Z005)the Tianmushan Laboratory Project,China(No.TK2023-B-001)。
文摘For uncertainty quantification of complex models with high-dimensional,nonlinear,multi-component coupling like digital twins,traditional statistical sampling methods,such as random sampling and Latin hypercube sampling,require a large number of samples,which entails huge computational costs.Therefore,how to construct a small-size sample space has been a hot issue of interest for researchers.To this end,this paper proposes a sequential search-based Latin hypercube sampling scheme to generate efficient and accurate samples for uncertainty quantification.First,the sampling range of the samples is formed by carving the polymorphic uncertainty based on theoretical analysis.Then,the optimal Latin hypercube design is selected using the Latin hypercube sampling method combined with the"space filling"criterion.Finally,the sample selection function is established,and the next most informative sample is optimally selected to obtain the sequential test sample.Compared with the classical sampling method,the generated samples can retain more information on the basis of sparsity.A series of numerical experiments are conducted to demonstrate the superiority of the proposed sequential search-based Latin hypercube sampling scheme,which is a way to provide reliable uncertainty quantification results with small sample sizes.
基金Funding from Natural Sciences and Engineering Research Council of Canada award number RGPIN/4002-2020.
文摘Viral diseases are an important threat to crop yield,as they are responsible for losses greater than US$30 billion annually.Thus,understanding the dynamics of virus propagation within plant cells is essential for devising effective control strategies.However,viruses are complex to propagate and quantify.Existing methodologies for viral quantification tend to be expensive and time-consuming.Here,we present a rapid cost-effective approach to quantify viral propagation using an engineered virus expressing a fluorescent reporter.Using a microplate reader,we measured viral protein levels and we validated our findings through comparison by western blot analysis of viral coat protein,the most common approach to quantify viral titer.Our proposed methodology provides a practical and accessible approach to studying virus-host interactions and could contribute to enhancing our understanding of plant virology.
文摘In pre-modern China,systematic records of astronomical phenomena measured by units including“zhang,”“chi,”and“cun”and records using phrases such as“as large as something”(making comparison with images取象比类)were kept.Some of these records survive and together they constitute a“scale system”for records of astronomical phenomena.A textual model of the celestial sphere based on naked eye observations also survives.According to this model and using the conversion ratio that 1 chi equals 1 degree,the author has reconstituted the geometric meaning of the records of“zhang,”“chi,”and“cun,”while records that use a comparison with a certain object have been converted to their apparent diameters or magnitudes.Finally,these two records are integrated into one system.Through an analysis of the origin of the chi system,the author obtains the result that the radius of the celestial sphere when the ancients observed the sky with the naked eye was about 13 meters.The paper supports this conclusion from various perspectives,including psychological factors,the radius of planetariums,and the nautical Method of Reckoning by the Stars.As naked eye observers always regard the vault of heaven as a plane hemisphere,they have a common false impression in their minds;the pre-modern observational data therefore contains some systematic errors.Under different illumination and weather conditions,the vault of heaven is plane in varying degrees,for which the author has defined“the angle of apparent plane degree”视扁度角.To correct the visual errors,the author has devised a series of calculation tables for daytime,nighttime,cloudy days,clear days,moonlit nights,and moonless nights,to convert the apparent heights or sizes of celestial bodies to the true heights or sizes.
文摘BACKGROUND The Streptococcus salivarius(S.salivarius)group,which produces the enzyme urease has been identified as a potential contributor to ammonia production in the gut.Researchers have reported that patients with minimal HE had an increased abundance of the S.salivarius group,which is a specific change in the gut microbiota that distinguishes them from healthy individuals.The correlation between the aggregation of specific bacterial species and fibrosis progression in chronic liver disease(CLD)is yet to be fully elucidated.AIM To quantify S.salivarius using digital PCR(dPCR)as a liver fibrosis marker of CLD.METHODS This study retrospectively analysed 52 patients with CLD.To quantify S.salivarius in patients with CLD using dPCR,we evaluated the specificity and sensitivity of S.salivarius bacterial load using dPCR for a type strain.Next,we evaluated the clinical usefulness of dPCR for S.salivarius load quantification for detecting liver fibrosis in patients with CLD.The liver fibrosis stage was categorized into mild and advanced fibrosis based on pathological findings.RESULTS The dPCR assay revealed that S.salivarius was highly positive for the tnpA gene.The lower limit of quantification for dPCR using the tnpA gene with a 1μL template comprising 1.28×102 CFU/mL was 4.3 copies.After considering the detection range in dPCR,we adjusted the extracted DNA concentration to 5.0×10-4 ng/μL from 200 mg stool samples.The median bacterial loads of S.salivarius in stool sample from patients with mild and advanced fibrosis were 1.9 and 7.4 copies/μL,respectively.The quantification of S.salivarius load was observed more frequently in patients with advanced fibrosis than in those with mild fibrosis(P=0.032).CONCLUSION Quantifying of S.salivarius load using digital PCR is a useful biomarker for liver fibrosis in patients with CLD.
文摘During high-speed forward flight,helicopter rotor blades operate across a wide range of Reynolds and Mach numbers.Under such conditions,their aerodynamic performance is significantly influenced by dynamic stall—a complex,unsteady flow phenomenon highly sensitive to inlet conditions such asMach and Reynolds numbers.The key features of three-dimensional blade stall can be effectively represented by the dynamic stall behavior of a pitching airfoil.In this study,we conduct an uncertainty quantification analysis of dynamic stall aerodynamics in high-Mach-number flows over pitching airfoils,accounting for uncertainties in inlet parameters.A computational fluid dynamics(CFD)model based on the compressible unsteady Reynolds-averagedNavier–Stokes(URANS)equations,coupledwith sliding mesh techniques,is developed to simulate the unsteady aerodynamic behavior and associated flow fields.To efficiently capture the aerodynamic responses while maintaining high accuracy,a multi-fidelity Co-Kriging surrogate model is constructed.This model integrates the precision of high-fidelity wind tunnel experiments with the computational efficiency of lower-fidelity URANS simulations.Its accuracy is validated through direct comparison with experimental data.Building upon this surrogate model,we employ interval analysis and the Sobol sensitivity method to quantify the uncertainty and parameter sensitivity of the unsteady aerodynamic forces resulting frominlet condition variability.Both the inlet Mach number and Reynolds number are treated as uncertain inputs,modeled using interval representations.Our results demonstrate that variations inMach number contribute far more significantly to aerodynamic uncertainty than those in Reynolds number.Moreover,the presence of dynamic stall vortices markedly amplifies the aerodynamic sensitivity to Mach number fluctuations.
基金supported by the National Natural Science Foundation of China(Nos.52303380,52025132,52273305,22205185,21621091,22021001,and 22121001)Fundamental Research Funds for the Central Universities(No.20720240041)+3 种基金the 111 Project(Nos.B17027 and B16029)the National Science Foundation of Fujian Province of China(No.2022J02059)the Science and Technology Projects of Innovation Laboratory for Sciences and Technologies of Energy Materials of Fujian Province(No.RD2022070601)the New Cornerstone Science Foundation through the XPLORER PRIZE。
文摘Excessive Fe^(3+) ion concentrations in wastewater pose a long-standing threat to human health.Achieving low-cost,high-efficiency quantification of Fe^(3+) ion concentration in unknown solutions can guide environmental management decisions and optimize water treatment processes.In this study,by leveraging the rapid,real-time detection capabilities of nanopores and the specific chemical binding affinity of tannic acid to Fe^(3+),a linear relationship between the ion current and Fe^(3+) ion concentration was established.Utilizing this linear relationship,quantification of Fe^(3+) ion concentration in unknown solutions was achieved.Furthermore,ethylenediaminetetraacetic acid disodium salt was employed to displace Fe^(3+) from the nanopores,allowing them to be restored to their initial conditions and reused for Fe^(3+) ion quantification.The reusable bioinspired nanopores remain functional over 330 days of storage.This recycling capability and the long-term stability of the nanopores contribute to a significant reduction in costs.This study provides a strategy for the quantification of unknown Fe^(3+) concentration using nanopores,with potential applications in environmental assessment,health monitoring,and so forth.
基金funded by Haikou Science and Technology Plan Project(2022-007),in part by key Laboratory of PK System Technologies Research of Hainan,China.
文摘In the data transaction process within a data asset trading platform,quantifying the trustworthiness of data source nodes is challenging due to their numerous attributes and complex structures.To address this issue,a distributed data source trust assessment management framework,a trust quantification model,and a dynamic adjustment mechanism are proposed.Themodel integrates the Analytic Hierarchy Process(AHP)and Dempster-Shafer(D-S)evidence theory to determine attribute weights and calculate direct trust values,while the PageRank algorithm is employed to derive indirect trust values.Thedirect and indirect trust values are then combined to compute the comprehensive trust value of the data source.Furthermore,a dynamic adjustment mechanism is introduced to continuously update the comprehensive trust value based on historical assessment data.By leveraging the collaborative efforts of multiple nodes in the distributed network,the proposed framework enables a comprehensive,dynamic,and objective evaluation of data source trustworthiness.Extensive experimental analyses demonstrate that the trust quantification model effectively handles large-scale data source trust assessments,exhibiting both strong trust differentiation capability and high robustness.
基金supported by the National Natural Science Foundation of China(12305185)Natural Science Foundation of Hunan Province,China(No.2023JJ50122)+1 种基金International Cooperative Research Project of the Ministry of Education,China(No.HZKY20220355)Scientific Research Foundation of the Education Department of Hunan Province,China(No.22A0307).
文摘Compared to other energy sources,nuclear reactors offer several advantages as a spacecraft power source,including compact size,high power density,and long operating life.These qualities make nuclear power an ideal energy source for future deep space exploration.A whole system model of the space nuclear reactor consisting of the reactor neutron kinetics,reactivity control,reactor heat transfer,heat exchanger,and thermoelectric converter was developed.In addition,an electrical power control system was designed based on the developed dynamic model.The GRS method was used to quantitatively calculate the uncertainty of coupling parameters of the neutronics,thermal-hydraulics,and control system for the space reactor.The Spearman correlation coefficient was applied in the sensitivity analysis of system input parameters to output parameters.The calculation results showed that the uncertainty of the output parameters caused by coupling parameters had the most considerable variation,with a relative standard deviation<2.01%.Effective delayed neutron fraction was most sensitive to electrical power.To obtain optimal control performance,the non-dominated sorting genetic algorithm method was employed to optimize the controller parameters based on the uncertainty quantification calculation.Two typical transient simulations were conducted to test the adaptive ability of the optimized controller in the uncertainty dynamic system,including 100%full power(FP)to 90%FP step load reduction transient and 5%FP/min linear variable load transient.The results showed that,considering the influence of system uncertainty,the optimized controller could improve the response speed and load following accuracy of electrical power control,in which the effectiveness and superiority have been verified.
基金Hefei Municipal Natural Science Foundation,Grant/Award Number:2022009Suqian Guiding Program Project,Grant/Award Number:Z202309Suqian Traditional Chinese Medicine Science and Technology Plan,Grant/Award Number:MS202301。
文摘Quantitative analysis of clinical function parameters from MRI images is crucial for diagnosing and assessing cardiovascular disease.However,the manual calculation of these parameters is challenging due to the high variability among patients and the time-consuming nature of the process.In this study,the authors introduce a framework named MultiJSQ,comprising the feature presentation network(FRN)and the indicator prediction network(IEN),which is designed for simultaneous joint segmentation and quantification.The FRN is tailored for representing global image features,facilitating the direct acquisition of left ventricle(LV)contour images through pixel classification.Additionally,the IEN incorporates specifically designed modules to extract relevant clinical indices.The authors’method considers the interdependence of different tasks,demonstrating the validity of these relationships and yielding favourable results.Through extensive experiments on cardiac MR images from 145 patients,MultiJSQ achieves impressive outcomes,with low mean absolute errors of 124 mm^(2),1.72 mm,and 1.21 mm for areas,dimensions,and regional wall thicknesses,respectively,along with a Dice metric score of 0.908.The experimental findings underscore the excellent performance of our framework in LV segmentation and quantification,highlighting its promising clinical application prospects.