The monument thermal effect(MTE)displacements could result in periodical signals with several mil-limeters magnitudes in the vertical and horizontal GPS position time series.However,the interaction ofvarious origins o...The monument thermal effect(MTE)displacements could result in periodical signals with several mil-limeters magnitudes in the vertical and horizontal GPS position time series.However,the interaction ofvarious origins of periodic signals in GPS observations makes it difficult to isolate the millimeter-levelMTE displacement from other signals and noises.In this study,to assess the diurnal and semidiurnalsignals induced by MTE,we processed 12 very short GPS baselines(VSGB)with length<150 m.Themonument pairs for each baseline differ in their heights,horizontal structure,or base foundations.Meanwhile,two zero-baselines were also processed as the control group.Results showed that the sea-sonal signals observed in VSGB time series in the horizontal and vertical directions,were mainly inducedby seasonal MTE.Time-varying diurnal and semidiurnal signals with amplitude up to 4 mm wereobserved in the vertical direction for baselines with monument height difference(MHD)larger than10 m.Horizontal diurnal signal with an amplitude of about 2 mm was also detected for baselines withnon-axisymmetric monument structure.The orientation of the detected horizontal displacement wascoherent with the direction of daily temperature variation(DTV)driven by direct solar radiation,whichindicates that the diurnal and semidiurnal signals are likely induced by MTE.The observed high-frequency MTE displacements,if not well modeled and removed,may propagate into spurious long-term signals and bias the velocity estimation in the daily GPS time series.展开更多
In the case of a medium-long baseline, for real-time kinematic (RTK) positioning, the fixed rate of integer ambiguity is low due to the distance between the base station and the observation station. Moreover, the atmo...In the case of a medium-long baseline, for real-time kinematic (RTK) positioning, the fixed rate of integer ambiguity is low due to the distance between the base station and the observation station. Moreover, the atmospheric delay after differential processing cannot be ignored. For correcting the residual atmospheric errors, we proposed a GPS/BDS/Galileo/GLONASS four-system fusion RTK positioning algorithm, which is based on the extended Kalman filter (EKF) algorithm. After realizing the spatio-temporal unification of multiple global navigation satellite systems (GNSSs), we introduced a parameter estimation of atmospheric errors based on the EKF model, using the least-squares integer ambiguity decorrelation adjustment (LAMBDA) to calculate the integer ambiguity. After conducting experiments for different baselines, the proposed RTK positioning algorithm can achieve centimeter-level positioning accuracy in the case of medium-long baselines. In addition, the time required to solve the fixed solution is shorter than that of the traditional RTK positioning algorithm.展开更多
The drifting baselines of capillary electrophoresis affect the veracity of analysis greatly. This paper presents Threshold Fitting Technique(TFT) so as to subtract the baselines from the original signals and emendate ...The drifting baselines of capillary electrophoresis affect the veracity of analysis greatly. This paper presents Threshold Fitting Technique(TFT) so as to subtract the baselines from the original signals and emendate the signals. In TFT, wavelet and curve fitting technique are applied synthetically, thresholds are decided by the computer automatically. Many experiments of signal processing indicate that TFT is simple for being used, there are few man-induced factors, and the results are satisfactory. TFT can be applied for noisy signals without any pre-processing.展开更多
Methane(CH4)is a very potent greenhouse gas with a globalwarming potential of approximately 80 times that of carbon dioxide(CO_(2))on a decadal scale[1].By November 2024,159 countrieshave signed the Global Methane Ple...Methane(CH4)is a very potent greenhouse gas with a globalwarming potential of approximately 80 times that of carbon dioxide(CO_(2))on a decadal scale[1].By November 2024,159 countrieshave signed the Global Methane Pledge,committing to reduceanthropogenic emissions by at least 30%by 2030 relative to the 2020 level(https://www.globalmethanepledge.org).In November2023,the Chinese government released an action plan dedicatedto methane emission control,including specific targets in differentsource sectors(https://www.mee.gov.cn/).However,large discrepancies still exist in China’s current methane inventories,hinderingthe setting of mitigation goals.By now,the national anthropogenicmethane emission estimates still vary by at least 30%,and sectoralemission estimates may differ by a factor of 2–5(Fig.1).To supportChina’s methane emission control plan,a key priority is to determine the emission baselines,which this work aims to address.展开更多
Basic communication standards that we rely upon reduce friction for consumers,customers and businesses,speed innovation and allow for cross-border collaboration among companies.That is critical for establishing global...Basic communication standards that we rely upon reduce friction for consumers,customers and businesses,speed innovation and allow for cross-border collaboration among companies.That is critical for establishing global baselines of trust.What standards can do that politics cannot is to bring people together,encourage collaboration,and encourage trusting relationships and the ability to share and develop interoperable communication systems.Today we are building roads that lead to quantum technologies,artificial intelligence and blockchain technologies,but the rules of those roads are yet to be written.The industry has an important role to play in developing rules of the road.Those rules need to start with a very basic foundation,such as data identification secured by design standards,standards for privacy protection,and watermarking,tracing and accountability of AI technologies and algorithms.These are all very achievable objectives,and they form together a foundation of trust that customers and governments worldwide can rely upon.展开更多
This paper delves into the baseline design under the baseline parameterization model in experimental design, focusing on the relationship between the K-aberration criterion and the word length pattern (WLP) of regular...This paper delves into the baseline design under the baseline parameterization model in experimental design, focusing on the relationship between the K-aberration criterion and the word length pattern (WLP) of regular two-level designs. The paper provides a detailed analysis of the relationship between K5and the WLP for regular two-level designs with resolution t=3, and proposes corresponding theoretical results. These results not only theoretically reveal the connection between the orthogonal parameterization model and the baseline parameterization model but also provide theoretical support for finding the K-aberration optimal regular two-level baseline designs. It demonstrates how to apply these theories to evaluate and select the optimal experimental designs. In practical applications, experimental designers can utilize the theoretical results of this paper to quickly assess and select regular two-level baseline designs with minimal K-aberration by analyzing the WLP of the experimental design. This allows for the identification of key factors that significantly affect the experimental outcomes without frequently changing the factor levels, thereby maximizing the benefits of the experiment.展开更多
Extreme disturbance activity is a signature of anthropogenic environmental change. Empirical information describing the historical normative limits of disturbance regimes provides baseline data that facilitates the de...Extreme disturbance activity is a signature of anthropogenic environmental change. Empirical information describing the historical normative limits of disturbance regimes provides baseline data that facilitates the detection of contemporary trends in both disturbances and community-level responses. Quantifying the attributes of historical disturbances is challenging due to their transient episodic nature, with decades-to centurieslong intervals between recurrences. Unmanaged primary forests that support centuries-old trees therefore serve as unique reference systems for quantifying past disturbance regimes. We surveyed relict stands of primary beech-dominated forests over wide environmental gradients in the Carpathian Mountains of Europe. We collected core samples from 3,026 trees in 208 field survey plots distributed across 13 forest stands in two countries. We used dendrochronological methods to analyze time-series of annually-resolved ring-width variation and to identify anomalous growth patterns diagnostic of past forest canopy removal. A 180-year record(1810–1990) of spatially and temporally explicit disturbance events(n =333) was compiled and used to derive s tatistical attributes of the disturbance regime. We quantified disturbance severity(canopy area lost), patch size, and return intervals. Our analyses describe a complex regime where a background of relatively frequent, smallscale, low-to intermediate-severity disturbance was punctuated by episodic large-scale high-severity events. Even the most severe events were non-catastrophic at a stand level, leaving significant residual tree cover that supported a continuity of ecological function. We did not detect evidence for an expected climate-induced intensification of disturbance with time, but methodological limitations precluded an assessment of disturbance activity in the decades since 1990.展开更多
Toroidal torques,generated by the resonant magnetic perturbation(RMP)and acting on the plasma column,are numerically systematically investigated for an ITER baseline scenario.The neoclassical toroidal viscosity(NTV),i...Toroidal torques,generated by the resonant magnetic perturbation(RMP)and acting on the plasma column,are numerically systematically investigated for an ITER baseline scenario.The neoclassical toroidal viscosity(NTV),in particular the resonant portion,is found to provide the dominant contribution to the total toroidal torque under the slow plasma flow regime in ITER.While the electromagnetic torque always opposes the plasma flow,the toroidal torque associated with the Reynolds stress enhances the plasma flow independent of the flow direction.A peculiar double-peak structure for the net NTV torque is robustly computed for ITER,as the toroidal rotation frequency is scanned near the zero value.This structure is found to be ultimately due to a non-monotonic behavior of the wave-particle resonance integral(over the particle pitch angle)in the superbanana plateau NTV regime in ITER.These findings are qualitatively insensitive to variations of a range of factors including the wall resistivity,the plasma pedestal flow and the assumed frequency of the rotating RMP field.展开更多
Sea topography information holds significant importance in oceanic research and the climate change detection.Radar imaging altimetry has emerged as the leading approach for global ocean observation,employing synthetic...Sea topography information holds significant importance in oceanic research and the climate change detection.Radar imaging altimetry has emerged as the leading approach for global ocean observation,employing synthetic aperture radar(SAR)interferometry to enhance the spatial resolution of Sea topography.Nevertheless,current payload capacity and satellite hardware limitations prevent the extension of the interferometric baseline by enlarging the physical antenna size.This constraint hinders achieving centimeter-level accuracy in interferometric altimetry.To address this challenge,we conducted a numerical simulation to assess the viability of a large baseline interferometric imaging altimeter(LB-IIA).By controlling the baseline within the range of 600-1000 m through spiral orbit design in two satellites and mitigating baseline de-correlation with the carrier frequency shift(CFS)technique,we aimed to overcome the above limitations.Our findings demonstrate the efficacy of the CFS technique in compensating for baseline decoherence,elevating coherence from less than 0.1 to over 0.85.Concurrently.The height difference accuracy between neighboring sea surfaces reaches 1 cm within a 1 km resolution.This study is anticipated to serve as a foundational reference for future interferometric imaging altimeter development,catering to the demand for high-precision sea topography data in accurate global bathymetry inversion.展开更多
The natural visibility graph method has been widely used in physiological signal analysis,but it fails to accurately handle signals with data points below the baseline.Such signals are common across various physiologi...The natural visibility graph method has been widely used in physiological signal analysis,but it fails to accurately handle signals with data points below the baseline.Such signals are common across various physiological measurements,including electroencephalograph(EEG)and functional magnetic resonance imaging(fMRI),and are crucial for insights into physiological phenomena.This study introduces a novel method,the baseline perspective visibility graph(BPVG),which can analyze time series by accurately capturing connectivity across data points both above and below the baseline.We present the BPVG construction process and validate its performance using simulated signals.Results demonstrate that BPVG accurately translates periodic,random,and fractal signals into regular,random,and scale-free networks respectively,exhibiting diverse degree distribution traits.Furthermore,we apply BPVG to classify Alzheimer’s disease(AD)patients from healthy controls using EEG data and identify non-demented adults at varying dementia risk using resting-state fMRI(rs-fMRI)data.Utilizing degree distribution entropy derived from BPVG networks,our results exceed the best accuracy benchmark(77.01%)in EEG analysis,especially at channels F4(78.46%)and O1(81.54%).Additionally,our rs-fMRI analysis achieves a statistically significant classification accuracy of 76.74%.These findings highlight the effectiveness of BPVG in distinguishing various time series types and its practical utility in EEG and rs-fMRI analysis for early AD detection and dementia risk assessment.In conclusion,BPVG’s validation across both simulated and real data confirms its capability to capture comprehensive information from time series,irrespective of baseline constraints,providing a novel method for studying neural physiological signals.展开更多
To explore emission baseline, technically the most difficult issue for Clean Development Mechanism (CDM) project development, as well as to examine whether CDM is a possible way to help Beijing restructure its heating...To explore emission baseline, technically the most difficult issue for Clean Development Mechanism (CDM) project development, as well as to examine whether CDM is a possible way to help Beijing restructure its heating energy consumption, this paper conducts a CDM baseline case study on residential heating in Beijing. Based on investigation, energy consumption forecast and economic analysis of future technology options, the technology benchmark and site specific baselines for both retrofit projects and new heating projects have been discussed. The results indicate that fuel switching from coal to natural gas can meet the additionality criteria in many cases and will be the main type of CDM project. In addition, it also proves that the technology benchmark and the case by case baseline setting approach are applicable for future CDM cooperation projects on heating in Beijing.展开更多
Relative positioning is recognized as an important issue for vehicles in urban environments.Multi-vehicle Cooperative Positioning(CP)techniques which fuse the Global Navigation Satellite System(GNSS)and inter-vehicle ...Relative positioning is recognized as an important issue for vehicles in urban environments.Multi-vehicle Cooperative Positioning(CP)techniques which fuse the Global Navigation Satellite System(GNSS)and inter-vehicle ranging have attracted attention in improving the performance of baseline estimation between vehicles.However,current CP methods estimate the baselines separately and ignore the interactions among the positioning information of different baselines.These interactions are called’information coupling’.In this work,we propose a new multivehicle precise CP framework using the coupled information in the network based on the Carrier Differential GNSS(CDGNSS)and inter-vehicle ranging.We demonstrate the benefit of the coupled information by deriving the Cramer-Rao Lower Bound(CRLB)of the float estimation in CP.To fully use this coupled information,we propose a Whole-Net CP(WN-CP)method which consists of the Whole-Net Extended Kalman Filter(WN-EKF)as the float estimation filter,and the Partial Baseline Fixing(PBF)as the ambiguity resolution part.The WN-EKF fuses the measurements of all baselines simultaneously to improve the performance of float estimation,and the PBF strategy fixes the ambiguities of the one baseline to be estimated,instead of full ambiguity resolution,to reduce the computation load of ambiguity resolution.Field tests involving four vehicles were conducted in urban environments.The results show that the proposed WN-CP method can achieve better performance and meanwhile maintain a low computation load compared to the existing methods.展开更多
The linear and nonlinear simulations are carried out using the gyrokinetic code NLT for the electrostatic instabilities in the core region of a deuterium plasma based on the International Thermonuclear Experimental Re...The linear and nonlinear simulations are carried out using the gyrokinetic code NLT for the electrostatic instabilities in the core region of a deuterium plasma based on the International Thermonuclear Experimental Reactor(ITER)baseline scenario.The kinetic electron effects on the linear frequency and nonlinear transport are studied by adopting the adiabatic electron model and the fully drift-kinetic electron model in the NLT code,respectively.The linear simulations focus on the dependence of linear frequency on the plasma parameters,such as the ion and electron temperature gradientsκ_(Ti,e)≡R=L_(Ti,e),the density gradientκ_(n)≡R/L_(n)and the ion-electron temperature ratioτ=T_(e)=T_(i).Here,is the major radius,and T_(e)and T_(i)denote the electron and ion temperatures,respectively.L_(A)=-(δ_(r)lnA)^(-1)is the gradient scale length,with denoting the density,the ion and electron temperatures,respectively.In the kinetic electron model,the ion temperature gradient(ITG)instability and the trapped electron mode(TEM)dominate in the small and large k_(θ)region,respectively,wherek_(θ)is the poloidal wavenumber.The TEMdominant region becomes wider by increasing(decreasing)κ_(T_(e))(κ_(T_(i)))or by decreasingκ_(n).For the nominal parameters of the ITER baseline scenario,the maximum growth rate of dominant ITG instability in the kinetic electron model is about three times larger than that in the adiabatic electron model.The normalized linear frequency depends on the value ofτ,rather than the value of T_(e)or T_(i),in both the adiabatic and kinetic electron models.The nonlinear simulation results show that the ion heat diffusivity in the kinetic electron model is quite a lot larger than that in the adiabatic electron model,the radial structure is finer and the time oscillation is more rapid.In addition,the magnitude of the fluctuated potential at the saturated stage peaks in the ITGdominated region,and contributions from the TEM(dominating in the higher k_(θ)region)to the nonlinear transport can be neglected.In the adiabatic electron model,the zonal radial electric field is found to be mainly driven by the turbulent energy flux,and the contribution of turbulent poloidal Reynolds stress is quite small due to the toroidal shielding effect.However,in the kinetic electron model,the turbulent energy flux is not strong enough to drive the zonal radial electric field in the nonlinear saturated stage.The kinetic electron effects on the mechanism of the turbulence-driven zonal radial electric field should be further investigated.展开更多
The Earth’s Free Core Nutation(FCN) causes Earth tides and forced nutation with frequencies close to the FCN that exhibit resonance effects.High-precision superconducting gravimeter(SG) and very long baseline interfe...The Earth’s Free Core Nutation(FCN) causes Earth tides and forced nutation with frequencies close to the FCN that exhibit resonance effects.High-precision superconducting gravimeter(SG) and very long baseline interferometry(VLBI) provide good observation techniques for detecting the FCN parameters.However,some choices in data processing and solution procedures increase the uncertainty of the FCN parameters.In this study,we analyzed the differences and the effectiveness of weight function and ocean tide corrections in the FCN parameter detection using synthetic data,SG data from thirty-one stations,and the 10 celestial pole offset(CPO) series.The results show that significant discrepancies are caused by different computing options for a single SG station.The stacking method,which results in a variation of0.24-5 sidereal days(SDs) in the FCN period(T) and 10^(3)-10^(4) in the quality factor(Q) due to the selection of the weighting function and the ocean tide model(OTM),can effectively suppress this influence.The statistical analysis results of synthetic data shows that although different weight choices,while adjusting the proportion of diurnal tidal waves involved,do not significantly improve the accuracy of fitted FCN parameters from gravity observations.The study evaluated a series of OTMs using the loading correction efficiency.The fitting of FCN parameters can be improved by selecting the mean of appropriate OTMs based on the evaluation results.Through the estimation of the FCN parameters based on the forced nutation,it was found that the weight function P_(1) is more suitable than others,and different CPO series(after 2009) resulted in a difference of 0.4 SDs in the T and of 103 in the Q.We estimated the FCN parameters for SG(T=430.4±1.5 SDs and Q=1.52×10^(4)±2.5×10^(3)) and for VLBI(T=429.8±0.7 SDs,Q=1.88×10^(4)±2.1×10^(3)).展开更多
Clinical practice guidelines drive clinical practice and clinicians rely to them when trying to answer their most common questions.One of the most important position papers in the field of gastro-esophageal reflux dis...Clinical practice guidelines drive clinical practice and clinicians rely to them when trying to answer their most common questions.One of the most important position papers in the field of gastro-esophageal reflux disease(GERD)is the one produced by the Lyon Consensus.Recently an updated second version has been released.Mean nocturnal baseline impedance(MNBI)was proposed by the first Consensus to act as supportive evidence for GERD diagnosis.Originally a cut-off of 2292 Ohms was proposed,a value revised in the second edition.The updated Consensus recommended that an MNBI<1500 Ohms strongly suggests GERD while a value>2500 Ohms can be used to refute GERD.The proposed cut-offs move in the correct direction by diminishing the original cut-off,nevertheless they arise from a study of normal subjects where cut-offs were provided by measuring the mean value±2SD and not in symptomatic patients.However,data exist that even symptomatic patients with inconclusive disease or reflux hypersensitivity(RH)show lower MNBI values in comparison to normal subjects or patients with functional heartburn(FH).Moreover,according to the data,MNBI,even among symptomatic patients,is affected by age and body mass index.Also,various studies have proposed different cut-offs by using receiver operating characteristic curve analysis even lower than the one proposed.Finally,no information is given for patients submitted to on-proton pump inhibitors pH-impedance studies even if new and extremely important data now exist.Therefore,even if MNBI is an extremely important tool when trying to approach patients with reflux symptoms and could distinguish conclusive GERD from RH or FH,its values should be interpreted with caution.展开更多
Arctic shipping poses environmental risks due to the region’s fragile ecosystems and rapid climate changes.Effective risk assessment tools are needed to ensure sustainable expansion and to carry out environmental imp...Arctic shipping poses environmental risks due to the region’s fragile ecosystems and rapid climate changes.Effective risk assessment tools are needed to ensure sustainable expansion and to carry out environmental impact assessments.This paper explores applications of Failure Modes and Effects Analysis(FMEA)and Systems-Theoretic Process Analysis(STPA)coupled with the consequences of a“Dynamic baseline approach”for Arctic shipping environmental impact assessment.Shipping entails complex interactions between environmental,technical,human,and organizational factors.FMEA identifies failure modes and their effects through component-level analysis.STPA examines how unsafe control actions can emerge from interactions between system components.Combining these techniques with a dynamic(variable)baseline,accounting for inherent ongoing changing Arctic conditions,offers a robust methodology.A qualitative case study shows that prioritizing hazards by risk,yields highest concerns,as increased greenhouse gas emissions,black carbon deposition on ice and snow,and response delays to accidents represent some of the most important identified threats to the environment.The use of FMEA and STPA are complementary,and differences are highlighted.The methodology applied,should be representative for the qualitative risk analysis methodology,and while the findings are impacted by the perspectives of the authors,the process followed is intended to identify and rank risks in a consistent manner.Mitigations measures must be in place to target these issues.Constant monitoring of the changing ecological and socioeconomic Arctic baselines supports the responses.This methodology offers a starting point for systematically addressing environmental impact risks in the data-limited Arctic.Integrating failure modes and effect analysis,system theories and dynamic baselines,account for identification of the complex interactions,influencing environmental risks in this rapidly evolving region.展开更多
The study on designs for the baseline parameterization has aroused attention in recent years. This paper focuses on two-level regular designs for the baseline parameterization. A general result on the relationship bet...The study on designs for the baseline parameterization has aroused attention in recent years. This paper focuses on two-level regular designs for the baseline parameterization. A general result on the relationship between K-aberration and word length pattern is developed.展开更多
基金funded by the Independent Innovation Project of Changjiang Institute of Survey,Planning,Design and Research Corporation (CX2020Z32)supported by the National Natural Science Foundation of China (Grant Numbers42204006 and 42104028)the Open Fund of Hubei Luojia Laboratory (Grant Numbers 230100020 and 230100019)
文摘The monument thermal effect(MTE)displacements could result in periodical signals with several mil-limeters magnitudes in the vertical and horizontal GPS position time series.However,the interaction ofvarious origins of periodic signals in GPS observations makes it difficult to isolate the millimeter-levelMTE displacement from other signals and noises.In this study,to assess the diurnal and semidiurnalsignals induced by MTE,we processed 12 very short GPS baselines(VSGB)with length<150 m.Themonument pairs for each baseline differ in their heights,horizontal structure,or base foundations.Meanwhile,two zero-baselines were also processed as the control group.Results showed that the sea-sonal signals observed in VSGB time series in the horizontal and vertical directions,were mainly inducedby seasonal MTE.Time-varying diurnal and semidiurnal signals with amplitude up to 4 mm wereobserved in the vertical direction for baselines with monument height difference(MHD)larger than10 m.Horizontal diurnal signal with an amplitude of about 2 mm was also detected for baselines withnon-axisymmetric monument structure.The orientation of the detected horizontal displacement wascoherent with the direction of daily temperature variation(DTV)driven by direct solar radiation,whichindicates that the diurnal and semidiurnal signals are likely induced by MTE.The observed high-frequency MTE displacements,if not well modeled and removed,may propagate into spurious long-term signals and bias the velocity estimation in the daily GPS time series.
文摘In the case of a medium-long baseline, for real-time kinematic (RTK) positioning, the fixed rate of integer ambiguity is low due to the distance between the base station and the observation station. Moreover, the atmospheric delay after differential processing cannot be ignored. For correcting the residual atmospheric errors, we proposed a GPS/BDS/Galileo/GLONASS four-system fusion RTK positioning algorithm, which is based on the extended Kalman filter (EKF) algorithm. After realizing the spatio-temporal unification of multiple global navigation satellite systems (GNSSs), we introduced a parameter estimation of atmospheric errors based on the EKF model, using the least-squares integer ambiguity decorrelation adjustment (LAMBDA) to calculate the integer ambiguity. After conducting experiments for different baselines, the proposed RTK positioning algorithm can achieve centimeter-level positioning accuracy in the case of medium-long baselines. In addition, the time required to solve the fixed solution is shorter than that of the traditional RTK positioning algorithm.
基金Supported by the National Natural Science Foundation of China(No.2 9975 0 33)
文摘The drifting baselines of capillary electrophoresis affect the veracity of analysis greatly. This paper presents Threshold Fitting Technique(TFT) so as to subtract the baselines from the original signals and emendate the signals. In TFT, wavelet and curve fitting technique are applied synthetically, thresholds are decided by the computer automatically. Many experiments of signal processing indicate that TFT is simple for being used, there are few man-induced factors, and the results are satisfactory. TFT can be applied for noisy signals without any pre-processing.
基金supported by the National Natural Science Foundation of China(42275194)the National Key Research andDevelopment Program of China(2023YFC3707404)Westlake University for the fruitful discussionof this work.We thank the technical support of the National LargeScientific and Technological Infrastructure‘Earth System Numerical Simulation Facility’(https://cstr.cn/31134.02.EL).
文摘Methane(CH4)is a very potent greenhouse gas with a globalwarming potential of approximately 80 times that of carbon dioxide(CO_(2))on a decadal scale[1].By November 2024,159 countrieshave signed the Global Methane Pledge,committing to reduceanthropogenic emissions by at least 30%by 2030 relative to the 2020 level(https://www.globalmethanepledge.org).In November2023,the Chinese government released an action plan dedicatedto methane emission control,including specific targets in differentsource sectors(https://www.mee.gov.cn/).However,large discrepancies still exist in China’s current methane inventories,hinderingthe setting of mitigation goals.By now,the national anthropogenicmethane emission estimates still vary by at least 30%,and sectoralemission estimates may differ by a factor of 2–5(Fig.1).To supportChina’s methane emission control plan,a key priority is to determine the emission baselines,which this work aims to address.
文摘Basic communication standards that we rely upon reduce friction for consumers,customers and businesses,speed innovation and allow for cross-border collaboration among companies.That is critical for establishing global baselines of trust.What standards can do that politics cannot is to bring people together,encourage collaboration,and encourage trusting relationships and the ability to share and develop interoperable communication systems.Today we are building roads that lead to quantum technologies,artificial intelligence and blockchain technologies,but the rules of those roads are yet to be written.The industry has an important role to play in developing rules of the road.Those rules need to start with a very basic foundation,such as data identification secured by design standards,standards for privacy protection,and watermarking,tracing and accountability of AI technologies and algorithms.These are all very achievable objectives,and they form together a foundation of trust that customers and governments worldwide can rely upon.
文摘This paper delves into the baseline design under the baseline parameterization model in experimental design, focusing on the relationship between the K-aberration criterion and the word length pattern (WLP) of regular two-level designs. The paper provides a detailed analysis of the relationship between K5and the WLP for regular two-level designs with resolution t=3, and proposes corresponding theoretical results. These results not only theoretically reveal the connection between the orthogonal parameterization model and the baseline parameterization model but also provide theoretical support for finding the K-aberration optimal regular two-level baseline designs. It demonstrates how to apply these theories to evaluate and select the optimal experimental designs. In practical applications, experimental designers can utilize the theoretical results of this paper to quickly assess and select regular two-level baseline designs with minimal K-aberration by analyzing the WLP of the experimental design. This allows for the identification of key factors that significantly affect the experimental outcomes without frequently changing the factor levels, thereby maximizing the benefits of the experiment.
基金supported by the Czech Science Foundation project GACR21-27454STechnology Agency of the Czech Republic(project Center for Landscape and Biodiversity,SS02030018).
文摘Extreme disturbance activity is a signature of anthropogenic environmental change. Empirical information describing the historical normative limits of disturbance regimes provides baseline data that facilitates the detection of contemporary trends in both disturbances and community-level responses. Quantifying the attributes of historical disturbances is challenging due to their transient episodic nature, with decades-to centurieslong intervals between recurrences. Unmanaged primary forests that support centuries-old trees therefore serve as unique reference systems for quantifying past disturbance regimes. We surveyed relict stands of primary beech-dominated forests over wide environmental gradients in the Carpathian Mountains of Europe. We collected core samples from 3,026 trees in 208 field survey plots distributed across 13 forest stands in two countries. We used dendrochronological methods to analyze time-series of annually-resolved ring-width variation and to identify anomalous growth patterns diagnostic of past forest canopy removal. A 180-year record(1810–1990) of spatially and temporally explicit disturbance events(n =333) was compiled and used to derive s tatistical attributes of the disturbance regime. We quantified disturbance severity(canopy area lost), patch size, and return intervals. Our analyses describe a complex regime where a background of relatively frequent, smallscale, low-to intermediate-severity disturbance was punctuated by episodic large-scale high-severity events. Even the most severe events were non-catastrophic at a stand level, leaving significant residual tree cover that supported a continuity of ecological function. We did not detect evidence for an expected climate-induced intensification of disturbance with time, but methodological limitations precluded an assessment of disturbance activity in the decades since 1990.
基金funded by National Natural Science Foundation of China(NSFC)(Nos.12075053,11505021 and 11975068)by National Key R&D Program of China(No.2022YFE 03060002)+1 种基金by Fundamental Research Funds for the Central Universities(No.2232024G-10)supported by the U.S.DoE Office of Science(No.DE-FG02–95ER54309)。
文摘Toroidal torques,generated by the resonant magnetic perturbation(RMP)and acting on the plasma column,are numerically systematically investigated for an ITER baseline scenario.The neoclassical toroidal viscosity(NTV),in particular the resonant portion,is found to provide the dominant contribution to the total toroidal torque under the slow plasma flow regime in ITER.While the electromagnetic torque always opposes the plasma flow,the toroidal torque associated with the Reynolds stress enhances the plasma flow independent of the flow direction.A peculiar double-peak structure for the net NTV torque is robustly computed for ITER,as the toroidal rotation frequency is scanned near the zero value.This structure is found to be ultimately due to a non-monotonic behavior of the wave-particle resonance integral(over the particle pitch angle)in the superbanana plateau NTV regime in ITER.These findings are qualitatively insensitive to variations of a range of factors including the wall resistivity,the plasma pedestal flow and the assumed frequency of the rotating RMP field.
文摘Sea topography information holds significant importance in oceanic research and the climate change detection.Radar imaging altimetry has emerged as the leading approach for global ocean observation,employing synthetic aperture radar(SAR)interferometry to enhance the spatial resolution of Sea topography.Nevertheless,current payload capacity and satellite hardware limitations prevent the extension of the interferometric baseline by enlarging the physical antenna size.This constraint hinders achieving centimeter-level accuracy in interferometric altimetry.To address this challenge,we conducted a numerical simulation to assess the viability of a large baseline interferometric imaging altimeter(LB-IIA).By controlling the baseline within the range of 600-1000 m through spiral orbit design in two satellites and mitigating baseline de-correlation with the carrier frequency shift(CFS)technique,we aimed to overcome the above limitations.Our findings demonstrate the efficacy of the CFS technique in compensating for baseline decoherence,elevating coherence from less than 0.1 to over 0.85.Concurrently.The height difference accuracy between neighboring sea surfaces reaches 1 cm within a 1 km resolution.This study is anticipated to serve as a foundational reference for future interferometric imaging altimeter development,catering to the demand for high-precision sea topography data in accurate global bathymetry inversion.
基金supported by the National Key Research and Development Program of China(Grant No.2023YFF1204803)the Natural Science Foundation of Jiangsu Province,China(Grant No.BK20190736)+1 种基金the Fundamental Research Funds for the Central Universities(Grant No.NJ2024029)the National Natural Science Foundation of China(Grant Nos.81701346 and 62201265).
文摘The natural visibility graph method has been widely used in physiological signal analysis,but it fails to accurately handle signals with data points below the baseline.Such signals are common across various physiological measurements,including electroencephalograph(EEG)and functional magnetic resonance imaging(fMRI),and are crucial for insights into physiological phenomena.This study introduces a novel method,the baseline perspective visibility graph(BPVG),which can analyze time series by accurately capturing connectivity across data points both above and below the baseline.We present the BPVG construction process and validate its performance using simulated signals.Results demonstrate that BPVG accurately translates periodic,random,and fractal signals into regular,random,and scale-free networks respectively,exhibiting diverse degree distribution traits.Furthermore,we apply BPVG to classify Alzheimer’s disease(AD)patients from healthy controls using EEG data and identify non-demented adults at varying dementia risk using resting-state fMRI(rs-fMRI)data.Utilizing degree distribution entropy derived from BPVG networks,our results exceed the best accuracy benchmark(77.01%)in EEG analysis,especially at channels F4(78.46%)and O1(81.54%).Additionally,our rs-fMRI analysis achieves a statistically significant classification accuracy of 76.74%.These findings highlight the effectiveness of BPVG in distinguishing various time series types and its practical utility in EEG and rs-fMRI analysis for early AD detection and dementia risk assessment.In conclusion,BPVG’s validation across both simulated and real data confirms its capability to capture comprehensive information from time series,irrespective of baseline constraints,providing a novel method for studying neural physiological signals.
基金Supported by the Major Research Project of the Ninth-Five Plan(1996 2 0 0 0 ) of China!(No. 96 - 911- 0 3)
文摘To explore emission baseline, technically the most difficult issue for Clean Development Mechanism (CDM) project development, as well as to examine whether CDM is a possible way to help Beijing restructure its heating energy consumption, this paper conducts a CDM baseline case study on residential heating in Beijing. Based on investigation, energy consumption forecast and economic analysis of future technology options, the technology benchmark and site specific baselines for both retrofit projects and new heating projects have been discussed. The results indicate that fuel switching from coal to natural gas can meet the additionality criteria in many cases and will be the main type of CDM project. In addition, it also proves that the technology benchmark and the case by case baseline setting approach are applicable for future CDM cooperation projects on heating in Beijing.
基金supported by the National Natural Science Foundation of China(No.61901015)。
文摘Relative positioning is recognized as an important issue for vehicles in urban environments.Multi-vehicle Cooperative Positioning(CP)techniques which fuse the Global Navigation Satellite System(GNSS)and inter-vehicle ranging have attracted attention in improving the performance of baseline estimation between vehicles.However,current CP methods estimate the baselines separately and ignore the interactions among the positioning information of different baselines.These interactions are called’information coupling’.In this work,we propose a new multivehicle precise CP framework using the coupled information in the network based on the Carrier Differential GNSS(CDGNSS)and inter-vehicle ranging.We demonstrate the benefit of the coupled information by deriving the Cramer-Rao Lower Bound(CRLB)of the float estimation in CP.To fully use this coupled information,we propose a Whole-Net CP(WN-CP)method which consists of the Whole-Net Extended Kalman Filter(WN-EKF)as the float estimation filter,and the Partial Baseline Fixing(PBF)as the ambiguity resolution part.The WN-EKF fuses the measurements of all baselines simultaneously to improve the performance of float estimation,and the PBF strategy fixes the ambiguities of the one baseline to be estimated,instead of full ambiguity resolution,to reduce the computation load of ambiguity resolution.Field tests involving four vehicles were conducted in urban environments.The results show that the proposed WN-CP method can achieve better performance and meanwhile maintain a low computation load compared to the existing methods.
基金supported by the National MCF Energy R&D Program of China(No.2019YFE03060000)National Natural Science Foundation of China(Nos.12005063,12375215 and 12175034)the Collaborative Innovation Program of Hefei Science Center,CAS(No.2022HSC-CIP008).
文摘The linear and nonlinear simulations are carried out using the gyrokinetic code NLT for the electrostatic instabilities in the core region of a deuterium plasma based on the International Thermonuclear Experimental Reactor(ITER)baseline scenario.The kinetic electron effects on the linear frequency and nonlinear transport are studied by adopting the adiabatic electron model and the fully drift-kinetic electron model in the NLT code,respectively.The linear simulations focus on the dependence of linear frequency on the plasma parameters,such as the ion and electron temperature gradientsκ_(Ti,e)≡R=L_(Ti,e),the density gradientκ_(n)≡R/L_(n)and the ion-electron temperature ratioτ=T_(e)=T_(i).Here,is the major radius,and T_(e)and T_(i)denote the electron and ion temperatures,respectively.L_(A)=-(δ_(r)lnA)^(-1)is the gradient scale length,with denoting the density,the ion and electron temperatures,respectively.In the kinetic electron model,the ion temperature gradient(ITG)instability and the trapped electron mode(TEM)dominate in the small and large k_(θ)region,respectively,wherek_(θ)is the poloidal wavenumber.The TEMdominant region becomes wider by increasing(decreasing)κ_(T_(e))(κ_(T_(i)))or by decreasingκ_(n).For the nominal parameters of the ITER baseline scenario,the maximum growth rate of dominant ITG instability in the kinetic electron model is about three times larger than that in the adiabatic electron model.The normalized linear frequency depends on the value ofτ,rather than the value of T_(e)or T_(i),in both the adiabatic and kinetic electron models.The nonlinear simulation results show that the ion heat diffusivity in the kinetic electron model is quite a lot larger than that in the adiabatic electron model,the radial structure is finer and the time oscillation is more rapid.In addition,the magnitude of the fluctuated potential at the saturated stage peaks in the ITGdominated region,and contributions from the TEM(dominating in the higher k_(θ)region)to the nonlinear transport can be neglected.In the adiabatic electron model,the zonal radial electric field is found to be mainly driven by the turbulent energy flux,and the contribution of turbulent poloidal Reynolds stress is quite small due to the toroidal shielding effect.However,in the kinetic electron model,the turbulent energy flux is not strong enough to drive the zonal radial electric field in the nonlinear saturated stage.The kinetic electron effects on the mechanism of the turbulence-driven zonal radial electric field should be further investigated.
基金supported by the Open Fund of Hubei Luojia Laboratory (No. 220100033)the Strategic Priority Research Program of Chinese Academy of Sciences (Grant No. XDB41000000)+1 种基金National Natural Science Foundation of China (Grant Nos. 42174108, 41874094, 42192535 and 42242015)the Young Top-notch Talent Cultivation Program of Hubei Province。
文摘The Earth’s Free Core Nutation(FCN) causes Earth tides and forced nutation with frequencies close to the FCN that exhibit resonance effects.High-precision superconducting gravimeter(SG) and very long baseline interferometry(VLBI) provide good observation techniques for detecting the FCN parameters.However,some choices in data processing and solution procedures increase the uncertainty of the FCN parameters.In this study,we analyzed the differences and the effectiveness of weight function and ocean tide corrections in the FCN parameter detection using synthetic data,SG data from thirty-one stations,and the 10 celestial pole offset(CPO) series.The results show that significant discrepancies are caused by different computing options for a single SG station.The stacking method,which results in a variation of0.24-5 sidereal days(SDs) in the FCN period(T) and 10^(3)-10^(4) in the quality factor(Q) due to the selection of the weighting function and the ocean tide model(OTM),can effectively suppress this influence.The statistical analysis results of synthetic data shows that although different weight choices,while adjusting the proportion of diurnal tidal waves involved,do not significantly improve the accuracy of fitted FCN parameters from gravity observations.The study evaluated a series of OTMs using the loading correction efficiency.The fitting of FCN parameters can be improved by selecting the mean of appropriate OTMs based on the evaluation results.Through the estimation of the FCN parameters based on the forced nutation,it was found that the weight function P_(1) is more suitable than others,and different CPO series(after 2009) resulted in a difference of 0.4 SDs in the T and of 103 in the Q.We estimated the FCN parameters for SG(T=430.4±1.5 SDs and Q=1.52×10^(4)±2.5×10^(3)) and for VLBI(T=429.8±0.7 SDs,Q=1.88×10^(4)±2.1×10^(3)).
文摘Clinical practice guidelines drive clinical practice and clinicians rely to them when trying to answer their most common questions.One of the most important position papers in the field of gastro-esophageal reflux disease(GERD)is the one produced by the Lyon Consensus.Recently an updated second version has been released.Mean nocturnal baseline impedance(MNBI)was proposed by the first Consensus to act as supportive evidence for GERD diagnosis.Originally a cut-off of 2292 Ohms was proposed,a value revised in the second edition.The updated Consensus recommended that an MNBI<1500 Ohms strongly suggests GERD while a value>2500 Ohms can be used to refute GERD.The proposed cut-offs move in the correct direction by diminishing the original cut-off,nevertheless they arise from a study of normal subjects where cut-offs were provided by measuring the mean value±2SD and not in symptomatic patients.However,data exist that even symptomatic patients with inconclusive disease or reflux hypersensitivity(RH)show lower MNBI values in comparison to normal subjects or patients with functional heartburn(FH).Moreover,according to the data,MNBI,even among symptomatic patients,is affected by age and body mass index.Also,various studies have proposed different cut-offs by using receiver operating characteristic curve analysis even lower than the one proposed.Finally,no information is given for patients submitted to on-proton pump inhibitors pH-impedance studies even if new and extremely important data now exist.Therefore,even if MNBI is an extremely important tool when trying to approach patients with reflux symptoms and could distinguish conclusive GERD from RH or FH,its values should be interpreted with caution.
基金supported in parts by funds provided by“The Program for Maritim Competence(MARKOM II),”(https://www.markomii.no/om-markomii/),project 10039funds made available to UIT,The Arctic University of Norway。
文摘Arctic shipping poses environmental risks due to the region’s fragile ecosystems and rapid climate changes.Effective risk assessment tools are needed to ensure sustainable expansion and to carry out environmental impact assessments.This paper explores applications of Failure Modes and Effects Analysis(FMEA)and Systems-Theoretic Process Analysis(STPA)coupled with the consequences of a“Dynamic baseline approach”for Arctic shipping environmental impact assessment.Shipping entails complex interactions between environmental,technical,human,and organizational factors.FMEA identifies failure modes and their effects through component-level analysis.STPA examines how unsafe control actions can emerge from interactions between system components.Combining these techniques with a dynamic(variable)baseline,accounting for inherent ongoing changing Arctic conditions,offers a robust methodology.A qualitative case study shows that prioritizing hazards by risk,yields highest concerns,as increased greenhouse gas emissions,black carbon deposition on ice and snow,and response delays to accidents represent some of the most important identified threats to the environment.The use of FMEA and STPA are complementary,and differences are highlighted.The methodology applied,should be representative for the qualitative risk analysis methodology,and while the findings are impacted by the perspectives of the authors,the process followed is intended to identify and rank risks in a consistent manner.Mitigations measures must be in place to target these issues.Constant monitoring of the changing ecological and socioeconomic Arctic baselines supports the responses.This methodology offers a starting point for systematically addressing environmental impact risks in the data-limited Arctic.Integrating failure modes and effect analysis,system theories and dynamic baselines,account for identification of the complex interactions,influencing environmental risks in this rapidly evolving region.
文摘The study on designs for the baseline parameterization has aroused attention in recent years. This paper focuses on two-level regular designs for the baseline parameterization. A general result on the relationship between K-aberration and word length pattern is developed.