Traditional sampling-based path planning algorithms,such as the rapidly-exploring random tree star(RRT^(*)),encounter critical limitations in unstructured orchard environments,including low sampling efficiency in narr...Traditional sampling-based path planning algorithms,such as the rapidly-exploring random tree star(RRT^(*)),encounter critical limitations in unstructured orchard environments,including low sampling efficiency in narrow passages,slow convergence,and high computational costs.To address these challenges,this paper proposes a novel hybrid global path planning algorithm integrating Gaussian sampling and quadtree optimization(RRT^(*)-GSQ).This methodology aims to enhance path planning by synergistically combining a Gaussian mixture sampling strategy to improve node generation in critical regions,an adaptive step-size and direction optimization mechanism for enhanced obstacle avoidance,a Quadtree-AABB collision detection framework to lower computational complexity,and a dynamic iteration control strategy for more efficient convergence.In obstacle-free and obstructed scenarios,compared with the conventional RRT^(*),the proposed algorithm reduced the number of node evaluations by 67.57%and 62.72%,and decreased the search time by 79.72%and 78.52%,respectively.In path tracking tests,the proposed algorithm achieved substantial reductions in RMSE of the final path compared to the conventional RRT^(*).Specifically,the lateral RMSE was reduced by 41.5%in obstacle-free environments and 59.3%in obstructed environments,while the longitudinal RMSE was reduced by 57.2%and 58.5%,respectively.Furthermore,the maximum absolute errors in both lateral and longitudinal directions were constrained within 0.75 m.Field validation experiments in an operational orchard confirmed the algorithm's practical effectiveness,showing reductions in the mean tracking error of 47.6%(obstacle-free)and 58.3%(with obstructed),alongside a 5.1%and 7.2%shortening of the path length compared to the baseline method.The proposed algorithm effectively enhances path planning efficiency and navigation accuracy for robots,presenting a superior solution for high-precision autonomous navigation of agricultural robots in orchard environments and holding significant value for engineering applications.展开更多
Artificial Intelligence(AI)in healthcare enables predicting diabetes using data-driven methods instead of the traditional ways of screening the disease,which include hemoglobin A1c(HbA1c),oral glucose tolerance test(O...Artificial Intelligence(AI)in healthcare enables predicting diabetes using data-driven methods instead of the traditional ways of screening the disease,which include hemoglobin A1c(HbA1c),oral glucose tolerance test(OGTT),and fasting plasma glucose(FPG)screening techniques,which are invasive and limited in scale.Machine learning(ML)and deep neural network(DNN)models that use large datasets to learn the complex,nonlinear feature interactions,but the conventional ML algorithms are data sensitive and often show unstable predictive accuracy.Conversely,DNN models are more robust,though the ability to reach a high accuracy rate consistently on heterogeneous datasets is still an open challenge.For predicting diabetes,this work proposed a hybrid DNN approach by integrating a bidirectional long short-term memory(BiLSTM)network with a bidirectional gated recurrent unit(BiGRU).A robust DL model,developed by combining various datasets with weighted coefficients,dense operations in the connection of deep layers,and the output aggregation using batch normalization and dropout functions to avoid overfitting.The goal of this hybrid model is better generalization and consistency among various datasets,which facilitates the effective management and early intervention.The proposed DNN model exhibits an excellent predictive performance as compared to the state-of-the-art and baseline ML and DNN models for diabetes prediction tasks.The robust performance indicates the possible usefulness of DL-based models in the development of disease prediction in healthcare and other areas that demand high-quality analytics.展开更多
We read with great interest Deng et al.’s study 1 comparing sextant(6-core)and 12-core systematic biopsy in theMRI-targeted era,which valuably challenges the“more cores=higher accuracy”dogma by proposing a precisio...We read with great interest Deng et al.’s study 1 comparing sextant(6-core)and 12-core systematic biopsy in theMRI-targeted era,which valuably challenges the“more cores=higher accuracy”dogma by proposing a precision sampling strategy based on prostate cancer’s spatial distribution,aligning with personalized diagnosis trends.展开更多
Xylogenesis,the process through which wood cells are formed,results in the long-term storage of carbon in woody biomass,making it a key component of the global carbon cycle.Understanding how environmental drivers infl...Xylogenesis,the process through which wood cells are formed,results in the long-term storage of carbon in woody biomass,making it a key component of the global carbon cycle.Understanding how environmental drivers influence xylogenesis during the growing season is therefore of great interest.However,studying shortterm drivers of wood production using xylogenetic data is complicated by the usual sampling scheme and the influence of eccentric growth,i.e.,heterogeneous growth around the stem.In this study,we improve xylogenesis research by introducing a statistical approach that explicitly considers seasonal phenology,short-term growth rates,and growth eccentricity.To this end,we developed Bayesian models of xylogenesis and compared them with a conventional method based on the use of Gompertz functions.Our results show that eccentricity generated high temporal autocorrelation between successive samples,and that explicitly taking it into account improved both the representativeness of phenology and intra-ring variability.We observed consistent short-term patterns in the model residuals,suggesting the influence of an unaccounted-for environmental variable on cell production.The proposed models offer several advantages over traditional methods,including robust confidence intervals around predictions,consistency with phenology,and reduced sensitivity to extreme observations at the end of the growing season,often linked to eccentric growth.These models also provide a benchmark for mechanistic testing of short-term drivers of wood formation.展开更多
Portable ratiometric fluorescent platforms have emerged as promising tools for multifarious detection,yet remain unexplored for point-of-care monitoring doxorubicin(DOX),one of clinically antineoplastic drugs.To this ...Portable ratiometric fluorescent platforms have emerged as promising tools for multifarious detection,yet remain unexplored for point-of-care monitoring doxorubicin(DOX),one of clinically antineoplastic drugs.To this end,we herein develop a portable self-calibrating platform namely carbon dots(C-dots)-embedded hydrogel sensors with a smartphone-assisted high-throughput imaging device,for DOX sensing.The prepared green-emitting(λ_(em)=508 nm)and negatively-charged C-dots(−11.40±1.21 mV),which have sufficient spectral overlap with the absorption band of DOX(∼500 nm),can strongly bind with positively-charged DOX molecules by electrostatic attraction effects.As a result,DOX molecules are selectively and rapid(20 s)determined with a detection limit of 10.26 nmol/L via Förster resonance energy transfer processes,demonstrating a remarkably chromatic shift from green to red.Further integrated with a 3D-printed smartphone-assisted device,the platform enabled high-throughput quantification,achieving recoveries of 96.40%-101.85%in human urine/serum(RSDs<2.94%,n=3).Notably,the dual linear detection ranges of the platform align with the reported clinical DOX concentrations in urine and plasma(0-4 h post-administration),validating their capability for direct quantification of DOX in clinical samples without special pre-treatment processes.By virtue of attractive analytical performances and robust feasibility,this platform bridges laboratory precision and point-of-care testing needs,offering promising potential for personalized chemotherapy and multiplexed analyte screening.展开更多
Remote sensing plays a pivotal role in forest inventory by enabling efficient large-scale monitoring while minimizing fieldwork costs.However,missing values pose a critical challenge in remote sensing applications,as ...Remote sensing plays a pivotal role in forest inventory by enabling efficient large-scale monitoring while minimizing fieldwork costs.However,missing values pose a critical challenge in remote sensing applications,as ignoring or mishandling such data gaps can introduce systematic bias into the estimation of target variables for natural resource monitoring.This can lead to cascading errors that propagate through forest and ecosystem management decisions,ultimately hindering progress toward sustainable forest management,biodiversity conservation,and climate change mitigation strategies.This study aims to propose and demonstrate a procedure that employs hybrid estimators to address the limitations of missing remotely sensed data in forest inventory,using Landsat 7 ETM+SLC-off data as an archived source for forest resource monitoring as a case in point.We compared forest inventory estimates from the hybrid estimator with those from a conventional model-based(CMB)estimator using Sentinel-2 data without missing values.Monte Carlo simulations revealed three key findings:(1)The hybrid estimator,leveraging missing-data remote sensing represented by Landsat 7 ETM+SLCoff data,achieved a sampling precision of over 90%,meeting China's national standard for the National Forest Inventory(NFI);(2)The hybrid estimator demonstrated comparable efficiency to the CMB estimator;(3)The uncertainty associated with hybrid estimators was primarily dominated by model parameter estimation,which could be effectively mitigated by slightly increasing the training sample size or refining model specification.Overall,in forest inventory,the hybrid estimator can surmount the limitations posed by missing values in remotely sensed auxiliary data,effectively balancing cost-effectiveness and flexibility.展开更多
With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comp...With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.展开更多
Video emotion recognition is widely used due to its alignment with the temporal characteristics of human emotional expression,but existingmodels have significant shortcomings.On the one hand,Transformermultihead self-...Video emotion recognition is widely used due to its alignment with the temporal characteristics of human emotional expression,but existingmodels have significant shortcomings.On the one hand,Transformermultihead self-attention modeling of global temporal dependency has problems of high computational overhead and feature similarity.On the other hand,fixed-size convolution kernels are often used,which have weak perception ability for emotional regions of different scales.Therefore,this paper proposes a video emotion recognition model that combines multi-scale region-aware convolution with temporal interactive sampling.In terms of space,multi-branch large-kernel stripe convolution is used to perceive emotional region features at different scales,and attention weights are generated for each scale feature.In terms of time,multi-layer odd-even down-sampling is performed on the time series,and oddeven sub-sequence interaction is performed to solve the problem of feature similarity,while reducing computational costs due to the linear relationship between sampling and convolution overhead.This paper was tested on CMU-MOSI,CMU-MOSEI,and Hume Reaction.The Acc-2 reached 83.4%,85.2%,and 81.2%,respectively.The experimental results show that the model can significantly improve the accuracy of emotion recognition.展开更多
Nuclear magnetic resonance(NMR)spectroscopy is a powerful tool for analyzing molecular structure and composition.However,traditional NMR experiments suffer from long acquisition times,especially in multidimensional NM...Nuclear magnetic resonance(NMR)spectroscopy is a powerful tool for analyzing molecular structure and composition.However,traditional NMR experiments suffer from long acquisition times,especially in multidimensional NMR spectroscopy.This problem,to some extent,limits broader applications of NMR techniques.Various methods have been proposed to accelerate sampling,including non-uniform sampling(NUS),multi-FID acquisition(MFA),Hadamard encoding,Fourier encoding,spatial encoding Ultrafast 2D NMR(UF2DNMR),and so on.The review focuses on rapid sampling methods developed in contemporary China,introducing their fundamental principles and applications while discussing their respective advantages and disadvantages.展开更多
In coal mines,dynamic disasters such as rock bursts seriously threaten the safety of mining activities.Exploring the dynamic behaviors and disaster characteristics in the impact failure process of coal serves as the b...In coal mines,dynamic disasters such as rock bursts seriously threaten the safety of mining activities.Exploring the dynamic behaviors and disaster characteristics in the impact failure process of coal serves as the basis and prerequisite for monitoring and warning rock bursts.In this context,impact failure tests of coal were carried out under different axial static loads and impact velocities to analyze the dynamic behaviors and acoustic emission(AE)response characteristics of coal.The results show that the dynamic behaviors of coal under combined dynamic and static loads are significantly different from those under static loads,and the stress-strain curve displays double peaks without an obvious compaction stage.As the axial static load grows,the dynamic strength and peak strain both have a quadratic function with the axial static load.When the coal damage intensifies instantaneously,the AE count and energy parameters both witness pulse-like increases and reach their peak values.The damage effect of axial static loads on coal,though limited,has an extreme point.In contrast,the impact velocity can strengthen the response of AE signals and has linear function relationships with the peak values of AE count and energy.This plays a leading role in the damage to samples and sets a critical point for coal failure and fracture.Compared with the analysis results of stress and strain,the responses of AE signals are more accurate and reliable.Based on AE response characteristics,the damage evolution process of coal under the combined dynamic and static loads can be identified more accurately to reveal the moment corresponding to coal damage and the characteristics of coal failure.The research results are conducive to the further application of AE monitoring methods to early warning of rock burst disasters in coal mining sites.展开更多
Successful ex situ conservation of plant populations requires a high degree of genetic representativeness.However,spatially biased sampling in ex situ conservation efforts may fail to capture all wild genetic clusters...Successful ex situ conservation of plant populations requires a high degree of genetic representativeness.However,spatially biased sampling in ex situ conservation efforts may fail to capture all wild genetic clusters for species with range-wide genetic structure.To investigate the extent of spatially biased sampling in living collections and the coverage of wild genetic clusters in plant populations under ex situ conservation worldwide,we combined a global synthesis of ex situ conservation efforts with a case study of an endangered riparian plant species,Myricaria laxiflora.Our analysis of ex situ conservation worldwide revealed that the majority(82.6%)of ex situ populations fail to cover all wild genetic clusters,largely due to spatially biased sampling with low geographic coverage.Our case study of M.laxiflora showed that genetic diversity differed between the ex situ and upstream populations,while it was comparable between ex situ populations and other wild populations.However,current ex situ populations did not cover all wild genetic clusters,as the upstream genetic cluster was previously uncollected.Our study suggests that the failure to cover all wild genetic clusters in ex situ populations is a widespread issue,and ex situ populations with high genetic diversity can also fail to cover all wild genetic clusters.In future ex situ conservation programs,both the importance of high genetic diversity and the high coverage of wild genetic clusters should be prioritized.展开更多
Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either re...Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.展开更多
Optical two-way time-frequency transfer(O-TWTFT),utilizing optical frequency comb carriers and linear optical sampling,effectively enables space-to-ground optical frequency standard comparisons.Previously reported det...Optical two-way time-frequency transfer(O-TWTFT),utilizing optical frequency comb carriers and linear optical sampling,effectively enables space-to-ground optical frequency standard comparisons.Previously reported detection sensitivities of O-TWTFTs were typically in the nanoWatt level,necessitating high-power optical frequency combs to compensate for significant losses in high-orbit satellite-to-ground passes.Such hardware-based solutions,while effective,tend to be costly.This paper presents a novel data post-processing algorithm to enhance sensitivity.Unlike previous timing methods,which depend solely on optical phase data and discard intensity information—resulting in elevated errors,especially under low-reception power,our approach employs complex least squares(CLS)estimation in the complex frequency domain.By preserving all intermediate data and avoiding noise from phase unwrapping,it achieves superior sensitivity and accuracy.Experiments over a 113-kilometer free-space link validate the algorithm's robustness,delivering a detection sensitivity of0.1 nanoWatts—over tenfold better than prior techniques—despite a 100-decibel link loss,comparable to Earth-Moon optical links.展开更多
In this study,we use observations from the Sounding of the Atmosphere using Broadband Emission Radiometry(SABER)instrument onboard the Thermosphere–Ionosphere–Mesosphere Energetics and Dynamics(TIMED)satellite to de...In this study,we use observations from the Sounding of the Atmosphere using Broadband Emission Radiometry(SABER)instrument onboard the Thermosphere–Ionosphere–Mesosphere Energetics and Dynamics(TIMED)satellite to develop and apply a new local-time binning method to investigate the long-term evolution of mesospheric water vapor at high latitudes.The proposed method accounts for the gradual local-time drift of the SABER orbit by aligning seasonal observation windows and selecting samples observed at similar local times.This approach minimizes tidal aliasing and ensures more consistent sampling,yielding more reliable estimates of long-term water vapor trends at high latitudes.The results show that drying signals primarily appear in the polar regions.However,in the southern hemisphere,a drying trend is observed only in autumn,whereas winter and summer mainly show moistening trends.In contrast,the northern hemisphere exhibits drying signals in the polar regions during all seasons,showing a clear seasonal asymmetry.Additionally,the water vapor trend in the northern hemisphere is particularly pronounced in February(late winter),with moistening reaching up to+2.0 ppmv.The winter in the southern hemisphere(July–August)also shows moistening,but the trend is still weaker than in the northern hemisphere.These differences highlight the strong moistening trend in the northern hemisphere during winter and underscore the significant asymmetry in seasonal water vapor changes between the two hemispheres.These findings emphasize the limitations of water vapor trend estimates across different seasons and latitudes.Moreover,they provide new insights into the spatiotemporal variability associated with tidal structures,underscoring the importance of optimizing local-time sampling strategies for reliable long-term trend detection.展开更多
Pinus radiata(D.Don)dominates New Zealand's forestry industry,constituting 91%of plantations,and is among the world's most important plantation species.Given the socio-economic and environmental importance of ...Pinus radiata(D.Don)dominates New Zealand's forestry industry,constituting 91%of plantations,and is among the world's most important plantation species.Given the socio-economic and environmental importance of this species,it is important to have accurate and precise projections over time to make efficient decisions for forest management and greenfield investments in afforestation projects,especially for permanent carbon forests.Future projections of any natural resource systems rely on modeling;however,the acceleration of climate change makes future projections of yield less certain.These challenges also impact national expectations of the contribution planted forests will provide to address climate change and meet international commitments under the Paris Agreement.Using a large national-scale set of contemporary ground-measured data(2013–2023),this study investigates the performance of two growth models developed over 30 years ago that are widely used by NZ plantation growers:1)the Pumice Plateau Model 1988(PPM88)and 2)the 300-index(including a model variant of regional drift).Model simulations were made using the FORECASTER modeling suite with geographic boundaries to adjust for drift in space and time.Basal area(BA,m^(2)⋅ha^(-1))and volume(m^(3)⋅ha^(-1))were simulated,and standard errors and goodness-of-fit metrics calculated up to a typical rotation age of 30 years.Model residuals were then separated and analysed for the main plantation growing regions.The models overpredicted observed growth by between 6.8%and 16.2%,but model predictions and errors varied significantly between regions.The results of this study provided clear evidence of divergence between the outputs of both models and the measured data.Finally,this study suggests future measures to address challenges posed by these discrepancies that will provide better information for forest management and investment decisions in a changing climate.展开更多
Distribution transformers play a vital role in power distribution systems,and their reliable operation is crucial for grid stability.This study presents a simulation-based framework for active fault diagnosis and earl...Distribution transformers play a vital role in power distribution systems,and their reliable operation is crucial for grid stability.This study presents a simulation-based framework for active fault diagnosis and early warning of distribution transformers,integrating Sample Ensemble Learning(SEL)with a Self-Optimizing Support Vector Machine(SO-SVM).The SEL technique enhances data diversity and mitigates class imbalance,while SO-SVM adaptively tunes its hyperparameters to improve classification accuracy.A comprehensive transformer model was developed in MATLAB/Simulink to simulate diverse fault scenarios,including inter-turn winding faults,core saturation,and thermal aging.Feature vectors were extracted from voltage,current,and temperature measurements to train and validate the proposed hybrid model.Quantitative analysis shows that the SEL–SO-SVM framework achieves a classification accuracy of 97.8%,a precision of 96.5%,and an F1-score of 97.2%.Beyond classification,the model effectively identified incipient faults,providing an early warning lead time of up to 2.5 s before significant deviations in operational parameters.This predictive capability underscores its potential for preventing catastrophic transformer failures and enabling timely maintenance actions.The proposed approach demonstrates strong applicability for enhancing the reliability and operational safety of distribution transformers in simulated environments,offering a promising foundation for future real-time and field-level implementations.展开更多
As one of the major volatile components in extraterrestrial materials,nitrogen(N_(2))isotopes serve not only as tracers for the formation and evolution of the solar system,but also play a critical role in assessing pl...As one of the major volatile components in extraterrestrial materials,nitrogen(N_(2))isotopes serve not only as tracers for the formation and evolution of the solar system,but also play a critical role in assessing planetary habitability and the search for extraterrestrial life.The integrated measurement of N_(2)and argon(Ar)isotopes by using noble gas mass spectrometry represents a state-of-the-art technique for such investigations.To support the growing demands of planetary science research in China,we have developed a high-efficiency,high-precision method for the integrated analysis of N_(2)and Ar isotopes.This was achieved by enhancing gas extraction and purification systems and integrating them with a static noble gas mass spectrometer.This method enables integrated N_(2)-Ar isotope measurements on submilligram samples,significantly improving sample utilization and reducing the impact of sample heterogeneity on volatile analysis.The system integrates CO_(2)laser heating,a modular two-stage Zr-Al getter pump,and a CuO furnace-based purification process,effectively reducing background levels(N_(2)blank as low as 0.35×10^(−6)cubic centimeters at standard temperature and pressure[ccSTP]).Analytical precision is ensured through calibration with atmospheric air and CO corrections.To validate the reliability of the method,we performed N_(2)-Ar isotope analyses on the Allende carbonaceous chondrite,one of the most extensively studied meteorites internationally.The measured N_(2)concentrations range from 19.2 to 29.8 ppm,withδ15N values between−44.8‰and−33.0‰.Concentrations of 40Ar,36Ar,and 38Ar are(12.5-21.1)×10^(−6)ccSTP/g,(90.9-150.3)×10^(−9)ccSTP/g,and(19.2-30.7)×10^(−9)ccSTP/g,respectively.These values correspond to cosmic-ray exposure ages of 4.5-5.7 Ma,consistent with previous reports.Step-heating experiments further reveal distinct release patterns of N and Ar isotopes,as well as their associations with specific mineral phases in the meteorite.In summary,the combined N_(2)-Ar isotopic system offers significant advantages for tracing volatile sources in extraterrestrial materials and will provide essential analytical support for upcoming Chinese planetary missions,such as Tianwen-2.展开更多
Graph Neural Networks(GNNs),as a deep learning framework specifically designed for graph-structured data,have achieved deep representation learning of graph data through message passing mechanisms and have become a co...Graph Neural Networks(GNNs),as a deep learning framework specifically designed for graph-structured data,have achieved deep representation learning of graph data through message passing mechanisms and have become a core technology in the field of graph analysis.However,current reviews on GNN models are mainly focused on smaller domains,and there is a lack of systematic reviews on the classification and applications of GNN models.This review systematically synthesizes the three canonical branches of GNN,Graph Convolutional Network(GCN),Graph Attention Network(GAT),and Graph Sampling Aggregation Network(GraphSAGE),and analyzes their integration pathways from both structural and feature perspectives.Drawing on representative studies,we identify three major integration patterns:cascaded fusion,where heterogeneous modules such as Convolutional Neural Network(CNN),Long Short-Term Memory(LSTM),and GraphSAGE are sequentially combined for hierarchical feature learning;parallel fusion,where multi-branch architectures jointly encode complementary graph features;and feature-level fusion,which employs concatenation,weighted summation,or attention-based gating to adaptively merge multi-source embeddings.Through these patterns,integrated GNNs achieve enhanced expressiveness,robustness,and scalability across domains including transportation,biomedicine,and cybersecurity.展开更多
Background The true risk of choronic villus sampling(CVS)is poorly defined.The objective of this study was to review the clinical outcome of transabdominal CVS performed in a university teaching unit,with an emphasis ...Background The true risk of choronic villus sampling(CVS)is poorly defined.The objective of this study was to review the clinical outcome of transabdominal CVS performed in a university teaching unit,with an emphasis on the complication rate.Methods A comprehensive audit database was maintained for 1351 pregnant women,including 17 sets ot twin pregnancies,who had a CVS.Details and outcome of all CVSs made in the unit between May 1996 and May 2004 were reviewed.All CVSs were performed by one of 5 operators using the identical techniques.Results All procedures were performed transabdominally.A total of 1355 CVSs were performed because there were 4 dichorionic twin pregnancies which required 2 punctures.The mean gestation at CVS was(11.8+0.7)weeks,and 97.3%of the procedures were performed between 11 and 13 completed weeks.The majority(96.2%)required only 1 puncture to achieve correct needle placement.The procedure failed to obtain an adequate sample in 4 subjects(0.30%).A total of 1351 chromosomal studies were requested and there was 1 case(0.07%)of culture failure.The results of chromosomal studies were available within 14 days in 36.7%of the cases and within 21 days in 94.0%.Overall,77 chromosomal abnormalities(5.7%)and 5 cases of thalassemia major were detected.Pregnancy outcome was unknown in only 13 singleton subjects(0.96%).In the remaining 1355 fetuses,there were 76 pregnancy terminations(5.56%),10 fetal losses with obvious obstetric causes(0.73%),and 21 potentially procedure-related fetal losses(1.54%).In the last group,the majority had one or more co-existing obstetric complications.The background fetal loss rate for pregnancies at similar gestational age in the unit was about 0.8%.Therefore,the procedure-related fetal loss rate was estimated to be at the maximum of 0.74%.Conclusions In experienced hands,first trimester transabdominal CVS is an accurate and safe invasive prenatal diagnostic procedure.It should be one of the treatment options available to pregnant women who require prenatal genetic diagnosis.展开更多
Efficient tool condition monitoring techniques help to realize intelligent management of tool life and reduce tool usage costs.In this paper,the influence of different wear degrees of ball-end milling cutters on the t...Efficient tool condition monitoring techniques help to realize intelligent management of tool life and reduce tool usage costs.In this paper,the influence of different wear degrees of ball-end milling cutters on the texture shape of machining tool marks is investigated,and a method is proposed for predicting the wear state(including the position and degree of tool wear)of ball-end milling cutters based on entropy measurement of tool mark texture images.Firstly,data samples are prepared through wear experiments,and the change law of the tool mark texture shape with the tool wear state is analyzed.Then,a two-dimensional sample entropy algorithm is developed to quantify the texture morphology.Finally,the processing parameters and tool attitude are integrated into the prediction process to predict the wear value and wear position of the ball end milling cutter.After testing,the correlation between the predicted value and the standard value of the proposed tool condition monitoring method reaches 95.32%,and the accuracy reaches 82.73%,indicating that the proposed method meets the requirement of tool condition monitoring.展开更多
基金National Natural Science Foundation of China(32301712)Natural Science Foundation of Jiangsu Province(BK20230548,BK20250876)+2 种基金Project of Faculty of Agricultural Equipment of Jiangsu University(NGXB20240203)A Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions(PAPD-2023-87)Open Funding Project of the Key Laboratory of Modern Agricultural Equipment and Technology(Jiangsu University),Ministry of Education(MAET202101)。
文摘Traditional sampling-based path planning algorithms,such as the rapidly-exploring random tree star(RRT^(*)),encounter critical limitations in unstructured orchard environments,including low sampling efficiency in narrow passages,slow convergence,and high computational costs.To address these challenges,this paper proposes a novel hybrid global path planning algorithm integrating Gaussian sampling and quadtree optimization(RRT^(*)-GSQ).This methodology aims to enhance path planning by synergistically combining a Gaussian mixture sampling strategy to improve node generation in critical regions,an adaptive step-size and direction optimization mechanism for enhanced obstacle avoidance,a Quadtree-AABB collision detection framework to lower computational complexity,and a dynamic iteration control strategy for more efficient convergence.In obstacle-free and obstructed scenarios,compared with the conventional RRT^(*),the proposed algorithm reduced the number of node evaluations by 67.57%and 62.72%,and decreased the search time by 79.72%and 78.52%,respectively.In path tracking tests,the proposed algorithm achieved substantial reductions in RMSE of the final path compared to the conventional RRT^(*).Specifically,the lateral RMSE was reduced by 41.5%in obstacle-free environments and 59.3%in obstructed environments,while the longitudinal RMSE was reduced by 57.2%and 58.5%,respectively.Furthermore,the maximum absolute errors in both lateral and longitudinal directions were constrained within 0.75 m.Field validation experiments in an operational orchard confirmed the algorithm's practical effectiveness,showing reductions in the mean tracking error of 47.6%(obstacle-free)and 58.3%(with obstructed),alongside a 5.1%and 7.2%shortening of the path length compared to the baseline method.The proposed algorithm effectively enhances path planning efficiency and navigation accuracy for robots,presenting a superior solution for high-precision autonomous navigation of agricultural robots in orchard environments and holding significant value for engineering applications.
基金supported by the School of Digital Science,Universiti Brunei Darussalam,Brunei.
文摘Artificial Intelligence(AI)in healthcare enables predicting diabetes using data-driven methods instead of the traditional ways of screening the disease,which include hemoglobin A1c(HbA1c),oral glucose tolerance test(OGTT),and fasting plasma glucose(FPG)screening techniques,which are invasive and limited in scale.Machine learning(ML)and deep neural network(DNN)models that use large datasets to learn the complex,nonlinear feature interactions,but the conventional ML algorithms are data sensitive and often show unstable predictive accuracy.Conversely,DNN models are more robust,though the ability to reach a high accuracy rate consistently on heterogeneous datasets is still an open challenge.For predicting diabetes,this work proposed a hybrid DNN approach by integrating a bidirectional long short-term memory(BiLSTM)network with a bidirectional gated recurrent unit(BiGRU).A robust DL model,developed by combining various datasets with weighted coefficients,dense operations in the connection of deep layers,and the output aggregation using batch normalization and dropout functions to avoid overfitting.The goal of this hybrid model is better generalization and consistency among various datasets,which facilitates the effective management and early intervention.The proposed DNN model exhibits an excellent predictive performance as compared to the state-of-the-art and baseline ML and DNN models for diabetes prediction tasks.The robust performance indicates the possible usefulness of DL-based models in the development of disease prediction in healthcare and other areas that demand high-quality analytics.
文摘We read with great interest Deng et al.’s study 1 comparing sextant(6-core)and 12-core systematic biopsy in theMRI-targeted era,which valuably challenges the“more cores=higher accuracy”dogma by proposing a precision sampling strategy based on prostate cancer’s spatial distribution,aligning with personalized diagnosis trends.
基金supported by the Discovery Grants program of the Natural Sciences and Engineering Research Council of Canada(No.RGPIN-2021-03553)the Canadian Research Chair in dendroecology and dendroclimatology(CRC-2021-00368)+3 种基金the Ministère des Ressources Naturelles et des Forèts(MRNF,Contract no.142332177-D)the Natural Sciences and Engineering Research Council of Canada(Alliance Grant No.ALLRP 557148-20,obtained in partnership with the MRNF and Resolute Forest Products)the Fonds de recherche du Qu ebec–Nature et technologies(Partnership Research Program on the Contribution of the Forestry Sector to Climate Change MitigationGrant No.2022-0FC-309064)。
文摘Xylogenesis,the process through which wood cells are formed,results in the long-term storage of carbon in woody biomass,making it a key component of the global carbon cycle.Understanding how environmental drivers influence xylogenesis during the growing season is therefore of great interest.However,studying shortterm drivers of wood production using xylogenetic data is complicated by the usual sampling scheme and the influence of eccentric growth,i.e.,heterogeneous growth around the stem.In this study,we improve xylogenesis research by introducing a statistical approach that explicitly considers seasonal phenology,short-term growth rates,and growth eccentricity.To this end,we developed Bayesian models of xylogenesis and compared them with a conventional method based on the use of Gompertz functions.Our results show that eccentricity generated high temporal autocorrelation between successive samples,and that explicitly taking it into account improved both the representativeness of phenology and intra-ring variability.We observed consistent short-term patterns in the model residuals,suggesting the influence of an unaccounted-for environmental variable on cell production.The proposed models offer several advantages over traditional methods,including robust confidence intervals around predictions,consistency with phenology,and reduced sensitivity to extreme observations at the end of the growing season,often linked to eccentric growth.These models also provide a benchmark for mechanistic testing of short-term drivers of wood formation.
基金supported by the National NaturalScience Foundation of China(No.22274001)the Key Project of Natural Science Research of the Education Department of Anhui Province(No.2022AH051032)the Excellent Research and Innovation Team of Universities in Anhui Province(No.2024AH010016).
文摘Portable ratiometric fluorescent platforms have emerged as promising tools for multifarious detection,yet remain unexplored for point-of-care monitoring doxorubicin(DOX),one of clinically antineoplastic drugs.To this end,we herein develop a portable self-calibrating platform namely carbon dots(C-dots)-embedded hydrogel sensors with a smartphone-assisted high-throughput imaging device,for DOX sensing.The prepared green-emitting(λ_(em)=508 nm)and negatively-charged C-dots(−11.40±1.21 mV),which have sufficient spectral overlap with the absorption band of DOX(∼500 nm),can strongly bind with positively-charged DOX molecules by electrostatic attraction effects.As a result,DOX molecules are selectively and rapid(20 s)determined with a detection limit of 10.26 nmol/L via Förster resonance energy transfer processes,demonstrating a remarkably chromatic shift from green to red.Further integrated with a 3D-printed smartphone-assisted device,the platform enabled high-throughput quantification,achieving recoveries of 96.40%-101.85%in human urine/serum(RSDs<2.94%,n=3).Notably,the dual linear detection ranges of the platform align with the reported clinical DOX concentrations in urine and plasma(0-4 h post-administration),validating their capability for direct quantification of DOX in clinical samples without special pre-treatment processes.By virtue of attractive analytical performances and robust feasibility,this platform bridges laboratory precision and point-of-care testing needs,offering promising potential for personalized chemotherapy and multiplexed analyte screening.
基金supported by the National Key R&D Program of China(No.2023YFF1304002-05)the National Social Science Fund of China(No.22BTJ005)the National Natural Science Foundation of China(No.32572049)。
文摘Remote sensing plays a pivotal role in forest inventory by enabling efficient large-scale monitoring while minimizing fieldwork costs.However,missing values pose a critical challenge in remote sensing applications,as ignoring or mishandling such data gaps can introduce systematic bias into the estimation of target variables for natural resource monitoring.This can lead to cascading errors that propagate through forest and ecosystem management decisions,ultimately hindering progress toward sustainable forest management,biodiversity conservation,and climate change mitigation strategies.This study aims to propose and demonstrate a procedure that employs hybrid estimators to address the limitations of missing remotely sensed data in forest inventory,using Landsat 7 ETM+SLC-off data as an archived source for forest resource monitoring as a case in point.We compared forest inventory estimates from the hybrid estimator with those from a conventional model-based(CMB)estimator using Sentinel-2 data without missing values.Monte Carlo simulations revealed three key findings:(1)The hybrid estimator,leveraging missing-data remote sensing represented by Landsat 7 ETM+SLCoff data,achieved a sampling precision of over 90%,meeting China's national standard for the National Forest Inventory(NFI);(2)The hybrid estimator demonstrated comparable efficiency to the CMB estimator;(3)The uncertainty associated with hybrid estimators was primarily dominated by model parameter estimation,which could be effectively mitigated by slightly increasing the training sample size or refining model specification.Overall,in forest inventory,the hybrid estimator can surmount the limitations posed by missing values in remotely sensed auxiliary data,effectively balancing cost-effectiveness and flexibility.
基金supported by the Institute of Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.RS-2023-00235509Development of security monitoring technology based network behavior against encrypted cyber threats in ICT convergence environment).
文摘With the increasing emphasis on personal information protection,encryption through security protocols has emerged as a critical requirement in data transmission and reception processes.Nevertheless,IoT ecosystems comprise heterogeneous networks where outdated systems coexist with the latest devices,spanning a range of devices from non-encrypted ones to fully encrypted ones.Given the limited visibility into payloads in this context,this study investigates AI-based attack detection methods that leverage encrypted traffic metadata,eliminating the need for decryption and minimizing system performance degradation—especially in light of these heterogeneous devices.Using the UNSW-NB15 and CICIoT-2023 dataset,encrypted and unencrypted traffic were categorized according to security protocol,and AI-based intrusion detection experiments were conducted for each traffic type based on metadata.To mitigate the problem of class imbalance,eight different data sampling techniques were applied.The effectiveness of these sampling techniques was then comparatively analyzed using two ensemble models and three Deep Learning(DL)models from various perspectives.The experimental results confirmed that metadata-based attack detection is feasible using only encrypted traffic.In the UNSW-NB15 dataset,the f1-score of encrypted traffic was approximately 0.98,which is 4.3%higher than that of unencrypted traffic(approximately 0.94).In addition,analysis of the encrypted traffic in the CICIoT-2023 dataset using the same method showed a significantly lower f1-score of roughly 0.43,indicating that the quality of the dataset and the preprocessing approach have a substantial impact on detection performance.Furthermore,when data sampling techniques were applied to encrypted traffic,the recall in the UNSWNB15(Encrypted)dataset improved by up to 23.0%,and in the CICIoT-2023(Encrypted)dataset by 20.26%,showing a similar level of improvement.Notably,in CICIoT-2023,f1-score and Receiver Operation Characteristic-Area Under the Curve(ROC-AUC)increased by 59.0%and 55.94%,respectively.These results suggest that data sampling can have a positive effect even in encrypted environments.However,the extent of the improvement may vary depending on data quality,model architecture,and sampling strategy.
基金supported,in part,by the National Nature Science Foundation of China under Grant 62272236,62376128in part,by the Natural Science Foundation of Jiangsu Province under Grant BK20201136,BK20191401.
文摘Video emotion recognition is widely used due to its alignment with the temporal characteristics of human emotional expression,but existingmodels have significant shortcomings.On the one hand,Transformermultihead self-attention modeling of global temporal dependency has problems of high computational overhead and feature similarity.On the other hand,fixed-size convolution kernels are often used,which have weak perception ability for emotional regions of different scales.Therefore,this paper proposes a video emotion recognition model that combines multi-scale region-aware convolution with temporal interactive sampling.In terms of space,multi-branch large-kernel stripe convolution is used to perceive emotional region features at different scales,and attention weights are generated for each scale feature.In terms of time,multi-layer odd-even down-sampling is performed on the time series,and oddeven sub-sequence interaction is performed to solve the problem of feature similarity,while reducing computational costs due to the linear relationship between sampling and convolution overhead.This paper was tested on CMU-MOSI,CMU-MOSEI,and Hume Reaction.The Acc-2 reached 83.4%,85.2%,and 81.2%,respectively.The experimental results show that the model can significantly improve the accuracy of emotion recognition.
基金financially supported by the National Natural Science Foundation of China(grant numbers 22174118,12411530077,and 22374124).
文摘Nuclear magnetic resonance(NMR)spectroscopy is a powerful tool for analyzing molecular structure and composition.However,traditional NMR experiments suffer from long acquisition times,especially in multidimensional NMR spectroscopy.This problem,to some extent,limits broader applications of NMR techniques.Various methods have been proposed to accelerate sampling,including non-uniform sampling(NUS),multi-FID acquisition(MFA),Hadamard encoding,Fourier encoding,spatial encoding Ultrafast 2D NMR(UF2DNMR),and so on.The review focuses on rapid sampling methods developed in contemporary China,introducing their fundamental principles and applications while discussing their respective advantages and disadvantages.
基金Open Fund of State Key Laboratory of Coal Mine Disaster Dynamics and Control,Grant/Award Number:2011DA105287-FW202306Postgraduate Research&Practice Innovation Program of Jiangsu Province,Grant/Award Number:KYCX24_2925+4 种基金Fundamental Research Program of Xuzhou,Grant/Award Number:KC23017National Natural Science Foundation of China,Grant/Award Number:52104234Fundamental Research Funds for the Central Universities,Grant/Award Number:2024-10962National Science Foundation for Young Scientists of Jiangsu Province,Grant/Award Number:BK20200657Graduate Innovation Program of China University of Mining and Technology,Grant/Award Number:2024WLKXJ152。
文摘In coal mines,dynamic disasters such as rock bursts seriously threaten the safety of mining activities.Exploring the dynamic behaviors and disaster characteristics in the impact failure process of coal serves as the basis and prerequisite for monitoring and warning rock bursts.In this context,impact failure tests of coal were carried out under different axial static loads and impact velocities to analyze the dynamic behaviors and acoustic emission(AE)response characteristics of coal.The results show that the dynamic behaviors of coal under combined dynamic and static loads are significantly different from those under static loads,and the stress-strain curve displays double peaks without an obvious compaction stage.As the axial static load grows,the dynamic strength and peak strain both have a quadratic function with the axial static load.When the coal damage intensifies instantaneously,the AE count and energy parameters both witness pulse-like increases and reach their peak values.The damage effect of axial static loads on coal,though limited,has an extreme point.In contrast,the impact velocity can strengthen the response of AE signals and has linear function relationships with the peak values of AE count and energy.This plays a leading role in the damage to samples and sets a critical point for coal failure and fracture.Compared with the analysis results of stress and strain,the responses of AE signals are more accurate and reliable.Based on AE response characteristics,the damage evolution process of coal under the combined dynamic and static loads can be identified more accurately to reveal the moment corresponding to coal damage and the characteristics of coal failure.The research results are conducive to the further application of AE monitoring methods to early warning of rock burst disasters in coal mining sites.
基金supported by National Key Research and Development Program of China(2024YFF1307400)Hubei Provincial Natural Science Foundation and Three Gorges Innovation Development Joint Fund(Grant No.2023AFD195)China Three Gorges Corporation(NBZZ202300130).
文摘Successful ex situ conservation of plant populations requires a high degree of genetic representativeness.However,spatially biased sampling in ex situ conservation efforts may fail to capture all wild genetic clusters for species with range-wide genetic structure.To investigate the extent of spatially biased sampling in living collections and the coverage of wild genetic clusters in plant populations under ex situ conservation worldwide,we combined a global synthesis of ex situ conservation efforts with a case study of an endangered riparian plant species,Myricaria laxiflora.Our analysis of ex situ conservation worldwide revealed that the majority(82.6%)of ex situ populations fail to cover all wild genetic clusters,largely due to spatially biased sampling with low geographic coverage.Our case study of M.laxiflora showed that genetic diversity differed between the ex situ and upstream populations,while it was comparable between ex situ populations and other wild populations.However,current ex situ populations did not cover all wild genetic clusters,as the upstream genetic cluster was previously uncollected.Our study suggests that the failure to cover all wild genetic clusters in ex situ populations is a widespread issue,and ex situ populations with high genetic diversity can also fail to cover all wild genetic clusters.In future ex situ conservation programs,both the importance of high genetic diversity and the high coverage of wild genetic clusters should be prioritized.
基金supported in part by the Research Fund of Key Lab of Education Blockchain and Intelligent Technology,Ministry of Education(EBME25-F-08).
文摘Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.
基金supported by the National Key Research and Development Programme of China(Grant Nos.2020YFC2200103 and 2020YFA0309800)the National Natural Science Foundation of China(Grant No.T2125010)+4 种基金Strategic Priority Research Programme of Chinese Academy of Sciences(Grant No.XDB35030000)Anhui Initiative in Quantum Information Technologies(Grant No.AHY010100)Key R&D Plan of Shandong Province(Grant No.2021ZDPT01)Shanghai Municipal Science and Technology Major Project(Grant No.2019SHZDZX01)Innovation Programme for Quantum Science and Technology(Grant Nos.2021ZD0300100,2021ZD0300300,and2021ZD0300903)。
文摘Optical two-way time-frequency transfer(O-TWTFT),utilizing optical frequency comb carriers and linear optical sampling,effectively enables space-to-ground optical frequency standard comparisons.Previously reported detection sensitivities of O-TWTFTs were typically in the nanoWatt level,necessitating high-power optical frequency combs to compensate for significant losses in high-orbit satellite-to-ground passes.Such hardware-based solutions,while effective,tend to be costly.This paper presents a novel data post-processing algorithm to enhance sensitivity.Unlike previous timing methods,which depend solely on optical phase data and discard intensity information—resulting in elevated errors,especially under low-reception power,our approach employs complex least squares(CLS)estimation in the complex frequency domain.By preserving all intermediate data and avoiding noise from phase unwrapping,it achieves superior sensitivity and accuracy.Experiments over a 113-kilometer free-space link validate the algorithm's robustness,delivering a detection sensitivity of0.1 nanoWatts—over tenfold better than prior techniques—despite a 100-decibel link loss,comparable to Earth-Moon optical links.
基金supported by the National Key R&D Program of China(Grant No.2022YFF0503703)the National Natural Science Foundation of China(Grant Nos.42130203,42275133,and 42241135).
文摘In this study,we use observations from the Sounding of the Atmosphere using Broadband Emission Radiometry(SABER)instrument onboard the Thermosphere–Ionosphere–Mesosphere Energetics and Dynamics(TIMED)satellite to develop and apply a new local-time binning method to investigate the long-term evolution of mesospheric water vapor at high latitudes.The proposed method accounts for the gradual local-time drift of the SABER orbit by aligning seasonal observation windows and selecting samples observed at similar local times.This approach minimizes tidal aliasing and ensures more consistent sampling,yielding more reliable estimates of long-term water vapor trends at high latitudes.The results show that drying signals primarily appear in the polar regions.However,in the southern hemisphere,a drying trend is observed only in autumn,whereas winter and summer mainly show moistening trends.In contrast,the northern hemisphere exhibits drying signals in the polar regions during all seasons,showing a clear seasonal asymmetry.Additionally,the water vapor trend in the northern hemisphere is particularly pronounced in February(late winter),with moistening reaching up to+2.0 ppmv.The winter in the southern hemisphere(July–August)also shows moistening,but the trend is still weaker than in the northern hemisphere.These differences highlight the strong moistening trend in the northern hemisphere during winter and underscore the significant asymmetry in seasonal water vapor changes between the two hemispheres.These findings emphasize the limitations of water vapor trend estimates across different seasons and latitudes.Moreover,they provide new insights into the spatiotemporal variability associated with tidal structures,underscoring the importance of optimizing local-time sampling strategies for reliable long-term trend detection.
基金funded by Scion's Strategic Science Investment Fund(SSIF)the Forest Growers Levy Trust(FGLT)through the Resilient Forests Programme(Task No.A89220)。
文摘Pinus radiata(D.Don)dominates New Zealand's forestry industry,constituting 91%of plantations,and is among the world's most important plantation species.Given the socio-economic and environmental importance of this species,it is important to have accurate and precise projections over time to make efficient decisions for forest management and greenfield investments in afforestation projects,especially for permanent carbon forests.Future projections of any natural resource systems rely on modeling;however,the acceleration of climate change makes future projections of yield less certain.These challenges also impact national expectations of the contribution planted forests will provide to address climate change and meet international commitments under the Paris Agreement.Using a large national-scale set of contemporary ground-measured data(2013–2023),this study investigates the performance of two growth models developed over 30 years ago that are widely used by NZ plantation growers:1)the Pumice Plateau Model 1988(PPM88)and 2)the 300-index(including a model variant of regional drift).Model simulations were made using the FORECASTER modeling suite with geographic boundaries to adjust for drift in space and time.Basal area(BA,m^(2)⋅ha^(-1))and volume(m^(3)⋅ha^(-1))were simulated,and standard errors and goodness-of-fit metrics calculated up to a typical rotation age of 30 years.Model residuals were then separated and analysed for the main plantation growing regions.The models overpredicted observed growth by between 6.8%and 16.2%,but model predictions and errors varied significantly between regions.The results of this study provided clear evidence of divergence between the outputs of both models and the measured data.Finally,this study suggests future measures to address challenges posed by these discrepancies that will provide better information for forest management and investment decisions in a changing climate.
文摘Distribution transformers play a vital role in power distribution systems,and their reliable operation is crucial for grid stability.This study presents a simulation-based framework for active fault diagnosis and early warning of distribution transformers,integrating Sample Ensemble Learning(SEL)with a Self-Optimizing Support Vector Machine(SO-SVM).The SEL technique enhances data diversity and mitigates class imbalance,while SO-SVM adaptively tunes its hyperparameters to improve classification accuracy.A comprehensive transformer model was developed in MATLAB/Simulink to simulate diverse fault scenarios,including inter-turn winding faults,core saturation,and thermal aging.Feature vectors were extracted from voltage,current,and temperature measurements to train and validate the proposed hybrid model.Quantitative analysis shows that the SEL–SO-SVM framework achieves a classification accuracy of 97.8%,a precision of 96.5%,and an F1-score of 97.2%.Beyond classification,the model effectively identified incipient faults,providing an early warning lead time of up to 2.5 s before significant deviations in operational parameters.This predictive capability underscores its potential for preventing catastrophic transformer failures and enabling timely maintenance actions.The proposed approach demonstrates strong applicability for enhancing the reliability and operational safety of distribution transformers in simulated environments,offering a promising foundation for future real-time and field-level implementations.
基金supported by the Bureau of Frontier Sciences and Basic Research,Chinese Academy of Sciences(Grant No.QYJ-2025-0103)the National Natural Science Foundation of China(Grant Nos.42441834,42241105,42441825,and 42203048)the Key Research Program of the Institute of Geology and Geophysics,Chinese Academy of Sciences(Grant No.IGGCAS-202401).
文摘As one of the major volatile components in extraterrestrial materials,nitrogen(N_(2))isotopes serve not only as tracers for the formation and evolution of the solar system,but also play a critical role in assessing planetary habitability and the search for extraterrestrial life.The integrated measurement of N_(2)and argon(Ar)isotopes by using noble gas mass spectrometry represents a state-of-the-art technique for such investigations.To support the growing demands of planetary science research in China,we have developed a high-efficiency,high-precision method for the integrated analysis of N_(2)and Ar isotopes.This was achieved by enhancing gas extraction and purification systems and integrating them with a static noble gas mass spectrometer.This method enables integrated N_(2)-Ar isotope measurements on submilligram samples,significantly improving sample utilization and reducing the impact of sample heterogeneity on volatile analysis.The system integrates CO_(2)laser heating,a modular two-stage Zr-Al getter pump,and a CuO furnace-based purification process,effectively reducing background levels(N_(2)blank as low as 0.35×10^(−6)cubic centimeters at standard temperature and pressure[ccSTP]).Analytical precision is ensured through calibration with atmospheric air and CO corrections.To validate the reliability of the method,we performed N_(2)-Ar isotope analyses on the Allende carbonaceous chondrite,one of the most extensively studied meteorites internationally.The measured N_(2)concentrations range from 19.2 to 29.8 ppm,withδ15N values between−44.8‰and−33.0‰.Concentrations of 40Ar,36Ar,and 38Ar are(12.5-21.1)×10^(−6)ccSTP/g,(90.9-150.3)×10^(−9)ccSTP/g,and(19.2-30.7)×10^(−9)ccSTP/g,respectively.These values correspond to cosmic-ray exposure ages of 4.5-5.7 Ma,consistent with previous reports.Step-heating experiments further reveal distinct release patterns of N and Ar isotopes,as well as their associations with specific mineral phases in the meteorite.In summary,the combined N_(2)-Ar isotopic system offers significant advantages for tracing volatile sources in extraterrestrial materials and will provide essential analytical support for upcoming Chinese planetary missions,such as Tianwen-2.
基金funded by Guangzhou Huashang University(2024HSZD01,HS2023JYSZH01).
文摘Graph Neural Networks(GNNs),as a deep learning framework specifically designed for graph-structured data,have achieved deep representation learning of graph data through message passing mechanisms and have become a core technology in the field of graph analysis.However,current reviews on GNN models are mainly focused on smaller domains,and there is a lack of systematic reviews on the classification and applications of GNN models.This review systematically synthesizes the three canonical branches of GNN,Graph Convolutional Network(GCN),Graph Attention Network(GAT),and Graph Sampling Aggregation Network(GraphSAGE),and analyzes their integration pathways from both structural and feature perspectives.Drawing on representative studies,we identify three major integration patterns:cascaded fusion,where heterogeneous modules such as Convolutional Neural Network(CNN),Long Short-Term Memory(LSTM),and GraphSAGE are sequentially combined for hierarchical feature learning;parallel fusion,where multi-branch architectures jointly encode complementary graph features;and feature-level fusion,which employs concatenation,weighted summation,or attention-based gating to adaptively merge multi-source embeddings.Through these patterns,integrated GNNs achieve enhanced expressiveness,robustness,and scalability across domains including transportation,biomedicine,and cybersecurity.
文摘Background The true risk of choronic villus sampling(CVS)is poorly defined.The objective of this study was to review the clinical outcome of transabdominal CVS performed in a university teaching unit,with an emphasis on the complication rate.Methods A comprehensive audit database was maintained for 1351 pregnant women,including 17 sets ot twin pregnancies,who had a CVS.Details and outcome of all CVSs made in the unit between May 1996 and May 2004 were reviewed.All CVSs were performed by one of 5 operators using the identical techniques.Results All procedures were performed transabdominally.A total of 1355 CVSs were performed because there were 4 dichorionic twin pregnancies which required 2 punctures.The mean gestation at CVS was(11.8+0.7)weeks,and 97.3%of the procedures were performed between 11 and 13 completed weeks.The majority(96.2%)required only 1 puncture to achieve correct needle placement.The procedure failed to obtain an adequate sample in 4 subjects(0.30%).A total of 1351 chromosomal studies were requested and there was 1 case(0.07%)of culture failure.The results of chromosomal studies were available within 14 days in 36.7%of the cases and within 21 days in 94.0%.Overall,77 chromosomal abnormalities(5.7%)and 5 cases of thalassemia major were detected.Pregnancy outcome was unknown in only 13 singleton subjects(0.96%).In the remaining 1355 fetuses,there were 76 pregnancy terminations(5.56%),10 fetal losses with obvious obstetric causes(0.73%),and 21 potentially procedure-related fetal losses(1.54%).In the last group,the majority had one or more co-existing obstetric complications.The background fetal loss rate for pregnancies at similar gestational age in the unit was about 0.8%.Therefore,the procedure-related fetal loss rate was estimated to be at the maximum of 0.74%.Conclusions In experienced hands,first trimester transabdominal CVS is an accurate and safe invasive prenatal diagnostic procedure.It should be one of the treatment options available to pregnant women who require prenatal genetic diagnosis.
基金Project(51975169)supported by the National Natural Science Foundation of ChinaProject(LH2022E085)supported by the Natural Science Foundation of Heilongjiang Province,China。
文摘Efficient tool condition monitoring techniques help to realize intelligent management of tool life and reduce tool usage costs.In this paper,the influence of different wear degrees of ball-end milling cutters on the texture shape of machining tool marks is investigated,and a method is proposed for predicting the wear state(including the position and degree of tool wear)of ball-end milling cutters based on entropy measurement of tool mark texture images.Firstly,data samples are prepared through wear experiments,and the change law of the tool mark texture shape with the tool wear state is analyzed.Then,a two-dimensional sample entropy algorithm is developed to quantify the texture morphology.Finally,the processing parameters and tool attitude are integrated into the prediction process to predict the wear value and wear position of the ball end milling cutter.After testing,the correlation between the predicted value and the standard value of the proposed tool condition monitoring method reaches 95.32%,and the accuracy reaches 82.73%,indicating that the proposed method meets the requirement of tool condition monitoring.