Portable ratiometric fluorescent platforms have emerged as promising tools for multifarious detection,yet remain unexplored for point-of-care monitoring doxorubicin(DOX),one of clinically antineoplastic drugs.To this ...Portable ratiometric fluorescent platforms have emerged as promising tools for multifarious detection,yet remain unexplored for point-of-care monitoring doxorubicin(DOX),one of clinically antineoplastic drugs.To this end,we herein develop a portable self-calibrating platform namely carbon dots(C-dots)-embedded hydrogel sensors with a smartphone-assisted high-throughput imaging device,for DOX sensing.The prepared green-emitting(λ_(em)=508 nm)and negatively-charged C-dots(−11.40±1.21 mV),which have sufficient spectral overlap with the absorption band of DOX(∼500 nm),can strongly bind with positively-charged DOX molecules by electrostatic attraction effects.As a result,DOX molecules are selectively and rapid(20 s)determined with a detection limit of 10.26 nmol/L via Förster resonance energy transfer processes,demonstrating a remarkably chromatic shift from green to red.Further integrated with a 3D-printed smartphone-assisted device,the platform enabled high-throughput quantification,achieving recoveries of 96.40%-101.85%in human urine/serum(RSDs<2.94%,n=3).Notably,the dual linear detection ranges of the platform align with the reported clinical DOX concentrations in urine and plasma(0-4 h post-administration),validating their capability for direct quantification of DOX in clinical samples without special pre-treatment processes.By virtue of attractive analytical performances and robust feasibility,this platform bridges laboratory precision and point-of-care testing needs,offering promising potential for personalized chemotherapy and multiplexed analyte screening.展开更多
Small datasets are often challenging due to their limited sample size.This research introduces a novel solution to these problems:average linkage virtual sample generation(ALVSG).ALVSG leverages the underlying data st...Small datasets are often challenging due to their limited sample size.This research introduces a novel solution to these problems:average linkage virtual sample generation(ALVSG).ALVSG leverages the underlying data structure to create virtual samples,which can be used to augment the original dataset.The ALVSG process consists of two steps.First,an average-linkage clustering technique is applied to the dataset to create a dendrogram.The dendrogram represents the hierarchical structure of the dataset,with each merging operation regarded as a linkage.Next,the linkages are combined into an average-based dataset,which serves as a new representation of the dataset.The second step in the ALVSG process involves generating virtual samples using the average-based dataset.The research project generates a set of 100 virtual samples by uniformly distributing them within the provided boundary.These virtual samples are then added to the original dataset,creating a more extensive dataset with improved generalization performance.The efficacy of the ALVSG approach is validated through resampling experiments and t-tests conducted on two small real-world datasets.The experiments are conducted on three forecasting models:the support vector machine for regression(SVR),the deep learning model(DL),and XGBoost.The results show that the ALVSG approach outperforms the baseline methods in terms of mean square error(MSE),root mean square error(RMSE),and mean absolute error(MAE).展开更多
Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either re...Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.展开更多
As one of the major volatile components in extraterrestrial materials,nitrogen(N_(2))isotopes serve not only as tracers for the formation and evolution of the solar system,but also play a critical role in assessing pl...As one of the major volatile components in extraterrestrial materials,nitrogen(N_(2))isotopes serve not only as tracers for the formation and evolution of the solar system,but also play a critical role in assessing planetary habitability and the search for extraterrestrial life.The integrated measurement of N_(2)and argon(Ar)isotopes by using noble gas mass spectrometry represents a state-of-the-art technique for such investigations.To support the growing demands of planetary science research in China,we have developed a high-efficiency,high-precision method for the integrated analysis of N_(2)and Ar isotopes.This was achieved by enhancing gas extraction and purification systems and integrating them with a static noble gas mass spectrometer.This method enables integrated N_(2)-Ar isotope measurements on submilligram samples,significantly improving sample utilization and reducing the impact of sample heterogeneity on volatile analysis.The system integrates CO_(2)laser heating,a modular two-stage Zr-Al getter pump,and a CuO furnace-based purification process,effectively reducing background levels(N_(2)blank as low as 0.35×10^(−6)cubic centimeters at standard temperature and pressure[ccSTP]).Analytical precision is ensured through calibration with atmospheric air and CO corrections.To validate the reliability of the method,we performed N_(2)-Ar isotope analyses on the Allende carbonaceous chondrite,one of the most extensively studied meteorites internationally.The measured N_(2)concentrations range from 19.2 to 29.8 ppm,withδ15N values between−44.8‰and−33.0‰.Concentrations of 40Ar,36Ar,and 38Ar are(12.5-21.1)×10^(−6)ccSTP/g,(90.9-150.3)×10^(−9)ccSTP/g,and(19.2-30.7)×10^(−9)ccSTP/g,respectively.These values correspond to cosmic-ray exposure ages of 4.5-5.7 Ma,consistent with previous reports.Step-heating experiments further reveal distinct release patterns of N and Ar isotopes,as well as their associations with specific mineral phases in the meteorite.In summary,the combined N_(2)-Ar isotopic system offers significant advantages for tracing volatile sources in extraterrestrial materials and will provide essential analytical support for upcoming Chinese planetary missions,such as Tianwen-2.展开更多
Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived chall...Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.展开更多
The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite ar...The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite are collected after intervals of 3 to 4 hours. Large bauxite processing industries producing 1 million tons of pure aluminium can have three grinding mills. Thus, the total number of samples to be tested in one day reaches a figure of 18 to 24. The sample of bauxite ore coming from the grinding mill is tested for its particle size and composition. For testing the composition, the bauxite ore sample is first prepared by fusing it with X-ray flux. Then the sample is sent for X-ray fluorescence analysis. Afterwards, the crucibles are washed in ultrasonic baths to be used for the next testing. The whole procedure takes about 2 - 3 hours. With a large number of samples reaching the laboratory, the chances of error in composition analysis increase. In this study, we have used a composite sampling methodology to reduce the number of samples reaching the laboratory without compromising their validity. The results of the average composition of fifteen samples were measured against composite samples. The mean of difference was calculated. The standard deviation and paired t-test values were evaluated against predetermined critical values obtained using a two-tailed test. It was found from the results that paired test-t values were much lower than the critical values thus validating the composition attained through composite sampling. The composite sampling approach not only reduced the number of samples but also the chemicals used in the laboratory. The objective of improved analytical protocol to reduce the number of samples reaching the laboratory was successfully achieved without compromising the quality of analytical results.展开更多
One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific object...One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific objectives,measurement targets,and measurement requirements for the proposed Gas and Ion Analyzer(GIA).The GIA is designed for in-situ mass spectrometry of neutral gases and low-energy ions,such as hydrogen,carbon,and oxygen,in the vicinity of 311P.Ion sampling techniques are essential for the GIA's Time-of-Flight(TOF)mass analysis capabilities.In this paper,we present an enhanced ion sampling technique through the development of an ion attraction model and an ion source model.The ion attraction model demonstrates that adjusting attraction grid voltage can enhance the detection efficiency of low-energy ions and mitigate the repulsive force of ions during sampling,which is influenced by the satellite's surface positive charging.The ion source model simulates the processes of gas ionization and ion multiplication.Simulation results indicate that the GIA can achieve a lower pressure limit below 10-13Pa and possess a dynamic range exceeding 10~9.These performances ensure the generation of ions with stable and consistent current,which is crucial for high-resolution and broad dynamic range mass spectrometer analysis.Preliminary testing experiments have verified GIA's capability to detect gas compositions such as H2O and N2.In-situ measurements near 311P using GIA are expected to significantly contribute to our understanding of asteroid activity mechanisms,the evolution of the atmospheric and ionized environments of main-belt comets,the interactions with solar wind,and the origin of Earth's water.展开更多
Industrial data mining usually deals with data from different sources.These heterogeneous datasets describe the same object in different views.However,samples from some of the datasets may be lost.Then the remaining s...Industrial data mining usually deals with data from different sources.These heterogeneous datasets describe the same object in different views.However,samples from some of the datasets may be lost.Then the remaining samples do not correspond one-to-one correctly.Mismatched datasets caused by missing samples make the industrial data unavailable for further machine learning.In order to align the mismatched samples,this article presents a cooperative iteration matching method(CIMM)based on the modified dynamic time warping(DTW).The proposed method regards the sequentially accumulated industrial data as the time series.Mismatched samples are aligned by the DTW.In addition,dynamic constraints are applied to the warping distance of the DTW process to make the alignment more efficient.Then a series of models are trained with the cumulated samples iteratively.Several groups of numerical experiments on different missing patterns and missing locations are designed and analyzed to prove the effectiveness and the applicability of the proposed method.展开更多
As an emerging microscopic detection tool,quantum microscopes based on the principle of quantum precision measurement have attracted widespread attention in recent years.Compared with the imaging of classical light,qu...As an emerging microscopic detection tool,quantum microscopes based on the principle of quantum precision measurement have attracted widespread attention in recent years.Compared with the imaging of classical light,quantum-enhanced imaging can achieve ultra-high resolution,ultra-sensitive detection,and anti-interference imaging.Here,we introduce a quantum-enhanced scanning microscope under illumination of an entangled NOON state in polarization.For the phase imager with NOON states,we propose a simple four-basis projection method to replace the four-step phase-shifting method.We have achieved the phase imaging of micrometer-sized birefringent samples and biological cell specimens,with sensitivity close to the Heisenberg limit.The visibility of transmittance-based imaging shows a great enhancement for NOON states.Besides,we also demonstrate that the scanning imaging with NOON states enables the spatial resolution enhancement of√N compared with classical measurement.Our imaging method may provide some reference for the practical application of quantum imaging and is expected to promote the development of microscopic detection.展开更多
Marine gas hydrates are highly sensitive to temperature and pressure fluctuations,and deviations from in-situ conditions may cause irreversible changes in phase state,microstructure,and mechanical properties.However,c...Marine gas hydrates are highly sensitive to temperature and pressure fluctuations,and deviations from in-situ conditions may cause irreversible changes in phase state,microstructure,and mechanical properties.However,conventional samplers often fail to maintain sealing and thermal stability,resulting in low sampling success rates.To address these challenges,an in-situ temperature-and pressure-preserved sampler for marine applications has been developed.The experimental results indicate that the selfdeveloped magnetically controlled pressure-preserved controller reliably achieves autonomous triggering and self-sealing,provides an initial sealing force of 83 N,and is capable of maintaining pressures up to 40 MPa.Additionally,a custom-designed intelligent temperature control chip and high-precision sensors were integrated into the sampler.Through the design of an optimized heat transfer structure,a temperature-preserved system was developed,achieving no more than a 0.3℃ rise in temperature within 2 h.The performance evaluation and sampling operations of the sampler were conducted at the Haima Cold Seep in the South China Sea,resulting in the successful recovery of hydrate maintained under in-situ pressure of 13.8 MPa and a temperature of 6.5℃.This advancement enables the acquisition of high-fidelity hydrate samples,providing critical support for the safe exploitation and scientific analysis of marine gas hydrate resources.展开更多
Weighted exponential distribution W ED(α,λ)with shape parameterαand scale parameterλpossesses some good properties and can be used as a good fit to survival time data compared to other distributions such as gamma,...Weighted exponential distribution W ED(α,λ)with shape parameterαand scale parameterλpossesses some good properties and can be used as a good fit to survival time data compared to other distributions such as gamma,Weibull,or generalized exponential distribution.In this article,we proved the existence and uniqueness of the maximum likelihood estimator(MLE)of the parameters of W ED(α,λ)in simple random sampling(SRS)and provided explicit expressions for the Fisher information number in SRS.Moreover,we also proved the existence and uniqueness of the MLE of the parameters of W ED(α,λ)in ranked set sampling(RSS)and provided explicit expressions for the Fisher information number in RSS.Simulation studies show that these MLEs in RSS can be real competitors for those in SRS.展开更多
Selection of negative samples significantly influences landslide susceptibility assessment,especially when establishing the relationship between landslides and environmental factors in regions with complex geological ...Selection of negative samples significantly influences landslide susceptibility assessment,especially when establishing the relationship between landslides and environmental factors in regions with complex geological conditions.Traditional sampling strategies commonly used in landslide susceptibility models can lead to a misrepresentation of the distribution of negative samples,causing a deviation from actual geological conditions.This,in turn,negatively affects the discriminative ability and generalization performance of the models.To address this issue,we propose a novel approach for selecting negative samples to enhance the quality of machine learning models.We choose the Liangshan Yi Autonomous Prefecture,located in southwestern Sichuan,China,as the case study.This area,characterized by complex terrain,frequent tectonic activities,and steep slope erosion,experiences recurrent landslides,making it an ideal setting for validating our proposed method.We calculate the contribution values of environmental factors using the relief algorithm to construct the feature space,apply the Target Space Exteriorization Sampling(TSES)method to select negative samples,calculate landslide probability values by Random Forest(RF)modeling,and then create regional landslide susceptibility maps.We evaluate the performance of the RF model optimized by the Environmental Factor Selection-based TSES(EFSTSES)method using standard performance metrics.The results indicated that the model achieved an accuracy(ACC)of 0.962,precision(PRE)of 0.961,and an area under the curve(AUC)of 0.962.These findings demonstrate that the EFSTSES-based model effectively mitigates the negative sample imbalance issue,enhances the differentiation between landslide and non-landslide samples,and reduces misclassification,particularly in geologically complex areas.These improvements offer valuable insights for disaster prevention,land use planning,and risk mitigation strategies.展开更多
An intelligent diagnosis method based on self-adaptiveWasserstein dual generative adversarial networks and feature fusion is proposed due to problems such as insufficient sample size and incomplete fault feature extra...An intelligent diagnosis method based on self-adaptiveWasserstein dual generative adversarial networks and feature fusion is proposed due to problems such as insufficient sample size and incomplete fault feature extraction,which are commonly faced by rolling bearings and lead to low diagnostic accuracy.Initially,dual models of the Wasserstein deep convolutional generative adversarial network incorporating gradient penalty(1D-2DWDCGAN)are constructed to augment the original dataset.A self-adaptive loss threshold control training strategy is introduced,and establishing a self-adaptive balancing mechanism for stable model training.Subsequently,a diagnostic model based on multidimensional feature fusion is designed,wherein complex features from various dimensions are extracted,merging the original signal waveform features,structured features,and time-frequency features into a deep composite feature representation that encompasses multiple dimensions and scales;thus,efficient and accurate small sample fault diagnosis is facilitated.Finally,an experiment between the bearing fault dataset of CaseWestern ReserveUniversity and the fault simulation experimental platformdataset of this research group shows that this method effectively supplements the dataset and remarkably improves the diagnostic accuracy.The diagnostic accuracy after data augmentation reached 99.94%and 99.87%in two different experimental environments,respectively.In addition,robustness analysis is conducted on the diagnostic accuracy of the proposed method under different noise backgrounds,verifying its good generalization performance.展开更多
Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neur...Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neural networks(PINN)provide a new way to solve the nonlinear Schrodinger equation describing the soliton evolution by fusing data-driven and physical constraints.However,the grid point sampling strategy of traditional PINN suffers from high computational complexity and unstable gradient flow,which makes it difficult to capture the physical details efficiently.In this paper,we propose a residual-based adaptive multi-distribution(RAMD)sampling method to optimize the PINN training process by dynamically constructing a multi-modal loss distribution.With a 50%reduction in the number of grid points,RAMD significantly reduces the relative error of PINN and,in particular,optimizes the solution error of the(2+1)Ginzburg–Landau equation from 4.55%to 1.98%.RAMD breaks through the lack of physical constraints in the purely data-driven model by the innovative combination of multi-modal distribution modeling and autonomous sampling control for the design of all-optical communication devices.RAMD provides a high-precision numerical simulation tool for the design of all-optical communication devices,optimization of nonlinear laser devices,and other studies.展开更多
Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random samp...Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.展开更多
Nonperiodic interrupted sampling repeater jamming(ISRJ)against inverse synthetic aperture radar(ISAR)can obtain two-dimensional blanket jamming performance by joint fast and slow time domain interrupted modulation,whi...Nonperiodic interrupted sampling repeater jamming(ISRJ)against inverse synthetic aperture radar(ISAR)can obtain two-dimensional blanket jamming performance by joint fast and slow time domain interrupted modulation,which is obviously dif-ferent from the conventional multi-false-target deception jam-ming.In this paper,a suppression method against this kind of novel jamming is proposed based on inter-pulse energy function and compressed sensing theory.By utilizing the discontinuous property of the jamming in slow time domain,the unjammed pulse is separated using the intra-pulse energy function diffe-rence.Based on this,the two-dimensional orthogonal matching pursuit(2D-OMP)algorithm is proposed.Further,it is proposed to reconstruct the ISAR image with the obtained unjammed pulse sequence.The validity of the proposed method is demon-strated via the Yake-42 plane data simulations.展开更多
Three-dimensional printing(3DP)offers valuable insight into the characterization of natural rocks and the verification of theoretical models due to its high reproducibility and accurate replication of complex defects ...Three-dimensional printing(3DP)offers valuable insight into the characterization of natural rocks and the verification of theoretical models due to its high reproducibility and accurate replication of complex defects such as cracks and pores.In this study,3DP gypsum samples with different printing directions were subjected to a series of uniaxial compression tests with in situ micro-computed tomography(micro-CT)scanning to quantitatively investigate their mechanical anisotropic properties and damage evolution characteristics.Based on the two-dimensional(2D)CT images obtained at different scanning steps,a novel void ratio variable was derived using the mean value and variance of CT intensity.Additionally,a constitutive model was formulated incorporating the proposed damage variable,utilizing the void ratio variable.The crack evolution and crack morphology of 3DP gypsum samples were obtained and analyzed using the 3D models reconstructed from the CT images.The results indicate that 3DP gypsum samples exhibit mechanical anisotropic characteristics similar to those found in naturally sedimentary rocks.The mechanical anisotropy is attributed to the bedding planes formed between adjacent layers and pillar-like structures along the printing direction formed by CaSO_(4)·2H_(2)O crystals of needle-like morphology.The mean gray intensity of the voids has a positive linear relationship with the threshold value,while the CT variance and void ratio have concave and convex relationships,respectively.The constitutive model can effectively match the stress–strain curves obtained from uniaxial compression experiments.This study provides comprehensive explanations of the failure modes and anisotropic mechanisms of 3DP gypsum samples,which is important for characterizing and understanding the failure mechanism and microstructural evolution of 3DP rocks when modeling natural rock behavior.展开更多
Electro-Optic Sampling(EOS)detection technique has been widely used in terahertz science and tech⁃nology,and it also can measure the field time waveform of the few-cycle laser pulse.Its frequency response and band lim...Electro-Optic Sampling(EOS)detection technique has been widely used in terahertz science and tech⁃nology,and it also can measure the field time waveform of the few-cycle laser pulse.Its frequency response and band limitation are determined directly by the electro-optic crystal and duration of the probe laser pulse.Here,we investigate the performance of the EOS with thin GaSe crystal in the measurement of the mid-infrared few-cycle la⁃ser pulse.The shift of the central frequency and change of the bandwidth induced by the EOS detection are calcu⁃lated,and then the pulse distortions induced in this detection process are discussed.It is found that this technique produces a red-shift of the central frequency and narrowing of the bandwidth.These changings decrease when the laser wavelength increases from 2μm to 10μm.This work can help to estimate the performance of the EOS de⁃tection technique in the mid-infrared band and offer a reference for the related experiment as well.展开更多
Task-oriented point cloud sampling aims to select a representative subset from the input,tailored to specific application scenarios and task requirements.However,existing approaches rarely tackle the problem of redund...Task-oriented point cloud sampling aims to select a representative subset from the input,tailored to specific application scenarios and task requirements.However,existing approaches rarely tackle the problem of redundancy caused by local structural similarities in 3D objects,which limits the performance of sampling.To address this issue,this paper introduces a novel task-oriented point cloud masked autoencoder-based sampling network(Point-MASNet),inspired by the masked autoencoder mechanism.Point-MASNet employs a voxel-based random non-overlapping masking strategy,which allows the model to selectively learn and capture distinctive local structural features from the input data.This approach effectively mitigates redundancy and enhances the representativeness of the sampled subset.In addition,we propose a lightweight,symmetrically structured keypoint reconstruction network,designed as an autoencoder.This network is optimized to efficiently extract latent features while enabling refined reconstructions.Extensive experiments demonstrate that Point-MASNet achieves competitive sampling performance across classification,registration,and reconstruction tasks.展开更多
基金supported by the National NaturalScience Foundation of China(No.22274001)the Key Project of Natural Science Research of the Education Department of Anhui Province(No.2022AH051032)the Excellent Research and Innovation Team of Universities in Anhui Province(No.2024AH010016).
文摘Portable ratiometric fluorescent platforms have emerged as promising tools for multifarious detection,yet remain unexplored for point-of-care monitoring doxorubicin(DOX),one of clinically antineoplastic drugs.To this end,we herein develop a portable self-calibrating platform namely carbon dots(C-dots)-embedded hydrogel sensors with a smartphone-assisted high-throughput imaging device,for DOX sensing.The prepared green-emitting(λ_(em)=508 nm)and negatively-charged C-dots(−11.40±1.21 mV),which have sufficient spectral overlap with the absorption band of DOX(∼500 nm),can strongly bind with positively-charged DOX molecules by electrostatic attraction effects.As a result,DOX molecules are selectively and rapid(20 s)determined with a detection limit of 10.26 nmol/L via Förster resonance energy transfer processes,demonstrating a remarkably chromatic shift from green to red.Further integrated with a 3D-printed smartphone-assisted device,the platform enabled high-throughput quantification,achieving recoveries of 96.40%-101.85%in human urine/serum(RSDs<2.94%,n=3).Notably,the dual linear detection ranges of the platform align with the reported clinical DOX concentrations in urine and plasma(0-4 h post-administration),validating their capability for direct quantification of DOX in clinical samples without special pre-treatment processes.By virtue of attractive analytical performances and robust feasibility,this platform bridges laboratory precision and point-of-care testing needs,offering promising potential for personalized chemotherapy and multiplexed analyte screening.
基金funding support from the National Science and Technology Council(NSTC),under Grant No.114-2410-H-011-026-MY3.
文摘Small datasets are often challenging due to their limited sample size.This research introduces a novel solution to these problems:average linkage virtual sample generation(ALVSG).ALVSG leverages the underlying data structure to create virtual samples,which can be used to augment the original dataset.The ALVSG process consists of two steps.First,an average-linkage clustering technique is applied to the dataset to create a dendrogram.The dendrogram represents the hierarchical structure of the dataset,with each merging operation regarded as a linkage.Next,the linkages are combined into an average-based dataset,which serves as a new representation of the dataset.The second step in the ALVSG process involves generating virtual samples using the average-based dataset.The research project generates a set of 100 virtual samples by uniformly distributing them within the provided boundary.These virtual samples are then added to the original dataset,creating a more extensive dataset with improved generalization performance.The efficacy of the ALVSG approach is validated through resampling experiments and t-tests conducted on two small real-world datasets.The experiments are conducted on three forecasting models:the support vector machine for regression(SVR),the deep learning model(DL),and XGBoost.The results show that the ALVSG approach outperforms the baseline methods in terms of mean square error(MSE),root mean square error(RMSE),and mean absolute error(MAE).
基金supported in part by the Research Fund of Key Lab of Education Blockchain and Intelligent Technology,Ministry of Education(EBME25-F-08).
文摘Lightweight nodes are crucial for blockchain scalability,but verifying the availability of complete block data puts significant strain on bandwidth and latency.Existing data availability sampling(DAS)schemes either require trusted setups or suffer from high communication overhead and low verification efficiency.This paper presents ISTIRDA,a DAS scheme that lets light clients certify availability by sampling small random codeword symbols.Built on ISTIR,an improved Reed–Solomon interactive oracle proof of proximity,ISTIRDA combines adaptive folding with dynamic code rate adjustment to preserve soundness while lowering communication.This paper formalizes opening consistency and prove security with bounded error in the random oracle model,giving polylogarithmic verifier queries and no trusted setup.In a prototype compared with FRIDA under equal soundness,ISTIRDA reduces communication by 40.65%to 80%.For data larger than 16 MB,ISTIRDA verifies faster and the advantage widens;at 128 MB,proofs are about 60%smaller and verification time is roughly 25%shorter,while prover overhead remains modest.In peer-to-peer emulation under injected latency and loss,ISTIRDA reaches confidence more quickly and is less sensitive to packet loss and load.These results indicate that ISTIRDA is a scalable and provably secure DAS scheme suitable for high-throughput,large-block public blockchains,substantially easing bandwidth and latency pressure on lightweight nodes.
基金supported by the Bureau of Frontier Sciences and Basic Research,Chinese Academy of Sciences(Grant No.QYJ-2025-0103)the National Natural Science Foundation of China(Grant Nos.42441834,42241105,42441825,and 42203048)the Key Research Program of the Institute of Geology and Geophysics,Chinese Academy of Sciences(Grant No.IGGCAS-202401).
文摘As one of the major volatile components in extraterrestrial materials,nitrogen(N_(2))isotopes serve not only as tracers for the formation and evolution of the solar system,but also play a critical role in assessing planetary habitability and the search for extraterrestrial life.The integrated measurement of N_(2)and argon(Ar)isotopes by using noble gas mass spectrometry represents a state-of-the-art technique for such investigations.To support the growing demands of planetary science research in China,we have developed a high-efficiency,high-precision method for the integrated analysis of N_(2)and Ar isotopes.This was achieved by enhancing gas extraction and purification systems and integrating them with a static noble gas mass spectrometer.This method enables integrated N_(2)-Ar isotope measurements on submilligram samples,significantly improving sample utilization and reducing the impact of sample heterogeneity on volatile analysis.The system integrates CO_(2)laser heating,a modular two-stage Zr-Al getter pump,and a CuO furnace-based purification process,effectively reducing background levels(N_(2)blank as low as 0.35×10^(−6)cubic centimeters at standard temperature and pressure[ccSTP]).Analytical precision is ensured through calibration with atmospheric air and CO corrections.To validate the reliability of the method,we performed N_(2)-Ar isotope analyses on the Allende carbonaceous chondrite,one of the most extensively studied meteorites internationally.The measured N_(2)concentrations range from 19.2 to 29.8 ppm,withδ15N values between−44.8‰and−33.0‰.Concentrations of 40Ar,36Ar,and 38Ar are(12.5-21.1)×10^(−6)ccSTP/g,(90.9-150.3)×10^(−9)ccSTP/g,and(19.2-30.7)×10^(−9)ccSTP/g,respectively.These values correspond to cosmic-ray exposure ages of 4.5-5.7 Ma,consistent with previous reports.Step-heating experiments further reveal distinct release patterns of N and Ar isotopes,as well as their associations with specific mineral phases in the meteorite.In summary,the combined N_(2)-Ar isotopic system offers significant advantages for tracing volatile sources in extraterrestrial materials and will provide essential analytical support for upcoming Chinese planetary missions,such as Tianwen-2.
文摘Critical Height Sampling(CHS)estimates stand volume free from any model and tree form assumptions.Despite its introduction more than four decades ago,CHS has not been widely applied in the field due to perceived challenges in measurement.The objectives of this study were to compare estimated stand volume between CHS and sampling methods that used volume or taper models,the equivalence of the sampling methods,and their relative efficiency.We established 65 field plots in planted forests of two coniferous tree species.We estimated stand volume for a range of Basal Area Factors(BAFs).Results showed that CHS produced the most similar mean stand volume across BAFs and tree species with maximum differences between BAFs of 5-18m^(3)·ha^(−1).Horizontal Point Sampling(HPS)using volume models produced very large variability in mean stand volume across BAFs with the differences up to 126m^(3)·ha^(−1).However,CHS was less precise and less efficient than HPS.Furthermore,none of the sampling methods were statistically interchangeable with CHS at an allowable tolerance of≤55m^(3)·ha^(−1).About 72%of critical height measurements were below crown base indicating that critical height was more accessible to measurement than expected.Our study suggests that the consistency in the mean estimates of CHS is a major advantage when planning a forest inventory.When checking against CHS,results hint that HPS estimates might contain potential model bias.These strengths of CHS could outweigh its lower precision.Our study also implies serious implications in financial terms when choosing a sampling method.Lastly,CHS could potentially benefit forest management as an alternate option of estimating stand volume when volume or taper models are lacking or are not reliable.
文摘The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite are collected after intervals of 3 to 4 hours. Large bauxite processing industries producing 1 million tons of pure aluminium can have three grinding mills. Thus, the total number of samples to be tested in one day reaches a figure of 18 to 24. The sample of bauxite ore coming from the grinding mill is tested for its particle size and composition. For testing the composition, the bauxite ore sample is first prepared by fusing it with X-ray flux. Then the sample is sent for X-ray fluorescence analysis. Afterwards, the crucibles are washed in ultrasonic baths to be used for the next testing. The whole procedure takes about 2 - 3 hours. With a large number of samples reaching the laboratory, the chances of error in composition analysis increase. In this study, we have used a composite sampling methodology to reduce the number of samples reaching the laboratory without compromising their validity. The results of the average composition of fifteen samples were measured against composite samples. The mean of difference was calculated. The standard deviation and paired t-test values were evaluated against predetermined critical values obtained using a two-tailed test. It was found from the results that paired test-t values were much lower than the critical values thus validating the composition attained through composite sampling. The composite sampling approach not only reduced the number of samples but also the chemicals used in the laboratory. The objective of improved analytical protocol to reduce the number of samples reaching the laboratory was successfully achieved without compromising the quality of analytical results.
基金Supported by the National Natural Science Foundation of China(42474239,41204128)China National Space Administration(Pre-research project on Civil Aerospace Technologies No.D010301)Strategic Priority Research Program of the Chinese Academy of Sciences(XDA17010303)。
文摘One of the detection objectives of the Chinese Asteroid Exploration mission is to investigate the space environment near the Main-belt Comet(MBC,Active Asteroid)311P/PANSTARRS.This paper outlines the scientific objectives,measurement targets,and measurement requirements for the proposed Gas and Ion Analyzer(GIA).The GIA is designed for in-situ mass spectrometry of neutral gases and low-energy ions,such as hydrogen,carbon,and oxygen,in the vicinity of 311P.Ion sampling techniques are essential for the GIA's Time-of-Flight(TOF)mass analysis capabilities.In this paper,we present an enhanced ion sampling technique through the development of an ion attraction model and an ion source model.The ion attraction model demonstrates that adjusting attraction grid voltage can enhance the detection efficiency of low-energy ions and mitigate the repulsive force of ions during sampling,which is influenced by the satellite's surface positive charging.The ion source model simulates the processes of gas ionization and ion multiplication.Simulation results indicate that the GIA can achieve a lower pressure limit below 10-13Pa and possess a dynamic range exceeding 10~9.These performances ensure the generation of ions with stable and consistent current,which is crucial for high-resolution and broad dynamic range mass spectrometer analysis.Preliminary testing experiments have verified GIA's capability to detect gas compositions such as H2O and N2.In-situ measurements near 311P using GIA are expected to significantly contribute to our understanding of asteroid activity mechanisms,the evolution of the atmospheric and ionized environments of main-belt comets,the interactions with solar wind,and the origin of Earth's water.
基金the Key National Natural Science Foundation of China(No.U1864211)the National Natural Science Foundation of China(No.11772191)the Natural Science Foundation of Shanghai(No.21ZR1431500)。
文摘Industrial data mining usually deals with data from different sources.These heterogeneous datasets describe the same object in different views.However,samples from some of the datasets may be lost.Then the remaining samples do not correspond one-to-one correctly.Mismatched datasets caused by missing samples make the industrial data unavailable for further machine learning.In order to align the mismatched samples,this article presents a cooperative iteration matching method(CIMM)based on the modified dynamic time warping(DTW).The proposed method regards the sequentially accumulated industrial data as the time series.Mismatched samples are aligned by the DTW.In addition,dynamic constraints are applied to the warping distance of the DTW process to make the alignment more efficient.Then a series of models are trained with the cumulated samples iteratively.Several groups of numerical experiments on different missing patterns and missing locations are designed and analyzed to prove the effectiveness and the applicability of the proposed method.
基金supported by he National Natural Science Foundation of China(Grant Nos.12304359,12304398,12404382,12234009,12274215,and 12427808)the China Postdoctoral Science Foundation(Grant No.2023M731611)+4 种基金the Jiangsu Funding Program for Excellent Postdoctoral Talent(Grant No.2023ZB717)Innovation Program for Quantum Science and Technology(Grant No.2021ZD0301400)Key R&D Program of Jiangsu Province(Grant No.BE2023002)Natural Science Foundation of Jiangsu Province(Grant Nos.BK20220759 and BK20233001)Program for Innovative Talents and Entrepreneurs in Jiangsu,and Key R&D Program of Guangdong Province(Grant No.2020B0303010001).
文摘As an emerging microscopic detection tool,quantum microscopes based on the principle of quantum precision measurement have attracted widespread attention in recent years.Compared with the imaging of classical light,quantum-enhanced imaging can achieve ultra-high resolution,ultra-sensitive detection,and anti-interference imaging.Here,we introduce a quantum-enhanced scanning microscope under illumination of an entangled NOON state in polarization.For the phase imager with NOON states,we propose a simple four-basis projection method to replace the four-step phase-shifting method.We have achieved the phase imaging of micrometer-sized birefringent samples and biological cell specimens,with sensitivity close to the Heisenberg limit.The visibility of transmittance-based imaging shows a great enhancement for NOON states.Besides,we also demonstrate that the scanning imaging with NOON states enables the spatial resolution enhancement of√N compared with classical measurement.Our imaging method may provide some reference for the practical application of quantum imaging and is expected to promote the development of microscopic detection.
基金financially supported by Shenzhen Science and Technology Program(Nos.JSGG20220831105002005 and KJZD20231025152759002)Support from the National Natural Science Foundation of China(Nos.52374357 and 523B2101)funded by the Shared Voyages Project for Deep-sea and Abyss Scientific Research and Equipment Sea Trials of Hainan Deep-Sea Technology Innovation Center(No.DSTIC-GXHC-2022002)。
文摘Marine gas hydrates are highly sensitive to temperature and pressure fluctuations,and deviations from in-situ conditions may cause irreversible changes in phase state,microstructure,and mechanical properties.However,conventional samplers often fail to maintain sealing and thermal stability,resulting in low sampling success rates.To address these challenges,an in-situ temperature-and pressure-preserved sampler for marine applications has been developed.The experimental results indicate that the selfdeveloped magnetically controlled pressure-preserved controller reliably achieves autonomous triggering and self-sealing,provides an initial sealing force of 83 N,and is capable of maintaining pressures up to 40 MPa.Additionally,a custom-designed intelligent temperature control chip and high-precision sensors were integrated into the sampler.Through the design of an optimized heat transfer structure,a temperature-preserved system was developed,achieving no more than a 0.3℃ rise in temperature within 2 h.The performance evaluation and sampling operations of the sampler were conducted at the Haima Cold Seep in the South China Sea,resulting in the successful recovery of hydrate maintained under in-situ pressure of 13.8 MPa and a temperature of 6.5℃.This advancement enables the acquisition of high-fidelity hydrate samples,providing critical support for the safe exploitation and scientific analysis of marine gas hydrate resources.
基金Supported by the National Science Foundation of China(11901236,12261036)Scientific Research Fund of Hunan Provincial Education Department(21A0328)+2 种基金Provincial Natural Science Foundation of Hunan(2022JJ30469)Young Core Teacher Foundation of Hunan Province([2020]43)Provincial Postgraduate Innovation Foundation of Hunan(CX20221113)。
文摘Weighted exponential distribution W ED(α,λ)with shape parameterαand scale parameterλpossesses some good properties and can be used as a good fit to survival time data compared to other distributions such as gamma,Weibull,or generalized exponential distribution.In this article,we proved the existence and uniqueness of the maximum likelihood estimator(MLE)of the parameters of W ED(α,λ)in simple random sampling(SRS)and provided explicit expressions for the Fisher information number in SRS.Moreover,we also proved the existence and uniqueness of the MLE of the parameters of W ED(α,λ)in ranked set sampling(RSS)and provided explicit expressions for the Fisher information number in RSS.Simulation studies show that these MLEs in RSS can be real competitors for those in SRS.
基金supported by Natural Science Research Project of Anhui Educational Committee(2023AH030041)National Natural Science Foundation of China(42277136)Anhui Province Young and Middle-aged Teacher Training Action Project(DTR2023018).
文摘Selection of negative samples significantly influences landslide susceptibility assessment,especially when establishing the relationship between landslides and environmental factors in regions with complex geological conditions.Traditional sampling strategies commonly used in landslide susceptibility models can lead to a misrepresentation of the distribution of negative samples,causing a deviation from actual geological conditions.This,in turn,negatively affects the discriminative ability and generalization performance of the models.To address this issue,we propose a novel approach for selecting negative samples to enhance the quality of machine learning models.We choose the Liangshan Yi Autonomous Prefecture,located in southwestern Sichuan,China,as the case study.This area,characterized by complex terrain,frequent tectonic activities,and steep slope erosion,experiences recurrent landslides,making it an ideal setting for validating our proposed method.We calculate the contribution values of environmental factors using the relief algorithm to construct the feature space,apply the Target Space Exteriorization Sampling(TSES)method to select negative samples,calculate landslide probability values by Random Forest(RF)modeling,and then create regional landslide susceptibility maps.We evaluate the performance of the RF model optimized by the Environmental Factor Selection-based TSES(EFSTSES)method using standard performance metrics.The results indicated that the model achieved an accuracy(ACC)of 0.962,precision(PRE)of 0.961,and an area under the curve(AUC)of 0.962.These findings demonstrate that the EFSTSES-based model effectively mitigates the negative sample imbalance issue,enhances the differentiation between landslide and non-landslide samples,and reduces misclassification,particularly in geologically complex areas.These improvements offer valuable insights for disaster prevention,land use planning,and risk mitigation strategies.
基金supported by the National Natural Science Foundation of China(Grant Nos.12272259 and 52005148).
文摘An intelligent diagnosis method based on self-adaptiveWasserstein dual generative adversarial networks and feature fusion is proposed due to problems such as insufficient sample size and incomplete fault feature extraction,which are commonly faced by rolling bearings and lead to low diagnostic accuracy.Initially,dual models of the Wasserstein deep convolutional generative adversarial network incorporating gradient penalty(1D-2DWDCGAN)are constructed to augment the original dataset.A self-adaptive loss threshold control training strategy is introduced,and establishing a self-adaptive balancing mechanism for stable model training.Subsequently,a diagnostic model based on multidimensional feature fusion is designed,wherein complex features from various dimensions are extracted,merging the original signal waveform features,structured features,and time-frequency features into a deep composite feature representation that encompasses multiple dimensions and scales;thus,efficient and accurate small sample fault diagnosis is facilitated.Finally,an experiment between the bearing fault dataset of CaseWestern ReserveUniversity and the fault simulation experimental platformdataset of this research group shows that this method effectively supplements the dataset and remarkably improves the diagnostic accuracy.The diagnostic accuracy after data augmentation reached 99.94%and 99.87%in two different experimental environments,respectively.In addition,robustness analysis is conducted on the diagnostic accuracy of the proposed method under different noise backgrounds,verifying its good generalization performance.
基金supported by the National Key R&D Program of China(Grant No.2022YFA1604200)National Natural Science Foundation of China(Grant No.12261131495)+1 种基金Beijing Municipal Science and Technology Commission,Adminitrative Commission of Zhongguancun Science Park(Grant No.Z231100006623006)Institute of Systems Science,Beijing Wuzi University(Grant No.BWUISS21)。
文摘Optical solitons,as self-sustaining waveforms in a nonlinear medium where dispersion and nonlinear effects are balanced,have key applications in ultrafast laser systems and optical communications.Physics-informed neural networks(PINN)provide a new way to solve the nonlinear Schrodinger equation describing the soliton evolution by fusing data-driven and physical constraints.However,the grid point sampling strategy of traditional PINN suffers from high computational complexity and unstable gradient flow,which makes it difficult to capture the physical details efficiently.In this paper,we propose a residual-based adaptive multi-distribution(RAMD)sampling method to optimize the PINN training process by dynamically constructing a multi-modal loss distribution.With a 50%reduction in the number of grid points,RAMD significantly reduces the relative error of PINN and,in particular,optimizes the solution error of the(2+1)Ginzburg–Landau equation from 4.55%to 1.98%.RAMD breaks through the lack of physical constraints in the purely data-driven model by the innovative combination of multi-modal distribution modeling and autonomous sampling control for the design of all-optical communication devices.RAMD provides a high-precision numerical simulation tool for the design of all-optical communication devices,optimization of nonlinear laser devices,and other studies.
基金the Ministry of Agriculture and Forestry key project“Puuta liikkeelle ja uusia tuotteita metsästä”(“Wood on the move and new products from forest”)Academy of Finland(project numbers 295100 , 306875).
文摘Background:The local pivotal method(LPM)utilizing auxiliary data in sample selection has recently been proposed as a sampling method for national forest inventories(NFIs).Its performance compared to simple random sampling(SRS)and LPM with geographical coordinates has produced promising results in simulation studies.In this simulation study we compared all these sampling methods to systematic sampling.The LPM samples were selected solely using the coordinates(LPMxy)or,in addition to that,auxiliary remote sensing-based forest variables(RS variables).We utilized field measurement data(NFI-field)and Multi-Source NFI(MS-NFI)maps as target data,and independent MS-NFI maps as auxiliary data.The designs were compared using relative efficiency(RE);a ratio of mean squared errors of the reference sampling design against the studied design.Applying a method in NFI also requires a proven estimator for the variance.Therefore,three different variance estimators were evaluated against the empirical variance of replications:1)an estimator corresponding to SRS;2)a Grafström-Schelin estimator repurposed for LPM;and 3)a Matérn estimator applied in the Finnish NFI for systematic sampling design.Results:The LPMxy was nearly comparable with the systematic design for the most target variables.The REs of the LPM designs utilizing auxiliary data compared to the systematic design varied between 0.74–1.18,according to the studied target variable.The SRS estimator for variance was expectedly the most biased and conservative estimator.Similarly,the Grafström-Schelin estimator gave overestimates in the case of LPMxy.When the RS variables were utilized as auxiliary data,the Grafström-Schelin estimates tended to underestimate the empirical variance.In systematic sampling the Matérn and Grafström-Schelin estimators performed for practical purposes equally.Conclusions:LPM optimized for a specific variable tended to be more efficient than systematic sampling,but all of the considered LPM designs were less efficient than the systematic sampling design for some target variables.The Grafström-Schelin estimator could be used as such with LPMxy or instead of the Matérn estimator in systematic sampling.Further studies of the variance estimators are needed if other auxiliary variables are to be used in LPM.
基金supported by the National Natural Science Foundation of China(62001481,61890542,62071475)the Natural Science Foundation of Hunan Province(2022JJ40561)the Research Program of National University of Defense Technology(ZK22-46).
文摘Nonperiodic interrupted sampling repeater jamming(ISRJ)against inverse synthetic aperture radar(ISAR)can obtain two-dimensional blanket jamming performance by joint fast and slow time domain interrupted modulation,which is obviously dif-ferent from the conventional multi-false-target deception jam-ming.In this paper,a suppression method against this kind of novel jamming is proposed based on inter-pulse energy function and compressed sensing theory.By utilizing the discontinuous property of the jamming in slow time domain,the unjammed pulse is separated using the intra-pulse energy function diffe-rence.Based on this,the two-dimensional orthogonal matching pursuit(2D-OMP)algorithm is proposed.Further,it is proposed to reconstruct the ISAR image with the obtained unjammed pulse sequence.The validity of the proposed method is demon-strated via the Yake-42 plane data simulations.
基金supported by grants from the Human Resources Development program(Grant No.20204010600250)the Training Program of CCUS for the Green Growth(Grant No.20214000000500)by the Korea Institute of Energy Technology Evaluation and Planning(KETEP)funded by the Ministry of Trade,Industry,and Energy of the Korean Government(MOTIE).
文摘Three-dimensional printing(3DP)offers valuable insight into the characterization of natural rocks and the verification of theoretical models due to its high reproducibility and accurate replication of complex defects such as cracks and pores.In this study,3DP gypsum samples with different printing directions were subjected to a series of uniaxial compression tests with in situ micro-computed tomography(micro-CT)scanning to quantitatively investigate their mechanical anisotropic properties and damage evolution characteristics.Based on the two-dimensional(2D)CT images obtained at different scanning steps,a novel void ratio variable was derived using the mean value and variance of CT intensity.Additionally,a constitutive model was formulated incorporating the proposed damage variable,utilizing the void ratio variable.The crack evolution and crack morphology of 3DP gypsum samples were obtained and analyzed using the 3D models reconstructed from the CT images.The results indicate that 3DP gypsum samples exhibit mechanical anisotropic characteristics similar to those found in naturally sedimentary rocks.The mechanical anisotropy is attributed to the bedding planes formed between adjacent layers and pillar-like structures along the printing direction formed by CaSO_(4)·2H_(2)O crystals of needle-like morphology.The mean gray intensity of the voids has a positive linear relationship with the threshold value,while the CT variance and void ratio have concave and convex relationships,respectively.The constitutive model can effectively match the stress–strain curves obtained from uniaxial compression experiments.This study provides comprehensive explanations of the failure modes and anisotropic mechanisms of 3DP gypsum samples,which is important for characterizing and understanding the failure mechanism and microstructural evolution of 3DP rocks when modeling natural rock behavior.
基金Supported by the National Natural Science Foundation of China(12064028)Jiangxi Provincial Natural Science Foundation(20232BAB201045).
文摘Electro-Optic Sampling(EOS)detection technique has been widely used in terahertz science and tech⁃nology,and it also can measure the field time waveform of the few-cycle laser pulse.Its frequency response and band limitation are determined directly by the electro-optic crystal and duration of the probe laser pulse.Here,we investigate the performance of the EOS with thin GaSe crystal in the measurement of the mid-infrared few-cycle la⁃ser pulse.The shift of the central frequency and change of the bandwidth induced by the EOS detection are calcu⁃lated,and then the pulse distortions induced in this detection process are discussed.It is found that this technique produces a red-shift of the central frequency and narrowing of the bandwidth.These changings decrease when the laser wavelength increases from 2μm to 10μm.This work can help to estimate the performance of the EOS de⁃tection technique in the mid-infrared band and offer a reference for the related experiment as well.
基金supported by the National Key Research and Development Program of China(2022YFB3103500)the National Natural Science Foundation of China(62473033,62571027)+1 种基金in part by the Beijing Natural Science Foundation(L231012)the State Scholarship Fund from the China Scholarship Council.
文摘Task-oriented point cloud sampling aims to select a representative subset from the input,tailored to specific application scenarios and task requirements.However,existing approaches rarely tackle the problem of redundancy caused by local structural similarities in 3D objects,which limits the performance of sampling.To address this issue,this paper introduces a novel task-oriented point cloud masked autoencoder-based sampling network(Point-MASNet),inspired by the masked autoencoder mechanism.Point-MASNet employs a voxel-based random non-overlapping masking strategy,which allows the model to selectively learn and capture distinctive local structural features from the input data.This approach effectively mitigates redundancy and enhances the representativeness of the sampled subset.In addition,we propose a lightweight,symmetrically structured keypoint reconstruction network,designed as an autoencoder.This network is optimized to efficiently extract latent features while enabling refined reconstructions.Extensive experiments demonstrate that Point-MASNet achieves competitive sampling performance across classification,registration,and reconstruction tasks.