Data hiding methods involve embedding secret messages into cover objects to enable covert communication in a way that is difficult to detect.In data hiding methods based on image interpolation,the image size is reduce...Data hiding methods involve embedding secret messages into cover objects to enable covert communication in a way that is difficult to detect.In data hiding methods based on image interpolation,the image size is reduced and then enlarged through interpolation,followed by the embedding of secret data into the newly generated pixels.A general improving approach for embedding secret messages is proposed.The approach may be regarded a general model for enhancing the data embedding capacity of various existing image interpolation-based data hiding methods.This enhancement is achieved by expanding the range of pixel values available for embedding secret messages,removing the limitations of many existing methods,where the range is restricted to powers of two to facilitate the direct embedding of bit-based messages.This improvement is accomplished through the application of multiple-based number conversion to the secret message data.The method converts the message bits into a multiple-based number and uses an algorithm to embed each digit of this number into an individual pixel,thereby enhancing the message embedding efficiency,as proved by a theorem derived in this study.The proposed improvement method has been tested through experiments on three well-known image interpolation-based data hiding methods.The results show that the proposed method can enhance the three data embedding rates by approximately 14%,13%,and 10%,respectively,create stego-images with good quality,and resist RS steganalysis attacks.These experimental results indicate that the use of the multiple-based number conversion technique to improve the three interpolation-based methods for embedding secret messages increases the number of message bits embedded in the images.For many image interpolation-based data hiding methods,which use power-of-two pixel-value ranges for message embedding,other than the three tested ones,the proposed improvement method is also expected to be effective for enhancing their data embedding capabilities.展开更多
The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and ...The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.展开更多
Snow cover in mountainous areas is characterized by high reflectivity,strong spatial heterogeneity,rapid changes,and susceptibility to cloud interference.However,due to the limitations of a single sensor,it is challen...Snow cover in mountainous areas is characterized by high reflectivity,strong spatial heterogeneity,rapid changes,and susceptibility to cloud interference.However,due to the limitations of a single sensor,it is challenging to obtain high-resolution satellite remote sensing data for monitoring the dynamic changes of snow cover within a day.This study focuses on two typical data fusion methods for polar-orbiting satellites(Sentinel-3 SLSTR)and geostationary satellites(Himawari-9 AHI),and explores the snow cover detection accuracy of a multitemporal cloud-gap snow cover identification model(Loose data fusion)and the ESTARFM(Spatiotemporal data fusion).Taking the Qilian Mountains as the research area,the accuracy of two data fusion results was verified using the snow cover extracted from Landsat-8 SR products.The results showed that both data fusion models could effectively capture the spatiotemporal variations of snow cover,but the ESTARFM demonstrated superior performance.It not only obtained fusion images at any target time,but also extracted snow cover that was closer to the spatial distribution of real satellite images.Therefore,the ESTARFM was utilized to fuse images for hourly reconstruction of the snow cover on February 14–15,2023.It was found that the maximum snow cover area of this snowfall reached 83.84%of the Qilian Mountains area,and the melting rate of the snow was extremely rapid,with a change of up to 4.30%per hour of the study area.This study offers reliable high spatiotemporal resolution satellite remote sensing data for monitoring snow cover changes in mountainous areas,contributing to more accurate and timely assessments.展开更多
Accurate acquisition and prediction of acoustic parameters of seabed sediments are crucial in marine sound propagation research.While the relationship between sound velocity and physical properties of sediment has bee...Accurate acquisition and prediction of acoustic parameters of seabed sediments are crucial in marine sound propagation research.While the relationship between sound velocity and physical properties of sediment has been extensively studied,there is still no consensus on the correlation between acoustic attenuation coefficient and sediment physical properties.Predicting the acoustic attenuation coefficient remains a challenging issue in sedimentary acoustic research.In this study,we propose a prediction method for the acoustic attenuation coefficient using machine learning algorithms,specifically the random forest(RF),support vector machine(SVR),and convolutional neural network(CNN)algorithms.We utilized the acoustic attenuation coefficient and sediment particle size data from 52 stations as training parameters,with the particle size parameters as the input feature matrix,and measured acoustic attenuation as the training label to validate the attenuation prediction model.Our results indicate that the error of the attenuation prediction model is small.Among the three models,the RF model exhibited the lowest prediction error,with a mean squared error of 0.8232,mean absolute error of 0.6613,and root mean squared error of 0.9073.Additionally,when we applied the models to predict the data collected at different times in the same region,we found that the models developed in this study also demonstrated a certain level of reliability in real prediction scenarios.Our approach demonstrates that constructing a sediment acoustic characteristics model based on machine learning is feasible to a certain extent and offers a novel perspective for studying sediment acoustic properties.展开更多
Seismic data plays a pivotal role in fault detection,offering critical insights into subsurface structures and seismic hazards.Understanding fault detection from seismic data is essential for mitigating seismic risks ...Seismic data plays a pivotal role in fault detection,offering critical insights into subsurface structures and seismic hazards.Understanding fault detection from seismic data is essential for mitigating seismic risks and guiding land-use plans.This paper presents a comprehensive review of existing methodologies for fault detection,focusing on the application of Machine Learning(ML)and Deep Learning(DL)techniques to enhance accuracy and efficiency.Various ML and DL approaches are analyzed with respect to fault segmentation,adaptive learning,and fault detection models.These techniques,benchmarked against established seismic datasets,reveal significant improvements over classical methods in terms of accuracy and computational efficiency.Additionally,this review highlights emerging trends,including hybrid model applications and the integration of real-time data processing for seismic fault detection.By providing a detailed comparative analysis of current methodologies,this review aims to guide future research and foster advancements in the effectiveness and reliability of seismic studies.Ultimately,the study seeks to bridge the gap between theoretical investigations and practical implementations in fault detection.展开更多
When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding bia...When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding biased data selection,ameliorating overconfident models,and being flexible to varying practical objectives,especially when the training and testing data are not identically distributed.A workflow characterized by leveraging Bayesian methodology was proposed to address these issues.Employing a Multi-Layer Perceptron(MLP)as the foundational model,this approach was benchmarked against empirical methods and advanced algorithms for its efficacy in simplicity,accuracy,and resistance to overfitting.The analysis revealed that,while MLP models optimized via maximum a posteriori algorithm suffices for straightforward scenarios,Bayesian neural networks showed great potential for preventing overfitting.Additionally,integrating decision thresholds through various evaluative principles offers insights for challenging decisions.Two case studies demonstrate the framework's capacity for nuanced interpretation of in situ data,employing a model committee for a detailed evaluation of liquefaction potential via Monte Carlo simulations and basic statistics.Overall,the proposed step-by-step workflow for analyzing seismic liquefaction incorporates multifold testing and real-world data validation,showing improved robustness against overfitting and greater versatility in addressing practical challenges.This research contributes to the seismic liquefaction assessment field by providing a structured,adaptable methodology for accurate and reliable analysis.展开更多
Accurate determination of rockhead is crucial for underground construction.Traditionally,borehole data are mainly used for this purpose.However,borehole drilling is costly,time-consuming,and sparsely distributed.Non-i...Accurate determination of rockhead is crucial for underground construction.Traditionally,borehole data are mainly used for this purpose.However,borehole drilling is costly,time-consuming,and sparsely distributed.Non-invasive geophysical methods,particularly those using passive seismic surface waves,have emerged as viable alternatives for geological profiling and rockhead detection.This study proposes three interpretation methods for rockhead determination using passive seismic surface wave data from Microtremor Array Measurement(MAM)and Horizontal-to-Vertical Spectral Ratio(HVSR)tests.These are:(1)the Wavelength-Normalized phase velocity(WN)method in which a nonlinear relationship between rockhead depth and wavelength is established;(2)the Statistically Determined-shear wave velocity(SD-V_(s))method in which the representative V_(s) value for rockhead is automatically determined using a statistical method;and(3)the empirical HVSR method in which the rockhead is determined by interpreting resonant frequencies using a reliably calibrated empirical equation.These methods were implemented to determine rockhead depths at 28 locations across two distinct geological formations in Singapore,and the results were evaluated using borehole data.The WN method can determine rockhead depths accurately and reliably with minimal absolute errors(average RMSE=3.11 m),demonstrating robust performance across both geological formations.Its advantage lies in interpreting dispersion curves alone,without the need for the inversion process.The SD-V_(s) method is practical in engineering practice owing to its simplicity.The empirical HVSR method reasonably determines rockhead depths with moderate accuracy,benefiting from a reliably calibrated empirical equation.展开更多
Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts...Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts of climate change.Remote sensing has become a vital tool for snow monitoring,with the widely used Moderate-resolution Imaging Spectroradiometer(MODIS)snow products from the Terra and Aqua satellites.However,cloud cover often interferes with snow detection,making cloud removal techniques crucial for reliable snow product generation.This study evaluated the accuracy of four MODIS snow cover datasets generated through different cloud removal algorithms.Using real-time field camera observations from four stations in the Tianshan Mountains,China,this study assessed the performance of these datasets during three distinct snow periods:the snow accumulation period(September-November),snowmelt period(March-June),and stable snow period(December-February in the following year).The findings showed that cloud-free snow products generated using the Hidden Markov Random Field(HMRF)algorithm consistently outperformed the others,particularly under cloud cover,while cloud-free snow products using near-day synthesis and the spatiotemporal adaptive fusion method with error correction(STAR)demonstrated varying performance depending on terrain complexity and cloud conditions.This study highlighted the importance of considering terrain features,land cover types,and snow dynamics when selecting cloud removal methods,particularly in areas with rapid snow accumulation and melting.The results suggested that future research should focus on improving cloud removal algorithms through the integration of machine learning,multi-source data fusion,and advanced remote sensing technologies.By expanding validation efforts and refining cloud removal strategies,more accurate and reliable snow products can be developed,contributing to enhanced snow monitoring and better management of water resources in alpine and arid areas.展开更多
Gastric cancer(GC), the fifth most common cancer globally, remains the leading cause of cancer deaths worldwide. Inflammation-induced tumorigenesis is the predominant process in GC development;therefore, systematic re...Gastric cancer(GC), the fifth most common cancer globally, remains the leading cause of cancer deaths worldwide. Inflammation-induced tumorigenesis is the predominant process in GC development;therefore, systematic research in this area should improve understanding of the biological mechanisms that initiate GC development and promote cancer hallmarks. Here, we summarize biological knowledge regarding gastric inflammation-induced tumorigenesis, and characterize the multi-omics data and systems biology methods for investigating GC development. Of note, we highlight pioneering studies in multi-omics data and state-of-the-art network-based algorithms used for dissecting the features of gastric inflammation-induced tumorigenesis, and we propose translational applications in early GC warning biomarkers and precise treatment strategies. This review offers integrative insights for GC research, with the goal of paving the way to novel paradigms for GC precision oncology and prevention.展开更多
Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and mai...Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and maintenance of cable-stayed bridges.However,the representative temperatures of stayed cables are not specified in the existing design codes.To address this issue,this study investigates the distribution of the cable temperature and determinates its representative temperature.First,an experimental investigation,spanning over a period of one year,was carried out near the bridge site to obtain the temperature data.According to the statistical analysis of the measured data,it reveals that the temperature distribution is generally uniform along the cable cross-section without significant temperature gradient.Then,based on the limited data,the Monte Carlo,the gradient boosted regression trees(GBRT),and univariate linear regression(ULR)methods are employed to predict the cable’s representative temperature throughout the service life.These methods effectively overcome the limitations of insufficient monitoring data and accurately predict the representative temperature of the cables.However,each method has its own advantages and limitations in terms of applicability and accuracy.A comprehensive evaluation of the performance of these methods is conducted,and practical recommendations are provided for their application.The proposed methods and representative temperatures provide a good basis for the operation and maintenance of in-service long-span cable-stayed bridges.展开更多
Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced tran...Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.展开更多
Hydrogen is the new age alternative energy source to combat energy demand and climate change.Storage of hydrogen is vital for a nation’s growth.Works of literature provide different methods for storing the produced h...Hydrogen is the new age alternative energy source to combat energy demand and climate change.Storage of hydrogen is vital for a nation’s growth.Works of literature provide different methods for storing the produced hydrogen,and the rational selection of a viable method is crucial for promoting sustainability and green practices.Typically,hydrogen storage is associated with diverse sustainable and circular economy(SCE)criteria.As a result,the authors consider the situation a multi-criteria decision-making(MCDM)problem.Studies infer that previous models for hydrogen storage method(HSM)selection(i)do not consider preferences in the natural language form;(ii)weights of experts are not methodically determined;(iii)hesitation of experts during criteria weight assessment is not effectively explored;and(iv)three-stage solution of a suitable selection of HSM is unexplored.Driven by these gaps,in this paper,authors put forward a new integrated framework,which considers double hierarchy linguistic information for rating,criteria importance through inter-criteria correlation(CRITIC)for expert weight calculation,evidence-based Bayesian method for criteria weight estimation,and combined compromise solution(CoCoSo)for ranking HSMs.The applicability of the developed framework is testified by using a case example of HSM selection in India.Sensitivity and comparative analysis reveal the merits and limitations of the developed framework.展开更多
In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data be...In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data benchmark(HIS)has garnered the most attention due to its practicality and effectiveness.However,existing CPM reviews usually focus on the theoretical benchmark,and there is a lack of an in-depth review that thoroughly explores HIS-based methods.In this article,a comprehensive overview of HIS-based CPM is provided.First,we provide a novel static-dynamic perspective on data-level manifestations of control performance underlying typical controller capacities including regulation and servo:static and dynamic properties.The static property portrays time-independent variability in system output,and the dynamic property describes temporal behavior driven by closed-loop feedback.Accordingly,existing HIS-based CPM approaches and their intrinsic motivations are classified and analyzed from these two perspectives.Specifically,two mainstream solutions for CPM methods are summarized,including static analysis and dynamic analysis,which match data-driven techniques with actual controlling behavior.Furthermore,this paper also points out various opportunities and challenges faced in CPM for modern industry and provides promising directions in the context of artificial intelligence for inspiring future research.展开更多
Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning fr...Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.展开更多
The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,s...The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.展开更多
Air pollution in China covers a large area with complex sources and formation mechanisms,making it a unique place to conduct air pollution and atmospheric chemistry research.The National Natural Science Foundation of ...Air pollution in China covers a large area with complex sources and formation mechanisms,making it a unique place to conduct air pollution and atmospheric chemistry research.The National Natural Science Foundation of China’s Major Research Plan entitled“Fundamental Researches on the Formation and Response Mechanism of the Air Pollution Complex in China”(or the Plan)has funded 76 research projects to explore the causes of air pollution in China,and the key processes of air pollution in atmospheric physics and atmospheric chemistry.In order to summarize the abundant data from the Plan and exhibit the long-term impacts domestically and internationally,an integration project is responsible for collecting the various types of data generated by the 76 projects of the Plan.This project has classified and integrated these data,forming eight categories containing 258 datasets and 15 technical reports in total.The integration project has led to the successful establishment of the China Air Pollution Data Center(CAPDC)platform,providing storage,retrieval,and download services for the eight categories.This platform has distinct features including data visualization,related project information querying,and bilingual services in both English and Chinese,which allows for rapid searching and downloading of data and provides a solid foundation of data and support for future related research.Air pollution control in China,especially in the past decade,is undeniably a global exemplar,and this data center is the first in China to focus on research into the country’s air pollution complex.展开更多
Lunar wrinkle ridges are an important stress geological structure on the Moon, which reflect the stress state and geological activity on the Moon. They provide important insights into the evolution of the Moon and are...Lunar wrinkle ridges are an important stress geological structure on the Moon, which reflect the stress state and geological activity on the Moon. They provide important insights into the evolution of the Moon and are key factors influencing future lunar activity, such as the choice of landing sites. However, automatic extraction of lunar wrinkle ridges is a challenging task due to their complex morphology and ambiguous features. Traditional manual extraction methods are time-consuming and labor-intensive. To achieve automated and detailed detection of lunar wrinkle ridges, we have constructed a lunar wrinkle ridge data set, incorporating previously unused aspect data to provide edge information, and proposed a Dual-Branch Ridge Detection Network(DBR-Net) based on deep learning technology. This method employs a dual-branch architecture and an Attention Complementary Feature Fusion module to address the issue of insufficient lunar wrinkle ridge features. Through comparisons with the results of various deep learning approaches, it is demonstrated that the proposed method exhibits superior detection performance. Furthermore, the trained model was applied to lunar mare regions, generating a distribution map of lunar mare wrinkle ridges;a significant linear relationship between the length and area of the lunar wrinkle ridges was obtained through statistical analysis, and six previously unrecorded potential lunar wrinkle ridges were detected. The proposed method upgrades the automated extraction of lunar wrinkle ridges to a pixel-level precision and verifies the effectiveness of DBR-Net in lunar wrinkle ridge detection.展开更多
BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Pack...BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Package)dedicated to automatically reducing the long-slit and echelle spectra obtained by these two instruments.The package supports bias and flat-fielding correction,order location,background subtraction,automatic wavelength calibration,and absolute flux calibration.The optimal extraction method maximizes the signal-to-noise ratio and removes most of the cosmic rays imprinted in the spectra.A comparison with the 1D spectra reduced with IRAF verifies the reliability of the results.This open-source software is publicly available to the community.展开更多
As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and oper...As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and operation and supervision[1,2].Healthcare data elements include biolog.ical and clinical data that are related to disease,environ-mental health data that are associated with life,and operational and healthcare management data that are related to healthcare activities(Figure 1).Activities such as the construction of a data value assessment system,the devel-opment of a data circulation and sharing platform,and the authorization of data compliance and operation products support the strong growth momentum of the market for health care data elements in China[3].展开更多
When inverting the S-wave velocity and azimuthal anisotropy from ambient noise data, it is always to obtain the partial overlapped inversion results in contiguous different regions. Merging different data to achieve a...When inverting the S-wave velocity and azimuthal anisotropy from ambient noise data, it is always to obtain the partial overlapped inversion results in contiguous different regions. Merging different data to achieve a consistent model becomes an essential requirement. Based on the S-wave velocity and azimuthal anisotropy obtained from different contiguous regions, this paper introduces three kinds of methods for merging data. For data from different regions with partial overlapping areas, the merged results could be calculated by direct average weighting(DAW), linear dynamic weighting(LDW), and Gaussian function weighting(GFW), respectively. Data tests demonstrate that the LDW and GFW methods can effectively merge data by reasonably allocating data weights to capitalize on the data quality advantages in each zone. In particular, they can resolve the data smoothness at the boundaries of data areas, resulting in a consistent data model in larger regions. This paper presents the effective methods and valuable experiences that can be referred to as advancing data merging technology.展开更多
文摘Data hiding methods involve embedding secret messages into cover objects to enable covert communication in a way that is difficult to detect.In data hiding methods based on image interpolation,the image size is reduced and then enlarged through interpolation,followed by the embedding of secret data into the newly generated pixels.A general improving approach for embedding secret messages is proposed.The approach may be regarded a general model for enhancing the data embedding capacity of various existing image interpolation-based data hiding methods.This enhancement is achieved by expanding the range of pixel values available for embedding secret messages,removing the limitations of many existing methods,where the range is restricted to powers of two to facilitate the direct embedding of bit-based messages.This improvement is accomplished through the application of multiple-based number conversion to the secret message data.The method converts the message bits into a multiple-based number and uses an algorithm to embed each digit of this number into an individual pixel,thereby enhancing the message embedding efficiency,as proved by a theorem derived in this study.The proposed improvement method has been tested through experiments on three well-known image interpolation-based data hiding methods.The results show that the proposed method can enhance the three data embedding rates by approximately 14%,13%,and 10%,respectively,create stego-images with good quality,and resist RS steganalysis attacks.These experimental results indicate that the use of the multiple-based number conversion technique to improve the three interpolation-based methods for embedding secret messages increases the number of message bits embedded in the images.For many image interpolation-based data hiding methods,which use power-of-two pixel-value ranges for message embedding,other than the three tested ones,the proposed improvement method is also expected to be effective for enhancing their data embedding capabilities.
基金the National Natural Science Foundation of China(Grant Nos.52308403 and 52079068)the Yunlong Lake Laboratory of Deep Underground Science and Engineering(No.104023005)the China Postdoctoral Science Foundation(Grant No.2023M731998)for funding provided to this work.
文摘The uniaxial compressive strength(UCS)of rocks is a vital geomechanical parameter widely used for rock mass classification,stability analysis,and engineering design in rock engineering.Various UCS testing methods and apparatuses have been proposed over the past few decades.The objective of the present study is to summarize the status and development in theories,test apparatuses,data processing of the existing testing methods for UCS measurement.It starts with elaborating the theories of these test methods.Then the test apparatus and development trends for UCS measurement are summarized,followed by a discussion on rock specimens for test apparatus,and data processing methods.Next,the method selection for UCS measurement is recommended.It reveals that the rock failure mechanism in the UCS testing methods can be divided into compression-shear,compression-tension,composite failure mode,and no obvious failure mode.The trends of these apparatuses are towards automation,digitization,precision,and multi-modal test.Two size correction methods are commonly used.One is to develop empirical correlation between the measured indices and the specimen size.The other is to use a standard specimen to calculate the size correction factor.Three to five input parameters are commonly utilized in soft computation models to predict the UCS of rocks.The selection of the test methods for the UCS measurement can be carried out according to the testing scenario and the specimen size.The engineers can gain a comprehensive understanding of the UCS testing methods and its potential developments in various rock engineering endeavors.
基金funded by the National Natural Science Foundation of China(42361058)supported by the Science and Technology Program of Gansu Province(22YF7FA074)。
文摘Snow cover in mountainous areas is characterized by high reflectivity,strong spatial heterogeneity,rapid changes,and susceptibility to cloud interference.However,due to the limitations of a single sensor,it is challenging to obtain high-resolution satellite remote sensing data for monitoring the dynamic changes of snow cover within a day.This study focuses on two typical data fusion methods for polar-orbiting satellites(Sentinel-3 SLSTR)and geostationary satellites(Himawari-9 AHI),and explores the snow cover detection accuracy of a multitemporal cloud-gap snow cover identification model(Loose data fusion)and the ESTARFM(Spatiotemporal data fusion).Taking the Qilian Mountains as the research area,the accuracy of two data fusion results was verified using the snow cover extracted from Landsat-8 SR products.The results showed that both data fusion models could effectively capture the spatiotemporal variations of snow cover,but the ESTARFM demonstrated superior performance.It not only obtained fusion images at any target time,but also extracted snow cover that was closer to the spatial distribution of real satellite images.Therefore,the ESTARFM was utilized to fuse images for hourly reconstruction of the snow cover on February 14–15,2023.It was found that the maximum snow cover area of this snowfall reached 83.84%of the Qilian Mountains area,and the melting rate of the snow was extremely rapid,with a change of up to 4.30%per hour of the study area.This study offers reliable high spatiotemporal resolution satellite remote sensing data for monitoring snow cover changes in mountainous areas,contributing to more accurate and timely assessments.
基金funded by the Basic Scientific Fund for National Public Research Institutes of China(No.2022 S01)the National Natural Science Foundation of China(Nos.42176191,42049902,and U22A2012)+5 种基金the Shandong Provincial Natural Science Foundation,China(No.ZR2022YQ40)the National Key R&D Program of China(No.2021YFF0501202)the Southern Marine Science and Engineering Guangdong Laboratory(Zhuhai)(No.SML2023 SP232)the Fundamental Research Funds for the Central Universities,Sun Yat-sen University(No.241gqb006)Data acquisition and sample collections were supported by the National Natural Science Foundation of China Open Research Cruise(Cruise No.NORC2021-02+NORC2021301)funded by the Shiptime Sharing Project of the National Natural Science Foundation of China。
文摘Accurate acquisition and prediction of acoustic parameters of seabed sediments are crucial in marine sound propagation research.While the relationship between sound velocity and physical properties of sediment has been extensively studied,there is still no consensus on the correlation between acoustic attenuation coefficient and sediment physical properties.Predicting the acoustic attenuation coefficient remains a challenging issue in sedimentary acoustic research.In this study,we propose a prediction method for the acoustic attenuation coefficient using machine learning algorithms,specifically the random forest(RF),support vector machine(SVR),and convolutional neural network(CNN)algorithms.We utilized the acoustic attenuation coefficient and sediment particle size data from 52 stations as training parameters,with the particle size parameters as the input feature matrix,and measured acoustic attenuation as the training label to validate the attenuation prediction model.Our results indicate that the error of the attenuation prediction model is small.Among the three models,the RF model exhibited the lowest prediction error,with a mean squared error of 0.8232,mean absolute error of 0.6613,and root mean squared error of 0.9073.Additionally,when we applied the models to predict the data collected at different times in the same region,we found that the models developed in this study also demonstrated a certain level of reliability in real prediction scenarios.Our approach demonstrates that constructing a sediment acoustic characteristics model based on machine learning is feasible to a certain extent and offers a novel perspective for studying sediment acoustic properties.
文摘Seismic data plays a pivotal role in fault detection,offering critical insights into subsurface structures and seismic hazards.Understanding fault detection from seismic data is essential for mitigating seismic risks and guiding land-use plans.This paper presents a comprehensive review of existing methodologies for fault detection,focusing on the application of Machine Learning(ML)and Deep Learning(DL)techniques to enhance accuracy and efficiency.Various ML and DL approaches are analyzed with respect to fault segmentation,adaptive learning,and fault detection models.These techniques,benchmarked against established seismic datasets,reveal significant improvements over classical methods in terms of accuracy and computational efficiency.Additionally,this review highlights emerging trends,including hybrid model applications and the integration of real-time data processing for seismic fault detection.By providing a detailed comparative analysis of current methodologies,this review aims to guide future research and foster advancements in the effectiveness and reliability of seismic studies.Ultimately,the study seeks to bridge the gap between theoretical investigations and practical implementations in fault detection.
文摘When assessing seismic liquefaction potential with data-driven models,addressing the uncertainties of establishing models,interpreting cone penetration tests(CPT)data and decision threshold is crucial for avoiding biased data selection,ameliorating overconfident models,and being flexible to varying practical objectives,especially when the training and testing data are not identically distributed.A workflow characterized by leveraging Bayesian methodology was proposed to address these issues.Employing a Multi-Layer Perceptron(MLP)as the foundational model,this approach was benchmarked against empirical methods and advanced algorithms for its efficacy in simplicity,accuracy,and resistance to overfitting.The analysis revealed that,while MLP models optimized via maximum a posteriori algorithm suffices for straightforward scenarios,Bayesian neural networks showed great potential for preventing overfitting.Additionally,integrating decision thresholds through various evaluative principles offers insights for challenging decisions.Two case studies demonstrate the framework's capacity for nuanced interpretation of in situ data,employing a model committee for a detailed evaluation of liquefaction potential via Monte Carlo simulations and basic statistics.Overall,the proposed step-by-step workflow for analyzing seismic liquefaction incorporates multifold testing and real-world data validation,showing improved robustness against overfitting and greater versatility in addressing practical challenges.This research contributes to the seismic liquefaction assessment field by providing a structured,adaptable methodology for accurate and reliable analysis.
基金partially supported by the Singapore Ministry of National Development and the National Research Foundation,Prime Minister’s Office,Singapore,under the Land and Liveability National Innovation Challenge(L2 NIC)Research Program(Grant No.L2NICCFP2-2015-1)by the National Research Foundation(NRF)of Singapore,under the Virtual Singapore program(Grant No.NRF2019VSG-GMS-001).
文摘Accurate determination of rockhead is crucial for underground construction.Traditionally,borehole data are mainly used for this purpose.However,borehole drilling is costly,time-consuming,and sparsely distributed.Non-invasive geophysical methods,particularly those using passive seismic surface waves,have emerged as viable alternatives for geological profiling and rockhead detection.This study proposes three interpretation methods for rockhead determination using passive seismic surface wave data from Microtremor Array Measurement(MAM)and Horizontal-to-Vertical Spectral Ratio(HVSR)tests.These are:(1)the Wavelength-Normalized phase velocity(WN)method in which a nonlinear relationship between rockhead depth and wavelength is established;(2)the Statistically Determined-shear wave velocity(SD-V_(s))method in which the representative V_(s) value for rockhead is automatically determined using a statistical method;and(3)the empirical HVSR method in which the rockhead is determined by interpreting resonant frequencies using a reliably calibrated empirical equation.These methods were implemented to determine rockhead depths at 28 locations across two distinct geological formations in Singapore,and the results were evaluated using borehole data.The WN method can determine rockhead depths accurately and reliably with minimal absolute errors(average RMSE=3.11 m),demonstrating robust performance across both geological formations.Its advantage lies in interpreting dispersion curves alone,without the need for the inversion process.The SD-V_(s) method is practical in engineering practice owing to its simplicity.The empirical HVSR method reasonably determines rockhead depths with moderate accuracy,benefiting from a reliably calibrated empirical equation.
基金funded by the Third Xinjiang Scientific Expedition Program(2021xjkk1400)the National Natural Science Foundation of China(42071049)+2 种基金the Natural Science Foundation of Xinjiang Uygur Autonomous Region(2019D01C022)the Xinjiang Uygur Autonomous Region Innovation Environment Construction Special Project&Science and Technology Innovation Base Construction Project(PT2107)the Tianshan Talent-Science and Technology Innovation Team(2022TSYCTD0006).
文摘Snow cover plays a critical role in global climate regulation and hydrological processes.Accurate monitoring is essential for understanding snow distribution patterns,managing water resources,and assessing the impacts of climate change.Remote sensing has become a vital tool for snow monitoring,with the widely used Moderate-resolution Imaging Spectroradiometer(MODIS)snow products from the Terra and Aqua satellites.However,cloud cover often interferes with snow detection,making cloud removal techniques crucial for reliable snow product generation.This study evaluated the accuracy of four MODIS snow cover datasets generated through different cloud removal algorithms.Using real-time field camera observations from four stations in the Tianshan Mountains,China,this study assessed the performance of these datasets during three distinct snow periods:the snow accumulation period(September-November),snowmelt period(March-June),and stable snow period(December-February in the following year).The findings showed that cloud-free snow products generated using the Hidden Markov Random Field(HMRF)algorithm consistently outperformed the others,particularly under cloud cover,while cloud-free snow products using near-day synthesis and the spatiotemporal adaptive fusion method with error correction(STAR)demonstrated varying performance depending on terrain complexity and cloud conditions.This study highlighted the importance of considering terrain features,land cover types,and snow dynamics when selecting cloud removal methods,particularly in areas with rapid snow accumulation and melting.The results suggested that future research should focus on improving cloud removal algorithms through the integration of machine learning,multi-source data fusion,and advanced remote sensing technologies.By expanding validation efforts and refining cloud removal strategies,more accurate and reliable snow products can be developed,contributing to enhanced snow monitoring and better management of water resources in alpine and arid areas.
基金supported by funds from the National Natural Science Foundation of China (Grant No. T2341008)。
文摘Gastric cancer(GC), the fifth most common cancer globally, remains the leading cause of cancer deaths worldwide. Inflammation-induced tumorigenesis is the predominant process in GC development;therefore, systematic research in this area should improve understanding of the biological mechanisms that initiate GC development and promote cancer hallmarks. Here, we summarize biological knowledge regarding gastric inflammation-induced tumorigenesis, and characterize the multi-omics data and systems biology methods for investigating GC development. Of note, we highlight pioneering studies in multi-omics data and state-of-the-art network-based algorithms used for dissecting the features of gastric inflammation-induced tumorigenesis, and we propose translational applications in early GC warning biomarkers and precise treatment strategies. This review offers integrative insights for GC research, with the goal of paving the way to novel paradigms for GC precision oncology and prevention.
基金Project(2017G006-N)supported by the Project of Science and Technology Research and Development Program of China Railway Corporation。
文摘Cable-stayed bridges have been widely used in high-speed railway infrastructure.The accurate determination of cable’s representative temperatures is vital during the intricate processes of design,construction,and maintenance of cable-stayed bridges.However,the representative temperatures of stayed cables are not specified in the existing design codes.To address this issue,this study investigates the distribution of the cable temperature and determinates its representative temperature.First,an experimental investigation,spanning over a period of one year,was carried out near the bridge site to obtain the temperature data.According to the statistical analysis of the measured data,it reveals that the temperature distribution is generally uniform along the cable cross-section without significant temperature gradient.Then,based on the limited data,the Monte Carlo,the gradient boosted regression trees(GBRT),and univariate linear regression(ULR)methods are employed to predict the cable’s representative temperature throughout the service life.These methods effectively overcome the limitations of insufficient monitoring data and accurately predict the representative temperature of the cables.However,each method has its own advantages and limitations in terms of applicability and accuracy.A comprehensive evaluation of the performance of these methods is conducted,and practical recommendations are provided for their application.The proposed methods and representative temperatures provide a good basis for the operation and maintenance of in-service long-span cable-stayed bridges.
基金research was funded by Science and Technology Project of State Grid Corporation of China under grant number 5200-202319382A-2-3-XG.
文摘Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.
文摘Hydrogen is the new age alternative energy source to combat energy demand and climate change.Storage of hydrogen is vital for a nation’s growth.Works of literature provide different methods for storing the produced hydrogen,and the rational selection of a viable method is crucial for promoting sustainability and green practices.Typically,hydrogen storage is associated with diverse sustainable and circular economy(SCE)criteria.As a result,the authors consider the situation a multi-criteria decision-making(MCDM)problem.Studies infer that previous models for hydrogen storage method(HSM)selection(i)do not consider preferences in the natural language form;(ii)weights of experts are not methodically determined;(iii)hesitation of experts during criteria weight assessment is not effectively explored;and(iv)three-stage solution of a suitable selection of HSM is unexplored.Driven by these gaps,in this paper,authors put forward a new integrated framework,which considers double hierarchy linguistic information for rating,criteria importance through inter-criteria correlation(CRITIC)for expert weight calculation,evidence-based Bayesian method for criteria weight estimation,and combined compromise solution(CoCoSo)for ranking HSMs.The applicability of the developed framework is testified by using a case example of HSM selection in India.Sensitivity and comparative analysis reveal the merits and limitations of the developed framework.
基金supported in part by the National Natural Science Foundation of China(62125306)Zhejiang Key Research and Development Project(2024C01163)the State Key Laboratory of Industrial Control Technology,China(ICT2024A06)
文摘In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data benchmark(HIS)has garnered the most attention due to its practicality and effectiveness.However,existing CPM reviews usually focus on the theoretical benchmark,and there is a lack of an in-depth review that thoroughly explores HIS-based methods.In this article,a comprehensive overview of HIS-based CPM is provided.First,we provide a novel static-dynamic perspective on data-level manifestations of control performance underlying typical controller capacities including regulation and servo:static and dynamic properties.The static property portrays time-independent variability in system output,and the dynamic property describes temporal behavior driven by closed-loop feedback.Accordingly,existing HIS-based CPM approaches and their intrinsic motivations are classified and analyzed from these two perspectives.Specifically,two mainstream solutions for CPM methods are summarized,including static analysis and dynamic analysis,which match data-driven techniques with actual controlling behavior.Furthermore,this paper also points out various opportunities and challenges faced in CPM for modern industry and provides promising directions in the context of artificial intelligence for inspiring future research.
基金supported by the National Natural Science Foundation of China(32370703)the CAMS Innovation Fund for Medical Sciences(CIFMS)(2022-I2M-1-021,2021-I2M-1-061)the Major Project of Guangzhou National Labora-tory(GZNL2024A01015).
文摘Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.
基金supported in part by the National Natural Science Foundation of China under Grant 62371181in part by the Changzhou Science and Technology International Cooperation Program under Grant CZ20230029+1 种基金supported by a National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(2021R1A2B5B02087169)supported under the framework of international cooperation program managed by the National Research Foundation of Korea(2022K2A9A1A01098051)。
文摘The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.
基金supported by the National Natural Science Foundation of China(Grant No.92044303)。
文摘Air pollution in China covers a large area with complex sources and formation mechanisms,making it a unique place to conduct air pollution and atmospheric chemistry research.The National Natural Science Foundation of China’s Major Research Plan entitled“Fundamental Researches on the Formation and Response Mechanism of the Air Pollution Complex in China”(or the Plan)has funded 76 research projects to explore the causes of air pollution in China,and the key processes of air pollution in atmospheric physics and atmospheric chemistry.In order to summarize the abundant data from the Plan and exhibit the long-term impacts domestically and internationally,an integration project is responsible for collecting the various types of data generated by the 76 projects of the Plan.This project has classified and integrated these data,forming eight categories containing 258 datasets and 15 technical reports in total.The integration project has led to the successful establishment of the China Air Pollution Data Center(CAPDC)platform,providing storage,retrieval,and download services for the eight categories.This platform has distinct features including data visualization,related project information querying,and bilingual services in both English and Chinese,which allows for rapid searching and downloading of data and provides a solid foundation of data and support for future related research.Air pollution control in China,especially in the past decade,is undeniably a global exemplar,and this data center is the first in China to focus on research into the country’s air pollution complex.
文摘Lunar wrinkle ridges are an important stress geological structure on the Moon, which reflect the stress state and geological activity on the Moon. They provide important insights into the evolution of the Moon and are key factors influencing future lunar activity, such as the choice of landing sites. However, automatic extraction of lunar wrinkle ridges is a challenging task due to their complex morphology and ambiguous features. Traditional manual extraction methods are time-consuming and labor-intensive. To achieve automated and detailed detection of lunar wrinkle ridges, we have constructed a lunar wrinkle ridge data set, incorporating previously unused aspect data to provide edge information, and proposed a Dual-Branch Ridge Detection Network(DBR-Net) based on deep learning technology. This method employs a dual-branch architecture and an Attention Complementary Feature Fusion module to address the issue of insufficient lunar wrinkle ridge features. Through comparisons with the results of various deep learning approaches, it is demonstrated that the proposed method exhibits superior detection performance. Furthermore, the trained model was applied to lunar mare regions, generating a distribution map of lunar mare wrinkle ridges;a significant linear relationship between the length and area of the lunar wrinkle ridges was obtained through statistical analysis, and six previously unrecorded potential lunar wrinkle ridges were detected. The proposed method upgrades the automated extraction of lunar wrinkle ridges to a pixel-level precision and verifies the effectiveness of DBR-Net in lunar wrinkle ridge detection.
基金supported by the National Natural Science Foundation of China under grant No.U2031144partially supported by the Open Project Program of the Key Laboratory of Optical Astronomy,National Astronomical Observatories,Chinese Academy of Sciences+5 种基金supported by the National Key R&D Program of China with No.2021YFA1600404the National Natural Science Foundation of China(12173082)the Yunnan Fundamental Research Projects(grant 202201AT070069)the Top-notch Young Talents Program of Yunnan Provincethe Light of West China Program provided by the Chinese Academy of Sciencesthe International Centre of Supernovae,Yunnan Key Laboratory(No.202302AN360001)。
文摘BFOSC and YFOSC are the most frequently used instruments in the Xinglong 2.16 m telescope and Lijiang 2.4 m telescope,respectively.We developed a software package named“BYSpec”(BFOSC and YFOSC Spectra Reduction Package)dedicated to automatically reducing the long-slit and echelle spectra obtained by these two instruments.The package supports bias and flat-fielding correction,order location,background subtraction,automatic wavelength calibration,and absolute flux calibration.The optimal extraction method maximizes the signal-to-noise ratio and removes most of the cosmic rays imprinted in the spectra.A comparison with the 1D spectra reduced with IRAF verifies the reliability of the results.This open-source software is publicly available to the community.
基金supported by National Natural Science Foundation of China(Grants 72474022,71974011,72174022,71972012,71874009)"BIT think tank"Promotion Plan of Science and Technology Innovation Program of Beijing Institute of Technology(Grants 2024CX14017,2023CX13029).
文摘As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and operation and supervision[1,2].Healthcare data elements include biolog.ical and clinical data that are related to disease,environ-mental health data that are associated with life,and operational and healthcare management data that are related to healthcare activities(Figure 1).Activities such as the construction of a data value assessment system,the devel-opment of a data circulation and sharing platform,and the authorization of data compliance and operation products support the strong growth momentum of the market for health care data elements in China[3].
基金supported by the National Natural Science Foundation of China (Project 42330311)the Central Public-interest Scientific Institution Basal Research Fund (No. 2021IEF0103)the National Key R&D Project of China (2017YFC1500304)。
文摘When inverting the S-wave velocity and azimuthal anisotropy from ambient noise data, it is always to obtain the partial overlapped inversion results in contiguous different regions. Merging different data to achieve a consistent model becomes an essential requirement. Based on the S-wave velocity and azimuthal anisotropy obtained from different contiguous regions, this paper introduces three kinds of methods for merging data. For data from different regions with partial overlapping areas, the merged results could be calculated by direct average weighting(DAW), linear dynamic weighting(LDW), and Gaussian function weighting(GFW), respectively. Data tests demonstrate that the LDW and GFW methods can effectively merge data by reasonably allocating data weights to capitalize on the data quality advantages in each zone. In particular, they can resolve the data smoothness at the boundaries of data areas, resulting in a consistent data model in larger regions. This paper presents the effective methods and valuable experiences that can be referred to as advancing data merging technology.