This paper describes the detailed desi gn of data acquisition device with multi-channel and high-precision for aerosp ace.Based on detailed analysis of the advantages and disadvantages of tw o common acquisition circu...This paper describes the detailed desi gn of data acquisition device with multi-channel and high-precision for aerosp ace.Based on detailed analysis of the advantages and disadvantages of tw o common acquisition circuits,the design factors of acquisition device focus o n accuracy,sampling rate,hardware overhead and design space.The me chanical structure of the system is divided into different card layers according to different functions and the structure has the characteristics of high reliability,conveni ence to install and scalability.To ens ure reliable operation mode,the interface uses the optocoupler isolated from th e e xternal circuit.The transmission of signal is decided by the current in the cur rent loop that consists of optocouplers between acquisition device and t est bench.In multi-channel switching circuit,by establ ishing analog multiplexer model,the selection principles of circuit modes are given.展开更多
INTRODUCTION.Crustal velocity model is crucial for describing the subsurface composition and structure,and has significant implications in offshore oil and gas exploration and marine geophysical engineering(Xie et al....INTRODUCTION.Crustal velocity model is crucial for describing the subsurface composition and structure,and has significant implications in offshore oil and gas exploration and marine geophysical engineering(Xie et al.,2024).Currently,travel time tomography is the most commonly used method for velocity modeling based on ocean bottom seismometer(OBS)data(Zhang et al.,2023;Sambolian et al.,2021).This method usually assumes that the sub-seafloor structure is layered,and therefore faces challenges in high-precision modeling with strong lateral discontinuities.展开更多
Large-aperture optical components are of paramount importance in domains such as integrated circuits,photolithography,aerospace,and inertial confinement fusion.However,measuring their surface profiles relies predomina...Large-aperture optical components are of paramount importance in domains such as integrated circuits,photolithography,aerospace,and inertial confinement fusion.However,measuring their surface profiles relies predominantly on the phase-shifting approach,which involves collecting multiple interferograms and imposes stringent demands on environmental stability.These issues significantly hinder its ability to achieve real-time and dynamic high-precision measurements.Therefore,this study proposes a high-precision large-aperture single-frame interferometric surface profile measurement(LA-SFISPM)method based on deep learning and explores its capability to realize dynamic measurements with high accuracy.The interferogram is matched to the phase by training the data measured using the small aperture.The consistency of the surface features of the small and large apertures is enhanced via contrast learning and feature-distribution alignment.Hence,high-precision phase reconstruction of large-aperture optical components can be achieved without using a phase shifter.The experimental results show that for the tested mirror withΦ=820 mm,the surface profile obtained from LA-SFISPM is subtracted point-by-point from the ground truth,resulting in a maximum single-point error of 4.56 nm.Meanwhile,the peak-to-valley(PV)value is 0.0758λ,and the simple repeatability of root mean square(SR-RMS)value is 0.00025λ,which aligns well with the measured results obtained by ZYGO.In particular,a significant reduction in the measurement time(reduced by a factor of 48)is achieved compared with that of the traditional phase-shifting method.Our proposed method provides an efficient,rapid,and accurate method for obtaining the surface profiles of optical components with different diameters without employing a phase-shifting approach,which is highly desired in large-aperture interferometric measurement systems.展开更多
The elliptic integral method(EIM) is an efficient analytical approach for analyzing large deformations of elastic beams. However, it faces the following challenges.First, the existing EIM can only handle cases with kn...The elliptic integral method(EIM) is an efficient analytical approach for analyzing large deformations of elastic beams. However, it faces the following challenges.First, the existing EIM can only handle cases with known deformation modes. Second,the existing EIM is only applicable to Euler beams, and there is no EIM available for higher-precision Timoshenko and Reissner beams in cases where both force and moment are applied at the end. This paper proposes a general EIM for Reissner beams under arbitrary boundary conditions. On this basis, an analytical equation for determining the sign of the elliptic integral is provided. Based on the equation, we discover a class of elliptic integral piecewise points that are distinct from inflection points. More importantly, we propose an algorithm that automatically calculates the number of inflection points and other piecewise points during the nonlinear solution process, which is crucial for beams with unknown or changing deformation modes.展开更多
With the intensifying competition in the integrated circuit(IC)industry,the high turnover rate of integrated circuit engineers has become a prominent issue affecting the technological continuity of high-precision,spec...With the intensifying competition in the integrated circuit(IC)industry,the high turnover rate of integrated circuit engineers has become a prominent issue affecting the technological continuity of high-precision,specialized,and innovative enterprises.As a representative of such enterprises,JL Technology has faced challenges to its R&D efficiency due to talent loss in recent years.This study takes this enterprise as a case to explore feasible paths to reduce turnover rates through optimizing training and career development systems.The research designs a method combining learning maps and talent maps,utilizes a competency model to clarify the direction for engineers’skill improvement,implements talent classification management using a nine-grid model,and achieves personalized training through Individual Development Plans(IDPs).Analysis of the enterprise’s historical data reveals that the main reasons for turnover are unclear career development paths and insufficient resources for skill improvement.After pilot implementation,the turnover rate in core departments decreased by 12%,and employee satisfaction with training increased by 24%.The results indicate that matching systematic talent reviews with dynamic learning resources can effectively enhance engineers’sense of belonging.This study provides a set of highly operational management tools for small and medium-sized high-precision,specialized,and innovative technology enterprises,verifies their applicability in such enterprises,and offers replicable experiences for similar enterprises to optimize their talent strategies[1].展开更多
Arabic Sign Language(ArSL)recognition plays a vital role in enhancing the communication for the Deaf and Hard of Hearing(DHH)community.Researchers have proposed multiple methods for automated recognition of ArSL;howev...Arabic Sign Language(ArSL)recognition plays a vital role in enhancing the communication for the Deaf and Hard of Hearing(DHH)community.Researchers have proposed multiple methods for automated recognition of ArSL;however,these methods face multiple challenges that include high gesture variability,occlusions,limited signer diversity,and the scarcity of large annotated datasets.Existing methods,often relying solely on either skeletal data or video-based features,struggle with generalization and robustness,especially in dynamic and real-world conditions.This paper proposes a novel multimodal ensemble classification framework that integrates geometric features derived from 3D skeletal joint distances and angles with temporal features extracted from RGB videos using the Inflated 3D ConvNet(I3D).By fusing these complementary modalities at the feature level and applying a majority-voting ensemble of XGBoost,Random Forest,and Support Vector Machine classifiers,the framework robustly captures both spatial configurations and motion dynamics of sign gestures.Feature selection using the Pearson Correlation Coefficient further enhances efficiency by reducing redundancy.Extensive experiments on the ArabSign dataset,which includes RGB videos and corresponding skeletal data,demonstrate that the proposed approach significantly outperforms state-of-the-art methods,achieving an average F1-score of 97%using a majority-voting ensemble of XGBoost,Random Forest,and SVM classifiers,and improving recognition accuracy by more than 7%over previous best methods.This work not only advances the technical stateof-the-art in ArSL recognition but also provides a scalable,real-time solution for practical deployment in educational,social,and assistive communication technologies.Even though this study is about Arabic Sign Language,the framework proposed here can be extended to different sign languages,creating possibilities for potentially worldwide applicability in sign language recognition tasks.展开更多
Shot peening is commonly employed for surface deformation strengthening of cylindrical surface part.Therefore,it is critical to understand the effects of shot peening on residual stress and surface topography.Compared...Shot peening is commonly employed for surface deformation strengthening of cylindrical surface part.Therefore,it is critical to understand the effects of shot peening on residual stress and surface topography.Compared to flat surface,cylindrical surface shot peening has two significant features:(i)the curvature of the cylindrical surface and the scattering of the shot stream cause dis-tributed impact velocities;(i)the rotation of the part results in a periodic variation of the impact velocity component.Therefore,it is a challenge to quickly and accurately predict the shot peening residual stress and surface topography of cylindrical surface.This paper developed a high-precision model which considers the more realistic shot peening process.Firstly,a kinematic analysis model was developed to simulate the relative movement of numerous shots and cylindrical surface.Then,the spatial distribution and time-varying impact information was calculated.Subsequently,the impact information was used for finite element modeling to predict residual stress and surface topography.The proposed kinematic analysis method was validated by comparison with the dis-crete element method.Meanwhile,9310 high strength steel rollers shot peening test verified the effectiveness of the model in predicting the residual stress and surface topography.In addition,the effects of air pressure and attack angle on the residual stress and surface topography were investigated.This work could provide a functional package for efficient prediction of the surface integrity and guide industrial application in cylindrical surface shot peening.展开更多
Taking autonomous driving and driverless as the research object,we discuss and define intelligent high-precision map.Intelligent high-precision map is considered as a key link of future travel,a carrier of real-time p...Taking autonomous driving and driverless as the research object,we discuss and define intelligent high-precision map.Intelligent high-precision map is considered as a key link of future travel,a carrier of real-time perception of traffic resources in the entire space-time range,and the criterion for the operation and control of the whole process of the vehicle.As a new form of map,it has distinctive features in terms of cartography theory and application requirements compared with traditional navigation electronic maps.Thus,it is necessary to analyze and discuss its key features and problems to promote the development of research and application of intelligent high-precision map.Accordingly,we propose an information transmission model based on the cartography theory and combine the wheeled robot’s control flow in practical application.Next,we put forward the data logic structure of intelligent high-precision map,and analyze its application in autonomous driving.Then,we summarize the computing mode of“Crowdsourcing+Edge-Cloud Collaborative Computing”,and carry out key technical analysis on how to improve the quality of crowdsourced data.We also analyze the effective application scenarios of intelligent high-precision map in the future.Finally,we present some thoughts and suggestions for the future development of this field.展开更多
Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced tran...Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.展开更多
Before charge-coupled device detectors became widely employed in observational astronomy,for more than a hundred years,the main detection method was photography on astronomical glass plates.Recently,in order to preser...Before charge-coupled device detectors became widely employed in observational astronomy,for more than a hundred years,the main detection method was photography on astronomical glass plates.Recently,in order to preserve these historical data and maintain their usability,the International Astronomical Union has appealed to all countries for global digitization of astronomical plates by developing or adopting advanced digitization technology.Specialized digitizers with high precision and high measuring speed represent key equipment for this task.The Shanghai Astronomical Observatory and the Nishimura Co.,Ltd in Japan cooperated between 2013 and 2016 to develop the first Chinese high-precision astronomical plate digitizer,which was then used for complete digitization of all nighttime-observation astronomical plates in China.Then,in 2019–2021,the Shanghai Astronomical Observatory independently developed new models of plate digitizers that enabled countries such as Uzbekistan and Italy to digitize their astronomical plates.Additionally,a new high-precision and multifunction digitizer was also used to digitize valuable microscope slides from the Shanghai Natural History Museum,providing a successful example of cross-domain application of high-precision digitization technology.展开更多
Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning fr...Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.展开更多
The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,s...The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.展开更多
Air pollution in China covers a large area with complex sources and formation mechanisms,making it a unique place to conduct air pollution and atmospheric chemistry research.The National Natural Science Foundation of ...Air pollution in China covers a large area with complex sources and formation mechanisms,making it a unique place to conduct air pollution and atmospheric chemistry research.The National Natural Science Foundation of China’s Major Research Plan entitled“Fundamental Researches on the Formation and Response Mechanism of the Air Pollution Complex in China”(or the Plan)has funded 76 research projects to explore the causes of air pollution in China,and the key processes of air pollution in atmospheric physics and atmospheric chemistry.In order to summarize the abundant data from the Plan and exhibit the long-term impacts domestically and internationally,an integration project is responsible for collecting the various types of data generated by the 76 projects of the Plan.This project has classified and integrated these data,forming eight categories containing 258 datasets and 15 technical reports in total.The integration project has led to the successful establishment of the China Air Pollution Data Center(CAPDC)platform,providing storage,retrieval,and download services for the eight categories.This platform has distinct features including data visualization,related project information querying,and bilingual services in both English and Chinese,which allows for rapid searching and downloading of data and provides a solid foundation of data and support for future related research.Air pollution control in China,especially in the past decade,is undeniably a global exemplar,and this data center is the first in China to focus on research into the country’s air pollution complex.展开更多
In order to improve the reliability of the spacecraft micro cold gas propulsion system and realize the precise control of the spacecraft attitude and orbit, a micro-thrust, high-precision cold gas thruster is carried ...In order to improve the reliability of the spacecraft micro cold gas propulsion system and realize the precise control of the spacecraft attitude and orbit, a micro-thrust, high-precision cold gas thruster is carried out, at the same time due to the design requirements of the spacecraft, this micro-thrust should be continuous working more than 60 minutes, the traditional solenoid valve used for the thrusts can’t complete the mission, so a long-life micro latching valve is developed as the control valve for this micro thruster, because the micro latching valve can keep its position when it cuts off the outage. Firstly, the authors introduced the design scheme and idea of the thruster. Secondly, the performance of the latching valve and the flow characteristics of the nozzle were simulated. Finally, from the experimental results and compared with the numerical study, it shows that the long-life micro cold gas thruster developed in this paper meets the mission requirements.展开更多
With the increasing precision of the GRAIL gravity field models and topography from LOLA, it is possible to investigate the substructure beneath crater Clavius. An admittance between gravity and topography data is com...With the increasing precision of the GRAIL gravity field models and topography from LOLA, it is possible to investigate the substructure beneath crater Clavius. An admittance between gravity and topography data is commonly used to estimate selenophysical parameters, including load ratio, crustal thickness and density, and elastic thickness. Not only a surface load, but also a subsurface load is considered in estimation. The algorithm of particle swarm optimization(PSO) with a swarm size of 400 is employed as well.Results indicate that the observed admittance is best-fitted by the modeled admittance based on a spherical shell model, which was proved to be unsatisfactory in the previous study. The best-fitted load ratio f is around-0.194. Such a small load ratio conforms to the direct proportion between the nearly uncompensated topography and its corresponding negative gravity anomaly. It also indicates that a surface load dominates all the loads. Constrained within 2σSTD, a small crustal thickness(~30 km) and a crustal density of ~2587 kg m-3are found, quite close to the results from previous GRAIL research. Considering the well constrained crustal thickness and density, the best-fitted elastic thickness(~7 km) is rational. This result is slightly smaller than the previous study(~12 km). Such difference can be attributed to the difference in crustal density used and the precision of gravity and topography data. Considering that the small difference between the modeled gravity anomaly and observations is quite small, a parameter inversed here could be an indicator of the subsurface structure beneath Clavius.展开更多
As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and oper...As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and operation and supervision[1,2].Healthcare data elements include biolog.ical and clinical data that are related to disease,environ-mental health data that are associated with life,and operational and healthcare management data that are related to healthcare activities(Figure 1).Activities such as the construction of a data value assessment system,the devel-opment of a data circulation and sharing platform,and the authorization of data compliance and operation products support the strong growth momentum of the market for health care data elements in China[3].展开更多
Smart card-automated fare collection systems now routinely record large volumes of data comprising the origins and destinations of travelers.Processing and analyzing these data open new opportunities in urban modeling...Smart card-automated fare collection systems now routinely record large volumes of data comprising the origins and destinations of travelers.Processing and analyzing these data open new opportunities in urban modeling and travel behavior research.This study seeks to develop an accurate framework for the study of urban mobility from smart card data by developing a heuristic primary location model to identify the home and work locations.The model uses journey counts as an indicator of usage regularity,visit-frequency to identify activity locations for regular commuters,and stay-time for the classification of work and home locations and activities.London is taken as a case study,and the model results were validated against survey data from the London Travel Demand Survey and volunteer survey.Results demonstrate that the proposed model is able to detect meaningful home and work places with high precision.This study offers a new and cost-effective approach to travel behavior and demand research.展开更多
As smart grid technology rapidly advances,the vast amount of user data collected by smart meter presents significant challenges in data security and privacy protection.Current research emphasizes data security and use...As smart grid technology rapidly advances,the vast amount of user data collected by smart meter presents significant challenges in data security and privacy protection.Current research emphasizes data security and user privacy concerns within smart grids.However,existing methods struggle with efficiency and security when processing large-scale data.Balancing efficient data processing with stringent privacy protection during data aggregation in smart grids remains an urgent challenge.This paper proposes an AI-based multi-type data aggregation method designed to enhance aggregation efficiency and security by standardizing and normalizing various data modalities.The approach optimizes data preprocessing,integrates Long Short-Term Memory(LSTM)networks for handling time-series data,and employs homomorphic encryption to safeguard user privacy.It also explores the application of Boneh Lynn Shacham(BLS)signatures for user authentication.The proposed scheme’s efficiency,security,and privacy protection capabilities are validated through rigorous security proofs and experimental analysis.展开更多
Earth’s internal core and crustal magnetic fields,as measured by geomagnetic satellites like MSS-1(Macao Science Satellite-1)and Swarm,are vital for understanding core dynamics and tectonic evolution.To model these i...Earth’s internal core and crustal magnetic fields,as measured by geomagnetic satellites like MSS-1(Macao Science Satellite-1)and Swarm,are vital for understanding core dynamics and tectonic evolution.To model these internal magnetic fields accurately,data selection based on specific criteria is often employed to minimize the influence of rapidly changing current systems in the ionosphere and magnetosphere.However,the quantitative impact of various data selection criteria on internal geomagnetic field modeling is not well understood.This study aims to address this issue and provide a reference for constructing and applying geomagnetic field models.First,we collect the latest MSS-1 and Swarm satellite magnetic data and summarize widely used data selection criteria in geomagnetic field modeling.Second,we briefly describe the method to co-estimate the core,crustal,and large-scale magnetospheric fields using satellite magnetic data.Finally,we conduct a series of field modeling experiments with different data selection criteria to quantitatively estimate their influence.Our numerical experiments confirm that without selecting data from dark regions and geomagnetically quiet times,the resulting internal field differences at the Earth’s surface can range from tens to hundreds of nanotesla(nT).Additionally,we find that the uncertainties introduced into field models by different data selection criteria are significantly larger than the measurement accuracy of modern geomagnetic satellites.These uncertainties should be considered when utilizing constructed magnetic field models for scientific research and applications.展开更多
Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subse...Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.展开更多
基金National Natural Science Foundation of China (No.50905169)
文摘This paper describes the detailed desi gn of data acquisition device with multi-channel and high-precision for aerosp ace.Based on detailed analysis of the advantages and disadvantages of tw o common acquisition circuits,the design factors of acquisition device focus o n accuracy,sampling rate,hardware overhead and design space.The me chanical structure of the system is divided into different card layers according to different functions and the structure has the characteristics of high reliability,conveni ence to install and scalability.To ens ure reliable operation mode,the interface uses the optocoupler isolated from th e e xternal circuit.The transmission of signal is decided by the current in the cur rent loop that consists of optocouplers between acquisition device and t est bench.In multi-channel switching circuit,by establ ishing analog multiplexer model,the selection principles of circuit modes are given.
基金financially supported by the National Key R&D Program of China(No.2023YFF0803404)the Zhejiang Provincial Natural Science Foundation(No.LY23D040001)+4 种基金the Open Research Fund of Key Laboratory of Engineering Geophysical Prospecting and Detection of Chinese Geophysical Society(No.CJ2021GB01)the Open Re-search Fund of Changjiang River Scientific Research Institute(No.CKWV20221011/KY)the ZhouShan Science and Technology Project(No.2023C81010)the National Natural Science Foundation of China(No.41904100)supported by Chinese Natural Science Foundation Open Research Cruise(Cruise No.NORC2019–08)。
文摘INTRODUCTION.Crustal velocity model is crucial for describing the subsurface composition and structure,and has significant implications in offshore oil and gas exploration and marine geophysical engineering(Xie et al.,2024).Currently,travel time tomography is the most commonly used method for velocity modeling based on ocean bottom seismometer(OBS)data(Zhang et al.,2023;Sambolian et al.,2021).This method usually assumes that the sub-seafloor structure is layered,and therefore faces challenges in high-precision modeling with strong lateral discontinuities.
基金funded by the National Natural Science Foundation of China Instrumentation Program(52327806)Youth Fund of the National Nature Foundation of China(62405020)China Postdoctoral Science Foundation(2024M764131).
文摘Large-aperture optical components are of paramount importance in domains such as integrated circuits,photolithography,aerospace,and inertial confinement fusion.However,measuring their surface profiles relies predominantly on the phase-shifting approach,which involves collecting multiple interferograms and imposes stringent demands on environmental stability.These issues significantly hinder its ability to achieve real-time and dynamic high-precision measurements.Therefore,this study proposes a high-precision large-aperture single-frame interferometric surface profile measurement(LA-SFISPM)method based on deep learning and explores its capability to realize dynamic measurements with high accuracy.The interferogram is matched to the phase by training the data measured using the small aperture.The consistency of the surface features of the small and large apertures is enhanced via contrast learning and feature-distribution alignment.Hence,high-precision phase reconstruction of large-aperture optical components can be achieved without using a phase shifter.The experimental results show that for the tested mirror withΦ=820 mm,the surface profile obtained from LA-SFISPM is subtracted point-by-point from the ground truth,resulting in a maximum single-point error of 4.56 nm.Meanwhile,the peak-to-valley(PV)value is 0.0758λ,and the simple repeatability of root mean square(SR-RMS)value is 0.00025λ,which aligns well with the measured results obtained by ZYGO.In particular,a significant reduction in the measurement time(reduced by a factor of 48)is achieved compared with that of the traditional phase-shifting method.Our proposed method provides an efficient,rapid,and accurate method for obtaining the surface profiles of optical components with different diameters without employing a phase-shifting approach,which is highly desired in large-aperture interferometric measurement systems.
基金supported by the National Natural Science Foundation of China (Nos. 12172388 and 12472400)the Guangdong Basic and Applied Basic Research Foundation of China(No. 2025A1515011975)the Scientific Research Project of Guangdong Polytechnic Normal University of China (No. 2023SDKYA010)
文摘The elliptic integral method(EIM) is an efficient analytical approach for analyzing large deformations of elastic beams. However, it faces the following challenges.First, the existing EIM can only handle cases with known deformation modes. Second,the existing EIM is only applicable to Euler beams, and there is no EIM available for higher-precision Timoshenko and Reissner beams in cases where both force and moment are applied at the end. This paper proposes a general EIM for Reissner beams under arbitrary boundary conditions. On this basis, an analytical equation for determining the sign of the elliptic integral is provided. Based on the equation, we discover a class of elliptic integral piecewise points that are distinct from inflection points. More importantly, we propose an algorithm that automatically calculates the number of inflection points and other piecewise points during the nonlinear solution process, which is crucial for beams with unknown or changing deformation modes.
文摘With the intensifying competition in the integrated circuit(IC)industry,the high turnover rate of integrated circuit engineers has become a prominent issue affecting the technological continuity of high-precision,specialized,and innovative enterprises.As a representative of such enterprises,JL Technology has faced challenges to its R&D efficiency due to talent loss in recent years.This study takes this enterprise as a case to explore feasible paths to reduce turnover rates through optimizing training and career development systems.The research designs a method combining learning maps and talent maps,utilizes a competency model to clarify the direction for engineers’skill improvement,implements talent classification management using a nine-grid model,and achieves personalized training through Individual Development Plans(IDPs).Analysis of the enterprise’s historical data reveals that the main reasons for turnover are unclear career development paths and insufficient resources for skill improvement.After pilot implementation,the turnover rate in core departments decreased by 12%,and employee satisfaction with training increased by 24%.The results indicate that matching systematic talent reviews with dynamic learning resources can effectively enhance engineers’sense of belonging.This study provides a set of highly operational management tools for small and medium-sized high-precision,specialized,and innovative technology enterprises,verifies their applicability in such enterprises,and offers replicable experiences for similar enterprises to optimize their talent strategies[1].
基金funding this work through Research Group No.KS-2024-376.
文摘Arabic Sign Language(ArSL)recognition plays a vital role in enhancing the communication for the Deaf and Hard of Hearing(DHH)community.Researchers have proposed multiple methods for automated recognition of ArSL;however,these methods face multiple challenges that include high gesture variability,occlusions,limited signer diversity,and the scarcity of large annotated datasets.Existing methods,often relying solely on either skeletal data or video-based features,struggle with generalization and robustness,especially in dynamic and real-world conditions.This paper proposes a novel multimodal ensemble classification framework that integrates geometric features derived from 3D skeletal joint distances and angles with temporal features extracted from RGB videos using the Inflated 3D ConvNet(I3D).By fusing these complementary modalities at the feature level and applying a majority-voting ensemble of XGBoost,Random Forest,and Support Vector Machine classifiers,the framework robustly captures both spatial configurations and motion dynamics of sign gestures.Feature selection using the Pearson Correlation Coefficient further enhances efficiency by reducing redundancy.Extensive experiments on the ArabSign dataset,which includes RGB videos and corresponding skeletal data,demonstrate that the proposed approach significantly outperforms state-of-the-art methods,achieving an average F1-score of 97%using a majority-voting ensemble of XGBoost,Random Forest,and SVM classifiers,and improving recognition accuracy by more than 7%over previous best methods.This work not only advances the technical stateof-the-art in ArSL recognition but also provides a scalable,real-time solution for practical deployment in educational,social,and assistive communication technologies.Even though this study is about Arabic Sign Language,the framework proposed here can be extended to different sign languages,creating possibilities for potentially worldwide applicability in sign language recognition tasks.
基金the National Natural Science Foundation of China (No.U22B2086)the National Science and Technology Major Project through (No.2019-VII-0017-0158).
文摘Shot peening is commonly employed for surface deformation strengthening of cylindrical surface part.Therefore,it is critical to understand the effects of shot peening on residual stress and surface topography.Compared to flat surface,cylindrical surface shot peening has two significant features:(i)the curvature of the cylindrical surface and the scattering of the shot stream cause dis-tributed impact velocities;(i)the rotation of the part results in a periodic variation of the impact velocity component.Therefore,it is a challenge to quickly and accurately predict the shot peening residual stress and surface topography of cylindrical surface.This paper developed a high-precision model which considers the more realistic shot peening process.Firstly,a kinematic analysis model was developed to simulate the relative movement of numerous shots and cylindrical surface.Then,the spatial distribution and time-varying impact information was calculated.Subsequently,the impact information was used for finite element modeling to predict residual stress and surface topography.The proposed kinematic analysis method was validated by comparison with the dis-crete element method.Meanwhile,9310 high strength steel rollers shot peening test verified the effectiveness of the model in predicting the residual stress and surface topography.In addition,the effects of air pressure and attack angle on the residual stress and surface topography were investigated.This work could provide a functional package for efficient prediction of the surface integrity and guide industrial application in cylindrical surface shot peening.
基金National Key Research and Development Program(No.2018YFB1305001)Major Consulting and Research Project of Chinese Academy of Engineering(No.2018-ZD-02-07)。
文摘Taking autonomous driving and driverless as the research object,we discuss and define intelligent high-precision map.Intelligent high-precision map is considered as a key link of future travel,a carrier of real-time perception of traffic resources in the entire space-time range,and the criterion for the operation and control of the whole process of the vehicle.As a new form of map,it has distinctive features in terms of cartography theory and application requirements compared with traditional navigation electronic maps.Thus,it is necessary to analyze and discuss its key features and problems to promote the development of research and application of intelligent high-precision map.Accordingly,we propose an information transmission model based on the cartography theory and combine the wheeled robot’s control flow in practical application.Next,we put forward the data logic structure of intelligent high-precision map,and analyze its application in autonomous driving.Then,we summarize the computing mode of“Crowdsourcing+Edge-Cloud Collaborative Computing”,and carry out key technical analysis on how to improve the quality of crowdsourced data.We also analyze the effective application scenarios of intelligent high-precision map in the future.Finally,we present some thoughts and suggestions for the future development of this field.
基金research was funded by Science and Technology Project of State Grid Corporation of China under grant number 5200-202319382A-2-3-XG.
文摘Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.
基金This work was conducted with the financial support of the National Key Research and Development Program of China(Grant No.2021YFE0103400)of the Shanghai Science and Technology Commission through its Scientific Research Project program(Grant No.21511104100)of the National Natural Science Foundation of China(Grant No.12073062).
文摘Before charge-coupled device detectors became widely employed in observational astronomy,for more than a hundred years,the main detection method was photography on astronomical glass plates.Recently,in order to preserve these historical data and maintain their usability,the International Astronomical Union has appealed to all countries for global digitization of astronomical plates by developing or adopting advanced digitization technology.Specialized digitizers with high precision and high measuring speed represent key equipment for this task.The Shanghai Astronomical Observatory and the Nishimura Co.,Ltd in Japan cooperated between 2013 and 2016 to develop the first Chinese high-precision astronomical plate digitizer,which was then used for complete digitization of all nighttime-observation astronomical plates in China.Then,in 2019–2021,the Shanghai Astronomical Observatory independently developed new models of plate digitizers that enabled countries such as Uzbekistan and Italy to digitize their astronomical plates.Additionally,a new high-precision and multifunction digitizer was also used to digitize valuable microscope slides from the Shanghai Natural History Museum,providing a successful example of cross-domain application of high-precision digitization technology.
基金supported by the National Natural Science Foundation of China(32370703)the CAMS Innovation Fund for Medical Sciences(CIFMS)(2022-I2M-1-021,2021-I2M-1-061)the Major Project of Guangzhou National Labora-tory(GZNL2024A01015).
文摘Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.
基金supported in part by the National Natural Science Foundation of China under Grant 62371181in part by the Changzhou Science and Technology International Cooperation Program under Grant CZ20230029+1 种基金supported by a National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(2021R1A2B5B02087169)supported under the framework of international cooperation program managed by the National Research Foundation of Korea(2022K2A9A1A01098051)。
文摘The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.
基金supported by the National Natural Science Foundation of China(Grant No.92044303)。
文摘Air pollution in China covers a large area with complex sources and formation mechanisms,making it a unique place to conduct air pollution and atmospheric chemistry research.The National Natural Science Foundation of China’s Major Research Plan entitled“Fundamental Researches on the Formation and Response Mechanism of the Air Pollution Complex in China”(or the Plan)has funded 76 research projects to explore the causes of air pollution in China,and the key processes of air pollution in atmospheric physics and atmospheric chemistry.In order to summarize the abundant data from the Plan and exhibit the long-term impacts domestically and internationally,an integration project is responsible for collecting the various types of data generated by the 76 projects of the Plan.This project has classified and integrated these data,forming eight categories containing 258 datasets and 15 technical reports in total.The integration project has led to the successful establishment of the China Air Pollution Data Center(CAPDC)platform,providing storage,retrieval,and download services for the eight categories.This platform has distinct features including data visualization,related project information querying,and bilingual services in both English and Chinese,which allows for rapid searching and downloading of data and provides a solid foundation of data and support for future related research.Air pollution control in China,especially in the past decade,is undeniably a global exemplar,and this data center is the first in China to focus on research into the country’s air pollution complex.
文摘In order to improve the reliability of the spacecraft micro cold gas propulsion system and realize the precise control of the spacecraft attitude and orbit, a micro-thrust, high-precision cold gas thruster is carried out, at the same time due to the design requirements of the spacecraft, this micro-thrust should be continuous working more than 60 minutes, the traditional solenoid valve used for the thrusts can’t complete the mission, so a long-life micro latching valve is developed as the control valve for this micro thruster, because the micro latching valve can keep its position when it cuts off the outage. Firstly, the authors introduced the design scheme and idea of the thruster. Secondly, the performance of the latching valve and the flow characteristics of the nozzle were simulated. Finally, from the experimental results and compared with the numerical study, it shows that the long-life micro cold gas thruster developed in this paper meets the mission requirements.
基金supported by a grants from the National Natural Science Foundation of China (Grant Nos. 41864001 and U1831132)Open Fund of State Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University (Grant No. 17P03)+3 种基金Guizhou Normal University Doctoral Research Fundsupported by grants from the Hubei Province Foundation innovation group project (2015CFA011, 2018CFA087)Open Project of Lunar and Planetary Science Laboratory, Macao University of Science and Technology (FDCT 119/2017/A3)Open Fund of Guizhou Provincial Key Laboratory of Radio Astronomy and Data Processing (KF201813)
文摘With the increasing precision of the GRAIL gravity field models and topography from LOLA, it is possible to investigate the substructure beneath crater Clavius. An admittance between gravity and topography data is commonly used to estimate selenophysical parameters, including load ratio, crustal thickness and density, and elastic thickness. Not only a surface load, but also a subsurface load is considered in estimation. The algorithm of particle swarm optimization(PSO) with a swarm size of 400 is employed as well.Results indicate that the observed admittance is best-fitted by the modeled admittance based on a spherical shell model, which was proved to be unsatisfactory in the previous study. The best-fitted load ratio f is around-0.194. Such a small load ratio conforms to the direct proportion between the nearly uncompensated topography and its corresponding negative gravity anomaly. It also indicates that a surface load dominates all the loads. Constrained within 2σSTD, a small crustal thickness(~30 km) and a crustal density of ~2587 kg m-3are found, quite close to the results from previous GRAIL research. Considering the well constrained crustal thickness and density, the best-fitted elastic thickness(~7 km) is rational. This result is slightly smaller than the previous study(~12 km). Such difference can be attributed to the difference in crustal density used and the precision of gravity and topography data. Considering that the small difference between the modeled gravity anomaly and observations is quite small, a parameter inversed here could be an indicator of the subsurface structure beneath Clavius.
基金supported by National Natural Science Foundation of China(Grants 72474022,71974011,72174022,71972012,71874009)"BIT think tank"Promotion Plan of Science and Technology Innovation Program of Beijing Institute of Technology(Grants 2024CX14017,2023CX13029).
文摘As a new type of production factor in healthcare,healthcare data elements have been rapidly integrated into various health production processes,such as clinical assistance,health management,biological testing,and operation and supervision[1,2].Healthcare data elements include biolog.ical and clinical data that are related to disease,environ-mental health data that are associated with life,and operational and healthcare management data that are related to healthcare activities(Figure 1).Activities such as the construction of a data value assessment system,the devel-opment of a data circulation and sharing platform,and the authorization of data compliance and operation products support the strong growth momentum of the market for health care data elements in China[3].
基金This work was funded by the Economic and Social Research Council(ESRC)in the United Kingdom[grant number 1477365].
文摘Smart card-automated fare collection systems now routinely record large volumes of data comprising the origins and destinations of travelers.Processing and analyzing these data open new opportunities in urban modeling and travel behavior research.This study seeks to develop an accurate framework for the study of urban mobility from smart card data by developing a heuristic primary location model to identify the home and work locations.The model uses journey counts as an indicator of usage regularity,visit-frequency to identify activity locations for regular commuters,and stay-time for the classification of work and home locations and activities.London is taken as a case study,and the model results were validated against survey data from the London Travel Demand Survey and volunteer survey.Results demonstrate that the proposed model is able to detect meaningful home and work places with high precision.This study offers a new and cost-effective approach to travel behavior and demand research.
基金supported by the National Key R&D Program of China(No.2023YFB2703700)the National Natural Science Foundation of China(Nos.U21A20465,62302457,62402444,62172292)+4 种基金the Fundamental Research Funds of Zhejiang Sci-Tech University(Nos.23222092-Y,22222266-Y)the Program for Leading Innovative Research Team of Zhejiang Province(No.2023R01001)the Zhejiang Provincial Natural Science Foundation of China(Nos.LQ24F020008,LQ24F020012)the Foundation of State Key Laboratory of Public Big Data(No.[2022]417)the“Pioneer”and“Leading Goose”R&D Program of Zhejiang(No.2023C01119).
文摘As smart grid technology rapidly advances,the vast amount of user data collected by smart meter presents significant challenges in data security and privacy protection.Current research emphasizes data security and user privacy concerns within smart grids.However,existing methods struggle with efficiency and security when processing large-scale data.Balancing efficient data processing with stringent privacy protection during data aggregation in smart grids remains an urgent challenge.This paper proposes an AI-based multi-type data aggregation method designed to enhance aggregation efficiency and security by standardizing and normalizing various data modalities.The approach optimizes data preprocessing,integrates Long Short-Term Memory(LSTM)networks for handling time-series data,and employs homomorphic encryption to safeguard user privacy.It also explores the application of Boneh Lynn Shacham(BLS)signatures for user authentication.The proposed scheme’s efficiency,security,and privacy protection capabilities are validated through rigorous security proofs and experimental analysis.
基金supported by the National Natural Science Foundation of China(42250101)the Macao Foundation。
文摘Earth’s internal core and crustal magnetic fields,as measured by geomagnetic satellites like MSS-1(Macao Science Satellite-1)and Swarm,are vital for understanding core dynamics and tectonic evolution.To model these internal magnetic fields accurately,data selection based on specific criteria is often employed to minimize the influence of rapidly changing current systems in the ionosphere and magnetosphere.However,the quantitative impact of various data selection criteria on internal geomagnetic field modeling is not well understood.This study aims to address this issue and provide a reference for constructing and applying geomagnetic field models.First,we collect the latest MSS-1 and Swarm satellite magnetic data and summarize widely used data selection criteria in geomagnetic field modeling.Second,we briefly describe the method to co-estimate the core,crustal,and large-scale magnetospheric fields using satellite magnetic data.Finally,we conduct a series of field modeling experiments with different data selection criteria to quantitatively estimate their influence.Our numerical experiments confirm that without selecting data from dark regions and geomagnetically quiet times,the resulting internal field differences at the Earth’s surface can range from tens to hundreds of nanotesla(nT).Additionally,we find that the uncertainties introduced into field models by different data selection criteria are significantly larger than the measurement accuracy of modern geomagnetic satellites.These uncertainties should be considered when utilizing constructed magnetic field models for scientific research and applications.
基金supported in part by NIH grants R01NS39600,U01MH114829RF1MH128693(to GAA)。
文摘Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.