The spatial offset of bridge has a significant impact on the safety,comfort,and durability of high-speed railway(HSR)operations,so it is crucial to rapidly and effectively detect the spatial offset of operational HSR ...The spatial offset of bridge has a significant impact on the safety,comfort,and durability of high-speed railway(HSR)operations,so it is crucial to rapidly and effectively detect the spatial offset of operational HSR bridges.Drive-by monitoring of bridge uneven settlement demonstrates significant potential due to its practicality,cost-effectiveness,and efficiency.However,existing drive-by methods for detecting bridge offset have limitations such as reliance on a single data source,low detection accuracy,and the inability to identify lateral deformations of bridges.This paper proposes a novel drive-by inspection method for spatial offset of HSR bridge based on multi-source data fusion of comprehensive inspection train.Firstly,dung beetle optimizer-variational mode decomposition was employed to achieve adaptive decomposition of non-stationary dynamic signals,and explore the hidden temporal relationships in the data.Subsequently,a long short-term memory neural network was developed to achieve feature fusion of multi-source signal and accurate prediction of spatial settlement of HSR bridge.A dataset of track irregularities and CRH380A high-speed train responses was generated using a 3D train-track-bridge interaction model,and the accuracy and effectiveness of the proposed hybrid deep learning model were numerically validated.Finally,the reliability of the proposed drive-by inspection method was further validated by analyzing the actual measurement data obtained from comprehensive inspection train.The research findings indicate that the proposed approach enables rapid and accurate detection of spatial offset in HSR bridge,ensuring the long-term operational safety of HSR bridges.展开更多
Domain adaptation aims to reduce the distribution gap between the training data(source domain)and the target data.This enables effective predictions even for domains not seen during training.However,most conventional ...Domain adaptation aims to reduce the distribution gap between the training data(source domain)and the target data.This enables effective predictions even for domains not seen during training.However,most conventional domain adaptation methods assume a single source domain,making them less suitable for modern deep learning settings that rely on diverse and large-scale datasets.To address this limitation,recent research has focused on Multi-Source Domain Adaptation(MSDA),which aims to learn effectively from multiple source domains.In this paper,we propose Efficient Domain Transition for Multi-source(EDTM),a novel and efficient framework designed to tackle two major challenges in existing MSDA approaches:(1)integrating knowledge across different source domains and(2)aligning label distributions between source and target domains.EDTM leverages an ensemble-based classifier expert mechanism to enhance the contribution of source domains that are more similar to the target domain.To further stabilize the learning process and improve performance,we incorporate imitation learning into the training of the target model.In addition,Maximum Classifier Discrepancy(MCD)is employed to align class-wise label distributions between the source and target domains.Experiments were conducted using Digits-Five,one of the most representative benchmark datasets for MSDA.The results show that EDTM consistently outperforms existing methods in terms of average classification accuracy.Notably,EDTM achieved significantly higher performance on target domains such as Modified National Institute of Standards and Technolog with blended background images(MNIST-M)and Street View House Numbers(SVHN)datasets,demonstrating enhanced generalization compared to baseline approaches.Furthermore,an ablation study analyzing the contribution of each loss component validated the effectiveness of the framework,highlighting the importance of each module in achieving optimal performance.展开更多
Here we report on simultaneous lidar observations of sporadic Ni(Nis)layers and sporadic Na(Nas)layers in the atmosphere over Yanqing,Beijing(40.42°N,116.02°E)from April 2019 to October 2022.During 343 night...Here we report on simultaneous lidar observations of sporadic Ni(Nis)layers and sporadic Na(Nas)layers in the atmosphere over Yanqing,Beijing(40.42°N,116.02°E)from April 2019 to October 2022.During 343 nights of observation,68 Nis and 56 Nas were observed.The seasonal variation of Nis and Nas was also obtained,with the highest occurrence of Nis being in July(43%)and that of Nas being in June(61%).We found that the seasonal variation of Nis is similar to that of Nas and that both occur more frequently in summer than in winter.In addition,we found 23 events in which Nis and Nas occur simultaneously.The average peak altitude of Nas is approximately 1 km higher than that of Nis,and the peak density ratio of Nas to Nis is approximately 5,which is half the density ratio of the two main layers.Additionally,the strength factor for Nas is smaller than that for Nis.Through data analysis of sporadic E layers(Es),we found that Nis and Nas has a significant correlation with Es.The neutralization rates of Ni^(+)/Na^(+)were calculated according to the dissociative recombination reaction of Ni^(+)/Na^(+)and the WACCM-Ni(Whole Atmosphere Community Climate Model of Ni).The production rates of Ni and Na were estimated to be approximately 1:4.4,which is consistent with the density ratio of Nis to Nas.The results showed that the neutralization reaction of Ni+,Na+,and electrons in Es is the main reason for the formation of the Nis layer and the Nas layer.展开更多
Benthic habitat mapping is an emerging discipline in the international marine field in recent years,providing an effective tool for marine spatial planning,marine ecological management,and decision-making applications...Benthic habitat mapping is an emerging discipline in the international marine field in recent years,providing an effective tool for marine spatial planning,marine ecological management,and decision-making applications.Seabed sediment classification is one of the main contents of seabed habitat mapping.In response to the impact of remote sensing imaging quality and the limitations of acoustic measurement range,where a single data source does not fully reflect the substrate type,we proposed a high-precision seabed habitat sediment classification method that integrates data from multiple sources.Based on WorldView-2 multi-spectral remote sensing image data and multibeam bathymetry data,constructed a random forests(RF)classifier with optimal feature selection.A seabed sediment classification experiment integrating optical remote sensing and acoustic remote sensing data was carried out in the shallow water area of Wuzhizhou Island,Hainan,South China.Different seabed sediment types,such as sand,seagrass,and coral reefs were effectively identified,with an overall classification accuracy of 92%.Experimental results show that RF matrix optimized by fusing multi-source remote sensing data for feature selection were better than the classification results of simple combinations of data sources,which improved the accuracy of seabed sediment classification.Therefore,the method proposed in this paper can be effectively applied to high-precision seabed sediment classification and habitat mapping around islands and reefs.展开更多
Snow depth (SD) is a key parameter for research into global climate changes and land surface processes. A method was developed to obtain daily SD images at a higher 4 km spatial resolution and higher precision with ...Snow depth (SD) is a key parameter for research into global climate changes and land surface processes. A method was developed to obtain daily SD images at a higher 4 km spatial resolution and higher precision with SD measurements from in situ observations and passive microwave remote sensing of Advanced Microwave Scanning Radiometer-EOS (AMSR-E) and snow cover measurements of the Interactive Multisensor Snow and Ice Mapping System (IMS). AMSR-E SD at 25 km spatial resolution was retrieved from AMSR-E products of snow density and snow water equivalent and then corrected using the SD from in situ observations and IMS snow cover. Corrected AMSR-E SD images were then resampled to act as "virtual" in situ observations to combine with the real in situ observations to interpolate at 4 km spatial resolution SD using the Cressman method. Finally, daily SD data generation for several regions of China demonstrated that the method is well suited to the generation of higher spatial resolution SD data in regions with a lower Digital Elevation Model (DEM) but not so well suited to regions at high altitude and with an undulating terrain, such as the Tibetan Plateau. Analysis of the longer time period SD data generation for January between 2003 and 2010 in northern Xinjiang also demonstrated the feasibility of the method.展开更多
On January 7,2025,an Ms6.8 earthquake struck Dingri County,XigazêCity,in the Xizang Autonomous Region.The epicenter,located near the Shenzha-Dingjie fault zone at the boundary between the Qinghai-Xizang Plateau a...On January 7,2025,an Ms6.8 earthquake struck Dingri County,XigazêCity,in the Xizang Autonomous Region.The epicenter,located near the Shenzha-Dingjie fault zone at the boundary between the Qinghai-Xizang Plateau and the Indian Plate,marked the largest earthquake in the region in recent years.The Shenzha-Dingjie fault zone,situated at the boundary between the Qinghai-Xizang Plateau and the Indian Plate,is a key tectonic feature in the India-Eurasia collision process,exhibiting both thrust and strike-slip faulting.This study analyzed the disaster characteristics induced by the earthquake using Differential Synthetic Aperture Radar Interferometry(DIn SAR)to process Sentinel-1 satellite data and derive pre-and post-earthquake surface deformation information.Additionally,high-resolution optical remote sensing data,UAV(unmanned aerial vehicle)imagery,and airborne Li DAR(light detection and ranging)data were employed to analyze the spatial distribution of the surface rupture zone,with field investigations validating the findings.Key results include:(1)Field verification confirmed that potential landslide hazard points identified via optical image interpretation did not exhibit secondary landslide activity;(2)D-In SAR revealed the co-seismic surface deformation pattern,providing detailed deformation information for the Dingri region;(3)Integration of Li DAR and optical imagery further refined and validated surface rupture characteristics identified by optical-In SAR,indicating a predominantly north-south rupture zone.Additionally,surface fracture features extending in a near east-west direction were observed on the southeast side of the epicenter,accompanied by some infrastructure damage;(4)Surface fracture was most severe in high-intensity seismic areas near the epicenter,with the maximum surface displacement approximately 28 km from the epicenter.The earthquake-induced surface deformation zone spanned approximately 6 km by 46 km,with deformation concentrated primarily on the western side of the Dingmucuo Fault,where maximum subsidence of 0.65 m was detected.On the eastern side,uplift was dominant,reaching a maximum of 0.75 m.This earthquake poses significant threats to local communities and infrastructure,underscoring the urgent need for continued monitoring in affected areas.The findings highlight the effectiveness of multi-source data fusion(space-air-ground based observation)in seismic disaster assessment,offering a methodological framework for rapid post-earthquake disaster response.providing a valuable scientific foundation for mitigating secondary disasters in the region.展开更多
The Macao satellites differ from their predecessors in their orbits:MSS-1(Macao Science Satellite-1)is in low inclination and the planned MSS-2 will be in highly elliptical orbits.This paper reviews the fundamental ad...The Macao satellites differ from their predecessors in their orbits:MSS-1(Macao Science Satellite-1)is in low inclination and the planned MSS-2 will be in highly elliptical orbits.This paper reviews the fundamental advantages and disadvantages of the different possible magnetic measurements:the component(declination,intensity,etc.)and location(satellite,ground,etc.).When planning a survey the choice of component is the"What?"question;the choice of location the"Where?"question.Results from potential theory inform the choice of measurement and data analysis.For example,knowing the vertical component of magnetic field provides a solution for the full magnetic field everywhere in the potential region.This is the familiar Neumann problem.In reality this ideal dataset is never available.In the past we were restricted to declination data only,then direction only,then total intensity only.There have also been large swathes of Earth's surface with no measurements at all(MSS-1 is restricted to latitudes below).These incomplete datasets throw up new questions for potential theory,questions that have some intriguing answers.When only declination is known uniqueness is provided by horizontal intensity measurements on a single line joining the dip-poles.When only directions are involved uniqueness is provided by a single intensity measurement,at least in principle.Paleomagnetic intensities can help.When only total intensity is known,as was largely the case in the early satellite era,uniqueness is provided by a precise location of the magnetic equator.Holes in the data distribution is a familiar problem in geophysical studies.All magnetic measurements sample,to a greater or lesser extent,the potential field everywhere.There is a trade-off between measurements close to the source,good for small targets and high resolution,and the broader sample of a distant measurement.The sampling of a measurement is given by the appropriate Green's function of the Laplacian,which determines both the resolution and scope of the measurement.For example,radial and horizontal measurements near the Earth's surface give a weighted average of the radial component over a patch of the core surface beneath the measurement site about in radius.The patch is smaller for shallower surfaces,for example from satellite to ground.Holes in the data distribution do not correspond to similar holes at the source surface;the price paid is in resolution of the source.I argue that,in the past,we have been too reluctant to take advantage of incomplete and apparently hopeless datasets.展开更多
Accurate estimation of understory terrain has significant scientific importance for maintaining ecosystem balance and biodiversity conservation.Addressing the issue of inadequate representation of spatial heterogeneit...Accurate estimation of understory terrain has significant scientific importance for maintaining ecosystem balance and biodiversity conservation.Addressing the issue of inadequate representation of spatial heterogeneity when traditional forest topographic inversion methods consider the entire forest as the inversion unit,this study pro⁃poses a differentiated modeling approach to forest types based on refined land cover classification.Taking Puerto Ri⁃co and Maryland as study areas,a multi-dimensional feature system is constructed by integrating multi-source re⁃mote sensing data:ICESat-2 spaceborne LiDAR is used to obtain benchmark values for understory terrain,topo⁃graphic factors such as slope and aspect are extracted based on SRTM data,and vegetation cover characteristics are analyzed using Landsat-8 multispectral imagery.This study incorporates forest type as a classification modeling con⁃dition and applies the random forest algorithm to build differentiated topographic inversion models.Experimental re⁃sults indicate that,compared to traditional whole-area modeling methods(RMSE=5.06 m),forest type-based classi⁃fication modeling significantly improves the accuracy of understory terrain estimation(RMSE=2.94 m),validating the effectiveness of spatial heterogeneity modeling.Further sensitivity analysis reveals that canopy structure parame⁃ters(with RMSE variation reaching 4.11 m)exert a stronger regulatory effect on estimation accuracy compared to forest cover,providing important theoretical support for optimizing remote sensing models of forest topography.展开更多
Adaptive optics(AO)has significantly advanced high-resolution solar observations by mitigating atmospheric turbulence.However,traditional post-focal AO systems suffer from external configurations that introduce excess...Adaptive optics(AO)has significantly advanced high-resolution solar observations by mitigating atmospheric turbulence.However,traditional post-focal AO systems suffer from external configurations that introduce excessive optical surfaces,reduced light throughput,and instrumental polarization.To address these limitations,we propose an embedded solar adaptive optics telescope(ESAOT)that intrinsically incorporates the solar AO(SAO)subsystem within the telescope's optical train,featuring a co-designed correction chain with a single Hartmann-Shack full-wavefront sensor(HS f-WFS)and a deformable secondary mirror(DSM).The HS f-WFS uses temporal-spatial hybrid sampling technique to simultane-ously resolve tip-tilt and high-order aberrations,while the DSM performs real-time compensation through adaptive modal optimization.This unified architecture achieves symmetrical polarization suppression and high system throughput by min-imizing optical surfaces.A 600 mm ESAOT prototype incorporating a 12×12 micro-lens array HS f-WFS and 61-actuator piezoelectric DSM has been developed and successfully conducted on-sky photospheric observations.Validations in-cluding turbulence simulations,optical bench testing,and practical observations at the Lijiang observatory collectively confirm the system's capability to maintain aboutλ/10 wavefront error during active region tracking.This architectural breakthrough of the ESAOT addresses long-standing SAO integration challenges in solar astronomy and provides scala-bility analyses confirming direct applicability to the existing and future large solar observation facilities.展开更多
To elucidate the fracturing mechanism of deep hard rock under complex disturbance environments,this study investigates the dynamic failure behavior of pre-damaged granite subjected to multi-source dynamic disturbances...To elucidate the fracturing mechanism of deep hard rock under complex disturbance environments,this study investigates the dynamic failure behavior of pre-damaged granite subjected to multi-source dynamic disturbances.Blasting vibration monitoring was conducted in a deep-buried drill-and-blast tunnel to characterize in-situ dynamic loading conditions.Subsequently,true triaxial compression tests incorporating multi-source disturbances were performed using a self-developed wide-low-frequency true triaxial system to simulate disturbance accumulation and damage evolution in granite.The results demonstrate that combined dynamic disturbances and unloading damage significantly accelerate strength degradation and trigger shear-slip failure along preferentially oriented blast-induced fractures,with strength reductions up to 16.7%.Layered failure was observed on the free surface of pre-damaged granite under biaxial loading,indicating a disturbance-induced fracture localization mechanism.Time-stress-fracture-energy coupling fields were constructed to reveal the spatiotemporal characteristics of fracture evolution.Critical precursor frequency bands(105-150,185-225,and 300-325 kHz)were identified,which serve as diagnostic signatures of impending failure.A dynamic instability mechanism driven by multi-source disturbance superposition and pre-damage evolution was established.Furthermore,a grouting-based wave-absorption control strategy was proposed to mitigate deep dynamic disasters by attenuating disturbance amplitude and reducing excitation frequency.展开更多
The SiO_(2) inverse opal photonic crystals(PC)with a three-dimensional macroporous structure were fabricated by the sacrificial template method,followed by infiltration of a pyrene derivative,1-(pyren-8-yl)but-3-en-1-...The SiO_(2) inverse opal photonic crystals(PC)with a three-dimensional macroporous structure were fabricated by the sacrificial template method,followed by infiltration of a pyrene derivative,1-(pyren-8-yl)but-3-en-1-amine(PEA),to achieve a formaldehyde(FA)-sensitive and fluorescence-enhanced sensing film.Utilizing the specific Aza-Cope rearrangement reaction of allylamine of PEA and FA to generate a strong fluorescent product emitted at approximately 480 nm,we chose a PC whose blue band edge of stopband overlapped with the fluorescence emission wavelength.In virtue of the fluorescence enhancement property derived from slow photon effect of PC,FA was detected highly selectively and sensitively.The limit of detection(LoD)was calculated to be 1.38 nmol/L.Furthermore,the fast detection of FA(within 1 min)is realized due to the interconnected three-dimensional macroporous structure of the inverse opal PC and its high specific surface area.The prepared sensing film can be used for the detection of FA in air,aquatic products and living cells.The very close FA content in indoor air to the result from FA detector,the recovery rate of 101.5%for detecting FA in aquatic products and fast fluorescence imaging in 2 min for living cells demonstrate the reliability and accuracy of our method in practical applications.展开更多
Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.P...Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.Previous schemes have achieved secure outsourced computing,but they suffer from low computational accuracy,difficult-to-handle heterogeneous distribution of data from multiple sources,and high computational cost,which result in extremely poor user experience and expensive cloud computing costs.To address the above problems,we propose amulti-precision,multi-sourced,andmulti-key outsourcing neural network training scheme.Firstly,we design a multi-precision functional encryption computation based on Euclidean division.Second,we design the outsourcing model training algorithm based on a multi-precision functional encryption with multi-sourced heterogeneity.Finally,we conduct experiments on three datasets.The results indicate that our framework achieves an accuracy improvement of 6%to 30%.Additionally,it offers a memory space optimization of 1.0×2^(24) times compared to the previous best approach.展开更多
Accurate monitoring of track irregularities is very helpful to improving the vehicle operation quality and to formulating appropriate track maintenance strategies.Existing methods have the problem that they rely on co...Accurate monitoring of track irregularities is very helpful to improving the vehicle operation quality and to formulating appropriate track maintenance strategies.Existing methods have the problem that they rely on complex signal processing algorithms and lack multi-source data analysis.Driven by multi-source measurement data,including the axle box,the bogie frame and the carbody accelerations,this paper proposes a track irregularities monitoring network(TIMNet)based on deep learning methods.TIMNet uses the feature extraction capability of convolutional neural networks and the sequence map-ping capability of the long short-term memory model to explore the mapping relationship between vehicle accelerations and track irregularities.The particle swarm optimization algorithm is used to optimize the network parameters,so that both the vertical and lateral track irregularities can be accurately identified in the time and spatial domains.The effectiveness and superiority of the proposed TIMNet is analyzed under different simulation conditions using a vehicle dynamics model.Field tests are conducted to prove the availability of the proposed TIMNet in quantitatively monitoring vertical and lateral track irregularities.Furthermore,comparative tests show that the TIMNet has a better fitting degree and timeliness in monitoring track irregularities(vertical R2 of 0.91,lateral R2 of 0.84 and time cost of 10 ms),compared to other classical regression.The test also proves that the TIMNet has a better anti-interference ability than other regression models.展开更多
In the heterogeneous power internet of things(IoT)environment,data signals are acquired to support different business systems to realize advanced intelligent applications,with massive,multi-source,heterogeneous and ot...In the heterogeneous power internet of things(IoT)environment,data signals are acquired to support different business systems to realize advanced intelligent applications,with massive,multi-source,heterogeneous and other characteristics.Reliable perception of information and efficient transmission of energy in multi-source heterogeneous environments are crucial issues.Compressive sensing(CS),as an effective method of signal compression and transmission,can accurately recover the original signal only by very few sampling.In this paper,we study a new method of multi-source heterogeneous data signal reconstruction of power IoT based on compressive sensing technology.Based on the traditional compressive sensing technology to directly recover multi-source heterogeneous signals,we fully use the interference subspace information to design the measurement matrix,which directly and effectively eliminates the interference while making the measurement.The measure matrix is optimized by minimizing the average cross-coherence of the matrix,and the reconstruction performance of the new method is further improved.Finally,the effectiveness of the new method with different parameter settings under different multi-source heterogeneous data signal cases is verified by using orthogonal matching pursuit(OMP)and sparsity adaptive matching pursuit(SAMP)for considering the actual environment with prior information utilization of signal sparsity and no prior information utilization of signal sparsity.展开更多
This paper deeply discusses the causes of gear howling noise,the identification and analysis of multi-source excitation,the transmission path of dynamic noise,simulation and experimental research,case analysis,optimiz...This paper deeply discusses the causes of gear howling noise,the identification and analysis of multi-source excitation,the transmission path of dynamic noise,simulation and experimental research,case analysis,optimization effect,etc.,aiming to better provide a certain guideline and reference for relevant researchers.展开更多
With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heter...With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%.展开更多
Multi-source data fusion provides high-precision spatial situational awareness essential for analyzing granular urban social activities.This study used Shanghai’s catering industry as a case study,leveraging electron...Multi-source data fusion provides high-precision spatial situational awareness essential for analyzing granular urban social activities.This study used Shanghai’s catering industry as a case study,leveraging electronic reviews and consumer data sourced from third-party restaurant platforms collected in 2021.By performing weighted processing on two-dimensional point-of-interest(POI)data,clustering hotspots of high-dimensional restaurant data were identified.A hierarchical network of restaurant hotspots was constructed following the Central Place Theory(CPT)framework,while the Geo-Informatic Tupu method was employed to resolve the challenges posed by network deformation in multi-scale processes.These findings suggest the necessity of enhancing the spatial balance of Shanghai’s urban centers by moderately increasing the number and service capacity of suburban centers at the urban periphery.Such measures would contribute to a more optimized urban structure and facilitate the outward dispersion of comfort-oriented facilities such as the restaurant industry.At a finer spatial scale,the distribution of restaurant hotspots demonstrates a polycentric and symmetric spatial pattern,with a developmental trend radiating outward along the city’s ring roads.This trend can be attributed to the efforts of restaurants to establish connections with other urban functional spaces,leading to the reconfiguration of urban spaces,expansion of restaurant-dedicated land use,and the reorganization of associated commercial activities.The results validate the existence of a polycentric urban structure in Shanghai but also highlight the instability of the restaurant hotspot network during cross-scale transitions.展开更多
This study explored the observation strategy and effectiveness of synoptic-scale adaptive observations for improving sea fog prediction in coastal regions around the Bohai Sea based on a poorly predicted fog event wit...This study explored the observation strategy and effectiveness of synoptic-scale adaptive observations for improving sea fog prediction in coastal regions around the Bohai Sea based on a poorly predicted fog event with cold-front synoptic pattern(CFSP).An ensemble Kalman filter data assimilation system for the Weather Research and Forecasting model was adopted with ensemble sensitivity analysis(ESA).By comparing observation impacts(estimated from a 40-member ensemble with ESA)among different meteorological observation variables and pressure levels,the temperature at 850 hPa and surface layer(850 hPa-and-surface temperature)was selected as the target observation type.Additionally,the area with large observation impacts for this observation type was predicted in the transition region of the surface low–high system.This area developed southward with the low and moved eastward with the low–high system,which could be explained by the main features of CFSP.Moreover,both experiments assimilating synthetic and real observations showed that assimilating 850 hPa-and-surface temperature observations generally yielded better fog coverage forecasts in areas with greater observation impacts than areas with smaller impacts.However,the effectiveness of adaptive observations was reduced when real observations rather than synthetic observations were assimilated,which is possibly due to factors such as observation and model errors.The main conclusions above were verified by another typical fog event with CFSP characteristics.Results of this study highlight the importance of improved initial conditions in the transition region of the low–high system for improving fog prediction and provide scientific guidance for implementing an observation network for fog forecasting over the Bohai Sea.展开更多
Taking the Ming Tombs Forest Farm in Beijing as the research object,this research applied multi-source data fusion and GIS heat-map overlay analysis techniques,systematically collected bird observation point data from...Taking the Ming Tombs Forest Farm in Beijing as the research object,this research applied multi-source data fusion and GIS heat-map overlay analysis techniques,systematically collected bird observation point data from the Global Biodiversity Information Facility(GBIF),population distribution data from the Oak Ridge National Laboratory(ORNL)in the United States,as well as information on the composition of tree species in suitable forest areas for birds and the forest geographical information of the Ming Tombs Forest Farm,which is based on literature research and field investigations.By using GIS technology,spatial processing was carried out on bird observation points and population distribution data to identify suitable bird-watching areas in different seasons.Then,according to the suitability value range,these areas were classified into different grades(from unsuitable to highly suitable).The research findings indicated that there was significant spatial heterogeneity in the bird-watching suitability of the Ming Tombs Forest Farm.The north side of the reservoir was generally a core area with high suitability in all seasons.The deep-aged broad-leaved mixed forests supported the overlapping co-existence of the ecological niches of various bird species,such as the Zosterops simplex and Urocissa erythrorhyncha.In contrast,the shallow forest-edge coniferous pure forests and mixed forests were more suitable for specialized species like Carduelis sinica.The southern urban area and the core area of the mausoleums had relatively low suitability due to ecological fragmentation or human interference.Based on these results,this paper proposed a three-level protection framework of“core area conservation—buffer zone management—isolation zone construction”and a spatio-temporal coordinated human-bird co-existence strategy.It was also suggested that the human-bird co-existence space could be optimized through measures such as constructing sound and light buffer interfaces,restoring ecological corridors,and integrating cultural heritage elements.This research provided an operational technical approach and decision-making support for the scientific planning of bird-watching sites and the coordination of ecological protection and tourism development.展开更多
The ice-phase microphysical characteristics of a stratiform cloud system over the Qilian Mountains in northwestern China on 15 September 2022 were analyzed via aircraft data.The stratiform cloud system developed under...The ice-phase microphysical characteristics of a stratiform cloud system over the Qilian Mountains in northwestern China on 15 September 2022 were analyzed via aircraft data.The stratiform cloud system developed under southwesterly flows at 500 hPa and was affected locally by topography.Synoptic features and aircraft observations revealed strengthened cloud development on the leeward slope.The ice particle habits and microphysical processes at heights of 6-8 km were investigated.The cloud system was characterized by extremely low supercooled liquid water content at temperatures between−4℃ and−17℃.The ice particle concentrations ranged predominantly from 10 to 30 L^(−1),corresponding to ice water content ranging from 0.01 to 0.05 g m^(−3).Active ice aggregation was observed at temperatures colder than−10°C.The windward side of the cloud system exhibited weaker development and two distinct cloud layers.Intense orographic uplift on the leeward slope enhanced ice particle aggregation.The clouds on the leeside presented lower ice particle concentrations but larger sizes than those on the windward side.The influence of aggregation on the ice particle size distribution was reflected in two main aspects.One aspect was the bimodal spectra at−16℃,with the first peak at 125μm and subpeak at 400-500μm;the other was the broadened size spectra at−13℃ due to significant aggregation of dendrites.展开更多
基金sponsored by the National Natural Science Foundation of China(Grant No.52178100).
文摘The spatial offset of bridge has a significant impact on the safety,comfort,and durability of high-speed railway(HSR)operations,so it is crucial to rapidly and effectively detect the spatial offset of operational HSR bridges.Drive-by monitoring of bridge uneven settlement demonstrates significant potential due to its practicality,cost-effectiveness,and efficiency.However,existing drive-by methods for detecting bridge offset have limitations such as reliance on a single data source,low detection accuracy,and the inability to identify lateral deformations of bridges.This paper proposes a novel drive-by inspection method for spatial offset of HSR bridge based on multi-source data fusion of comprehensive inspection train.Firstly,dung beetle optimizer-variational mode decomposition was employed to achieve adaptive decomposition of non-stationary dynamic signals,and explore the hidden temporal relationships in the data.Subsequently,a long short-term memory neural network was developed to achieve feature fusion of multi-source signal and accurate prediction of spatial settlement of HSR bridge.A dataset of track irregularities and CRH380A high-speed train responses was generated using a 3D train-track-bridge interaction model,and the accuracy and effectiveness of the proposed hybrid deep learning model were numerically validated.Finally,the reliability of the proposed drive-by inspection method was further validated by analyzing the actual measurement data obtained from comprehensive inspection train.The research findings indicate that the proposed approach enables rapid and accurate detection of spatial offset in HSR bridge,ensuring the long-term operational safety of HSR bridges.
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.RS-2024-00406320)the Institute of Information&Communica-tions Technology Planning&Evaluation(IITP)-Innovative Human Resource Development for Local Intellectualization Program Grant funded by the Korea government(MSIT)(IITP-2026-RS-2023-00259678).
文摘Domain adaptation aims to reduce the distribution gap between the training data(source domain)and the target data.This enables effective predictions even for domains not seen during training.However,most conventional domain adaptation methods assume a single source domain,making them less suitable for modern deep learning settings that rely on diverse and large-scale datasets.To address this limitation,recent research has focused on Multi-Source Domain Adaptation(MSDA),which aims to learn effectively from multiple source domains.In this paper,we propose Efficient Domain Transition for Multi-source(EDTM),a novel and efficient framework designed to tackle two major challenges in existing MSDA approaches:(1)integrating knowledge across different source domains and(2)aligning label distributions between source and target domains.EDTM leverages an ensemble-based classifier expert mechanism to enhance the contribution of source domains that are more similar to the target domain.To further stabilize the learning process and improve performance,we incorporate imitation learning into the training of the target model.In addition,Maximum Classifier Discrepancy(MCD)is employed to align class-wise label distributions between the source and target domains.Experiments were conducted using Digits-Five,one of the most representative benchmark datasets for MSDA.The results show that EDTM consistently outperforms existing methods in terms of average classification accuracy.Notably,EDTM achieved significantly higher performance on target domains such as Modified National Institute of Standards and Technolog with blended background images(MNIST-M)and Street View House Numbers(SVHN)datasets,demonstrating enhanced generalization compared to baseline approaches.Furthermore,an ablation study analyzing the contribution of each loss component validated the effectiveness of the framework,highlighting the importance of each module in achieving optimal performance.
基金supported by the Specialized Research Fund for State Key Laboratories,Chinese Meridian Project,the Specialized Research Fund for the State Key Laboratory of Solar Activity and Space Weather,postgraduate Education Reform and Quality Improvement Project of Henan Province(Grant No.YJS2024JD32)Natural Science Foundation Project of Henan Province(Grant No.242300420253)National Natural Science Foundation of China for Young Scientists(Grant No.42504156)funding.
文摘Here we report on simultaneous lidar observations of sporadic Ni(Nis)layers and sporadic Na(Nas)layers in the atmosphere over Yanqing,Beijing(40.42°N,116.02°E)from April 2019 to October 2022.During 343 nights of observation,68 Nis and 56 Nas were observed.The seasonal variation of Nis and Nas was also obtained,with the highest occurrence of Nis being in July(43%)and that of Nas being in June(61%).We found that the seasonal variation of Nis is similar to that of Nas and that both occur more frequently in summer than in winter.In addition,we found 23 events in which Nis and Nas occur simultaneously.The average peak altitude of Nas is approximately 1 km higher than that of Nis,and the peak density ratio of Nas to Nis is approximately 5,which is half the density ratio of the two main layers.Additionally,the strength factor for Nas is smaller than that for Nis.Through data analysis of sporadic E layers(Es),we found that Nis and Nas has a significant correlation with Es.The neutralization rates of Ni^(+)/Na^(+)were calculated according to the dissociative recombination reaction of Ni^(+)/Na^(+)and the WACCM-Ni(Whole Atmosphere Community Climate Model of Ni).The production rates of Ni and Na were estimated to be approximately 1:4.4,which is consistent with the density ratio of Nis to Nas.The results showed that the neutralization reaction of Ni+,Na+,and electrons in Es is the main reason for the formation of the Nis layer and the Nas layer.
基金Supported by the National Natural Science Foundation of China(Nos.42376185,41876111)the Shandong Provincial Natural Science Foundation(No.ZR2023MD073)。
文摘Benthic habitat mapping is an emerging discipline in the international marine field in recent years,providing an effective tool for marine spatial planning,marine ecological management,and decision-making applications.Seabed sediment classification is one of the main contents of seabed habitat mapping.In response to the impact of remote sensing imaging quality and the limitations of acoustic measurement range,where a single data source does not fully reflect the substrate type,we proposed a high-precision seabed habitat sediment classification method that integrates data from multiple sources.Based on WorldView-2 multi-spectral remote sensing image data and multibeam bathymetry data,constructed a random forests(RF)classifier with optimal feature selection.A seabed sediment classification experiment integrating optical remote sensing and acoustic remote sensing data was carried out in the shallow water area of Wuzhizhou Island,Hainan,South China.Different seabed sediment types,such as sand,seagrass,and coral reefs were effectively identified,with an overall classification accuracy of 92%.Experimental results show that RF matrix optimized by fusing multi-source remote sensing data for feature selection were better than the classification results of simple combinations of data sources,which improved the accuracy of seabed sediment classification.Therefore,the method proposed in this paper can be effectively applied to high-precision seabed sediment classification and habitat mapping around islands and reefs.
基金Meteorological Research in the Public Interest,No.GYHY201106014Beijing Nova Program,No.2010B037China Special Fund for the National High Technology Research and Development Program of China(863 Program),No.412230
文摘Snow depth (SD) is a key parameter for research into global climate changes and land surface processes. A method was developed to obtain daily SD images at a higher 4 km spatial resolution and higher precision with SD measurements from in situ observations and passive microwave remote sensing of Advanced Microwave Scanning Radiometer-EOS (AMSR-E) and snow cover measurements of the Interactive Multisensor Snow and Ice Mapping System (IMS). AMSR-E SD at 25 km spatial resolution was retrieved from AMSR-E products of snow density and snow water equivalent and then corrected using the SD from in situ observations and IMS snow cover. Corrected AMSR-E SD images were then resampled to act as "virtual" in situ observations to combine with the real in situ observations to interpolate at 4 km spatial resolution SD using the Cressman method. Finally, daily SD data generation for several regions of China demonstrated that the method is well suited to the generation of higher spatial resolution SD data in regions with a lower Digital Elevation Model (DEM) but not so well suited to regions at high altitude and with an undulating terrain, such as the Tibetan Plateau. Analysis of the longer time period SD data generation for January between 2003 and 2010 in northern Xinjiang also demonstrated the feasibility of the method.
基金supported by the National Natural Science Foundation of China(No.42477170)the Major Project of the National Natural Science Foundation of China(No.42090054)+1 种基金the Research Fund Program of Hubei Key Laboratory of Resources and Eco-Environment Geology(No.HBREGKFJJ-202411)Innovative Group Project of Natural Science Foundation of Hubei Province(No.2024AFA015)。
文摘On January 7,2025,an Ms6.8 earthquake struck Dingri County,XigazêCity,in the Xizang Autonomous Region.The epicenter,located near the Shenzha-Dingjie fault zone at the boundary between the Qinghai-Xizang Plateau and the Indian Plate,marked the largest earthquake in the region in recent years.The Shenzha-Dingjie fault zone,situated at the boundary between the Qinghai-Xizang Plateau and the Indian Plate,is a key tectonic feature in the India-Eurasia collision process,exhibiting both thrust and strike-slip faulting.This study analyzed the disaster characteristics induced by the earthquake using Differential Synthetic Aperture Radar Interferometry(DIn SAR)to process Sentinel-1 satellite data and derive pre-and post-earthquake surface deformation information.Additionally,high-resolution optical remote sensing data,UAV(unmanned aerial vehicle)imagery,and airborne Li DAR(light detection and ranging)data were employed to analyze the spatial distribution of the surface rupture zone,with field investigations validating the findings.Key results include:(1)Field verification confirmed that potential landslide hazard points identified via optical image interpretation did not exhibit secondary landslide activity;(2)D-In SAR revealed the co-seismic surface deformation pattern,providing detailed deformation information for the Dingri region;(3)Integration of Li DAR and optical imagery further refined and validated surface rupture characteristics identified by optical-In SAR,indicating a predominantly north-south rupture zone.Additionally,surface fracture features extending in a near east-west direction were observed on the southeast side of the epicenter,accompanied by some infrastructure damage;(4)Surface fracture was most severe in high-intensity seismic areas near the epicenter,with the maximum surface displacement approximately 28 km from the epicenter.The earthquake-induced surface deformation zone spanned approximately 6 km by 46 km,with deformation concentrated primarily on the western side of the Dingmucuo Fault,where maximum subsidence of 0.65 m was detected.On the eastern side,uplift was dominant,reaching a maximum of 0.75 m.This earthquake poses significant threats to local communities and infrastructure,underscoring the urgent need for continued monitoring in affected areas.The findings highlight the effectiveness of multi-source data fusion(space-air-ground based observation)in seismic disaster assessment,offering a methodological framework for rapid post-earthquake disaster response.providing a valuable scientific foundation for mitigating secondary disasters in the region.
文摘The Macao satellites differ from their predecessors in their orbits:MSS-1(Macao Science Satellite-1)is in low inclination and the planned MSS-2 will be in highly elliptical orbits.This paper reviews the fundamental advantages and disadvantages of the different possible magnetic measurements:the component(declination,intensity,etc.)and location(satellite,ground,etc.).When planning a survey the choice of component is the"What?"question;the choice of location the"Where?"question.Results from potential theory inform the choice of measurement and data analysis.For example,knowing the vertical component of magnetic field provides a solution for the full magnetic field everywhere in the potential region.This is the familiar Neumann problem.In reality this ideal dataset is never available.In the past we were restricted to declination data only,then direction only,then total intensity only.There have also been large swathes of Earth's surface with no measurements at all(MSS-1 is restricted to latitudes below).These incomplete datasets throw up new questions for potential theory,questions that have some intriguing answers.When only declination is known uniqueness is provided by horizontal intensity measurements on a single line joining the dip-poles.When only directions are involved uniqueness is provided by a single intensity measurement,at least in principle.Paleomagnetic intensities can help.When only total intensity is known,as was largely the case in the early satellite era,uniqueness is provided by a precise location of the magnetic equator.Holes in the data distribution is a familiar problem in geophysical studies.All magnetic measurements sample,to a greater or lesser extent,the potential field everywhere.There is a trade-off between measurements close to the source,good for small targets and high resolution,and the broader sample of a distant measurement.The sampling of a measurement is given by the appropriate Green's function of the Laplacian,which determines both the resolution and scope of the measurement.For example,radial and horizontal measurements near the Earth's surface give a weighted average of the radial component over a patch of the core surface beneath the measurement site about in radius.The patch is smaller for shallower surfaces,for example from satellite to ground.Holes in the data distribution do not correspond to similar holes at the source surface;the price paid is in resolution of the source.I argue that,in the past,we have been too reluctant to take advantage of incomplete and apparently hopeless datasets.
基金Supported by the National Natural Science Foundation of China(42401488,42071351)the National Key Research and Development Program of China(2020YFA0608501,2017YFB0504204)+4 种基金the Liaoning Revitalization Talents Program(XLYC1802027)the Talent Recruited Program of the Chinese Academy of Science(Y938091)the Project Supported Discipline Innovation Team of the Liaoning Technical University(LNTU20TD-23)the Liaoning Province Doctoral Research Initiation Fund Program(2023-BS-202)the Basic Research Projects of Liaoning Department of Education(JYTQN2023202)。
文摘Accurate estimation of understory terrain has significant scientific importance for maintaining ecosystem balance and biodiversity conservation.Addressing the issue of inadequate representation of spatial heterogeneity when traditional forest topographic inversion methods consider the entire forest as the inversion unit,this study pro⁃poses a differentiated modeling approach to forest types based on refined land cover classification.Taking Puerto Ri⁃co and Maryland as study areas,a multi-dimensional feature system is constructed by integrating multi-source re⁃mote sensing data:ICESat-2 spaceborne LiDAR is used to obtain benchmark values for understory terrain,topo⁃graphic factors such as slope and aspect are extracted based on SRTM data,and vegetation cover characteristics are analyzed using Landsat-8 multispectral imagery.This study incorporates forest type as a classification modeling con⁃dition and applies the random forest algorithm to build differentiated topographic inversion models.Experimental re⁃sults indicate that,compared to traditional whole-area modeling methods(RMSE=5.06 m),forest type-based classi⁃fication modeling significantly improves the accuracy of understory terrain estimation(RMSE=2.94 m),validating the effectiveness of spatial heterogeneity modeling.Further sensitivity analysis reveals that canopy structure parame⁃ters(with RMSE variation reaching 4.11 m)exert a stronger regulatory effect on estimation accuracy compared to forest cover,providing important theoretical support for optimizing remote sensing models of forest topography.
基金support from the National Science Foundation of China(NSFC)(Grants No.12293031 and No.61905252)the National Science Foundation for Distinguished Young Scholars(Grant No.12022308)the National Key R&D Program of China(Grants No.2021YFC2202200 and No.2021YFC2202204).
文摘Adaptive optics(AO)has significantly advanced high-resolution solar observations by mitigating atmospheric turbulence.However,traditional post-focal AO systems suffer from external configurations that introduce excessive optical surfaces,reduced light throughput,and instrumental polarization.To address these limitations,we propose an embedded solar adaptive optics telescope(ESAOT)that intrinsically incorporates the solar AO(SAO)subsystem within the telescope's optical train,featuring a co-designed correction chain with a single Hartmann-Shack full-wavefront sensor(HS f-WFS)and a deformable secondary mirror(DSM).The HS f-WFS uses temporal-spatial hybrid sampling technique to simultane-ously resolve tip-tilt and high-order aberrations,while the DSM performs real-time compensation through adaptive modal optimization.This unified architecture achieves symmetrical polarization suppression and high system throughput by min-imizing optical surfaces.A 600 mm ESAOT prototype incorporating a 12×12 micro-lens array HS f-WFS and 61-actuator piezoelectric DSM has been developed and successfully conducted on-sky photospheric observations.Validations in-cluding turbulence simulations,optical bench testing,and practical observations at the Lijiang observatory collectively confirm the system's capability to maintain aboutλ/10 wavefront error during active region tracking.This architectural breakthrough of the ESAOT addresses long-standing SAO integration challenges in solar astronomy and provides scala-bility analyses confirming direct applicability to the existing and future large solar observation facilities.
基金supported by the National Key R&D Program of China(No.2023YFB2603602)the National Natural Science Foundation of China(Nos.52222810 and 52178383).
文摘To elucidate the fracturing mechanism of deep hard rock under complex disturbance environments,this study investigates the dynamic failure behavior of pre-damaged granite subjected to multi-source dynamic disturbances.Blasting vibration monitoring was conducted in a deep-buried drill-and-blast tunnel to characterize in-situ dynamic loading conditions.Subsequently,true triaxial compression tests incorporating multi-source disturbances were performed using a self-developed wide-low-frequency true triaxial system to simulate disturbance accumulation and damage evolution in granite.The results demonstrate that combined dynamic disturbances and unloading damage significantly accelerate strength degradation and trigger shear-slip failure along preferentially oriented blast-induced fractures,with strength reductions up to 16.7%.Layered failure was observed on the free surface of pre-damaged granite under biaxial loading,indicating a disturbance-induced fracture localization mechanism.Time-stress-fracture-energy coupling fields were constructed to reveal the spatiotemporal characteristics of fracture evolution.Critical precursor frequency bands(105-150,185-225,and 300-325 kHz)were identified,which serve as diagnostic signatures of impending failure.A dynamic instability mechanism driven by multi-source disturbance superposition and pre-damage evolution was established.Furthermore,a grouting-based wave-absorption control strategy was proposed to mitigate deep dynamic disasters by attenuating disturbance amplitude and reducing excitation frequency.
基金supported by the National Natural Science Foundation of China(21663032 and 22061041)the Open Sharing Platform for Scientific and Technological Resources of Shaanxi Province(2021PT-004)the National Innovation and Entrepreneurship Training Program for College Students of China(S202110719044)。
文摘The SiO_(2) inverse opal photonic crystals(PC)with a three-dimensional macroporous structure were fabricated by the sacrificial template method,followed by infiltration of a pyrene derivative,1-(pyren-8-yl)but-3-en-1-amine(PEA),to achieve a formaldehyde(FA)-sensitive and fluorescence-enhanced sensing film.Utilizing the specific Aza-Cope rearrangement reaction of allylamine of PEA and FA to generate a strong fluorescent product emitted at approximately 480 nm,we chose a PC whose blue band edge of stopband overlapped with the fluorescence emission wavelength.In virtue of the fluorescence enhancement property derived from slow photon effect of PC,FA was detected highly selectively and sensitively.The limit of detection(LoD)was calculated to be 1.38 nmol/L.Furthermore,the fast detection of FA(within 1 min)is realized due to the interconnected three-dimensional macroporous structure of the inverse opal PC and its high specific surface area.The prepared sensing film can be used for the detection of FA in air,aquatic products and living cells.The very close FA content in indoor air to the result from FA detector,the recovery rate of 101.5%for detecting FA in aquatic products and fast fluorescence imaging in 2 min for living cells demonstrate the reliability and accuracy of our method in practical applications.
基金supported by Natural Science Foundation of China(Nos.62303126,62362008,author Z.Z,https://www.nsfc.gov.cn/,accessed on 20 December 2024)Major Scientific and Technological Special Project of Guizhou Province([2024]014)+2 种基金Guizhou Provincial Science and Technology Projects(No.ZK[2022]General149) ,author Z.Z,https://kjt.guizhou.gov.cn/,accessed on 20 December 2024)The Open Project of the Key Laboratory of Computing Power Network and Information Security,Ministry of Education under Grant 2023ZD037,author Z.Z,https://www.gzu.edu.cn/,accessed on 20 December 2024)Open Research Project of the State Key Laboratory of Industrial Control Technology,Zhejiang University,China(No.ICT2024B25),author Z.Z,https://www.gzu.edu.cn/,accessed on 20 December 2024).
文摘Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.Previous schemes have achieved secure outsourced computing,but they suffer from low computational accuracy,difficult-to-handle heterogeneous distribution of data from multiple sources,and high computational cost,which result in extremely poor user experience and expensive cloud computing costs.To address the above problems,we propose amulti-precision,multi-sourced,andmulti-key outsourcing neural network training scheme.Firstly,we design a multi-precision functional encryption computation based on Euclidean division.Second,we design the outsourcing model training algorithm based on a multi-precision functional encryption with multi-sourced heterogeneity.Finally,we conduct experiments on three datasets.The results indicate that our framework achieves an accuracy improvement of 6%to 30%.Additionally,it offers a memory space optimization of 1.0×2^(24) times compared to the previous best approach.
基金supported by the Sichuan Science and Technology Program(Nos.2024JDRC0100 and 2023YFQ0091)the National Natural Science Foundation of China(Nos.U21A20167 and 52475138)the Scientific Research Foundation of the State Key Laboratory of Rail Transit Vehicle System(No.2024RVL-T08).
文摘Accurate monitoring of track irregularities is very helpful to improving the vehicle operation quality and to formulating appropriate track maintenance strategies.Existing methods have the problem that they rely on complex signal processing algorithms and lack multi-source data analysis.Driven by multi-source measurement data,including the axle box,the bogie frame and the carbody accelerations,this paper proposes a track irregularities monitoring network(TIMNet)based on deep learning methods.TIMNet uses the feature extraction capability of convolutional neural networks and the sequence map-ping capability of the long short-term memory model to explore the mapping relationship between vehicle accelerations and track irregularities.The particle swarm optimization algorithm is used to optimize the network parameters,so that both the vertical and lateral track irregularities can be accurately identified in the time and spatial domains.The effectiveness and superiority of the proposed TIMNet is analyzed under different simulation conditions using a vehicle dynamics model.Field tests are conducted to prove the availability of the proposed TIMNet in quantitatively monitoring vertical and lateral track irregularities.Furthermore,comparative tests show that the TIMNet has a better fitting degree and timeliness in monitoring track irregularities(vertical R2 of 0.91,lateral R2 of 0.84 and time cost of 10 ms),compared to other classical regression.The test also proves that the TIMNet has a better anti-interference ability than other regression models.
基金supported by National Natural Science Foundation of China(12174350)Science and Technology Project of State Grid Henan Electric Power Company(5217Q0240008).
文摘In the heterogeneous power internet of things(IoT)environment,data signals are acquired to support different business systems to realize advanced intelligent applications,with massive,multi-source,heterogeneous and other characteristics.Reliable perception of information and efficient transmission of energy in multi-source heterogeneous environments are crucial issues.Compressive sensing(CS),as an effective method of signal compression and transmission,can accurately recover the original signal only by very few sampling.In this paper,we study a new method of multi-source heterogeneous data signal reconstruction of power IoT based on compressive sensing technology.Based on the traditional compressive sensing technology to directly recover multi-source heterogeneous signals,we fully use the interference subspace information to design the measurement matrix,which directly and effectively eliminates the interference while making the measurement.The measure matrix is optimized by minimizing the average cross-coherence of the matrix,and the reconstruction performance of the new method is further improved.Finally,the effectiveness of the new method with different parameter settings under different multi-source heterogeneous data signal cases is verified by using orthogonal matching pursuit(OMP)and sparsity adaptive matching pursuit(SAMP)for considering the actual environment with prior information utilization of signal sparsity and no prior information utilization of signal sparsity.
文摘This paper deeply discusses the causes of gear howling noise,the identification and analysis of multi-source excitation,the transmission path of dynamic noise,simulation and experimental research,case analysis,optimization effect,etc.,aiming to better provide a certain guideline and reference for relevant researchers.
文摘With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%.
基金Under the auspices of the Key Program of National Natural Science Foundation of China(No.42030409)。
文摘Multi-source data fusion provides high-precision spatial situational awareness essential for analyzing granular urban social activities.This study used Shanghai’s catering industry as a case study,leveraging electronic reviews and consumer data sourced from third-party restaurant platforms collected in 2021.By performing weighted processing on two-dimensional point-of-interest(POI)data,clustering hotspots of high-dimensional restaurant data were identified.A hierarchical network of restaurant hotspots was constructed following the Central Place Theory(CPT)framework,while the Geo-Informatic Tupu method was employed to resolve the challenges posed by network deformation in multi-scale processes.These findings suggest the necessity of enhancing the spatial balance of Shanghai’s urban centers by moderately increasing the number and service capacity of suburban centers at the urban periphery.Such measures would contribute to a more optimized urban structure and facilitate the outward dispersion of comfort-oriented facilities such as the restaurant industry.At a finer spatial scale,the distribution of restaurant hotspots demonstrates a polycentric and symmetric spatial pattern,with a developmental trend radiating outward along the city’s ring roads.This trend can be attributed to the efforts of restaurants to establish connections with other urban functional spaces,leading to the reconfiguration of urban spaces,expansion of restaurant-dedicated land use,and the reorganization of associated commercial activities.The results validate the existence of a polycentric urban structure in Shanghai but also highlight the instability of the restaurant hotspot network during cross-scale transitions.
基金supported by the National Natural Science Foundation of China(Grant No.41705081)the Shandong Natural Science Foundation Project(Grant No.ZR2019ZD12)the Laoshan Laboratory(Grant No.LSKJ202202203).
文摘This study explored the observation strategy and effectiveness of synoptic-scale adaptive observations for improving sea fog prediction in coastal regions around the Bohai Sea based on a poorly predicted fog event with cold-front synoptic pattern(CFSP).An ensemble Kalman filter data assimilation system for the Weather Research and Forecasting model was adopted with ensemble sensitivity analysis(ESA).By comparing observation impacts(estimated from a 40-member ensemble with ESA)among different meteorological observation variables and pressure levels,the temperature at 850 hPa and surface layer(850 hPa-and-surface temperature)was selected as the target observation type.Additionally,the area with large observation impacts for this observation type was predicted in the transition region of the surface low–high system.This area developed southward with the low and moved eastward with the low–high system,which could be explained by the main features of CFSP.Moreover,both experiments assimilating synthetic and real observations showed that assimilating 850 hPa-and-surface temperature observations generally yielded better fog coverage forecasts in areas with greater observation impacts than areas with smaller impacts.However,the effectiveness of adaptive observations was reduced when real observations rather than synthetic observations were assimilated,which is possibly due to factors such as observation and model errors.The main conclusions above were verified by another typical fog event with CFSP characteristics.Results of this study highlight the importance of improved initial conditions in the transition region of the low–high system for improving fog prediction and provide scientific guidance for implementing an observation network for fog forecasting over the Bohai Sea.
基金Sponsored by Beijing Youth Innovation Talent Support Program for Urban Greening and Landscaping——The 2024 Special Project for Promoting High-Quality Development of Beijing’s Landscaping through Scientific and Technological Innovation(KJCXQT202410).
文摘Taking the Ming Tombs Forest Farm in Beijing as the research object,this research applied multi-source data fusion and GIS heat-map overlay analysis techniques,systematically collected bird observation point data from the Global Biodiversity Information Facility(GBIF),population distribution data from the Oak Ridge National Laboratory(ORNL)in the United States,as well as information on the composition of tree species in suitable forest areas for birds and the forest geographical information of the Ming Tombs Forest Farm,which is based on literature research and field investigations.By using GIS technology,spatial processing was carried out on bird observation points and population distribution data to identify suitable bird-watching areas in different seasons.Then,according to the suitability value range,these areas were classified into different grades(from unsuitable to highly suitable).The research findings indicated that there was significant spatial heterogeneity in the bird-watching suitability of the Ming Tombs Forest Farm.The north side of the reservoir was generally a core area with high suitability in all seasons.The deep-aged broad-leaved mixed forests supported the overlapping co-existence of the ecological niches of various bird species,such as the Zosterops simplex and Urocissa erythrorhyncha.In contrast,the shallow forest-edge coniferous pure forests and mixed forests were more suitable for specialized species like Carduelis sinica.The southern urban area and the core area of the mausoleums had relatively low suitability due to ecological fragmentation or human interference.Based on these results,this paper proposed a three-level protection framework of“core area conservation—buffer zone management—isolation zone construction”and a spatio-temporal coordinated human-bird co-existence strategy.It was also suggested that the human-bird co-existence space could be optimized through measures such as constructing sound and light buffer interfaces,restoring ecological corridors,and integrating cultural heritage elements.This research provided an operational technical approach and decision-making support for the scientific planning of bird-watching sites and the coordination of ecological protection and tourism development.
基金supported by the National Natural Science Foundation of China(Grant Nos.42475100 and 42405091)supported by the CMA Key Innovation Team(Grant No.CMA2022ZD10)+1 种基金the CMA Weather Modification Centre Innovation Team(Grant No.WMC2023IT02)the National Key R&D Program of China(Grant No.2019YFC1510305).
文摘The ice-phase microphysical characteristics of a stratiform cloud system over the Qilian Mountains in northwestern China on 15 September 2022 were analyzed via aircraft data.The stratiform cloud system developed under southwesterly flows at 500 hPa and was affected locally by topography.Synoptic features and aircraft observations revealed strengthened cloud development on the leeward slope.The ice particle habits and microphysical processes at heights of 6-8 km were investigated.The cloud system was characterized by extremely low supercooled liquid water content at temperatures between−4℃ and−17℃.The ice particle concentrations ranged predominantly from 10 to 30 L^(−1),corresponding to ice water content ranging from 0.01 to 0.05 g m^(−3).Active ice aggregation was observed at temperatures colder than−10°C.The windward side of the cloud system exhibited weaker development and two distinct cloud layers.Intense orographic uplift on the leeward slope enhanced ice particle aggregation.The clouds on the leeside presented lower ice particle concentrations but larger sizes than those on the windward side.The influence of aggregation on the ice particle size distribution was reflected in two main aspects.One aspect was the bimodal spectra at−16℃,with the first peak at 125μm and subpeak at 400-500μm;the other was the broadened size spectra at−13℃ due to significant aggregation of dendrites.