Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minute...Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.展开更多
Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmenta...Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmentation(DA)methods are utilised to expand dataset diversity and scale.However,due to the complex and distinct characteristics of LiDAR point cloud data from different platforms(such as missile-borne and vehicular LiDAR data),directly applying traditional 2D visual domain DA methods to 3D data can lead to networks trained using this approach not robustly achieving the corresponding tasks.To address this issue,the present study explores DA for missile-borne LiDAR point cloud using a Monte Carlo(MC)simulation method that closely resembles practical application.Firstly,the model of multi-sensor imaging system is established,taking into account the joint errors arising from the platform itself and the relative motion during the imaging process.A distortion simulation method based on MC simulation for augmenting missile-borne LiDAR point cloud data is proposed,underpinned by an analysis of combined errors between different modal sensors,achieving high-quality augmentation of point cloud data.The effectiveness of the proposed method in addressing imaging system errors and distortion simulation is validated using the imaging scene dataset constructed in this paper.Comparative experiments between the proposed point cloud DA algorithm and the current state-of-the-art algorithms in point cloud detection and single object tracking tasks demonstrate that the proposed method can improve the network performance obtained from unaugmented datasets by over 17.3%and 17.9%,surpassing SOTA performance of current point cloud DA algorithms.展开更多
The three dimensional variable cross-section roll forming is a kind of new metal forming technol- ogy which combines large forming force, multi-axis linkage movement and space synergic movement, and the sequential syn...The three dimensional variable cross-section roll forming is a kind of new metal forming technol- ogy which combines large forming force, multi-axis linkage movement and space synergic movement, and the sequential synergic movement of the ganged roller group is used to complete the metal sheet forming according to the shape of the complicated and variable forming part data. The control system should meet the demands of quick response to the test requirements of the product part. A new kind of real time data driving multi-axis linkage and synergic movement control strategy of 3D roll forming is put forward in the paper. In the new control strategy, the forming data are automatically generated according to the shape of the parts, and the multi-axis linkage movement together with cooperative motion among the six stands of the 3D roll forming machine is driven by the real-time information, and the control nodes are also driven by the forming data. The new control strategy is applied to a 48 axis 3D roll forming machine developed by our research center, and the control servo period is less than 10ms. A forming experiment of variable cross section part is carried out, and the forming preci- sion is better than + 0.5mm by the control strategy. The result of the experiment proves that the control strategy has significant potentiality for the development of 3D roll forming production line with large scale, multi-axis ganged and svner^ic movement展开更多
In this review, we highlight some recent methodological and theoretical develop- ments in estimation and testing of large panel data models with cross-sectional dependence. The paper begins with a discussion of issues...In this review, we highlight some recent methodological and theoretical develop- ments in estimation and testing of large panel data models with cross-sectional dependence. The paper begins with a discussion of issues of cross-sectional dependence, and introduces the concepts of weak and strong cross-sectional dependence. Then, the main attention is primarily paid to spatial and factor approaches for modeling cross-sectional dependence for both linear and nonlinear (nonparametric and semiparametric) panel data models. Finally, we conclude with some speculations on future research directions.展开更多
Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)...Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field.展开更多
We present new data on the^(63)Cu(γ,n)cross-section studied using a quasi-monochromatic and energy-tunableγbeam produced at the Shanghai Laser Electron Gamma Source to resolve the long-standing discrepancy between e...We present new data on the^(63)Cu(γ,n)cross-section studied using a quasi-monochromatic and energy-tunableγbeam produced at the Shanghai Laser Electron Gamma Source to resolve the long-standing discrepancy between existing measurements and evaluations of this cross-section.Using an unfolding iteration method,^(63)Cu(γ,n)data were obtained with an uncertainty of less than 4%,and the inconsistencies between the available experimental data were discussed.Theγ-ray strength function of^(63)Cu(γ,n)was successfully extracted as an experimental constraint.We further calculated the cross-section of the radiative neutron capture reaction^(62)Cu(n,γ)using the TALYS code.Our calculation method enables the extraction of(n,γ)cross-sections for unstable nuclides.展开更多
To transmit customer power data collected by smart meters(SMs)to utility companies,data must first be transmitted to the corresponding data aggregation point(DAP)of the SM.The number of DAPs installed and the installa...To transmit customer power data collected by smart meters(SMs)to utility companies,data must first be transmitted to the corresponding data aggregation point(DAP)of the SM.The number of DAPs installed and the installation location greatly impact the whole network.For the traditional DAP placement algorithm,the number of DAPs must be set in advance,but determining the best number of DAPs is difficult,which undoubtedly reduces the overall performance of the network.Moreover,the excessive gap between the loads of different DAPs is also an important factor affecting the quality of the network.To address the above problems,this paper proposes a DAP placement algorithm,APSSA,based on the improved affinity propagation(AP)algorithm and sparrow search(SSA)algorithm,which can select the appropriate number of DAPs to be installed and the corresponding installation locations according to the number of SMs and their distribution locations in different environments.The algorithm adds an allocation mechanism to optimize the subnetwork in the SSA.APSSA is evaluated under three different areas and compared with other DAP placement algorithms.The experimental results validated that the method in this paper can reduce the network cost,shorten the average transmission distance,and reduce the load gap.展开更多
In order to improve the precision of super point detection and control measurement resource consumption, this paper proposes a super point detection method based on sampling and data streaming algorithms (SDSD), and...In order to improve the precision of super point detection and control measurement resource consumption, this paper proposes a super point detection method based on sampling and data streaming algorithms (SDSD), and proves that only sources or destinations with a lot of flows can be sampled probabilistically using the SDSD algorithm. The SDSD algorithm uses both the IP table and the flow bloom filter (BF) data structures to maintain the IP and flow information. The IP table is used to judge whether an IP address has been recorded. If the IP exists, then all its subsequent flows will be recorded into the flow BF; otherwise, the IP flow is sampled. This paper also analyzes the accuracy and memory requirements of the SDSD algorithm , and tests them using the CERNET trace. The theoretical analysis and experimental tests demonstrate that the most relative errors of the super points estimated by the SDSD algorithm are less than 5%, whereas the results of other algorithms are about 10%. Because of the BF structure, the SDSD algorithm is also better than previous algorithms in terms of memory consumption.展开更多
A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken b...A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm. Then, the objects in the segmented images are classified into small rock candidates, rock shadows, and large objects. Rock shadows and large objects are considered as the regions within which large rocks may exist. In these regions, large rock candidates are extracted through ground-plane fitting with the 3D point cloud data. Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results. The shape properties of the rocks (angularity, circularity, width, height, and width-height ratio) have been calculated for subsequent ~eological studies.展开更多
OBJECTIVE: To identify the acupoint combinations used in the treatment of Alzheimer's disease(AD).METHODS: The clinical literature regarding acupuncture and moxibustion for AD was searched and collected from datab...OBJECTIVE: To identify the acupoint combinations used in the treatment of Alzheimer's disease(AD).METHODS: The clinical literature regarding acupuncture and moxibustion for AD was searched and collected from databases including Chinese Biomedical Medicine, China National Knowledge Infrastructure, Wanfang Database and PubMed. The database of acupuncture and moxibustion prescriptions for AD was established by using Excel software so as to conduct the descriptive analysis, association analysis on the data.RESULTS: Baihui(GV 20), Sishencong(EX-HN 1),Shenmen(HT 7), Zusanli(ST 36), Neiguan(PC 6),Fengchi(GB 20), Taixi(KI 3), Dazhui(GV 14), Shenshu(BL 23), Sanyinjiao(SP 6), Shenting(GV 24), Fenglong(ST 40), Xuanzhong(GB 39), Shuigou(GV 26)and Taichong(LR 3) were of higher frequency in the treatment of AD with acupnucture and moxibustion. Most acupoints were selected from the Governor Vessel. The commonly used acupoints were located on the head, face, neck and lower limbs. The combination of the local acupoints with the distal ones was predominated. The crossing points among the specific points presented the advantage in the treatment. The association analysis indicated that the correlation among Fengchi(GB 20)-Baihui(GV 20) was the strongest, followed by combinations of Dazhui(GV 14)-Baihui(GV 20), Shenshu(BL 23)-Baihui(GV 20) and Neiguan(PC 6)-Baihui(GV 20) and indicated the common rules of the clinical acupoint selection and combination for AD.CONCLUSION: Our findings provide a reference for acupoints selection and combination for AD in clinical acupuncture practice.展开更多
In a round-oval-round pass rolling sequence, the cross-section profile of an outgoing workpiece was predicted first after getting the maximum spread. The concept "critical point on the contact boundary" was proposed...In a round-oval-round pass rolling sequence, the cross-section profile of an outgoing workpiece was predicted first after getting the maximum spread. The concept "critical point on the contact boundary" was proposed and the coordinates of the critical point were solved. The equivalent contact section area was represented and the mean roll radius was determined. The validity of this model was examined by alloy bar rolling experiment and rigid-plastic FEM simulation. Compared with the existing models, the mean roll radius obtained by this model is similar to experiment data.展开更多
Data loss or distortion causes adverse effects on the accuracy and stability of the thunderstorm point charge localization.To solve this problem,we propose a data complementary method based on the atmospheric electric...Data loss or distortion causes adverse effects on the accuracy and stability of the thunderstorm point charge localization.To solve this problem,we propose a data complementary method based on the atmospheric electric field apparatus array group.The electric field component measurement model of the atmospheric electric field apparatus is established,and the orientation parameters of the thunderstorm point charge are defined.Based on the mirror method,the thunderstorm point charge coordinates are obtained by using the potential distribution formulas.To test the validity of the basic algorithm,the electric field component measurement error and the localization accuracy are studied.Besides the azimuth angle and the elevation angle,the localization parameters also include the distance from the apparatus to the thunderstorm cloud.Based on a primary electric field apparatus,we establish the array group of apparatuses.Based on this,the data measured by each apparatus is complementarily processed to regain the thunderstorm point charge position.The results show that,compared with the radar map data,this method can accurately reflect the location of the thunderstorm point charge,and has a better localization effect.Additionally,several observation results during thunderstorm weather have been presented.展开更多
For the accurate extraction of cavity decay time, a selection of data points is supplemented to the weighted least square method. We derive the expected precision, accuracy and computation cost of this improved method...For the accurate extraction of cavity decay time, a selection of data points is supplemented to the weighted least square method. We derive the expected precision, accuracy and computation cost of this improved method, and examine these performances by simulation. By comparing this method with the nonlinear least square fitting (NLSF) method and the linear regression of the sum (LRS) method in derivations and simulations, we find that this method can achieve the same or even better precision, comparable accuracy, and lower computation cost. We test this method by experimental decay signals. The results are in agreement with the ones obtained from the nonlinear least square fitting method.展开更多
Understanding the mechanisms and risks of forest fires by building a spatial prediction model is an important means of controlling forest fires.Non-fire point data are important training data for constructing a model,...Understanding the mechanisms and risks of forest fires by building a spatial prediction model is an important means of controlling forest fires.Non-fire point data are important training data for constructing a model,and their quality significantly impacts the prediction performance of the model.However,non-fire point data obtained using existing sampling methods generally suffer from low representativeness.Therefore,this study proposes a non-fire point data sampling method based on geographical similarity to improve the quality of non-fire point samples.The method is based on the idea that the less similar the geographical environment between a sample point and an already occurred fire point,the greater the confidence in being a non-fire point sample.Yunnan Province,China,with a high frequency of forest fires,was used as the study area.We compared the prediction performance of traditional sampling methods and the proposed method using three commonly used forest fire risk prediction models:logistic regression(LR),support vector machine(SVM),and random forest(RF).The results show that the modeling and prediction accuracies of the forest fire prediction models established based on the proposed sampling method are significantly improved compared with those of the traditional sampling method.Specifically,in 2010,the modeling and prediction accuracies improved by 19.1%and 32.8%,respectively,and in 2020,they improved by 13.1%and 24.3%,respectively.Therefore,we believe that collecting non-fire point samples based on the principle of geographical similarity is an effective way to improve the quality of forest fire samples,and thus enhance the prediction of forest fire risk.展开更多
For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are ac...For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are acquired using specialized maritime LiDAR sensors in both inland waterways and wide-open ocean environments. The simulated data is generated by placing a ship in the LiDAR coordinate system and scanning it with a redeveloped Blensor that emulates the operation of a LiDAR sensor equipped with various laser beams. Furthermore,we also render point clouds for foggy and rainy weather conditions. To describe a realistic shipping environment, a dynamic tail wave is modeled by iterating the wave elevation of each point in a time series. Finally, networks serving small objects are migrated to ship applications by feeding our dataset. The positive effect of simulated data is described in object detection experiments, and the negative impact of tail waves as noise is verified in single-object tracking experiments. The Dataset is available at https://github.com/zqy411470859/ship_dataset.展开更多
This paper gives a definition of permanent optimal data point of Least Absolute Deviation(LAD)problem.Some theoretical results on non-degenerate LAD problem are obtained.For computing LAD problem,an efficient,algorith...This paper gives a definition of permanent optimal data point of Least Absolute Deviation(LAD)problem.Some theoretical results on non-degenerate LAD problem are obtained.For computing LAD problem,an efficient,algorithm is given according to the idea of permanent optimal data point.Numerical experience shows that our algorithm is better than many of others,including the famous B R algorithm.展开更多
A four dimensional variational data assimilation (4DVar) based on a dimension-reduced projection (DRP-4DVar) has been developed as a hybrid of the 4DVar and Ensemble Kalman filter (EnKF) concepts. Its good flow-...A four dimensional variational data assimilation (4DVar) based on a dimension-reduced projection (DRP-4DVar) has been developed as a hybrid of the 4DVar and Ensemble Kalman filter (EnKF) concepts. Its good flow-dependent features are demonstrated in single-point experiments through comparisons with adjointbased 4DVar and three-dimensional variational data (3DVar) assimilations using the fifth-generation Pennsylvania State University-National Center for Atmospheric Research Mesoscale Model (MM5). The results reveal that DRP-4DVar can reasonably generate a background error covariance matrix (simply B-matrix) during the assimilation window from an initial estimation using a number of initial condition dependent historical forecast samples. In contrast, flow-dependence in the B-matrix of MM5 4DVar is barely detectable. It is argued that use of diagonal estimation in the B-matrix of the MM5 4DVar method at the initial time leads to this failure. The experiments also show that the increments produced by DRP-4DVar are anisotropic and no longer symmetric with respect to observation location due to the effects of the weather trends captured in its B-matrix. This differs from the MM5 3DVar which does not consider the influence of heterogeneous forcing on the correlation structure of the B-matrix, a condition that is realistic for many situations. Thus, the MM5 3DVar assimilation could only present an isotropic and homogeneous structure in its increments.展开更多
The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for d...The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for difference significant test among different treatments using dynamic point as indexes were presented without setting the replication on each dynamic point observed.展开更多
Semantic segmentation in the context of 3D point clouds for the railway environment holds a significant economic value,but its development is severely hindered by the lack of suitable and specific datasets.Additionall...Semantic segmentation in the context of 3D point clouds for the railway environment holds a significant economic value,but its development is severely hindered by the lack of suitable and specific datasets.Additionally,the models trained on existing urban road point cloud datasets demonstrate poor generalisation on railway data due to a large domain gap caused by non-overlapping special/rare categories,for example,rail track,track bed etc.To harness the potential of supervised learning methods in the domain of 3D railway semantic segmentation,we introduce RailPC,a new point cloud benchmark.RailPC provides a large-scale dataset with rich annotations for semantic segmentation in the railway environment.Notably,RailPC contains twice the number of annotated points compared to the largest available mobile laser scanning(MLS)point cloud dataset and is the first railway-specific 3D dataset for semantic segmentation.It covers a total of nearly 25 km railway in two different scenes(urban and mountain),with 3 billion points that are finely labelled as 16 most typical classes with respect to railway,and the data acquisition process is completed in China by MLS systems.Through extensive experimentation,we evaluate the performance of advanced scene understanding methods on the annotated dataset and present a synthetic analysis of semantic segmentation results.Based on our findings,we establish some critical challenges towards railway-scale point cloud semantic segmentation.The dataset is available at https://github.com/NNU-GISA/GISA-RailPC,and we will continuously update it based on community feedback.展开更多
This paper mainly investigated the basic information about non-stationary trend change point patterns. After performing the investigation, the corresponding results show the existence of a trend, its magnitude, and ch...This paper mainly investigated the basic information about non-stationary trend change point patterns. After performing the investigation, the corresponding results show the existence of a trend, its magnitude, and change points in 24-hourly annual maximum series (AMS) extracted from monthly maximum series (MMS) data for thirty years (1986-2015) rainfall data for Uyo metropolis. Trend analysis was performed using Mann-Kendall (MK) test and Sen’s slope estimator (SSE) used to obtain the trend magnitude, while the trend change point analysis was conducted using the distribution-free cumulative sum test (CUSUM) and the sequential Mann-Kendall test (SQMK). A free CUSUM plot date of change point of rainfall trend as 2002 at 90% confidence interval was obtained from where the increasing trend started and became more pronounced in the year 2011, another change point year from the SQMK plot with the trend intensifying. The SSE gave an average rate of change in rainfall as 2.1288 and 2.16 mm/year for AMS and MMS time series data respectively. Invariably, the condition for Non-stationary concept application is met for intensity-duration-frequency modeling.展开更多
基金National Natural Science Foundation of China(No.41801379)Fundamental Research Funds for the Central Universities(No.2019B08414)National Key R&D Program of China(No.2016YFC0401801)。
文摘Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.
基金Postgraduate Innovation Top notch Talent Training Project of Hunan Province,Grant/Award Number:CX20220045Scientific Research Project of National University of Defense Technology,Grant/Award Number:22-ZZCX-07+2 种基金New Era Education Quality Project of Anhui Province,Grant/Award Number:2023cxcysj194National Natural Science Foundation of China,Grant/Award Numbers:62201597,62205372,1210456foundation of Hefei Comprehensive National Science Center,Grant/Award Number:KY23C502。
文摘Large-scale point cloud datasets form the basis for training various deep learning networks and achieving high-quality network processing tasks.Due to the diversity and robustness constraints of the data,data augmentation(DA)methods are utilised to expand dataset diversity and scale.However,due to the complex and distinct characteristics of LiDAR point cloud data from different platforms(such as missile-borne and vehicular LiDAR data),directly applying traditional 2D visual domain DA methods to 3D data can lead to networks trained using this approach not robustly achieving the corresponding tasks.To address this issue,the present study explores DA for missile-borne LiDAR point cloud using a Monte Carlo(MC)simulation method that closely resembles practical application.Firstly,the model of multi-sensor imaging system is established,taking into account the joint errors arising from the platform itself and the relative motion during the imaging process.A distortion simulation method based on MC simulation for augmenting missile-borne LiDAR point cloud data is proposed,underpinned by an analysis of combined errors between different modal sensors,achieving high-quality augmentation of point cloud data.The effectiveness of the proposed method in addressing imaging system errors and distortion simulation is validated using the imaging scene dataset constructed in this paper.Comparative experiments between the proposed point cloud DA algorithm and the current state-of-the-art algorithms in point cloud detection and single object tracking tasks demonstrate that the proposed method can improve the network performance obtained from unaugmented datasets by over 17.3%and 17.9%,surpassing SOTA performance of current point cloud DA algorithms.
基金Supported by National Key Technology R&D Program(No.2011BAG03B03)
文摘The three dimensional variable cross-section roll forming is a kind of new metal forming technol- ogy which combines large forming force, multi-axis linkage movement and space synergic movement, and the sequential synergic movement of the ganged roller group is used to complete the metal sheet forming according to the shape of the complicated and variable forming part data. The control system should meet the demands of quick response to the test requirements of the product part. A new kind of real time data driving multi-axis linkage and synergic movement control strategy of 3D roll forming is put forward in the paper. In the new control strategy, the forming data are automatically generated according to the shape of the parts, and the multi-axis linkage movement together with cooperative motion among the six stands of the 3D roll forming machine is driven by the real-time information, and the control nodes are also driven by the forming data. The new control strategy is applied to a 48 axis 3D roll forming machine developed by our research center, and the control servo period is less than 10ms. A forming experiment of variable cross section part is carried out, and the forming preci- sion is better than + 0.5mm by the control strategy. The result of the experiment proves that the control strategy has significant potentiality for the development of 3D roll forming production line with large scale, multi-axis ganged and svner^ic movement
基金Supported by the National Natural Science Foundation of China(71131008(Key Project)and 71271179)
文摘In this review, we highlight some recent methodological and theoretical develop- ments in estimation and testing of large panel data models with cross-sectional dependence. The paper begins with a discussion of issues of cross-sectional dependence, and introduces the concepts of weak and strong cross-sectional dependence. Then, the main attention is primarily paid to spatial and factor approaches for modeling cross-sectional dependence for both linear and nonlinear (nonparametric and semiparametric) panel data models. Finally, we conclude with some speculations on future research directions.
基金Guangxi Key Laboratory of Spatial Information and Geomatics(21-238-21-12)Guangxi Young and Middle-aged Teachers’Research Fundamental Ability Enhancement Project(2023KY1196).
文摘Airborne LiDAR(Light Detection and Ranging)is an evolving high-tech active remote sensing technology that has the capability to acquire large-area topographic data and can quickly generate DEM(Digital Elevation Model)products.Combined with image data,this technology can further enrich and extract spatial geographic information.However,practically,due to the limited operating range of airborne LiDAR and the large area of task,it would be necessary to perform registration and stitching process on point clouds of adjacent flight strips.By eliminating grow errors,the systematic errors in the data need to be effectively reduced.Thus,this paper conducts research on point cloud registration methods in urban building areas,aiming to improve the accuracy and processing efficiency of airborne LiDAR data.Meanwhile,an improved post-ICP(Iterative Closest Point)point cloud registration method was proposed in this study to determine the accurate registration and efficient stitching of point clouds,which capable to provide a potential technical support for applicants in related field.
基金supported by the National Key Research and Development Program(Nos.2023YFA1606901 and 2022YFA1602400)National Natural Science Foundation of China(Nos.U2230133,12275338,and 12388102)Open Fund of the CIAE Key Laboratory of Nuclear Data(No.JCKY2022201C152).
文摘We present new data on the^(63)Cu(γ,n)cross-section studied using a quasi-monochromatic and energy-tunableγbeam produced at the Shanghai Laser Electron Gamma Source to resolve the long-standing discrepancy between existing measurements and evaluations of this cross-section.Using an unfolding iteration method,^(63)Cu(γ,n)data were obtained with an uncertainty of less than 4%,and the inconsistencies between the available experimental data were discussed.Theγ-ray strength function of^(63)Cu(γ,n)was successfully extracted as an experimental constraint.We further calculated the cross-section of the radiative neutron capture reaction^(62)Cu(n,γ)using the TALYS code.Our calculation method enables the extraction of(n,γ)cross-sections for unstable nuclides.
基金supported by the Fujian University of Technology under Grant GYZ20016,GY-Z18183,and GY-Z19005partially supported by the National Science and Technology Council under Grant NSTC 113-2221-E-224-056-.
文摘To transmit customer power data collected by smart meters(SMs)to utility companies,data must first be transmitted to the corresponding data aggregation point(DAP)of the SM.The number of DAPs installed and the installation location greatly impact the whole network.For the traditional DAP placement algorithm,the number of DAPs must be set in advance,but determining the best number of DAPs is difficult,which undoubtedly reduces the overall performance of the network.Moreover,the excessive gap between the loads of different DAPs is also an important factor affecting the quality of the network.To address the above problems,this paper proposes a DAP placement algorithm,APSSA,based on the improved affinity propagation(AP)algorithm and sparrow search(SSA)algorithm,which can select the appropriate number of DAPs to be installed and the corresponding installation locations according to the number of SMs and their distribution locations in different environments.The algorithm adds an allocation mechanism to optimize the subnetwork in the SSA.APSSA is evaluated under three different areas and compared with other DAP placement algorithms.The experimental results validated that the method in this paper can reduce the network cost,shorten the average transmission distance,and reduce the load gap.
基金The National Basic Research Program of China(973Program)(No.2009CB320505)the Natural Science Foundation of Jiangsu Province(No. BK2008288)+1 种基金the Excellent Young Teachers Program of Southeast University(No.4009001018)the Open Research Program of Key Laboratory of Computer Network of Guangdong Province (No. CCNL200706)
文摘In order to improve the precision of super point detection and control measurement resource consumption, this paper proposes a super point detection method based on sampling and data streaming algorithms (SDSD), and proves that only sources or destinations with a lot of flows can be sampled probabilistically using the SDSD algorithm. The SDSD algorithm uses both the IP table and the flow bloom filter (BF) data structures to maintain the IP and flow information. The IP table is used to judge whether an IP address has been recorded. If the IP exists, then all its subsequent flows will be recorded into the flow BF; otherwise, the IP flow is sampled. This paper also analyzes the accuracy and memory requirements of the SDSD algorithm , and tests them using the CERNET trace. The theoretical analysis and experimental tests demonstrate that the most relative errors of the super points estimated by the SDSD algorithm are less than 5%, whereas the results of other algorithms are about 10%. Because of the BF structure, the SDSD algorithm is also better than previous algorithms in terms of memory consumption.
基金supported by the National Natural Science Foundation of China(Nos.41171355and41002120)
文摘A new object-oriented method has been developed for the extraction of Mars rocks from Mars rover data. It is based on a combination of Mars rover imagery and 3D point cloud data. First, Navcam or Pancam images taken by the Mars rovers are segmented into homogeneous objects with a mean-shift algorithm. Then, the objects in the segmented images are classified into small rock candidates, rock shadows, and large objects. Rock shadows and large objects are considered as the regions within which large rocks may exist. In these regions, large rock candidates are extracted through ground-plane fitting with the 3D point cloud data. Small and large rock candidates are combined and postprocessed to obtain the final rock extraction results. The shape properties of the rocks (angularity, circularity, width, height, and width-height ratio) have been calculated for subsequent ~eological studies.
基金Supported by National Natural Science Foundation of China(No.81373741)National Natural Science Foundation of China(No.81473786)Chinese Medicine and Integrated Medicine Research Projects[2017,No.20]Funded by Health and Family Planning Commission of Hubei Province(No.24)
文摘OBJECTIVE: To identify the acupoint combinations used in the treatment of Alzheimer's disease(AD).METHODS: The clinical literature regarding acupuncture and moxibustion for AD was searched and collected from databases including Chinese Biomedical Medicine, China National Knowledge Infrastructure, Wanfang Database and PubMed. The database of acupuncture and moxibustion prescriptions for AD was established by using Excel software so as to conduct the descriptive analysis, association analysis on the data.RESULTS: Baihui(GV 20), Sishencong(EX-HN 1),Shenmen(HT 7), Zusanli(ST 36), Neiguan(PC 6),Fengchi(GB 20), Taixi(KI 3), Dazhui(GV 14), Shenshu(BL 23), Sanyinjiao(SP 6), Shenting(GV 24), Fenglong(ST 40), Xuanzhong(GB 39), Shuigou(GV 26)and Taichong(LR 3) were of higher frequency in the treatment of AD with acupnucture and moxibustion. Most acupoints were selected from the Governor Vessel. The commonly used acupoints were located on the head, face, neck and lower limbs. The combination of the local acupoints with the distal ones was predominated. The crossing points among the specific points presented the advantage in the treatment. The association analysis indicated that the correlation among Fengchi(GB 20)-Baihui(GV 20) was the strongest, followed by combinations of Dazhui(GV 14)-Baihui(GV 20), Shenshu(BL 23)-Baihui(GV 20) and Neiguan(PC 6)-Baihui(GV 20) and indicated the common rules of the clinical acupoint selection and combination for AD.CONCLUSION: Our findings provide a reference for acupoints selection and combination for AD in clinical acupuncture practice.
文摘In a round-oval-round pass rolling sequence, the cross-section profile of an outgoing workpiece was predicted first after getting the maximum spread. The concept "critical point on the contact boundary" was proposed and the coordinates of the critical point were solved. The equivalent contact section area was represented and the mean roll radius was determined. The validity of this model was examined by alloy bar rolling experiment and rigid-plastic FEM simulation. Compared with the existing models, the mean roll radius obtained by this model is similar to experiment data.
基金This work is supported by the National Key Research and Development Program of China(Grant No.2021YFE0105500)the National Natural Science Foundation of China(Grant No.61671248)+2 种基金the Key Research and Development Plan of Jiangsu Province,China(Grant No.BE2018719)Postgraduate Research and Practice Innovation Program of Jiangsu Province(Grant No.SJCX19_0309)the Advantage Discipline Information and Communication Engineering of Jiangsu Province,China.
文摘Data loss or distortion causes adverse effects on the accuracy and stability of the thunderstorm point charge localization.To solve this problem,we propose a data complementary method based on the atmospheric electric field apparatus array group.The electric field component measurement model of the atmospheric electric field apparatus is established,and the orientation parameters of the thunderstorm point charge are defined.Based on the mirror method,the thunderstorm point charge coordinates are obtained by using the potential distribution formulas.To test the validity of the basic algorithm,the electric field component measurement error and the localization accuracy are studied.Besides the azimuth angle and the elevation angle,the localization parameters also include the distance from the apparatus to the thunderstorm cloud.Based on a primary electric field apparatus,we establish the array group of apparatuses.Based on this,the data measured by each apparatus is complementarily processed to regain the thunderstorm point charge position.The results show that,compared with the radar map data,this method can accurately reflect the location of the thunderstorm point charge,and has a better localization effect.Additionally,several observation results during thunderstorm weather have been presented.
基金supported by the Preeminent Youth Fund of Sichuan Province,China(Grant No.2012JQ0012)the National Natural Science Foundation of China(Grant Nos.11173008,10974202,and 60978049)the National Key Scientific and Research Equipment Development Project of China(Grant No.ZDYZ2013-2)
文摘For the accurate extraction of cavity decay time, a selection of data points is supplemented to the weighted least square method. We derive the expected precision, accuracy and computation cost of this improved method, and examine these performances by simulation. By comparing this method with the nonlinear least square fitting (NLSF) method and the linear regression of the sum (LRS) method in derivations and simulations, we find that this method can achieve the same or even better precision, comparable accuracy, and lower computation cost. We test this method by experimental decay signals. The results are in agreement with the ones obtained from the nonlinear least square fitting method.
基金financially supported by the National Natural Science Fundation of China(Grant Nos.42161065 and 41461038)。
文摘Understanding the mechanisms and risks of forest fires by building a spatial prediction model is an important means of controlling forest fires.Non-fire point data are important training data for constructing a model,and their quality significantly impacts the prediction performance of the model.However,non-fire point data obtained using existing sampling methods generally suffer from low representativeness.Therefore,this study proposes a non-fire point data sampling method based on geographical similarity to improve the quality of non-fire point samples.The method is based on the idea that the less similar the geographical environment between a sample point and an already occurred fire point,the greater the confidence in being a non-fire point sample.Yunnan Province,China,with a high frequency of forest fires,was used as the study area.We compared the prediction performance of traditional sampling methods and the proposed method using three commonly used forest fire risk prediction models:logistic regression(LR),support vector machine(SVM),and random forest(RF).The results show that the modeling and prediction accuracies of the forest fire prediction models established based on the proposed sampling method are significantly improved compared with those of the traditional sampling method.Specifically,in 2010,the modeling and prediction accuracies improved by 19.1%and 32.8%,respectively,and in 2020,they improved by 13.1%and 24.3%,respectively.Therefore,we believe that collecting non-fire point samples based on the principle of geographical similarity is an effective way to improve the quality of forest fire samples,and thus enhance the prediction of forest fire risk.
基金supported by the National Natural Science Foundation of China (62173103)the Fundamental Research Funds for the Central Universities of China (3072022JC0402,3072022JC0403)。
文摘For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are acquired using specialized maritime LiDAR sensors in both inland waterways and wide-open ocean environments. The simulated data is generated by placing a ship in the LiDAR coordinate system and scanning it with a redeveloped Blensor that emulates the operation of a LiDAR sensor equipped with various laser beams. Furthermore,we also render point clouds for foggy and rainy weather conditions. To describe a realistic shipping environment, a dynamic tail wave is modeled by iterating the wave elevation of each point in a time series. Finally, networks serving small objects are migrated to ship applications by feeding our dataset. The positive effect of simulated data is described in object detection experiments, and the negative impact of tail waves as noise is verified in single-object tracking experiments. The Dataset is available at https://github.com/zqy411470859/ship_dataset.
基金The Project was supported by Natural Science Foundation of Jiangsu Province.
文摘This paper gives a definition of permanent optimal data point of Least Absolute Deviation(LAD)problem.Some theoretical results on non-degenerate LAD problem are obtained.For computing LAD problem,an efficient,algorithm is given according to the idea of permanent optimal data point.Numerical experience shows that our algorithm is better than many of others,including the famous B R algorithm.
基金We acknowledge the Ministry of Science and Technology of China (Grant No.2006BAC03B01)the Ministry of Science and Technology of China for funding the 973 project (Grant No.2005CB321703)
文摘A four dimensional variational data assimilation (4DVar) based on a dimension-reduced projection (DRP-4DVar) has been developed as a hybrid of the 4DVar and Ensemble Kalman filter (EnKF) concepts. Its good flow-dependent features are demonstrated in single-point experiments through comparisons with adjointbased 4DVar and three-dimensional variational data (3DVar) assimilations using the fifth-generation Pennsylvania State University-National Center for Atmospheric Research Mesoscale Model (MM5). The results reveal that DRP-4DVar can reasonably generate a background error covariance matrix (simply B-matrix) during the assimilation window from an initial estimation using a number of initial condition dependent historical forecast samples. In contrast, flow-dependence in the B-matrix of MM5 4DVar is barely detectable. It is argued that use of diagonal estimation in the B-matrix of the MM5 4DVar method at the initial time leads to this failure. The experiments also show that the increments produced by DRP-4DVar are anisotropic and no longer symmetric with respect to observation location due to the effects of the weather trends captured in its B-matrix. This differs from the MM5 3DVar which does not consider the influence of heterogeneous forcing on the correlation structure of the B-matrix, a condition that is realistic for many situations. Thus, the MM5 3DVar assimilation could only present an isotropic and homogeneous structure in its increments.
文摘The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for difference significant test among different treatments using dynamic point as indexes were presented without setting the replication on each dynamic point observed.
基金Key Laboratory of Degraded and Unused Land Consolidation Engineering,Ministry of Natural Resources of China,Grant/Award Number:SXDJ2024-22Technology Innovation Centre for Integrated Applications in Remote Sensing and Navigation,Ministry of Natural Resources of China,Grant/Award Number:TICIARSN-2023-06+2 种基金National Natural Science Foundation of China,Grant/Award Numbers:42171446,62302246Zhejiang Provincial Natural Science Foundation of China,Grant/Award Number:LQ23F010008Science and Technology Program of Tianjin,China,Grant/Award Number:23ZGSSSS00010。
文摘Semantic segmentation in the context of 3D point clouds for the railway environment holds a significant economic value,but its development is severely hindered by the lack of suitable and specific datasets.Additionally,the models trained on existing urban road point cloud datasets demonstrate poor generalisation on railway data due to a large domain gap caused by non-overlapping special/rare categories,for example,rail track,track bed etc.To harness the potential of supervised learning methods in the domain of 3D railway semantic segmentation,we introduce RailPC,a new point cloud benchmark.RailPC provides a large-scale dataset with rich annotations for semantic segmentation in the railway environment.Notably,RailPC contains twice the number of annotated points compared to the largest available mobile laser scanning(MLS)point cloud dataset and is the first railway-specific 3D dataset for semantic segmentation.It covers a total of nearly 25 km railway in two different scenes(urban and mountain),with 3 billion points that are finely labelled as 16 most typical classes with respect to railway,and the data acquisition process is completed in China by MLS systems.Through extensive experimentation,we evaluate the performance of advanced scene understanding methods on the annotated dataset and present a synthetic analysis of semantic segmentation results.Based on our findings,we establish some critical challenges towards railway-scale point cloud semantic segmentation.The dataset is available at https://github.com/NNU-GISA/GISA-RailPC,and we will continuously update it based on community feedback.
文摘This paper mainly investigated the basic information about non-stationary trend change point patterns. After performing the investigation, the corresponding results show the existence of a trend, its magnitude, and change points in 24-hourly annual maximum series (AMS) extracted from monthly maximum series (MMS) data for thirty years (1986-2015) rainfall data for Uyo metropolis. Trend analysis was performed using Mann-Kendall (MK) test and Sen’s slope estimator (SSE) used to obtain the trend magnitude, while the trend change point analysis was conducted using the distribution-free cumulative sum test (CUSUM) and the sequential Mann-Kendall test (SQMK). A free CUSUM plot date of change point of rainfall trend as 2002 at 90% confidence interval was obtained from where the increasing trend started and became more pronounced in the year 2011, another change point year from the SQMK plot with the trend intensifying. The SSE gave an average rate of change in rainfall as 2.1288 and 2.16 mm/year for AMS and MMS time series data respectively. Invariably, the condition for Non-stationary concept application is met for intensity-duration-frequency modeling.