期刊文献+
共找到1,163篇文章
< 1 2 59 >
每页显示 20 50 100
Support vector machine regression(SVR)-based nonlinear modeling of radiometric transforming relation for the coarse-resolution data-referenced relative radiometric normalization(RRN) 被引量:3
1
作者 Jing Geng Wenxia Gan +2 位作者 Jinying Xu Ruqin Yang Shuliang Wang 《Geo-Spatial Information Science》 SCIE CSCD 2020年第3期237-247,I0004,共12页
Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating ... Radiometric normalization,as an essential step for multi-source and multi-temporal data processing,has received critical attention.Relative Radiometric Normalization(RRN)method has been primarily used for eliminating the radiometric inconsistency.The radiometric trans-forming relation between the subject image and the reference image is an essential aspect of RRN.Aimed at accurate radiometric transforming relation modeling,the learning-based nonlinear regression method,Support Vector machine Regression(SVR)is used for fitting the complicated radiometric transforming relation for the coarse-resolution data-referenced RRN.To evaluate the effectiveness of the proposed method,a series of experiments are performed,including two synthetic data experiments and one real data experiment.And the proposed method is compared with other methods that use linear regression,Artificial Neural Network(ANN)or Random Forest(RF)for radiometric transforming relation modeling.The results show that the proposed method performs well on fitting the radiometric transforming relation and could enhance the RRN performance. 展开更多
关键词 Support Vector machine Regression(SVR) non-linear radiometric transforming relation Relative Radiometric normalization(RRN) multi-source data
原文传递
Optical-Elevation Data Co-Registration and Classification-Based Height Normalization for Building Detection in Stereo VHR Images 被引量:1
2
作者 Alaeldin Suliman Yun Zhang 《Advances in Remote Sensing》 2017年第2期103-119,共17页
Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable dete... Building detection in very high resolution (VHR) images is crucial for mapping and analysing urban environments. Since buildings are elevated objects, elevation data need to be integrated with images for reliable detection. This process requires two critical steps: optical-elevation data co-registration and aboveground elevation calculation. These two steps are still challenging to some extent. Therefore, this paper introduces optical-elevation data co-registration and normalization techniques for generating a dataset that facilitates elevation-based building detection. For achieving accurate co-registration, a dense set of stereo-based elevations is generated and co-registered to their relevant image based on their corresponding image locations. To normalize these co-registered elevations, the bare-earth elevations are detected based on classification information of some terrain-level features after achieving the image co-registration. The developed method was executed and validated. After implementation, 80% overall-quality of detection result was achieved with 94% correct detection. Together, the developed techniques successfully facilitate the incorporation of stereo-based elevations for detecting buildings in VHR remote sensing images. 展开更多
关键词 Building Detection Very High Resolution Images Optical-Elevation data CO-REGISTRATION Classification-Based Height normalization
暂未订购
Evaluation of Two Absolute Radiometric Normalization Algorithms for Pre-processing of Landsat Imagery 被引量:13
3
作者 徐涵秋 《Journal of China University of Geosciences》 SCIE CSCD 2006年第2期146-150,157,共6页
In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illuminati... In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illumination Correction Model proposed by Markham and Irish and the Illumination and Atmospheric Correction Model developed by the Remote Sensing and GIS Laboratory of the Utah State University. Relative noise, correlation coefficient and slope value were used as the criteria for the evaluation and comparison, which were derived from pseudo-invarlant features identified from multitemporal Landsat image pairs of Xiamen (厦门) and Fuzhou (福州) areas, both located in the eastern Fujian (福建) Province of China. Compared with the unnormalized image, the radiometric differences between the normalized multitemporal images were significantly reduced when the seasons of multitemporal images were different. However, there was no significant difference between the normalized and unnorrealized images with a similar seasonal condition. Furthermore, the correction results of two algorithms are similar when the images are relatively clear with a uniform atmospheric condition. Therefore, the radiometric normalization procedures should be carried out if the multitemporal images have a significant seasonal difference. 展开更多
关键词 LANDSAT radiometrie correction data normalization pseudo-invariant features image processing.
在线阅读 下载PDF
A new edge recognition technology based on the normalized vertical derivative of the total horizontal derivative for potential field data 被引量:103
4
作者 Wang Wanyin Pan Yu Qiu Zhiyun 《Applied Geophysics》 SCIE CSCD 2009年第3期226-233,299,共9页
Edge detection and enhancement techniques are commonly used in recognizing the edge of geologic bodies using potential field data. We present a new edge recognition technology based on the normalized vertical derivati... Edge detection and enhancement techniques are commonly used in recognizing the edge of geologic bodies using potential field data. We present a new edge recognition technology based on the normalized vertical derivative of the total horizontal derivative which has the functions of both edge detection and enhancement techniques. First, we calculate the total horizontal derivative (THDR) of the potential-field data and then compute the n-order vertical derivative (VDRn) of the THDR. For the n-order vertical derivative, the peak value of total horizontal derivative (PTHDR) is obtained using a threshold value greater than 0. This PTHDR can be used for edge detection. Second, the PTHDR value is divided by the total horizontal derivative and normalized by the maximum value. Finally, we used different kinds of numerical models to verify the effectiveness and reliability of the new edge recognition technology. 展开更多
关键词 potential field data edge recognition edge enhancement total horizontal derivative normalized vertical derivative
在线阅读 下载PDF
Similarity measurement method of high-dimensional data based on normalized net lattice subspace 被引量:4
5
作者 李文法 Wang Gongming +1 位作者 Li Ke Huang Su 《High Technology Letters》 EI CAS 2017年第2期179-184,共6页
The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities... The performance of conventional similarity measurement methods is affected seriously by the curse of dimensionality of high-dimensional data.The reason is that data difference between sparse and noisy dimensionalities occupies a large proportion of the similarity,leading to the dissimilarities between any results.A similarity measurement method of high-dimensional data based on normalized net lattice subspace is proposed.The data range of each dimension is divided into several intervals,and the components in different dimensions are mapped onto the corresponding interval.Only the component in the same or adjacent interval is used to calculate the similarity.To validate this method,three data types are used,and seven common similarity measurement methods are compared.The experimental result indicates that the relative difference of the method is increasing with the dimensionality and is approximately two or three orders of magnitude higher than the conventional method.In addition,the similarity range of this method in different dimensions is [0,1],which is fit for similarity analysis after dimensionality reduction. 展开更多
关键词 high-dimensional data the curse of dimensionality SIMILARITY normalization SUBSPACE NPsim
在线阅读 下载PDF
An Evolutionary Normalization Algorithm for Signed Floating-Point Multiply-Accumulate Operation 被引量:1
6
作者 Rajkumar Sarma Cherry Bhargava Ketan Kotecha 《Computers, Materials & Continua》 SCIE EI 2022年第7期481-495,共15页
In the era of digital signal processing,like graphics and computation systems,multiplication-accumulation is one of the prime operations.A MAC unit is a vital component of a digital system,like different Fast Fourier ... In the era of digital signal processing,like graphics and computation systems,multiplication-accumulation is one of the prime operations.A MAC unit is a vital component of a digital system,like different Fast Fourier Transform(FFT)algorithms,convolution,image processing algorithms,etcetera.In the domain of digital signal processing,the use of normalization architecture is very vast.The main objective of using normalization is to performcomparison and shift operations.In this research paper,an evolutionary approach for designing an optimized normalization algorithm is proposed using basic logical blocks such as Multiplexer,Adder etc.The proposed normalization algorithm is further used in designing an 8×8 bit Signed Floating-Point Multiply-Accumulate(SFMAC)architecture.Since the SFMAC can accept an 8-bit significand and a 3-bit exponent,the input to the said architecture can be somewhere between−(7.96872)_(10) to+(7.96872)_(10).The proposed architecture is designed and implemented using the Cadence Virtuoso using 90 and 130 nm technologies(in Generic Process Design Kit(GPDK)and Taiwan Semiconductor Manufacturing Company(TSMC),respectively).To reduce the power consumption of the proposed normalization architecture,techniques such as“block enabling”and“clock gating”are used rigorously.According to the analysis done on Cadence,the proposed architecture uses the least amount of power compared to its current predecessors. 展开更多
关键词 data normalization cadence virtuoso signed-floating-point MAC evolutionary optimized algorithm block enabling clock gating
在线阅读 下载PDF
Performance Evaluation of Quicksort with GPU Dynamic Parallelism for Gene-Expression Quantile Normalization
7
作者 Roberto Pinto Souto Carla Osthoff +2 位作者 Douglas Augusto Oswaldo Trelles Ana Tereza Ribeiro de Vasconcelos 《通讯和计算机(中英文版)》 2013年第12期1522-1528,共7页
关键词 快速排序算法 基因表达数据 并行实现 GPU 绩效评估 位数 现代分子生物学 寡核苷酸微阵列
在线阅读 下载PDF
Bayesian Inference of Spatially Correlated Binary Data Using Skew-Normal Latent Variables with Application in Tooth Caries Analysis
8
作者 Solaiman Afroughi 《Open Journal of Statistics》 2015年第2期127-139,共13页
The analysis of spatially correlated binary data observed on lattices is an interesting topic that catches the attention of many scholars of different scientific fields like epidemiology, medicine, agriculture, biolog... The analysis of spatially correlated binary data observed on lattices is an interesting topic that catches the attention of many scholars of different scientific fields like epidemiology, medicine, agriculture, biology, geology and geography. To overcome the encountered difficulties upon fitting the autologistic regression model to analyze such data via Bayesian and/or Markov chain Monte Carlo (MCMC) techniques, the Gaussian latent variable model has been enrolled in the methodology. Assuming a normal distribution for the latent random variable may not be realistic and wrong, normal assumptions might cause bias in parameter estimates and affect the accuracy of results and inferences. Thus, it entails more flexible prior distributions for the latent variable in the spatial models. A review of the recent literature in spatial statistics shows that there is an increasing tendency in presenting models that are involving skew distributions, especially skew-normal ones. In this study, a skew-normal latent variable modeling was developed in Bayesian analysis of the spatially correlated binary data that were acquired on uncorrelated lattices. The proposed methodology was applied in inspecting spatial dependency and related factors of tooth caries occurrences in a sample of students of Yasuj University of Medical Sciences, Yasuj, Iran. The results indicated that the skew-normal latent variable model had validity and it made a decent criterion that fitted caries data. 展开更多
关键词 Spatial data LATENT Variable Autologistic Model SKEW-normal Distribution BAYESIAN INFERENCE TOOTH CARIES
暂未订购
Leveraging the knee point:Boosting remaining useful life prediction accuracy for lithium-ion batteries with virtual-enhanced normalizing flow
9
作者 Bowei Zhang Mingzhe Leng +5 位作者 Changhua Hu Hong Pei Zhaoqiang Wang Chuanyang Li Li Wang Xiangming He 《Journal of Energy Chemistry》 2025年第11期535-547,I0013,共14页
Deep learning has emerged as a powerful tool for predicting the remaining useful life(RUL)of batteries,contingent upon access to ample data.However,the inherent limitations of data availability from traditional or acc... Deep learning has emerged as a powerful tool for predicting the remaining useful life(RUL)of batteries,contingent upon access to ample data.However,the inherent limitations of data availability from traditional or accelerated life testing pose significant challenges.To mitigate the prediction accuracy issues arising from small sample sizes in existing intelligent methods,we introduce a novel data augmentation framework for RUL prediction.This framework harnesses the inherent high coincidence of degradation patterns exhibited by lithium-ion batteries to pinpoint the knee point,a critical juncture marking a significant shift in the degradation trajectory.By focusing on this critical knee point,we leverage the power of normalizing flow models to generate virtual data,effectively augmenting the training sample size.Additionally,we integrate a Bayesian Long Short-Term Memory network,optimized with Box-Cox transformation,to address the inherent uncertainty associated with predictions based on augmented data.This integration allows for a more nuanced understanding of RUL prediction uncertainties,offering valuable confidence intervals.The efficacy and superiority of the proposed framework are validated through extensive experiments on the CS2 dataset from the University of Maryland and the CrFeMnNiCo dataset from our laboratory.The results clearly demonstrate a substantial improvement in the confidence interval of RUL predictions compared to pre-optimization,highlighting the ability of the framework to achieve high-precision RUL predictions even with limited data. 展开更多
关键词 Remaining useful life data augmentation Knee point normalizing flow Box-Cox transformation
在线阅读 下载PDF
煤矿智能常态化理论体系与顶层架构研究
10
作者 李瑞 王高伟 王忠强 《煤炭工程》 北大核心 2026年第1期1-11,共11页
针对煤矿智能化建设中存在的常态化运行管理机制缺失、系统效能未达预期、技术生态不健全等关键问题,旨在构建一套系统化的煤矿智能常态化理论体系与顶层架构。通过引入“管理一体化、系统智能化、运行持续化、评估标准化”的综合理念,... 针对煤矿智能化建设中存在的常态化运行管理机制缺失、系统效能未达预期、技术生态不健全等关键问题,旨在构建一套系统化的煤矿智能常态化理论体系与顶层架构。通过引入“管理一体化、系统智能化、运行持续化、评估标准化”的综合理念,融合“智能技术+管理”双轮驱动模式,设计了涵盖业务逻辑、技术体系、数据架构与评价管理的煤矿智能常态化顶层架构。研究明确了该架构中业务管控、AI协作、安全保障与评价管理四大体系的协同关系,阐述了基于PaaS云底座、智能运维AI引擎及全生命周期数据架构的技术实现路径,并提出了以成熟度模型为核心的闭环评价管理流程。结果表明,该顶层架构能够为煤矿智能系统实现长期稳定、自适应优化的常态化运行提供理论指引与方法支撑,有助于推动煤矿生产向安全、高效、绿色、智能的可持续模式转型。 展开更多
关键词 煤矿智能常态化 顶层架构 业务逻辑 数据架构 煤矿智能化
在线阅读 下载PDF
非参数固定效应Panel Data模型的分位数回归推断 被引量:1
11
作者 吕秀梅 《统计与信息论坛》 CSSCI 2012年第6期28-32,共5页
利用分位数回归方法,讨论了非参数固定效应Panel Data模型的估计和检验问题,得到了参数估计的渐近正态性及收敛速度。同时,建立一个秩得分(rank score)统计量来检验模型的固定效应,并证明了这个统计量渐近服从标准正态分布。
关键词 分位数回归 渐近正态 固定效应Panel data模型
在线阅读 下载PDF
基于M3TEA的高精度糖尿病视网膜病变图像分类
12
作者 李丰硕 吴扬东 +1 位作者 邓智方 赵炼 《计算机工程与设计》 北大核心 2026年第3期769-777,共9页
针对糖尿病视网膜病变图像分类任务中分类精度不足、图像多样性及病变细节识别困难等问题,提出基于MobileNetV3的任务增强注意力模型。结合多任务损失函数,利用图像多层次信息,增强对病变特征识别能力;集成多种数据增强策略,增加训练数... 针对糖尿病视网膜病变图像分类任务中分类精度不足、图像多样性及病变细节识别困难等问题,提出基于MobileNetV3的任务增强注意力模型。结合多任务损失函数,利用图像多层次信息,增强对病变特征识别能力;集成多种数据增强策略,增加训练数据的多样性和复杂性,提高模型泛化能力;结合归一化注意力机制,聚焦于关键病变区域,进一步提升分类精度。在Kaggle糖尿病视网膜病变数据集上的实验结果表明,该模型相比基准MobileNetV3,显著提升了分类精度,验证了方法的有效性和优越性。 展开更多
关键词 糖尿病视网膜病变 图像分类 多任务损失函数 泛化能力 数据增强 归一化注意力机制 计算机辅助诊断
在线阅读 下载PDF
基于CNN-BiLSTM-Attention的光伏发电功率预测研究
13
作者 朱峻嬉 郑淑娴 +3 位作者 金典 孙世康 冯靖瑶 陈仕军 《四川电力技术》 2026年第1期14-21,95,共9页
针对光伏功率输出的波动性及间歇性特征,提出了一种卷积神经网络(convolutional neural network,CNN)结合双向长短期记忆网络(bidirectional long short-term memory,BiLSTM)和注意力机制(Attention)的混合预测模型:先采用局部异常因子(... 针对光伏功率输出的波动性及间歇性特征,提出了一种卷积神经网络(convolutional neural network,CNN)结合双向长短期记忆网络(bidirectional long short-term memory,BiLSTM)和注意力机制(Attention)的混合预测模型:先采用局部异常因子(local outlier factor,LOF)算法检测与剔除功率数据中的异常数据,结合横向归一化方法消除量纲差异;再利用CNN捕捉局部空间特征、BiLSTM捕捉长期时序依赖,建立预测模型;最后在优化阶段引入Attention动态分配关键时间步的权重。为检验模型效果,选取某省级电网近3年的光伏发电功率数据进行实例分析。结果表明,所提CNN-BiLSTM-Attention预测模型的平均绝对误差、均方根误差和平均相对误差分别为0.02、0.04和0.06,可实现光伏发电的高精度功率预测,对优化电力调配与新能源消纳具有实际意义。 展开更多
关键词 光伏发电功率预测 数据归一化 LOF异常检测 CNN-BiLSTM-Attention混合模型 注意力机制
在线阅读 下载PDF
跨机构医疗数据的隐私保护FedAvg优化模型仿真
14
作者 肖薇 王晓光 +1 位作者 李长悦 王晨莎 《计算机仿真》 2026年第1期472-476,共5页
不同医疗机构电子健康记录系统的异构性加大了数据交换难度,引起实际操作中的误解或遗漏,加之跨机构医疗数据共享参与方众多,易使部分机构丢失私钥和公钥参数,导致无法及时更新加密信息,进而削弱隐私保护私密性。为此,提出跨机构医疗数... 不同医疗机构电子健康记录系统的异构性加大了数据交换难度,引起实际操作中的误解或遗漏,加之跨机构医疗数据共享参与方众多,易使部分机构丢失私钥和公钥参数,导致无法及时更新加密信息,进而削弱隐私保护私密性。为此,提出跨机构医疗数据隐私保护FedAvg优化方法。标准化处理跨机构医疗数据,统一数据格式和结构;构建属性映射函数,通过哈希函数标记隐私数据,减少整合错误;通过安全素数阶层的循环群初始化全局模型,确保各医疗机构拥有私钥和公钥参数;引入考虑权值的FedAvg算法调整模型更新机制,采用同态加密技术加密更新信息,实现医疗数据的安全访问。由仿真可知,所提方法在通信成本上带宽占用和加密任务进度占用低;模型精度上F1值维持在0.9以上且更稳定,能够保证医疗患者数据的私密性,模型的精准度更高;计算开销上占用计算资源少。 展开更多
关键词 跨机构医疗数据 数据加密 隐私保护 联邦平均算法 数据归一化 同态加密
在线阅读 下载PDF
基于Normalized Cut的基因表达数据聚类 被引量:4
15
作者 王俊生 王年 +1 位作者 郭秀丽 唐俊 《安徽大学学报(自然科学版)》 CAS 北大核心 2012年第4期68-72,共5页
利用基因表达数据进行聚类分析可提高肿瘤诊断的正确率,对生物医学研究具有重要意义.该文将Normalized Cut应用于基因表达数据的聚类中,将样本映射为高维空间的点,利用亲近矩阵和度矩阵构造正规Laplacian矩阵,经SVD分解得到反映原始样... 利用基因表达数据进行聚类分析可提高肿瘤诊断的正确率,对生物医学研究具有重要意义.该文将Normalized Cut应用于基因表达数据的聚类中,将样本映射为高维空间的点,利用亲近矩阵和度矩阵构造正规Laplacian矩阵,经SVD分解得到反映原始样本类别信息的指示向量,利用指示向量各分量的符号差异实现基因表达数据的聚类.通过对白血病和结肠癌数据集的实验,证明了该文方法的有效性. 展开更多
关键词 聚类 指示向量 normalized CUT 基因表达数据
在线阅读 下载PDF
基于数据挖掘算法的配电网全过程造价控制方法研究
16
作者 陈付雷 李建青 施晓敏 《电子设计工程》 2026年第5期106-110,116,共6页
从全过程的角度分析,影响配电网造价的因素较多,且影响程度不同。为此,提出基于数据挖掘算法的配电网全过程造价控制方法。该方法分别从设计阶段、施工过程、材料设备采购三个方面,分析配电网全过程造价的影响因素构成,采用两两比较的... 从全过程的角度分析,影响配电网造价的因素较多,且影响程度不同。为此,提出基于数据挖掘算法的配电网全过程造价控制方法。该方法分别从设计阶段、施工过程、材料设备采购三个方面,分析配电网全过程造价的影响因素构成,采用两两比较的方式构建配电网全过程造价影响因素的判断矩阵,对其进行归一化处理,结合最大特征根参量为影响因素赋权。在控制阶段,采用层次分析法构建分层控制目标体系,包括总体控制目标、子控制目标、控制阶段目标与控制指标。将敏感系数趋近于0的影响因素状态作为最终控制结果,以降低其对造价的扰动。实验结果显示,设计控制方法下的单位投资成本明显低于对照组。 展开更多
关键词 数据挖掘算法 配电网全过程 造价控制 判断矩阵 归一化处理 最大特征根 敏感系数
在线阅读 下载PDF
ASYMPTOTIC PROPERTIES OF ESTIMATORS IN PARTIALLY LINEAR SINGLE-INDEX MODEL FOR LONGITUDINAL DATA 被引量:3
17
作者 田萍 杨林 薛留根 《Acta Mathematica Scientia》 SCIE CSCD 2010年第3期677-687,共11页
In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be est... In this article, a partially linear single-index model /or longitudinal data is investigated. The generalized penalized spline least squares estimates of the unknown parameters are suggested. All parameters can be estimated simultaneously by the proposed method while the feature of longitudinal data is considered. The existence, strong consistency and asymptotic normality of the estimators are proved under suitable conditions. A simulation study is conducted to investigate the finite sample performance of the proposed method. Our approach can also be used to study the pure single-index model for longitudinal data. 展开更多
关键词 Longitudinal data partially linear single-index model penalized spline strong consistency asymptotic normality
在线阅读 下载PDF
Assessment of Human Impacts on Vegetation in Built-up Areas in China Based on AVHRR,MODIS and DMSP_OLS Nighttime Light Data,1992–2010 被引量:6
18
作者 LIU Qinping YANG Yongchun +2 位作者 TIAN Hongzhen ZHANG Bo GU Lei 《Chinese Geographical Science》 SCIE CSCD 2014年第2期231-244,共14页
Since the reform and opening-up program started in 1978,the level of urbanization has increased rapidly in China.Rapid urban expansion and restructuring have had significant impacts on the ecological environment espec... Since the reform and opening-up program started in 1978,the level of urbanization has increased rapidly in China.Rapid urban expansion and restructuring have had significant impacts on the ecological environment especially within built-up areas.In this study,ArcGIS 10,ENVI 4.5,and Visual FoxPro 6.0 were used to analyze the human impacts on vegetation in the built-up areas of 656Chinese cities from 1992 to 2010.Firstly,an existing algorithm was refined to extract the boundaries of the built-up areas based on the Defense Meteorological Satellite Program Operational Linescan System(DMSP_OLS)nighttime light data.This improved algorithm has the advantages of high accuracy and speed.Secondly,a mathematical model(Human impacts(HI))was constructed to measure the impacts of human factors on vegetation during rapid urbanization based on Advanced Very High Resolution Radiometer(AVHRR)Normalized Difference Vegetation Index(NDVI)and Moderate Resolution Imaging Spectroradiometer(MODIS)NDVI.HI values greater than zero indicate relatively beneficial effects while values less than zero indicate proportionally adverse effects.The results were analyzed from four aspects:the size of cities(metropolises,large cities,medium-sized cities,and small cities),large regions(the eastern,central,western,and northeastern China),administrative divisions of China(provinces,autonomous regions,and municipalities)and vegetation zones(humid and semi-humid forest zone,semi-arid steppe zone,and arid desert zone).Finally,we discussed how human factors impacted on vegetation changes in the built-up areas.We found that urban planning policies and developmental stages impacted on vegetation changes in the built-up areas.The negative human impacts followed an inverted′U′shape,first rising and then falling with increase of urban scales.China′s national policies,social and economic development affected vegetation changes in the built-up areas.The findings can provide a scientific basis for municipal planning departments,a decision-making reference for government,and scientific guidance for sustainable development in China. 展开更多
关键词 vegetation change human impact urbanization built-up areas nighttime light data normalized Difference Vegetation Index(NDVI)
在线阅读 下载PDF
A new approach to retrieve leaf normal distribution using terrestrial laser scanners 被引量:2
19
作者 Shengye Jin Masayuki Tamura Junichi Susaki 《Journal of Forestry Research》 SCIE CAS CSCD 2016年第3期631-638,共8页
Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between le... Leaf normal distribution is an important structural characteristic of the forest canopy. Although terrestrial laser scanners(TLS) have potential for estimating canopy structural parameters, distinguishing between leaves and nonphotosynthetic structures to retrieve the leaf normal has been challenging. We used here an approach to accurately retrieve the leaf normals of camphorwood(Cinnamomum camphora) using TLS point cloud data.First, nonphotosynthetic structures were filtered by using the curvature threshold of each point. Then, the point cloud data were segmented by a voxel method and clustered by a Gaussian mixture model in each voxel. Finally, the normal vector of each cluster was computed by principal component analysis to obtain the leaf normal distribution. We collected leaf inclination angles and estimated the distribution, which we compared with the retrieved leaf normal distribution. The correlation coefficient between measurements and obtained results was 0.96, indicating a good coincidence. 展开更多
关键词 Leaf normal distribution Leaf inclinationangle Terrestrial laser scanner Point cloud data Curvature - Clustering
在线阅读 下载PDF
上一页 1 2 59 下一页 到第
使用帮助 返回顶部