The Tayatea Dyke Swarm(also known as the Tayatea Dolerite)comprises well-exposed northeast-trending tholeiitic dykes that intrude the Rocky Cape Group(RCG)of northwest Tasmania,Australia.The dykes commonly
Optimal location query in road networks is a basic operation in the location intelligence applications.Given a set of clients and servers on a road network,the purpose of optimal location query is to obtain a location...Optimal location query in road networks is a basic operation in the location intelligence applications.Given a set of clients and servers on a road network,the purpose of optimal location query is to obtain a location for a new server,so that a certain objective function calculated based on the locations of clients and servers is optimal.Existing works assume no labels for servers and that a client only visits the nearest server.These assumptions are not realistic and it renders the existing work not useful in many cases.In this paper,we relax these assumptions and consider the k nearest neighbours(KNN)of clients.We introduce the problem of KNN-based optimal location query(KOLQ)which considers the k nearest servers of clients and labeled servers.We also introduce a variant problem called relocation KOLQ(RKOLQ)which aims at relocating an existing server to an optimal location.Two main analysis algorithms are proposed for these problems.Extensive experiments on the real road networks illustrate the efficiency of our proposed solutions.展开更多
Aims this systematic review highlights the relative support and implica-tions of the attractant-decoy and repellent-plant hypotheses,discuss-ing important linkages between these theories and the opportunity for novel ...Aims this systematic review highlights the relative support and implica-tions of the attractant-decoy and repellent-plant hypotheses,discuss-ing important linkages between these theories and the opportunity for novel integration into ecological and applied research.Methods an extensive systematic review of the current literature on the attract-ant-decoy and repellent-plant hypotheses was done to describe the following attributes of the research to date:(i)the geographic extent(country and biome)of studies on this topic,(ii)the scope of experi-mental designs used,(iii)the level of support for these hypotheses with respect to the breadth of ecological niches tested,(iv)the level of support for these hypotheses with respect to the classes of herbi-vores examined and,lastly,(v)the ecological impact or purpose of these studies.Herein,we summarize important research gaps in the empirical literature on this topic and identify novel opportunities for critical linkages between ecological and applied theories.Important Findingsa total of 37%of experiments testing these two associated hypotheses were done in North america,frequently in either temperate broadleaf(26%of studies)or taiga ecosystems(15%of studies).the majority of these studies involved experimental manipulations such as removing and transplanting vegetation and either tracked or excluded mammalian herbivores.Ecological implications were primarily examined(59%of studies),but there were also implications described for agriculture and commercial forestry in 22%of studies.the repellent-plant hypothesis was well supported in many ecological systems,particularly for mammalian herbivores,but the attractant-decoy hypothesis has been less frequently tested,thereby representing an important research gap.Insect herbivores were under-represented in all categories except in applied contexts such as commercial forestry and agriculture.there is a clear need for studies to connect these two ecological hypotheses with the management of agriculture and restoration efforts in many ecosystems.research on the co-evolution and facilitation between palatable and unpalatable plants also represents another novel area of future study.展开更多
The studypresents theHalfMax InsertionHeuristic (HMIH) as a novel approach to solving theTravelling SalesmanProblem (TSP). The goal is to outperform existing techniques such as the Farthest Insertion Heuristic (FIH) a...The studypresents theHalfMax InsertionHeuristic (HMIH) as a novel approach to solving theTravelling SalesmanProblem (TSP). The goal is to outperform existing techniques such as the Farthest Insertion Heuristic (FIH) andNearest Neighbour Heuristic (NNH). The paper discusses the limitations of current construction tour heuristics,focusing particularly on the significant margin of error in FIH. It then proposes HMIH as an alternative thatminimizes the increase in tour distance and includes more nodes. HMIH improves tour quality by starting withan initial tour consisting of a ‘minimum’ polygon and iteratively adding nodes using our novel Half Max routine.The paper thoroughly examines and compares HMIH with FIH and NNH via rigorous testing on standard TSPbenchmarks. The results indicate that HMIH consistently delivers superior performance, particularly with respectto tour cost and computational efficiency. HMIH’s tours were sometimes 16% shorter than those generated by FIHand NNH, showcasing its potential and value as a novel benchmark for TSP solutions. The study used statisticalmethods, including Friedman’s Non-parametric Test, to validate the performance of HMIH over FIH and NNH.This guarantees that the identified advantages are statistically significant and consistent in various situations. Thiscomprehensive analysis emphasizes the reliability and efficiency of the heuristic, making a compelling case for itsuse in solving TSP issues. The research shows that, in general, HMIH fared better than FIH in all cases studied,except for a few instances (pr439, eil51, and eil101) where FIH either performed equally or slightly better thanHMIH. HMIH’s efficiency is shown by its improvements in error percentage (δ) and goodness values (g) comparedto FIH and NNH. In the att48 instance, HMIH had an error rate of 6.3%, whereas FIH had 14.6% and NNH had20.9%, indicating that HMIH was closer to the optimal solution. HMIH consistently showed superior performanceacross many benchmarks, with lower percentage error and higher goodness values, suggesting a closer match tothe optimal tour costs. This study substantially contributes to combinatorial optimization by enhancing currentinsertion algorithms and presenting a more efficient solution for the Travelling Salesman Problem. It also createsnew possibilities for progress in heuristic design and optimization methodologies.展开更多
This paper deals with a real-life application of epilepsy classification, where three phases of absence seizure, namely pre-seizure, seizure and seizure-free, are classified using real clinical data. Artificial neural...This paper deals with a real-life application of epilepsy classification, where three phases of absence seizure, namely pre-seizure, seizure and seizure-free, are classified using real clinical data. Artificial neural network (ANN) and support vector machines (SVMs) combined with su- pervised learning algorithms, and k-means clustering (k-MC) combined with unsupervised techniques are employed to classify the three seizure phases. Different techniques to combine binary SVMs, namely One Vs One (OvO), One Vs All (OVA) and Binary Decision Tree (BDT), are employed for multiclass classification. Comparisons are performed with two traditional classification methods, namely, k-Nearest Neighbour (k- NN) and Naive Bayes classifier. It is concluded that SVM-based classifiers outperform the traditional ones in terms of recognition accuracy and robustness property when the original clinical data is distorted with noise. Furthermore, SVM-based classifier with OvO provides the highest recognition accuracy, whereas ANN-based classifier overtakes by demonstrating maximum accuracy in the presence of noise.展开更多
Deep learning has reached many successes in Video Processing.Video has become a growing important part of our daily digital interactions.The advancement of better resolution content and the large volume offers serious...Deep learning has reached many successes in Video Processing.Video has become a growing important part of our daily digital interactions.The advancement of better resolution content and the large volume offers serious challenges to the goal of receiving,distributing,compressing and revealing highquality video content.In this paper we propose a novel Effective and Efficient video compression by the Deep Learning framework based on the flask,which creatively combines the Deep Learning Techniques on Convolutional Neural Networks(CNN)and Generative Adversarial Networks(GAN).The video compression method involves the layers are divided into different groups for data processing,using CNN to remove the duplicate frames,repeating the single image instead of the duplicate images by recognizing and detecting minute changes using GAN and recorded with Long Short-Term Memory(LSTM).Instead of the complete image,the small changes generated using GAN are substituted,which helps with frame-level compression.Pixel wise comparison is performed using K-nearest Neighbours(KNN)over the frame,clustered with K-means and Singular Value Decomposition(SVD)is applied for every frame in the video for all three colour channels[Red,Green,Blue]to decrease the dimension of the utility matrix[R,G,B]by extracting its latent factors.Video frames are packed with parameters with the aid of a codec and converted to video format and the results are compared with the original video.Repeated experiments on several videos with different sizes,duration,Frames per second(FPS),and quality results demonstrated a significant resampling rate.On normal,the outcome delivered had around a 10%deviation in quality and over half in size when contrasted,and the original video.展开更多
This paper describes procedure for estimation of travel time on signalized arterial roads based on multiple data sources with application of dimensionality reduction. Travel time estimation approach incorporates forec...This paper describes procedure for estimation of travel time on signalized arterial roads based on multiple data sources with application of dimensionality reduction. Travel time estimation approach incorporates forecast of transportation nodes impendence and travel time on network links. Forecasting period is two hours and the estimation is based on historical data and real time data on traffic conditions. Travel time estimation combines multivariate regression, principal component analysis, KNN (k-nearest neighbours), cross validation and EWMA (exponentially weighted moving average) methods. When comparing estimation methodologies, relevantly better results were achieved by KNN method than with EWMA method. This is true for every time interval considered except for evening time interval when signalized arterial roads were uncongested.展开更多
The natural element method (NEM) is a newly- developed numerical method based on Voronoi diagram and Delaunay triangulation of scattered points, which adopts natural neighbour interpolation to construct trial functi...The natural element method (NEM) is a newly- developed numerical method based on Voronoi diagram and Delaunay triangulation of scattered points, which adopts natural neighbour interpolation to construct trial functions in the framework of Galerkin method. Owing to its distinctive advantages, the NEM is used widely in many problems of computational mechanics. Utilizing the NEM, this paper deals with numerical limit analysis of structures made up of perfectly rigid-plastic material. According to kinematic the- orem of plastic limit analysis, a mathematical programming natural element formulation is established for determining the upper bound multiplier of plane problems, and a direct iteration algorithm is proposed accordingly to solve it. In this algorithm, the plastic incompressibility condition is handled by two different treatments, and the nonlinearity and nons- moothness of the goal function are overcome by distinguishing the rigid zones from the plastic zones at each iteration. The procedure implementation of iterative process is quite simple and effective because each iteration is equivalent to solving an associated elastic problem. The obtained limit load multiplier is proved to monotonically converge to the upper bound of true solution. Several benchmark examples are investigated to validate the significant performance of the NEM in the application field of limit analysis.展开更多
Firefly algorithm(FA)is a recently-proposed swarm intelligence technique.It has shown good performance in solving various optimization problems.According to the standard firefly algorithm and most of its variants,a fi...Firefly algorithm(FA)is a recently-proposed swarm intelligence technique.It has shown good performance in solving various optimization problems.According to the standard firefly algorithm and most of its variants,a firefly migrates to every other brighter firefly in each iteration.However,this method leads to defects of oscillations of positions,which hampers the convergence to the optimum.To address these problems and enhance the performance of FA,we propose a new firefly algorithm,which is called the Best Neighbor Firefly Algorithm(BNFA).It employs the best neighbor guided strategy,where each firefly is attracted to the best firefly among some randomly chosen neighbors,thus reducing the firefly oscillations in every attraction-induced migration stage,while increasing the probability of the guidance a new better direction.Moreover,it selects neighbors randomly to prevent the firefly form being trapped into a local optimum.Extensive experiments are conducted to find out the optimal parameter settings.To verify the performance of BNFA,13 classical benchmark functions are tested.Results show that BNFA outperforms the standard FA and other recently proposed modified FAs.展开更多
VisuShrink, ModineighShrink and NeighShrink are efficient image denoising algorithms based on the discrete wavelet transform (DWT). These methods have disadvantage of using a suboptimal universal threshold and identic...VisuShrink, ModineighShrink and NeighShrink are efficient image denoising algorithms based on the discrete wavelet transform (DWT). These methods have disadvantage of using a suboptimal universal threshold and identical neighbouring window size in all wavelet subbands. In this paper, an improved method is proposed, that determines a threshold as well as neighbouring window size for every subband using its lengths. Our experimental results illustrate that the proposed approach is better than the existing ones, i.e., NeighShrink, ModineighShrink and VisuShrink in terms of peak signal-to-noise ratio (PSNR) i.e. visual quality of the image.展开更多
This paper studies evolutionary mechanism of parameter selection in the construction of weight function for Nearest Neighbour Estimate in nonparametric regression. Construct an algorithm which adaptively evolves fine ...This paper studies evolutionary mechanism of parameter selection in the construction of weight function for Nearest Neighbour Estimate in nonparametric regression. Construct an algorithm which adaptively evolves fine weight and makes good prediction about unknown points. The numerical experiments indicate that this method is effective. It is a meaningful discussion about practicability of nonparametric regression and methodology of adaptive model-building.展开更多
Background:Species-specific genotypic features,local neighbourhood interactions and resource supply strongly influence the tree stature and growth rate.In mixed-species forests,diversity-mediated biomass allocation ha...Background:Species-specific genotypic features,local neighbourhood interactions and resource supply strongly influence the tree stature and growth rate.In mixed-species forests,diversity-mediated biomass allocation has been suggested to be a fundamental mechanism underlying the positive biodiversity-productivity relationships.Empirical evidence,however,is rare about the impact of local neighbourhood diversity on tree characteristics analysed at a very high level of detail.To address this issue we analysed these effects on the individual-tree crown architecture and tree productivity in a mature mixed forest in northern Germany.Methods:Our analysis considers multiple target tree species across a local neighbourhood species richness gradient ranging from 1 to 4.We applied terrestrial laser scanning to quantify a large number of individual mature trees(N=920)at very high accuracy.We evaluated two different neighbour inclusion approaches by analysing both a fixed radius selection procedure and a selection based on overlapping crowns.Results and conclusions:We show that local neighbourhood species diversity significantly increases crown dimension and wood volume of target trees.Moreover,we found a size-dependency of diversity effects on tree productivity(basal area and wood volume increment)with positive effects for large-sized trees(diameter at breast height(DBH)>40 cm)and negative effects for small-sized(DBH<40 cm)trees.In our analysis,the neighbour inclusion approach has a significant impact on the outcome.For scientific studies and the validation of growth models we recommend a neighbour selection by overlapping crowns,because this seems to be the relevant scale at which local neighbourhood interactions occur.Because local neighbourhood diversity promotes individual-tree productivity in mature European mixed-species forests,we conclude that a small-scale species mixture should be considered in management plans.展开更多
In this paper a new continuous variable called core-ratio is defined to describe the probability for a residue to be in a binding site, thereby replacing the previous binary description of the interface residue using ...In this paper a new continuous variable called core-ratio is defined to describe the probability for a residue to be in a binding site, thereby replacing the previous binary description of the interface residue using 0 and 1. So we can use the support vector machine regression method to fit the core-ratio value and predict the protein binding sites. We also design a new group of physical and chemical descriptors to characterize the binding sites. The new descriptors are more effective, with an averaging procedure used. Our test shows that much better prediction results can be obtained by the support vector regression (SVR) method than by the support vector classification method.展开更多
It is an important problem in chaos theory whether an observed irregular signal is deterministic chaotic or stochas- tic. We propose an efficient method for distinguishing deterministic chaotic from stochastic time se...It is an important problem in chaos theory whether an observed irregular signal is deterministic chaotic or stochas- tic. We propose an efficient method for distinguishing deterministic chaotic from stochastic time series for short scalar time series. We first investigate, with the increase of the embedding dimension, the changing trend of the distance between two points which stay close in phase space. And then, we obtain the differences between Gaussian white noise and deterministic chaotic time series underlying this method. Finally, numerical experiments are presented to testify the validity and robustness of the method. Simulation results indicate that our method can distinguish deterministic chaotic from stochastic time series effectively even when the data are short and contaminated.展开更多
Using a tight binding transfer matrix method, we calculate the complex band structure of armchair graphene nanoribbons. The real part of the complex band structure calculated by the transfer matrix method fits well wi...Using a tight binding transfer matrix method, we calculate the complex band structure of armchair graphene nanoribbons. The real part of the complex band structure calculated by the transfer matrix method fits well with the bulk band structure calculated by a Hermitian matrix. The complex band structure gives extra information on carrier's decay behaviour. The imaginary loop connects the conduction and valence band, and can profoundly affect the characteristics of nanoscale electronic device made with graphene nanoribbons. In this work, the complex band structure calculation includes not only the first nearest neighbour interaction, but also the effects of edge bond relaxation and the third nearest neighbour interaction. The band gap is classified into three classes. Due to the edge bond relaxation and the third nearest neighbour interaction term, it opens a band gap for N = 3M- 1. The band gap is almost unchanged for N =3M + 1, but decreased for N = 3M. The maximum imaginary wave vector length provides additional information about the electrical characteristics of graphene nanoribbons, and is also classified into three classes.展开更多
By using the method of density-matrix renormalization-group to solve the different spin spin correlation functions, the nearest-neighbouring entanglement (NNE) and the next-nearest-neighbouring entanglement (NNNE)...By using the method of density-matrix renormalization-group to solve the different spin spin correlation functions, the nearest-neighbouring entanglement (NNE) and the next-nearest-neighbouring entanglement (NNNE) of one-dimensional alternating Heisenberg XY spin chain are investigated in the presence of alternating the-nearestneighbouring interaction of exchange couplings, external magnetic fields and the next-nearest neighbouring interaction. For a dimerised ferromagnetic spin chain, the NNNE appears only above a critical dimerized interaction, meanwhile, the dimerized interaction a effects a quantum phase transition point and improves the NNNE to a large extent. We also study the effect of ferromagnetic or antiferromagnetic next-nearest neighbouring (NNN) interaction on the dynamics of NNE and NNNE. The ferromagnetic NNN interaction increases and shrinks the NNE below and above a critical frustrated interaction respectively, while the antiferromagnetic NNN interaction always reduces the NNE. The antiferromagnetic NNN interaction results in a large value of NNNE compared with the case where the NNN interaction is ferromagnetic.展开更多
Forest health is currently assessed in Europe (ICP Forests monitoring program). Crown defoliation and dieback, tree mortality, and pathogenic damage are the main aspects considered in tree health assessment. The wor...Forest health is currently assessed in Europe (ICP Forests monitoring program). Crown defoliation and dieback, tree mortality, and pathogenic damage are the main aspects considered in tree health assessment. The worsening of environmental conditions (i.e., increase of temperature and drought events) may cause large-spatial scale tree mortality and forest decline. However, the role of stand features, including tree species assemblage and diversity as factors that modify environmental impacts, is poorly considered. The present contribution reanalyses the historical dataset of crown conditions in Italian forests from ] 997 to 2014 to identify ecological and structural factors that influence tree crown defoliation, highlighting in a special manner the role of tree diversity. The effects of tree diversity were explored using the entire data set through multivariate cluster analyses and on individual trees, analysing the influence of the neighbouring tree diversity and identity at the local (neighbour) level. Preliminary results suggest that each tree species shows a specific behaviour in relation to crown defoliation, and the distribution of crown defoliation across Italian forests reflects the distribution of the main forest types and their ecological equilibrium with the environment. The potentiality and the problems connected to the possible extension of this analysis at a more general level (European and North American) were discussed.展开更多
文摘The Tayatea Dyke Swarm(also known as the Tayatea Dolerite)comprises well-exposed northeast-trending tholeiitic dykes that intrude the Rocky Cape Group(RCG)of northwest Tasmania,Australia.The dykes commonly
基金This paper was supported by the National Nature Science Foundation of China(Grant Nos.61572537,U1501252).
文摘Optimal location query in road networks is a basic operation in the location intelligence applications.Given a set of clients and servers on a road network,the purpose of optimal location query is to obtain a location for a new server,so that a certain objective function calculated based on the locations of clients and servers is optimal.Existing works assume no labels for servers and that a client only visits the nearest server.These assumptions are not realistic and it renders the existing work not useful in many cases.In this paper,we relax these assumptions and consider the k nearest neighbours(KNN)of clients.We introduce the problem of KNN-based optimal location query(KOLQ)which considers the k nearest servers of clients and labeled servers.We also introduce a variant problem called relocation KOLQ(RKOLQ)which aims at relocating an existing server to an optimal location.Two main analysis algorithms are proposed for these problems.Extensive experiments on the real road networks illustrate the efficiency of our proposed solutions.
文摘Aims this systematic review highlights the relative support and implica-tions of the attractant-decoy and repellent-plant hypotheses,discuss-ing important linkages between these theories and the opportunity for novel integration into ecological and applied research.Methods an extensive systematic review of the current literature on the attract-ant-decoy and repellent-plant hypotheses was done to describe the following attributes of the research to date:(i)the geographic extent(country and biome)of studies on this topic,(ii)the scope of experi-mental designs used,(iii)the level of support for these hypotheses with respect to the breadth of ecological niches tested,(iv)the level of support for these hypotheses with respect to the classes of herbi-vores examined and,lastly,(v)the ecological impact or purpose of these studies.Herein,we summarize important research gaps in the empirical literature on this topic and identify novel opportunities for critical linkages between ecological and applied theories.Important Findingsa total of 37%of experiments testing these two associated hypotheses were done in North america,frequently in either temperate broadleaf(26%of studies)or taiga ecosystems(15%of studies).the majority of these studies involved experimental manipulations such as removing and transplanting vegetation and either tracked or excluded mammalian herbivores.Ecological implications were primarily examined(59%of studies),but there were also implications described for agriculture and commercial forestry in 22%of studies.the repellent-plant hypothesis was well supported in many ecological systems,particularly for mammalian herbivores,but the attractant-decoy hypothesis has been less frequently tested,thereby representing an important research gap.Insect herbivores were under-represented in all categories except in applied contexts such as commercial forestry and agriculture.there is a clear need for studies to connect these two ecological hypotheses with the management of agriculture and restoration efforts in many ecosystems.research on the co-evolution and facilitation between palatable and unpalatable plants also represents another novel area of future study.
基金the Centre of Excellence in Mobile and e-Services,the University of Zululand,Kwadlangezwa,South Africa.
文摘The studypresents theHalfMax InsertionHeuristic (HMIH) as a novel approach to solving theTravelling SalesmanProblem (TSP). The goal is to outperform existing techniques such as the Farthest Insertion Heuristic (FIH) andNearest Neighbour Heuristic (NNH). The paper discusses the limitations of current construction tour heuristics,focusing particularly on the significant margin of error in FIH. It then proposes HMIH as an alternative thatminimizes the increase in tour distance and includes more nodes. HMIH improves tour quality by starting withan initial tour consisting of a ‘minimum’ polygon and iteratively adding nodes using our novel Half Max routine.The paper thoroughly examines and compares HMIH with FIH and NNH via rigorous testing on standard TSPbenchmarks. The results indicate that HMIH consistently delivers superior performance, particularly with respectto tour cost and computational efficiency. HMIH’s tours were sometimes 16% shorter than those generated by FIHand NNH, showcasing its potential and value as a novel benchmark for TSP solutions. The study used statisticalmethods, including Friedman’s Non-parametric Test, to validate the performance of HMIH over FIH and NNH.This guarantees that the identified advantages are statistically significant and consistent in various situations. Thiscomprehensive analysis emphasizes the reliability and efficiency of the heuristic, making a compelling case for itsuse in solving TSP issues. The research shows that, in general, HMIH fared better than FIH in all cases studied,except for a few instances (pr439, eil51, and eil101) where FIH either performed equally or slightly better thanHMIH. HMIH’s efficiency is shown by its improvements in error percentage (δ) and goodness values (g) comparedto FIH and NNH. In the att48 instance, HMIH had an error rate of 6.3%, whereas FIH had 14.6% and NNH had20.9%, indicating that HMIH was closer to the optimal solution. HMIH consistently showed superior performanceacross many benchmarks, with lower percentage error and higher goodness values, suggesting a closer match tothe optimal tour costs. This study substantially contributes to combinatorial optimization by enhancing currentinsertion algorithms and presenting a more efficient solution for the Travelling Salesman Problem. It also createsnew possibilities for progress in heuristic design and optimization methodologies.
文摘This paper deals with a real-life application of epilepsy classification, where three phases of absence seizure, namely pre-seizure, seizure and seizure-free, are classified using real clinical data. Artificial neural network (ANN) and support vector machines (SVMs) combined with su- pervised learning algorithms, and k-means clustering (k-MC) combined with unsupervised techniques are employed to classify the three seizure phases. Different techniques to combine binary SVMs, namely One Vs One (OvO), One Vs All (OVA) and Binary Decision Tree (BDT), are employed for multiclass classification. Comparisons are performed with two traditional classification methods, namely, k-Nearest Neighbour (k- NN) and Naive Bayes classifier. It is concluded that SVM-based classifiers outperform the traditional ones in terms of recognition accuracy and robustness property when the original clinical data is distorted with noise. Furthermore, SVM-based classifier with OvO provides the highest recognition accuracy, whereas ANN-based classifier overtakes by demonstrating maximum accuracy in the presence of noise.
文摘Deep learning has reached many successes in Video Processing.Video has become a growing important part of our daily digital interactions.The advancement of better resolution content and the large volume offers serious challenges to the goal of receiving,distributing,compressing and revealing highquality video content.In this paper we propose a novel Effective and Efficient video compression by the Deep Learning framework based on the flask,which creatively combines the Deep Learning Techniques on Convolutional Neural Networks(CNN)and Generative Adversarial Networks(GAN).The video compression method involves the layers are divided into different groups for data processing,using CNN to remove the duplicate frames,repeating the single image instead of the duplicate images by recognizing and detecting minute changes using GAN and recorded with Long Short-Term Memory(LSTM).Instead of the complete image,the small changes generated using GAN are substituted,which helps with frame-level compression.Pixel wise comparison is performed using K-nearest Neighbours(KNN)over the frame,clustered with K-means and Singular Value Decomposition(SVD)is applied for every frame in the video for all three colour channels[Red,Green,Blue]to decrease the dimension of the utility matrix[R,G,B]by extracting its latent factors.Video frames are packed with parameters with the aid of a codec and converted to video format and the results are compared with the original video.Repeated experiments on several videos with different sizes,duration,Frames per second(FPS),and quality results demonstrated a significant resampling rate.On normal,the outcome delivered had around a 10%deviation in quality and over half in size when contrasted,and the original video.
文摘This paper describes procedure for estimation of travel time on signalized arterial roads based on multiple data sources with application of dimensionality reduction. Travel time estimation approach incorporates forecast of transportation nodes impendence and travel time on network links. Forecasting period is two hours and the estimation is based on historical data and real time data on traffic conditions. Travel time estimation combines multivariate regression, principal component analysis, KNN (k-nearest neighbours), cross validation and EWMA (exponentially weighted moving average) methods. When comparing estimation methodologies, relevantly better results were achieved by KNN method than with EWMA method. This is true for every time interval considered except for evening time interval when signalized arterial roads were uncongested.
基金supported by the National Foundation for Excellent Doctoral Thesis of China (200025)the Program for New Century Excellent Talents in University (NCET-04-0075)the National Natural Science Foundation of China (19902007)
文摘The natural element method (NEM) is a newly- developed numerical method based on Voronoi diagram and Delaunay triangulation of scattered points, which adopts natural neighbour interpolation to construct trial functions in the framework of Galerkin method. Owing to its distinctive advantages, the NEM is used widely in many problems of computational mechanics. Utilizing the NEM, this paper deals with numerical limit analysis of structures made up of perfectly rigid-plastic material. According to kinematic the- orem of plastic limit analysis, a mathematical programming natural element formulation is established for determining the upper bound multiplier of plane problems, and a direct iteration algorithm is proposed accordingly to solve it. In this algorithm, the plastic incompressibility condition is handled by two different treatments, and the nonlinearity and nons- moothness of the goal function are overcome by distinguishing the rigid zones from the plastic zones at each iteration. The procedure implementation of iterative process is quite simple and effective because each iteration is equivalent to solving an associated elastic problem. The obtained limit load multiplier is proved to monotonically converge to the upper bound of true solution. Several benchmark examples are investigated to validate the significant performance of the NEM in the application field of limit analysis.
基金Supported by the National Natural Science Foundation of China(61763019,61364025)the Science and Technology Foundation of Jiangxi Province,China(GJJ161076)
文摘Firefly algorithm(FA)is a recently-proposed swarm intelligence technique.It has shown good performance in solving various optimization problems.According to the standard firefly algorithm and most of its variants,a firefly migrates to every other brighter firefly in each iteration.However,this method leads to defects of oscillations of positions,which hampers the convergence to the optimum.To address these problems and enhance the performance of FA,we propose a new firefly algorithm,which is called the Best Neighbor Firefly Algorithm(BNFA).It employs the best neighbor guided strategy,where each firefly is attracted to the best firefly among some randomly chosen neighbors,thus reducing the firefly oscillations in every attraction-induced migration stage,while increasing the probability of the guidance a new better direction.Moreover,it selects neighbors randomly to prevent the firefly form being trapped into a local optimum.Extensive experiments are conducted to find out the optimal parameter settings.To verify the performance of BNFA,13 classical benchmark functions are tested.Results show that BNFA outperforms the standard FA and other recently proposed modified FAs.
文摘VisuShrink, ModineighShrink and NeighShrink are efficient image denoising algorithms based on the discrete wavelet transform (DWT). These methods have disadvantage of using a suboptimal universal threshold and identical neighbouring window size in all wavelet subbands. In this paper, an improved method is proposed, that determines a threshold as well as neighbouring window size for every subband using its lengths. Our experimental results illustrate that the proposed approach is better than the existing ones, i.e., NeighShrink, ModineighShrink and VisuShrink in terms of peak signal-to-noise ratio (PSNR) i.e. visual quality of the image.
文摘This paper studies evolutionary mechanism of parameter selection in the construction of weight function for Nearest Neighbour Estimate in nonparametric regression. Construct an algorithm which adaptively evolves fine weight and makes good prediction about unknown points. The numerical experiments indicate that this method is effective. It is a meaningful discussion about practicability of nonparametric regression and methodology of adaptive model-building.
基金LG was funded by the German Research Foundation(DFG 320926971)through the project“Analysis of diversity effects on above-groundproductivity in forests:advancing the mechanistic understanding of spatiotemporal dynamics in canopy space filling using mobile laser scanning”。
文摘Background:Species-specific genotypic features,local neighbourhood interactions and resource supply strongly influence the tree stature and growth rate.In mixed-species forests,diversity-mediated biomass allocation has been suggested to be a fundamental mechanism underlying the positive biodiversity-productivity relationships.Empirical evidence,however,is rare about the impact of local neighbourhood diversity on tree characteristics analysed at a very high level of detail.To address this issue we analysed these effects on the individual-tree crown architecture and tree productivity in a mature mixed forest in northern Germany.Methods:Our analysis considers multiple target tree species across a local neighbourhood species richness gradient ranging from 1 to 4.We applied terrestrial laser scanning to quantify a large number of individual mature trees(N=920)at very high accuracy.We evaluated two different neighbour inclusion approaches by analysing both a fixed radius selection procedure and a selection based on overlapping crowns.Results and conclusions:We show that local neighbourhood species diversity significantly increases crown dimension and wood volume of target trees.Moreover,we found a size-dependency of diversity effects on tree productivity(basal area and wood volume increment)with positive effects for large-sized trees(diameter at breast height(DBH)>40 cm)and negative effects for small-sized(DBH<40 cm)trees.In our analysis,the neighbour inclusion approach has a significant impact on the outcome.For scientific studies and the validation of growth models we recommend a neighbour selection by overlapping crowns,because this seems to be the relevant scale at which local neighbourhood interactions occur.Because local neighbourhood diversity promotes individual-tree productivity in mature European mixed-species forests,we conclude that a small-scale species mixture should be considered in management plans.
基金Project supported by the National Natural Science Foundation of China (Grant Nos. 10674172 and 10874229)
文摘In this paper a new continuous variable called core-ratio is defined to describe the probability for a residue to be in a binding site, thereby replacing the previous binary description of the interface residue using 0 and 1. So we can use the support vector machine regression method to fit the core-ratio value and predict the protein binding sites. We also design a new group of physical and chemical descriptors to characterize the binding sites. The new descriptors are more effective, with an averaging procedure used. Our test shows that much better prediction results can be obtained by the support vector regression (SVR) method than by the support vector classification method.
文摘It is an important problem in chaos theory whether an observed irregular signal is deterministic chaotic or stochas- tic. We propose an efficient method for distinguishing deterministic chaotic from stochastic time series for short scalar time series. We first investigate, with the increase of the embedding dimension, the changing trend of the distance between two points which stay close in phase space. And then, we obtain the differences between Gaussian white noise and deterministic chaotic time series underlying this method. Finally, numerical experiments are presented to testify the validity and robustness of the method. Simulation results indicate that our method can distinguish deterministic chaotic from stochastic time series effectively even when the data are short and contaminated.
基金Project supported by the Fundamental Research Funds for the Central Universities (Grant No. YWF-10-02-040)
文摘Using a tight binding transfer matrix method, we calculate the complex band structure of armchair graphene nanoribbons. The real part of the complex band structure calculated by the transfer matrix method fits well with the bulk band structure calculated by a Hermitian matrix. The complex band structure gives extra information on carrier's decay behaviour. The imaginary loop connects the conduction and valence band, and can profoundly affect the characteristics of nanoscale electronic device made with graphene nanoribbons. In this work, the complex band structure calculation includes not only the first nearest neighbour interaction, but also the effects of edge bond relaxation and the third nearest neighbour interaction. The band gap is classified into three classes. Due to the edge bond relaxation and the third nearest neighbour interaction term, it opens a band gap for N = 3M- 1. The band gap is almost unchanged for N =3M + 1, but decreased for N = 3M. The maximum imaginary wave vector length provides additional information about the electrical characteristics of graphene nanoribbons, and is also classified into three classes.
基金Project supported by the Key Higher Education Program of Hubei Province, China (Grant No Z20052201)Natural Science Foundation of Hubei Province, China (Grant No 2006ABA055)Postgraduate Program of Hubei Normal University of China(Grant No 2007D20)
文摘By using the method of density-matrix renormalization-group to solve the different spin spin correlation functions, the nearest-neighbouring entanglement (NNE) and the next-nearest-neighbouring entanglement (NNNE) of one-dimensional alternating Heisenberg XY spin chain are investigated in the presence of alternating the-nearestneighbouring interaction of exchange couplings, external magnetic fields and the next-nearest neighbouring interaction. For a dimerised ferromagnetic spin chain, the NNNE appears only above a critical dimerized interaction, meanwhile, the dimerized interaction a effects a quantum phase transition point and improves the NNNE to a large extent. We also study the effect of ferromagnetic or antiferromagnetic next-nearest neighbouring (NNN) interaction on the dynamics of NNE and NNNE. The ferromagnetic NNN interaction increases and shrinks the NNE below and above a critical frustrated interaction respectively, while the antiferromagnetic NNN interaction always reduces the NNE. The antiferromagnetic NNN interaction results in a large value of NNNE compared with the case where the NNN interaction is ferromagnetic.
基金funded and carried out within SMART4Action LIFE+project“Sustainable Monitoring and Reporting to Inform Forest and Environmental Awareness and Protection”LIFE13 ENV/IT/000813
文摘Forest health is currently assessed in Europe (ICP Forests monitoring program). Crown defoliation and dieback, tree mortality, and pathogenic damage are the main aspects considered in tree health assessment. The worsening of environmental conditions (i.e., increase of temperature and drought events) may cause large-spatial scale tree mortality and forest decline. However, the role of stand features, including tree species assemblage and diversity as factors that modify environmental impacts, is poorly considered. The present contribution reanalyses the historical dataset of crown conditions in Italian forests from ] 997 to 2014 to identify ecological and structural factors that influence tree crown defoliation, highlighting in a special manner the role of tree diversity. The effects of tree diversity were explored using the entire data set through multivariate cluster analyses and on individual trees, analysing the influence of the neighbouring tree diversity and identity at the local (neighbour) level. Preliminary results suggest that each tree species shows a specific behaviour in relation to crown defoliation, and the distribution of crown defoliation across Italian forests reflects the distribution of the main forest types and their ecological equilibrium with the environment. The potentiality and the problems connected to the possible extension of this analysis at a more general level (European and North American) were discussed.