Recently,dealing with the non-Euclidean data and its characterization is considered as one of the major issues by researchers.The first problem arises while defining the distinction among Euclidean and non-Euclidean g...Recently,dealing with the non-Euclidean data and its characterization is considered as one of the major issues by researchers.The first problem arises while defining the distinction among Euclidean and non-Euclidean geometry with its examples.The second problem arises while dealing with the non-Euclidean geometry in true,false,and uncertain regions.The third problem arises while investigating some patterns in non-Euclidean data sets.This paper focused on tackling these issues with some real-life examples in data processing,data visualization,knowledge representation,and quantum computing.展开更多
The relation between the circum-radius and the in-radius of an n-dimensional simplex in E^n is studied.Two new generalizations of Euler inequality for the n-dimensional simplex are established.Besides,we obtain some s...The relation between the circum-radius and the in-radius of an n-dimensional simplex in E^n is studied.Two new generalizations of Euler inequality for the n-dimensional simplex are established.Besides,we obtain some stronger generalizations of Euler inequality for the n-dimensional simplex than previously known results.展开更多
In this letter a new skeletonization algorithm is proposed. It combines techniques of fast construction of Euclidean Distance Maps(EDMs), ridge extraction, Hit-or-Miss Transformation(HMT) of structuring elements and t...In this letter a new skeletonization algorithm is proposed. It combines techniques of fast construction of Euclidean Distance Maps(EDMs), ridge extraction, Hit-or-Miss Transformation(HMT) of structuring elements and the set operators. It first produces the EDM image with no more than 4 passes through an image of any kinds, and then the ridge image is extracted by applying a turn-on scheme and performing a rain-fall elimination to accelerate the processing. The one-pixel wide skeleton is finally acquired by carrying out the HMTs of two structure elements and the SUBTRACT and OR operations. Experimental results obtained by practical applications are also presented.展开更多
A new method for constructing Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) codes based on Euclidean Geometry (EG) is presented. The proposed method results in a class of QC-LDPC codes with girth of at least 6 and...A new method for constructing Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) codes based on Euclidean Geometry (EG) is presented. The proposed method results in a class of QC-LDPC codes with girth of at least 6 and the designed codes perform very close to the Shannon limit with iterative decoding. Simulations show that the designed QC-LDPC codes have almost the same performance with the existing EG-LDPC codes.展开更多
In this paper, the source localization by utilizing the measurements of a single electromagnetic (EM) vector-sensor is investigated in the framework of the geometric algebra of Euclidean 3-space. In order to describ...In this paper, the source localization by utilizing the measurements of a single electromagnetic (EM) vector-sensor is investigated in the framework of the geometric algebra of Euclidean 3-space. In order to describe the orthogonality among the electric and magnetic measurements, two multivectors of the geometric algebra of Euclidean 3-space (G3) are used to model the outputs of a spatially collocated EM vector-sensor. Two estimators for the wave propagation vector estimation are then formulated by the inner product between a vector and a bivector in the G3. Since the information used by the two estimators is different, a weighted inner product estimator is then proposed to fuse the two estimators together in the sense of the minimum mean square error (MMSE). Analytical results show that the statistical performances of the weighted inner product estimator are always better than its traditional cross product counterpart. The efficacy of the weighted inner product estimator and the correctness of the analytical predictions are demonstrated by simulation results.展开更多
The Euclidean Steiner minimum tree problem is a classical NP-hard combinatorial optimization problem.Because of the intrinsic characteristic of the hard computability,this problem cannot be solved accurately by effici...The Euclidean Steiner minimum tree problem is a classical NP-hard combinatorial optimization problem.Because of the intrinsic characteristic of the hard computability,this problem cannot be solved accurately by efficient algorithms up to now.Due to the extensive applications in real world,it is quite important to find some heuristics for it.The stochastic diffusion search algorithm is a newly population-based algorithm whose operating mechanism is quite different from ordinary intelligent algorithms,so this algorithm has its own advantage in solving some optimization problems.This paper has carefully studied the stochastic diffusion search algorithm and designed a cellular automata stochastic diffusion search algorithm for the Euclidean Steiner minimum tree problem which has low time complexity.Practical results show that the proposed algorithm can find approving results in short time even for the large scale size,while exact algorithms need to cost several hours.展开更多
文摘Recently,dealing with the non-Euclidean data and its characterization is considered as one of the major issues by researchers.The first problem arises while defining the distinction among Euclidean and non-Euclidean geometry with its examples.The second problem arises while dealing with the non-Euclidean geometry in true,false,and uncertain regions.The third problem arises while investigating some patterns in non-Euclidean data sets.This paper focused on tackling these issues with some real-life examples in data processing,data visualization,knowledge representation,and quantum computing.
基金国家自然科学基金(61305038,61273249,61502282)海洋公益性行业科研专项经费资助项目(201505002)+2 种基金自主系统与网络控制教育部重点实验室广东省生物医学工程重点实验室资助the National Engineering Research Center for Tissue Restoration and Reconstruction~~
基金Foundation item: Supported by the National Science Foundation of China(60671051) Supported by the Foundation of Anhui Higher School(KJ2009A45)
文摘The relation between the circum-radius and the in-radius of an n-dimensional simplex in E^n is studied.Two new generalizations of Euler inequality for the n-dimensional simplex are established.Besides,we obtain some stronger generalizations of Euler inequality for the n-dimensional simplex than previously known results.
文摘In this letter a new skeletonization algorithm is proposed. It combines techniques of fast construction of Euclidean Distance Maps(EDMs), ridge extraction, Hit-or-Miss Transformation(HMT) of structuring elements and the set operators. It first produces the EDM image with no more than 4 passes through an image of any kinds, and then the ridge image is extracted by applying a turn-on scheme and performing a rain-fall elimination to accelerate the processing. The one-pixel wide skeleton is finally acquired by carrying out the HMTs of two structure elements and the SUBTRACT and OR operations. Experimental results obtained by practical applications are also presented.
基金Supported by the National Key Basic Research Program (973) Project (No. 2010CB328300)the 111 Project (No. B08038)
文摘A new method for constructing Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) codes based on Euclidean Geometry (EG) is presented. The proposed method results in a class of QC-LDPC codes with girth of at least 6 and the designed codes perform very close to the Shannon limit with iterative decoding. Simulations show that the designed QC-LDPC codes have almost the same performance with the existing EG-LDPC codes.
基金National Natural Science Foundation of China(61171127)National Basic Research Program of China(2011CB302903)
文摘In this paper, the source localization by utilizing the measurements of a single electromagnetic (EM) vector-sensor is investigated in the framework of the geometric algebra of Euclidean 3-space. In order to describe the orthogonality among the electric and magnetic measurements, two multivectors of the geometric algebra of Euclidean 3-space (G3) are used to model the outputs of a spatially collocated EM vector-sensor. Two estimators for the wave propagation vector estimation are then formulated by the inner product between a vector and a bivector in the G3. Since the information used by the two estimators is different, a weighted inner product estimator is then proposed to fuse the two estimators together in the sense of the minimum mean square error (MMSE). Analytical results show that the statistical performances of the weighted inner product estimator are always better than its traditional cross product counterpart. The efficacy of the weighted inner product estimator and the correctness of the analytical predictions are demonstrated by simulation results.
基金the National Natural Science Foundation of China (No.70871081)the Science and Technology Department Research Project of Henan Province(No.112102310448)the Natural Science Foundation of Henan University (No.2010YBZR047)
文摘The Euclidean Steiner minimum tree problem is a classical NP-hard combinatorial optimization problem.Because of the intrinsic characteristic of the hard computability,this problem cannot be solved accurately by efficient algorithms up to now.Due to the extensive applications in real world,it is quite important to find some heuristics for it.The stochastic diffusion search algorithm is a newly population-based algorithm whose operating mechanism is quite different from ordinary intelligent algorithms,so this algorithm has its own advantage in solving some optimization problems.This paper has carefully studied the stochastic diffusion search algorithm and designed a cellular automata stochastic diffusion search algorithm for the Euclidean Steiner minimum tree problem which has low time complexity.Practical results show that the proposed algorithm can find approving results in short time even for the large scale size,while exact algorithms need to cost several hours.