Blasting in geological bodies is an industrial process acting in an environment characterized by high uncertainties (natural joints, faults, voids, abrupt structural changes), which are transposed into the process par...Blasting in geological bodies is an industrial process acting in an environment characterized by high uncertainties (natural joints, faults, voids, abrupt structural changes), which are transposed into the process parameters (e.g. energetic transfer to rock mass, hole deviations, misfires, vibrations, fly-rock, etc.). The approach to this problem searching for the "optimum" result can be ineffective. The geological environment is marked out by too many uncertainties, to have an "optimum" suitable to different applications. Researching for "Robustness" in a blast design gives rise to much more efficiency. Robustness is the capability of the system to behave constantly under varying conditions, without leading to unexpected results. Since the geology varies from site to site, setting a robust method can grant better results in varying environments, lowering the costs and increasing benefits and safety. Complexity Analysis (C.A.) is an innovative approach to systems. C.A. allows analyzing the Complexity of the Blast System and the criticality of each variable (drilling, charging and initiation parameters). The lower is the complexity, the more robust is the system, and the lower is the possibility of unexpected results. The paper presents the results obtained thanks to the C.A. approach in an underground gypsum quarry (Italy), exploited by conventional rooms and pillars method by drilling and blasting. The application of C.A. led to a reliable solution to reduce the charge per delay, hence reducing the impact of ground vibration on the surrounding structures. The analysis of the correlation degree between the variables allowed recognizing empirical laws as well.展开更多
The pooling problem,also called the blending problem,is fundamental in production planning of petroleum.It can be formulated as an optimization problem similar with the minimum-cost flow problem.However,Alfaki and Hau...The pooling problem,also called the blending problem,is fundamental in production planning of petroleum.It can be formulated as an optimization problem similar with the minimum-cost flow problem.However,Alfaki and Haugland(J Glob Optim 56:897–916,2013)proved the strong NP-hardness of the pooling problem in general case.They also pointed out that it was an open problem to determine the computational complexity of the pooling problem with a fixed number of qualities.In this paper,we prove that the pooling problem is still strongly NP-hard even with only one quality.This means the quality is an essential difference between minimum-cost flow problem and the pooling problem.For solving large-scale pooling problems in real applications,we adopt the non-monotone strategy to improve the traditional successive linear programming method.Global convergence of the algorithm is established.The numerical experiments show that the non-monotone strategy is effective to push the algorithm to explore the global minimizer or provide a good local minimizer.Our results for real problems from factories show that the proposed algorithm is competitive to the one embedded in the famous commercial software Aspen PIMS.展开更多
To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empiri...To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empirical studies of the gene ontology with various perspectives, this paper shows that the whole gene ontology displays the same topological features as complex networks including "small world" and "scale-free",while some sub-ontologies have the "scale-free" property but no "small world" effect.The potential important terms in an ontology are discovered by some famous complex network centralization methods.An evaluation method based on information retrieval in MEDLINE is designed to measure the effectiveness of the discovered important terms.According to the relevant literature of the gene ontology terms,the suitability of these centralization methods for ontology important concepts discovering is quantitatively evaluated.The experimental results indicate that the betweenness centrality is the most appropriate method among all the evaluated centralization measures.展开更多
The effects of the calorimetric buffer solutions were investigated while the two colorimetric reactions of AI-ferron complex and Fe-ferron complex occurred individually, and the effects of the testing wavelength and t...The effects of the calorimetric buffer solutions were investigated while the two colorimetric reactions of AI-ferron complex and Fe-ferron complex occurred individually, and the effects of the testing wavelength and the pH of the solutions were also investigated. A timed complexatian colorimetric analysis method of Al-Fe-ferron in view of the total concentration of {AI + Fe} was then established to determine the species distribution of polymeric Al-Fe. The testing wavelength was recommended at 362 net and the testing pH value was 5. With a comparison of the ratios of n(Al)/n(Fe), the standard adsorption curves of the polymeric Al-Fe solutions were derived from the experimental results. Furthermore, the solutions' composition were carious in both the molar n(Al)/n(Fe) ratios, i.e. 0/0, 5/5, 9/1 and 0/10, and the concentrations associated with the total ( Al + Fe which ranged from 10(-5) to 10(-4) mol/L..展开更多
Air traffic complexity is an objective metric for evaluating the operational condition of the airspace. It has several applications, such as airspace design and traffic flow management.Therefore, identifying a reliabl...Air traffic complexity is an objective metric for evaluating the operational condition of the airspace. It has several applications, such as airspace design and traffic flow management.Therefore, identifying a reliable method to accurately measure traffic complexity is important. Considering that many factors correlate with traffic complexity in complicated nonlinear ways,researchers have proposed several complexity evaluation methods based on machine learning models which were trained with large samples. However, the high cost of sample collection usually results in limited training set. In this paper, an ensemble learning model is proposed for measuring air traffic complexity within a sector based on small samples. To exploit the classification information within each factor, multiple diverse factor subsets(FSSs) are generated under guidance from factor noise and independence analysis. Then, a base complexity evaluator is built corresponding to each FSS. The final complexity evaluation result is obtained by integrating all results from the base evaluators. Experimental studies using real-world air traffic operation data demonstrate the advantages of our model for small-sample-based traffic complexity evaluation over other stateof-the-art methods.展开更多
Backgroud:Clinical studies on acupuncture treatment of hyperplasia of mammary gland(HMG)have proved its effectiveness,but most studies have paid little attention to acupoints prescription and acupoint compatibility.Th...Backgroud:Clinical studies on acupuncture treatment of hyperplasia of mammary gland(HMG)have proved its effectiveness,but most studies have paid little attention to acupoints prescription and acupoint compatibility.The clinical prescription is not identical,the curative effect also has the difference.Therefore,through data mining and network analysis,this study explored the core acupoints and the compatibility law of acupoints in acupuncture treatment of HMG.Methods:To search and select qualified literature according to inclusion and exclusion criteria for relevant clinical research literature on acupuncture treatment of HMG in CNKI,VIP database,WanFang database and PubMed,etc.Then extract relevant information and establish a database.Using the method of statistical and complex network analysis,this paper studies the core acupoints and the law of acupoint compatibility.Results:A total of 104 Chinese literatures and 0 English literatures were included and 106 acupuncture prescriptions were extracted.The core acupoints in the treatment of HMG are Danzhong(CV 17),Wuyi(ST 15),Zusanli(ST 36),Jianjing(GB 21).Danzhong(CV 17)and Zusanli(ST 36),Danzhong(CV 17)and Wuyi(ST 15),Jianjing(GB 21)and Tianzong(SI 11),Jianjing(GB 21)and Wuyi(ST 15)have the highest correlation degree.The method of acupoint matching mainly consists of local-remote acupoints,upper-lower acupoints and front-rear acupoints.Conclusion:The results of a network analysis substantially accord with the general rules of acupuncture theories in traditional Chinese medicine,able to reflect the points-selection principles and features in acupuncture treatment of HMG and provide evidence for the acupoints selection in the treatment of HMG in acupuncture clinic.展开更多
Objective The Yangtze craton collisional orogeny at ca. extensional events at ca. 1 experienced Paleoproterozoic 1.95-2.0 Ga and post-orogenic 85 Ga related to amalgamation of the Columbia (Nuna) supercontinent (Zh...Objective The Yangtze craton collisional orogeny at ca. extensional events at ca. 1 experienced Paleoproterozoic 1.95-2.0 Ga and post-orogenic 85 Ga related to amalgamation of the Columbia (Nuna) supercontinent (Zhao and Cawood, 2012). A ca. 2.15 Ga suprasubduction zone ophiolitic melange was recongized in the Archean- Paleoproterozoic Kongling Complex of the northern Yangtze craton (Han et al., 2017). However, the tectonic evolution in early Paleoproterozoic from 2.4 Ga to 2.2 Ga remains unclear. We report here the presence of a suite of Paleoproterozoic (2.2 Ga) granites in the Huangling dome, northern Yangtze craton, which may provide important insights into crustal growth processes in the craton prior to the assembly of Columbia.展开更多
Objective The 2014 Ludian Mw6.1 earthquake in Yunnan occurred in a mountainous area with complex tectonics and topography, which caused serious damage as well as co-seismic landslides of an unusual large scale. Becau...Objective The 2014 Ludian Mw6.1 earthquake in Yunnan occurred in a mountainous area with complex tectonics and topography, which caused serious damage as well as co-seismic landslides of an unusual large scale. Because the suspected seismogenic faults on the surface, distribution of aftershocks and focal mechanism solutions are not consistent, it remains difficult to determine what is the real causal fault or seismogenic structure for this event. Actually, it may imply the complicity of the seismic source at depth. In addition, the distribution of the co- seismic landslides also exhibits some diffusion that is different from general eases, likely associated with the seismic focus structure.展开更多
In this paper, the variability characteristics of the global field of sea surface temperature (SST) anomaly are studied by complex principal component (c.p.c.) analysis, whose results are also compared with those of r...In this paper, the variability characteristics of the global field of sea surface temperature (SST) anomaly are studied by complex principal component (c.p.c.) analysis, whose results are also compared with those of real p.c. analysis. The data consist of 40 years of global SST monthly averages over latitudes from 42 5°S to 67 5°N. In the spatial domain, it is found that the distribution of the first complex loading amplitude is characterized by three areas of large values: the first one in the eastern and central equatorial Pacific Ocean, the second one in the northern tropical Indian Ocean and South China Sea, the third one in the northern Pacific Ocean. As it will be explained, this pattern may be considered as representative of El Nio mode. The first complex loading phase pattern shows a stationary wave in the Pacific (also revealed by real p.c. analysis) superimposed to an oscillating disturbance, propagating from the Pacific to Indian or the opposite way. A subsequent correlation analysis among different spatial points allows revealing disturbances actually propagating westward from the Pacific to the Indian Ocean, which could therefore represent reflected Rossby waves, i.e. the west phase of the signals that propagate disturbances of thermal structure in the tropical Pacific Ocean. In the time domain, a relation between the trend of the first complex principal component and the ENSO cycle is also established.展开更多
Methodological problems of climatic reconstruction for different periods of Holocene are discussed on the basis of a multiple group biological analysis on peat-sapropel sediments. The possibility of biological analysi...Methodological problems of climatic reconstruction for different periods of Holocene are discussed on the basis of a multiple group biological analysis on peat-sapropel sediments. The possibility of biological analysis is exemplified by the paleoclimatic reconstruction for Carpathian and Altai Mountain ranges. For the "Skolevsky Beskidy" national park of Carpaty the paleoclimatic scenarios have been drown up aiming at the more precise definition of climatic conditions for the period of mass mountain slope terracing. The stability of terrace systems of various designs in the current climatic conditions has been assessed. It is shown that during periods of humid climate the terraces, whose designs have been focused on drainage, were built. In periods of dry and warm climate the terrace systems capable of accumulating water were built. Both these types of terrace systems are destroyed in nowadays. Only those terrace systems are stable which were adjusted by their builders to contrast variations of precipitation. For Western Altais the paleoclimatic scenario has been done to forecast the safety of the Bronze Age kurgans (burial earth mounds) with permafrost inside the construction. In the Altay region during the Holocene it has revealed two periods of sharp cooling, the peaks of which occurred in the intervals 4500- 4300 and 2500-2300 years pronounced climatic drying ago, and two periods of 4900-4700 and 130-70 years ago. Depletion of the algae composition in the layer corresponding to the last period of drying climate indicates a very sharp change in the parameters of moisture and turning the lake into a dry swamp. Periods of cold weather may have contributed to the formation of special ritual traditions of the Sakan tribes that require the frozen ground to bury the dead. The later climate fluctuations identified have not affected the safety of permafrost in burial mounds constructed in the V-III cc BC.展开更多
With the vigorous expansion of nonlinear adaptive filtering with real-valued kernel functions,its counterpart complex kernel adaptive filtering algorithms were also sequentially proposed to solve the complex-valued no...With the vigorous expansion of nonlinear adaptive filtering with real-valued kernel functions,its counterpart complex kernel adaptive filtering algorithms were also sequentially proposed to solve the complex-valued nonlinear problems arising in almost all real-world applications.This paper firstly presents two schemes of the complex Gaussian kernel-based adaptive filtering algorithms to illustrate their respective characteristics.Then the theoretical convergence behavior of the complex Gaussian kernel least mean square(LMS) algorithm is studied by using the fixed dictionary strategy.The simulation results demonstrate that the theoretical curves predicted by the derived analytical models consistently coincide with the Monte Carlo simulation results in both transient and steady-state stages for two introduced complex Gaussian kernel LMS algonthms using non-circular complex data.The analytical models are able to be regard as a theoretical tool evaluating ability and allow to compare with mean square error(MSE) performance among of complex kernel LMS(KLMS) methods according to the specified kernel bandwidth and the length of dictionary.展开更多
Today's emergence of nano-micro hybrid structures with almost biological complexity is of fundamental interest. Our ability to adapt intelligently to the challenges has ramifications all the way from fundamentally ch...Today's emergence of nano-micro hybrid structures with almost biological complexity is of fundamental interest. Our ability to adapt intelligently to the challenges has ramifications all the way from fundamentally changing research itself, over applications critical to future survival, to posing globally existential dangers. Touching on specific issues such as how complexity relates to the catalytic prowess of multi-metal compounds, we discuss the increasingly urgent issues in nanotechnology also very generally and guided by the motto 'Bio Is Nature's Nanotech'. Technology belongs to macro-evolution; for example integration with artificial intelligence (AI) is inevitable. Darwinian adaptation manifests as integration of complexity, and awareness of this helps in developing adaptable research methods that can find use across a wide range of research. The second half of this work reviews a diverse range of projects which all benefited from 'playful' programming aimed at dealing with complexity. The main purpose of reviewing them is to show how such projects benefit from and fit in with the general, philosophical approach, proving the relevance of the 'big picture' where it is usually disregarded.展开更多
This study is trying to address the critical need for efficient routing in Mobile Ad Hoc Networks(MANETs)from dynamic topologies that pose great challenges because of the mobility of nodes.Themain objective was to del...This study is trying to address the critical need for efficient routing in Mobile Ad Hoc Networks(MANETs)from dynamic topologies that pose great challenges because of the mobility of nodes.Themain objective was to delve into and refine the application of the Dijkstra’s algorithm in this context,a method conventionally esteemed for its efficiency in static networks.Thus,this paper has carried out a comparative theoretical analysis with the Bellman-Ford algorithm,considering adaptation to the dynamic network conditions that are typical for MANETs.This paper has shown through detailed algorithmic analysis that Dijkstra’s algorithm,when adapted for dynamic updates,yields a very workable solution to the problem of real-time routing in MANETs.The results indicate that with these changes,Dijkstra’s algorithm performs much better computationally and 30%better in routing optimization than Bellman-Ford when working with configurations of sparse networks.The theoretical framework adapted,with the adaptation of the Dijkstra’s algorithm for dynamically changing network topologies,is novel in this work and quite different from any traditional application.The adaptation should offer more efficient routing and less computational overhead,most apt in the limited resource environment of MANETs.Thus,from these findings,one may derive a conclusion that the proposed version of Dijkstra’s algorithm is the best and most feasible choice of the routing protocol for MANETs given all pertinent key performance and resource consumption indicators and further that the proposed method offers a marked improvement over traditional methods.This paper,therefore,operationalizes the theoretical model into practical scenarios and also further research with empirical simulations to understand more about its operational effectiveness.展开更多
As a global strategic reserve resource,rare earth has been widely used in important industries,such as military equipment and biomedicine.However,existing analyses based solely on the total volume of rare earth trade ...As a global strategic reserve resource,rare earth has been widely used in important industries,such as military equipment and biomedicine.However,existing analyses based solely on the total volume of rare earth trade fail to uncover the underlying competition and dependency dynamics.To address this gap,this paper employs the principles of trade preference and import similarity to construct dependency and competition networks.Complex network analysis is then employed to study the evolution of the global rare earth trade network from 2002 to 2018.The main conclusions are as follows.The global rare earth trade follows the Pareto principle,and the trade network shows a scale-free distribution.China has emerged as the world’s largest importer and exporter of rare earth since 2017.In the dependency network,China has become the most dependent country since 2006.The result of community division shows that China has separated from the American community and formed new communities with the Association of Southeast Asian Nations(ASEAN)countries.The United States of America has formed a super-strong community with European and Asian countries.In the competition network,the distribution of competition intensity follows a scale-free distribution.Most countries face low-intensity competition,but there are numerous competing countries.The competition related to China has increased significantly.Lastly,the competition source for the United States of America has shifted from Mexico to China,resulting in China,the USA,and Japan becoming the core participants in the competition network.展开更多
Heart monitoring improves life quality.Electrocardiograms(ECGs or EKGs)detect heart irregularities.Machine learning algorithms can create a few ECG diagnosis processing methods.The first method uses raw ECG and time-s...Heart monitoring improves life quality.Electrocardiograms(ECGs or EKGs)detect heart irregularities.Machine learning algorithms can create a few ECG diagnosis processing methods.The first method uses raw ECG and time-series data.The second method classifies the ECG by patient experience.The third technique translates ECG impulses into Q waves,R waves and S waves(QRS)features using richer information.Because ECG signals vary naturally between humans and activities,we will combine the three feature selection methods to improve classification accuracy and diagnosis.Classifications using all three approaches have not been examined till now.Several researchers found that Machine Learning(ML)techniques can improve ECG classification.This study will compare popular machine learning techniques to evaluate ECG features.Four algorithms—Support Vector Machine(SVM),Decision Tree,Naive Bayes,and Neural Network—compare categorization results.SVM plus prior knowledge has the highest accuracy(99%)of the four ML methods.QRS characteristics failed to identify signals without chaos theory.With 99.8%classification accuracy,the Decision Tree technique outperformed all previous experiments.展开更多
Inertial navigation system/visual navigation system(INS/VNS) integrated navigation is a commonly used autonomous navigation method for planetary rovers. Since visual measurements are related to the previous and curren...Inertial navigation system/visual navigation system(INS/VNS) integrated navigation is a commonly used autonomous navigation method for planetary rovers. Since visual measurements are related to the previous and current state vectors(position and attitude) of planetary rovers, the performance of the Kalman filter(KF) will be challenged by the time-correlation problem. A state augmentation method, which augments the previous state value to the state vector, is commonly used when dealing with this problem. However, the augmenting of state dimensions will result in an increase in computation load. In this paper, a state dimension reduced INS/VNS integrated navigation method based on coordinates of feature points is presented that utilizes the information obtained through INS/VNS integrated navigation at a previous moment to overcome the time relevance problem and reduce the dimensions of the state vector. Equations of extended Kalman filter(EKF) are used to demonstrate the equivalence of calculated results between the proposed method and traditional state augmented methods. Results of simulation and experimentation indicate that this method has less computational load but similar accuracy when compared with traditional methods.展开更多
Based on the ideas of infeasible interior-point methods and predictor-corrector algorithms, two interior-point predictor-corrector algorithms for the second-order cone programming (SOCP) are presented. The two algor...Based on the ideas of infeasible interior-point methods and predictor-corrector algorithms, two interior-point predictor-corrector algorithms for the second-order cone programming (SOCP) are presented. The two algorithms use the Newton direction and the Euler direction as the predictor directions, respectively. The corrector directions belong to the category of the Alizadeh-Haeberly-Overton (AHO) directions. These algorithms are suitable to the cases of feasible and infeasible interior iterative points. A simpler neighborhood of the central path for the SOCP is proposed, which is the pivotal difference from other interior-point predictor-corrector algorithms. Under some assumptions, the algorithms possess the global, linear, and quadratic convergence. The complexity bound O(rln(εo/ε)) is obtained, where r denotes the number of the second-order cones in the SOCP problem. The numerical results show that the proposed algorithms are effective.展开更多
This paper deals with the problem of uniqueness of meromorphic functions with two deficient values and obtains a result which is an improvement of that of F.Gross and Yi Hongxun.
Genetic Algorithms(GAs)are efficient non-gradient stochastic search methods and Parallel GAs(PGAs)are proposed to overcome the deficiencies of the sequential GAs,such as low speed,aptness to local convergence,etc.Howe...Genetic Algorithms(GAs)are efficient non-gradient stochastic search methods and Parallel GAs(PGAs)are proposed to overcome the deficiencies of the sequential GAs,such as low speed,aptness to local convergence,etc.However,the tremendous increase in the communication costs accompanied with the parallelization stunts the further improvements of PGAs.This letter takes the decrease of the communication costs as the key to this problem and advances a new Migration Scheme based on Schema Theorem(MSST).MSST distills schemata from the populations and then proportionately disseminates them to other populations,which decreases the total communication cost among the populations and arms the multiple-population model with higher speed and better scalability.展开更多
文摘Blasting in geological bodies is an industrial process acting in an environment characterized by high uncertainties (natural joints, faults, voids, abrupt structural changes), which are transposed into the process parameters (e.g. energetic transfer to rock mass, hole deviations, misfires, vibrations, fly-rock, etc.). The approach to this problem searching for the "optimum" result can be ineffective. The geological environment is marked out by too many uncertainties, to have an "optimum" suitable to different applications. Researching for "Robustness" in a blast design gives rise to much more efficiency. Robustness is the capability of the system to behave constantly under varying conditions, without leading to unexpected results. Since the geology varies from site to site, setting a robust method can grant better results in varying environments, lowering the costs and increasing benefits and safety. Complexity Analysis (C.A.) is an innovative approach to systems. C.A. allows analyzing the Complexity of the Blast System and the criticality of each variable (drilling, charging and initiation parameters). The lower is the complexity, the more robust is the system, and the lower is the possibility of unexpected results. The paper presents the results obtained thanks to the C.A. approach in an underground gypsum quarry (Italy), exploited by conventional rooms and pillars method by drilling and blasting. The application of C.A. led to a reliable solution to reduce the charge per delay, hence reducing the impact of ground vibration on the surrounding structures. The analysis of the correlation degree between the variables allowed recognizing empirical laws as well.
基金This research is supported by the National Natural Science Foundation of China(Nos.11631013,71331001,11331012)the National 973 Program of China(No.2015CB856002).
文摘The pooling problem,also called the blending problem,is fundamental in production planning of petroleum.It can be formulated as an optimization problem similar with the minimum-cost flow problem.However,Alfaki and Haugland(J Glob Optim 56:897–916,2013)proved the strong NP-hardness of the pooling problem in general case.They also pointed out that it was an open problem to determine the computational complexity of the pooling problem with a fixed number of qualities.In this paper,we prove that the pooling problem is still strongly NP-hard even with only one quality.This means the quality is an essential difference between minimum-cost flow problem and the pooling problem.For solving large-scale pooling problems in real applications,we adopt the non-monotone strategy to improve the traditional successive linear programming method.Global convergence of the algorithm is established.The numerical experiments show that the non-monotone strategy is effective to push the algorithm to explore the global minimizer or provide a good local minimizer.Our results for real problems from factories show that the proposed algorithm is competitive to the one embedded in the famous commercial software Aspen PIMS.
基金The National Basic Research Program of China (973Program) (No.2005CB321802)Program for New Century Excellent Talents in University (No.NCET-06-0926)the National Natural Science Foundation of China (No.60873097,90612009)
文摘To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empirical studies of the gene ontology with various perspectives, this paper shows that the whole gene ontology displays the same topological features as complex networks including "small world" and "scale-free",while some sub-ontologies have the "scale-free" property but no "small world" effect.The potential important terms in an ontology are discovered by some famous complex network centralization methods.An evaluation method based on information retrieval in MEDLINE is designed to measure the effectiveness of the discovered important terms.According to the relevant literature of the gene ontology terms,the suitability of these centralization methods for ontology important concepts discovering is quantitatively evaluated.The experimental results indicate that the betweenness centrality is the most appropriate method among all the evaluated centralization measures.
基金TheNationalNaturalScienceFoundationofChina (No .2 96 770 0 4)
文摘The effects of the calorimetric buffer solutions were investigated while the two colorimetric reactions of AI-ferron complex and Fe-ferron complex occurred individually, and the effects of the testing wavelength and the pH of the solutions were also investigated. A timed complexatian colorimetric analysis method of Al-Fe-ferron in view of the total concentration of {AI + Fe} was then established to determine the species distribution of polymeric Al-Fe. The testing wavelength was recommended at 362 net and the testing pH value was 5. With a comparison of the ratios of n(Al)/n(Fe), the standard adsorption curves of the polymeric Al-Fe solutions were derived from the experimental results. Furthermore, the solutions' composition were carious in both the molar n(Al)/n(Fe) ratios, i.e. 0/0, 5/5, 9/1 and 0/10, and the concentrations associated with the total ( Al + Fe which ranged from 10(-5) to 10(-4) mol/L..
基金co-supported by the State Key Program of National Natural Science Foundation of China (No. 91538204)the National Science Fund for Distinguished Young Scholars (No. 61425014)the National Key Technologies R&D Program of China (No. 2015BAG15B01)
文摘Air traffic complexity is an objective metric for evaluating the operational condition of the airspace. It has several applications, such as airspace design and traffic flow management.Therefore, identifying a reliable method to accurately measure traffic complexity is important. Considering that many factors correlate with traffic complexity in complicated nonlinear ways,researchers have proposed several complexity evaluation methods based on machine learning models which were trained with large samples. However, the high cost of sample collection usually results in limited training set. In this paper, an ensemble learning model is proposed for measuring air traffic complexity within a sector based on small samples. To exploit the classification information within each factor, multiple diverse factor subsets(FSSs) are generated under guidance from factor noise and independence analysis. Then, a base complexity evaluator is built corresponding to each FSS. The final complexity evaluation result is obtained by integrating all results from the base evaluators. Experimental studies using real-world air traffic operation data demonstrate the advantages of our model for small-sample-based traffic complexity evaluation over other stateof-the-art methods.
文摘Backgroud:Clinical studies on acupuncture treatment of hyperplasia of mammary gland(HMG)have proved its effectiveness,but most studies have paid little attention to acupoints prescription and acupoint compatibility.The clinical prescription is not identical,the curative effect also has the difference.Therefore,through data mining and network analysis,this study explored the core acupoints and the compatibility law of acupoints in acupuncture treatment of HMG.Methods:To search and select qualified literature according to inclusion and exclusion criteria for relevant clinical research literature on acupuncture treatment of HMG in CNKI,VIP database,WanFang database and PubMed,etc.Then extract relevant information and establish a database.Using the method of statistical and complex network analysis,this paper studies the core acupoints and the law of acupoint compatibility.Results:A total of 104 Chinese literatures and 0 English literatures were included and 106 acupuncture prescriptions were extracted.The core acupoints in the treatment of HMG are Danzhong(CV 17),Wuyi(ST 15),Zusanli(ST 36),Jianjing(GB 21).Danzhong(CV 17)and Zusanli(ST 36),Danzhong(CV 17)and Wuyi(ST 15),Jianjing(GB 21)and Tianzong(SI 11),Jianjing(GB 21)and Wuyi(ST 15)have the highest correlation degree.The method of acupoint matching mainly consists of local-remote acupoints,upper-lower acupoints and front-rear acupoints.Conclusion:The results of a network analysis substantially accord with the general rules of acupuncture theories in traditional Chinese medicine,able to reflect the points-selection principles and features in acupuncture treatment of HMG and provide evidence for the acupoints selection in the treatment of HMG in acupuncture clinic.
基金supported by the funded project of the China Geological Survey(grants No.12120113061700,121201009000150013 and DD20160029)
文摘Objective The Yangtze craton collisional orogeny at ca. extensional events at ca. 1 experienced Paleoproterozoic 1.95-2.0 Ga and post-orogenic 85 Ga related to amalgamation of the Columbia (Nuna) supercontinent (Zhao and Cawood, 2012). A ca. 2.15 Ga suprasubduction zone ophiolitic melange was recongized in the Archean- Paleoproterozoic Kongling Complex of the northern Yangtze craton (Han et al., 2017). However, the tectonic evolution in early Paleoproterozoic from 2.4 Ga to 2.2 Ga remains unclear. We report here the presence of a suite of Paleoproterozoic (2.2 Ga) granites in the Huangling dome, northern Yangtze craton, which may provide important insights into crustal growth processes in the craton prior to the assembly of Columbia.
基金supported by the National Natural Science Foundation of China(grant No.41572194)the Institute of Geology,China Earthquake Administration(grant No.IGCEA1604)the National Key Basic Research Program of China(grant No.2013CB733205)
文摘Objective The 2014 Ludian Mw6.1 earthquake in Yunnan occurred in a mountainous area with complex tectonics and topography, which caused serious damage as well as co-seismic landslides of an unusual large scale. Because the suspected seismogenic faults on the surface, distribution of aftershocks and focal mechanism solutions are not consistent, it remains difficult to determine what is the real causal fault or seismogenic structure for this event. Actually, it may imply the complicity of the seismic source at depth. In addition, the distribution of the co- seismic landslides also exhibits some diffusion that is different from general eases, likely associated with the seismic focus structure.
文摘In this paper, the variability characteristics of the global field of sea surface temperature (SST) anomaly are studied by complex principal component (c.p.c.) analysis, whose results are also compared with those of real p.c. analysis. The data consist of 40 years of global SST monthly averages over latitudes from 42 5°S to 67 5°N. In the spatial domain, it is found that the distribution of the first complex loading amplitude is characterized by three areas of large values: the first one in the eastern and central equatorial Pacific Ocean, the second one in the northern tropical Indian Ocean and South China Sea, the third one in the northern Pacific Ocean. As it will be explained, this pattern may be considered as representative of El Nio mode. The first complex loading phase pattern shows a stationary wave in the Pacific (also revealed by real p.c. analysis) superimposed to an oscillating disturbance, propagating from the Pacific to Indian or the opposite way. A subsequent correlation analysis among different spatial points allows revealing disturbances actually propagating westward from the Pacific to the Indian Ocean, which could therefore represent reflected Rossby waves, i.e. the west phase of the signals that propagate disturbances of thermal structure in the tropical Pacific Ocean. In the time domain, a relation between the trend of the first complex principal component and the ENSO cycle is also established.
基金supported by the Russian Foundation for Basic Research (Grant No 08-05-92223)
文摘Methodological problems of climatic reconstruction for different periods of Holocene are discussed on the basis of a multiple group biological analysis on peat-sapropel sediments. The possibility of biological analysis is exemplified by the paleoclimatic reconstruction for Carpathian and Altai Mountain ranges. For the "Skolevsky Beskidy" national park of Carpaty the paleoclimatic scenarios have been drown up aiming at the more precise definition of climatic conditions for the period of mass mountain slope terracing. The stability of terrace systems of various designs in the current climatic conditions has been assessed. It is shown that during periods of humid climate the terraces, whose designs have been focused on drainage, were built. In periods of dry and warm climate the terrace systems capable of accumulating water were built. Both these types of terrace systems are destroyed in nowadays. Only those terrace systems are stable which were adjusted by their builders to contrast variations of precipitation. For Western Altais the paleoclimatic scenario has been done to forecast the safety of the Bronze Age kurgans (burial earth mounds) with permafrost inside the construction. In the Altay region during the Holocene it has revealed two periods of sharp cooling, the peaks of which occurred in the intervals 4500- 4300 and 2500-2300 years pronounced climatic drying ago, and two periods of 4900-4700 and 130-70 years ago. Depletion of the algae composition in the layer corresponding to the last period of drying climate indicates a very sharp change in the parameters of moisture and turning the lake into a dry swamp. Periods of cold weather may have contributed to the formation of special ritual traditions of the Sakan tribes that require the frozen ground to bury the dead. The later climate fluctuations identified have not affected the safety of permafrost in burial mounds constructed in the V-III cc BC.
基金supported by the National Natural Science Foundation of China(6100115361271415+4 种基金6140149961531015)the Fundamental Research Funds for the Central Universities(3102014JCQ010103102014ZD0041)the Opening Research Foundation of State Key Laboratory of Underwater Information Processing and Control(9140C231002130C23085)
文摘With the vigorous expansion of nonlinear adaptive filtering with real-valued kernel functions,its counterpart complex kernel adaptive filtering algorithms were also sequentially proposed to solve the complex-valued nonlinear problems arising in almost all real-world applications.This paper firstly presents two schemes of the complex Gaussian kernel-based adaptive filtering algorithms to illustrate their respective characteristics.Then the theoretical convergence behavior of the complex Gaussian kernel least mean square(LMS) algorithm is studied by using the fixed dictionary strategy.The simulation results demonstrate that the theoretical curves predicted by the derived analytical models consistently coincide with the Monte Carlo simulation results in both transient and steady-state stages for two introduced complex Gaussian kernel LMS algonthms using non-circular complex data.The analytical models are able to be regard as a theoretical tool evaluating ability and allow to compare with mean square error(MSE) performance among of complex kernel LMS(KLMS) methods according to the specified kernel bandwidth and the length of dictionary.
基金jointly supported by the Natural Science Foundation of Jiangsu Province (No.2012729)the Innovation Fund of Jiangsu Province (No.BY2013072-06)the National Natural Science Foundation of China (No.51171078 and No.11374136)
文摘Today's emergence of nano-micro hybrid structures with almost biological complexity is of fundamental interest. Our ability to adapt intelligently to the challenges has ramifications all the way from fundamentally changing research itself, over applications critical to future survival, to posing globally existential dangers. Touching on specific issues such as how complexity relates to the catalytic prowess of multi-metal compounds, we discuss the increasingly urgent issues in nanotechnology also very generally and guided by the motto 'Bio Is Nature's Nanotech'. Technology belongs to macro-evolution; for example integration with artificial intelligence (AI) is inevitable. Darwinian adaptation manifests as integration of complexity, and awareness of this helps in developing adaptable research methods that can find use across a wide range of research. The second half of this work reviews a diverse range of projects which all benefited from 'playful' programming aimed at dealing with complexity. The main purpose of reviewing them is to show how such projects benefit from and fit in with the general, philosophical approach, proving the relevance of the 'big picture' where it is usually disregarded.
基金supported by Northern Border University,Arar,Kingdom of Saudi Arabia,through the Project Number“NBU-FFR-2024-2248-03”.
文摘This study is trying to address the critical need for efficient routing in Mobile Ad Hoc Networks(MANETs)from dynamic topologies that pose great challenges because of the mobility of nodes.Themain objective was to delve into and refine the application of the Dijkstra’s algorithm in this context,a method conventionally esteemed for its efficiency in static networks.Thus,this paper has carried out a comparative theoretical analysis with the Bellman-Ford algorithm,considering adaptation to the dynamic network conditions that are typical for MANETs.This paper has shown through detailed algorithmic analysis that Dijkstra’s algorithm,when adapted for dynamic updates,yields a very workable solution to the problem of real-time routing in MANETs.The results indicate that with these changes,Dijkstra’s algorithm performs much better computationally and 30%better in routing optimization than Bellman-Ford when working with configurations of sparse networks.The theoretical framework adapted,with the adaptation of the Dijkstra’s algorithm for dynamically changing network topologies,is novel in this work and quite different from any traditional application.The adaptation should offer more efficient routing and less computational overhead,most apt in the limited resource environment of MANETs.Thus,from these findings,one may derive a conclusion that the proposed version of Dijkstra’s algorithm is the best and most feasible choice of the routing protocol for MANETs given all pertinent key performance and resource consumption indicators and further that the proposed method offers a marked improvement over traditional methods.This paper,therefore,operationalizes the theoretical model into practical scenarios and also further research with empirical simulations to understand more about its operational effectiveness.
基金supported by the Ministry of Education of the People’s Republic of China Humanities and Social Sciences Youth Foundation(Grant No.22YJC910014)the Social Sciences Planning Youth Project of Anhui Province(Grant No.AHSKQ2022D138)the Innovation Development Research Project of Anhui Province(Grant No.2021CX053).
文摘As a global strategic reserve resource,rare earth has been widely used in important industries,such as military equipment and biomedicine.However,existing analyses based solely on the total volume of rare earth trade fail to uncover the underlying competition and dependency dynamics.To address this gap,this paper employs the principles of trade preference and import similarity to construct dependency and competition networks.Complex network analysis is then employed to study the evolution of the global rare earth trade network from 2002 to 2018.The main conclusions are as follows.The global rare earth trade follows the Pareto principle,and the trade network shows a scale-free distribution.China has emerged as the world’s largest importer and exporter of rare earth since 2017.In the dependency network,China has become the most dependent country since 2006.The result of community division shows that China has separated from the American community and formed new communities with the Association of Southeast Asian Nations(ASEAN)countries.The United States of America has formed a super-strong community with European and Asian countries.In the competition network,the distribution of competition intensity follows a scale-free distribution.Most countries face low-intensity competition,but there are numerous competing countries.The competition related to China has increased significantly.Lastly,the competition source for the United States of America has shifted from Mexico to China,resulting in China,the USA,and Japan becoming the core participants in the competition network.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through Large Groups(Grant Number RGP.2/246/44),B.B.,and https://www.kku.edu.sa/en.
文摘Heart monitoring improves life quality.Electrocardiograms(ECGs or EKGs)detect heart irregularities.Machine learning algorithms can create a few ECG diagnosis processing methods.The first method uses raw ECG and time-series data.The second method classifies the ECG by patient experience.The third technique translates ECG impulses into Q waves,R waves and S waves(QRS)features using richer information.Because ECG signals vary naturally between humans and activities,we will combine the three feature selection methods to improve classification accuracy and diagnosis.Classifications using all three approaches have not been examined till now.Several researchers found that Machine Learning(ML)techniques can improve ECG classification.This study will compare popular machine learning techniques to evaluate ECG features.Four algorithms—Support Vector Machine(SVM),Decision Tree,Naive Bayes,and Neural Network—compare categorization results.SVM plus prior knowledge has the highest accuracy(99%)of the four ML methods.QRS characteristics failed to identify signals without chaos theory.With 99.8%classification accuracy,the Decision Tree technique outperformed all previous experiments.
基金supported by the National Natural Science Foundation of China (Nos. 61233005 and 61503013)the National Basic Research Program of China (No. 2014CB744202)+2 种基金Beijing Youth Talent ProgramFundamental Science on Novel Inertial Instrument & Navigation System Technology LaboratoryProgram for Changjiang Scholars and Innovative Research Team in University (IRT1203) for their valuable comments
文摘Inertial navigation system/visual navigation system(INS/VNS) integrated navigation is a commonly used autonomous navigation method for planetary rovers. Since visual measurements are related to the previous and current state vectors(position and attitude) of planetary rovers, the performance of the Kalman filter(KF) will be challenged by the time-correlation problem. A state augmentation method, which augments the previous state value to the state vector, is commonly used when dealing with this problem. However, the augmenting of state dimensions will result in an increase in computation load. In this paper, a state dimension reduced INS/VNS integrated navigation method based on coordinates of feature points is presented that utilizes the information obtained through INS/VNS integrated navigation at a previous moment to overcome the time relevance problem and reduce the dimensions of the state vector. Equations of extended Kalman filter(EKF) are used to demonstrate the equivalence of calculated results between the proposed method and traditional state augmented methods. Results of simulation and experimentation indicate that this method has less computational load but similar accuracy when compared with traditional methods.
基金supported by the National Natural Science Foundation of China (Nos. 71061002 and 11071158)the Natural Science Foundation of Guangxi Province of China (Nos. 0832052 and 2010GXNSFB013047)
文摘Based on the ideas of infeasible interior-point methods and predictor-corrector algorithms, two interior-point predictor-corrector algorithms for the second-order cone programming (SOCP) are presented. The two algorithms use the Newton direction and the Euler direction as the predictor directions, respectively. The corrector directions belong to the category of the Alizadeh-Haeberly-Overton (AHO) directions. These algorithms are suitable to the cases of feasible and infeasible interior iterative points. A simpler neighborhood of the central path for the SOCP is proposed, which is the pivotal difference from other interior-point predictor-corrector algorithms. Under some assumptions, the algorithms possess the global, linear, and quadratic convergence. The complexity bound O(rln(εo/ε)) is obtained, where r denotes the number of the second-order cones in the SOCP problem. The numerical results show that the proposed algorithms are effective.
文摘This paper deals with the problem of uniqueness of meromorphic functions with two deficient values and obtains a result which is an improvement of that of F.Gross and Yi Hongxun.
基金National Natural Science Foundation of China(No.60073012)National Science Foundation of Jiangsu,China(BK2001004)Visiting Scholar Foundation of Key Lab.in the University
文摘Genetic Algorithms(GAs)are efficient non-gradient stochastic search methods and Parallel GAs(PGAs)are proposed to overcome the deficiencies of the sequential GAs,such as low speed,aptness to local convergence,etc.However,the tremendous increase in the communication costs accompanied with the parallelization stunts the further improvements of PGAs.This letter takes the decrease of the communication costs as the key to this problem and advances a new Migration Scheme based on Schema Theorem(MSST).MSST distills schemata from the populations and then proportionately disseminates them to other populations,which decreases the total communication cost among the populations and arms the multiple-population model with higher speed and better scalability.