Experimental and theoretical studies of the mechanisms of vibration stimulation of oil recovery in watered fields lead to the conclusion that resonance oscillations develop in fractured-block formations. These oscilla...Experimental and theoretical studies of the mechanisms of vibration stimulation of oil recovery in watered fields lead to the conclusion that resonance oscillations develop in fractured-block formations. These oscillations, caused by weak but long-lasting and frequency-stable influences, create the conditions for ultrasonic wave’s generation in the layers, which are capable of destroying thickened oil membranes in reservoir cracks. For fractured-porous reservoirs in the process of exploitation by the method of water high-pressure oil displacement, the possibility of intensifying ultrasonic vibrations can have an important technological significance. Even a very weak ultrasound can destroy, over a long period of time, the viscous oil membranes formed in the cracks between the blocks, which can be the reason for lowering the permeability of the layers and increasing the oil recovery. To describe these effects, it is necessary to consider the wave process in a hierarchically blocky environment and theoretically simulate the mechanism of the appearance of self-oscillations under the action of relaxation shear stresses. For the analysis of seism acoustic response in time on fixed intervals along the borehole an algorithm of phase diagrams of the state of many-phase medium is suggested.展开更多
Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabili...Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.展开更多
Ancient stellar observations are a valuable cultural heritage,profoundly influencing both cultural domains and modern astronomical research.Shi’s Star Catalog(石氏星经),the oldest extant star catalog in China,faces c...Ancient stellar observations are a valuable cultural heritage,profoundly influencing both cultural domains and modern astronomical research.Shi’s Star Catalog(石氏星经),the oldest extant star catalog in China,faces controversy regarding its observational epoch.Determining this epoch via precession assumes accurate ancient coordinates and correspondence with contemporary stars,posing significant challenges.This study introduces a novel method using the Generalized Hough Transform to ascertain the catalog’s observational epoch.This approach statistically accommodates errors in ancient coordinates and discrepancies between ancient and modern stars,addressing limitations in prior methods.Our findings date Shi’s Star Catalog to the 4th century BCE,with 2nd-century CE adjustments.In comparison,the Western tradition’s oldest known catalog,the Ptolemaic Star Catalog(2nd century CE),likely derives from the Hipparchus Star Catalog(2nd century BCE).Thus,Shi’s Star Catalog is identified as the world’s oldest known star catalog.Beyond establishing its observation period,this study aims to consolidate and digitize these cultural artifacts.展开更多
Clock difference between the ensemble pulsar timescale(PT)and the International Atomic Time(TAI)PT-TAI derived from the International Pulsar Timing Array(IPTA)data set indicates a very similar variation trend with the...Clock difference between the ensemble pulsar timescale(PT)and the International Atomic Time(TAI)PT-TAI derived from the International Pulsar Timing Array(IPTA)data set indicates a very similar variation trend with the Terrestrial Time TT(BIPMXXXX)-TAI but PT has larger measurement error.In this paper,we discuss the smoothing method of PT using a combined smoothing filter and compare the results with that from other filters.The clock difference sequence between PT-TAI and the first time derivative series of the TT(BIPMXXXX)-TAI can be combined by a combined smoothing filter to yield two smooth curves tied by the constraints assuring that the latter is the derivative of the former.The ensemble pulsar time IPTA2016 with respect to TAI published by G.Hobbs et al.and first time derivative series of the TT(BIPM2017)-TAI with quadratic polynomial terms removed are processed by combined smoothing filter in order to demonstrate the properties of the smoothed results.How to correctly estimate two smoothing coefficients is described and the output results of the combined smoothing filter are analyzed.The results show that the combined smoothing method efficiently removes high frequency noises of two input data series and the smoothed data of the PT-TAI combine long term fractional frequency stability of the pulsar time and frequency accuracy of the terrestrial time.Fractional frequency stability analysis indicates that both short and medium time interval stability of the smoothed PT-TAI is improved while keeping its original long term frequency stability level.The combined smoothing filter is more suitable for smoothing observational pulsar timescale data than any filter that only performs smoothing of a single pulsar time series.The smoothed pulsar time by combined smoothing filter is a pulsar atomic time combined timescale.This kind of combined timescale can also be used as terrestrial time.展开更多
In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal m...In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal mine safety should lead to improved efficiency in the use of funds and management resources. This helps government and enterprise managers better understand how safety inputs are used and to optimize allocation of resources. Study on coal mine's efficiency assessment of safety input was con- ducted in this paper. A C^2R model with non-Archimedean infinitesimal vector based on output is established after consideration of the input characteristics and the model properties. An assessment of an operating mine was done using a specific set of input and output criteria. It is found that the safety input was efficient in 2002 and 2005 and was weakly efficient in 2003. However, the efficiency was relatively low in both 2001 and 2004. The safety input resources can be optimized and adjusted by means of projection theory. Such analysis shows that, on average in 2001 and 2004, 45% of the expended funds could have been saved. Likewise, 10% of the safety management and technical staff could have been eliminated and working hours devoted to safety could have been reduced by 12%. These conditions could have Riven the same results.展开更多
A set of indices for performance evaluation for business processes with multiple inputs and multiple outputs is proposed, which are found in machinery manufacturers. Based on the traditional methods of data envelopmen...A set of indices for performance evaluation for business processes with multiple inputs and multiple outputs is proposed, which are found in machinery manufacturers. Based on the traditional methods of data envelopment analysis (DEA) and analytical hierarchical process (AHP), a hybrid model called DEA/AHP model is proposed to deal with the evaluation of business process performance. With the proposed method, the DEA is firstly used to develop a pairwise comparison matrix, and then the AHP is applied to evaluate the performance of business process using the pairwise comparison matrix. The significant advantage of this hybrid model is the use of objective data instead of subjective human judgment for performance evaluation. In the case study, a project of business process reengineering (BPR) with a hydraulic machinery manufacturer is used to demonstrate the effectiveness of the DEA/AHP model.展开更多
In the last decade,ranking units in data envelopment analysis(DEA) has become the interests of many DEA researchers and a variety of models were developed to rank units with multiple inputs and multiple outputs.These ...In the last decade,ranking units in data envelopment analysis(DEA) has become the interests of many DEA researchers and a variety of models were developed to rank units with multiple inputs and multiple outputs.These performance factors(inputs and outputs) are classified into two groups:desirable and undesirable.Obviously,undesirable factors in production process should be reduced to improve the performance.Also,some of these data may be known only in terms of ordinal relations.While the models developed in the past are interesting and meaningful,they didn t consider both undesirable and ordinal factors at the same time.In this research,we develop an evaluating model and a ranking model to overcome some deficiencies in the earlier models.This paper incorporates undesirable and ordinal data in DEA and discusses the efficiency evaluation and ranking of decision making units(DMUs) with undesirable and ordinal data.For this purpose,we transform the ordinal data into definite data,and then we consider each undesirable input and output as desirable output and input,respectively.Finally,an application that shows the capability of the proposed method is illustrated.展开更多
The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the...The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the appearance of uncertainties on input and output data of decision making unit (DMU) might make the nominal solution infeasible and lead to the efficiency scores meaningless from practical view. This paper analyzes the impact of data uncertainty on the evaluation results of DEA, and proposes several robust DEA models based on the adaptation of recently developed robust optimization approaches, which would be immune against input and output data uncertainties. The robust DEA models developed are based on input-oriented and outputoriented CCR model, respectively, when the uncertainties appear in output data and input data separately. Furthermore, the robust DEA models could deal with random symmetric uncertainty and unknown-but-bounded uncertainty, in both of which the distributions of the random data entries are permitted to be unknown. The robust DEA models are implemented in a numerical example and the efficiency scores and rankings of these models are compared. The results indicate that the robust DEA approach could be a more reliable method for efficiency evaluation and ranking in MCDM problems.展开更多
This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it i...This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it introduces the R/S analysis for time series analysis into spacial series to calculate the structural fractal dimensions of ranges and standard deviation for spacial series data -and to establish the fractal dimension matrix and the procedures in plotting the fractal dimension anomaly diagram with vector distances of fractal dimension . At last , it has examples of its application .展开更多
Data envelopment analysis (DEA) has become a standard non parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it ...Data envelopment analysis (DEA) has become a standard non parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it can solve the prediction problem with multiple inputs and outputs which can not be solved easily by the regression analysis method.But the traditional DEA models can not solve the problem with undesirable outputs,so in this paper the inherent relationship between goal programming and the DEA method based on the relationship between multiple goal programming and goal programming is explored,and a mixed DEA model which can make all factors of inputs and undesirable outputs decrease in different proportions is built.And at the same time,all the factors of desirable outputs increase in different proportions.展开更多
This paper introduced the basic theory and algorithm of the surrogate data method, which proposed a rigorous way to detect the random and seemingly stochastic characteristics in a system. The Gaussian data and the Ros...This paper introduced the basic theory and algorithm of the surrogate data method, which proposed a rigorous way to detect the random and seemingly stochastic characteristics in a system. The Gaussian data and the Rossler data were used to show the availability and effectivity of this method. According to the analysis by this method based on the short-circuiting current signals under the conditions of the same voltage and different wire feed speeds, it is demonstrated that the electrical signals time series exhibit apparently randomness when the welding parameters do not match. However, the electrical signals time series are deterministic when a match is found. The stability of short-circuiting transfer process could be judged exactly by the method of surrogate data.展开更多
This paper proposes a new approach for ranking efficiency units in data envelopment analysis as a modification of the super-efficiency models developed by Tone [1]. The new approach based on slacks-based measure of ef...This paper proposes a new approach for ranking efficiency units in data envelopment analysis as a modification of the super-efficiency models developed by Tone [1]. The new approach based on slacks-based measure of efficiency (SBM) for dealing with objective function used to classify all of the decision-making units allows the ranking of all inefficient DMUs and overcomes the disadvantages of infeasibility. This method also is applied to rank super-efficient scores for the sample of 145 agricultural bank branches in Viet Nam during 2007-2010. We then compare the estimated results from the new SCI model and the exsisting SBM model by using some statistical tests.展开更多
According to the eco-efficiency theory, combined with agricultural production characteristics, I point out the environmental impact and substance energy consumption characteristics of agricultural production. Based on...According to the eco-efficiency theory, combined with agricultural production characteristics, I point out the environmental impact and substance energy consumption characteristics of agricultural production. Based on this, I establish the eco-efficiency evaluation indicator system for agricultural production, and conduct a comprehensive analysis on the agricultural eco-efficiency of 17 prefecture-level cities in Anhui Province, using data envelopment analysis method.展开更多
Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minute...Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.展开更多
The paper studies the non-zero slacks in data envelopment analysis. A procedure is developed for the treatment of non-zero slacks. DEA projections can be done just in one step.
In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, η-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the character...In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, η-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the characteristics of magnitude, time and space distribution of seismicity from different respects. By using the primary component analysis method, the synthesis parameter W reflecting the anomalous features of earthquake magnitude, time and space distribution can be gained. Generally, there is some relativity among the 8 parameters, but their variations are different in different periods. The earthquake prediction based on these parameters is not very well. However, the synthesis parameter W showed obvious anomalies before 13 earthquakes (MS≥5.8) occurred in North China, which indicates that the synthesis parameter W can reflect the anomalous characteristics of magnitude, time and space distribution of seismicity better. Other problems related to the conclusions drawn by the primary component analysis method are also discussed.展开更多
This work involves the evaluation of dry port competitiveness through analysis of efficiencies for selected dry ports in Africa. Five dry ports were selected and analysis carried out over a period of four years. The d...This work involves the evaluation of dry port competitiveness through analysis of efficiencies for selected dry ports in Africa. Five dry ports were selected and analysis carried out over a period of four years. The dry ports considered were Mojo and Kality in Ethiopia, Mombasa in Kenya, Isaka in Tanzania and Casablanca in Casablanca, Morocco. Data Envelopment Analysis (DEA) was applied for this work. Container throughputs for the various ports under consideration were used as the output variable for the data analysis model, while the number of reach stackers, the number of tractors, the number of forklifts and the size of the dry port were used as the input variables. From the results, the Mombasa dry port was found to be the most efficient with an average score of approximately 1 over the period under consideration. Casablanca was the second efficient dry port with an average score of 0.762, while Isaka was the least efficient with an average score of 0.142. This research is significant since the African countries have embraced the dry port concept, as witnessed in the huge investments in this sector, and would serve to highlight areas that need improvement for the few existing dry port facilities, most of which are undergoing expansion as well as modernization.展开更多
This paper improves the slacks-based method for estimating inefficiency,derives the criteria for the selection of the weights of output and input inefficiencies in the objective function,and creates a new nonparametri...This paper improves the slacks-based method for estimating inefficiency,derives the criteria for the selection of the weights of output and input inefficiencies in the objective function,and creates a new nonparametric method for accounting economic growth.Based on this method,the paper estimates the sources of China s economic growth from 1978 to 2013.Our findings suggest that factor input and especially capital is a major source of economic growth for China as a whole and its major regions,and that economic growth in recent years is increasingly dependent on capital.For a rather long period of time before 2005,China s northeast,central and western regions lagged behind the eastern region in terms of economic growth,and TFP and factor input are major reasons behind such regional growth disparities.Although other regions have narrowed their disparities with and even overtaken the eastern region in terms of economic growth,the key driver is the rapid increase in the contribution of factor input.Advanced technologies of eastern region should be utilized to promote TFP progress in other regions,which is vital to economic growth in these regions and China as a whole.展开更多
Although investment is regarded as a key force of China’s economic growth, little study has been done to measure China’s investment efficiency. The present paper applies the data envelopment analysis (DEA) to Chines...Although investment is regarded as a key force of China’s economic growth, little study has been done to measure China’s investment efficiency. The present paper applies the data envelopment analysis (DEA) to Chinese provincial panel data from the year 2003 to 2008 for measuring the investment efficiencies and identifying their trends of Chinese 30 provinces and autonomous regions. A cross-efficient DEA model with considering benevolent formulation is used for providing accurate efficiency scores and completely ranking. The empirical results suggest that the differences of investment efficiency in different regions are distinct but tending to diminish year by year, and the investment efficiencies in some provinces are significantly correlated to their investment rates to the national total investment.展开更多
文摘Experimental and theoretical studies of the mechanisms of vibration stimulation of oil recovery in watered fields lead to the conclusion that resonance oscillations develop in fractured-block formations. These oscillations, caused by weak but long-lasting and frequency-stable influences, create the conditions for ultrasonic wave’s generation in the layers, which are capable of destroying thickened oil membranes in reservoir cracks. For fractured-porous reservoirs in the process of exploitation by the method of water high-pressure oil displacement, the possibility of intensifying ultrasonic vibrations can have an important technological significance. Even a very weak ultrasound can destroy, over a long period of time, the viscous oil membranes formed in the cracks between the blocks, which can be the reason for lowering the permeability of the layers and increasing the oil recovery. To describe these effects, it is necessary to consider the wave process in a hierarchically blocky environment and theoretically simulate the mechanism of the appearance of self-oscillations under the action of relaxation shear stresses. For the analysis of seism acoustic response in time on fixed intervals along the borehole an algorithm of phase diagrams of the state of many-phase medium is suggested.
文摘Mitigating increasing cyberattack incidents may require strategies such as reinforcing organizations’ networks with Honeypots and effectively analyzing attack traffic for detection of zero-day attacks and vulnerabilities. To effectively detect and mitigate cyberattacks, both computerized and visual analyses are typically required. However, most security analysts are not adequately trained in visualization principles and/or methods, which is required for effective visual perception of useful attack information hidden in attack data. Additionally, Honeypot has proven useful in cyberattack research, but no studies have comprehensively investigated visualization practices in the field. In this paper, we reviewed visualization practices and methods commonly used in the discovery and communication of attack patterns based on Honeypot network traffic data. Using the PRISMA methodology, we identified and screened 218 papers and evaluated only 37 papers having a high impact. Most Honeypot papers conducted summary statistics of Honeypot data based on static data metrics such as IP address, port, and packet size. They visually analyzed Honeypot attack data using simple graphical methods (such as line, bar, and pie charts) that tend to hide useful attack information. Furthermore, only a few papers conducted extended attack analysis, and commonly visualized attack data using scatter and linear plots. Papers rarely included simple yet sophisticated graphical methods, such as box plots and histograms, which allow for critical evaluation of analysis results. While a significant number of automated visualization tools have incorporated visualization standards by default, the construction of effective and expressive graphical methods for easy pattern discovery and explainable insights still requires applied knowledge and skill of visualization principles and tools, and occasionally, an interdisciplinary collaboration with peers. We, therefore, suggest the need, going forward, for non-classical graphical methods for visualizing attack patterns and communicating analysis results. We also recommend training investigators in visualization principles and standards for effective visual perception and presentation.
基金supported by China National Astronomical Data Center(NADC),CAS Astronomical Data Center and Chinese Virtual Observatory(China-VO)supported by Astronomical Big Data Joint Research Center,co-founded by National Astronomical Observatories,Chinese Academy of Sciences and Alibaba Cloud。
文摘Ancient stellar observations are a valuable cultural heritage,profoundly influencing both cultural domains and modern astronomical research.Shi’s Star Catalog(石氏星经),the oldest extant star catalog in China,faces controversy regarding its observational epoch.Determining this epoch via precession assumes accurate ancient coordinates and correspondence with contemporary stars,posing significant challenges.This study introduces a novel method using the Generalized Hough Transform to ascertain the catalog’s observational epoch.This approach statistically accommodates errors in ancient coordinates and discrepancies between ancient and modern stars,addressing limitations in prior methods.Our findings date Shi’s Star Catalog to the 4th century BCE,with 2nd-century CE adjustments.In comparison,the Western tradition’s oldest known catalog,the Ptolemaic Star Catalog(2nd century CE),likely derives from the Hipparchus Star Catalog(2nd century BCE).Thus,Shi’s Star Catalog is identified as the world’s oldest known star catalog.Beyond establishing its observation period,this study aims to consolidate and digitize these cultural artifacts.
基金supported by the Strategic Priority Research Program of Chinese Academy of Sciences(grant No.XDA0350502)the National SKA Program of China(grant No.2020SKA0120103)the National Natural Science Foundation of China(NSFC,Grant Nos.U1831130 and 11973046)。
文摘Clock difference between the ensemble pulsar timescale(PT)and the International Atomic Time(TAI)PT-TAI derived from the International Pulsar Timing Array(IPTA)data set indicates a very similar variation trend with the Terrestrial Time TT(BIPMXXXX)-TAI but PT has larger measurement error.In this paper,we discuss the smoothing method of PT using a combined smoothing filter and compare the results with that from other filters.The clock difference sequence between PT-TAI and the first time derivative series of the TT(BIPMXXXX)-TAI can be combined by a combined smoothing filter to yield two smooth curves tied by the constraints assuring that the latter is the derivative of the former.The ensemble pulsar time IPTA2016 with respect to TAI published by G.Hobbs et al.and first time derivative series of the TT(BIPM2017)-TAI with quadratic polynomial terms removed are processed by combined smoothing filter in order to demonstrate the properties of the smoothed results.How to correctly estimate two smoothing coefficients is described and the output results of the combined smoothing filter are analyzed.The results show that the combined smoothing method efficiently removes high frequency noises of two input data series and the smoothed data of the PT-TAI combine long term fractional frequency stability of the pulsar time and frequency accuracy of the terrestrial time.Fractional frequency stability analysis indicates that both short and medium time interval stability of the smoothed PT-TAI is improved while keeping its original long term frequency stability level.The combined smoothing filter is more suitable for smoothing observational pulsar timescale data than any filter that only performs smoothing of a single pulsar time series.The smoothed pulsar time by combined smoothing filter is a pulsar atomic time combined timescale.This kind of combined timescale can also be used as terrestrial time.
基金Project 70771105 supported by the National Natural Science Foundation of China
文摘In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal mine safety should lead to improved efficiency in the use of funds and management resources. This helps government and enterprise managers better understand how safety inputs are used and to optimize allocation of resources. Study on coal mine's efficiency assessment of safety input was con- ducted in this paper. A C^2R model with non-Archimedean infinitesimal vector based on output is established after consideration of the input characteristics and the model properties. An assessment of an operating mine was done using a specific set of input and output criteria. It is found that the safety input was efficient in 2002 and 2005 and was weakly efficient in 2003. However, the efficiency was relatively low in both 2001 and 2004. The safety input resources can be optimized and adjusted by means of projection theory. Such analysis shows that, on average in 2001 and 2004, 45% of the expended funds could have been saved. Likewise, 10% of the safety management and technical staff could have been eliminated and working hours devoted to safety could have been reduced by 12%. These conditions could have Riven the same results.
基金This project is supported by National Natural Science Foundation of China (No. 70471009)Natural Science Foundation Project of CQ CSTC, China (No. 2006BA2033).
文摘A set of indices for performance evaluation for business processes with multiple inputs and multiple outputs is proposed, which are found in machinery manufacturers. Based on the traditional methods of data envelopment analysis (DEA) and analytical hierarchical process (AHP), a hybrid model called DEA/AHP model is proposed to deal with the evaluation of business process performance. With the proposed method, the DEA is firstly used to develop a pairwise comparison matrix, and then the AHP is applied to evaluate the performance of business process using the pairwise comparison matrix. The significant advantage of this hybrid model is the use of objective data instead of subjective human judgment for performance evaluation. In the case study, a project of business process reengineering (BPR) with a hydraulic machinery manufacturer is used to demonstrate the effectiveness of the DEA/AHP model.
文摘In the last decade,ranking units in data envelopment analysis(DEA) has become the interests of many DEA researchers and a variety of models were developed to rank units with multiple inputs and multiple outputs.These performance factors(inputs and outputs) are classified into two groups:desirable and undesirable.Obviously,undesirable factors in production process should be reduced to improve the performance.Also,some of these data may be known only in terms of ordinal relations.While the models developed in the past are interesting and meaningful,they didn t consider both undesirable and ordinal factors at the same time.In this research,we develop an evaluating model and a ranking model to overcome some deficiencies in the earlier models.This paper incorporates undesirable and ordinal data in DEA and discusses the efficiency evaluation and ranking of decision making units(DMUs) with undesirable and ordinal data.For this purpose,we transform the ordinal data into definite data,and then we consider each undesirable input and output as desirable output and input,respectively.Finally,an application that shows the capability of the proposed method is illustrated.
文摘The application of data envelopment analysis (DEA) as a multiple criteria decision making (MCDM) technique has been gaining more and more attention in recent research. In the practice of applying DEA approach, the appearance of uncertainties on input and output data of decision making unit (DMU) might make the nominal solution infeasible and lead to the efficiency scores meaningless from practical view. This paper analyzes the impact of data uncertainty on the evaluation results of DEA, and proposes several robust DEA models based on the adaptation of recently developed robust optimization approaches, which would be immune against input and output data uncertainties. The robust DEA models developed are based on input-oriented and outputoriented CCR model, respectively, when the uncertainties appear in output data and input data separately. Furthermore, the robust DEA models could deal with random symmetric uncertainty and unknown-but-bounded uncertainty, in both of which the distributions of the random data entries are permitted to be unknown. The robust DEA models are implemented in a numerical example and the efficiency scores and rankings of these models are compared. The results indicate that the robust DEA approach could be a more reliable method for efficiency evaluation and ranking in MCDM problems.
文摘This paper establishes the phase space in the light of spacial series data , discusses the fractal structure of geological data in terms of correlated functions and studies the chaos of these data . In addition , it introduces the R/S analysis for time series analysis into spacial series to calculate the structural fractal dimensions of ranges and standard deviation for spacial series data -and to establish the fractal dimension matrix and the procedures in plotting the fractal dimension anomaly diagram with vector distances of fractal dimension . At last , it has examples of its application .
文摘Data envelopment analysis (DEA) has become a standard non parametric approach to productivity analysis, especially to relative efficiency analysis of decision making units (DMUs). Extended to the prediction field, it can solve the prediction problem with multiple inputs and outputs which can not be solved easily by the regression analysis method.But the traditional DEA models can not solve the problem with undesirable outputs,so in this paper the inherent relationship between goal programming and the DEA method based on the relationship between multiple goal programming and goal programming is explored,and a mixed DEA model which can make all factors of inputs and undesirable outputs decrease in different proportions is built.And at the same time,all the factors of desirable outputs increase in different proportions.
基金supported by the Young Scientists Fund of the National Natural Science Foundation of China(Grant No.51205283)
文摘This paper introduced the basic theory and algorithm of the surrogate data method, which proposed a rigorous way to detect the random and seemingly stochastic characteristics in a system. The Gaussian data and the Rossler data were used to show the availability and effectivity of this method. According to the analysis by this method based on the short-circuiting current signals under the conditions of the same voltage and different wire feed speeds, it is demonstrated that the electrical signals time series exhibit apparently randomness when the welding parameters do not match. However, the electrical signals time series are deterministic when a match is found. The stability of short-circuiting transfer process could be judged exactly by the method of surrogate data.
文摘This paper proposes a new approach for ranking efficiency units in data envelopment analysis as a modification of the super-efficiency models developed by Tone [1]. The new approach based on slacks-based measure of efficiency (SBM) for dealing with objective function used to classify all of the decision-making units allows the ranking of all inefficient DMUs and overcomes the disadvantages of infeasibility. This method also is applied to rank super-efficient scores for the sample of 145 agricultural bank branches in Viet Nam during 2007-2010. We then compare the estimated results from the new SCI model and the exsisting SBM model by using some statistical tests.
基金Supported by Special Project for Youth Research in Anhui Institute of Architecture&Industry(20104012)
文摘According to the eco-efficiency theory, combined with agricultural production characteristics, I point out the environmental impact and substance energy consumption characteristics of agricultural production. Based on this, I establish the eco-efficiency evaluation indicator system for agricultural production, and conduct a comprehensive analysis on the agricultural eco-efficiency of 17 prefecture-level cities in Anhui Province, using data envelopment analysis method.
基金National Natural Science Foundation of China(No.41801379)Fundamental Research Funds for the Central Universities(No.2019B08414)National Key R&D Program of China(No.2016YFC0401801)。
文摘Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.
文摘The paper studies the non-zero slacks in data envelopment analysis. A procedure is developed for the treatment of non-zero slacks. DEA projections can be done just in one step.
基金Project of Joint Seismological Science Foundation of China (104090).
文摘In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, η-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the characteristics of magnitude, time and space distribution of seismicity from different respects. By using the primary component analysis method, the synthesis parameter W reflecting the anomalous features of earthquake magnitude, time and space distribution can be gained. Generally, there is some relativity among the 8 parameters, but their variations are different in different periods. The earthquake prediction based on these parameters is not very well. However, the synthesis parameter W showed obvious anomalies before 13 earthquakes (MS≥5.8) occurred in North China, which indicates that the synthesis parameter W can reflect the anomalous characteristics of magnitude, time and space distribution of seismicity better. Other problems related to the conclusions drawn by the primary component analysis method are also discussed.
文摘This work involves the evaluation of dry port competitiveness through analysis of efficiencies for selected dry ports in Africa. Five dry ports were selected and analysis carried out over a period of four years. The dry ports considered were Mojo and Kality in Ethiopia, Mombasa in Kenya, Isaka in Tanzania and Casablanca in Casablanca, Morocco. Data Envelopment Analysis (DEA) was applied for this work. Container throughputs for the various ports under consideration were used as the output variable for the data analysis model, while the number of reach stackers, the number of tractors, the number of forklifts and the size of the dry port were used as the input variables. From the results, the Mombasa dry port was found to be the most efficient with an average score of approximately 1 over the period under consideration. Casablanca was the second efficient dry port with an average score of 0.762, while Isaka was the least efficient with an average score of 0.142. This research is significant since the African countries have embraced the dry port concept, as witnessed in the huge investments in this sector, and would serve to highlight areas that need improvement for the few existing dry port facilities, most of which are undergoing expansion as well as modernization.
文摘This paper improves the slacks-based method for estimating inefficiency,derives the criteria for the selection of the weights of output and input inefficiencies in the objective function,and creates a new nonparametric method for accounting economic growth.Based on this method,the paper estimates the sources of China s economic growth from 1978 to 2013.Our findings suggest that factor input and especially capital is a major source of economic growth for China as a whole and its major regions,and that economic growth in recent years is increasingly dependent on capital.For a rather long period of time before 2005,China s northeast,central and western regions lagged behind the eastern region in terms of economic growth,and TFP and factor input are major reasons behind such regional growth disparities.Although other regions have narrowed their disparities with and even overtaken the eastern region in terms of economic growth,the key driver is the rapid increase in the contribution of factor input.Advanced technologies of eastern region should be utilized to promote TFP progress in other regions,which is vital to economic growth in these regions and China as a whole.
文摘Although investment is regarded as a key force of China’s economic growth, little study has been done to measure China’s investment efficiency. The present paper applies the data envelopment analysis (DEA) to Chinese provincial panel data from the year 2003 to 2008 for measuring the investment efficiencies and identifying their trends of Chinese 30 provinces and autonomous regions. A cross-efficient DEA model with considering benevolent formulation is used for providing accurate efficiency scores and completely ranking. The empirical results suggest that the differences of investment efficiency in different regions are distinct but tending to diminish year by year, and the investment efficiencies in some provinces are significantly correlated to their investment rates to the national total investment.