Masked data are the system failure data when exact component causing system failure might be unknown.In this paper,the mathematical description of general masked data was presented in software reliability engineering....Masked data are the system failure data when exact component causing system failure might be unknown.In this paper,the mathematical description of general masked data was presented in software reliability engineering.Furthermore,a general maskedbased additive non-homogeneous Poisson process(NHPP) model was considered to analyze component reliability.However,the problem of masked-based additive model lies in the difficulty of estimating parameters.The maximum likelihood estimation procedure was derived to estimate parameters.Finally,a numerical example was given to illustrate the applicability of proposed model,and the immune particle swarm optimization(IPSO) algorithm was used in maximize log-likelihood function.展开更多
The processing of nonlinear data was one of hot topics in surveying and mapping field in recent years. As a result, many linear methods and nonlinear methods have been developed. But the methods for processing general...The processing of nonlinear data was one of hot topics in surveying and mapping field in recent years. As a result, many linear methods and nonlinear methods have been developed. But the methods for processing generalized nonlinear surveying and mapping data, especially for different data types and including unknown parameters with random or nonrandom, are seldom noticed. A new algorithm model is presented in this paper for processing nonlinear dynamic multiple-period and multiple-accuracy data derived from deformation monitoring network.展开更多
An integration processing system of three-dimensional laser scanning information visualization in goaf was developed. It is provided with multiple functions, such as laser scanning information management for goaf, clo...An integration processing system of three-dimensional laser scanning information visualization in goaf was developed. It is provided with multiple functions, such as laser scanning information management for goaf, cloud data de-noising optimization, construction, display and operation of three-dimensional model, model editing, profile generation, calculation of goaf volume and roof area, Boolean calculation among models and interaction with the third party soft ware. Concerning this system with a concise interface, plentiful data input/output interfaces, it is featured with high integration, simple and convenient operations of applications. According to practice, in addition to being well-adapted, this system is favorably reliable and stable.展开更多
This paper studies the asymptotic normality of the Nelson-Aalen and the Kaplan-Meier estimators in a competing risks context in presence of independent right-censorship. To prove our results, we use Robelledo’s theor...This paper studies the asymptotic normality of the Nelson-Aalen and the Kaplan-Meier estimators in a competing risks context in presence of independent right-censorship. To prove our results, we use Robelledo’s theorem which makes it possible to apply the central limit theorem to certain types of particular martingales. From the results obtained, confidence bounds for the hazard and the survival functions are provided.展开更多
Traditional project effort estimation utilizes development models that span the entire project life cycle and thus culminates estimation errors. This exploratory research is the first attempt to break each project act...Traditional project effort estimation utilizes development models that span the entire project life cycle and thus culminates estimation errors. This exploratory research is the first attempt to break each project activity down to smaller work elements. There are eight work elements, each of which is being defined and symbolized with visually distinct shape. The purpose is to standardize the operations associated with the development process in the form of a visual symbolic flow map. Hence, developers can streamline their work systematically. Project effort estimation can be determined based on these standard work elements that not only help identify essential cost drivers for estimation, but also reduce latency cost to enhance estimation efficiency. Benefits of the proposed work element scheme are project visibility, better control for immediate pay-off and, in long term management, standardization for software process automation.展开更多
To meet the demands for the data combination with multiple space geodetic techniques at the observation level,we developed a new software platform with high extensibility and computation efficiency,named space Geodeti...To meet the demands for the data combination with multiple space geodetic techniques at the observation level,we developed a new software platform with high extensibility and computation efficiency,named space Geodetic SpatioTemporal data Analysis and Research software(GSTAR).Most of the modules in the GSTAR are coded in C++with object-oriented programming.The layered modular theory is adopted for the design of the software,and the antenna-based data architecture is proposed for users to construct personalized geodetic application scenarios easily.The initial performance of the GSTAR software is evaluated by processing the Global Navigation Satellite System(GNSS)data collected from 315 globally distributed stations over two and a half years.The accuracy of GNSS-based geodetic products is evaluated by comparing them with those released by International GNSS Service(IGS)Analysis Centers(AC).Taking the products released by European Space Agency(ESA)as reference,the Three-Dimension(3D)Root-Mean-Squares(RMS)of the orbit differences are 2.7/6.7/3.3/7.7/21.0 cm and the STandard Deviations(STD)of the clock differences are 19/48/16/32/25 ps for Global Positioning System(GPS),GLObal NAvigation Satellite System(GLONASS),Galileo navigation satellite system(Galileo),BeiDou Navigation Satellite System(BDS),Medium Earth Orbit(MEO),and BDS Inclined Geo-Synchronous Orbit(IGSO)satellites,respectively.The mean values of the X and Y components of the polar coordinate and the Length of Day(LOD)with respect to the International Earth Rotation and Reference Systems Service(IERS)14 C04 products are-17.6 microarc-second(μas),9.2μas,and 14.0μs/d.Compared to the IGS daily solution,the RMSs of the site position differences in the north/east/up direction are 1.6/1.5/3.9,3.8/2.4/7.6,2.5/2.4/7.9 and 2.7/2.3/7.4 mm for GPS-only,GLONASS-only,Galileo-only,and BDS-only solution,respectively.The RMSs of the differences of the tropospheric Zenith Path Delay(ZPD),the north gradients,and the east gradients are 5.8,0.9,and 0.9 mm with respect to the IGS products.The X and Y components of the geocenter motion estimated from GPS-only,Galileo-only,and BDS-only observations well agree with IGS products,while the Z component values are much nosier where anomalous harmonics in GNSS draconitic year can be found.The accuracies of the above products calculated by the GSTAR are comparable with those from different IGS ACs.Compared to the precise scientific orbit products,the 3D RMS of the orbit differences for the two Gravity Recovery and Climate Experiment Follow-on(GRACE-FO)satellites is below 1.5 cm by conducting Precise Point Positioning with Ambiguity Resolution(PPP-AR).In addition,a series of rapid data processing algorithms are developed,and the operation speed of the GSTAR software is 5.6 times faster than that of the Positioning and Navigation Data Analyst(PANDA)software for the quad-system precise orbit determination procedure.展开更多
LA-MC-ICP-MS offers the ability to directly measure small variations in isotope composition at the micrometer scale in geological samples.However,when analyzing isotope ratios in complex minerals,mass spectrometric in...LA-MC-ICP-MS offers the ability to directly measure small variations in isotope composition at the micrometer scale in geological samples.However,when analyzing isotope ratios in complex minerals,mass spectrometric interferences from themselves and isotopic mass fractionation arising from the analytical process need to be carefully corrected.The lack of professional software to visualize and process raw data presents a significant challenge for all of LA-MC-ICP-MS labs.Moreover,the rapid development of instruments and the innovations in micro-analytical techniques necessitate specialized software to complete the complex correction and improve efficiency,such as for in situ isochron dating or new linear regression correction techniques.In this study,Plume,a free data processing software,was developed specifically for isotopic data processing in LA-MC-ICP-MS.The software provides a comprehensive set of functions,including baseline correction,signal selection and integration,mass spectrometric interference correction,isotopic fractionation correction,uncertainty propagation,delta value calculation,and real-time data processing.It also supports a complete correction workflow for the latest isochron dating using LA-(MC)-ICP-MS/MS,including isotopic and elemental ratios correction,the isochron drawing,and age calculation for multiple models.Additionally,a dedicated linear regression calculation unit has been developed in Plume to specialize in data processing of low signal-to-noise ratio or high spatial resolution analysis.These new features expand the application range of LA-MC-ICP-MS and improve work efficiency through batch processing.展开更多
Software reliability growth models (SRGMs) incorporating the imperfect debugging and learning phenomenon of developers have recently been developed by many researchers to estimate software reliability measures such ...Software reliability growth models (SRGMs) incorporating the imperfect debugging and learning phenomenon of developers have recently been developed by many researchers to estimate software reliability measures such as the number of remaining faults and software reliability. However, the model parameters of both the fault content rate function and fault detection rate function of the SRGMs are often considered to be independent from each other. In practice, this assumption may not be the case and it is worth to investigate what if it is not. In this paper, we aim for such study and propose a software reliability model connecting the imperfect debugging and learning phenomenon by a common parameter among the two functions, called the imperfect-debugging fault-detection dependent-parameter model. Software testing data collected from real applications are utilized to illustrate the proposed model for both the descriptive and predictive power by determining the non-zero initial debugging process.展开更多
This paper analyses the effect of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure ra...This paper analyses the effect of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs wdl with censored data.展开更多
The parameter values which actually change with the circumstances, weather and load level etc. produce great effect to the result of state estimation. A new parameter estimation method based on data mining technology ...The parameter values which actually change with the circumstances, weather and load level etc. produce great effect to the result of state estimation. A new parameter estimation method based on data mining technology was proposed. The clustering method was used to classify the historical data in supervisory control and data acquisition (SCADA) database as several types. The data processing technology was implied to treat the isolated point, missing data and yawp data in samples for classified groups. The measurement data which belong to each classification were introduced to the linear regression equation in order to gain the regression coefficient and actual parameters by the least square method. A practical system demonstrates the high correctness, reliability and strong practicability of the proposed method.展开更多
In this paper, stochastic processes developed by Aalen [1]?[2] are adapted to the Nelson-Aalen and Kaplan-Meier?[3] estimators in a context of competing risks. We focus only on the probability distributions of complet...In this paper, stochastic processes developed by Aalen [1]?[2] are adapted to the Nelson-Aalen and Kaplan-Meier?[3] estimators in a context of competing risks. We focus only on the probability distributions of complete downtime individuals whose causes are known and which bring us to consider a partition of individuals into sub-groups for each cause. We then study the asymptotic properties of nonparametric estimators obtained.展开更多
基金Technology Foundation of Guizhou Province,China(No.QianKeHeJZi[2015]2064)Scientific Research Foundation for Advanced Talents in Guizhou Institue of Technology and Science,China(No.XJGC20150106)Joint Foundation of Guizhou Province,China(No.QianKeHeLHZi[2015]7105)
文摘Masked data are the system failure data when exact component causing system failure might be unknown.In this paper,the mathematical description of general masked data was presented in software reliability engineering.Furthermore,a general maskedbased additive non-homogeneous Poisson process(NHPP) model was considered to analyze component reliability.However,the problem of masked-based additive model lies in the difficulty of estimating parameters.The maximum likelihood estimation procedure was derived to estimate parameters.Finally,a numerical example was given to illustrate the applicability of proposed model,and the immune particle swarm optimization(IPSO) algorithm was used in maximize log-likelihood function.
文摘The processing of nonlinear data was one of hot topics in surveying and mapping field in recent years. As a result, many linear methods and nonlinear methods have been developed. But the methods for processing generalized nonlinear surveying and mapping data, especially for different data types and including unknown parameters with random or nonrandom, are seldom noticed. A new algorithm model is presented in this paper for processing nonlinear dynamic multiple-period and multiple-accuracy data derived from deformation monitoring network.
基金Project(51274250)supported by the National Natural Science Foundation of ChinaProject(2012BAK09B02-05)supported by the National Key Technology R&D Program during the 12th Five-year Plan of China
文摘An integration processing system of three-dimensional laser scanning information visualization in goaf was developed. It is provided with multiple functions, such as laser scanning information management for goaf, cloud data de-noising optimization, construction, display and operation of three-dimensional model, model editing, profile generation, calculation of goaf volume and roof area, Boolean calculation among models and interaction with the third party soft ware. Concerning this system with a concise interface, plentiful data input/output interfaces, it is featured with high integration, simple and convenient operations of applications. According to practice, in addition to being well-adapted, this system is favorably reliable and stable.
文摘This paper studies the asymptotic normality of the Nelson-Aalen and the Kaplan-Meier estimators in a competing risks context in presence of independent right-censorship. To prove our results, we use Robelledo’s theorem which makes it possible to apply the central limit theorem to certain types of particular martingales. From the results obtained, confidence bounds for the hazard and the survival functions are provided.
文摘Traditional project effort estimation utilizes development models that span the entire project life cycle and thus culminates estimation errors. This exploratory research is the first attempt to break each project activity down to smaller work elements. There are eight work elements, each of which is being defined and symbolized with visually distinct shape. The purpose is to standardize the operations associated with the development process in the form of a visual symbolic flow map. Hence, developers can streamline their work systematically. Project effort estimation can be determined based on these standard work elements that not only help identify essential cost drivers for estimation, but also reduce latency cost to enhance estimation efficiency. Benefits of the proposed work element scheme are project visibility, better control for immediate pay-off and, in long term management, standardization for software process automation.
基金This work was sponsored by National Natural Science Foundation of China(Grant No.41931075,42274041).
文摘To meet the demands for the data combination with multiple space geodetic techniques at the observation level,we developed a new software platform with high extensibility and computation efficiency,named space Geodetic SpatioTemporal data Analysis and Research software(GSTAR).Most of the modules in the GSTAR are coded in C++with object-oriented programming.The layered modular theory is adopted for the design of the software,and the antenna-based data architecture is proposed for users to construct personalized geodetic application scenarios easily.The initial performance of the GSTAR software is evaluated by processing the Global Navigation Satellite System(GNSS)data collected from 315 globally distributed stations over two and a half years.The accuracy of GNSS-based geodetic products is evaluated by comparing them with those released by International GNSS Service(IGS)Analysis Centers(AC).Taking the products released by European Space Agency(ESA)as reference,the Three-Dimension(3D)Root-Mean-Squares(RMS)of the orbit differences are 2.7/6.7/3.3/7.7/21.0 cm and the STandard Deviations(STD)of the clock differences are 19/48/16/32/25 ps for Global Positioning System(GPS),GLObal NAvigation Satellite System(GLONASS),Galileo navigation satellite system(Galileo),BeiDou Navigation Satellite System(BDS),Medium Earth Orbit(MEO),and BDS Inclined Geo-Synchronous Orbit(IGSO)satellites,respectively.The mean values of the X and Y components of the polar coordinate and the Length of Day(LOD)with respect to the International Earth Rotation and Reference Systems Service(IERS)14 C04 products are-17.6 microarc-second(μas),9.2μas,and 14.0μs/d.Compared to the IGS daily solution,the RMSs of the site position differences in the north/east/up direction are 1.6/1.5/3.9,3.8/2.4/7.6,2.5/2.4/7.9 and 2.7/2.3/7.4 mm for GPS-only,GLONASS-only,Galileo-only,and BDS-only solution,respectively.The RMSs of the differences of the tropospheric Zenith Path Delay(ZPD),the north gradients,and the east gradients are 5.8,0.9,and 0.9 mm with respect to the IGS products.The X and Y components of the geocenter motion estimated from GPS-only,Galileo-only,and BDS-only observations well agree with IGS products,while the Z component values are much nosier where anomalous harmonics in GNSS draconitic year can be found.The accuracies of the above products calculated by the GSTAR are comparable with those from different IGS ACs.Compared to the precise scientific orbit products,the 3D RMS of the orbit differences for the two Gravity Recovery and Climate Experiment Follow-on(GRACE-FO)satellites is below 1.5 cm by conducting Precise Point Positioning with Ambiguity Resolution(PPP-AR).In addition,a series of rapid data processing algorithms are developed,and the operation speed of the GSTAR software is 5.6 times faster than that of the Positioning and Navigation Data Analyst(PANDA)software for the quad-system precise orbit determination procedure.
基金supported by the National Key Research and Development Program of China(Grant No.2024YFF0808200)the National Natural Science Foundation of China(Grant Nos.42330104,42473035)+1 种基金Hubei Provincial Natural Science Foundation of China(Grant No.2025AFA005)the Most Special Fund from the State Key Laboratory of Geological Processes and Mineral Resources,China University of Geosciences(Grant Nos.MSFGPMR04 and MSFGPMR08)。
文摘LA-MC-ICP-MS offers the ability to directly measure small variations in isotope composition at the micrometer scale in geological samples.However,when analyzing isotope ratios in complex minerals,mass spectrometric interferences from themselves and isotopic mass fractionation arising from the analytical process need to be carefully corrected.The lack of professional software to visualize and process raw data presents a significant challenge for all of LA-MC-ICP-MS labs.Moreover,the rapid development of instruments and the innovations in micro-analytical techniques necessitate specialized software to complete the complex correction and improve efficiency,such as for in situ isochron dating or new linear regression correction techniques.In this study,Plume,a free data processing software,was developed specifically for isotopic data processing in LA-MC-ICP-MS.The software provides a comprehensive set of functions,including baseline correction,signal selection and integration,mass spectrometric interference correction,isotopic fractionation correction,uncertainty propagation,delta value calculation,and real-time data processing.It also supports a complete correction workflow for the latest isochron dating using LA-(MC)-ICP-MS/MS,including isotopic and elemental ratios correction,the isochron drawing,and age calculation for multiple models.Additionally,a dedicated linear regression calculation unit has been developed in Plume to specialize in data processing of low signal-to-noise ratio or high spatial resolution analysis.These new features expand the application range of LA-MC-ICP-MS and improve work efficiency through batch processing.
文摘Software reliability growth models (SRGMs) incorporating the imperfect debugging and learning phenomenon of developers have recently been developed by many researchers to estimate software reliability measures such as the number of remaining faults and software reliability. However, the model parameters of both the fault content rate function and fault detection rate function of the SRGMs are often considered to be independent from each other. In practice, this assumption may not be the case and it is worth to investigate what if it is not. In this paper, we aim for such study and propose a software reliability model connecting the imperfect debugging and learning phenomenon by a common parameter among the two functions, called the imperfect-debugging fault-detection dependent-parameter model. Software testing data collected from real applications are utilized to illustrate the proposed model for both the descriptive and predictive power by determining the non-zero initial debugging process.
文摘This paper analyses the effect of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs wdl with censored data.
基金the National High Technology Research and Development (863) Program of China(No. 2006AA05Z214)
文摘The parameter values which actually change with the circumstances, weather and load level etc. produce great effect to the result of state estimation. A new parameter estimation method based on data mining technology was proposed. The clustering method was used to classify the historical data in supervisory control and data acquisition (SCADA) database as several types. The data processing technology was implied to treat the isolated point, missing data and yawp data in samples for classified groups. The measurement data which belong to each classification were introduced to the linear regression equation in order to gain the regression coefficient and actual parameters by the least square method. A practical system demonstrates the high correctness, reliability and strong practicability of the proposed method.
文摘In this paper, stochastic processes developed by Aalen [1]?[2] are adapted to the Nelson-Aalen and Kaplan-Meier?[3] estimators in a context of competing risks. We focus only on the probability distributions of complete downtime individuals whose causes are known and which bring us to consider a partition of individuals into sub-groups for each cause. We then study the asymptotic properties of nonparametric estimators obtained.