Software-related security aspects are a growing and legitimate concern,especially with 5G data available just at our palms.To conduct research in this field,periodic comparative analysis is needed with the new techniq...Software-related security aspects are a growing and legitimate concern,especially with 5G data available just at our palms.To conduct research in this field,periodic comparative analysis is needed with the new techniques coming up rapidly.The purpose of this study is to review the recent developments in the field of security integration in the software development lifecycle(SDLC)by analyzing the articles published in the last two decades and to propose a way forward.This review follows Kitchenham’s review protocol.The review has been divided into three main stages including planning,execution,and analysis.From the selected 100 articles,it becomes evident that need of a collaborative approach is necessary for addressing critical software security risks(CSSRs)through effective risk management/estimation techniques.Quantifying risks using a numeric scale enables a comprehensive understanding of their severity,facilitating focused resource allocation and mitigation efforts.Through a comprehensive understanding of potential vulnerabilities and proactive mitigation efforts facilitated by protection poker,organizations can prioritize resources effectively to ensure the successful outcome of projects and initiatives in today’s dynamic threat landscape.The review reveals that threat analysis and security testing are needed to develop automated tools for the future.Accurate estimation of effort required to prioritize potential security risks is a big challenge in software security.The accuracy of effort estimation can be further improved by exploring new techniques,particularly those involving deep learning.It is also imperative to validate these effort estimation methods to ensure all potential security threats are addressed.Another challenge is selecting the right model for each specific security threat.To achieve a comprehensive evaluation,researchers should use well-known benchmark checklists.展开更多
Purpose-The rapid development of China’s railway construction has led to an increase in data generated by the high-speed rail(HSR)catenary system.Traditional management methods struggle with challenges such as poor i...Purpose-The rapid development of China’s railway construction has led to an increase in data generated by the high-speed rail(HSR)catenary system.Traditional management methods struggle with challenges such as poor information sharing,disconnected business applications and insufficient intelligence throughout the lifecycle.This study aims to address these issues by applying building information modeling(BIM)technology to improve lifecycle management efficiency for HSR catenary systems.Design/methodology/approach-Based on the lifecycle management needs of catenary engineering,incorporating the intelligent HSR“Model-Data Driven,Axis-Plane Coordination”philosophy,this paper constructs a BIM-based lifecycle management framework for HSR catenary engineering.Findings-This study investigates the full-process lifecycle management of the catenary system across various stages of design,manufacture,construction and operation,exploring integrated BIM models and data transmission methods,along with key technologies for BIM model transmission,transformation and lightweighting.Originality/value-This study establishes a lossless information circulation and transmission system for HSR catenary lifecycle management.Multi-stage applications are verified through the construction of the Chongqing-Kunming High-Speed Railway,comprehensive advancing the intelligent promotion and highquality development of catenary engineering.展开更多
With the rapid adoption of artificial intelligence(AI)in domains such as power,transportation,and finance,the number of machine learning and deep learning models has grown exponentially.However,challenges such as dela...With the rapid adoption of artificial intelligence(AI)in domains such as power,transportation,and finance,the number of machine learning and deep learning models has grown exponentially.However,challenges such as delayed retraining,inconsistent version management,insufficient drift monitoring,and limited data security still hinder efficient and reliable model operations.To address these issues,this paper proposes the Intelligent Model Lifecycle Management Algorithm(IMLMA).The algorithm employs a dual-trigger mechanism based on both data volume thresholds and time intervals to automate retraining,and applies Bayesian optimization for adaptive hyperparameter tuning to improve performance.A multi-metric replacement strategy,incorporating MSE,MAE,and R2,ensures that new models replace existing ones only when performance improvements are guaranteed.A versioning and traceability database supports comparison and visualization,while real-time monitoring with stability analysis enables early warnings of latency and drift.Finally,hash-based integrity checks secure both model files and datasets.Experimental validation in a power metering operation scenario demonstrates that IMLMA reduces model update delays,enhances predictive accuracy and stability,and maintains low latency under high concurrency.This work provides a practical,reusable,and scalable solution for intelligent model lifecycle management,with broad applicability to complex systems such as smart grids.展开更多
Housing construction and municipal engineering have full lifecycle characteristics,involving multiple stages.Emphasizing the coherence and systematicity of each stage,the supervisor should establish a three-dimensiona...Housing construction and municipal engineering have full lifecycle characteristics,involving multiple stages.Emphasizing the coherence and systematicity of each stage,the supervisor should establish a three-dimensional management system.Establishing quantitative evaluation models and visual monitoring schemes to ensure quality and safety,as well as introducing cost control methods and innovative collaborative management mechanisms,ultimately forming a supervision-led paradigm and proposing directions for the application of digital twin technology.展开更多
Measuring the lifecycle of low-carbon energy technologies is critical to better understanding the innovation pattern.However,previous studies on lifecycle either focus on technical details or just provide a general ov...Measuring the lifecycle of low-carbon energy technologies is critical to better understanding the innovation pattern.However,previous studies on lifecycle either focus on technical details or just provide a general overview,due to the lack of connection with innovation theories.This article attempts to fill this gap by analyzing the lifecycle from a combinatorial innovation perspective,based on patent data of ten low-carbon energy technologies in China from 1999 to 2018.The problem of estimating lifecycle stages can be transformed into analyzing the rise and fall of knowledge combinations.By building the international patent classification(IPC)co-occurrence matrix,this paper demonstrates the lifecycle evolution of technologies and develops an efficient quantitative index to define lifecycle stages.The mathematical measurement can effectively reflect the evolutionary pattern of technologies.Additionally,this article relates the macro evolution of lifecycle to the micro dynamic mechanism of technology paradigms.The sign of technology maturity is that new inventions tend to follow the patterns established by prior ones.Following this logic,this paper identifies different trends of paradigms in each technology field and analyze their transition.Furthermore,catching-up literature shows that drastic transformation of technology paradigms may open“windows of opportunity”for laggard regions.From the results of this paper,it is clear to see that latecomers can catch up with pioneers especially when there is a radical change in paradigms.Therefore,it is important for policy makers to capture such opportunities during the technology lifecycle and coordinate regional innovation resources.展开更多
Hefei Light Source(HLS)is a synchrotron radiation light source that primarily produces vacuum ultraviolet and soft X-rays.It currently consists of ten experimental stations,including a soft X-ray microscopy station.As...Hefei Light Source(HLS)is a synchrotron radiation light source that primarily produces vacuum ultraviolet and soft X-rays.It currently consists of ten experimental stations,including a soft X-ray microscopy station.As part of its on-going efforts to establish a centralized scientific data management platform,HLS is in the process of developing a test sys-tem that covers the entire lifecycle of scientific data,including data generation,acquisition,processing,analysis,and de-struction.However,the instruments used in the soft X-ray microscopy experimental station rely on commercial propriet-ary software for data acquisition and processing.We developed a semi-automatic data acquisition program to facilitate the integration of soft X-ray microscopy stations into a centralized scientific data management platform.Additionally,we cre-ated an online data processing platform to assist users in analyzing their scientific data.The system we developed and de-ployed meets the design requirements,successfully integrating the soft X-ray microscopy station into the full lifecycle management of scientific data.展开更多
To comprehensively assess fi'actionated spacecraft, an assessment tool is developed based on lifecycle simulation under uncertainty driven by modular evolutionary stochastic models. First, fractionated spacecraft nom...To comprehensively assess fi'actionated spacecraft, an assessment tool is developed based on lifecycle simulation under uncertainty driven by modular evolutionary stochastic models. First, fractionated spacecraft nomenclature and architecture are clarified, and assessment criteria are analyzed. The mean and standard deviation of risk adjusted lifecycle cost and net present value (NPV) are defined as assessment metrics. Second, fractionated spacecraft sizing models are briefly described, followed by detailed discussion on risk adjusted lifecycle cost and NPV models. Third, uncertainty sources over fractionated spacecraft life- cycle are analyzed and modeled with probability theory. Then the chronological lifecycle simulation process is expounded, and simulation modules are developed with object oriented methodology to build up the assessment tool. The preceding uncertainty models are integrated in these simulation modules, hence the random object status can be simulated and evolve with lifecycle timeline. A case study to investigate the fractionated spacecraft for a hypothetical earth observation mission is carried out with the proposed assessment tool, and the results show that fractionation degree and launch manifest have great influence on cost and NPV, and generally fractionated spacecraft is more advanced than its monolithic counterpart under uncertainty effect. Finally, some conclusions are given and future research topics are highlighted.展开更多
Two dynamic grey models DGM (1, 1) for the verification cycle and the lifecycle of measuring instrument based on time sequence and frequency sequence were set up, according to the statistical feature of examination da...Two dynamic grey models DGM (1, 1) for the verification cycle and the lifecycle of measuring instrument based on time sequence and frequency sequence were set up, according to the statistical feature of examination data and weighting method. By a specific case, i.e. vernier caliper, it is proved that the fit precision and forecast precision of the models are much higher, the cycles are obviously different under different working conditions, and the forecast result of the frequency sequence model is better than that of the time sequence model. Combining dynamic grey model and auto-manufacturing case the controlling and information subsystems of verification cycle and the lifecycle based on information integration, multi-sensor controlling and management controlling were given. The models can be used in production process to help enterprise reduce error, cost and flaw.展开更多
The principles for lifecycle safety guarantee of engineering structures areproposed, and the conception is developed for developing the safety guarantee system by integratingthe monitoring system, analysis system and ...The principles for lifecycle safety guarantee of engineering structures areproposed, and the conception is developed for developing the safety guarantee system by integratingthe monitoring system, analysis system and maintenance system together on the basis of multi-sensor,distribution-measurement, data fusion and digital-signal-processor (DSP) technologies as well asthree-dimensional (3D) fatigue fracture unified theory. As all the systems should work in situs andin real-time, micromation and integration are important. Damage detectability is introduced toclarify the relationship of life prediction and healthy monitoring or faulty diagnosis. The researchwork to realize the lifecycle safety guarantee system is summarized and perspectives for futureefforts are outlined.展开更多
Software engineering's lifecycle models havc proven to be very important for traditional software development. However, can these models be applied to the development of Web-based applications as well? In recent yea...Software engineering's lifecycle models havc proven to be very important for traditional software development. However, can these models be applied to the development of Web-based applications as well? In recent years, Web-based applications have become more and more complicated and a lot of efforts have been placed on introducing new technologies such as J2EE, PhP, and .NET, etc., which have been universally accepted as the development technologies for Web-based applications. However, there is no universally accepted process model for the development of Web-based applications. Moreover, shaping the process model for small medium-sized enterprises (SMEs), which have limited resources, has been relatively neglected. Based on our previous work, this paper presents an expanded lifecycle process model for the development of Web-based applications in SMEs. It consists of three sets of processes, i.e., requirement processes, development processes, and evolution processes. Particularly, the post-delivery evolution processes are important to SMEs to develop and maintain quality web applications with limited resources and time.展开更多
Mass Customization and global economic collaboration drives the product development and management beyond internal enterprise to cover the whole product value chain. To meet such requirement, a strategy approach focus...Mass Customization and global economic collaboration drives the product development and management beyond internal enterprise to cover the whole product value chain. To meet such requirement, a strategy approach focusing on data organization for product lifecycle management was promoted. The approach takes product platform as the base, view engine and rule-based access as data access mechanism, and integration and collaboration bus as a enabler to allow participants involved in product lifecycle to get convenient, WEB-based access to the internal and external content, applications, and services.展开更多
The provision of services as e-commodities and the wide use of networked business models have driven buyer experience to new heights. Innovation, coupled with quality is the competitive advantage e-commerce vendors st...The provision of services as e-commodities and the wide use of networked business models have driven buyer experience to new heights. Innovation, coupled with quality is the competitive advantage e-commerce vendors strive to achieve when designing or re-designing their software. In terms of quality, evaluation stood in the spotlight for many years. However, software analysis and design based on quality models have been used mostly for understanding, rather than improving. In this work, we present a new model for the analysis and design of e-commerce software systems mapped to the software life-cycle process. Quality control procedures are mapped to every phase, from initiation and design to delivery. Based on current ISO standards such as ISO25000 series, this work discusses technical and managerial principles that need to be applied in order to obtain quality e-commerce software.展开更多
In recent times, the overall interest over Supercritical Fluid Chromatography (SFC) is truly growing within various domains but especially for pharmaceutical analysis. However, in the best of our knowledge modern SFC ...In recent times, the overall interest over Supercritical Fluid Chromatography (SFC) is truly growing within various domains but especially for pharmaceutical analysis. However, in the best of our knowledge modern SFC is not yet applied for drug quality control in the daily routine framework. Among the numerous reported SFC methods, none of them could be found to fully satisfy to all steps of the analytical method lifecycle. Thereby, the present contribution aims to provide an overview of the current and past achievements related to SFC techniques, with a targeted attention to this lifecycle and its successive steps. The included discussions were therefore structured accordingly and emphasizing the analytical method lifecycle in accord with the International Conference on Harmonisation (ICH). Recent and important scientific outputs in the field of analytical SFC, as well as instrumental evolution, qualification strategies, method development methodologies and discussions on the topic of method validation are reviewed.展开更多
Based on the analysis of the whole lifecycle of equipment, we, in this paper, propose the lifecycle-oriented diagnosis and maintenance philosophy, and expound its connotation and characteristics. Then, we present its ...Based on the analysis of the whole lifecycle of equipment, we, in this paper, propose the lifecycle-oriented diagnosis and maintenance philosophy, and expound its connotation and characteristics. Then, we present its open framework of the lifecycle-oriented diagnosis view in detail, which consists of three levels: information model, net model and open space. The open space level indicates the alternation of the information, the integration of the structure and the modeling of the knowledge. These three levels reveal the synthesis of the diagnosis process effectively in length and breadth. Finally, the main work of the paper and the future work on this problem are discussed.展开更多
In this paper, the assessment indicator hierarchy of the Product Multi-Lifecycle System (PMLS) is proposed based on the PMLS connotations and objectives, upon which the PMLS assessment method is discussed and the exam...In this paper, the assessment indicator hierarchy of the Product Multi-Lifecycle System (PMLS) is proposed based on the PMLS connotations and objectives, upon which the PMLS assessment method is discussed and the example is presented by using the fuzzy judging theory.展开更多
The PLDIM (Product Lifecycle Dynamic Information Model) is the most important part of the PLDM (Product Lifecycle Dynamic Model ) and it is the basis to create the information system and to implement PLM (Product Life...The PLDIM (Product Lifecycle Dynamic Information Model) is the most important part of the PLDM (Product Lifecycle Dynamic Model ) and it is the basis to create the information system and to implement PLM (Product Lifecycle Management). The information classification, the relationships among all information items, PLDIM mathematic expression, information coding and the 3D synthetic description of the PLDIM are presented. The information flow and information system structure based on the two information centers and Internet/Intranet are proposed, and how to implement this system for ship diesel engines are also introduced according to the PLDIM and PLM solutions.展开更多
In today’s rapidly evolving digital landscape,web application security has become paramount as organizations face increasingly sophisticated cyber threats.This work presents a comprehensive methodology for implementi...In today’s rapidly evolving digital landscape,web application security has become paramount as organizations face increasingly sophisticated cyber threats.This work presents a comprehensive methodology for implementing robust security measures in modern web applications and the proof of the Methodology applied to Vue.js,Spring Boot,and MySQL architecture.The proposed approach addresses critical security challenges through a multi-layered framework that encompasses essential security dimensions including multi-factor authentication,fine-grained authorization controls,sophisticated session management,data confidentiality and integrity protection,secure logging mechanisms,comprehensive error handling,high availability strategies,advanced input validation,and security headers implementation.Significant contributions are made to the field of web application security.First,a detailed catalogue of security requirements specifically tailored to protect web applications against contemporary threats,backed by rigorous analysis and industry best practices.Second,the methodology is validated through a carefully designed proof-of-concept implementation in a controlled environment,demonstrating the practical effectiveness of the security measures.The validation process employs cutting-edge static and dynamic analysis tools for comprehensive dependency validation and vulnerability detection,ensuring robust security coverage.The validation results confirm the prevention and avoidance of security vulnerabilities of the methodology.A key innovation of this work is the seamless integration of DevSecOps practices throughout the secure Software Development Life Cycle(SSDLC),creating a security-first mindset from initial design to deployment.By combining proactive secure coding practices with defensive security approaches,a framework is established that not only strengthens application security but also fosters a culture of security awareness within development teams.This hybrid approach ensures that security considerations are woven into every aspect of the development process,rather than being treated as an afterthought.展开更多
The standards system for cultural heritage digitalization aims to build a clear and logically rigorous framework to guide the development and revision of relevant standards.This system enhances the scientific,systemat...The standards system for cultural heritage digitalization aims to build a clear and logically rigorous framework to guide the development and revision of relevant standards.This system enhances the scientific,systematic,and practical aspects of cultural heritage digitalization.This paper comprehensively analyzes the current status and needs of cultural heritage digitalization and standardization.It further examines the methods used to construct the standards system.Through comparative analysis,it establishes a lifecycle-based framework for cultural heritage.This framework accounts for the unique characteristics of cultural heritage and systematically integrates key processes such as collection,processing,storage,transmission,and utilization of data.The standards system is divided into six sections:general,data,information,knowledge,intelligence,and application.Based on the current digitalization efforts,this paper proposes key standardization directions for each section.This framework ensures the integrity and consistency of data throughout the digitalization process.It also supports the application of intelligent technologies in cultural heritage conservation,contributing to the sustainable preservation and utilization of cultural heritage data.展开更多
As a foundation component of cloud computing platforms, Virtual Machines (VMs) are confronted with numerous security threats. However, existing solutions tend to focus on solving threats in a specific state of the VM....As a foundation component of cloud computing platforms, Virtual Machines (VMs) are confronted with numerous security threats. However, existing solutions tend to focus on solving threats in a specific state of the VM. In this paper, we propose a novel VM lifecycle security protection framework based on trusted computing to solve the security threats to VMs throughout their entire lifecycle. Specifically, a concept of the VM lifecycle is presented divided up by the different active conditions of the VM. Then, a trusted computing based security protecti on framework is developed, which can exte nd the trusted relati on ship from trusted platform module to the VM and protect the security and reliability of the VM throughout its lifecycle. The theoretical analysis shows that our proposed framework can provide comprehensive safety to VM in all of its states. Furthermore, experiment results demonstrate that the proposed framework is feasible and achieves a higher level of security compared with some state-of-the-art schemes.展开更多
This article proposes that problems of information security are mainly caused by the ineffective integration of people, operation, and technology, and not merely by the poor use of technology. Based on the information...This article proposes that problems of information security are mainly caused by the ineffective integration of people, operation, and technology, and not merely by the poor use of technology. Based on the information lifecycle, a model of the information security assurance lifecycle is presented. The crucial parts of the model are further discussed, with the information risk value and protect level, and the solution in each step of the lifecycle is presented with an ensured information risk level, in term of the integration of people, operation, and technology.展开更多
文摘Software-related security aspects are a growing and legitimate concern,especially with 5G data available just at our palms.To conduct research in this field,periodic comparative analysis is needed with the new techniques coming up rapidly.The purpose of this study is to review the recent developments in the field of security integration in the software development lifecycle(SDLC)by analyzing the articles published in the last two decades and to propose a way forward.This review follows Kitchenham’s review protocol.The review has been divided into three main stages including planning,execution,and analysis.From the selected 100 articles,it becomes evident that need of a collaborative approach is necessary for addressing critical software security risks(CSSRs)through effective risk management/estimation techniques.Quantifying risks using a numeric scale enables a comprehensive understanding of their severity,facilitating focused resource allocation and mitigation efforts.Through a comprehensive understanding of potential vulnerabilities and proactive mitigation efforts facilitated by protection poker,organizations can prioritize resources effectively to ensure the successful outcome of projects and initiatives in today’s dynamic threat landscape.The review reveals that threat analysis and security testing are needed to develop automated tools for the future.Accurate estimation of effort required to prioritize potential security risks is a big challenge in software security.The accuracy of effort estimation can be further improved by exploring new techniques,particularly those involving deep learning.It is also imperative to validate these effort estimation methods to ensure all potential security threats are addressed.Another challenge is selecting the right model for each specific security threat.To achieve a comprehensive evaluation,researchers should use well-known benchmark checklists.
基金supported by China Academy of Railway Sciences Foundation(Research on Multi Agent Collaborative Mechanism of Intelligent High Speed Rail System Based on Complex Adaptive System Theory,Grant 2023YJ392).
文摘Purpose-The rapid development of China’s railway construction has led to an increase in data generated by the high-speed rail(HSR)catenary system.Traditional management methods struggle with challenges such as poor information sharing,disconnected business applications and insufficient intelligence throughout the lifecycle.This study aims to address these issues by applying building information modeling(BIM)technology to improve lifecycle management efficiency for HSR catenary systems.Design/methodology/approach-Based on the lifecycle management needs of catenary engineering,incorporating the intelligent HSR“Model-Data Driven,Axis-Plane Coordination”philosophy,this paper constructs a BIM-based lifecycle management framework for HSR catenary engineering.Findings-This study investigates the full-process lifecycle management of the catenary system across various stages of design,manufacture,construction and operation,exploring integrated BIM models and data transmission methods,along with key technologies for BIM model transmission,transformation and lightweighting.Originality/value-This study establishes a lossless information circulation and transmission system for HSR catenary lifecycle management.Multi-stage applications are verified through the construction of the Chongqing-Kunming High-Speed Railway,comprehensive advancing the intelligent promotion and highquality development of catenary engineering.
基金funded by Anhui NARI ZT Electric Co.,Ltd.,entitled“Research on the Shared Operation and Maintenance Service Model for Metering Equipment and Platform Development for the Modern Industrial Chain”(Grant No.524636250005).
文摘With the rapid adoption of artificial intelligence(AI)in domains such as power,transportation,and finance,the number of machine learning and deep learning models has grown exponentially.However,challenges such as delayed retraining,inconsistent version management,insufficient drift monitoring,and limited data security still hinder efficient and reliable model operations.To address these issues,this paper proposes the Intelligent Model Lifecycle Management Algorithm(IMLMA).The algorithm employs a dual-trigger mechanism based on both data volume thresholds and time intervals to automate retraining,and applies Bayesian optimization for adaptive hyperparameter tuning to improve performance.A multi-metric replacement strategy,incorporating MSE,MAE,and R2,ensures that new models replace existing ones only when performance improvements are guaranteed.A versioning and traceability database supports comparison and visualization,while real-time monitoring with stability analysis enables early warnings of latency and drift.Finally,hash-based integrity checks secure both model files and datasets.Experimental validation in a power metering operation scenario demonstrates that IMLMA reduces model update delays,enhances predictive accuracy and stability,and maintains low latency under high concurrency.This work provides a practical,reusable,and scalable solution for intelligent model lifecycle management,with broad applicability to complex systems such as smart grids.
文摘Housing construction and municipal engineering have full lifecycle characteristics,involving multiple stages.Emphasizing the coherence and systematicity of each stage,the supervisor should establish a three-dimensional management system.Establishing quantitative evaluation models and visual monitoring schemes to ensure quality and safety,as well as introducing cost control methods and innovative collaborative management mechanisms,ultimately forming a supervision-led paradigm and proposing directions for the application of digital twin technology.
基金supported by the Natural Science Foundation of China(Grants No.42122006,42471187).
文摘Measuring the lifecycle of low-carbon energy technologies is critical to better understanding the innovation pattern.However,previous studies on lifecycle either focus on technical details or just provide a general overview,due to the lack of connection with innovation theories.This article attempts to fill this gap by analyzing the lifecycle from a combinatorial innovation perspective,based on patent data of ten low-carbon energy technologies in China from 1999 to 2018.The problem of estimating lifecycle stages can be transformed into analyzing the rise and fall of knowledge combinations.By building the international patent classification(IPC)co-occurrence matrix,this paper demonstrates the lifecycle evolution of technologies and develops an efficient quantitative index to define lifecycle stages.The mathematical measurement can effectively reflect the evolutionary pattern of technologies.Additionally,this article relates the macro evolution of lifecycle to the micro dynamic mechanism of technology paradigms.The sign of technology maturity is that new inventions tend to follow the patterns established by prior ones.Following this logic,this paper identifies different trends of paradigms in each technology field and analyze their transition.Furthermore,catching-up literature shows that drastic transformation of technology paradigms may open“windows of opportunity”for laggard regions.From the results of this paper,it is clear to see that latecomers can catch up with pioneers especially when there is a radical change in paradigms.Therefore,it is important for policy makers to capture such opportunities during the technology lifecycle and coordinate regional innovation resources.
基金supported by the Fundamental Research Funds for the Central Universities(WK2310000102)。
文摘Hefei Light Source(HLS)is a synchrotron radiation light source that primarily produces vacuum ultraviolet and soft X-rays.It currently consists of ten experimental stations,including a soft X-ray microscopy station.As part of its on-going efforts to establish a centralized scientific data management platform,HLS is in the process of developing a test sys-tem that covers the entire lifecycle of scientific data,including data generation,acquisition,processing,analysis,and de-struction.However,the instruments used in the soft X-ray microscopy experimental station rely on commercial propriet-ary software for data acquisition and processing.We developed a semi-automatic data acquisition program to facilitate the integration of soft X-ray microscopy stations into a centralized scientific data management platform.Additionally,we cre-ated an online data processing platform to assist users in analyzing their scientific data.The system we developed and de-ployed meets the design requirements,successfully integrating the soft X-ray microscopy station into the full lifecycle management of scientific data.
基金Foundation items: National Natural Science Foundation of China (50975280, 61004094) Program for New Century Excellent Talents in University (NCET-08-0149)+1 种基金 Fund of Innovation by Graduate School of National University of Defense Technology (B090102) Hunan Provincial Innovation Foundation for Postgraduate, China.
文摘To comprehensively assess fi'actionated spacecraft, an assessment tool is developed based on lifecycle simulation under uncertainty driven by modular evolutionary stochastic models. First, fractionated spacecraft nomenclature and architecture are clarified, and assessment criteria are analyzed. The mean and standard deviation of risk adjusted lifecycle cost and net present value (NPV) are defined as assessment metrics. Second, fractionated spacecraft sizing models are briefly described, followed by detailed discussion on risk adjusted lifecycle cost and NPV models. Third, uncertainty sources over fractionated spacecraft life- cycle are analyzed and modeled with probability theory. Then the chronological lifecycle simulation process is expounded, and simulation modules are developed with object oriented methodology to build up the assessment tool. The preceding uncertainty models are integrated in these simulation modules, hence the random object status can be simulated and evolve with lifecycle timeline. A case study to investigate the fractionated spacecraft for a hypothetical earth observation mission is carried out with the proposed assessment tool, and the results show that fractionation degree and launch manifest have great influence on cost and NPV, and generally fractionated spacecraft is more advanced than its monolithic counterpart under uncertainty effect. Finally, some conclusions are given and future research topics are highlighted.
文摘Two dynamic grey models DGM (1, 1) for the verification cycle and the lifecycle of measuring instrument based on time sequence and frequency sequence were set up, according to the statistical feature of examination data and weighting method. By a specific case, i.e. vernier caliper, it is proved that the fit precision and forecast precision of the models are much higher, the cycles are obviously different under different working conditions, and the forecast result of the frequency sequence model is better than that of the time sequence model. Combining dynamic grey model and auto-manufacturing case the controlling and information subsystems of verification cycle and the lifecycle based on information integration, multi-sensor controlling and management controlling were given. The models can be used in production process to help enterprise reduce error, cost and flaw.
基金This project is supported by National Natural Science Foundation of China (No.50275073)Space Science Foundation of China (No.03B52011).
文摘The principles for lifecycle safety guarantee of engineering structures areproposed, and the conception is developed for developing the safety guarantee system by integratingthe monitoring system, analysis system and maintenance system together on the basis of multi-sensor,distribution-measurement, data fusion and digital-signal-processor (DSP) technologies as well asthree-dimensional (3D) fatigue fracture unified theory. As all the systems should work in situs andin real-time, micromation and integration are important. Damage detectability is introduced toclarify the relationship of life prediction and healthy monitoring or faulty diagnosis. The researchwork to realize the lifecycle safety guarantee system is summarized and perspectives for futureefforts are outlined.
文摘Software engineering's lifecycle models havc proven to be very important for traditional software development. However, can these models be applied to the development of Web-based applications as well? In recent years, Web-based applications have become more and more complicated and a lot of efforts have been placed on introducing new technologies such as J2EE, PhP, and .NET, etc., which have been universally accepted as the development technologies for Web-based applications. However, there is no universally accepted process model for the development of Web-based applications. Moreover, shaping the process model for small medium-sized enterprises (SMEs), which have limited resources, has been relatively neglected. Based on our previous work, this paper presents an expanded lifecycle process model for the development of Web-based applications in SMEs. It consists of three sets of processes, i.e., requirement processes, development processes, and evolution processes. Particularly, the post-delivery evolution processes are important to SMEs to develop and maintain quality web applications with limited resources and time.
文摘Mass Customization and global economic collaboration drives the product development and management beyond internal enterprise to cover the whole product value chain. To meet such requirement, a strategy approach focusing on data organization for product lifecycle management was promoted. The approach takes product platform as the base, view engine and rule-based access as data access mechanism, and integration and collaboration bus as a enabler to allow participants involved in product lifecycle to get convenient, WEB-based access to the internal and external content, applications, and services.
文摘The provision of services as e-commodities and the wide use of networked business models have driven buyer experience to new heights. Innovation, coupled with quality is the competitive advantage e-commerce vendors strive to achieve when designing or re-designing their software. In terms of quality, evaluation stood in the spotlight for many years. However, software analysis and design based on quality models have been used mostly for understanding, rather than improving. In this work, we present a new model for the analysis and design of e-commerce software systems mapped to the software life-cycle process. Quality control procedures are mapped to every phase, from initiation and design to delivery. Based on current ISO standards such as ISO25000 series, this work discusses technical and managerial principles that need to be applied in order to obtain quality e-commerce software.
文摘In recent times, the overall interest over Supercritical Fluid Chromatography (SFC) is truly growing within various domains but especially for pharmaceutical analysis. However, in the best of our knowledge modern SFC is not yet applied for drug quality control in the daily routine framework. Among the numerous reported SFC methods, none of them could be found to fully satisfy to all steps of the analytical method lifecycle. Thereby, the present contribution aims to provide an overview of the current and past achievements related to SFC techniques, with a targeted attention to this lifecycle and its successive steps. The included discussions were therefore structured accordingly and emphasizing the analytical method lifecycle in accord with the International Conference on Harmonisation (ICH). Recent and important scientific outputs in the field of analytical SFC, as well as instrumental evolution, qualification strategies, method development methodologies and discussions on the topic of method validation are reviewed.
文摘Based on the analysis of the whole lifecycle of equipment, we, in this paper, propose the lifecycle-oriented diagnosis and maintenance philosophy, and expound its connotation and characteristics. Then, we present its open framework of the lifecycle-oriented diagnosis view in detail, which consists of three levels: information model, net model and open space. The open space level indicates the alternation of the information, the integration of the structure and the modeling of the knowledge. These three levels reveal the synthesis of the diagnosis process effectively in length and breadth. Finally, the main work of the paper and the future work on this problem are discussed.
文摘In this paper, the assessment indicator hierarchy of the Product Multi-Lifecycle System (PMLS) is proposed based on the PMLS connotations and objectives, upon which the PMLS assessment method is discussed and the example is presented by using the fuzzy judging theory.
文摘The PLDIM (Product Lifecycle Dynamic Information Model) is the most important part of the PLDM (Product Lifecycle Dynamic Model ) and it is the basis to create the information system and to implement PLM (Product Lifecycle Management). The information classification, the relationships among all information items, PLDIM mathematic expression, information coding and the 3D synthetic description of the PLDIM are presented. The information flow and information system structure based on the two information centers and Internet/Intranet are proposed, and how to implement this system for ship diesel engines are also introduced according to the PLDIM and PLM solutions.
文摘In today’s rapidly evolving digital landscape,web application security has become paramount as organizations face increasingly sophisticated cyber threats.This work presents a comprehensive methodology for implementing robust security measures in modern web applications and the proof of the Methodology applied to Vue.js,Spring Boot,and MySQL architecture.The proposed approach addresses critical security challenges through a multi-layered framework that encompasses essential security dimensions including multi-factor authentication,fine-grained authorization controls,sophisticated session management,data confidentiality and integrity protection,secure logging mechanisms,comprehensive error handling,high availability strategies,advanced input validation,and security headers implementation.Significant contributions are made to the field of web application security.First,a detailed catalogue of security requirements specifically tailored to protect web applications against contemporary threats,backed by rigorous analysis and industry best practices.Second,the methodology is validated through a carefully designed proof-of-concept implementation in a controlled environment,demonstrating the practical effectiveness of the security measures.The validation process employs cutting-edge static and dynamic analysis tools for comprehensive dependency validation and vulnerability detection,ensuring robust security coverage.The validation results confirm the prevention and avoidance of security vulnerabilities of the methodology.A key innovation of this work is the seamless integration of DevSecOps practices throughout the secure Software Development Life Cycle(SSDLC),creating a security-first mindset from initial design to deployment.By combining proactive secure coding practices with defensive security approaches,a framework is established that not only strengthens application security but also fosters a culture of security awareness within development teams.This hybrid approach ensures that security considerations are woven into every aspect of the development process,rather than being treated as an afterthought.
基金supported by“The Palace Museum Talent Program”.The Palace Museum Talent Program is supported by The Hong Kong Jockey Club,exclusively sponsored by the Institute of Philanthropy.
文摘The standards system for cultural heritage digitalization aims to build a clear and logically rigorous framework to guide the development and revision of relevant standards.This system enhances the scientific,systematic,and practical aspects of cultural heritage digitalization.This paper comprehensively analyzes the current status and needs of cultural heritage digitalization and standardization.It further examines the methods used to construct the standards system.Through comparative analysis,it establishes a lifecycle-based framework for cultural heritage.This framework accounts for the unique characteristics of cultural heritage and systematically integrates key processes such as collection,processing,storage,transmission,and utilization of data.The standards system is divided into six sections:general,data,information,knowledge,intelligence,and application.Based on the current digitalization efforts,this paper proposes key standardization directions for each section.This framework ensures the integrity and consistency of data throughout the digitalization process.It also supports the application of intelligent technologies in cultural heritage conservation,contributing to the sustainable preservation and utilization of cultural heritage data.
基金supported by the National Natural Science Foundation of China(Nos.61802270 and 61802271)the Fundamental Research Funds for the Central Universities(Nos.SCU2018D018 and SCU2018D022)
文摘As a foundation component of cloud computing platforms, Virtual Machines (VMs) are confronted with numerous security threats. However, existing solutions tend to focus on solving threats in a specific state of the VM. In this paper, we propose a novel VM lifecycle security protection framework based on trusted computing to solve the security threats to VMs throughout their entire lifecycle. Specifically, a concept of the VM lifecycle is presented divided up by the different active conditions of the VM. Then, a trusted computing based security protecti on framework is developed, which can exte nd the trusted relati on ship from trusted platform module to the VM and protect the security and reliability of the VM throughout its lifecycle. The theoretical analysis shows that our proposed framework can provide comprehensive safety to VM in all of its states. Furthermore, experiment results demonstrate that the proposed framework is feasible and achieves a higher level of security compared with some state-of-the-art schemes.
文摘This article proposes that problems of information security are mainly caused by the ineffective integration of people, operation, and technology, and not merely by the poor use of technology. Based on the information lifecycle, a model of the information security assurance lifecycle is presented. The crucial parts of the model are further discussed, with the information risk value and protect level, and the solution in each step of the lifecycle is presented with an ensured information risk level, in term of the integration of people, operation, and technology.