An evaluation approach for the response time probability distribution of workflows based on the fluid stochastic Petri net formalism is presented. Firstly, some problems about stochastic workflow net modeling are disc...An evaluation approach for the response time probability distribution of workflows based on the fluid stochastic Petri net formalism is presented. Firstly, some problems about stochastic workflow net modeling are discussed. Then how to convert a stochastic workflow net model into a fluid stochastic Petri net model is described. The response time distribution can be obtained directly upon the transient state solution of the fluid stochastic Petri net model. In the proposed approach, there are not any restrictions on the structure of workflow models, and the processing times of workflow tasks can be modeled by using arbitrary probability distributions. Large workflow models can be efficiently tackled by recursively using a net reduction technique.展开更多
In recent years, several researchers have applied workflow technologies for service automation on ubiquitous compating environments. However, most context-aware workflows do not offer a method to compose several workf...In recent years, several researchers have applied workflow technologies for service automation on ubiquitous compating environments. However, most context-aware workflows do not offer a method to compose several workflows in order to get mare large-scale or complicated workflow. They only provide a simple workflow model, not a composite workflow model. In this paper, the autorhs propose a context-aware workflow model to support composite workflows by expanding the patterns of the existing context-aware wrY:flows, which support the basic woddlow patterns. The suggested workflow model of. fers composite workflow patterns for a context-aware workflow, which consists of various flow patterns, such as simple, split, parallel flows, and subflow. With the suggested model, the model can easily reuse few of existing workflows to make a new workflow. As a result, it can save the development efforts and time of context-aware workflows and increase the workflow reusability. Therefore, the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.展开更多
Based on the methods of acquaintance cache and group-based intelligent forwarding of service recommendations,a novel group-based active service(GAS) protocol for migrating workflows was proposed.This protocol did not ...Based on the methods of acquaintance cache and group-based intelligent forwarding of service recommendations,a novel group-based active service(GAS) protocol for migrating workflows was proposed.This protocol did not require service requesters to discover services or resources.The semantic acquaintance knowledge representation was exploited to describe service groups and this semantic information was used to recommend service to respective clients.The experimental results show that the new protocol proposed offers better performance than other protocols in terms of first-response-time,success-scope and ratio of success-packet-number to total-packet-number.When the number of service request packet is 20,the first-response-time of GAS protocol is only 5.1 s,which is significantly lower than that of other protocols.The success-scope of GAS protocol is 49.1%,showing that GAS protocol can effectively improve the reliability of mobile transactions.And the ratio of success-packet-number to total-packet-number of GAS protocol is up to 0.080,which is obviously higher than that of other protocols.展开更多
As the Internet of Things(IoT)and mobile devices have rapidly proliferated,their computationally intensive applications have developed into complex,concurrent IoT-based workflows involving multiple interdependent task...As the Internet of Things(IoT)and mobile devices have rapidly proliferated,their computationally intensive applications have developed into complex,concurrent IoT-based workflows involving multiple interdependent tasks.By exploiting its low latency and high bandwidth,mobile edge computing(MEC)has emerged to achieve the high-performance computation offloading of these applications to satisfy the quality-of-service requirements of workflows and devices.In this study,we propose an offloading strategy for IoT-based workflows in a high-performance MEC environment.The proposed task-based offloading strategy consists of an optimization problem that includes task dependency,communication costs,workflow constraints,device energy consumption,and the heterogeneous characteristics of the edge environment.In addition,the optimal placement of workflow tasks is optimized using a discrete teaching learning-based optimization(DTLBO)metaheuristic.Extensive experimental evaluations demonstrate that the proposed offloading strategy is effective at minimizing the energy consumption of mobile devices and reducing the execution times of workflows compared to offloading strategies using different metaheuristics,including particle swarm optimization and ant colony optimization.展开更多
With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services...With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, eonformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of work-flows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.展开更多
With the rapid growth of the Industrial Internet of Things(IIoT), the Mobile Edge Computing(MEC) has coming widely used in many emerging scenarios. In MEC, each workflow task can be executed locally or offloaded to ed...With the rapid growth of the Industrial Internet of Things(IIoT), the Mobile Edge Computing(MEC) has coming widely used in many emerging scenarios. In MEC, each workflow task can be executed locally or offloaded to edge to help improve Quality of Service(QoS) and reduce energy consumption. However, most of the existing offloading strategies focus on independent applications, which cannot be applied efficiently to workflow applications with a series of dependent tasks. To address the issue,this paper proposes an energy-efficient task offloading strategy for large-scale workflow applications in MEC. First, we formulate the task offloading problem into an optimization problem with the goal of minimizing the utility cost, which is the trade-off between energy consumption and the total execution time. Then, a novel heuristic algorithm named Green DVFS-GA is proposed, which includes a task offloading step based on the genetic algorithm and a further step to reduce the energy consumption using Dynamic Voltage and Frequency Scaling(DVFS) technique. Experimental results show that our proposed strategy can significantly reduce the energy consumption and achieve the best trade-off compared with other strategies.展开更多
Workflow-based systems are typically said to lead to better use of staff and better management and productivity. The first phase in building a workflow-based system is capturing the real-world process in a conceptual ...Workflow-based systems are typically said to lead to better use of staff and better management and productivity. The first phase in building a workflow-based system is capturing the real-world process in a conceptual representation suitable for the following phases of formalization and implementation. The specification may be in text or diagram form or written in a formal language. This paper proposes a flow-based diagrammatic methodology as a tool for workflow specification. The expressiveness of the method is appraised though its ability to capture a workflow-based application. Here we show that the proposed conceptual diagrams are able to express situations arising in practice as an alternative to tools currently used in workflow systems. This is demonstrated by using the proposed methodology to partial build demo systems for two government agencies.展开更多
The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tas...The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tasks. However, executing scientific workflows in IaaS cloud environments poses significant challenges due to conflicting objectives, such as minimizing execution time (makespan) and reducing resource utilization costs. This study responds to the increasing need for efficient and adaptable optimization solutions in dynamic and complex environments, which are critical for meeting the evolving demands of modern users and applications. This study presents an innovative multi-objective approach for scheduling scientific workflows in IaaS cloud environments. The proposed algorithm, MOS-MWMC, aims to minimize total execution time (makespan) and resource utilization costs by leveraging key features of virtual machine instances, such as a high number of cores and fast local SSD storage. By integrating realistic simulations based on the WRENCH framework, the method effectively dimensions the cloud infrastructure and optimizes resource usage. Experimental results highlight the superiority of MOS-MWMC compared to benchmark algorithms HEFT and Max-Min. The Pareto fronts obtained for the CyberShake, Epigenomics, and Montage workflows demonstrate closer proximity to the optimal front, confirming the algorithm’s ability to balance conflicting objectives. This study contributes to optimizing scientific workflows in complex environments by providing solutions tailored to specific user needs while minimizing costs and execution times.展开更多
The current application of digital workflows for the understanding,promotion and participation in the conservation of heritage sites involves several technical challenges and should be governed by serious ethical enga...The current application of digital workflows for the understanding,promotion and participation in the conservation of heritage sites involves several technical challenges and should be governed by serious ethical engagement.Recording consists of capturing(or mapping)the physical characteristics of character-defining elements that provide the significance of cultural heritage sites.Usually,the outcome of this work represents the cornerstone information serving for their conservation,whatever it uses actively for maintaining them or for ensuring a posterity record in case of destruction.The records produced could guide the decision-making process at different levels by property owners,site managers,public officials,and conservators around the world,as well as to present historical knowledge and values of these resources.Rigorous documentation may also serve a broader purpose:over time,it becomes the primary means by which scholars and the public apprehends a site that has since changed radically or disappeared.This contribution is aimed at providing an overview of the potential application and threats of technology utilised by a heritage recording professional by addressing the need to develop ethical principles that can improve the heritage recording practice at large.展开更多
Research data currently face a huge increase of data objects with an increasing variety of types(data types,formats)and variety of workflows by which objects need to be managed across their lifecycle by data infrastru...Research data currently face a huge increase of data objects with an increasing variety of types(data types,formats)and variety of workflows by which objects need to be managed across their lifecycle by data infrastructures.Researchers desire to shorten the workflows from data generation to analysis and publication,and the full workflow needs to become transparent to multiple stakeholders,including research administrators and funders.This poses challenges for research infrastructures and user-oriented data services in terms of not only making data and workflows findable,accessible,interoperable and reusable,but also doing so in a way that leverages machine support for better efficiency.One primary need to be addressed is that of findability,and achieving better findability has benefits for other aspects of data and workflow management.In this article,we describe how machine capabilities can be extended to make workflows more findable,in particular by leveraging the Digital Object Architecture,common object operations and machine learning techniques.展开更多
In Geographic Information Systems(GIS),geoprocessing workflows allow analysts to organize their methods on spatial data in complex chains.We propose a method for expressing workflows as linked data,and for semi-automa...In Geographic Information Systems(GIS),geoprocessing workflows allow analysts to organize their methods on spatial data in complex chains.We propose a method for expressing workflows as linked data,and for semi-automatically enriching them with semantics on the level of their operations and datasets.Linked workflows can be easily published on the Web and queried for types of inputs,results,or tools.Thus,GIS analysts can reuse their workflows in a modular way,selecting,adapting,and recommending resources based on compatible semantic types.Our typing approach starts from minimal annotations of workflow operations with classes of GIS tools,and then propagates data types and implicit semantic structures through the workflow using an OWL typing scheme and SPARQL rules by backtracking over GIS operations.The method is implemented in Python and is evaluated on two real-world geoprocessing workflows,generated with Esri's ArcGIS.To illustrate the potential applications of our typing method,we formulate and execute competency questions over these workflows.展开更多
When defining indicators on the environment,the use of existing initiatives should be a priority rather than redefining indicators each time.From an Information,Communication and Technology perspective,data interopera...When defining indicators on the environment,the use of existing initiatives should be a priority rather than redefining indicators each time.From an Information,Communication and Technology perspective,data interoperability and standardization are critical to improve data access and exchange as promoted by the Group on Earth Observations.GEOEssential is following an end-user driven approach by defining Essential Variables(EVs),as an intermediate value between environmental policy indicators and their appropriate data sources.From international to local scales,environmental policies and indicators are increasingly percolating down from the global to the local agendas.The scientific business processes for the generation of EVs and related indicators can be formalized in workflows specifying the necessary logical steps.To this aim,GEOEssential is developing a Virtual Laboratory the main objective of which is to instantiate conceptual workflows,which are stored in a dedicated knowledge base,generating executable workflows.To interpret and present the relevant outputs/results carried out by the different thematic workflows considered in GEOEssential(i.e.biodiversity,ecosystems,extractives,night light,and food-water-energy nexus),a Dashboard is built as a visual front-end.This is a valuable instrument to track progresses towards environmental policies.展开更多
There is a growing recognition of the interdependencies among the supply systems that rely upon food,water and energy.Billions of people lack safe and sufficient access to these systems,coupled with a rapidly growing ...There is a growing recognition of the interdependencies among the supply systems that rely upon food,water and energy.Billions of people lack safe and sufficient access to these systems,coupled with a rapidly growing global demand and increasing resource constraints.Modeling frameworks are considered one of the few means available to understand the complex interrelationships among the sectors,however development of nexus related frameworks has been limited.We describe three opensource models well known in their respective domains(i.e.TerrSysMP,WOFOST and SWAT)where components of each if combined could help decision-makers address the nexus issue.We propose as a first step the development of simple workflows utilizing essential variables and addressing components of the above-mentioned models which can act as building-blocks to be used ultimately in a comprehensive nexus model framework.The outputs of the workflows and the model framework are designed to address the SDGs.展开更多
Since their introduction by James Dixon in 2010,data lakes get more and more attention,driven by the promise of high reusability of the stored data due to the schema-on-read semantics.Building on this idea,several add...Since their introduction by James Dixon in 2010,data lakes get more and more attention,driven by the promise of high reusability of the stored data due to the schema-on-read semantics.Building on this idea,several additional requirements were discussed in literature to improve the general usability of the concept,like a central metadata catalog including all provenance information,an overarching data governance,or the integration with(high-performance)processing capabilities.Although the necessity for a logical and a physical organisation of data lakes in order to meet those requirements is widely recognized,no concrete guidelines are yet provided.The most common architecture implementing this conceptual organisation is the zone architecture,where data is assigned to a certain zone depending on the degree of processing.This paper discusses how FAIR Digital Objects can be used in a novel approach to organize a data lake based on data types instead of zones,how they can be used to abstract the physical implementation,and how they empower generic and portable processing capabilities based on a provenance-based approach.展开更多
The article by Chauhan et al highlights the transformative potential of magnification tools in improving precision and outcomes across various dental specialties.While the authors discuss the advantages of magnificati...The article by Chauhan et al highlights the transformative potential of magnification tools in improving precision and outcomes across various dental specialties.While the authors discuss the advantages of magnification,they do not address the potential integration of artificial intelligence(AI)with magnification devices to further enhance diagnostic and therapeutic efficiency.This letter explores the synergy of AI with magnification tools,emphasizing its applicability in image-guided diagnostics,workflow optimization,and personalized treatment planning.The integration of AI and magnification also paves the way for personalized,data-driven treatment strategies,marking a significant evolution in dental care.However,it is important to acknowledge the limitations and challenges associated with AI,such as data privacy concerns,algorithmic biases,and the need for robust validation before clinical implementation.This discussion underscores the need for interdisciplinary research to realize this potential.展开更多
Patients in intensive care units(ICUs)require rapid critical decision making.Modern ICUs are data rich,where information streams from diverse sources.Machine learning(ML)and neural networks(NN)can leverage the rich da...Patients in intensive care units(ICUs)require rapid critical decision making.Modern ICUs are data rich,where information streams from diverse sources.Machine learning(ML)and neural networks(NN)can leverage the rich data for prognostication and clinical care.They can handle complex nonlinear relation-ships in medical data and have advantages over traditional predictive methods.A number of models are used:(1)Feedforward networks;and(2)Recurrent NN and convolutional NN to predict key outcomes such as mortality,length of stay in the ICU and the likelihood of complications.Current NN models exist in silos;their integration into clinical workflow requires greater transparency on data that are analyzed.Most models that are accurate enough for use in clinical care operate as‘black-boxes’in which the logic behind their decision making is opaque.Advan-ces have occurred to see through the opacity and peer into the processing of the black-box.In the near future ML is positioned to help in clinical decision making far beyond what is currently possible.Transparency is the first step toward vali-dation which is followed by clinical trust and adoption.In summary,NNs have the transformative ability to enhance predictive accuracy and improve patient management in ICUs.The concept should soon be turning into reality.展开更多
Magnesium-ion batteries hold promise as future energy storage solutions,yet current Mg cathodes are challenged by low voltage and specific capacity.Herein,we present an AI-driven workflow for discovering high-performa...Magnesium-ion batteries hold promise as future energy storage solutions,yet current Mg cathodes are challenged by low voltage and specific capacity.Herein,we present an AI-driven workflow for discovering high-performance Mg cathode materials.Utilizing the common characteristics of various ionic intercalation-type electrodes,we design and train a Crystal Graph Convolutional Neural Network model that can accurately predict electrode voltages for various ions with mean absolute errors(MAE)between0.25 and 0.33 V.By deploying the trained model to stable Mg compounds from Materials Project and GNoME AI dataset,we identify 160 high voltage structures out of 15,308 candidates with voltages above3.0 V and volumetric capacity over 800 mA h/cm^(3).We further train a precise NequIP model to facilitate accurate and rapid simulations of Mg ionic conductivity.From the 160 high voltage structures,the machine learning molecular dynamics simulations have selected 23 cathode materials with both high energy density and high ionic conductivity.This Al-driven workflow dramatically boosts the efficiency and precision of material discovery for multivalent ion batteries,paving the way for advanced Mg battery development.展开更多
BACKGROUND A key cardiac magnetic resonance(CMR)challenge is breath-holding duration,difficult for cardiac patients.AIM To evaluate whether artificial intelligence-assisted compressed sensing CINE(AICS-CINE)reduces im...BACKGROUND A key cardiac magnetic resonance(CMR)challenge is breath-holding duration,difficult for cardiac patients.AIM To evaluate whether artificial intelligence-assisted compressed sensing CINE(AICS-CINE)reduces image acquisition time of CMR compared to conventional CINE(C-CINE).METHODS Cardio-oncology patients(n=60)and healthy volunteers(n=29)underwent sequential C-CINE and AI-CS-CINE with a 1.5-T scanner.Acquisition time,visual image quality assessment,and biventricular metrics(end-diastolic volume,endsystolic volume,stroke volume,ejection fraction,left ventricular mass,and wall thickness)were analyzed and compared between C-CINE and AI-CS-CINE with Bland–Altman analysis,and calculation of intraclass coefficient(ICC).RESULTS In 89 participants(58.5±16.8 years,42 males,47 females),total AI-CS-CINE acquisition and reconstruction time(37 seconds)was 84%faster than C-CINE(238 seconds).C-CINE required repeats in 23%(20/89)of cases(approximately 8 minutes lost),while AI-CS-CINE only needed one repeat(1%;2 seconds lost).AICS-CINE had slightly lower contrast but preserved structural clarity.Bland-Altman plots and ICC(0.73≤r≤0.98)showed strong agreement for left ventricle(LV)and right ventricle(RV)metrics,including those in the cardiac amyloidosis subgroup(n=31).AI-CS-CINE enabled faster,easier imaging in patients with claustrophobia,dyspnea,arrhythmias,or restlessness.Motion-artifacted C-CINE images were reliably interpreted from AI-CS-CINE.CONCLUSION AI-CS-CINE accelerated CMR image acquisition and reconstruction,preserved anatomical detail,and diminished impact of patient-related motion.Quantitative AI-CS-CINE metrics agreed closely with C-CINE in cardio-oncology patients,including the cardiac amyloidosis cohort,as well as healthy volunteers regardless of left and right ventricular size and function.AI-CS-CINE significantly enhanced CMR workflow,particularly in challenging cases.The strong analytical concordance underscores reliability and robustness of AI-CS-CINE as a valuable tool.展开更多
We discuss the problem of accountability when multiple parties cooperate towards an end result,such as multiple companies in a supply chain or departments of a government service under different authorities.In cases w...We discuss the problem of accountability when multiple parties cooperate towards an end result,such as multiple companies in a supply chain or departments of a government service under different authorities.In cases where a fully trusted central point does not exist,it is difficult to obtain a trusted audit trail of a workflow when each individual participant is unaccountable to all others.We propose AudiWFlow,an auditing architecture that makes participants accountable for their contributions in a distributed workflow.Our scheme provides confidentiality in most cases,collusion detection,and availability of evidence after the workflow terminates.AudiWFlow is based on verifiable secret sharing and real-time peer-to-peer verification of records;it further supports multiple levels of assurance to meet a desired trade-off between the availability of evidence and the overhead resulting from the auditing approach.We propose and evaluate two implementation approaches for AudiWFlow.The first one is fully distributed except for a central auxiliary point that,nevertheless,needs only a low level of trust.The second one is based on smart contracts running on a public blockchain,which is able to remove the need for any central point but requires integration with a blockchain.展开更多
基金The National Natural Science Foundation of China(No.60175027).
文摘An evaluation approach for the response time probability distribution of workflows based on the fluid stochastic Petri net formalism is presented. Firstly, some problems about stochastic workflow net modeling are discussed. Then how to convert a stochastic workflow net model into a fluid stochastic Petri net model is described. The response time distribution can be obtained directly upon the transient state solution of the fluid stochastic Petri net model. In the proposed approach, there are not any restrictions on the structure of workflow models, and the processing times of workflow tasks can be modeled by using arbitrary probability distributions. Large workflow models can be efficiently tackled by recursively using a net reduction technique.
基金supported by the The Ministry of Knowledge Economy,Korea,the ITRC(Information Technology Research Center)support program(ⅡTA-2009-(C1090-0902-0007))
文摘In recent years, several researchers have applied workflow technologies for service automation on ubiquitous compating environments. However, most context-aware workflows do not offer a method to compose several workflows in order to get mare large-scale or complicated workflow. They only provide a simple workflow model, not a composite workflow model. In this paper, the autorhs propose a context-aware workflow model to support composite workflows by expanding the patterns of the existing context-aware wrY:flows, which support the basic woddlow patterns. The suggested workflow model of. fers composite workflow patterns for a context-aware workflow, which consists of various flow patterns, such as simple, split, parallel flows, and subflow. With the suggested model, the model can easily reuse few of existing workflows to make a new workflow. As a result, it can save the development efforts and time of context-aware workflows and increase the workflow reusability. Therefore, the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.
基金Project(60573169) supported by the National Natural Science Foundation of China
文摘Based on the methods of acquaintance cache and group-based intelligent forwarding of service recommendations,a novel group-based active service(GAS) protocol for migrating workflows was proposed.This protocol did not require service requesters to discover services or resources.The semantic acquaintance knowledge representation was exploited to describe service groups and this semantic information was used to recommend service to respective clients.The experimental results show that the new protocol proposed offers better performance than other protocols in terms of first-response-time,success-scope and ratio of success-packet-number to total-packet-number.When the number of service request packet is 20,the first-response-time of GAS protocol is only 5.1 s,which is significantly lower than that of other protocols.The success-scope of GAS protocol is 49.1%,showing that GAS protocol can effectively improve the reliability of mobile transactions.And the ratio of success-packet-number to total-packet-number of GAS protocol is up to 0.080,which is obviously higher than that of other protocols.
文摘As the Internet of Things(IoT)and mobile devices have rapidly proliferated,their computationally intensive applications have developed into complex,concurrent IoT-based workflows involving multiple interdependent tasks.By exploiting its low latency and high bandwidth,mobile edge computing(MEC)has emerged to achieve the high-performance computation offloading of these applications to satisfy the quality-of-service requirements of workflows and devices.In this study,we propose an offloading strategy for IoT-based workflows in a high-performance MEC environment.The proposed task-based offloading strategy consists of an optimization problem that includes task dependency,communication costs,workflow constraints,device energy consumption,and the heterogeneous characteristics of the edge environment.In addition,the optimal placement of workflow tasks is optimized using a discrete teaching learning-based optimization(DTLBO)metaheuristic.Extensive experimental evaluations demonstrate that the proposed offloading strategy is effective at minimizing the energy consumption of mobile devices and reducing the execution times of workflows compared to offloading strategies using different metaheuristics,including particle swarm optimization and ant colony optimization.
基金the National High Technology.Research and Development Programme of China(No.2006AAO4Z151 and 2006AA04Z166)the National Natural Science Foundation of China(No.60674080 and No.60504030)the EU FP6(No.033610)
文摘With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, eonformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of work-flows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.
基金Supported by the National Natural Science Foundation of China(62102292)the Hubei Key Laboratory of Intelligent Robot(Wuhan Institute of Technology) of China(HBIRL202103,HBIRL202204)+1 种基金Science Foundation Research Project of Wuhan Institute of Technology of China(K202035)Graduate Innovative Fund of Wuhan Institute of Technology of China(CX2021265)。
文摘With the rapid growth of the Industrial Internet of Things(IIoT), the Mobile Edge Computing(MEC) has coming widely used in many emerging scenarios. In MEC, each workflow task can be executed locally or offloaded to edge to help improve Quality of Service(QoS) and reduce energy consumption. However, most of the existing offloading strategies focus on independent applications, which cannot be applied efficiently to workflow applications with a series of dependent tasks. To address the issue,this paper proposes an energy-efficient task offloading strategy for large-scale workflow applications in MEC. First, we formulate the task offloading problem into an optimization problem with the goal of minimizing the utility cost, which is the trade-off between energy consumption and the total execution time. Then, a novel heuristic algorithm named Green DVFS-GA is proposed, which includes a task offloading step based on the genetic algorithm and a further step to reduce the energy consumption using Dynamic Voltage and Frequency Scaling(DVFS) technique. Experimental results show that our proposed strategy can significantly reduce the energy consumption and achieve the best trade-off compared with other strategies.
文摘Workflow-based systems are typically said to lead to better use of staff and better management and productivity. The first phase in building a workflow-based system is capturing the real-world process in a conceptual representation suitable for the following phases of formalization and implementation. The specification may be in text or diagram form or written in a formal language. This paper proposes a flow-based diagrammatic methodology as a tool for workflow specification. The expressiveness of the method is appraised though its ability to capture a workflow-based application. Here we show that the proposed conceptual diagrams are able to express situations arising in practice as an alternative to tools currently used in workflow systems. This is demonstrated by using the proposed methodology to partial build demo systems for two government agencies.
文摘The ease of accessing a virtually unlimited pool of resources makes Infrastructure as a Service (IaaS) clouds an ideal platform for running data-intensive workflow applications comprising hundreds of computational tasks. However, executing scientific workflows in IaaS cloud environments poses significant challenges due to conflicting objectives, such as minimizing execution time (makespan) and reducing resource utilization costs. This study responds to the increasing need for efficient and adaptable optimization solutions in dynamic and complex environments, which are critical for meeting the evolving demands of modern users and applications. This study presents an innovative multi-objective approach for scheduling scientific workflows in IaaS cloud environments. The proposed algorithm, MOS-MWMC, aims to minimize total execution time (makespan) and resource utilization costs by leveraging key features of virtual machine instances, such as a high number of cores and fast local SSD storage. By integrating realistic simulations based on the WRENCH framework, the method effectively dimensions the cloud infrastructure and optimizes resource usage. Experimental results highlight the superiority of MOS-MWMC compared to benchmark algorithms HEFT and Max-Min. The Pareto fronts obtained for the CyberShake, Epigenomics, and Montage workflows demonstrate closer proximity to the optimal front, confirming the algorithm’s ability to balance conflicting objectives. This study contributes to optimizing scientific workflows in complex environments by providing solutions tailored to specific user needs while minimizing costs and execution times.
文摘The current application of digital workflows for the understanding,promotion and participation in the conservation of heritage sites involves several technical challenges and should be governed by serious ethical engagement.Recording consists of capturing(or mapping)the physical characteristics of character-defining elements that provide the significance of cultural heritage sites.Usually,the outcome of this work represents the cornerstone information serving for their conservation,whatever it uses actively for maintaining them or for ensuring a posterity record in case of destruction.The records produced could guide the decision-making process at different levels by property owners,site managers,public officials,and conservators around the world,as well as to present historical knowledge and values of these resources.Rigorous documentation may also serve a broader purpose:over time,it becomes the primary means by which scholars and the public apprehends a site that has since changed radically or disappeared.This contribution is aimed at providing an overview of the potential application and threats of technology utilised by a heritage recording professional by addressing the need to develop ethical principles that can improve the heritage recording practice at large.
文摘Research data currently face a huge increase of data objects with an increasing variety of types(data types,formats)and variety of workflows by which objects need to be managed across their lifecycle by data infrastructures.Researchers desire to shorten the workflows from data generation to analysis and publication,and the full workflow needs to become transparent to multiple stakeholders,including research administrators and funders.This poses challenges for research infrastructures and user-oriented data services in terms of not only making data and workflows findable,accessible,interoperable and reusable,but also doing so in a way that leverages machine support for better efficiency.One primary need to be addressed is that of findability,and achieving better findability has benefits for other aspects of data and workflow management.In this article,we describe how machine capabilities can be extended to make workflows more findable,in particular by leveraging the Digital Object Architecture,common object operations and machine learning techniques.
文摘In Geographic Information Systems(GIS),geoprocessing workflows allow analysts to organize their methods on spatial data in complex chains.We propose a method for expressing workflows as linked data,and for semi-automatically enriching them with semantics on the level of their operations and datasets.Linked workflows can be easily published on the Web and queried for types of inputs,results,or tools.Thus,GIS analysts can reuse their workflows in a modular way,selecting,adapting,and recommending resources based on compatible semantic types.Our typing approach starts from minimal annotations of workflow operations with classes of GIS tools,and then propagates data types and implicit semantic structures through the workflow using an OWL typing scheme and SPARQL rules by backtracking over GIS operations.The method is implemented in Python and is evaluated on two real-world geoprocessing workflows,generated with Esri's ArcGIS.To illustrate the potential applications of our typing method,we formulate and execute competency questions over these workflows.
基金This work was supported by European Commission[grant number H2020 ERA-PLANET project No.689443].
文摘When defining indicators on the environment,the use of existing initiatives should be a priority rather than redefining indicators each time.From an Information,Communication and Technology perspective,data interoperability and standardization are critical to improve data access and exchange as promoted by the Group on Earth Observations.GEOEssential is following an end-user driven approach by defining Essential Variables(EVs),as an intermediate value between environmental policy indicators and their appropriate data sources.From international to local scales,environmental policies and indicators are increasingly percolating down from the global to the local agendas.The scientific business processes for the generation of EVs and related indicators can be formalized in workflows specifying the necessary logical steps.To this aim,GEOEssential is developing a Virtual Laboratory the main objective of which is to instantiate conceptual workflows,which are stored in a dedicated knowledge base,generating executable workflows.To interpret and present the relevant outputs/results carried out by the different thematic workflows considered in GEOEssential(i.e.biodiversity,ecosystems,extractives,night light,and food-water-energy nexus),a Dashboard is built as a visual front-end.This is a valuable instrument to track progresses towards environmental policies.
基金The authors would like to acknowledge the European Commission Horizon 2020 Program that funded both the ERAPLANET/GEOEssential(Grant Agreement no.689443)ConnectinGEO(Grant Agreement no.641538)projects.
文摘There is a growing recognition of the interdependencies among the supply systems that rely upon food,water and energy.Billions of people lack safe and sufficient access to these systems,coupled with a rapidly growing global demand and increasing resource constraints.Modeling frameworks are considered one of the few means available to understand the complex interrelationships among the sectors,however development of nexus related frameworks has been limited.We describe three opensource models well known in their respective domains(i.e.TerrSysMP,WOFOST and SWAT)where components of each if combined could help decision-makers address the nexus issue.We propose as a first step the development of simple workflows utilizing essential variables and addressing components of the above-mentioned models which can act as building-blocks to be used ultimately in a comprehensive nexus model framework.The outputs of the workflows and the model framework are designed to address the SDGs.
基金funding by the"Niedersachsisches Vorab"funding line of the Volkswagen Foundation.
文摘Since their introduction by James Dixon in 2010,data lakes get more and more attention,driven by the promise of high reusability of the stored data due to the schema-on-read semantics.Building on this idea,several additional requirements were discussed in literature to improve the general usability of the concept,like a central metadata catalog including all provenance information,an overarching data governance,or the integration with(high-performance)processing capabilities.Although the necessity for a logical and a physical organisation of data lakes in order to meet those requirements is widely recognized,no concrete guidelines are yet provided.The most common architecture implementing this conceptual organisation is the zone architecture,where data is assigned to a certain zone depending on the degree of processing.This paper discusses how FAIR Digital Objects can be used in a novel approach to organize a data lake based on data types instead of zones,how they can be used to abstract the physical implementation,and how they empower generic and portable processing capabilities based on a provenance-based approach.
文摘The article by Chauhan et al highlights the transformative potential of magnification tools in improving precision and outcomes across various dental specialties.While the authors discuss the advantages of magnification,they do not address the potential integration of artificial intelligence(AI)with magnification devices to further enhance diagnostic and therapeutic efficiency.This letter explores the synergy of AI with magnification tools,emphasizing its applicability in image-guided diagnostics,workflow optimization,and personalized treatment planning.The integration of AI and magnification also paves the way for personalized,data-driven treatment strategies,marking a significant evolution in dental care.However,it is important to acknowledge the limitations and challenges associated with AI,such as data privacy concerns,algorithmic biases,and the need for robust validation before clinical implementation.This discussion underscores the need for interdisciplinary research to realize this potential.
文摘Patients in intensive care units(ICUs)require rapid critical decision making.Modern ICUs are data rich,where information streams from diverse sources.Machine learning(ML)and neural networks(NN)can leverage the rich data for prognostication and clinical care.They can handle complex nonlinear relation-ships in medical data and have advantages over traditional predictive methods.A number of models are used:(1)Feedforward networks;and(2)Recurrent NN and convolutional NN to predict key outcomes such as mortality,length of stay in the ICU and the likelihood of complications.Current NN models exist in silos;their integration into clinical workflow requires greater transparency on data that are analyzed.Most models that are accurate enough for use in clinical care operate as‘black-boxes’in which the logic behind their decision making is opaque.Advan-ces have occurred to see through the opacity and peer into the processing of the black-box.In the near future ML is positioned to help in clinical decision making far beyond what is currently possible.Transparency is the first step toward vali-dation which is followed by clinical trust and adoption.In summary,NNs have the transformative ability to enhance predictive accuracy and improve patient management in ICUs.The concept should soon be turning into reality.
基金supported by the National Key R&D Program of China(2022YFA1203400)the National Natural Science Foundation of China(W2441009)。
文摘Magnesium-ion batteries hold promise as future energy storage solutions,yet current Mg cathodes are challenged by low voltage and specific capacity.Herein,we present an AI-driven workflow for discovering high-performance Mg cathode materials.Utilizing the common characteristics of various ionic intercalation-type electrodes,we design and train a Crystal Graph Convolutional Neural Network model that can accurately predict electrode voltages for various ions with mean absolute errors(MAE)between0.25 and 0.33 V.By deploying the trained model to stable Mg compounds from Materials Project and GNoME AI dataset,we identify 160 high voltage structures out of 15,308 candidates with voltages above3.0 V and volumetric capacity over 800 mA h/cm^(3).We further train a precise NequIP model to facilitate accurate and rapid simulations of Mg ionic conductivity.From the 160 high voltage structures,the machine learning molecular dynamics simulations have selected 23 cathode materials with both high energy density and high ionic conductivity.This Al-driven workflow dramatically boosts the efficiency and precision of material discovery for multivalent ion batteries,paving the way for advanced Mg battery development.
基金Supported by James Russell Hornsby and Jun Xiong Fund and United Imaging Healthcare.
文摘BACKGROUND A key cardiac magnetic resonance(CMR)challenge is breath-holding duration,difficult for cardiac patients.AIM To evaluate whether artificial intelligence-assisted compressed sensing CINE(AICS-CINE)reduces image acquisition time of CMR compared to conventional CINE(C-CINE).METHODS Cardio-oncology patients(n=60)and healthy volunteers(n=29)underwent sequential C-CINE and AI-CS-CINE with a 1.5-T scanner.Acquisition time,visual image quality assessment,and biventricular metrics(end-diastolic volume,endsystolic volume,stroke volume,ejection fraction,left ventricular mass,and wall thickness)were analyzed and compared between C-CINE and AI-CS-CINE with Bland–Altman analysis,and calculation of intraclass coefficient(ICC).RESULTS In 89 participants(58.5±16.8 years,42 males,47 females),total AI-CS-CINE acquisition and reconstruction time(37 seconds)was 84%faster than C-CINE(238 seconds).C-CINE required repeats in 23%(20/89)of cases(approximately 8 minutes lost),while AI-CS-CINE only needed one repeat(1%;2 seconds lost).AICS-CINE had slightly lower contrast but preserved structural clarity.Bland-Altman plots and ICC(0.73≤r≤0.98)showed strong agreement for left ventricle(LV)and right ventricle(RV)metrics,including those in the cardiac amyloidosis subgroup(n=31).AI-CS-CINE enabled faster,easier imaging in patients with claustrophobia,dyspnea,arrhythmias,or restlessness.Motion-artifacted C-CINE images were reliably interpreted from AI-CS-CINE.CONCLUSION AI-CS-CINE accelerated CMR image acquisition and reconstruction,preserved anatomical detail,and diminished impact of patient-related motion.Quantitative AI-CS-CINE metrics agreed closely with C-CINE in cardio-oncology patients,including the cardiac amyloidosis cohort,as well as healthy volunteers regardless of left and right ventricular size and function.AI-CS-CINE significantly enhanced CMR workflow,particularly in challenging cases.The strong analytical concordance underscores reliability and robustness of AI-CS-CINE as a valuable tool.
文摘We discuss the problem of accountability when multiple parties cooperate towards an end result,such as multiple companies in a supply chain or departments of a government service under different authorities.In cases where a fully trusted central point does not exist,it is difficult to obtain a trusted audit trail of a workflow when each individual participant is unaccountable to all others.We propose AudiWFlow,an auditing architecture that makes participants accountable for their contributions in a distributed workflow.Our scheme provides confidentiality in most cases,collusion detection,and availability of evidence after the workflow terminates.AudiWFlow is based on verifiable secret sharing and real-time peer-to-peer verification of records;it further supports multiple levels of assurance to meet a desired trade-off between the availability of evidence and the overhead resulting from the auditing approach.We propose and evaluate two implementation approaches for AudiWFlow.The first one is fully distributed except for a central auxiliary point that,nevertheless,needs only a low level of trust.The second one is based on smart contracts running on a public blockchain,which is able to remove the need for any central point but requires integration with a blockchain.