Active distribution network(ADN)planning is crucial for achieving a cost-effective transition to modern power systems,yet it poses significant challenges as the system scale increases.The advent of quantum computing o...Active distribution network(ADN)planning is crucial for achieving a cost-effective transition to modern power systems,yet it poses significant challenges as the system scale increases.The advent of quantum computing offers a transformative approach to solve ADN planning.To fully leverage the potential of quantum computing,this paper proposes a photonic quantum acceleration algorithm.First,a quantum-accelerated framework for ADN planning is proposed on the basis of coherent photonic quantum computers.The ADN planning model is then formulated and decomposed into discrete master problems and continuous subproblems to facilitate the quantum optimization process.The photonic quantum-embedded adaptive alternating direction method of multipliers(PQA-ADMM)algorithm is subsequently proposed to equivalently map the discrete master problem onto a quantum-interpretable model,enabling its deployment on a photonic quantum computer.Finally,a comparative analysis with various solvers,including Gurobi,demonstrates that the proposed PQA-ADMM algorithm achieves significant speedup on the modified IEEE 33-node and IEEE 123-node systems,highlighting its effectiveness.展开更多
In this paper, the sticker based DNA computing was used for solving the independent set problem. At first, solution space was constructed by using appropriate DNA memory complexes. We defined a new operation called “...In this paper, the sticker based DNA computing was used for solving the independent set problem. At first, solution space was constructed by using appropriate DNA memory complexes. We defined a new operation called “divide” and applied it in construction of solution space. Then, by application of a sticker based parallel algorithm using biological operations, independent set problem was resolved in polynomial time.展开更多
Deep vein thrombosis (DVT) is a common and potentially fatal vascular event when it leads to pulmonary embolism. Occurring as part of the broader phenomenon of Venous Thromboembolism (VTE), DVT classically arises when...Deep vein thrombosis (DVT) is a common and potentially fatal vascular event when it leads to pulmonary embolism. Occurring as part of the broader phenomenon of Venous Thromboembolism (VTE), DVT classically arises when Virchow’s triad of hypercoagulability, changes in blood flow (e.g. stasis) and endothelial dysfunction, is fulfilled. Although such immobilisation is most often seen in bedbound patients and travellers on long distance flights, there is increasing evidence that prolonged periods of work or leisure related to using computers while seated at work desks, is an independent risk factor. In this report, we present two cases of “e-thrombosis” from prolonged sitting while using a computer.展开更多
Although AI and quantum computing (QC) are fast emerging as key enablers of the future Internet, experts believe they pose an existential threat to humanity. Responding to the frenzied release of ChatGPT/GPT-4, thousa...Although AI and quantum computing (QC) are fast emerging as key enablers of the future Internet, experts believe they pose an existential threat to humanity. Responding to the frenzied release of ChatGPT/GPT-4, thousands of alarmed tech leaders recently signed an open letter to pause AI research to prepare for the catastrophic threats to humanity from uncontrolled AGI (Artificial General Intelligence). Perceived as an “epistemological nightmare”, AGI is believed to be on the anvil with GPT-5. Two computing rules appear responsible for these risks. 1) Mandatory third-party permissions that allow computers to run applications at the expense of introducing vulnerabilities. 2) The Halting Problem of Turing-complete AI programming languages potentially renders AGI unstoppable. The double whammy of these inherent weaknesses remains invincible under the legacy systems. A recent cybersecurity breakthrough shows that banning all permissions reduces the computer attack surface to zero, delivering a new zero vulnerability computing (ZVC) paradigm. Deploying ZVC and blockchain, this paper formulates and supports a hypothesis: “Safe, secure, ethical, controllable AGI/QC is possible by conquering the two unassailable rules of computability.” Pursued by a European consortium, testing/proving the proposed hypothesis will have a groundbreaking impact on the future digital infrastructure when AGI/QC starts powering the 75 billion internet devices by 2025.展开更多
We are already familiar with computers——computers work for us at home, in offices and in factories. But it is also true that many children today are using computers at schools before they can write. What does this m...We are already familiar with computers——computers work for us at home, in offices and in factories. But it is also true that many children today are using computers at schools before they can write. What does this mean for the future? Are these children lucky or not?展开更多
In this letter,we propose a duality computing mode,which resembles particle-wave duality property whena quantum system such as a quantum computer passes through a double-slit.In this mode,computing operations arenot n...In this letter,we propose a duality computing mode,which resembles particle-wave duality property whena quantum system such as a quantum computer passes through a double-slit.In this mode,computing operations arenot necessarily unitary.The duality mode provides a natural link between classical computing and quantum computing.In addition,the duality mode provides a new tool for quantum algorithm design.展开更多
A new approach for the implementation of variogram models and ordinary kriging using the R statistical language, in conjunction with Fortran, the MPI (Message Passing Interface), and the "pbdDMAT" package within R...A new approach for the implementation of variogram models and ordinary kriging using the R statistical language, in conjunction with Fortran, the MPI (Message Passing Interface), and the "pbdDMAT" package within R on the Bridges and Stampede Supercomputers will be described. This new technique has led to great improvements in timing as compared to those in R alone, or R with C and MPI. These improvements include processing and forecasting vectors of size 25,000 in an average time of 6 minutes on the Stampede Supercomputer and 2.5 minutes on the Bridges Supercomputer as compared to previous processing times of 3.5 hours.展开更多
Nonadiabatic holonomic quantum computers serve as the physical platform for nonadiabatic holonomic quantum computation.As quantum computation has entered the noisy intermediate-scale era,building accurate intermediate...Nonadiabatic holonomic quantum computers serve as the physical platform for nonadiabatic holonomic quantum computation.As quantum computation has entered the noisy intermediate-scale era,building accurate intermediate-scale nonadiabatic holo-nomic quantum computers is clearly necessary.Given that measurements are the sole means of extracting information,they play an indispensable role in nonadiabatic holonomic quantum computers.Accordingly,developing methods to reduce measurement errors in nonadiabatic holonomic quantum computers is of great importance.However,while much attention has been given to the research on nonadiabatic holonomic gates,the research on reducing measurement errors in nonadiabatic holonomic quantum computers is severely lacking.In this study,we propose a measurement error reduction method tailored for intermediate-scale nonadiabatic holonomic quantum computers.The reason we say this is because our method can not only reduce the measurement errors in the computer but also be useful in mitigating errors originating from nonadiabatic holonomic gates.Given these features,our method significantly advances the construction of accurate intermediate-scale nonadiabatic holonomic quantum computers.展开更多
In the effort to develop useful quantum computers,simulating quantum machines with conventional classical computing resources is a key capability.Such simulations will always face limits,preventing the emulation of qu...In the effort to develop useful quantum computers,simulating quantum machines with conventional classical computing resources is a key capability.Such simulations will always face limits,preventing the emulation of quantum computers at substantial scale;however,by pushing the envelope through optimal choices of algorithms and hardware,the value of simulator tools can be maximized.This work reviews state-of-the-art numerical simulation methods,i.e.,classical algorithms that emulate quantum computer evolution under specific operations.We focus on the mainstream state-vector and tensor-network paradigms,while briefly mentioning alternative methods.Moreover,we review the diverse applications of simulation across different facets of quantum computer development,including understanding the fundamental differences between quantum and classical computations,exploring algorithmic design for quantum advantage,predicting quantum processor performance at the design stage,and efficiently characterizing fabricated devices for rapid iterations.This review complements recent surveys of current tools and implementations;here,we aim to provide readers with an essential understanding of the theoretical basis of classical simulation methods,a detailed discussion of their advantages and limitations,and an overview of the demands and challenges arising from practical use cases.展开更多
Background Describing where distribution hotspots and coldspots are located is crucial for any science-based species management and governance.Thus,here we created the world's first Super Species Distribution Mode...Background Describing where distribution hotspots and coldspots are located is crucial for any science-based species management and governance.Thus,here we created the world's first Super Species Distribution Models(SDMs)including all described primate species and the best-available predictor set.These Super SDMs are conducted using an ensemble of modern Machine Learning algorithms,including Maxent,Tree Net,Random Forest,CART,CART Boosting and Bagging,and MARS with the utilization of cloud supercomputers(as an add-on option for more powerful models).For the global cold/hotspot models,we obtained global distribution data from www.GBIF.org(approx.420,000 raw occurrence records)and utilized the world's largest Open Access environmental predictor set of 201 layers.For this analysis,all occurrences have been merged into one multi-species(400+species)pixel-based analysis.Results We present the first quantified pixel-based global primate hotspot prediction for Central and Northern South America,West Africa,East Africa,Southeast Asia,Central Asia,and Southern Africa.The global primate coldspots are Antarctica,the Arctic,most temperate regions,and Oceania past the Wallace line.We additionally described all these modeled hotspots/coldspots and discussed reasons for a quantified understanding of where the world's non-human primates occur(or not).Conclusions This shows us where the focus for most future research and conservation management efforts should be,using state-of-the-art digital data indication tools with reasoning.Those areas should be considered of the highest conservation management priority,ideally following‘no killing zones'and sustainable land stewardship approaches if primates are to have a chance of survival.展开更多
Noncohesive particle clusters are identified and tracked in turbulent flows to determine the breakdown and time evolution of cluster statistics and their implications for interscale mass transfer,which has connections...Noncohesive particle clusters are identified and tracked in turbulent flows to determine the breakdown and time evolution of cluster statistics and their implications for interscale mass transfer,which has connections to the classical turbulent energy cascade and its mass cascade counterpart running in parallel.In particular,the formation and dynamics of sediment and larvae clusters are of interest to coral larvae settlement in coastal regions and particularly the resilience of green-gray coastal protection solutions.Analogous cluster behavior is relevant to cloud microphysics and precipitation initiation,radiation transport and light transmission through colloids and suspensions,heat and mass transfer in particle-laden flows,and viral and pollutant transmission.Following a comparison between various clustering techniques,we adopt a density-based cluster identification algorithm based on its simplicity and efficiency,where particles are clustered based on the number of neighboring particles in their individual spheres of influence.We establish parallels with lattice-based percolation theory,as evident in the power-law scaling of the cluster size distribution near the percolation threshold.The degree of discontinuity of the phase transition associated with this percolation threshold is observed to broaden with larger Stokes numbers and thereby large-scale clustering.The sensitivity of our findings to the employed clustering algorithm is discussed.A novel cluster tracking algorithm is deployed to determine the interscale transfer rate along the particle-number phase-space dimension via accounting of cluster breakup and merger events,extending previous work on the bubble breakup cascade beneath surface breaking waves.Our findings shed light on the interaction between particle clusters and their carrier turbulent flows,with an eye toward transport models incorporating cluster characteristics and dynamics.展开更多
Biomass-based hydrocarbon fuels,as one of the alternatives to traditional fossil fuels,have attracted considerable attention in the energy field due to their renewability and environmental benefits.This article provid...Biomass-based hydrocarbon fuels,as one of the alternatives to traditional fossil fuels,have attracted considerable attention in the energy field due to their renewability and environmental benefits.This article provides a systematic review of recent research progress in the chemical synthesis of biomass-based hydrocarbon fuels.It outlines the conversion pathways using feedstocks such as lipids,terpenoids,cellulose/hemicellulose,and lignin.Depending on the feedstock,various products with distinct structural characteristics can be prepared through reactions such as cyclization,condensation,and catalytic hydrogenation.Throughout the synthesis process,three key factors play a critical role:efficient catalyst development,production process optimization,and computational-chemistry-based molecular design.Finally,the article discusses future perspectives for biomass-based hydrocarbon fuel synthesis research.展开更多
The capture of atmospheric carbon dioxide by adsorbents is an important strategy to deal with the greenhouse effect.Compared with traditional CO_(2) adsorption materials like activated carbon,silica gel,and zeolite mo...The capture of atmospheric carbon dioxide by adsorbents is an important strategy to deal with the greenhouse effect.Compared with traditional CO_(2) adsorption materials like activated carbon,silica gel,and zeolite molecular sieves,covalent organic frameworks(COFs)have excellent thermal and chemical stabilities and can be produced in many different forms.Using their different possible construction units,ordered structures for specific applications can be produced,giving them broad prospects in fields such as gas storage.This review analyzes the different types of COFs that have been synthesized and their different methods of CO_(2) capture.It then discusses different ways to increase CO_(2) adsorption by changing the internal structure of COFs and modifying their surfaces.The limitations of COF-derived carbon materials in CO_(2) capture are reviewed and,finally,the key role of machine learning and computational simulation in improving CO_(2) adsorption is mentioned,and the current status and future possible uses of COFs are summarized.展开更多
We consider the relevance of computer hardware and simulations not only to science and technology but also to social life. Evolutionary processes are part of all we know, from the physical and inanimate world to the s...We consider the relevance of computer hardware and simulations not only to science and technology but also to social life. Evolutionary processes are part of all we know, from the physical and inanimate world to the simplest or most complex biological system. Evolution is manifested by land mark discoveries which deeply affect our social life. Demographic pressure, demand for improved living standards and devastation of the natural environment pose new and complex challenges. We believe that the implementation of new computational models based on the latest scientific methodology can provide a reasonable chance of overcoming today's social problems. To ensure this goal, however, we need a change of mindset, placing findings obtained from modern science above traditional concepts and beliefs. In particular, the type of modeling used with success in computational sciences must be extended to allow simulations of novel models for social life.展开更多
基金supported in part by the National Natural Science Foundation of China under Grant 52307134the Fundamental Research Funds for the Central Universities(xzy012025022)。
文摘Active distribution network(ADN)planning is crucial for achieving a cost-effective transition to modern power systems,yet it poses significant challenges as the system scale increases.The advent of quantum computing offers a transformative approach to solve ADN planning.To fully leverage the potential of quantum computing,this paper proposes a photonic quantum acceleration algorithm.First,a quantum-accelerated framework for ADN planning is proposed on the basis of coherent photonic quantum computers.The ADN planning model is then formulated and decomposed into discrete master problems and continuous subproblems to facilitate the quantum optimization process.The photonic quantum-embedded adaptive alternating direction method of multipliers(PQA-ADMM)algorithm is subsequently proposed to equivalently map the discrete master problem onto a quantum-interpretable model,enabling its deployment on a photonic quantum computer.Finally,a comparative analysis with various solvers,including Gurobi,demonstrates that the proposed PQA-ADMM algorithm achieves significant speedup on the modified IEEE 33-node and IEEE 123-node systems,highlighting its effectiveness.
文摘In this paper, the sticker based DNA computing was used for solving the independent set problem. At first, solution space was constructed by using appropriate DNA memory complexes. We defined a new operation called “divide” and applied it in construction of solution space. Then, by application of a sticker based parallel algorithm using biological operations, independent set problem was resolved in polynomial time.
文摘Deep vein thrombosis (DVT) is a common and potentially fatal vascular event when it leads to pulmonary embolism. Occurring as part of the broader phenomenon of Venous Thromboembolism (VTE), DVT classically arises when Virchow’s triad of hypercoagulability, changes in blood flow (e.g. stasis) and endothelial dysfunction, is fulfilled. Although such immobilisation is most often seen in bedbound patients and travellers on long distance flights, there is increasing evidence that prolonged periods of work or leisure related to using computers while seated at work desks, is an independent risk factor. In this report, we present two cases of “e-thrombosis” from prolonged sitting while using a computer.
文摘Although AI and quantum computing (QC) are fast emerging as key enablers of the future Internet, experts believe they pose an existential threat to humanity. Responding to the frenzied release of ChatGPT/GPT-4, thousands of alarmed tech leaders recently signed an open letter to pause AI research to prepare for the catastrophic threats to humanity from uncontrolled AGI (Artificial General Intelligence). Perceived as an “epistemological nightmare”, AGI is believed to be on the anvil with GPT-5. Two computing rules appear responsible for these risks. 1) Mandatory third-party permissions that allow computers to run applications at the expense of introducing vulnerabilities. 2) The Halting Problem of Turing-complete AI programming languages potentially renders AGI unstoppable. The double whammy of these inherent weaknesses remains invincible under the legacy systems. A recent cybersecurity breakthrough shows that banning all permissions reduces the computer attack surface to zero, delivering a new zero vulnerability computing (ZVC) paradigm. Deploying ZVC and blockchain, this paper formulates and supports a hypothesis: “Safe, secure, ethical, controllable AGI/QC is possible by conquering the two unassailable rules of computability.” Pursued by a European consortium, testing/proving the proposed hypothesis will have a groundbreaking impact on the future digital infrastructure when AGI/QC starts powering the 75 billion internet devices by 2025.
文摘We are already familiar with computers——computers work for us at home, in offices and in factories. But it is also true that many children today are using computers at schools before they can write. What does this mean for the future? Are these children lucky or not?
基金the National Fundamental Research Program under Grant No.2006CB921106National Natural Science Foundation of China under Grant Nos.10325521 and 60433050
文摘In this letter,we propose a duality computing mode,which resembles particle-wave duality property whena quantum system such as a quantum computer passes through a double-slit.In this mode,computing operations arenot necessarily unitary.The duality mode provides a natural link between classical computing and quantum computing.In addition,the duality mode provides a new tool for quantum algorithm design.
文摘A new approach for the implementation of variogram models and ordinary kriging using the R statistical language, in conjunction with Fortran, the MPI (Message Passing Interface), and the "pbdDMAT" package within R on the Bridges and Stampede Supercomputers will be described. This new technique has led to great improvements in timing as compared to those in R alone, or R with C and MPI. These improvements include processing and forecasting vectors of size 25,000 in an average time of 6 minutes on the Stampede Supercomputer and 2.5 minutes on the Bridges Supercomputer as compared to previous processing times of 3.5 hours.
基金supported by the National Natural Science Foundation of China(Grant No.12174224)。
文摘Nonadiabatic holonomic quantum computers serve as the physical platform for nonadiabatic holonomic quantum computation.As quantum computation has entered the noisy intermediate-scale era,building accurate intermediate-scale nonadiabatic holo-nomic quantum computers is clearly necessary.Given that measurements are the sole means of extracting information,they play an indispensable role in nonadiabatic holonomic quantum computers.Accordingly,developing methods to reduce measurement errors in nonadiabatic holonomic quantum computers is of great importance.However,while much attention has been given to the research on nonadiabatic holonomic gates,the research on reducing measurement errors in nonadiabatic holonomic quantum computers is severely lacking.In this study,we propose a measurement error reduction method tailored for intermediate-scale nonadiabatic holonomic quantum computers.The reason we say this is because our method can not only reduce the measurement errors in the computer but also be useful in mitigating errors originating from nonadiabatic holonomic gates.Given these features,our method significantly advances the construction of accurate intermediate-scale nonadiabatic holonomic quantum computers.
基金supported by the National Natural Science Foundation of China(12325501 and 12447101).
文摘In the effort to develop useful quantum computers,simulating quantum machines with conventional classical computing resources is a key capability.Such simulations will always face limits,preventing the emulation of quantum computers at substantial scale;however,by pushing the envelope through optimal choices of algorithms and hardware,the value of simulator tools can be maximized.This work reviews state-of-the-art numerical simulation methods,i.e.,classical algorithms that emulate quantum computer evolution under specific operations.We focus on the mainstream state-vector and tensor-network paradigms,while briefly mentioning alternative methods.Moreover,we review the diverse applications of simulation across different facets of quantum computer development,including understanding the fundamental differences between quantum and classical computations,exploring algorithmic design for quantum advantage,predicting quantum processor performance at the design stage,and efficiently characterizing fabricated devices for rapid iterations.This review complements recent surveys of current tools and implementations;here,we aim to provide readers with an essential understanding of the theoretical basis of classical simulation methods,a detailed discussion of their advantages and limitations,and an overview of the demands and challenges arising from practical use cases.
文摘Background Describing where distribution hotspots and coldspots are located is crucial for any science-based species management and governance.Thus,here we created the world's first Super Species Distribution Models(SDMs)including all described primate species and the best-available predictor set.These Super SDMs are conducted using an ensemble of modern Machine Learning algorithms,including Maxent,Tree Net,Random Forest,CART,CART Boosting and Bagging,and MARS with the utilization of cloud supercomputers(as an add-on option for more powerful models).For the global cold/hotspot models,we obtained global distribution data from www.GBIF.org(approx.420,000 raw occurrence records)and utilized the world's largest Open Access environmental predictor set of 201 layers.For this analysis,all occurrences have been merged into one multi-species(400+species)pixel-based analysis.Results We present the first quantified pixel-based global primate hotspot prediction for Central and Northern South America,West Africa,East Africa,Southeast Asia,Central Asia,and Southern Africa.The global primate coldspots are Antarctica,the Arctic,most temperate regions,and Oceania past the Wallace line.We additionally described all these modeled hotspots/coldspots and discussed reasons for a quantified understanding of where the world's non-human primates occur(or not).Conclusions This shows us where the focus for most future research and conservation management efforts should be,using state-of-the-art digital data indication tools with reasoning.Those areas should be considered of the highest conservation management priority,ideally following‘no killing zones'and sustainable land stewardship approaches if primates are to have a chance of survival.
文摘Noncohesive particle clusters are identified and tracked in turbulent flows to determine the breakdown and time evolution of cluster statistics and their implications for interscale mass transfer,which has connections to the classical turbulent energy cascade and its mass cascade counterpart running in parallel.In particular,the formation and dynamics of sediment and larvae clusters are of interest to coral larvae settlement in coastal regions and particularly the resilience of green-gray coastal protection solutions.Analogous cluster behavior is relevant to cloud microphysics and precipitation initiation,radiation transport and light transmission through colloids and suspensions,heat and mass transfer in particle-laden flows,and viral and pollutant transmission.Following a comparison between various clustering techniques,we adopt a density-based cluster identification algorithm based on its simplicity and efficiency,where particles are clustered based on the number of neighboring particles in their individual spheres of influence.We establish parallels with lattice-based percolation theory,as evident in the power-law scaling of the cluster size distribution near the percolation threshold.The degree of discontinuity of the phase transition associated with this percolation threshold is observed to broaden with larger Stokes numbers and thereby large-scale clustering.The sensitivity of our findings to the employed clustering algorithm is discussed.A novel cluster tracking algorithm is deployed to determine the interscale transfer rate along the particle-number phase-space dimension via accounting of cluster breakup and merger events,extending previous work on the bubble breakup cascade beneath surface breaking waves.Our findings shed light on the interaction between particle clusters and their carrier turbulent flows,with an eye toward transport models incorporating cluster characteristics and dynamics.
基金Support by National Natural Science Foundation of China(22127802,22573091)the HY Action(62402010305)。
文摘Biomass-based hydrocarbon fuels,as one of the alternatives to traditional fossil fuels,have attracted considerable attention in the energy field due to their renewability and environmental benefits.This article provides a systematic review of recent research progress in the chemical synthesis of biomass-based hydrocarbon fuels.It outlines the conversion pathways using feedstocks such as lipids,terpenoids,cellulose/hemicellulose,and lignin.Depending on the feedstock,various products with distinct structural characteristics can be prepared through reactions such as cyclization,condensation,and catalytic hydrogenation.Throughout the synthesis process,three key factors play a critical role:efficient catalyst development,production process optimization,and computational-chemistry-based molecular design.Finally,the article discusses future perspectives for biomass-based hydrocarbon fuel synthesis research.
文摘The capture of atmospheric carbon dioxide by adsorbents is an important strategy to deal with the greenhouse effect.Compared with traditional CO_(2) adsorption materials like activated carbon,silica gel,and zeolite molecular sieves,covalent organic frameworks(COFs)have excellent thermal and chemical stabilities and can be produced in many different forms.Using their different possible construction units,ordered structures for specific applications can be produced,giving them broad prospects in fields such as gas storage.This review analyzes the different types of COFs that have been synthesized and their different methods of CO_(2) capture.It then discusses different ways to increase CO_(2) adsorption by changing the internal structure of COFs and modifying their surfaces.The limitations of COF-derived carbon materials in CO_(2) capture are reviewed and,finally,the key role of machine learning and computational simulation in improving CO_(2) adsorption is mentioned,and the current status and future possible uses of COFs are summarized.
文摘We consider the relevance of computer hardware and simulations not only to science and technology but also to social life. Evolutionary processes are part of all we know, from the physical and inanimate world to the simplest or most complex biological system. Evolution is manifested by land mark discoveries which deeply affect our social life. Demographic pressure, demand for improved living standards and devastation of the natural environment pose new and complex challenges. We believe that the implementation of new computational models based on the latest scientific methodology can provide a reasonable chance of overcoming today's social problems. To ensure this goal, however, we need a change of mindset, placing findings obtained from modern science above traditional concepts and beliefs. In particular, the type of modeling used with success in computational sciences must be extended to allow simulations of novel models for social life.