It has been over a decade since the first coded aperture video compressive sensing(CS)system was reported.The underlying principle of this technology is to employ a high-frequency modulator in the optical path to modu...It has been over a decade since the first coded aperture video compressive sensing(CS)system was reported.The underlying principle of this technology is to employ a high-frequency modulator in the optical path to modulate a recorded high-speed scene within one integration time.The superimposed image captured in this manner is modulated and compressed,since multiple modulation patterns are imposed.Following this,reconstruction algorithms are utilized to recover the desired high-speed scene.One leading advantage of video CS is that a single captured measurement can be used to reconstruct a multi-frame video,thereby enabling a low-speed camera to capture high-speed scenes.Inspired by this,a number of variants of video CS systems have been built,mainly using different modulation devices.Meanwhile,in order to obtain high-quality reconstruction videos,many algorithms have been developed,from optimization-based iterative algorithms to deep-learning-based ones.Recently,emerging deep learning methods have been dominant due to their high-speed inference and high-quality reconstruction,highlighting the possibility of deploying video CS in practical applications.Toward this end,this paper reviews the progress that has been achieved in video CS during the past decade.We further analyze the efforts that need to be made—in terms of both hardware and algorithms—to enable real applications.Research gaps are put forward and future directions are summarized to help researchers and engineers working on this topic.展开更多
With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heurist...With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heuristic algorithms have shown remarkable effectiveness in solving these challenges across diverse domains,such as machine learning,process control,and engineering design,showcasing their capability to address complex optimization problems.The Stochastic Fractal Search(SFS)algorithm is one of the most popular meta-heuristic optimization methods inspired by the fractal growth patterns of natural materials.Since its introduction by Hamid Salimi in 2015,SFS has garnered significant attention from researchers and has been applied to diverse optimization problems acrossmultiple disciplines.Its popularity can be attributed to several factors,including its simplicity,practical computational efficiency,ease of implementation,rapid convergence,high effectiveness,and ability to address singleandmulti-objective optimization problems,often outperforming other established algorithms.This review paper offers a comprehensive and detailed analysis of the SFS algorithm,covering its standard version,modifications,hybridization,and multi-objective implementations.The paper also examines several SFS applications across diverse domains,including power and energy systems,image processing,machine learning,wireless sensor networks,environmental modeling,economics and finance,and numerous engineering challenges.Furthermore,the paper critically evaluates the SFS algorithm’s performance,benchmarking its effectiveness against recently published meta-heuristic algorithms.In conclusion,the review highlights key findings and suggests potential directions for future developments and modifications of the SFS algorithm.展开更多
In the 9 December 2024 issue of Nature[1],a team of Google engineers reported breakthrough results using“Willow”,their lat-est quantum computing chip(Fig.1).By meeting a milestone“below threshold”reduction in the ...In the 9 December 2024 issue of Nature[1],a team of Google engineers reported breakthrough results using“Willow”,their lat-est quantum computing chip(Fig.1).By meeting a milestone“below threshold”reduction in the rate of errors that plague super-conducting circuit-based quantum computing systems(Fig.2),the work moves the field another step towards its promised super-charged applications,albeit likely still many years away.Areas expected to benefit from quantum computing include,among others,drug discovery,materials science,finance,cybersecurity,and machine learning.展开更多
In medical imaging,accurate brain tumor classification in medical imaging requires real-time processing and efficient computation,making hardware acceleration essential.Field Programmable Gate Arrays(FPGAs)offer paral...In medical imaging,accurate brain tumor classification in medical imaging requires real-time processing and efficient computation,making hardware acceleration essential.Field Programmable Gate Arrays(FPGAs)offer parallelism and reconfigurability,making them well-suited for such tasks.In this study,we propose a hardware-accelerated Convolutional Neural Network(CNN)for brain cancer classification,implemented on the PYNQ-Z2 FPGA.Our approach optimizes the first Conv2D layer using different numerical representations:8-bit fixed-point(INT8),16-bit fixed-point(FP16),and 32-bit fixed-point(FP32),while the remaining layers run on an ARM Cortex-A9 processor.Experimental results demonstrate that FPGA acceleration significantly outperforms the CPU(Central Processing Unit)based approach.The obtained results emphasize the critical importance of selecting the appropriate numerical representation for hardware acceleration in medical imaging.On the PYNQ-Z2 FPGA,the INT8 achieves a 16.8%reduction in latency and 22.2%power savings compared to FP32,making it ideal for real-time and energy-constrained applications.FP16 offers a strong balance,delivering only a 0.1%drop in accuracy compared to FP32(94.1%vs.94.2%)while improving latency by 5%and reducing power consumption by 11.1%.Compared to prior works,the proposed FPGA-based CNN model achieves the highest classification accuracy(94.2%)with a throughput of up to 1.562 FPS,outperforming GPU-based and traditional CPU methods in both accuracy and hardware efficiency.These findings demonstrate the effectiveness of FPGA-based AI acceleration for real-time,power-efficient,and high-performance brain tumor classification,showcasing its practical potential in next-generation medical imaging systems.展开更多
BACKGROUND Photon-counting detector(PCD)CT represents a transformative advancement in radiological imaging,offering superior spatial resolution,enhanced contrast-tonoise ratio,and reduced radiation dose compared with ...BACKGROUND Photon-counting detector(PCD)CT represents a transformative advancement in radiological imaging,offering superior spatial resolution,enhanced contrast-tonoise ratio,and reduced radiation dose compared with the conventional energyintegrating detector CT.AIM To evaluate PCD CT in oncologic imaging,focusing on its role in tumor detection,staging,and treatment response assessment.METHODS We performed a systematic PubMed search from January 1,2017 to December 31,2024,using the keywords“photon-counting CT”,“cancer”,and“tumor”to identify studies on its use in oncologic imaging.We included experimental studies on humans or human phantoms and excluded reviews,commentaries,editorials,non-English,animal,and non-experimental studies.Study selection followed Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines.Out of 175 initial studies,39 met the inclusion criteria after screening and full-text review.Data extraction focused on study type,country of origin,and oncologic applications of photon-counting CT.No formal risk of bias assessment was performed,and the review was not registered in PROSPERO as it did not include a meta-analysis.RESULTS Key findings highlighted the advantages of PCD CT in imaging renal masses,adrenal adenomas,ovarian cancer,breast cancer,prostate cancer,pancreatic tumors,hepatocellular carcinoma,metastases,multiple myeloma,and lung cancer.Additionally,PCD CT has demonstrated improved lesion characterization and enhanced diagnostic accuracy in oncology.Despite its promising capabilities challenges related to data processing,storage,and accessibility remain.CONCLUSION As PCD CT technology evolves,its integration into routine oncologic imaging has the potential to significantly enhance cancer diagnosis and patient management.展开更多
The unmanned aerial vehicle(UAV)-assisted mobile edge computing(MEC)has been deemed a promising solution for energy-constrained devices to run smart applications with computationintensive and latency-sensitive require...The unmanned aerial vehicle(UAV)-assisted mobile edge computing(MEC)has been deemed a promising solution for energy-constrained devices to run smart applications with computationintensive and latency-sensitive requirements,especially in some infrastructure-limited areas or some emergency scenarios.However,the multi-UAVassisted MEC network remains largely unexplored.In this paper,the dynamic trajectory optimization and computation offloading are studied in a multi-UAVassisted MEC system where multiple UAVs fly over a target area with different trajectories to serve ground users.By considering the dynamic channel condition and random task arrival and jointly optimizing UAVs'trajectories,user association,and subchannel assignment,the average long-term sum of the user energy consumption minimization problem is formulated.To address the problem involving both discrete and continuous variables,a hybrid decision deep reinforcement learning(DRL)-based intelligent energyefficient resource allocation and trajectory optimization algorithm is proposed,named HDRT algorithm,where deep Q network(DQN)and deep deterministic policy gradient(DDPG)are invoked to process discrete and continuous variables,respectively.Simulation results show that the proposed HDRT algorithm converges fast and outperforms other benchmarks in the aspect of user energy consumption and latency.展开更多
As a computing paradigm that combines temporal and spatial computations,dynamic reconfigurable computing provides superiorities of flexibility,energy efficiency and area efficiency,attracting interest from both academ...As a computing paradigm that combines temporal and spatial computations,dynamic reconfigurable computing provides superiorities of flexibility,energy efficiency and area efficiency,attracting interest from both academia and industry.However,dynamic reconfigurable computing is not yet mature because of several unsolved problems.This work introduces the concept,architecture,and compilation techniques of dynamic reconfigurable computing.It also discusses the existing major challenges and points out its potential applications.展开更多
Cloud computing is the highly demanded technology nowadays.Due to the service oriented architecture,seamless accessibility and other advantages of this advent technology,many transaction rich applications are making u...Cloud computing is the highly demanded technology nowadays.Due to the service oriented architecture,seamless accessibility and other advantages of this advent technology,many transaction rich applications are making use of it.At the same time,it is vulnerable to hacks and threats.Hence securing this environment is of at most important and many research works are being reported focusing on it.This paper proposes a safe storage mechanism using Elliptic curve cryptography(ECC)for the Transaction Rich Applications(TRA).With ECC based security scheme,the security level of the protected system will be increased and it is more suitable to secure the delivered data in the portable devices.The proposed scheme shields the aligning of different kind of data elements to each provider using an ECC algorithm.Analysis,comparison and simulation prove that the proposed system is more effective and secure for the Transaction rich applications in Cloud.展开更多
Since virtualization technology enables the abstraction and sharing of resources in a flexible management way, the overall expenses of network deployment can be significantly reduced. Therefore, the technology has bee...Since virtualization technology enables the abstraction and sharing of resources in a flexible management way, the overall expenses of network deployment can be significantly reduced. Therefore, the technology has been widely applied in the core network. With the tremendous growth in mobile traffic and services, it is natural to extend virtualization technology to the cloud computing based radio access networks(CCRANs) for achieving high spectral efficiency with low cost.In this paper, the virtualization technologies in CC-RANs are surveyed, including the system architecture, key enabling techniques, challenges, and open issues. The enabling key technologies for virtualization in CC-RANs mainly including virtual resource allocation, radio access network(RAN) slicing, mobility management, and social-awareness have been comprehensively surveyed to satisfy the isolation, customization and high-efficiency utilization of radio resources. The challenges and open issues mainly focus on virtualization levels for CC-RANs, signaling design for CC-RAN virtualization, performance analysis for CC-RAN virtualization, and network security for virtualized CC-RANs.展开更多
In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of ...In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.展开更多
Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved c...Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.展开更多
In 2010, cloud computing gained momentum. Cloud computing is a model for real-time, on-demand, pay-for-use network access to a shared pool of configurable computing and storage resources. It has matured from a promisi...In 2010, cloud computing gained momentum. Cloud computing is a model for real-time, on-demand, pay-for-use network access to a shared pool of configurable computing and storage resources. It has matured from a promising business concept to a working reality in both the private and public IT sectors. The U.S. government, for example, has requested all its agencies to evaluate cloud computing alternatives as part of their budget submissions for new IT investment.展开更多
In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to im...In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to improve IIoT service efficiency.There are two types of costs for this kind of IoT network:a communication cost and a computing cost.For service efficiency,the communication cost of data transmission should be minimized,and the computing cost in the edge cloud should be also minimized.Therefore,in this paper,the communication cost for data transmission is defined as the delay factor,and the computing cost in the edge cloud is defined as the waiting time of the computing intensity.The proposed method selects an edge cloud that minimizes the total cost of the communication and computing costs.That is,a device chooses a routing path to the selected edge cloud based on the costs.The proposed method controls the data flows in a mesh-structured network and appropriately distributes the data processing load.The performance of the proposed method is validated through extensive computer simulation.When the transition probability from good to bad is 0.3 and the transition probability from bad to good is 0.7 in wireless and edge cloud states,the proposed method reduced both the average delay and the service pause counts to about 25%of the existing method.展开更多
In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing pa...In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing paradigm which covers all the granularity the study of the theory, methods, techniques and the tools. In many areas are the basic ideas of granular computing, such as the interval analysis, rough set theory, clustering analysis and information retrieval, machine learning, database, etc. With the theory of domain known division of target concept and rule acquisition, in knowledge discovery, data mining and the pattern recognition is widely used. Under this basis, in this paper, we propose the fuzzy rough theory based computing paradigm that gains ideal performance.展开更多
IT as a dynamic filed changes very rapidly; efficient management of such systems for the most of the companies requires handling tremendous complex situations in terms of hardware and software setup. Hardware and soft...IT as a dynamic filed changes very rapidly; efficient management of such systems for the most of the companies requires handling tremendous complex situations in terms of hardware and software setup. Hardware and software itself changes quickly with the time and keeping them updated is a difficult problem for the most of the companies; the problem is more emphasized for the companies having large infrastructure of IT facilities such as data centers which are expensive to be maintained. Many applications run on the company premises which require well prepared staff for successfully maintaining them. With the inception of Cloud Computing many companies have transferred their applications and data into cloud computing based platforms in order to have reduced maintaining cost, easier maintenance in terms of hardware and software, reliable and securely accessible services. The benefits of building distributed applications using Google infrastructure are conferred in this paper.展开更多
Fog computing has recently developed as a new paradigm with the aim of addressing time-sensitive applications better than with cloud computing by placing and processing tasks in close proximity to the data sources.How...Fog computing has recently developed as a new paradigm with the aim of addressing time-sensitive applications better than with cloud computing by placing and processing tasks in close proximity to the data sources.However,the majority of the fog nodes in this environment are geographically scattered with resources that are limited in terms of capabilities compared to cloud nodes,thus making the application placement problem more complex than that in cloud computing.An approach for cost-efficient application placement in fog-cloud computing environments that combines the benefits of both fog and cloud computing to optimize the placement of applications and services while minimizing costs.This approach is particularly relevant in scenarios where latency,resource constraints,and cost considerations are crucial factors for the deployment of applications.In this study,we propose a hybrid approach that combines a genetic algorithm(GA)with the Flamingo Search Algorithm(FSA)to place application modules while minimizing cost.We consider four cost-types for application deployment:Computation,communication,energy consumption,and violations.The proposed hybrid approach is called GA-FSA and is designed to place the application modules considering the deadline of the application and deploy them appropriately to fog or cloud nodes to curtail the overall cost of the system.An extensive simulation is conducted to assess the performance of the proposed approach compared to other state-of-the-art approaches.The results demonstrate that GA-FSA approach is superior to the other approaches with respect to task guarantee ratio(TGR)and total cost.展开更多
In this study, we delve into the realm of efficient Big Data Engineering and Extract, Transform, Load (ETL) processes within the healthcare sector, leveraging the robust foundation provided by the MIMIC-III Clinical D...In this study, we delve into the realm of efficient Big Data Engineering and Extract, Transform, Load (ETL) processes within the healthcare sector, leveraging the robust foundation provided by the MIMIC-III Clinical Database. Our investigation entails a comprehensive exploration of various methodologies aimed at enhancing the efficiency of ETL processes, with a primary emphasis on optimizing time and resource utilization. Through meticulous experimentation utilizing a representative dataset, we shed light on the advantages associated with the incorporation of PySpark and Docker containerized applications. Our research illuminates significant advancements in time efficiency, process streamlining, and resource optimization attained through the utilization of PySpark for distributed computing within Big Data Engineering workflows. Additionally, we underscore the strategic integration of Docker containers, delineating their pivotal role in augmenting scalability and reproducibility within the ETL pipeline. This paper encapsulates the pivotal insights gleaned from our experimental journey, accentuating the practical implications and benefits entailed in the adoption of PySpark and Docker. By streamlining Big Data Engineering and ETL processes in the context of clinical big data, our study contributes to the ongoing discourse on optimizing data processing efficiency in healthcare applications. The source code is available on request.展开更多
The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a w...The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a wide range of applications in student learning behavior analysis,teaching resource management,campus safety monitoring,and decision support,which improves the quality of education and management efficiency.Cloud computing technology supports the integration,distribution,and optimal use of educational resources through cloud resource sharing,virtual classrooms,intelligent campus management systems,and Infrastructure-as-a-Service(IaaS)models,which reduce costs and increase flexibility.This paper comprehensively discusses the practical application of big data and cloud computing technologies in smart campuses,showing how these technologies can contribute to the development of smart campuses,and laying the foundation for the future innovation of education models.展开更多
基金supported by the National Natural Science Foundation of China(61931012,62171258,62088102,and 62271414)the Zhejiang Provincial Outstanding Youth Science Foundation(LR23F010001)the Key Project of Westlake Institute for Optoelectronics(2023GD007).
文摘It has been over a decade since the first coded aperture video compressive sensing(CS)system was reported.The underlying principle of this technology is to employ a high-frequency modulator in the optical path to modulate a recorded high-speed scene within one integration time.The superimposed image captured in this manner is modulated and compressed,since multiple modulation patterns are imposed.Following this,reconstruction algorithms are utilized to recover the desired high-speed scene.One leading advantage of video CS is that a single captured measurement can be used to reconstruct a multi-frame video,thereby enabling a low-speed camera to capture high-speed scenes.Inspired by this,a number of variants of video CS systems have been built,mainly using different modulation devices.Meanwhile,in order to obtain high-quality reconstruction videos,many algorithms have been developed,from optimization-based iterative algorithms to deep-learning-based ones.Recently,emerging deep learning methods have been dominant due to their high-speed inference and high-quality reconstruction,highlighting the possibility of deploying video CS in practical applications.Toward this end,this paper reviews the progress that has been achieved in video CS during the past decade.We further analyze the efforts that need to be made—in terms of both hardware and algorithms—to enable real applications.Research gaps are put forward and future directions are summarized to help researchers and engineers working on this topic.
基金supported by Prince Sattam bin Abdulaziz University for funding this research work through the project number(2024/RV/06).
文摘With the rapid advancements in technology and science,optimization theory and algorithms have become increasingly important.A wide range of real-world problems is classified as optimization challenges,and meta-heuristic algorithms have shown remarkable effectiveness in solving these challenges across diverse domains,such as machine learning,process control,and engineering design,showcasing their capability to address complex optimization problems.The Stochastic Fractal Search(SFS)algorithm is one of the most popular meta-heuristic optimization methods inspired by the fractal growth patterns of natural materials.Since its introduction by Hamid Salimi in 2015,SFS has garnered significant attention from researchers and has been applied to diverse optimization problems acrossmultiple disciplines.Its popularity can be attributed to several factors,including its simplicity,practical computational efficiency,ease of implementation,rapid convergence,high effectiveness,and ability to address singleandmulti-objective optimization problems,often outperforming other established algorithms.This review paper offers a comprehensive and detailed analysis of the SFS algorithm,covering its standard version,modifications,hybridization,and multi-objective implementations.The paper also examines several SFS applications across diverse domains,including power and energy systems,image processing,machine learning,wireless sensor networks,environmental modeling,economics and finance,and numerous engineering challenges.Furthermore,the paper critically evaluates the SFS algorithm’s performance,benchmarking its effectiveness against recently published meta-heuristic algorithms.In conclusion,the review highlights key findings and suggests potential directions for future developments and modifications of the SFS algorithm.
文摘In the 9 December 2024 issue of Nature[1],a team of Google engineers reported breakthrough results using“Willow”,their lat-est quantum computing chip(Fig.1).By meeting a milestone“below threshold”reduction in the rate of errors that plague super-conducting circuit-based quantum computing systems(Fig.2),the work moves the field another step towards its promised super-charged applications,albeit likely still many years away.Areas expected to benefit from quantum computing include,among others,drug discovery,materials science,finance,cybersecurity,and machine learning.
基金supported by Northern Border University Researchers Supporting Project number(NBU-FFR-2025-432-03),Northern Border University,Arar,Saudi Arabia.
文摘In medical imaging,accurate brain tumor classification in medical imaging requires real-time processing and efficient computation,making hardware acceleration essential.Field Programmable Gate Arrays(FPGAs)offer parallelism and reconfigurability,making them well-suited for such tasks.In this study,we propose a hardware-accelerated Convolutional Neural Network(CNN)for brain cancer classification,implemented on the PYNQ-Z2 FPGA.Our approach optimizes the first Conv2D layer using different numerical representations:8-bit fixed-point(INT8),16-bit fixed-point(FP16),and 32-bit fixed-point(FP32),while the remaining layers run on an ARM Cortex-A9 processor.Experimental results demonstrate that FPGA acceleration significantly outperforms the CPU(Central Processing Unit)based approach.The obtained results emphasize the critical importance of selecting the appropriate numerical representation for hardware acceleration in medical imaging.On the PYNQ-Z2 FPGA,the INT8 achieves a 16.8%reduction in latency and 22.2%power savings compared to FP32,making it ideal for real-time and energy-constrained applications.FP16 offers a strong balance,delivering only a 0.1%drop in accuracy compared to FP32(94.1%vs.94.2%)while improving latency by 5%and reducing power consumption by 11.1%.Compared to prior works,the proposed FPGA-based CNN model achieves the highest classification accuracy(94.2%)with a throughput of up to 1.562 FPS,outperforming GPU-based and traditional CPU methods in both accuracy and hardware efficiency.These findings demonstrate the effectiveness of FPGA-based AI acceleration for real-time,power-efficient,and high-performance brain tumor classification,showcasing its practical potential in next-generation medical imaging systems.
文摘BACKGROUND Photon-counting detector(PCD)CT represents a transformative advancement in radiological imaging,offering superior spatial resolution,enhanced contrast-tonoise ratio,and reduced radiation dose compared with the conventional energyintegrating detector CT.AIM To evaluate PCD CT in oncologic imaging,focusing on its role in tumor detection,staging,and treatment response assessment.METHODS We performed a systematic PubMed search from January 1,2017 to December 31,2024,using the keywords“photon-counting CT”,“cancer”,and“tumor”to identify studies on its use in oncologic imaging.We included experimental studies on humans or human phantoms and excluded reviews,commentaries,editorials,non-English,animal,and non-experimental studies.Study selection followed Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines.Out of 175 initial studies,39 met the inclusion criteria after screening and full-text review.Data extraction focused on study type,country of origin,and oncologic applications of photon-counting CT.No formal risk of bias assessment was performed,and the review was not registered in PROSPERO as it did not include a meta-analysis.RESULTS Key findings highlighted the advantages of PCD CT in imaging renal masses,adrenal adenomas,ovarian cancer,breast cancer,prostate cancer,pancreatic tumors,hepatocellular carcinoma,metastases,multiple myeloma,and lung cancer.Additionally,PCD CT has demonstrated improved lesion characterization and enhanced diagnostic accuracy in oncology.Despite its promising capabilities challenges related to data processing,storage,and accessibility remain.CONCLUSION As PCD CT technology evolves,its integration into routine oncologic imaging has the potential to significantly enhance cancer diagnosis and patient management.
基金supported by National Natural Science Foundation of China(No.62471254)National Natural Science Foundation of China(No.92367302)。
文摘The unmanned aerial vehicle(UAV)-assisted mobile edge computing(MEC)has been deemed a promising solution for energy-constrained devices to run smart applications with computationintensive and latency-sensitive requirements,especially in some infrastructure-limited areas or some emergency scenarios.However,the multi-UAVassisted MEC network remains largely unexplored.In this paper,the dynamic trajectory optimization and computation offloading are studied in a multi-UAVassisted MEC system where multiple UAVs fly over a target area with different trajectories to serve ground users.By considering the dynamic channel condition and random task arrival and jointly optimizing UAVs'trajectories,user association,and subchannel assignment,the average long-term sum of the user energy consumption minimization problem is formulated.To address the problem involving both discrete and continuous variables,a hybrid decision deep reinforcement learning(DRL)-based intelligent energyefficient resource allocation and trajectory optimization algorithm is proposed,named HDRT algorithm,where deep Q network(DQN)and deep deterministic policy gradient(DDPG)are invoked to process discrete and continuous variables,respectively.Simulation results show that the proposed HDRT algorithm converges fast and outperforms other benchmarks in the aspect of user energy consumption and latency.
基金supported in part by the National Science and Technology Major Project of the Ministry of Science and Technology of China (Grant No. 2018ZX01028201)in part by the National Natural Science Foundation of China (Grant No. 61672317, No. 61834002)in part by the National Key R&D Program of China (Grant No. 2018YFB2202101)
文摘As a computing paradigm that combines temporal and spatial computations,dynamic reconfigurable computing provides superiorities of flexibility,energy efficiency and area efficiency,attracting interest from both academia and industry.However,dynamic reconfigurable computing is not yet mature because of several unsolved problems.This work introduces the concept,architecture,and compilation techniques of dynamic reconfigurable computing.It also discusses the existing major challenges and points out its potential applications.
文摘Cloud computing is the highly demanded technology nowadays.Due to the service oriented architecture,seamless accessibility and other advantages of this advent technology,many transaction rich applications are making use of it.At the same time,it is vulnerable to hacks and threats.Hence securing this environment is of at most important and many research works are being reported focusing on it.This paper proposes a safe storage mechanism using Elliptic curve cryptography(ECC)for the Transaction Rich Applications(TRA).With ECC based security scheme,the security level of the protected system will be increased and it is more suitable to secure the delivered data in the portable devices.The proposed scheme shields the aligning of different kind of data elements to each provider using an ECC algorithm.Analysis,comparison and simulation prove that the proposed system is more effective and secure for the Transaction rich applications in Cloud.
文摘Since virtualization technology enables the abstraction and sharing of resources in a flexible management way, the overall expenses of network deployment can be significantly reduced. Therefore, the technology has been widely applied in the core network. With the tremendous growth in mobile traffic and services, it is natural to extend virtualization technology to the cloud computing based radio access networks(CCRANs) for achieving high spectral efficiency with low cost.In this paper, the virtualization technologies in CC-RANs are surveyed, including the system architecture, key enabling techniques, challenges, and open issues. The enabling key technologies for virtualization in CC-RANs mainly including virtual resource allocation, radio access network(RAN) slicing, mobility management, and social-awareness have been comprehensively surveyed to satisfy the isolation, customization and high-efficiency utilization of radio resources. The challenges and open issues mainly focus on virtualization levels for CC-RANs, signaling design for CC-RAN virtualization, performance analysis for CC-RAN virtualization, and network security for virtualized CC-RANs.
文摘In light of the coronavirus disease 2019(COVID-19)outbreak caused by the novel coronavirus,companies and institutions have instructed their employees to work from home as a precautionary measure to reduce the risk of contagion.Employees,however,have been exposed to different security risks because of working from home.Moreover,the rapid global spread of COVID-19 has increased the volume of data generated from various sources.Working from home depends mainly on cloud computing(CC)applications that help employees to efficiently accomplish their tasks.The cloud computing environment(CCE)is an unsung hero in the COVID-19 pandemic crisis.It consists of the fast-paced practices for services that reflect the trend of rapidly deployable applications for maintaining data.Despite the increase in the use of CC applications,there is an ongoing research challenge in the domains of CCE concerning data,guaranteeing security,and the availability of CC applications.This paper,to the best of our knowledge,is the first paper that thoroughly explains the impact of the COVID-19 pandemic on CCE.Additionally,this paper also highlights the security risks of working from home during the COVID-19 pandemic.
文摘Cloud computing technology is changing the development and usage patterns of IT infrastructure and applications. Virtualized and distributed systems as well as unified management and scheduling has greatly im proved computing and storage. Management has become easier, andOAM costs have been significantly reduced. Cloud desktop technology is develop ing rapidly. With this technology, users can flexibly and dynamically use virtual ma chine resources, companies' efficiency of using and allocating resources is greatly improved, and information security is ensured. In most existing virtual cloud desk top solutions, computing and storage are bound together, and data is stored as im age files. This limits the flexibility and expandability of systems and is insufficient for meetinz customers' requirements in different scenarios.
文摘In 2010, cloud computing gained momentum. Cloud computing is a model for real-time, on-demand, pay-for-use network access to a shared pool of configurable computing and storage resources. It has matured from a promising business concept to a working reality in both the private and public IT sectors. The U.S. government, for example, has requested all its agencies to evaluate cloud computing alternatives as part of their budget submissions for new IT investment.
基金supported by the National Research Foundation of Korea (NRF) grant funded by the Korea Government (MSIT) (No.2021R1C1C1013133)supported by the Institute of Information and Communications Technology Planning and Evaluation (IITP)grant funded by the Korea Government (MSIT) (RS-2022-00167197,Development of Intelligent 5G/6G Infrastructure Technology for The Smart City)supported by the Soonchunhyang University Research Fund.
文摘In many IIoT architectures,various devices connect to the edge cloud via gateway systems.For data processing,numerous data are delivered to the edge cloud.Delivering data to an appropriate edge cloud is critical to improve IIoT service efficiency.There are two types of costs for this kind of IoT network:a communication cost and a computing cost.For service efficiency,the communication cost of data transmission should be minimized,and the computing cost in the edge cloud should be also minimized.Therefore,in this paper,the communication cost for data transmission is defined as the delay factor,and the computing cost in the edge cloud is defined as the waiting time of the computing intensity.The proposed method selects an edge cloud that minimizes the total cost of the communication and computing costs.That is,a device chooses a routing path to the selected edge cloud based on the costs.The proposed method controls the data flows in a mesh-structured network and appropriately distributes the data processing load.The performance of the proposed method is validated through extensive computer simulation.When the transition probability from good to bad is 0.3 and the transition probability from bad to good is 0.7 in wireless and edge cloud states,the proposed method reduced both the average delay and the service pause counts to about 25%of the existing method.
文摘In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing paradigm which covers all the granularity the study of the theory, methods, techniques and the tools. In many areas are the basic ideas of granular computing, such as the interval analysis, rough set theory, clustering analysis and information retrieval, machine learning, database, etc. With the theory of domain known division of target concept and rule acquisition, in knowledge discovery, data mining and the pattern recognition is widely used. Under this basis, in this paper, we propose the fuzzy rough theory based computing paradigm that gains ideal performance.
文摘IT as a dynamic filed changes very rapidly; efficient management of such systems for the most of the companies requires handling tremendous complex situations in terms of hardware and software setup. Hardware and software itself changes quickly with the time and keeping them updated is a difficult problem for the most of the companies; the problem is more emphasized for the companies having large infrastructure of IT facilities such as data centers which are expensive to be maintained. Many applications run on the company premises which require well prepared staff for successfully maintaining them. With the inception of Cloud Computing many companies have transferred their applications and data into cloud computing based platforms in order to have reduced maintaining cost, easier maintenance in terms of hardware and software, reliable and securely accessible services. The benefits of building distributed applications using Google infrastructure are conferred in this paper.
基金supported via funding from Prince Sattam bin Abdulaziz University Project Number(PSAU/2024/R/1445).
文摘Fog computing has recently developed as a new paradigm with the aim of addressing time-sensitive applications better than with cloud computing by placing and processing tasks in close proximity to the data sources.However,the majority of the fog nodes in this environment are geographically scattered with resources that are limited in terms of capabilities compared to cloud nodes,thus making the application placement problem more complex than that in cloud computing.An approach for cost-efficient application placement in fog-cloud computing environments that combines the benefits of both fog and cloud computing to optimize the placement of applications and services while minimizing costs.This approach is particularly relevant in scenarios where latency,resource constraints,and cost considerations are crucial factors for the deployment of applications.In this study,we propose a hybrid approach that combines a genetic algorithm(GA)with the Flamingo Search Algorithm(FSA)to place application modules while minimizing cost.We consider four cost-types for application deployment:Computation,communication,energy consumption,and violations.The proposed hybrid approach is called GA-FSA and is designed to place the application modules considering the deadline of the application and deploy them appropriately to fog or cloud nodes to curtail the overall cost of the system.An extensive simulation is conducted to assess the performance of the proposed approach compared to other state-of-the-art approaches.The results demonstrate that GA-FSA approach is superior to the other approaches with respect to task guarantee ratio(TGR)and total cost.
文摘In this study, we delve into the realm of efficient Big Data Engineering and Extract, Transform, Load (ETL) processes within the healthcare sector, leveraging the robust foundation provided by the MIMIC-III Clinical Database. Our investigation entails a comprehensive exploration of various methodologies aimed at enhancing the efficiency of ETL processes, with a primary emphasis on optimizing time and resource utilization. Through meticulous experimentation utilizing a representative dataset, we shed light on the advantages associated with the incorporation of PySpark and Docker containerized applications. Our research illuminates significant advancements in time efficiency, process streamlining, and resource optimization attained through the utilization of PySpark for distributed computing within Big Data Engineering workflows. Additionally, we underscore the strategic integration of Docker containers, delineating their pivotal role in augmenting scalability and reproducibility within the ETL pipeline. This paper encapsulates the pivotal insights gleaned from our experimental journey, accentuating the practical implications and benefits entailed in the adoption of PySpark and Docker. By streamlining Big Data Engineering and ETL processes in the context of clinical big data, our study contributes to the ongoing discourse on optimizing data processing efficiency in healthcare applications. The source code is available on request.
文摘The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a wide range of applications in student learning behavior analysis,teaching resource management,campus safety monitoring,and decision support,which improves the quality of education and management efficiency.Cloud computing technology supports the integration,distribution,and optimal use of educational resources through cloud resource sharing,virtual classrooms,intelligent campus management systems,and Infrastructure-as-a-Service(IaaS)models,which reduce costs and increase flexibility.This paper comprehensively discusses the practical application of big data and cloud computing technologies in smart campuses,showing how these technologies can contribute to the development of smart campuses,and laying the foundation for the future innovation of education models.