The number of students demanding computer science(CS)education is rapidly rising,and while faculty sizes are also growing,the traditional pipeline consisting of a CS major,a CS master’s,and then a move to industry or...The number of students demanding computer science(CS)education is rapidly rising,and while faculty sizes are also growing,the traditional pipeline consisting of a CS major,a CS master’s,and then a move to industry or a Ph.D.program is simply not scalable.To address this problem,the Department of Computing at the University of Illinois has introduced a multidisciplinary approach to computing,which is a scalable and collaborative approach to capitalize on the tremendous demand for computer science education.The key component of the approach is the blended major,also referred to as“CS+X”,where CS denotes computer science and X denotes a non-computing field.These CS+X blended degrees enable win-win partnerships among multiple subject areas,distributing the educational responsibilities while growing the entire university.To meet the demand from non-CS majors,another pathway that is offered is a graduate certificate program in addition to the traditional minor program.To accommodate the large number of students,scalable teaching tools,such as automatic graders,have also been developed.展开更多
Computer science(CS)is a discipline to study the scientific and practical approach to computation and its applications.As we enter into the Internet era,computers and the Internet have become intimate parts of our dai...Computer science(CS)is a discipline to study the scientific and practical approach to computation and its applications.As we enter into the Internet era,computers and the Internet have become intimate parts of our daily life.Due to its rapid development and wide applications recently,more CS graduates are needed in industries around the world.In USA,this situation is even more severe due to the rapid expansions of several big IT related companies such as Microsoft,Google,Facebook,Amazon,IBM etc.Hence,how to effectively train a large number of展开更多
At the panel session of the 3rd Global Forum on the Development of Computer Science,attendees had an opportunity to deliberate recent issues affecting computer science departments as a result of the recent growth in t...At the panel session of the 3rd Global Forum on the Development of Computer Science,attendees had an opportunity to deliberate recent issues affecting computer science departments as a result of the recent growth in the field.6 heads of university computer science departments participated in the discussions,including the moderator,Professor Andrew Yao.The first issue was how universities are managing the growing number of applicants in addition to swelling class sizes.Several approaches were suggested,including increasing faculty hiring,implementing scalable teaching tools,and working closer with other departments through degree programs that integrate computer science with other fields.The second issue was about the position and role of computer science within broader science.Participants generally agreed that all fields are increasingly relying on computer science techniques,and that effectively disseminating these techniques to others is a key to unlocking broader scientific progress.展开更多
Improving the quality assurance (QA) processes and acquiring accreditation are top priorities for academic programs. The learning outcomes (LOs)assessment and continuous quality improvement represent core components o...Improving the quality assurance (QA) processes and acquiring accreditation are top priorities for academic programs. The learning outcomes (LOs)assessment and continuous quality improvement represent core components ofthe quality assurance system (QAS). Current assessment methods suffer deficiencies related to accuracy and reliability, and they lack well-organized processes forcontinuous improvement planning. Moreover, the absence of automation, andintegration in QA processes forms a major obstacle towards developing efficientquality system. There is a pressing need to adopt security protocols that providerequired security services to safeguard the valuable information processed byQAS as well. This research proposes an effective methodology for LOs assessment and continuous improvement processes. The proposed approach ensuresmore accurate and reliable LOs assessment results and provides systematic wayfor utilizing those results in the continuous quality improvement. This systematicand well-specified QA processes were then utilized to model and implement automated and secure QAS that efficiently performs quality-related processes. Theproposed system adopts two security protocols that provide confidentiality, integrity, and authentication for quality data and reports. The security protocols avoidthe source repudiation, which is important in the quality reporting system. This isachieved through implementing powerful cryptographic algorithms. The QASenables efficient data collection and processing required for analysis and interpretation. It also prepares for the development of datasets that can be used in futureartificial intelligence (AI) researches to support decision making and improve thequality of academic programs. The proposed approach is implemented in a successful real case study for a computer science program. The current study servesscientific programs struggling to achieve academic accreditation, and gives rise tofully automating and integrating the QA processes and adopting modern AI andsecurity technologies to develop effective QAS.展开更多
The need for information systems in organizations and economic units increases as there is a great deal of data that arise from doing many of the processes in order to be addressed to provide information that can brin...The need for information systems in organizations and economic units increases as there is a great deal of data that arise from doing many of the processes in order to be addressed to provide information that can bring interest to multi-users, the new and distinctive management accounting systems which meet in a manner easily all the needs of institutions and individuals from financial business, accounting and management, which take into account the accuracy, speed and confidentiality of the information for which the system is designed. The paper aims to describe a computerized system that is able to predict the budget for the new year based on past budgets by using time series analysis, which gives results with errors to a minimum and controls the budget during the year, through the ability to control exchange, compared to the scheme with the investigator and calculating the deviation, measurement of performance ratio and the expense of a number of indicators relating to budgets, such as the rate of condensation of capital, the growth rate and profitability ratio and gives a clear indication whether these ratios are good or not. There is a positive impact on information systems through this system for its ability to accomplish complex calculations and process paperwork, which is faster than it was previously and there is also a high flexibility, where the system can do any adjustments required in helping relevant parties to control the financial matters of the decision-making appropriate action thereon.展开更多
The importance of prerequisites for education has recently become a promising research direction.This work proposes a statistical model for measuring dependencies in learning resources between knowledge units.Instruct...The importance of prerequisites for education has recently become a promising research direction.This work proposes a statistical model for measuring dependencies in learning resources between knowledge units.Instructors are expected to present knowledge units in a semantically well-organized manner to facilitate students’understanding of the material.The proposed model reveals how inner concepts of a knowledge unit are dependent on each other and on concepts not in the knowledge unit.To help understand the complexity of the inner concepts themselves,WordNet is included as an external knowledge base in thismodel.The goal is to develop a model that will enable instructors to evaluate whether or not a learning regime has hidden relationships which might hinder students’ability to understand the material.The evaluation,employing three textbooks,shows that the proposed model succeeds in discovering hidden relationships among knowledge units in learning resources and in exposing the knowledge gaps in some knowledge units.展开更多
It's a great pleasure for me to be here today and have this opportunity to talk to you about my view of the future of computer science, because I think this is a very important time for those of you, the st...It's a great pleasure for me to be here today and have this opportunity to talk to you about my view of the future of computer science, because I think this is a very important time for those of you, the students. What I like to do is I like展开更多
In the very beginning,the Computer Laboratory of the University of Cambridge was founded to provide computing service for different disciplines across the university.As computer science developed as a discipline in it...In the very beginning,the Computer Laboratory of the University of Cambridge was founded to provide computing service for different disciplines across the university.As computer science developed as a discipline in its own right,boundaries necessarily arose between it and other disciplines,in a way that is now often detrimental to progress.Therefore,it is necessary to reinvigorate the relationship between computer science and other academic disciplines and celebrate exploration and creativity in research.To do this,the structures of the academic department have to act as supporting scaffolding rather than barriers.Some examples are given that show the efforts being made at the University of Cambridge to approach this problem.展开更多
The rapid digitalization of urban infrastructure has made smart cities increasingly vulnerable to sophisticated cyber threats.In the evolving landscape of cybersecurity,the efficacy of Intrusion Detection Systems(IDS)...The rapid digitalization of urban infrastructure has made smart cities increasingly vulnerable to sophisticated cyber threats.In the evolving landscape of cybersecurity,the efficacy of Intrusion Detection Systems(IDS)is increasingly measured by technical performance,operational usability,and adaptability.This study introduces and rigorously evaluates a Human-Computer Interaction(HCI)-Integrated IDS with the utilization of Convolutional Neural Network(CNN),CNN-Long Short Term Memory(LSTM),and Random Forest(RF)against both a Baseline Machine Learning(ML)and a Traditional IDS model,through an extensive experimental framework encompassing many performance metrics,including detection latency,accuracy,alert prioritization,classification errors,system throughput,usability,ROC-AUC,precision-recall,confusion matrix analysis,and statistical accuracy measures.Our findings consistently demonstrate the superiority of the HCI-Integrated approach utilizing three major datasets(CICIDS 2017,KDD Cup 1999,and UNSW-NB15).Experimental results indicate that the HCI-Integrated model outperforms its counterparts,achieving an AUC-ROC of 0.99,a precision of 0.93,and a recall of 0.96,while maintaining the lowest false positive rate(0.03)and the fastest detection time(~1.5 s).These findings validate the efficacy of incorporating HCI to enhance anomaly detection capabilities,improve responsiveness,and reduce alert fatigue in critical smart city applications.It achieves markedly lower detection times,higher accuracy across all threat categories,reduced false positive and false negative rates,and enhanced system throughput under concurrent load conditions.The HCIIntegrated IDS excels in alert contextualization and prioritization,offering more actionable insights while minimizing analyst fatigue.Usability feedback underscores increased analyst confidence and operational clarity,reinforcing the importance of user-centered design.These results collectively position the HCI-Integrated IDS as a highly effective,scalable,and human-aligned solution for modern threat detection environments.展开更多
This paper introduces a novel fractional-order model based on the Caputo-Fabrizio(CF)derivative for analyzing computer virus propagation in networked environments.The model partitions the computer population into four...This paper introduces a novel fractional-order model based on the Caputo-Fabrizio(CF)derivative for analyzing computer virus propagation in networked environments.The model partitions the computer population into four compartments:susceptible,latently infected,breaking-out,and antivirus-capable systems.By employing the CF derivative—which uses a nonsingular exponential kernel—the framework effectively captures memory-dependent and nonlocal characteristics intrinsic to cyber systems,aspects inadequately represented by traditional integer-order models.Under Lipschitz continuity and boundedness assumptions,the existence and uniqueness of solutions are rigorously established via fixed-point theory.We develop a tailored two-step Adams-Bashforth numerical scheme for the CF framework and prove its second-order accuracy.Extensive numerical simulations across various fractional orders reveal that memory effects significantly influence virus transmission and control dynamics;smaller fractional orders produce more pronounced memory effects,delaying both infection spread and antivirus activation.Further theoretical analysis,including Hyers-Ulam stability and sensitivity assessments,reinforces the model’s robustness and identifies key parameters governing virus dynamics.The study also extends the framework to incorporate stochastic effects through a stochastic CF formulation.These results underscore fractional-order modeling as a powerful analytical tool for developing robust and effective cybersecurity strategies.展开更多
Mortality prediction in respiratory health is challenging,especially when using large-scale clinical datasets composed primarily of categorical variables.Traditional digital twin(DT)frameworks often rely on longi-tudi...Mortality prediction in respiratory health is challenging,especially when using large-scale clinical datasets composed primarily of categorical variables.Traditional digital twin(DT)frameworks often rely on longi-tudinal or sensor-based data,which are not always available in public health contexts.In this article,we propose a novel proto-DT framework for mortality prediction in respiratory health using a large-scale categorical biomedical dataset.This dataset contains 415,711 severe acute respiratory infection cases from the Brazilian Unified Health System,including both COVID-19 and non-COVID-19 patients.Four classification models—extreme gradient boosting(XGBoost),logistic regression,random forest,and a deep neural network(DNN)—are trained using cost-sensitive learning to address class imbalance.The models are evaluated using accuracy,precision,recall,F1-score,and area under the curve(AUC)related to the receiver operating characteristic(ROC).The framework supports simulated interventions by modifying selected inputs and recalculating predicted mortality.Additionally,we incorporate multiple correspondence analysis and K-means clustering to explore model sensitivity.A Python library has been developed to ensure reproducibility.All models achieve AUC-ROC values near or above 0.85.XGBoost yields the highest accuracy(0.84),while the DNN achieves the highest recall(0.81).Scenario-based simulations reveal how key clinical factors,such as intensive care unit admission and oxygen support,affect predicted outcomes.The proposed proto-DT framework demonstrates the feasibility of mortality prediction and intervention simulation using categorical data alone.This framework provides a foundation for data-driven explainable DTs in public health,even in the absence of time-series data.展开更多
Since its inaugural issue in 1986,the Journal of Computer Science and Technology(JCST)has been the premier English journal of China Computer Federation(CCF),serving international readers and authors by disseminating s...Since its inaugural issue in 1986,the Journal of Computer Science and Technology(JCST)has been the premier English journal of China Computer Federation(CCF),serving international readers and authors by disseminating scholarly and technical papers under a rigorous review process.展开更多
Technological innovation ushered in the computer era, and, after a few years of tutelage by established disciplines, computer science emerged as an independent discipline. In the subsequent decades computer science de...Technological innovation ushered in the computer era, and, after a few years of tutelage by established disciplines, computer science emerged as an independent discipline. In the subsequent decades computer science developed its special identity, sharing the dual character of engineering and mathematics. This evolution is revisited here based on my personal experience. In my view, the notion of computational model has been the enabler of extraordinary creativity, and at the same time the source of critical reflection two decades ago. However, capitalizing on a vibrant technology, computer science is reinventing itself as the indispensable enabler of applications. This is a crucial profile that calls for a pedagogical adaptation, where the notion of model morphs from means to end.展开更多
As computer science enrollments continue to surge, assessments that involve student collaboration may play a more critical role in improving student learning. We provide a review on some of the most commonly adopted c...As computer science enrollments continue to surge, assessments that involve student collaboration may play a more critical role in improving student learning. We provide a review on some of the most commonly adopted collaborative assessments in computer science, including pair programming, collaborative exams, and group projects. Existing research on these assessment formats is categorized and compared. We also discuss potential future research topics on the aforementioned collaborative assessment formats.展开更多
This paper presents CMOS circuit designs of a ternary adder and a ternary multiplier,formulated using transmission function theory.Binary carry signals appearing in these designs allow conventional look-ahead carry te...This paper presents CMOS circuit designs of a ternary adder and a ternary multiplier,formulated using transmission function theory.Binary carry signals appearing in these designs allow conventional look-ahead carry techniques to be used.Compared with previous similar designs,the circuits proposed in this paper have advantages such as low dissipation,low output impedance,and simplicity of construction.展开更多
The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,s...The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.展开更多
Recent years have witnessed the ever-increasing performance of Deep Neural Networks(DNNs)in computer vision tasks.However,researchers have identified a potential vulnerability:carefully crafted adversarial examples ca...Recent years have witnessed the ever-increasing performance of Deep Neural Networks(DNNs)in computer vision tasks.However,researchers have identified a potential vulnerability:carefully crafted adversarial examples can easily mislead DNNs into incorrect behavior via the injection of imperceptible modification to the input data.In this survey,we focus on(1)adversarial attack algorithms to generate adversarial examples,(2)adversarial defense techniques to secure DNNs against adversarial examples,and(3)important problems in the realm of adversarial examples beyond attack and defense,including the theoretical explanations,trade-off issues and benign attacks in adversarial examples.Additionally,we draw a brief comparison between recently published surveys on adversarial examples,and identify the future directions for the research of adversarial examples,such as the generalization of methods and the understanding of transferability,that might be solutions to the open problems in this field.展开更多
Deep-time Earth research plays a pivotal role in deciphering the rates,patterns,and mechanisms of Earth's evolutionary processes throughout geological history,providing essential scientific foundations for climate...Deep-time Earth research plays a pivotal role in deciphering the rates,patterns,and mechanisms of Earth's evolutionary processes throughout geological history,providing essential scientific foundations for climate prediction,natural resource exploration,and sustainable planetary stewardship.To advance Deep-time Earth research in the era of big data and artificial intelligence,the International Union of Geological Sciences initiated the“Deeptime Digital Earth International Big Science Program”(DDE)in 2019.At the core of this ambitious program lies the development of geoscience knowledge graphs,serving as a transformative knowledge infrastructure that enables the integration,sharing,mining,and analysis of heterogeneous geoscience big data.The DDE knowledge graph initiative has made significant strides in three critical dimensions:(1)establishing a unified knowledge structure across geoscience disciplines that ensures consistent representation of geological entities and their interrelationships through standardized ontologies and semantic frameworks;(2)developing a robust and scalable software infrastructure capable of supporting both expert-driven and machine-assisted knowledge engineering for large-scale graph construction and management;(3)implementing a comprehensive three-tiered architecture encompassing basic,discipline-specific,and application-oriented knowledge graphs,spanning approximately 20 geoscience disciplines.Through its open knowledge framework and international collaborative network,this initiative has fostered multinational research collaborations,establishing a robust foundation for next-generation geoscience research while propelling the discipline toward FAIR(Findable,Accessible,Interoperable,Reusable)data practices in deep-time Earth systems research.展开更多
Accurate fingertip detection is critical for translating hand gestures into actionable commands in vision-based human‒computer interaction(HCI)systems.However,challenges such as complex backgrounds,dynamic hand postur...Accurate fingertip detection is critical for translating hand gestures into actionable commands in vision-based human‒computer interaction(HCI)systems.However,challenges such as complex backgrounds,dynamic hand postures,and real-time processing constraints hinder reliable detection.This paper introduces a robust framework integrating three key innovations:(1)an adaptive Gaussian mixture model(GMM)enhanced with neighborhood pixel connectivity for precise motion extraction;(2)a weighted YCbCr color-space shadow removal algorithm to eliminate false positives;and(3)a centroid distance method refined with circularity constraints for accurate fingertip localization.Extensive experiments demonstrate a recognition accuracy of 97.26%across diverse scenarios,including varying illuminations,occlusions,and hand rotations.The algorithm processes each frame in 23.43 ms on average,satisfying real-time requirements.Comparative evaluations against state-of-the-art methods reveal significant improvements in precision(8.3%),recall(6.1%),and F-measure(7.8%).This work advances HCI applications such as virtual keyboards,gesture-controlled interfaces,and augmented reality systems.展开更多
With the rapid development of artificial intelligence technology,AIGC(Artificial Intelligence-Generated Content)has triggered profound changes in the field of high-level language programming courses.This paper deeply ...With the rapid development of artificial intelligence technology,AIGC(Artificial Intelligence-Generated Content)has triggered profound changes in the field of high-level language programming courses.This paper deeply explored the application principles,advantages,and limitations of AIGC in intelligent code generation,analyzed the new mode of human-computer collaboration in high-level language programming courses driven by AIGC,discussed the impact of human-computer collaboration on programming efficiency and code quality through practical case studies,and looks forward to future development trends.This research aims to provide theoretical and practical guidance for high-level language programming courses and promote innovative development of high-level language programming courses under the human-computer collaboration paradigm.展开更多
文摘The number of students demanding computer science(CS)education is rapidly rising,and while faculty sizes are also growing,the traditional pipeline consisting of a CS major,a CS master’s,and then a move to industry or a Ph.D.program is simply not scalable.To address this problem,the Department of Computing at the University of Illinois has introduced a multidisciplinary approach to computing,which is a scalable and collaborative approach to capitalize on the tremendous demand for computer science education.The key component of the approach is the blended major,also referred to as“CS+X”,where CS denotes computer science and X denotes a non-computing field.These CS+X blended degrees enable win-win partnerships among multiple subject areas,distributing the educational responsibilities while growing the entire university.To meet the demand from non-CS majors,another pathway that is offered is a graduate certificate program in addition to the traditional minor program.To accommodate the large number of students,scalable teaching tools,such as automatic graders,have also been developed.
文摘Computer science(CS)is a discipline to study the scientific and practical approach to computation and its applications.As we enter into the Internet era,computers and the Internet have become intimate parts of our daily life.Due to its rapid development and wide applications recently,more CS graduates are needed in industries around the world.In USA,this situation is even more severe due to the rapid expansions of several big IT related companies such as Microsoft,Google,Facebook,Amazon,IBM etc.Hence,how to effectively train a large number of
文摘At the panel session of the 3rd Global Forum on the Development of Computer Science,attendees had an opportunity to deliberate recent issues affecting computer science departments as a result of the recent growth in the field.6 heads of university computer science departments participated in the discussions,including the moderator,Professor Andrew Yao.The first issue was how universities are managing the growing number of applicants in addition to swelling class sizes.Several approaches were suggested,including increasing faculty hiring,implementing scalable teaching tools,and working closer with other departments through degree programs that integrate computer science with other fields.The second issue was about the position and role of computer science within broader science.Participants generally agreed that all fields are increasingly relying on computer science techniques,and that effectively disseminating these techniques to others is a key to unlocking broader scientific progress.
基金Author extends his appreciation to the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University for funding and supporting this work through Graduate Student Research Support Program.
文摘Improving the quality assurance (QA) processes and acquiring accreditation are top priorities for academic programs. The learning outcomes (LOs)assessment and continuous quality improvement represent core components ofthe quality assurance system (QAS). Current assessment methods suffer deficiencies related to accuracy and reliability, and they lack well-organized processes forcontinuous improvement planning. Moreover, the absence of automation, andintegration in QA processes forms a major obstacle towards developing efficientquality system. There is a pressing need to adopt security protocols that providerequired security services to safeguard the valuable information processed byQAS as well. This research proposes an effective methodology for LOs assessment and continuous improvement processes. The proposed approach ensuresmore accurate and reliable LOs assessment results and provides systematic wayfor utilizing those results in the continuous quality improvement. This systematicand well-specified QA processes were then utilized to model and implement automated and secure QAS that efficiently performs quality-related processes. Theproposed system adopts two security protocols that provide confidentiality, integrity, and authentication for quality data and reports. The security protocols avoidthe source repudiation, which is important in the quality reporting system. This isachieved through implementing powerful cryptographic algorithms. The QASenables efficient data collection and processing required for analysis and interpretation. It also prepares for the development of datasets that can be used in futureartificial intelligence (AI) researches to support decision making and improve thequality of academic programs. The proposed approach is implemented in a successful real case study for a computer science program. The current study servesscientific programs struggling to achieve academic accreditation, and gives rise tofully automating and integrating the QA processes and adopting modern AI andsecurity technologies to develop effective QAS.
文摘The need for information systems in organizations and economic units increases as there is a great deal of data that arise from doing many of the processes in order to be addressed to provide information that can bring interest to multi-users, the new and distinctive management accounting systems which meet in a manner easily all the needs of institutions and individuals from financial business, accounting and management, which take into account the accuracy, speed and confidentiality of the information for which the system is designed. The paper aims to describe a computerized system that is able to predict the budget for the new year based on past budgets by using time series analysis, which gives results with errors to a minimum and controls the budget during the year, through the ability to control exchange, compared to the scheme with the investigator and calculating the deviation, measurement of performance ratio and the expense of a number of indicators relating to budgets, such as the rate of condensation of capital, the growth rate and profitability ratio and gives a clear indication whether these ratios are good or not. There is a positive impact on information systems through this system for its ability to accomplish complex calculations and process paperwork, which is faster than it was previously and there is also a high flexibility, where the system can do any adjustments required in helping relevant parties to control the financial matters of the decision-making appropriate action thereon.
文摘The importance of prerequisites for education has recently become a promising research direction.This work proposes a statistical model for measuring dependencies in learning resources between knowledge units.Instructors are expected to present knowledge units in a semantically well-organized manner to facilitate students’understanding of the material.The proposed model reveals how inner concepts of a knowledge unit are dependent on each other and on concepts not in the knowledge unit.To help understand the complexity of the inner concepts themselves,WordNet is included as an external knowledge base in thismodel.The goal is to develop a model that will enable instructors to evaluate whether or not a learning regime has hidden relationships which might hinder students’ability to understand the material.The evaluation,employing three textbooks,shows that the proposed model succeeds in discovering hidden relationships among knowledge units in learning resources and in exposing the knowledge gaps in some knowledge units.
文摘It's a great pleasure for me to be here today and have this opportunity to talk to you about my view of the future of computer science, because I think this is a very important time for those of you, the students. What I like to do is I like
文摘In the very beginning,the Computer Laboratory of the University of Cambridge was founded to provide computing service for different disciplines across the university.As computer science developed as a discipline in its own right,boundaries necessarily arose between it and other disciplines,in a way that is now often detrimental to progress.Therefore,it is necessary to reinvigorate the relationship between computer science and other academic disciplines and celebrate exploration and creativity in research.To do this,the structures of the academic department have to act as supporting scaffolding rather than barriers.Some examples are given that show the efforts being made at the University of Cambridge to approach this problem.
基金funded and supported by the Ongoing Research Funding program(ORF-2025-314),King Saud University,Riyadh,Saudi Arabia.
文摘The rapid digitalization of urban infrastructure has made smart cities increasingly vulnerable to sophisticated cyber threats.In the evolving landscape of cybersecurity,the efficacy of Intrusion Detection Systems(IDS)is increasingly measured by technical performance,operational usability,and adaptability.This study introduces and rigorously evaluates a Human-Computer Interaction(HCI)-Integrated IDS with the utilization of Convolutional Neural Network(CNN),CNN-Long Short Term Memory(LSTM),and Random Forest(RF)against both a Baseline Machine Learning(ML)and a Traditional IDS model,through an extensive experimental framework encompassing many performance metrics,including detection latency,accuracy,alert prioritization,classification errors,system throughput,usability,ROC-AUC,precision-recall,confusion matrix analysis,and statistical accuracy measures.Our findings consistently demonstrate the superiority of the HCI-Integrated approach utilizing three major datasets(CICIDS 2017,KDD Cup 1999,and UNSW-NB15).Experimental results indicate that the HCI-Integrated model outperforms its counterparts,achieving an AUC-ROC of 0.99,a precision of 0.93,and a recall of 0.96,while maintaining the lowest false positive rate(0.03)and the fastest detection time(~1.5 s).These findings validate the efficacy of incorporating HCI to enhance anomaly detection capabilities,improve responsiveness,and reduce alert fatigue in critical smart city applications.It achieves markedly lower detection times,higher accuracy across all threat categories,reduced false positive and false negative rates,and enhanced system throughput under concurrent load conditions.The HCIIntegrated IDS excels in alert contextualization and prioritization,offering more actionable insights while minimizing analyst fatigue.Usability feedback underscores increased analyst confidence and operational clarity,reinforcing the importance of user-centered design.These results collectively position the HCI-Integrated IDS as a highly effective,scalable,and human-aligned solution for modern threat detection environments.
基金supported and funded by the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)(grant number IMSIU-DDRSP2601).
文摘This paper introduces a novel fractional-order model based on the Caputo-Fabrizio(CF)derivative for analyzing computer virus propagation in networked environments.The model partitions the computer population into four compartments:susceptible,latently infected,breaking-out,and antivirus-capable systems.By employing the CF derivative—which uses a nonsingular exponential kernel—the framework effectively captures memory-dependent and nonlocal characteristics intrinsic to cyber systems,aspects inadequately represented by traditional integer-order models.Under Lipschitz continuity and boundedness assumptions,the existence and uniqueness of solutions are rigorously established via fixed-point theory.We develop a tailored two-step Adams-Bashforth numerical scheme for the CF framework and prove its second-order accuracy.Extensive numerical simulations across various fractional orders reveal that memory effects significantly influence virus transmission and control dynamics;smaller fractional orders produce more pronounced memory effects,delaying both infection spread and antivirus activation.Further theoretical analysis,including Hyers-Ulam stability and sensitivity assessments,reinforces the model’s robustness and identifies key parameters governing virus dynamics.The study also extends the framework to incorporate stochastic effects through a stochastic CF formulation.These results underscore fractional-order modeling as a powerful analytical tool for developing robust and effective cybersecurity strategies.
文摘Mortality prediction in respiratory health is challenging,especially when using large-scale clinical datasets composed primarily of categorical variables.Traditional digital twin(DT)frameworks often rely on longi-tudinal or sensor-based data,which are not always available in public health contexts.In this article,we propose a novel proto-DT framework for mortality prediction in respiratory health using a large-scale categorical biomedical dataset.This dataset contains 415,711 severe acute respiratory infection cases from the Brazilian Unified Health System,including both COVID-19 and non-COVID-19 patients.Four classification models—extreme gradient boosting(XGBoost),logistic regression,random forest,and a deep neural network(DNN)—are trained using cost-sensitive learning to address class imbalance.The models are evaluated using accuracy,precision,recall,F1-score,and area under the curve(AUC)related to the receiver operating characteristic(ROC).The framework supports simulated interventions by modifying selected inputs and recalculating predicted mortality.Additionally,we incorporate multiple correspondence analysis and K-means clustering to explore model sensitivity.A Python library has been developed to ensure reproducibility.All models achieve AUC-ROC values near or above 0.85.XGBoost yields the highest accuracy(0.84),while the DNN achieves the highest recall(0.81).Scenario-based simulations reveal how key clinical factors,such as intensive care unit admission and oxygen support,affect predicted outcomes.The proposed proto-DT framework demonstrates the feasibility of mortality prediction and intervention simulation using categorical data alone.This framework provides a foundation for data-driven explainable DTs in public health,even in the absence of time-series data.
文摘Since its inaugural issue in 1986,the Journal of Computer Science and Technology(JCST)has been the premier English journal of China Computer Federation(CCF),serving international readers and authors by disseminating scholarly and technical papers under a rigorous review process.
文摘Technological innovation ushered in the computer era, and, after a few years of tutelage by established disciplines, computer science emerged as an independent discipline. In the subsequent decades computer science developed its special identity, sharing the dual character of engineering and mathematics. This evolution is revisited here based on my personal experience. In my view, the notion of computational model has been the enabler of extraordinary creativity, and at the same time the source of critical reflection two decades ago. However, capitalizing on a vibrant technology, computer science is reinventing itself as the indispensable enabler of applications. This is a crucial profile that calls for a pedagogical adaptation, where the notion of model morphs from means to end.
文摘As computer science enrollments continue to surge, assessments that involve student collaboration may play a more critical role in improving student learning. We provide a review on some of the most commonly adopted collaborative assessments in computer science, including pair programming, collaborative exams, and group projects. Existing research on these assessment formats is categorized and compared. We also discuss potential future research topics on the aforementioned collaborative assessment formats.
基金Project supported by the National Natural Science Foundation of China.
文摘This paper presents CMOS circuit designs of a ternary adder and a ternary multiplier,formulated using transmission function theory.Binary carry signals appearing in these designs allow conventional look-ahead carry techniques to be used.Compared with previous similar designs,the circuits proposed in this paper have advantages such as low dissipation,low output impedance,and simplicity of construction.
基金supported in part by the National Natural Science Foundation of China under Grant 62371181in part by the Changzhou Science and Technology International Cooperation Program under Grant CZ20230029+1 种基金supported by a National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(2021R1A2B5B02087169)supported under the framework of international cooperation program managed by the National Research Foundation of Korea(2022K2A9A1A01098051)。
文摘The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.
基金Supported by the National Natural Science Foundation of China(U1903214,62372339,62371350,61876135)the Ministry of Education Industry University Cooperative Education Project(202102246004,220800006041043,202002142012)the Fundamental Research Funds for the Central Universities(2042023kf1033)。
文摘Recent years have witnessed the ever-increasing performance of Deep Neural Networks(DNNs)in computer vision tasks.However,researchers have identified a potential vulnerability:carefully crafted adversarial examples can easily mislead DNNs into incorrect behavior via the injection of imperceptible modification to the input data.In this survey,we focus on(1)adversarial attack algorithms to generate adversarial examples,(2)adversarial defense techniques to secure DNNs against adversarial examples,and(3)important problems in the realm of adversarial examples beyond attack and defense,including the theoretical explanations,trade-off issues and benign attacks in adversarial examples.Additionally,we draw a brief comparison between recently published surveys on adversarial examples,and identify the future directions for the research of adversarial examples,such as the generalization of methods and the understanding of transferability,that might be solutions to the open problems in this field.
基金Strategic Priority Research Program of the Chinese Academy of Sciences,No.XDB0740000National Key Research and Development Program of China,No.2022YFB3904200,No.2022YFF0711601+1 种基金Key Project of Innovation LREIS,No.PI009National Natural Science Foundation of China,No.42471503。
文摘Deep-time Earth research plays a pivotal role in deciphering the rates,patterns,and mechanisms of Earth's evolutionary processes throughout geological history,providing essential scientific foundations for climate prediction,natural resource exploration,and sustainable planetary stewardship.To advance Deep-time Earth research in the era of big data and artificial intelligence,the International Union of Geological Sciences initiated the“Deeptime Digital Earth International Big Science Program”(DDE)in 2019.At the core of this ambitious program lies the development of geoscience knowledge graphs,serving as a transformative knowledge infrastructure that enables the integration,sharing,mining,and analysis of heterogeneous geoscience big data.The DDE knowledge graph initiative has made significant strides in three critical dimensions:(1)establishing a unified knowledge structure across geoscience disciplines that ensures consistent representation of geological entities and their interrelationships through standardized ontologies and semantic frameworks;(2)developing a robust and scalable software infrastructure capable of supporting both expert-driven and machine-assisted knowledge engineering for large-scale graph construction and management;(3)implementing a comprehensive three-tiered architecture encompassing basic,discipline-specific,and application-oriented knowledge graphs,spanning approximately 20 geoscience disciplines.Through its open knowledge framework and international collaborative network,this initiative has fostered multinational research collaborations,establishing a robust foundation for next-generation geoscience research while propelling the discipline toward FAIR(Findable,Accessible,Interoperable,Reusable)data practices in deep-time Earth systems research.
基金funded by the Jiaying University Research Start-Up Fund(grant number 323E0431).
文摘Accurate fingertip detection is critical for translating hand gestures into actionable commands in vision-based human‒computer interaction(HCI)systems.However,challenges such as complex backgrounds,dynamic hand postures,and real-time processing constraints hinder reliable detection.This paper introduces a robust framework integrating three key innovations:(1)an adaptive Gaussian mixture model(GMM)enhanced with neighborhood pixel connectivity for precise motion extraction;(2)a weighted YCbCr color-space shadow removal algorithm to eliminate false positives;and(3)a centroid distance method refined with circularity constraints for accurate fingertip localization.Extensive experiments demonstrate a recognition accuracy of 97.26%across diverse scenarios,including varying illuminations,occlusions,and hand rotations.The algorithm processes each frame in 23.43 ms on average,satisfying real-time requirements.Comparative evaluations against state-of-the-art methods reveal significant improvements in precision(8.3%),recall(6.1%),and F-measure(7.8%).This work advances HCI applications such as virtual keyboards,gesture-controlled interfaces,and augmented reality systems.
基金Education and Teaching Research Project of Beijing University of Technology(ER2024KCB08)。
文摘With the rapid development of artificial intelligence technology,AIGC(Artificial Intelligence-Generated Content)has triggered profound changes in the field of high-level language programming courses.This paper deeply explored the application principles,advantages,and limitations of AIGC in intelligent code generation,analyzed the new mode of human-computer collaboration in high-level language programming courses driven by AIGC,discussed the impact of human-computer collaboration on programming efficiency and code quality through practical case studies,and looks forward to future development trends.This research aims to provide theoretical and practical guidance for high-level language programming courses and promote innovative development of high-level language programming courses under the human-computer collaboration paradigm.