The paper addresses the challenge of transmitting a big number offiles stored in a data center(DC),encrypting them by compilers,and sending them through a network at an acceptable time.Face to the big number offiles,o...The paper addresses the challenge of transmitting a big number offiles stored in a data center(DC),encrypting them by compilers,and sending them through a network at an acceptable time.Face to the big number offiles,only one compiler may not be sufficient to encrypt data in an acceptable time.In this paper,we consider the problem of several compilers and the objective is tofind an algorithm that can give an efficient schedule for the givenfiles to be compiled by the compilers.The main objective of the work is to minimize the gap in the total size of assignedfiles between compilers.This minimization ensures the fair distribution offiles to different compilers.This problem is considered to be a very hard problem.This paper presents two research axes.Thefirst axis is related to architecture.We propose a novel pre-compiler architecture in this context.The second axis is algorithmic development.We develop six algorithms to solve the problem,in this context.These algorithms are based on the dispatching rules method,decomposition method,and an iterative approach.These algorithms give approximate solutions for the studied problem.An experimental result is imple-mented to show the performance of algorithms.Several indicators are used to measure the performance of the proposed algorithms.In addition,five classes are proposed to test the algorithms with a total of 2350 instances.A comparison between the proposed algorithms is presented in different tables discussed to show the performance of each algorithm.The result showed that the best algorithm is the Iterative-mixed Smallest-Longest-Heuristic(ISL)with a percentage equal to 97.7%and an average running time equal to 0.148 s.All other algorithms did not exceed 22%as a percentage.The best algorithm excluding ISL is Iterative-mixed Longest-Smallest Heuristic(ILS)with a percentage equal to 21,4%and an average running time equal to 0.150 s.展开更多
Should the article be accepted and published by Agricultural Science&Technology,the author hereby grants exclusively to the editorial department of Agricultural Science&Technology the digital reproduction,dist...Should the article be accepted and published by Agricultural Science&Technology,the author hereby grants exclusively to the editorial department of Agricultural Science&Technology the digital reproduction,distribution,compilation and information network transmission rights.展开更多
In compliance with“Copyright Law of the People’s Republic of China”,it is agreed by the author(s)of the said article as follows upon signing of this statement:The said article shall be published in Journal of Trans...In compliance with“Copyright Law of the People’s Republic of China”,it is agreed by the author(s)of the said article as follows upon signing of this statement:The said article shall be published in Journal of Trans-lational Neuroscience.The author(s)shall grant the fol-lowing worldwide exclusive rights carried by the said article in different languages to Journal of Translational Neuroscience free of charge:reproductions,distribution,electronic dissemination,translation and compilation.The author(s)authorize(s)Journal of Translational Neu-roscience to register the said article(including all the in-termedia)with the proper copyright authorities.展开更多
Should the article be accepted and published by Agricultural Science&Technology,the author hereby grants exclusively to the editorial department of Agricultural Science&Technology the digital reproduction,dist...Should the article be accepted and published by Agricultural Science&Technology,the author hereby grants exclusively to the editorial department of Agricultural Science&Technology the digital reproduction,distribution,compilation and information network transmission rights.展开更多
Traditional quantum circuit scheduling approaches underutilize the inherent parallelism of quantum computation in the Noisy Intermediate-Scale Quantum(NISQ)era,overlook the inter-layer operations can be further parall...Traditional quantum circuit scheduling approaches underutilize the inherent parallelism of quantum computation in the Noisy Intermediate-Scale Quantum(NISQ)era,overlook the inter-layer operations can be further parallelized.Based on this,two quantum circuit scheduling optimization approaches are designed and integrated into the quantum circuit compilation process.Firstly,we introduce the Layered Topology Scheduling Approach(LTSA),which employs a greedy algorithm and leverages the principles of topological sorting in graph theory.LTSA allocates quantum gates to a layered structure,maximizing the concurrent execution of quantum gate operations.Secondly,the Layerwise Conflict Resolution Approach(LCRA)is proposed.LCRA focuses on utilizing directly executable quantum gates within layers.Through the insertion of SWAP gates and conflict resolution checks,it minimizes conflicts and enhances parallelism,thereby optimizing the overall computational efficiency.Experimental findings indicate that LTSA and LCRA individually achieve a noteworthy reduction of 51.1%and 53.2%,respectively,in the number of inserted SWAP gates.Additionally,they contribute to a decrease in hardware gate overhead by 14.7%and 15%,respectively.Considering the intricate nature of quantum circuits and the temporal dependencies among different layers,the amalgamation of both approaches leads to a remarkable 51.6%reduction in inserted SWAP gates and a 14.8%decrease in hardware gate overhead.These results underscore the efficacy of the combined LTSA and LCRA in optimizing quantum circuit compilation.展开更多
Translation validation was invented in the 90's by Pnueli et al. as a technique to formally verify the correctness of code generators. Rather than certifying the code generator or exhaustively qualifying it, translat...Translation validation was invented in the 90's by Pnueli et al. as a technique to formally verify the correctness of code generators. Rather than certifying the code generator or exhaustively qualifying it, translation validators attempt to verify that program transformations preserve semantics. In this work, we adopt this approach to formally verify that the clock semantics and data dependence are preserved during the compilation of the Signal compiler. Translation valida- tion is implemented for every compilation phase from the initial phase until the latest phase where the executable code is generated, by proving the transformation in each phase of the compiler preserves the semantics. We represent the clock semantics, the data dependence of a program and its trans- formed counterpart as first-order formulas which are called clock models and synchronous dependence graphs (SDGs), respectively. We then introduce clock refinement and depen- dence refinement relations which express the preservations of clock semantics and dependence, as a relation on clock mod- els and SDGs, respectively. Our validator does not require any instrumentation or modification of the compiler, nor any rewriting of the source program.展开更多
Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount impo...Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount importance in the emerging field of edge AI.One widely used testing method for this purpose is fuzz testing,which detects bugs by inputting random test cases into the target program.However,this process consumes significant time and resources.To improve the efficiency of compiler fuzz testing,it is common practice to utilize test case prioritization techniques.Some researchers use machine learning to predict the code coverage of test cases,aiming to maximize the test capability for the target compiler by increasing the overall predicted coverage of the test cases.Nevertheless,these methods can only forecast the code coverage of the compiler at a specific optimization level,potentially missing many optimization-related bugs.In this paper,we introduce C-CORE(short for Clustering by Code Representation),the first framework to prioritize test cases according to their code representations,which are derived directly from the source codes.This approach avoids being limited to specific compiler states and extends to a broader range of compiler bugs.Specifically,we first train a scaled pre-trained programming language model to capture as many common features as possible from the test cases generated by a fuzzer.Using this pre-trained model,we then train two downstream models:one for predicting the likelihood of triggering a bug and another for identifying code representations associated with bugs.Subsequently,we cluster the test cases according to their code representations and select the highest-scoring test case from each cluster as the high-quality test case.This reduction in redundant testing cases leads to time savings.Comprehensive evaluation results reveal that code representations are better at distinguishing test capabilities,and C-CORE significantly enhances testing efficiency.Across four datasets,C-CORE increases the average of the percentage of faults detected(APFD)value by 0.16 to 0.31 and reduces test time by over 50% in 46% of cases.When compared to the best results from approaches using predicted code coverage,C-CORE improves the APFD value by 1.1% to 12.3% and achieves an overall time-saving of 159.1%.展开更多
Episodes of tectonic activities since Archaean time in one of the oldest craton,the eastern Yilgarn Craton of Western Australia,have left a complex pattern in the architectural settings.Insights of the crustal scale a...Episodes of tectonic activities since Archaean time in one of the oldest craton,the eastern Yilgarn Craton of Western Australia,have left a complex pattern in the architectural settings.Insights of the crustal scale architectural settings of the Craton have been made through geophysical data modelling and imaging using high resolution aeromagnetic and Bouguer gravity data.The advanced technique of image processing using pseudocolour composition,hill-shading and the multiple data layers compilation in the hue,saturation and intensity(HSI)space has been used for image based analysis of potential field data.Geophysical methods of anomaly enhancement technique along with the imaging technique are used to delineate several regional and as well as local structures.Multiscale analysis in geophysical data processing with the application of varying upward continuation levels,and also anomaly enhancement techniques using spatial derivatives are used delineating major shear zones and regional scale structures.A suitable data based interpretation of basement architecture of the study area is given.展开更多
Since 2008, the author of this paper has conducted historiographic research on the visual history of science in the West since the mid-twentieth century. The findings show that the cognitive functions of visual scient...Since 2008, the author of this paper has conducted historiographic research on the visual history of science in the West since the mid-twentieth century. The findings show that the cognitive functions of visual scientific representations in the history of science are connected with theories of knowledge development in dialectical materialist epistemology and theories on children's cognitive features at different ages in developmental psychology, as well as the stage-specific curriculum objectives outlined in the Compulsory Education Science Curriculum Standards(2022 Edition). These insights provide essential inspiration and theoretical support for the establishment of the twin-theme logical structure in the Primary School Science Textbooks(Daxiang Edition)—core competencies as the warp and cognitive development as the weft—and for the intentional cultivation of students' cognitive abilities using scientific images across different learning stages and textbooks.展开更多
VHDL and its supporting environment are active domain in the field of logic design.In the paper the design principle and some key techniques to solve the problems on the implementation of the VHDL parser are introduce...VHDL and its supporting environment are active domain in the field of logic design.In the paper the design principle and some key techniques to solve the problems on the implementation of the VHDL parser are introduced. According to the methods discussed in the paper, the VHDL parser based on VHDL IEEE 1076 standard version is implemented and a series of strict tests are done. This VHDL parser is front-end tool of the VHDL high level synthesis and mixed level simulation system developed by the Research Center of ASIC of BIT.展开更多
Prototype landscape refers to the impressive scenes that one has experienced in his/her living environment before 20 years old.Based on the analysis of the existing literature,the authors compiled a standard scale typ...Prototype landscape refers to the impressive scenes that one has experienced in his/her living environment before 20 years old.Based on the analysis of the existing literature,the authors compiled a standard scale type questionnaire by means of a field survey,which was about the influences of prototype landscape on one's landscape perception.Taking Likert scale as the main part,this questionnaire analyzed the influence of prototype landscape on landscape perception from perception,attitude,and behavior dimensions.In order to further improve its rationality,the authors tested some other aspects of this questionnaire,including logic validity,construct validity,congeniality reliability,split-half reliability,etc..The results validated that the questionnaire possessed good theoretical structure and validity target,which can evaluate various aspects of prototype landscape on one's landscape perception in an effective and reliable way.Therefore,the questionnaire put forward by this study not only enriched the studies of prototype landscape on landscape designing,but also provided an effective tool for quantitative analysis of "the influences of prototype landscape on one's landscape perception".展开更多
This paper briefly introduces the systemic structure of Vocational English series--Basic English, and puts forwards the four key compiling principles, namely, system, cognition, practicality and interest.
基金The author would like to thank the Deanship of Scientific Research at Majmaah University for supporting this work under Project Number No.R-2022-85.
文摘The paper addresses the challenge of transmitting a big number offiles stored in a data center(DC),encrypting them by compilers,and sending them through a network at an acceptable time.Face to the big number offiles,only one compiler may not be sufficient to encrypt data in an acceptable time.In this paper,we consider the problem of several compilers and the objective is tofind an algorithm that can give an efficient schedule for the givenfiles to be compiled by the compilers.The main objective of the work is to minimize the gap in the total size of assignedfiles between compilers.This minimization ensures the fair distribution offiles to different compilers.This problem is considered to be a very hard problem.This paper presents two research axes.Thefirst axis is related to architecture.We propose a novel pre-compiler architecture in this context.The second axis is algorithmic development.We develop six algorithms to solve the problem,in this context.These algorithms are based on the dispatching rules method,decomposition method,and an iterative approach.These algorithms give approximate solutions for the studied problem.An experimental result is imple-mented to show the performance of algorithms.Several indicators are used to measure the performance of the proposed algorithms.In addition,five classes are proposed to test the algorithms with a total of 2350 instances.A comparison between the proposed algorithms is presented in different tables discussed to show the performance of each algorithm.The result showed that the best algorithm is the Iterative-mixed Smallest-Longest-Heuristic(ISL)with a percentage equal to 97.7%and an average running time equal to 0.148 s.All other algorithms did not exceed 22%as a percentage.The best algorithm excluding ISL is Iterative-mixed Longest-Smallest Heuristic(ILS)with a percentage equal to 21,4%and an average running time equal to 0.150 s.
文摘Should the article be accepted and published by Agricultural Science&Technology,the author hereby grants exclusively to the editorial department of Agricultural Science&Technology the digital reproduction,distribution,compilation and information network transmission rights.
文摘In compliance with“Copyright Law of the People’s Republic of China”,it is agreed by the author(s)of the said article as follows upon signing of this statement:The said article shall be published in Journal of Trans-lational Neuroscience.The author(s)shall grant the fol-lowing worldwide exclusive rights carried by the said article in different languages to Journal of Translational Neuroscience free of charge:reproductions,distribution,electronic dissemination,translation and compilation.The author(s)authorize(s)Journal of Translational Neu-roscience to register the said article(including all the in-termedia)with the proper copyright authorities.
文摘Should the article be accepted and published by Agricultural Science&Technology,the author hereby grants exclusively to the editorial department of Agricultural Science&Technology the digital reproduction,distribution,compilation and information network transmission rights.
基金funded by the Natural Science Foundation of Heilongjiang Province(Grant No.LH2022F035)the Cultivation Programme for Young Innovative Talents in Ordinary Higher Education Institutions of Heilongjiang Province(Grant No.UNPYSCT-2020212)the Cultivation Programme for Young Innovative Talents in Scientific Research of Harbin University of Commerce(Grant No.2023-KYYWF-0983).
文摘Traditional quantum circuit scheduling approaches underutilize the inherent parallelism of quantum computation in the Noisy Intermediate-Scale Quantum(NISQ)era,overlook the inter-layer operations can be further parallelized.Based on this,two quantum circuit scheduling optimization approaches are designed and integrated into the quantum circuit compilation process.Firstly,we introduce the Layered Topology Scheduling Approach(LTSA),which employs a greedy algorithm and leverages the principles of topological sorting in graph theory.LTSA allocates quantum gates to a layered structure,maximizing the concurrent execution of quantum gate operations.Secondly,the Layerwise Conflict Resolution Approach(LCRA)is proposed.LCRA focuses on utilizing directly executable quantum gates within layers.Through the insertion of SWAP gates and conflict resolution checks,it minimizes conflicts and enhances parallelism,thereby optimizing the overall computational efficiency.Experimental findings indicate that LTSA and LCRA individually achieve a noteworthy reduction of 51.1%and 53.2%,respectively,in the number of inserted SWAP gates.Additionally,they contribute to a decrease in hardware gate overhead by 14.7%and 15%,respectively.Considering the intricate nature of quantum circuits and the temporal dependencies among different layers,the amalgamation of both approaches leads to a remarkable 51.6%reduction in inserted SWAP gates and a 14.8%decrease in hardware gate overhead.These results underscore the efficacy of the combined LTSA and LCRA in optimizing quantum circuit compilation.
文摘Translation validation was invented in the 90's by Pnueli et al. as a technique to formally verify the correctness of code generators. Rather than certifying the code generator or exhaustively qualifying it, translation validators attempt to verify that program transformations preserve semantics. In this work, we adopt this approach to formally verify that the clock semantics and data dependence are preserved during the compilation of the Signal compiler. Translation valida- tion is implemented for every compilation phase from the initial phase until the latest phase where the executable code is generated, by proving the transformation in each phase of the compiler preserves the semantics. We represent the clock semantics, the data dependence of a program and its trans- formed counterpart as first-order formulas which are called clock models and synchronous dependence graphs (SDGs), respectively. We then introduce clock refinement and depen- dence refinement relations which express the preservations of clock semantics and dependence, as a relation on clock mod- els and SDGs, respectively. Our validator does not require any instrumentation or modification of the compiler, nor any rewriting of the source program.
文摘Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount importance in the emerging field of edge AI.One widely used testing method for this purpose is fuzz testing,which detects bugs by inputting random test cases into the target program.However,this process consumes significant time and resources.To improve the efficiency of compiler fuzz testing,it is common practice to utilize test case prioritization techniques.Some researchers use machine learning to predict the code coverage of test cases,aiming to maximize the test capability for the target compiler by increasing the overall predicted coverage of the test cases.Nevertheless,these methods can only forecast the code coverage of the compiler at a specific optimization level,potentially missing many optimization-related bugs.In this paper,we introduce C-CORE(short for Clustering by Code Representation),the first framework to prioritize test cases according to their code representations,which are derived directly from the source codes.This approach avoids being limited to specific compiler states and extends to a broader range of compiler bugs.Specifically,we first train a scaled pre-trained programming language model to capture as many common features as possible from the test cases generated by a fuzzer.Using this pre-trained model,we then train two downstream models:one for predicting the likelihood of triggering a bug and another for identifying code representations associated with bugs.Subsequently,we cluster the test cases according to their code representations and select the highest-scoring test case from each cluster as the high-quality test case.This reduction in redundant testing cases leads to time savings.Comprehensive evaluation results reveal that code representations are better at distinguishing test capabilities,and C-CORE significantly enhances testing efficiency.Across four datasets,C-CORE increases the average of the percentage of faults detected(APFD)value by 0.16 to 0.31 and reduces test time by over 50% in 46% of cases.When compared to the best results from approaches using predicted code coverage,C-CORE improves the APFD value by 1.1% to 12.3% and achieves an overall time-saving of 159.1%.
基金supported by Spaceage Geoconsulting,a research oriented consulting firm.
文摘Episodes of tectonic activities since Archaean time in one of the oldest craton,the eastern Yilgarn Craton of Western Australia,have left a complex pattern in the architectural settings.Insights of the crustal scale architectural settings of the Craton have been made through geophysical data modelling and imaging using high resolution aeromagnetic and Bouguer gravity data.The advanced technique of image processing using pseudocolour composition,hill-shading and the multiple data layers compilation in the hue,saturation and intensity(HSI)space has been used for image based analysis of potential field data.Geophysical methods of anomaly enhancement technique along with the imaging technique are used to delineate several regional and as well as local structures.Multiscale analysis in geophysical data processing with the application of varying upward continuation levels,and also anomaly enhancement techniques using spatial derivatives are used delineating major shear zones and regional scale structures.A suitable data based interpretation of basement architecture of the study area is given.
基金supported by a general research programme‘Research on the Compilation System of Primary School Science Textbooks with Marxist Epistemology as a Main Theme’under the 2023 Henan Province Education Sciences Plan launched by the Education Department of Henan Province(grant no.2023YB0610)
文摘Since 2008, the author of this paper has conducted historiographic research on the visual history of science in the West since the mid-twentieth century. The findings show that the cognitive functions of visual scientific representations in the history of science are connected with theories of knowledge development in dialectical materialist epistemology and theories on children's cognitive features at different ages in developmental psychology, as well as the stage-specific curriculum objectives outlined in the Compulsory Education Science Curriculum Standards(2022 Edition). These insights provide essential inspiration and theoretical support for the establishment of the twin-theme logical structure in the Primary School Science Textbooks(Daxiang Edition)—core competencies as the warp and cognitive development as the weft—and for the intentional cultivation of students' cognitive abilities using scientific images across different learning stages and textbooks.
文摘VHDL and its supporting environment are active domain in the field of logic design.In the paper the design principle and some key techniques to solve the problems on the implementation of the VHDL parser are introduced. According to the methods discussed in the paper, the VHDL parser based on VHDL IEEE 1076 standard version is implemented and a series of strict tests are done. This VHDL parser is front-end tool of the VHDL high level synthesis and mixed level simulation system developed by the Research Center of ASIC of BIT.
文摘Prototype landscape refers to the impressive scenes that one has experienced in his/her living environment before 20 years old.Based on the analysis of the existing literature,the authors compiled a standard scale type questionnaire by means of a field survey,which was about the influences of prototype landscape on one's landscape perception.Taking Likert scale as the main part,this questionnaire analyzed the influence of prototype landscape on landscape perception from perception,attitude,and behavior dimensions.In order to further improve its rationality,the authors tested some other aspects of this questionnaire,including logic validity,construct validity,congeniality reliability,split-half reliability,etc..The results validated that the questionnaire possessed good theoretical structure and validity target,which can evaluate various aspects of prototype landscape on one's landscape perception in an effective and reliable way.Therefore,the questionnaire put forward by this study not only enriched the studies of prototype landscape on landscape designing,but also provided an effective tool for quantitative analysis of "the influences of prototype landscape on one's landscape perception".
文摘This paper briefly introduces the systemic structure of Vocational English series--Basic English, and puts forwards the four key compiling principles, namely, system, cognition, practicality and interest.