With the rapid development of electronic information engineering,high-speed digital circuits have been increasingly widely applied in various fields.In high-speed digital circuits,signal integrity is prone to interfer...With the rapid development of electronic information engineering,high-speed digital circuits have been increasingly widely applied in various fields.In high-speed digital circuits,signal integrity is prone to interference from various external factors,leading to issues such as signal distortion or degradation of system performance.Based on this,this paper conducts research on the optimization strategies for signal integrity of high-speed digital circuits in electronic information engineering.It deeply analyzes the importance of high-speed digital circuits,elaborates on the challenges they face and the specific manifestations of signal integrity issues,and proposes a series of optimization strategies in electronic information engineering.The aim is to improve the signal integrity of highspeed digital circuits and provide theoretical support and practical guidance for the development of related fields.展开更多
L2 reading is not only an important channel for people to obtain information and knowledge,but also the main way for people to learn a foreign language.Reading information processing can be divided into controlled pro...L2 reading is not only an important channel for people to obtain information and knowledge,but also the main way for people to learn a foreign language.Reading information processing can be divided into controlled processing and automatic processing.Controlled information processing is a conscious and resource-intensive processing model,while automatic information processing is an unconscious and automatic processing model.This study investigates the characteristics and interactivity of controlled and automatic information processing in L2 reading,and explores the roles of controlled and automatic information processing strategies in improving L2 reading ability.The findings are as follows:(a)controlled and automatic information processing is interactive in L2 reading;and(b)the uses of controlled and automatic information processing strategies are beneficial to the improvement of the reading ability of L2 learners.This study has important theoretical and practical value in improving the efficiency of L2 reading teaching and learning.展开更多
Aim To develop an information processing system with real time processing capability and artistic user interface for the optoelectronic antagonism general measuring system. Methods The A/D board and the multifun...Aim To develop an information processing system with real time processing capability and artistic user interface for the optoelectronic antagonism general measuring system. Methods The A/D board and the multifunctional board communicating with every instruments were designed, data collecting and processing were realized by selecting appropriate software platform. Results Simulating results show the information processing system can operate correctly and dependably, the measuring rules, interactive interface and data handling method were all accepted by the user. Conclusion The designing approach based on the mix platform takes advantages of the two operating systems, the desired performances are acquired both in the real time processing and with the friendly artistic user interface.展开更多
The hot deformation behavior and microstructure evolution of industrial grade American Iron and Steel Institute(AISI)M35 high-speed steel produced by electroslag remelting at different parameters were investigated.The...The hot deformation behavior and microstructure evolution of industrial grade American Iron and Steel Institute(AISI)M35 high-speed steel produced by electroslag remelting at different parameters were investigated.The results indicated that grains coarsening and M2C carbides decomposing appeared in the steel at 1150℃for 5 min,and the network carbides were broken and deformed radially after the hot deformation.A constitutive equation was determined based on the corrected flow stress-strain curves considering the effects of friction and temperature,and a constitutive model with strain-compensated was established.The dynamic recrystallization(DRX)characteristic values were calculated based on the Cingara-McQueen model,and the grain distribution under different conditions was observed and analyzed.Significantly,the action mechanisms of carbides on the DRX were illuminated.It was found from a functional relation between average grain size and Z parameter that grain size increased with increasing temperature and decreasing strain rate.Optimal parameters for the hot deformation were determined as 980-1005℃~0.01-0.015 s^(−1)and 1095-1110℃~0.01-0.037 s^(−1)at the strain ranging from 0.05 to 0.8.Increasing the strain rate appropriately during deformation process was suggested to obtain fine and uniformly distributed carbides.Besides,an industrial grade forging deformation had also verified practicability of the above parameters.展开更多
During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resol...During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.展开更多
Purpose-The purpose of this paper is to eliminate the fluctuations in train arrival and departure times caused by skewed distributions in interval operation times.These fluctuations arise from random origin and proces...Purpose-The purpose of this paper is to eliminate the fluctuations in train arrival and departure times caused by skewed distributions in interval operation times.These fluctuations arise from random origin and process factors during interval operations and can accumulate over multiple intervals.The aim is to enhance the robustness of high-speed rail station arrival and departure track utilization schemes.Design/methodologylapproach-To achieve this objective,the paper simulates actual train operations,incorporating the fluctuations in interval operation times into the utilization of arrival and departure tracks at the station.The Monte Carlo simulation method is adopted to solve this problem.This approach transforms a nonlinear model,which includes constraints from probability distribution functions and is difficult to solve directly,into a linear programming model that is easier to handle.The method then linearly weights two objectives to optimize the solution.Findings-Through the application of Monte Carlo simulation,the study successfully converts the complex nonlinear model with probability distribution function constraints into a manageable linear programming model.By continuously adjusting the weighting coefficients of the linear objectives,the method is able to optimize the Pareto solution.Notably,this approach does not require extensive scene data to obtain a satisfactory Pareto solution set.Originality/value-The paper contributes to the field by introducing a novel method for optimizing high-speed rail station arrival and departure track utilization in the presence of fluctuations in interval operation times.The use of Monte Carlo simulation to transform the problem into a tractable linear programming model represents a significant advancement.Furthermore,the method's ability to produce satisfactory Pareto solutions without relying on extensive data sets adds to its practical value and applicability in real-world scenarios.展开更多
In a measurement system, new representation methods are necessary to maintain the uncertainty and to supply more powerful ability for reasoning and transformation between numerical system and symbolic system. A grey m...In a measurement system, new representation methods are necessary to maintain the uncertainty and to supply more powerful ability for reasoning and transformation between numerical system and symbolic system. A grey measurement system is discussed from the point of view of intelligent sensors and incomplete information processing compared with a numerical and symbolized measurement system. The methods of grey representation and information processing are proposed for data collection and reasoning. As a case study, multi-ultrasonic sensor systems are demonstrated to verify the effectiveness of the proposed methods.展开更多
Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is di...Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.展开更多
Nitrogen-vacancy (NV) center in diamond is one of the most promising candidates to implement room temperature quantum computing. In this review, we briefly discuss the working principles and recent experimental prog...Nitrogen-vacancy (NV) center in diamond is one of the most promising candidates to implement room temperature quantum computing. In this review, we briefly discuss the working principles and recent experimental progresses of this spin qubit. These results focus on understanding and prolonging center spin coherence, steering and probing spin states with dedicated quantum control techniques, and exploiting the quantum nature of these multi-spin systems, such as superposition and entanglement, to demonstrate the superiority of quantum information processing. Those techniques also stimulate the fast development of NV-based quantum sensing, which is an interdisciplinary field with great potential applications.展开更多
Based on the cognitive radar concept and the basic connotation of cognitive skywave over-the-horizon radar(SWOTHR), the system structure and information processingmechanism about cognitive SWOTHR are researched. Amo...Based on the cognitive radar concept and the basic connotation of cognitive skywave over-the-horizon radar(SWOTHR), the system structure and information processingmechanism about cognitive SWOTHR are researched. Amongthem, the hybrid network system architecture which is thedistributed configuration combining with the centralized cognition and its soft/hardware framework with the sense-detectionintegration are proposed, and the information processing framebased on the lens principle and its information processing flowwith receive-transmit joint adaption are designed, which buildand parse the work law for cognition and its self feedback adjustment with the lens focus model and five stages informationprocessing sequence. After that, the system simulation andthe performance analysis and comparison are provided, whichinitially proves the rationality and advantages of the proposedideas. Finally, four important development ideas of futureSWOTHR toward "high frequency intelligence information processing system" are discussed, which are scene information fusion, dynamic reconfigurable system, hierarchical and modulardesign, and sustainable development. Then the conclusion thatthe cognitive SWOTHR can cause the performance improvement is gotten.展开更多
Radar is an electronic device that uses radio waves to determine the range, angle, or velocity of objects. Real-time signal and information processor is an important module for real-time positioning, imaging, detectio...Radar is an electronic device that uses radio waves to determine the range, angle, or velocity of objects. Real-time signal and information processor is an important module for real-time positioning, imaging, detection and recognition of targets. With the development of ultra-wideband technology, synthetic aperture technology, signal and information processing technology, the radar coverage, detection accuracy and resolution have been greatly improved, especially in terms of one-dimensional(1D) high-resolution radar detection, tracking, recognition, and two-dimensional(2D) synthetic aperture radar imaging technology. Meanwhile, for the application of radar detection and remote sensing with high resolution and wide swath, the amount of data has been greatly increased. Therefore, the radar is required to have low-latency and real-time processing capability under the constraints of size, weight and power consumption. This paper systematically introduces the new technology of high resolution radar and real-time signal and information processing. The key problems and solutions are discussed, including the detection and tracking of 1D high-resolution radar, the accurate signal modeling and wide-swath imaging for geosynchronous orbit synthetic aperture radar, and real-time signal and information processing architecture and efficient algorithms. Finally, the latest research progress and representative results are presented, and the development trends are prospected.展开更多
We investigate a planar ion chip design with a two-dimensional array of linear ion traps for scalable quantum information processing. Qubits are formed from the internal electronic states of trapped ^40Ca^+ ions. The...We investigate a planar ion chip design with a two-dimensional array of linear ion traps for scalable quantum information processing. Qubits are formed from the internal electronic states of trapped ^40Ca^+ ions. The segmented electrodes reside in a single plane on a substrate and a grounded metal plate separately, a combination of appropriate rf and DC potentials is applied to them for stable ion confinement. Every two adjacent electrodes can generate a linear ion trap in and between the electrodes above the chip at a distance dependent on the geometrical scale and other considerations. The potential distributions are calculated by using a static electric field qualitatively. This architecture provides a conceptually simple avenue to achieving the microfabrication and large-scale quantum computation based on the arrays of trapped ions.展开更多
In order to study the problem of intelligent information processing in new types of imaging fuze, the method of extracting the invariance features of target images is adopted, and radial basis function neural network ...In order to study the problem of intelligent information processing in new types of imaging fuze, the method of extracting the invariance features of target images is adopted, and radial basis function neural network is used to recognize targets. Owing to its ability of parallel processing, its robustness and generalization, the method can realize the recognition of the conditions of missile-target encounters, and meet the requirements of real-time recognition in the imaging fuze. It is shown that based on artificial neural network target recognition and burst point control are feasible.展开更多
The delay-causing text data contain valuable information such as the specific reasons for the delay,location and time of the disturbance,which can provide an efficient support for the prediction of train delays and im...The delay-causing text data contain valuable information such as the specific reasons for the delay,location and time of the disturbance,which can provide an efficient support for the prediction of train delays and improve the guidance of train control efficiency.Based on the train operation data and delay-causing data of the Wuhan-Guangzhou high-speed railway,the relevant algorithms in the natural language processing field are used to process the delay-causing text data.It also integrates the train operatingenvironment information and delay-causing text information so as to develop a cause-based train delay propagation prediction model.The Word2vec model is first used to vectorize the delay-causing text description after word segmentation.The mean model or the term frequency-inverse document frequency-weighted model is then used to generate the delay-causing sentence vector based on the original word vector.Afterward,the train operating-environment features and delay-causing sentence vector are input into the extreme gradient boosting(XGBoost)regression algorithm to develop a delay propagation prediction model.In this work,4 text feature processing methods and 8 regression algorithms are considered.The results demonstrate that the XGBoost regression algorithm has the highest prediction accuracy using the test features processed by the continuous bag of words and the mean models.Compared with the prediction model that only considers the train-operating-environment features,the results show that the prediction accuracy of the model is significantly improved with multi-ple regression algorithms after integrating the delay-causing feature.展开更多
It is acknowledged that lacking of interdisciplinary communication amongst designers can result in poor coordination performance in building design. Viewing communication as information processing activity, this paper...It is acknowledged that lacking of interdisciplinary communication amongst designers can result in poor coordination performance in building design. Viewing communication as information processing activity, this paper aims to explore the relationship between interdisciplinary information processing (IP) and design coordination performance. Both amount and quality are concerned regarding information processing. 698 project based samples are collected by questionnaire survey from design institutes in China's Mainland. Statistical data analysis shows that the relationship between information processing amount and design coordination performance follows a nonlinear exponential expression: performance = 3.691 (1-0.235IP amount) rather than reverted U curve. It implies that design period is too short to allow information overload. It indicates that the main problem in interdisciplinary communication in design institute in China is insufficient information. In additional, it is found the correlation between IP quality and coordination process performance is much stronger than that between IP amount and coordination process performance. For practitioners, it reminds design mangers to pay more attention to information processing quality rather than amount.展开更多
Oil monitoring and vibration monitoring are two principal techniques for mechanical fault diagnosis and condition monitoring at present.They monitor the mechanical condition by different approaches,nevertheless,oil an...Oil monitoring and vibration monitoring are two principal techniques for mechanical fault diagnosis and condition monitoring at present.They monitor the mechanical condition by different approaches,nevertheless,oil and vibration monitoring are related in information collecting and processing.In the same mechanical system,the information obtained from the same information source can be described with the same expression form.The expressions are constituted of a structure matrix,a relative matrix and a system matrix.For oil and vibration monitoring,the information source is correlation and the collection is independent and complementary.And oil monitoring and vibration monitoring have the same process method when they yield their information.This research has provided a reasonable and useful approach to combine oil monitoring and vibration monitoring.展开更多
Quantum information processing is an active cross-disciplinary field drawing upon theoretical and experimental physics, computer science, engineering, mathematics, and material science. Its scope ranges from fundament...Quantum information processing is an active cross-disciplinary field drawing upon theoretical and experimental physics, computer science, engineering, mathematics, and material science. Its scope ranges from fundamental issues in quantum physics to prospective commercial exploitation by the computing and communications industries.展开更多
Background: This work aims at investigating the histology of hippocampus formation as structural model of information processing. The study addressed the question whether the pattern of cellular type distribution with...Background: This work aims at investigating the histology of hippocampus formation as structural model of information processing. The study addressed the question whether the pattern of cellular type distribution within hippocampal fields could be used as support of information processing in the hippocampus. Method: Pyramidal-shaped neurons presenting both cytoplasm and nucleus outlined clearly were measured systematically on brain slides, using a light microscope connected to a microcomputer equipped with a scanner software for measuring particles. Morphological types of cells were identified following class sizes and their distribution determined through hippocampal fields. Results: A battery of statistical tests: Sturges’ classification, class sizes distribution around overall mean, Bartlett’s sphericity test, principal components analysis (PCA) followed by correlations matrix analysis and ANOVA allowed two cellular groups to be identified in the hippocampus: large and small pyramidal-shaped cells. Conclusion: The results show that sensory information processing in the hippocampus could be built on two classes of pyramidal neurons that differed anatomically with probably different physiological functions. The study suggests combination ensembles clustering large and small pyramidal cells at different rates, as fundamental signaling units of the hippocampus.展开更多
文摘With the rapid development of electronic information engineering,high-speed digital circuits have been increasingly widely applied in various fields.In high-speed digital circuits,signal integrity is prone to interference from various external factors,leading to issues such as signal distortion or degradation of system performance.Based on this,this paper conducts research on the optimization strategies for signal integrity of high-speed digital circuits in electronic information engineering.It deeply analyzes the importance of high-speed digital circuits,elaborates on the challenges they face and the specific manifestations of signal integrity issues,and proposes a series of optimization strategies in electronic information engineering.The aim is to improve the signal integrity of highspeed digital circuits and provide theoretical support and practical guidance for the development of related fields.
文摘L2 reading is not only an important channel for people to obtain information and knowledge,but also the main way for people to learn a foreign language.Reading information processing can be divided into controlled processing and automatic processing.Controlled information processing is a conscious and resource-intensive processing model,while automatic information processing is an unconscious and automatic processing model.This study investigates the characteristics and interactivity of controlled and automatic information processing in L2 reading,and explores the roles of controlled and automatic information processing strategies in improving L2 reading ability.The findings are as follows:(a)controlled and automatic information processing is interactive in L2 reading;and(b)the uses of controlled and automatic information processing strategies are beneficial to the improvement of the reading ability of L2 learners.This study has important theoretical and practical value in improving the efficiency of L2 reading teaching and learning.
文摘Aim To develop an information processing system with real time processing capability and artistic user interface for the optoelectronic antagonism general measuring system. Methods The A/D board and the multifunctional board communicating with every instruments were designed, data collecting and processing were realized by selecting appropriate software platform. Results Simulating results show the information processing system can operate correctly and dependably, the measuring rules, interactive interface and data handling method were all accepted by the user. Conclusion The designing approach based on the mix platform takes advantages of the two operating systems, the desired performances are acquired both in the real time processing and with the friendly artistic user interface.
基金support from Open Project of State Key Laboratory of Advanced Metallurgy,University of Science and Technology Beijing(No.41622030)Danyang Coinch New Material Technology Co.,Ltd.
文摘The hot deformation behavior and microstructure evolution of industrial grade American Iron and Steel Institute(AISI)M35 high-speed steel produced by electroslag remelting at different parameters were investigated.The results indicated that grains coarsening and M2C carbides decomposing appeared in the steel at 1150℃for 5 min,and the network carbides were broken and deformed radially after the hot deformation.A constitutive equation was determined based on the corrected flow stress-strain curves considering the effects of friction and temperature,and a constitutive model with strain-compensated was established.The dynamic recrystallization(DRX)characteristic values were calculated based on the Cingara-McQueen model,and the grain distribution under different conditions was observed and analyzed.Significantly,the action mechanisms of carbides on the DRX were illuminated.It was found from a functional relation between average grain size and Z parameter that grain size increased with increasing temperature and decreasing strain rate.Optimal parameters for the hot deformation were determined as 980-1005℃~0.01-0.015 s^(−1)and 1095-1110℃~0.01-0.037 s^(−1)at the strain ranging from 0.05 to 0.8.Increasing the strain rate appropriately during deformation process was suggested to obtain fine and uniformly distributed carbides.Besides,an industrial grade forging deformation had also verified practicability of the above parameters.
基金Supported by the National Natural Science Foundation of China(U24B2031)National Key Research and Development Project(2018YFA0702504)"14th Five-Year Plan"Science and Technology Project of CNOOC(KJGG2022-0201)。
文摘During drilling operations,the low resolution of seismic data often limits the accurate characterization of small-scale geological bodies near the borehole and ahead of the drill bit.This study investigates high-resolution seismic data processing technologies and methods tailored for drilling scenarios.The high-resolution processing of seismic data is divided into three stages:pre-drilling processing,post-drilling correction,and while-drilling updating.By integrating seismic data from different stages,spatial ranges,and frequencies,together with information from drilled wells and while-drilling data,and applying artificial intelligence modeling techniques,a progressive high-resolution processing technology of seismic data based on multi-source information fusion is developed,which performs simple and efficient seismic information updates during drilling.Case studies show that,with the gradual integration of multi-source information,the resolution and accuracy of seismic data are significantly improved,and thin-bed weak reflections are more clearly imaged.The updated seismic information while-drilling demonstrates high value in predicting geological bodies ahead of the drill bit.Validation using logging,mud logging,and drilling engineering data ensures the fidelity of the processing results of high-resolution seismic data.This provides clearer and more accurate stratigraphic information for drilling operations,enhancing both drilling safety and efficiency.
文摘Purpose-The purpose of this paper is to eliminate the fluctuations in train arrival and departure times caused by skewed distributions in interval operation times.These fluctuations arise from random origin and process factors during interval operations and can accumulate over multiple intervals.The aim is to enhance the robustness of high-speed rail station arrival and departure track utilization schemes.Design/methodologylapproach-To achieve this objective,the paper simulates actual train operations,incorporating the fluctuations in interval operation times into the utilization of arrival and departure tracks at the station.The Monte Carlo simulation method is adopted to solve this problem.This approach transforms a nonlinear model,which includes constraints from probability distribution functions and is difficult to solve directly,into a linear programming model that is easier to handle.The method then linearly weights two objectives to optimize the solution.Findings-Through the application of Monte Carlo simulation,the study successfully converts the complex nonlinear model with probability distribution function constraints into a manageable linear programming model.By continuously adjusting the weighting coefficients of the linear objectives,the method is able to optimize the Pareto solution.Notably,this approach does not require extensive scene data to obtain a satisfactory Pareto solution set.Originality/value-The paper contributes to the field by introducing a novel method for optimizing high-speed rail station arrival and departure track utilization in the presence of fluctuations in interval operation times.The use of Monte Carlo simulation to transform the problem into a tractable linear programming model represents a significant advancement.Furthermore,the method's ability to produce satisfactory Pareto solutions without relying on extensive data sets adds to its practical value and applicability in real-world scenarios.
基金the National Natural Science Foundation of China (6070308360575033).
文摘In a measurement system, new representation methods are necessary to maintain the uncertainty and to supply more powerful ability for reasoning and transformation between numerical system and symbolic system. A grey measurement system is discussed from the point of view of intelligent sensors and incomplete information processing compared with a numerical and symbolized measurement system. The methods of grey representation and information processing are proposed for data collection and reasoning. As a case study, multi-ultrasonic sensor systems are demonstrated to verify the effectiveness of the proposed methods.
文摘Data processing of small samples is an important and valuable research problem in the electronic equipment test. Because it is difficult and complex to determine the probability distribution of small samples, it is difficult to use the traditional probability theory to process the samples and assess the degree of uncertainty. Using the grey relational theory and the norm theory, the grey distance information approach, which is based on the grey distance information quantity of a sample and the average grey distance information quantity of the samples, is proposed in this article. The definitions of the grey distance information quantity of a sample and the average grey distance information quantity of the samples, with their characteristics and algorithms, are introduced. The correlative problems, including the algorithm of estimated value, the standard deviation, and the acceptance and rejection criteria of the samples and estimated results, are also proposed. Moreover, the information whitening ratio is introduced to select the weight algorithm and to compare the different samples. Several examples are given to demonstrate the application of the proposed approach. The examples show that the proposed approach, which has no demand for the probability distribution of small samples, is feasible and effective.
基金Project supported by the National Basic Research Program of China(Grant Nos.2014CB921402 and 2015CB921103)the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB07010300)+1 种基金the National Natural Science Foundation of China(Grant No.11574386)the Key Research Program of the Chinese Academy of Sciences(Grant No.XDPB0803)
文摘Nitrogen-vacancy (NV) center in diamond is one of the most promising candidates to implement room temperature quantum computing. In this review, we briefly discuss the working principles and recent experimental progresses of this spin qubit. These results focus on understanding and prolonging center spin coherence, steering and probing spin states with dedicated quantum control techniques, and exploiting the quantum nature of these multi-spin systems, such as superposition and entanglement, to demonstrate the superiority of quantum information processing. Those techniques also stimulate the fast development of NV-based quantum sensing, which is an interdisciplinary field with great potential applications.
基金supported by the National Natural Science Foundation of China(61471391)the China Postdoctoral Science Foundation(2013M542541)
文摘Based on the cognitive radar concept and the basic connotation of cognitive skywave over-the-horizon radar(SWOTHR), the system structure and information processingmechanism about cognitive SWOTHR are researched. Amongthem, the hybrid network system architecture which is thedistributed configuration combining with the centralized cognition and its soft/hardware framework with the sense-detectionintegration are proposed, and the information processing framebased on the lens principle and its information processing flowwith receive-transmit joint adaption are designed, which buildand parse the work law for cognition and its self feedback adjustment with the lens focus model and five stages informationprocessing sequence. After that, the system simulation andthe performance analysis and comparison are provided, whichinitially proves the rationality and advantages of the proposedideas. Finally, four important development ideas of futureSWOTHR toward "high frequency intelligence information processing system" are discussed, which are scene information fusion, dynamic reconfigurable system, hierarchical and modulardesign, and sustainable development. Then the conclusion thatthe cognitive SWOTHR can cause the performance improvement is gotten.
基金supported in part by the National Natural Science Foundation of China under Grant Nos.61427802,31727901,61625103,61501032,61471038the Chang Jiang Scholars Program(T2012122)+1 种基金part by the 111 project of China under Grant B14010supported by the Program for Changjiang Scholars and Innovative Research Team in University of Ministry of Education of China
文摘Radar is an electronic device that uses radio waves to determine the range, angle, or velocity of objects. Real-time signal and information processor is an important module for real-time positioning, imaging, detection and recognition of targets. With the development of ultra-wideband technology, synthetic aperture technology, signal and information processing technology, the radar coverage, detection accuracy and resolution have been greatly improved, especially in terms of one-dimensional(1D) high-resolution radar detection, tracking, recognition, and two-dimensional(2D) synthetic aperture radar imaging technology. Meanwhile, for the application of radar detection and remote sensing with high resolution and wide swath, the amount of data has been greatly increased. Therefore, the radar is required to have low-latency and real-time processing capability under the constraints of size, weight and power consumption. This paper systematically introduces the new technology of high resolution radar and real-time signal and information processing. The key problems and solutions are discussed, including the detection and tracking of 1D high-resolution radar, the accurate signal modeling and wide-swath imaging for geosynchronous orbit synthetic aperture radar, and real-time signal and information processing architecture and efficient algorithms. Finally, the latest research progress and representative results are presented, and the development trends are prospected.
基金Project supported by the Shanghai Pujiang Programme and the National Basic Research Programme of China (Grant No 2006CB921202)
文摘We investigate a planar ion chip design with a two-dimensional array of linear ion traps for scalable quantum information processing. Qubits are formed from the internal electronic states of trapped ^40Ca^+ ions. The segmented electrodes reside in a single plane on a substrate and a grounded metal plate separately, a combination of appropriate rf and DC potentials is applied to them for stable ion confinement. Every two adjacent electrodes can generate a linear ion trap in and between the electrodes above the chip at a distance dependent on the geometrical scale and other considerations. The potential distributions are calculated by using a static electric field qualitatively. This architecture provides a conceptually simple avenue to achieving the microfabrication and large-scale quantum computation based on the arrays of trapped ions.
文摘In order to study the problem of intelligent information processing in new types of imaging fuze, the method of extracting the invariance features of target images is adopted, and radial basis function neural network is used to recognize targets. Owing to its ability of parallel processing, its robustness and generalization, the method can realize the recognition of the conditions of missile-target encounters, and meet the requirements of real-time recognition in the imaging fuze. It is shown that based on artificial neural network target recognition and burst point control are feasible.
基金This work was supported by the National Nature Science Foundation of China(Nos.71871188 and U1834209)the Research and development project of China National Railway Group Co.,Ltd(No.P2020X016).
文摘The delay-causing text data contain valuable information such as the specific reasons for the delay,location and time of the disturbance,which can provide an efficient support for the prediction of train delays and improve the guidance of train control efficiency.Based on the train operation data and delay-causing data of the Wuhan-Guangzhou high-speed railway,the relevant algorithms in the natural language processing field are used to process the delay-causing text data.It also integrates the train operatingenvironment information and delay-causing text information so as to develop a cause-based train delay propagation prediction model.The Word2vec model is first used to vectorize the delay-causing text description after word segmentation.The mean model or the term frequency-inverse document frequency-weighted model is then used to generate the delay-causing sentence vector based on the original word vector.Afterward,the train operating-environment features and delay-causing sentence vector are input into the extreme gradient boosting(XGBoost)regression algorithm to develop a delay propagation prediction model.In this work,4 text feature processing methods and 8 regression algorithms are considered.The results demonstrate that the XGBoost regression algorithm has the highest prediction accuracy using the test features processed by the continuous bag of words and the mean models.Compared with the prediction model that only considers the train-operating-environment features,the results show that the prediction accuracy of the model is significantly improved with multi-ple regression algorithms after integrating the delay-causing feature.
文摘It is acknowledged that lacking of interdisciplinary communication amongst designers can result in poor coordination performance in building design. Viewing communication as information processing activity, this paper aims to explore the relationship between interdisciplinary information processing (IP) and design coordination performance. Both amount and quality are concerned regarding information processing. 698 project based samples are collected by questionnaire survey from design institutes in China's Mainland. Statistical data analysis shows that the relationship between information processing amount and design coordination performance follows a nonlinear exponential expression: performance = 3.691 (1-0.235IP amount) rather than reverted U curve. It implies that design period is too short to allow information overload. It indicates that the main problem in interdisciplinary communication in design institute in China is insufficient information. In additional, it is found the correlation between IP quality and coordination process performance is much stronger than that between IP amount and coordination process performance. For practitioners, it reminds design mangers to pay more attention to information processing quality rather than amount.
基金supported by the National Natural Science Foundution of China under Crant No 50275111 und Excellent Universily Teacher Foundation of the Ministry of Education of China under Grant No.2002-65-5.
文摘Oil monitoring and vibration monitoring are two principal techniques for mechanical fault diagnosis and condition monitoring at present.They monitor the mechanical condition by different approaches,nevertheless,oil and vibration monitoring are related in information collecting and processing.In the same mechanical system,the information obtained from the same information source can be described with the same expression form.The expressions are constituted of a structure matrix,a relative matrix and a system matrix.For oil and vibration monitoring,the information source is correlation and the collection is independent and complementary.And oil monitoring and vibration monitoring have the same process method when they yield their information.This research has provided a reasonable and useful approach to combine oil monitoring and vibration monitoring.
文摘Quantum information processing is an active cross-disciplinary field drawing upon theoretical and experimental physics, computer science, engineering, mathematics, and material science. Its scope ranges from fundamental issues in quantum physics to prospective commercial exploitation by the computing and communications industries.
文摘Background: This work aims at investigating the histology of hippocampus formation as structural model of information processing. The study addressed the question whether the pattern of cellular type distribution within hippocampal fields could be used as support of information processing in the hippocampus. Method: Pyramidal-shaped neurons presenting both cytoplasm and nucleus outlined clearly were measured systematically on brain slides, using a light microscope connected to a microcomputer equipped with a scanner software for measuring particles. Morphological types of cells were identified following class sizes and their distribution determined through hippocampal fields. Results: A battery of statistical tests: Sturges’ classification, class sizes distribution around overall mean, Bartlett’s sphericity test, principal components analysis (PCA) followed by correlations matrix analysis and ANOVA allowed two cellular groups to be identified in the hippocampus: large and small pyramidal-shaped cells. Conclusion: The results show that sensory information processing in the hippocampus could be built on two classes of pyramidal neurons that differed anatomically with probably different physiological functions. The study suggests combination ensembles clustering large and small pyramidal cells at different rates, as fundamental signaling units of the hippocampus.