Integrated circuit chips are produced on silicon wafers.Robotic cluster tools are widely used since they provide a reconfigurable and efficient environment for most wafer fabrication processes.Recent advances in new s...Integrated circuit chips are produced on silicon wafers.Robotic cluster tools are widely used since they provide a reconfigurable and efficient environment for most wafer fabrication processes.Recent advances in new semiconductor materials bring about new functionality for integrated circuits.After a wafer is processed in a processing chamber,the wafer should be removed from there as fast as possible to guarantee its high-quality integrated circuits.Meanwhile,maximization of the throughput of robotic cluster tools is desired.This work aims to perform post-processing time-aware scheduling for such tools subject to wafer residencytime constraints.To do so,closed-form expression algorithms are derived to compute robot waiting time accurately upon the analysis of particular events of robot waiting for singlearm cluster tools.Examples are given to show the application and effectiveness of the proposed algorithms.展开更多
Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradi...Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.展开更多
As wafer circuit width shrinks down to less than ten nanometers in recent years,stringent quality control in the wafer manufacturing process is increasingly important.Thanks to the coupling of neighboring cluster tool...As wafer circuit width shrinks down to less than ten nanometers in recent years,stringent quality control in the wafer manufacturing process is increasingly important.Thanks to the coupling of neighboring cluster tools and coordination of multiple robots in a multi-cluster tool,wafer production scheduling becomes rather complicated.After a wafer is processed,due to high-temperature chemical reactions in a chamber,the robot should be controlled to take it out of the processing chamber at the right time.In order to ensure the uniformity of integrated circuits on wafers,it is highly desirable to make the differences in wafer post-processing time among the individual tools in a multicluster tool as small as possible.To achieve this goal,for the first time,this work aims to find an optimal schedule for a dual-arm multi-cluster tool to regulate the wafer post-processing time.To do so,we propose polynomial-time algorithms to find an optimal schedule,which can achieve the highest throughput,and minimize the total post-processing time of the processing steps.We propose a linear program model and another algorithm to balance the differences in the post-processing time between any pair of adjacent cluster tools.Two industrial examples are given to illustrate the application and effectiveness of the proposed method.展开更多
In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow bas...In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow based on computational or experimental results. Field plots by contours, iso-surfaces, streamlines, vectors and others are traditional post-processing techniques. While the shock wave, as one important and critical flow structure in many aerodynamic problems, can hardly be detected or distinguished in a direct way using these traditional methods, due to possible confusions with other similar discontinuous flow structures like slip line, contact discontinuity, etc. Therefore, method for automatic detection of shock wave in post-processing is of great importance for both academic research and engineering applications. In this paper, the current status of methodologies developed for shock wave detection and implementations in post-processing platform are reviewed, as well as discussions on advantages and limitations of the existing methods and proposals for further studies of shock wave detection method. We also develop an advanced post-processing software, with improved shock detection.展开更多
The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for coll...The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for collecting data is not sufficient because of their limited coverage and expensive costs for installation and maintenance. Application of the Global Positioning Systems (GPS) in travel time and delay data collections is proven to be efficient in terms of accuracy, level of details for the data and required data collection of man-power. While data collection automation is improved by the GPS technique, human errors can easily find their way through the post-processing phase, and therefore data post-processing remains a challenge especially in case of big projects with high amount of data. This paper introduces a stand-alone post-processing tool called GPS Calculator, which provides an easy-to-use environment to carry out data post-processing. This is a Visual Basic application that processes the data files obtained in the field and integrates them into Geographic Information Systems (GIS) for analysis and representation. The results show that this tool obtains similar results to the currently used data post-processing method, reduces the post-processing effort, and also eliminates the need for the second person during the data collection.展开更多
Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve...Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32× 32 APD array is up to tens of Gbits/s.展开更多
In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflec...In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.展开更多
When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive ...When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.展开更多
To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum...To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum that the high intense and stable line spectrum is superimposed on the wide continuous spectrum.This method modifies the traditional beam forming algorithm by calculating and fusing the beam forming results at multi-frequency band and multi-azimuth interval,showing an excellent way to extract the line spectrum when the interference and the target are not in the same azimuth interval simultaneously.Statistical efficiency of the estimated azimuth variance and corresponding power of the line spectrum band depends on the line spectra ratio(LSR)of the line spectrum.The change laws of the output signal to noise ratio(SNR)with the LSR,the input SNR,the integration time and the filtering bandwidth of different algorithms bring the selection principle of the critical LSR.As the basis,the detection gain of wideband energy integration and the narrowband line spectrum algorithm are theoretically analyzed.The simulation detection gain demonstrates a good match with the theoretical model.The application conditions of all methods are verified by the receiver operating characteristic(ROC)curve and experimental data from Qiandao Lake.In fact,combining the two methods for target detection reduces the missed detection rate.The proposed post-processing method in2-dimension with the Kalman filter in the time dimension and the background equalization algorithm in the azimuth dimension makes use of the strong correlation between adjacent frames,could further remove background fluctuation and improve the display effect.展开更多
This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids...This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.展开更多
Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intole...Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intolerably alter inherent features of MR images.Drastic changes in brightness features,induced by post-processing are not appreciated in medical imaging as the grey level values have certain diagnostic meanings.To overcome these issues this paper proposes an algorithm that enhance the contrast of MR images while preserving the underlying features as well.This method termed as Power-law and Logarithmic Modification-based Histogram Equalization(PLMHE)partitions the histogram of the image into two sub histograms after a power-law transformation and a log compression.After a modification intended for improving the dispersion of the sub-histograms and subsequent normalization,cumulative histograms are computed.Enhanced grey level values are computed from the resultant cumulative histograms.The performance of the PLMHE algorithm is comparedwith traditional histogram equalization based algorithms and it has been observed from the results that PLMHE can boost the image contrast without causing dynamic range compression,a significant change in mean brightness,and contrast-overshoot.展开更多
In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produce...In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produced by wall holes and the loss of precision induced by using differential method to derive strains, the displacement-based elements cannot always present accuracy enough for design. In this paper, the hybrid post-processing procedure based on the Hellinger-Reissner variational principle is used for improving the stress precision of two quadrilateral plane elements. In order to find the best stress field, three different forms are assumed for the displacement-based plane elements and with drilling DOF. Numerical results show that by using the proposed method, the accuracy of stress solutions of these two displacement-based plane elements can be improved.展开更多
Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM t...Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM techniques(i.e.,laser powder bed fusion and laser-directed energy deposition)will be reviewed,covering the aspects of processes,materials and post-processing.The impacts of process parameters and strategies for optimizing parameters will be elucidated.Various types of Ti alloys processed by LAM,includingα-Ti,(α+β)-Ti,andβ-Ti alloys,will be overviewed in terms of micro structures and benchmarking properties.Furthermore,the post-processing methods for improving the performance of L AM-processed Ti alloys,including conventional and novel heat treatment,hot isostatic pressing,and surface processing(e.g.,ultrasonic and laser shot peening),will be systematically reviewed and discussed.The review summarizes the process windows,properties,and performance envelopes and benchmarks the research achievements in LAM of Ti alloys.The outlooks of further trends in LAM of Ti alloys are also highlighted at the end of the review.This comprehensive review could serve as a valuable resource for researchers and practitioners,promoting further advancements in LAM-built Ti alloys and their applications.展开更多
Accuracy allocation is crucial in the accuracy design of machining tools.Current accuracy allocation methods primarily focus on positional deviation,with little consideration for tool direction deviation.To address th...Accuracy allocation is crucial in the accuracy design of machining tools.Current accuracy allocation methods primarily focus on positional deviation,with little consideration for tool direction deviation.To address this issue,we propose a geometric error cost sensitivity-based accuracy allocation method for five-axis machine tools.A geometric error model consisting of 4l error components is constructed based on homogeneous transformation matrices.Volumetric points with positional and tool direction deviations are randomly sampled to evaluate the accuracy of the machine tool.The sensitivity of each error component at these sampling points is analyzed using the Sobol method.To balance the needs of geometric precision and manufacturing cost,a geometric error cost sensitivity function is developed to estimate the required cost.By allocating error components affecting tool direction deviation first and the remaining components second,this allocation scheme ensures that both deviations meet the requirements.We also perform numerical simulation of a BC-type(B-axis and C-axis type)five-axis machine tool to validate the method.The results show that the new allocation scheme reduces the total geometric error cost by 27.8%compared to a uniform allocation scheme,and yields the same positional and tool direction machining accuracies.展开更多
Trochoidal milling is known for its advantages in machining difficult-to-machine materials as it facilitates chip removal and tool cooling.However,the conventional trochoidal tool path presents challenges such as lowe...Trochoidal milling is known for its advantages in machining difficult-to-machine materials as it facilitates chip removal and tool cooling.However,the conventional trochoidal tool path presents challenges such as lower machining efficiency and longer machining time due to its time-varying cutter-workpiece engagement angle and a high percentage of non-cutting tool paths.To address these issues,this paper introduces a parameter-variant trochoidal-like(PVTR)tool path planning method for chatter-free and high-efficiency milling.This method ensures a constant engagement angle for each tool path period by adjusting the trochoidal radius and step.Initially,the nonlinear equation for the PVTR toolpath is established.Then,a segmented recurrence method is proposed to plan tool paths based on the desired engagement angle.The impact of trochoidal tool path parameters on the engagement angle is analyzed and coupled this information with the milling stability model based on spindle speed and engagement angle to determine the desired engagement angle throughout the machining process.Finally,several experimental tests are carried out using the bull-nose end mill to validate the feasibility and effectiveness of the proposed method.展开更多
Developmental and reproductive toxicity(DART)endpoint entails a toxicological assessment of all developmental stages and reproductive cycles of an organism.In silico tools to predict DART will provide a method to asse...Developmental and reproductive toxicity(DART)endpoint entails a toxicological assessment of all developmental stages and reproductive cycles of an organism.In silico tools to predict DART will provide a method to assess this complex toxicity endpoint and will be valuable for screening emerging pollutants as well as for m anaging new chemicals in China.Currently,there are few published DART prediction models in China,but many related research and development projects are in progress.In 2013,WU et al.published an expert rule-based DART decision tree(DT).This DT relies on known chemical structures linked to DART to forecast DART potential of a given chemical.Within this procedure,an accurate DART data interpretation is the foundation of building and expanding the DT.This paper excerpted case studies demonstrating DART data curation and interpretation of four chemicals(including 8-hydroxyquinoline,3,5,6-trichloro-2-pyridinol,thiacloprid,and imidacloprid)to expand the existing DART DT.Chemicals were first selected from the database of Solid Waste and Chemicals Management Center,Ministry of Ecology and Environment(MEESCC)in China.The structures of these 4 chemicals were analyzed and preliminarily grouped by chemists based on core structural features,functional groups,receptor binding property,metabolism,and possible mode of actions.Then,the DART conclusion was derived by collecting chemical information,searching,integrating,and interpreting DART data by the toxicologists.Finally,these chemicals were classified into either an existing category or a new category via integrating their chemical features,DART conclusions,and biological properties.The results showed that 8-hydroxyquinoline impacted estrous cyclicity,s exual organ weights,and embryonal development,and 3,5,6-trichloro-2-pyridinol caused central nervous system(CNS)malformations,which were added to an existing subcategory 8e(aromatic compounds with multi-halogen and nitro groups)of the DT.Thiacloprid caused dystocia and fetal skeletal malformation,and imidacloprid disrupted the endocrine system and male fertility.They both contain 2-chloro-5-methylpyridine substituted imidazolidine c yclic ring,which were expected to create a new category of neonicotinoids.The current work delineates a t ransparent process of curating toxicological data for the purpose of DART data interpretation.In the presence of sufficient related structures and DART data,the DT can be expanded by iteratively adding chemicals within the a pplicable domain of each category or subcategory.This DT can potentially serve as a tool for screening emerging pollutants and assessing new chemicals in China.展开更多
Insect-derived traditional Chinese medicine(TCM)constitutes an essential component of TCM,with the earliest records found in“52 Bingfang”(Prescriptions of fifty-two diseases,which is one of the earliest Chinese medi...Insect-derived traditional Chinese medicine(TCM)constitutes an essential component of TCM,with the earliest records found in“52 Bingfang”(Prescriptions of fifty-two diseases,which is one of the earliest Chinese medical prescriptions).展开更多
Background: Clinical decision support tools provide suggestions to support healthcare providers and clinicians, as they attend to patients. Clinicians use these tools to rapidly consult the evidence at the point of ca...Background: Clinical decision support tools provide suggestions to support healthcare providers and clinicians, as they attend to patients. Clinicians use these tools to rapidly consult the evidence at the point of care, a practice which has been found to reduce the time patients spend in hospitals, promote the quality of care and improve healthcare outcomes. Such tools include Medscape, VisualDx, Clinical Key, DynaMed, BMJ Best Practice and UpToDate. However, use of such tools has not yet been fully embraced in low-resource settings such as Uganda. Objective: This paper intends to collate data on the use and uptake of one such tool, UpToDate, which was provided at no cost to five medical schools in Uganda. Methods: Free access to UpToDate was granted through the IP addresses of five medical schools in Uganda in collaboration with Better Evidence at The Global Health Delivery Project at Harvard and Brigham and Women’s Hospital and Wolters Kluwer Health. Following the donation, medical librarians in the respective institutions conducted training sessions and created awareness of the tool. Usage data was aggregated, based on logins and content views, presented and analyzed using Excel tables and graphs. Results: The data shows similar trends in increased usage over the period of August 2022 to August 2023 across the five medical schools. The most common topics viewed, mode of access (using either the computer or the mobile app), total usage by institution, ratio of uses to eligible users by institution and ratio of uses to students by institution are shared. Conclusion: The study revealed that the tool was used by various user categories across the institutions with similar steady improved usage over the year. These results can inform the librarians as they encourage their respective institutions to continue using the tool to support uptake of point-of-care tools in clinical practice.展开更多
Microgrinding is widely used in clinical bone surgery,but saline spray cooling faces technical challenges such as low wettability at the microgrinding tool–bone interface,easy clogging of the microgrinding tools,and ...Microgrinding is widely used in clinical bone surgery,but saline spray cooling faces technical challenges such as low wettability at the microgrinding tool–bone interface,easy clogging of the microgrinding tools,and high grinding temperatures.These issues can lead to bone necrosis,irreversible thermal damage to nerves,or even surgical failure.Inspired by the water-trapping and directional transportation abilities of desert beetles,this study proposes a biomimetic desert beetle microgrinding tool.The flow-field distribution directly influences the convective heat transfer of the cooling medium in the grinding zone,which in turn affects the grinding temperature.To address this,a mathematical model of the two-phase flow field at the biomimetic microgrinding tool–bone interface is developed.The results indicate an average error of 14.74%between the calculated and experimentally obtained airflow field velocities.Next,a biomimetic desert beetle microgrinding tool is prepared.Experiments with physiological saline spray cooling were conducted on fresh bovine femur bone,which has mechanical properties similar to human bone.Results show that,compared with conventional microgrinding tools,the biomimetic tools reduced bone surface temperature by 21.7%,13.2%,5.8%,20.3%,and 25.8%at particle sizes of 150#,200#,240#,270#,and 300#,respectively.The surface morphology of the biomimetic microgrinding tools after grinding is observed and analyzed,revealing a maximum clogging area reduction of 23.0%,which is 6.1%,6.0%,10.0%,15.6%,and 9.5%less than that observed with conventional tools.Finally,this study unveils the dynamic mechanism of cooling medium transfer in the flow field at the biomimetic microgrinding tool–bone interface.This research provides theoretical guidance and technical support for clinical bone resection surgery.展开更多
Cardiovascular disease is the leading cause of human mortality,and calcified tissue blocking blood vessels is the main cause of major adverse cardiovascular events(MACE).Rotational Atherectomy(RA)is a minimally invasi...Cardiovascular disease is the leading cause of human mortality,and calcified tissue blocking blood vessels is the main cause of major adverse cardiovascular events(MACE).Rotational Atherectomy(RA)is a minimally invasive catheterbased treatment method that involves high-speed cutting of calcified tissue using miniature tools for removal.However,the cutting forces,heat,and debris can induce tissue damage and give rise to serious surgical complications.To enhance the effectiveness and efficiency of RA,a novel eccentric rotational cutting tool,with one side comprising axial and circumferential staggered micro-blades,was designed and fabricated in this study.In addition,a series of experiments were conducted to analyze their performance across five dimensions:tool kinematics,force,temperature,debris,and surface morphology of the specimens.Experimental results show that the force,temperature and debris size of the novel tool were well inhibited at the highest rotational speed.For the tool of standard clinical size(diameter 1.25 mm),the maximum force is 0.75 N,with a maximum temperature rise in the operation area of 1.09℃.Debris distribution followed a normal distribution pattern,with 90%of debris measuring smaller than 9.12μm.All tool metrics met clinical safety requirements,indicating its superior performance.This study provides a new idea for the design of calcified tissue removal tools,and contributes positively to the advancement of RA.展开更多
基金supported in part by the National Natural Science Foundation of China(61673123,61803397,61603100)Science and Technology Development Fund(FDCT)Macao SAR of China(0017/2019/A1,005/2018/A1,011/2017/A)
文摘Integrated circuit chips are produced on silicon wafers.Robotic cluster tools are widely used since they provide a reconfigurable and efficient environment for most wafer fabrication processes.Recent advances in new semiconductor materials bring about new functionality for integrated circuits.After a wafer is processed in a processing chamber,the wafer should be removed from there as fast as possible to guarantee its high-quality integrated circuits.Meanwhile,maximization of the throughput of robotic cluster tools is desired.This work aims to perform post-processing time-aware scheduling for such tools subject to wafer residencytime constraints.To do so,closed-form expression algorithms are derived to compute robot waiting time accurately upon the analysis of particular events of robot waiting for singlearm cluster tools.Examples are given to show the application and effectiveness of the proposed algorithms.
基金the Young Investigator Group“Artificial Intelligence for Probabilistic Weather Forecasting”funded by the Vector Stiftungfunding from the Federal Ministry of Education and Research(BMBF)and the Baden-Württemberg Ministry of Science as part of the Excellence Strategy of the German Federal and State Governments。
文摘Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.
基金supported in part by the National Natural Science Foundation of China(61673123)the Natural Science Foundation of Guangdong Province,China(2020A151501482)+1 种基金the Science and Technology development fund(FDCT),Macao SAR(0083/2021/A2,0015/2020/AMJ)Research Fund of Guangdong-Hong Kong-Macao Joint Laboratory for Intelligent Micro-Nano Optoelectronic Technology(2020B1212030010)。
文摘As wafer circuit width shrinks down to less than ten nanometers in recent years,stringent quality control in the wafer manufacturing process is increasingly important.Thanks to the coupling of neighboring cluster tools and coordination of multiple robots in a multi-cluster tool,wafer production scheduling becomes rather complicated.After a wafer is processed,due to high-temperature chemical reactions in a chamber,the robot should be controlled to take it out of the processing chamber at the right time.In order to ensure the uniformity of integrated circuits on wafers,it is highly desirable to make the differences in wafer post-processing time among the individual tools in a multicluster tool as small as possible.To achieve this goal,for the first time,this work aims to find an optimal schedule for a dual-arm multi-cluster tool to regulate the wafer post-processing time.To do so,we propose polynomial-time algorithms to find an optimal schedule,which can achieve the highest throughput,and minimize the total post-processing time of the processing steps.We propose a linear program model and another algorithm to balance the differences in the post-processing time between any pair of adjacent cluster tools.Two industrial examples are given to illustrate the application and effectiveness of the proposed method.
文摘In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow based on computational or experimental results. Field plots by contours, iso-surfaces, streamlines, vectors and others are traditional post-processing techniques. While the shock wave, as one important and critical flow structure in many aerodynamic problems, can hardly be detected or distinguished in a direct way using these traditional methods, due to possible confusions with other similar discontinuous flow structures like slip line, contact discontinuity, etc. Therefore, method for automatic detection of shock wave in post-processing is of great importance for both academic research and engineering applications. In this paper, the current status of methodologies developed for shock wave detection and implementations in post-processing platform are reviewed, as well as discussions on advantages and limitations of the existing methods and proposals for further studies of shock wave detection method. We also develop an advanced post-processing software, with improved shock detection.
文摘The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for collecting data is not sufficient because of their limited coverage and expensive costs for installation and maintenance. Application of the Global Positioning Systems (GPS) in travel time and delay data collections is proven to be efficient in terms of accuracy, level of details for the data and required data collection of man-power. While data collection automation is improved by the GPS technique, human errors can easily find their way through the post-processing phase, and therefore data post-processing remains a challenge especially in case of big projects with high amount of data. This paper introduces a stand-alone post-processing tool called GPS Calculator, which provides an easy-to-use environment to carry out data post-processing. This is a Visual Basic application that processes the data files obtained in the field and integrates them into Geographic Information Systems (GIS) for analysis and representation. The results show that this tool obtains similar results to the currently used data post-processing method, reduces the post-processing effort, and also eliminates the need for the second person during the data collection.
基金Supported by the Chinese Academy of Sciences Center for Excellence and Synergetic Innovation Center in Quantum Information and Quantum Physics,Shanghai Branch,University of Science and Technology of Chinathe National Natural Science Foundation of China under Grant No 11405172
文摘Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32× 32 APD array is up to tens of Gbits/s.
基金Projects 50221402, 50490271 and 50025413 supported by the National Natural Science Foundation of Chinathe National Basic Research Program of China (2009CB219603, 2009 CB724601, 2006CB202209 and 2005CB221500)+1 种基金the Key Project of the Ministry of Education (306002)the Program for Changjiang Scholars and Innovative Research Teams in Universities of MOE (IRT0408)
文摘In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.
基金supported by the New Century Excellent Talents in University(NCET-09-0396)the National Science&Technology Key Projects of Numerical Control(2012ZX04014-031)+1 种基金the Natural Science Foundation of Hubei Province(2011CDB279)the Foundation for Innovative Research Groups of the Natural Science Foundation of Hubei Province,China(2010CDA067)
文摘When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.
基金supported by the National Natural Science Foundation of China(51875535)the Natural Science Foundation for Young Scientists of Shanxi Province(201701D221017,201901D211242)。
文摘To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum that the high intense and stable line spectrum is superimposed on the wide continuous spectrum.This method modifies the traditional beam forming algorithm by calculating and fusing the beam forming results at multi-frequency band and multi-azimuth interval,showing an excellent way to extract the line spectrum when the interference and the target are not in the same azimuth interval simultaneously.Statistical efficiency of the estimated azimuth variance and corresponding power of the line spectrum band depends on the line spectra ratio(LSR)of the line spectrum.The change laws of the output signal to noise ratio(SNR)with the LSR,the input SNR,the integration time and the filtering bandwidth of different algorithms bring the selection principle of the critical LSR.As the basis,the detection gain of wideband energy integration and the narrowband line spectrum algorithm are theoretically analyzed.The simulation detection gain demonstrates a good match with the theoretical model.The application conditions of all methods are verified by the receiver operating characteristic(ROC)curve and experimental data from Qiandao Lake.In fact,combining the two methods for target detection reduces the missed detection rate.The proposed post-processing method in2-dimension with the Kalman filter in the time dimension and the background equalization algorithm in the azimuth dimension makes use of the strong correlation between adjacent frames,could further remove background fluctuation and improve the display effect.
文摘This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.
基金This work was supported by Taif university Researchers Supporting Project Number(TURSP-2020/114),Taif University,Taif,Saudi Arabia.
文摘Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intolerably alter inherent features of MR images.Drastic changes in brightness features,induced by post-processing are not appreciated in medical imaging as the grey level values have certain diagnostic meanings.To overcome these issues this paper proposes an algorithm that enhance the contrast of MR images while preserving the underlying features as well.This method termed as Power-law and Logarithmic Modification-based Histogram Equalization(PLMHE)partitions the histogram of the image into two sub histograms after a power-law transformation and a log compression.After a modification intended for improving the dispersion of the sub-histograms and subsequent normalization,cumulative histograms are computed.Enhanced grey level values are computed from the resultant cumulative histograms.The performance of the PLMHE algorithm is comparedwith traditional histogram equalization based algorithms and it has been observed from the results that PLMHE can boost the image contrast without causing dynamic range compression,a significant change in mean brightness,and contrast-overshoot.
文摘In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produced by wall holes and the loss of precision induced by using differential method to derive strains, the displacement-based elements cannot always present accuracy enough for design. In this paper, the hybrid post-processing procedure based on the Hellinger-Reissner variational principle is used for improving the stress precision of two quadrilateral plane elements. In order to find the best stress field, three different forms are assumed for the displacement-based plane elements and with drilling DOF. Numerical results show that by using the proposed method, the accuracy of stress solutions of these two displacement-based plane elements can be improved.
基金financially supported by the 2022 MTC Young Individual Research Grants under Singapore Research,Innovation and Enterprise(RIE)2025 Plan(No.M22K3c0097)the Natural Science Foundation of US(No.DMR-2104933)the sponsorship of the China Scholarship Council(No.202106130051)。
文摘Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM techniques(i.e.,laser powder bed fusion and laser-directed energy deposition)will be reviewed,covering the aspects of processes,materials and post-processing.The impacts of process parameters and strategies for optimizing parameters will be elucidated.Various types of Ti alloys processed by LAM,includingα-Ti,(α+β)-Ti,andβ-Ti alloys,will be overviewed in terms of micro structures and benchmarking properties.Furthermore,the post-processing methods for improving the performance of L AM-processed Ti alloys,including conventional and novel heat treatment,hot isostatic pressing,and surface processing(e.g.,ultrasonic and laser shot peening),will be systematically reviewed and discussed.The review summarizes the process windows,properties,and performance envelopes and benchmarks the research achievements in LAM of Ti alloys.The outlooks of further trends in LAM of Ti alloys are also highlighted at the end of the review.This comprehensive review could serve as a valuable resource for researchers and practitioners,promoting further advancements in LAM-built Ti alloys and their applications.
基金supported by the Key R&D Program of Zhejiang Province(Nos.2023C01166 and 2024SJCZX0046)the Zhejiang Provincial Natural Science Foundation of China(Nos.LDT23E05013E05 and LD24E050009)the Natural Science Foundation of Ningbo(No.2021J150),China.
文摘Accuracy allocation is crucial in the accuracy design of machining tools.Current accuracy allocation methods primarily focus on positional deviation,with little consideration for tool direction deviation.To address this issue,we propose a geometric error cost sensitivity-based accuracy allocation method for five-axis machine tools.A geometric error model consisting of 4l error components is constructed based on homogeneous transformation matrices.Volumetric points with positional and tool direction deviations are randomly sampled to evaluate the accuracy of the machine tool.The sensitivity of each error component at these sampling points is analyzed using the Sobol method.To balance the needs of geometric precision and manufacturing cost,a geometric error cost sensitivity function is developed to estimate the required cost.By allocating error components affecting tool direction deviation first and the remaining components second,this allocation scheme ensures that both deviations meet the requirements.We also perform numerical simulation of a BC-type(B-axis and C-axis type)five-axis machine tool to validate the method.The results show that the new allocation scheme reduces the total geometric error cost by 27.8%compared to a uniform allocation scheme,and yields the same positional and tool direction machining accuracies.
基金supported by the National Natural Science Foundation of China(Grant Nos.U22A20202 and 52275477).
文摘Trochoidal milling is known for its advantages in machining difficult-to-machine materials as it facilitates chip removal and tool cooling.However,the conventional trochoidal tool path presents challenges such as lower machining efficiency and longer machining time due to its time-varying cutter-workpiece engagement angle and a high percentage of non-cutting tool paths.To address these issues,this paper introduces a parameter-variant trochoidal-like(PVTR)tool path planning method for chatter-free and high-efficiency milling.This method ensures a constant engagement angle for each tool path period by adjusting the trochoidal radius and step.Initially,the nonlinear equation for the PVTR toolpath is established.Then,a segmented recurrence method is proposed to plan tool paths based on the desired engagement angle.The impact of trochoidal tool path parameters on the engagement angle is analyzed and coupled this information with the milling stability model based on spindle speed and engagement angle to determine the desired engagement angle throughout the machining process.Finally,several experimental tests are carried out using the bull-nose end mill to validate the feasibility and effectiveness of the proposed method.
文摘Developmental and reproductive toxicity(DART)endpoint entails a toxicological assessment of all developmental stages and reproductive cycles of an organism.In silico tools to predict DART will provide a method to assess this complex toxicity endpoint and will be valuable for screening emerging pollutants as well as for m anaging new chemicals in China.Currently,there are few published DART prediction models in China,but many related research and development projects are in progress.In 2013,WU et al.published an expert rule-based DART decision tree(DT).This DT relies on known chemical structures linked to DART to forecast DART potential of a given chemical.Within this procedure,an accurate DART data interpretation is the foundation of building and expanding the DT.This paper excerpted case studies demonstrating DART data curation and interpretation of four chemicals(including 8-hydroxyquinoline,3,5,6-trichloro-2-pyridinol,thiacloprid,and imidacloprid)to expand the existing DART DT.Chemicals were first selected from the database of Solid Waste and Chemicals Management Center,Ministry of Ecology and Environment(MEESCC)in China.The structures of these 4 chemicals were analyzed and preliminarily grouped by chemists based on core structural features,functional groups,receptor binding property,metabolism,and possible mode of actions.Then,the DART conclusion was derived by collecting chemical information,searching,integrating,and interpreting DART data by the toxicologists.Finally,these chemicals were classified into either an existing category or a new category via integrating their chemical features,DART conclusions,and biological properties.The results showed that 8-hydroxyquinoline impacted estrous cyclicity,s exual organ weights,and embryonal development,and 3,5,6-trichloro-2-pyridinol caused central nervous system(CNS)malformations,which were added to an existing subcategory 8e(aromatic compounds with multi-halogen and nitro groups)of the DT.Thiacloprid caused dystocia and fetal skeletal malformation,and imidacloprid disrupted the endocrine system and male fertility.They both contain 2-chloro-5-methylpyridine substituted imidazolidine c yclic ring,which were expected to create a new category of neonicotinoids.The current work delineates a t ransparent process of curating toxicological data for the purpose of DART data interpretation.In the presence of sufficient related structures and DART data,the DT can be expanded by iteratively adding chemicals within the a pplicable domain of each category or subcategory.This DT can potentially serve as a tool for screening emerging pollutants and assessing new chemicals in China.
基金funded by the National Natural Science Foundation of China(Grant Nos.:82222068,82070423,82270348,and 82173779)the Innovation Team and Talents Cultivation Pro-gram of National Administration of Traditional Chinese Medicine,China(Grant No:ZYYCXTD-D-202206)+1 种基金Fujian Province Science and Technology Project,China(Grant Nos.:2021J01420479,2021J02058,2022J011374,and 2022J02057)Fundamental Research Funds for the Chinese Central Universities,China(Grant No.:20720230070).
文摘Insect-derived traditional Chinese medicine(TCM)constitutes an essential component of TCM,with the earliest records found in“52 Bingfang”(Prescriptions of fifty-two diseases,which is one of the earliest Chinese medical prescriptions).
文摘Background: Clinical decision support tools provide suggestions to support healthcare providers and clinicians, as they attend to patients. Clinicians use these tools to rapidly consult the evidence at the point of care, a practice which has been found to reduce the time patients spend in hospitals, promote the quality of care and improve healthcare outcomes. Such tools include Medscape, VisualDx, Clinical Key, DynaMed, BMJ Best Practice and UpToDate. However, use of such tools has not yet been fully embraced in low-resource settings such as Uganda. Objective: This paper intends to collate data on the use and uptake of one such tool, UpToDate, which was provided at no cost to five medical schools in Uganda. Methods: Free access to UpToDate was granted through the IP addresses of five medical schools in Uganda in collaboration with Better Evidence at The Global Health Delivery Project at Harvard and Brigham and Women’s Hospital and Wolters Kluwer Health. Following the donation, medical librarians in the respective institutions conducted training sessions and created awareness of the tool. Usage data was aggregated, based on logins and content views, presented and analyzed using Excel tables and graphs. Results: The data shows similar trends in increased usage over the period of August 2022 to August 2023 across the five medical schools. The most common topics viewed, mode of access (using either the computer or the mobile app), total usage by institution, ratio of uses to eligible users by institution and ratio of uses to students by institution are shared. Conclusion: The study revealed that the tool was used by various user categories across the institutions with similar steady improved usage over the year. These results can inform the librarians as they encourage their respective institutions to continue using the tool to support uptake of point-of-care tools in clinical practice.
基金Supported by National Natural Science Foundation of China(Grant Nos.52205481,52305477)Outstanding Youth Innovation Team in Universities of Shandong Province(Grant No.2023KJ114)+2 种基金Qingdao Science and Technology Planning Park Cultivation Plan(Grant No.23-1-5-yqpy-17-qy)Young Talent of Lifting engineering for Science and Technology in Shandong(Grant No.SDAST2024QTA043)Key Lab of Industrial Fluid Energy Conservation and Pollution Control(Ministry of Education)(Grant No.CK-2024-0033)。
文摘Microgrinding is widely used in clinical bone surgery,but saline spray cooling faces technical challenges such as low wettability at the microgrinding tool–bone interface,easy clogging of the microgrinding tools,and high grinding temperatures.These issues can lead to bone necrosis,irreversible thermal damage to nerves,or even surgical failure.Inspired by the water-trapping and directional transportation abilities of desert beetles,this study proposes a biomimetic desert beetle microgrinding tool.The flow-field distribution directly influences the convective heat transfer of the cooling medium in the grinding zone,which in turn affects the grinding temperature.To address this,a mathematical model of the two-phase flow field at the biomimetic microgrinding tool–bone interface is developed.The results indicate an average error of 14.74%between the calculated and experimentally obtained airflow field velocities.Next,a biomimetic desert beetle microgrinding tool is prepared.Experiments with physiological saline spray cooling were conducted on fresh bovine femur bone,which has mechanical properties similar to human bone.Results show that,compared with conventional microgrinding tools,the biomimetic tools reduced bone surface temperature by 21.7%,13.2%,5.8%,20.3%,and 25.8%at particle sizes of 150#,200#,240#,270#,and 300#,respectively.The surface morphology of the biomimetic microgrinding tools after grinding is observed and analyzed,revealing a maximum clogging area reduction of 23.0%,which is 6.1%,6.0%,10.0%,15.6%,and 9.5%less than that observed with conventional tools.Finally,this study unveils the dynamic mechanism of cooling medium transfer in the flow field at the biomimetic microgrinding tool–bone interface.This research provides theoretical guidance and technical support for clinical bone resection surgery.
基金Supported by National Natural Science Foundation of China(Grant No.52205455)Fujian Provincial Health Technology Project(Grant Nos.2022CXA005,2022CXA015)。
文摘Cardiovascular disease is the leading cause of human mortality,and calcified tissue blocking blood vessels is the main cause of major adverse cardiovascular events(MACE).Rotational Atherectomy(RA)is a minimally invasive catheterbased treatment method that involves high-speed cutting of calcified tissue using miniature tools for removal.However,the cutting forces,heat,and debris can induce tissue damage and give rise to serious surgical complications.To enhance the effectiveness and efficiency of RA,a novel eccentric rotational cutting tool,with one side comprising axial and circumferential staggered micro-blades,was designed and fabricated in this study.In addition,a series of experiments were conducted to analyze their performance across five dimensions:tool kinematics,force,temperature,debris,and surface morphology of the specimens.Experimental results show that the force,temperature and debris size of the novel tool were well inhibited at the highest rotational speed.For the tool of standard clinical size(diameter 1.25 mm),the maximum force is 0.75 N,with a maximum temperature rise in the operation area of 1.09℃.Debris distribution followed a normal distribution pattern,with 90%of debris measuring smaller than 9.12μm.All tool metrics met clinical safety requirements,indicating its superior performance.This study provides a new idea for the design of calcified tissue removal tools,and contributes positively to the advancement of RA.