Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradi...Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.展开更多
Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM t...Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM techniques(i.e.,laser powder bed fusion and laser-directed energy deposition)will be reviewed,covering the aspects of processes,materials and post-processing.The impacts of process parameters and strategies for optimizing parameters will be elucidated.Various types of Ti alloys processed by LAM,includingα-Ti,(α+β)-Ti,andβ-Ti alloys,will be overviewed in terms of micro structures and benchmarking properties.Furthermore,the post-processing methods for improving the performance of L AM-processed Ti alloys,including conventional and novel heat treatment,hot isostatic pressing,and surface processing(e.g.,ultrasonic and laser shot peening),will be systematically reviewed and discussed.The review summarizes the process windows,properties,and performance envelopes and benchmarks the research achievements in LAM of Ti alloys.The outlooks of further trends in LAM of Ti alloys are also highlighted at the end of the review.This comprehensive review could serve as a valuable resource for researchers and practitioners,promoting further advancements in LAM-built Ti alloys and their applications.展开更多
In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow bas...In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow based on computational or experimental results. Field plots by contours, iso-surfaces, streamlines, vectors and others are traditional post-processing techniques. While the shock wave, as one important and critical flow structure in many aerodynamic problems, can hardly be detected or distinguished in a direct way using these traditional methods, due to possible confusions with other similar discontinuous flow structures like slip line, contact discontinuity, etc. Therefore, method for automatic detection of shock wave in post-processing is of great importance for both academic research and engineering applications. In this paper, the current status of methodologies developed for shock wave detection and implementations in post-processing platform are reviewed, as well as discussions on advantages and limitations of the existing methods and proposals for further studies of shock wave detection method. We also develop an advanced post-processing software, with improved shock detection.展开更多
Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve...Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32× 32 APD array is up to tens of Gbits/s.展开更多
To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum...To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum that the high intense and stable line spectrum is superimposed on the wide continuous spectrum.This method modifies the traditional beam forming algorithm by calculating and fusing the beam forming results at multi-frequency band and multi-azimuth interval,showing an excellent way to extract the line spectrum when the interference and the target are not in the same azimuth interval simultaneously.Statistical efficiency of the estimated azimuth variance and corresponding power of the line spectrum band depends on the line spectra ratio(LSR)of the line spectrum.The change laws of the output signal to noise ratio(SNR)with the LSR,the input SNR,the integration time and the filtering bandwidth of different algorithms bring the selection principle of the critical LSR.As the basis,the detection gain of wideband energy integration and the narrowband line spectrum algorithm are theoretically analyzed.The simulation detection gain demonstrates a good match with the theoretical model.The application conditions of all methods are verified by the receiver operating characteristic(ROC)curve and experimental data from Qiandao Lake.In fact,combining the two methods for target detection reduces the missed detection rate.The proposed post-processing method in2-dimension with the Kalman filter in the time dimension and the background equalization algorithm in the azimuth dimension makes use of the strong correlation between adjacent frames,could further remove background fluctuation and improve the display effect.展开更多
The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for coll...The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for collecting data is not sufficient because of their limited coverage and expensive costs for installation and maintenance. Application of the Global Positioning Systems (GPS) in travel time and delay data collections is proven to be efficient in terms of accuracy, level of details for the data and required data collection of man-power. While data collection automation is improved by the GPS technique, human errors can easily find their way through the post-processing phase, and therefore data post-processing remains a challenge especially in case of big projects with high amount of data. This paper introduces a stand-alone post-processing tool called GPS Calculator, which provides an easy-to-use environment to carry out data post-processing. This is a Visual Basic application that processes the data files obtained in the field and integrates them into Geographic Information Systems (GIS) for analysis and representation. The results show that this tool obtains similar results to the currently used data post-processing method, reduces the post-processing effort, and also eliminates the need for the second person during the data collection.展开更多
In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflec...In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.展开更多
When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive ...When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.展开更多
Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intole...Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intolerably alter inherent features of MR images.Drastic changes in brightness features,induced by post-processing are not appreciated in medical imaging as the grey level values have certain diagnostic meanings.To overcome these issues this paper proposes an algorithm that enhance the contrast of MR images while preserving the underlying features as well.This method termed as Power-law and Logarithmic Modification-based Histogram Equalization(PLMHE)partitions the histogram of the image into two sub histograms after a power-law transformation and a log compression.After a modification intended for improving the dispersion of the sub-histograms and subsequent normalization,cumulative histograms are computed.Enhanced grey level values are computed from the resultant cumulative histograms.The performance of the PLMHE algorithm is comparedwith traditional histogram equalization based algorithms and it has been observed from the results that PLMHE can boost the image contrast without causing dynamic range compression,a significant change in mean brightness,and contrast-overshoot.展开更多
This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids...This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.展开更多
In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produce...In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produced by wall holes and the loss of precision induced by using differential method to derive strains, the displacement-based elements cannot always present accuracy enough for design. In this paper, the hybrid post-processing procedure based on the Hellinger-Reissner variational principle is used for improving the stress precision of two quadrilateral plane elements. In order to find the best stress field, three different forms are assumed for the displacement-based plane elements and with drilling DOF. Numerical results show that by using the proposed method, the accuracy of stress solutions of these two displacement-based plane elements can be improved.展开更多
Developing reliable weather prediction systems is still a challenging task due to the complexity of the Earth System and the chaotic behavior of its compo-nents.Small errors introduced by observations,their assimilati...Developing reliable weather prediction systems is still a challenging task due to the complexity of the Earth System and the chaotic behavior of its compo-nents.Small errors introduced by observations,their assimilation and the forecast model configuration escalate chaotically,leading to a significant loss in forecast skill with time.Traditionally,rainfall forecasts have been generated at grid-based spatial resolutions,providing valuable information on regional precipitation patterns.However,Weather varies markedly within a grid box and forecasts for specific sites have occasionally failed inevitably.The grid-based forecasts may not always meet the needs of decision-makers at specific points of interest.The major challenge is dealing with variations in sub-grid variability,that is to say,the variation seen amongst rainfall point values within a given model grid box,more especially in convective situations.While ensemble forecasts have shown promise in capturing the uncertainty inherent in average rainfall predictions of much larger grid boxes,their utility at point locations has not been extensively explored.Most evaluation studies focus on grid-based verification metrics,which may not accurately reflect forecast per-formance at individual points of interest.EcPoint,a post-processing approach developed at the European Centre for Medium Range Forecasts(ECMWF),is tailored to forecast rainfall at point locations.In this study,we evaluate the performance of the EcPoint post-processing method over the south China re-gion.The analysis focuses on the reliability,accuracy and discrimination skill of this post-processing method over the three provinces in south China(An-hui,Zhejiang and Jiangsu).We examine performance versus lead time,sea-sons,and altitude.Through verifications,the study highlighted the added value of the post-processing method over Raw ensemble forecasts.One year of verification demonstrates that,between the Raw ensemble and post-pro-cessed EcPoint forecasts,EcPoint is the more reliable and skillful system,add-ing significant value to most rainfall events occurring during the day and the seasonal associated events,as well as the topography-associated rainfall events.To complement the one-year verification analysis,a case study was conducted on an extremely heavy rainfall event observed on June 2,2022 at 12 UTC.The analysis demonstrated EcPoint’s ability to provide more localized and refined forecasts whereas Raw didn’t provide any possibility of rainfall,particularly at short lead times.At longer lead times,EcPoint ensembles maintained rela-tively low probabilities,but offered improved performance in capturing rain-fall variability,while Raw ensemble exhibited broader but less precise rainfall predictions with a tendency of over warning of some areas.Future work can extend the evaluation to more diverse climatic and topographic regions of China to enhance the general applicability of the method.Although based solely on the global ECMWF-IFS model,EcPoint performs well over the small domain of south China(three provinces).Besides verifying EcPoint,the study confirms that the post-processing method can significantly improve the fore-cast performance.展开更多
Previous research utilizing Cartoon Generative Adversarial Network(CartoonGAN)has encountered limitations in managing intricate outlines and accurately representing lighting effects,particularly in complex scenes requ...Previous research utilizing Cartoon Generative Adversarial Network(CartoonGAN)has encountered limitations in managing intricate outlines and accurately representing lighting effects,particularly in complex scenes requiring detailed shading and contrast.This paper presents a novel Enhanced Pixel Integration(EPI)technique designed to improve the visual quality of images generated by CartoonGAN.Rather than modifying the core model,the EPI approach employs post-processing adjustments that enhance images without significant computational overhead.In this method,images produced by CartoonGAN are converted from Red-Green-Blue(RGB)to Hue-Saturation-Value(HSV)format,allowing for precise adjustments in hue,saturation,and brightness,thereby improving color fidelity.Specific correction values are applied to fine-tune colors,ensuring they closely match the original input while maintaining the characteristic,stylized effect of CartoonGAN.The corrected images are blended with the originals to retain aesthetic appeal and visual distinctiveness,resulting in improved color accuracy and overall coherence.Experimental results demonstrate that EPI significantly increases similarity to original input images compared to the standard CartoonGAN model,achieving a 40.14%enhancement in visual similarity in Learned Perceptual Image Patch Similarity(LPIPS),a 30.21%improvement in structural consistency in Structural Similarity Index Measure(SSIM),and an 11.81%reduction in pixel-level error in Mean Squared Error(MSE).By addressing limitations present in the traditional CartoonGAN pipeline,EPI offers practical enhancements for creative applications,particularly within media and design fields where visual fidelity and artistic style preservation are critical.These improvements align with the goals of Fog and Edge Computing,which also seek to enhance processing efficiency and application performance in sensitive industries such as healthcare,logistics,and education.This research not only resolves key deficiencies in existing CartoonGAN models but also expands its potential applications in image-based content creation,bridging gaps between technical constraints and creative demands.Future studies may explore the adaptability of EPI across various datasets and artistic styles,potentially broadening its impact on visual transformation tasks.展开更多
Metal halide perovskites have become one of the most competitive new-generation optoelectronic materials due to their excellent optoelectronic properties. Vacuum evaporation can produce high-purity and large-area film...Metal halide perovskites have become one of the most competitive new-generation optoelectronic materials due to their excellent optoelectronic properties. Vacuum evaporation can produce high-purity and large-area films, leading to the wide application of this method in the semiconductor industry and optoelectronics field. However, the electroluminescent performance of vacuum-evaporated perovskite light-emitting diodes(PeLEDs) still lags behind those counterparts fabricated by solution methods. Herein, based on vacuum evaporation, 3D perovskite films are obtained by three-source co-evaporation.Considering the unique quantum well structure of quasi-2D perovskite can significantly enhance the exciton binding energy and improve the radiative recombination rate, leading to a high photoluminescence quantum yield(PLQY). Subsequently,the highly stable and low-defect-density quasi-2D perovskite is introduced into 3D perovskite films through post-treatment with phenethylammonium chloride(PEACl). To minimize the degradation of film quality caused by PEACl treatment, a layer of guanidinium bromide(GABr) is vacuum evaporated on top of PEACl treatment to further improve the quality of emitting layer. Finally, under the synergistic post-processing modification of PEACl and GABr, blue PeLEDs with a maximum external quantum efficiency(EQE) of 6.09% and a maximum brightness of 1325 cd/m^(2) are successfully obtained. This work deepens the understanding of 2D/3D heterojunctions and provides a new approach to construct PeLEDs with high performance.展开更多
The concept of local shock strength and a quantitative measure index str of local shock strength are proposed,derived from the oblique shock relation and the monotonic relationship between total pressure loss ratio an...The concept of local shock strength and a quantitative measure index str of local shock strength are proposed,derived from the oblique shock relation and the monotonic relationship between total pressure loss ratio and normal Mach number.Utilizing the high density gradient characteristic of shock waves and the oblique shock relation,a post-processing algorithm for two-dimensional flow field data is developed.The objective of the post-processing algorithm is to obtain specific shock wave location coordinates and calculate the corresponding str from flow filed data under the calibration of the oblique shock relation.Valida-tion of this post-processing algorithm is conducted using a standard model example that can be solved analytically.Combining the concept of local shock strength with the post-processing algorithm,a local shock strength quantitative mapping approach is established for the first time.This approach enables a quantitative measure and visualization of local shock strength at distinct locations,represented by color mapping on the shock structures.The approach can be applied to post-processing numerical sim-ulation data of two-dimensional flows.Applications to the intersection of two left-running oblique shock waves(straight shock waves),the bow shock in front of a cylinder(curved shock wave),and Mach reflection(mixed straight and curved shock waves)demonstrate the accuracy,and effectiveness of the mapping approach in investigating diverse shock wave phenomena.The quan-titative mapping approach of str may be a valuable tool in the design of supersonic/hypersonic vehicles and the exploration of shock wave evolution.展开更多
Magnesium(Mg)and its alloys are emerging as a structural material for the aerospace,automobile,and electronics industries,driven by the imperative of weight reduction.They are also drawing notable attention in the med...Magnesium(Mg)and its alloys are emerging as a structural material for the aerospace,automobile,and electronics industries,driven by the imperative of weight reduction.They are also drawing notable attention in the medical industries owing to their biodegradability and a lower elastic modulus comparable to bone.The ability to manufacture near-net shape products featuring intricate geometries has sparked huge interest in additive manufacturing(AM)of Mg alloys,reflecting a transformation in the manufacturing sectors.However,AM of Mg alloys presents more formidable challenges due to inherent properties,particularly susceptibility to oxidation,gas trapping,high thermal expansion coefficient,and low solidification temperature.This leads to defects such as porosity,lack of fusion,cracking,delamination,residual stresses,and inhomogeneity,ultimately influencing the mechanical,corrosion,and surface properties of AM Mg alloys.To address these issues,post-processing of AM Mg alloys are often needed to make them suitable for application.The present article reviews all post-processing techniques adapted for AM Mg alloys to date,including heat treatment,hot isostatic pressing,friction stir processing,and surface peening.The utilization of these methods within the hybrid AM process,employing interlayer post-processing,is also discussed.Optimal post-processing conditions are reported,and their influence on the microstructure,mechanical,and corrosion properties are detailed.Additionally,future prospects and research directions are proposed.展开更多
Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantil...Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantile regression(QR)is highly competitive in terms of both flexibility and predictive performance.Nevertheless,a long-standing problem of QR is quantile crossing,which greatly limits the interpretability of QR-calibrated forecasts.On this point,this study proposes a non-crossing quantile regression neural network(NCQRNN),for calibrating ensemble NWP forecasts into a set of reliable quantile forecasts without crossing.The overarching design principle of NCQRNN is to add on top of the conventional QRNN structure another hidden layer,which imposes a non-decreasing mapping between the combined output from nodes of the last hidden layer to the nodes of the output layer,through a triangular weight matrix with positive entries.The empirical part of the work considers a solar irradiance case study,in which four years of ensemble irradiance forecasts at seven locations,issued by the European Centre for Medium-Range Weather Forecasts,are calibrated via NCQRNN,as well as via an eclectic mix of benchmarking models,ranging from the naïve climatology to the state-of-the-art deep-learning and other non-crossing models.Formal and stringent forecast verification suggests that the forecasts post-processed via NCQRNN attain the maximum sharpness subject to calibration,amongst all competitors.Furthermore,the proposed conception to resolve quantile crossing is remarkably simple yet general,and thus has broad applicability as it can be integrated with many shallow-and deep-learning-based neural networks.展开更多
Large-scale components of steel and aluminum alloys(Fe-Al)with high bonding strength are highly needed from space exploration to the fabrication of transportation systems.The formation of detrimental intermetallic com...Large-scale components of steel and aluminum alloys(Fe-Al)with high bonding strength are highly needed from space exploration to the fabrication of transportation systems.The formation of detrimental intermetallic compounds at the Al-Fe interface has limited the application range of the Fe-Al components.The modified friction stir additive manufacturing was developed for fabricating large-scale Fe-Al compo-nents with homogenously distributed interfacial amorphous layers rather than detrimental intermetallic compounds.The interfacial amorphous layers comprised an Mg-O rich amorphous layer<20 nm in thick-ness and an Al-Fe-Si amorphous layer<120 nm in thickness.The interfacial amorphous layers exhibited high thermal stability and did not change even after the post-processing heat treatment of heating at 500℃ for 20 min and aging at 170℃ for 7 h.The tensile strengths of the Fe-Al tensile specimens were increased from 160 to 250 MPa after the application of the post-processing heat treatment.The fracture occurred in the aluminum alloys instead of at the dissimilar metal interface,demonstrating that high bonding strength at the Al-Fe interface was enabled by the formation of homogenously distributed interfacial amorphous layers.展开更多
Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only f...Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only for removing irrelevant or redundant rules but also for uncovering hidden associations that impact other factors.Recently,several post-processing methods have been proposed,each with its own strengths and weaknesses.In this paper,we propose THAPE(Tunable Hybrid Associative Predictive Engine),which combines descriptive and predictive techniques.By leveraging both techniques,our aim is to enhance the quality of analyzing generated rules.This includes removing irrelevant or redundant rules,uncovering interesting and useful rules,exploring hidden association rules that may affect other factors,and providing backtracking ability for a given product.The proposed approach offers a tailored method that suits specific goals for retailers,enabling them to gain a better understanding of customer behavior based on factual transactions in the target market.We applied THAPE to a real dataset as a case study in this paper to demonstrate its effectiveness.Through this application,we successfully mined a concise set of highly interesting and useful association rules.Out of the 11,265 rules generated,we identified 125 rules that are particularly relevant to the business context.These identified rules significantly improve the interpretability and usefulness of association rules for decision-making purposes.展开更多
Vat photopolymerization(VPP)3D printing technology has broken through mold limitations and shown great potential to manufacture complex-structured ceramic cores for turbine blades.However,improving dimensional accurac...Vat photopolymerization(VPP)3D printing technology has broken through mold limitations and shown great potential to manufacture complex-structured ceramic cores for turbine blades.However,improving dimensional accuracy is difficult for the VPP 3D printed parts due to the high contraction deformation.Reducing shrinkage is a key challenge for developing 3D-printed ceramic cores.In this study,3D-printed alumina ceramic cores with near-zero shrinkage in the X direction were achieved for the first time using a novel approach that was called atmosphere-controlled in-situ oxidation of aluminum powder.The in-situ oxidation reaction of the aluminum powder was creatively tuned by changing the atmosphere transition temperature from argon to air.Then,the microstructure and properties of the ceramic core could be controlled by the liquid-phase sintering with the participation of atmosphere-protected molten aluminum.As a result,the pore size of the ceramic cores was significantly increased by almost ten times,but the bonding strength of the grains was also increased.In addition,the powder consolidation generated by the action of molten aluminum was considered to be an important reason for reducing the linear shrinkage of ceramic cores.Under the optimized parameters,the linear shrinkage of the ceramic cores was as low as 0.3%in the X direction.The high apparent porosity(45.02%)and flexural strength(72.7 MPa)of the alumina ceramic cores were realized at the same time.The in-situ control of sintering by changing the atmosphere will be a creative method for regulating the properties of ceramic materials.展开更多
基金the Young Investigator Group“Artificial Intelligence for Probabilistic Weather Forecasting”funded by the Vector Stiftungfunding from the Federal Ministry of Education and Research(BMBF)and the Baden-Württemberg Ministry of Science as part of the Excellence Strategy of the German Federal and State Governments。
文摘Weather forecasts from numerical weather prediction models play a central role in solar energy forecasting,where a cascade of physics-based models is used in a model chain approach to convert forecasts of solar irradiance to solar power production.Ensemble simulations from such weather models aim to quantify uncertainty in the future development of the weather,and can be used to propagate this uncertainty through the model chain to generate probabilistic solar energy predictions.However,ensemble prediction systems are known to exhibit systematic errors,and thus require post-processing to obtain accurate and reliable probabilistic forecasts.The overarching aim of our study is to systematically evaluate different strategies to apply post-processing in model chain approaches with a specific focus on solar energy:not applying any post-processing at all;post-processing only the irradiance predictions before the conversion;post-processing only the solar power predictions obtained from the model chain;or applying post-processing in both steps.In a case study based on a benchmark dataset for the Jacumba solar plant in the U.S.,we develop statistical and machine learning methods for postprocessing ensemble predictions of global horizontal irradiance(GHI)and solar power generation.Further,we propose a neural-network-based model for direct solar power forecasting that bypasses the model chain.Our results indicate that postprocessing substantially improves the solar power generation forecasts,in particular when post-processing is applied to the power predictions.The machine learning methods for post-processing slightly outperform the statistical methods,and the direct forecasting approach performs comparably to the post-processing strategies.
基金financially supported by the 2022 MTC Young Individual Research Grants under Singapore Research,Innovation and Enterprise(RIE)2025 Plan(No.M22K3c0097)the Natural Science Foundation of US(No.DMR-2104933)the sponsorship of the China Scholarship Council(No.202106130051)。
文摘Laser additive manufacturing(LAM)of titanium(Ti)alloys has emerged as a transformative technology with vast potential across multiple industries.To recap the state of the art,Ti alloys processed by two essential LAM techniques(i.e.,laser powder bed fusion and laser-directed energy deposition)will be reviewed,covering the aspects of processes,materials and post-processing.The impacts of process parameters and strategies for optimizing parameters will be elucidated.Various types of Ti alloys processed by LAM,includingα-Ti,(α+β)-Ti,andβ-Ti alloys,will be overviewed in terms of micro structures and benchmarking properties.Furthermore,the post-processing methods for improving the performance of L AM-processed Ti alloys,including conventional and novel heat treatment,hot isostatic pressing,and surface processing(e.g.,ultrasonic and laser shot peening),will be systematically reviewed and discussed.The review summarizes the process windows,properties,and performance envelopes and benchmarks the research achievements in LAM of Ti alloys.The outlooks of further trends in LAM of Ti alloys are also highlighted at the end of the review.This comprehensive review could serve as a valuable resource for researchers and practitioners,promoting further advancements in LAM-built Ti alloys and their applications.
文摘In the present computational fluid dynamics (CFD) community, post-processing is regarded as a procedure to view parameter distribution, detect characteristic structure and reveal physical mechanism of fluid flow based on computational or experimental results. Field plots by contours, iso-surfaces, streamlines, vectors and others are traditional post-processing techniques. While the shock wave, as one important and critical flow structure in many aerodynamic problems, can hardly be detected or distinguished in a direct way using these traditional methods, due to possible confusions with other similar discontinuous flow structures like slip line, contact discontinuity, etc. Therefore, method for automatic detection of shock wave in post-processing is of great importance for both academic research and engineering applications. In this paper, the current status of methodologies developed for shock wave detection and implementations in post-processing platform are reviewed, as well as discussions on advantages and limitations of the existing methods and proposals for further studies of shock wave detection method. We also develop an advanced post-processing software, with improved shock detection.
基金Supported by the Chinese Academy of Sciences Center for Excellence and Synergetic Innovation Center in Quantum Information and Quantum Physics,Shanghai Branch,University of Science and Technology of Chinathe National Natural Science Foundation of China under Grant No 11405172
文摘Quantum random number generators adopting single negligible dead time of avalanche photodiodes (APDs) photon detection have been restricted due to the non- We propose a new approach based on an APD array to improve the generation rate of random numbers significantly. This method compares the detectors' responses to consecutive optical pulses and generates the random sequence. We implement a demonstration experiment to show its simplicity, compactness and scalability. The generated numbers are proved to be unbiased, post-processing free, ready to use, and their randomness is verified by using the national institute of standard technology statistical test suite. The random bit generation efficiency is as high as 32.8% and the potential generation rate adopting the 32× 32 APD array is up to tens of Gbits/s.
基金supported by the National Natural Science Foundation of China(51875535)the Natural Science Foundation for Young Scientists of Shanxi Province(201701D221017,201901D211242)。
文摘To improve the ability of detecting underwater targets under strong wideband interference environment,an efficient method of line spectrum extraction is proposed,which fully utilizes the feature of the target spectrum that the high intense and stable line spectrum is superimposed on the wide continuous spectrum.This method modifies the traditional beam forming algorithm by calculating and fusing the beam forming results at multi-frequency band and multi-azimuth interval,showing an excellent way to extract the line spectrum when the interference and the target are not in the same azimuth interval simultaneously.Statistical efficiency of the estimated azimuth variance and corresponding power of the line spectrum band depends on the line spectra ratio(LSR)of the line spectrum.The change laws of the output signal to noise ratio(SNR)with the LSR,the input SNR,the integration time and the filtering bandwidth of different algorithms bring the selection principle of the critical LSR.As the basis,the detection gain of wideband energy integration and the narrowband line spectrum algorithm are theoretically analyzed.The simulation detection gain demonstrates a good match with the theoretical model.The application conditions of all methods are verified by the receiver operating characteristic(ROC)curve and experimental data from Qiandao Lake.In fact,combining the two methods for target detection reduces the missed detection rate.The proposed post-processing method in2-dimension with the Kalman filter in the time dimension and the background equalization algorithm in the azimuth dimension makes use of the strong correlation between adjacent frames,could further remove background fluctuation and improve the display effect.
文摘The travel time data collection method is used to assist the congestion management. The use of traditional sensors (e.g. inductive loops, AVI sensors) or more recent Bluetooth sensors installed on major roads for collecting data is not sufficient because of their limited coverage and expensive costs for installation and maintenance. Application of the Global Positioning Systems (GPS) in travel time and delay data collections is proven to be efficient in terms of accuracy, level of details for the data and required data collection of man-power. While data collection automation is improved by the GPS technique, human errors can easily find their way through the post-processing phase, and therefore data post-processing remains a challenge especially in case of big projects with high amount of data. This paper introduces a stand-alone post-processing tool called GPS Calculator, which provides an easy-to-use environment to carry out data post-processing. This is a Visual Basic application that processes the data files obtained in the field and integrates them into Geographic Information Systems (GIS) for analysis and representation. The results show that this tool obtains similar results to the currently used data post-processing method, reduces the post-processing effort, and also eliminates the need for the second person during the data collection.
基金Projects 50221402, 50490271 and 50025413 supported by the National Natural Science Foundation of Chinathe National Basic Research Program of China (2009CB219603, 2009 CB724601, 2006CB202209 and 2005CB221500)+1 种基金the Key Project of the Ministry of Education (306002)the Program for Changjiang Scholars and Innovative Research Teams in Universities of MOE (IRT0408)
文摘In order to carry out numerical simulation using geologic structural data obtained from Landmark(seismic interpretation system), underground geological structures are abstracted into mechanical models which can reflect actual situations and facilitate their computation and analyses.Given the importance of model building, further processing methods about traditional seismic interpretation results from Landmark should be studied and the processed result can then be directly used in numerical simulation computations.Through this data conversion procedure, Landmark and FLAC(the international general stress software) are seamlessly connected.Thus, the format conversion between the two systems and the pre-and post-processing in simulation computation is realized.A practical application indicates that this method has many advantages such as simple operation, high accuracy of the element subdivision and high speed, which may definitely satisfy the actual needs of floor grid cutting.
基金supported by the New Century Excellent Talents in University(NCET-09-0396)the National Science&Technology Key Projects of Numerical Control(2012ZX04014-031)+1 种基金the Natural Science Foundation of Hubei Province(2011CDB279)the Foundation for Innovative Research Groups of the Natural Science Foundation of Hubei Province,China(2010CDA067)
文摘When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.
基金This work was supported by Taif university Researchers Supporting Project Number(TURSP-2020/114),Taif University,Taif,Saudi Arabia.
文摘Low contrast of Magnetic Resonance(MR)images limits the visibility of subtle structures and adversely affects the outcome of both subjective and automated diagnosis.State-of-the-art contrast boosting techniques intolerably alter inherent features of MR images.Drastic changes in brightness features,induced by post-processing are not appreciated in medical imaging as the grey level values have certain diagnostic meanings.To overcome these issues this paper proposes an algorithm that enhance the contrast of MR images while preserving the underlying features as well.This method termed as Power-law and Logarithmic Modification-based Histogram Equalization(PLMHE)partitions the histogram of the image into two sub histograms after a power-law transformation and a log compression.After a modification intended for improving the dispersion of the sub-histograms and subsequent normalization,cumulative histograms are computed.Enhanced grey level values are computed from the resultant cumulative histograms.The performance of the PLMHE algorithm is comparedwith traditional histogram equalization based algorithms and it has been observed from the results that PLMHE can boost the image contrast without causing dynamic range compression,a significant change in mean brightness,and contrast-overshoot.
文摘This paper proposed improvements to the low bit rate parametric audio coder with sinusoid model as its kernel. Firstly, we propose a new method to effectively order and select the perceptually most important sinusoids. The sinusoid which contributes most to the reduction of overall NMR is chosen. Combined with our improved parametric psychoacoustic model and advanced peak riddling techniques, the number of sinusoids required can be greatly reduced and the coding efficiency can be greatly enhanced. A lightweight version is also given to reduce the amount of computation with only little sacrifice of performance. Secondly, we propose two enhancement techniques for sinusoid synthesis: bandwidth enhancement and line enhancement. With little overhead, the effective bandwidth can be extended one more octave; the timbre tends to sound much brighter, thicker and more beautiful.
文摘In the analysis of high-rise building, traditional displacement-based plane elements are often used to get the in-plane internal forces of the shear walls by stress integration. Limited by the singular problem produced by wall holes and the loss of precision induced by using differential method to derive strains, the displacement-based elements cannot always present accuracy enough for design. In this paper, the hybrid post-processing procedure based on the Hellinger-Reissner variational principle is used for improving the stress precision of two quadrilateral plane elements. In order to find the best stress field, three different forms are assumed for the displacement-based plane elements and with drilling DOF. Numerical results show that by using the proposed method, the accuracy of stress solutions of these two displacement-based plane elements can be improved.
文摘Developing reliable weather prediction systems is still a challenging task due to the complexity of the Earth System and the chaotic behavior of its compo-nents.Small errors introduced by observations,their assimilation and the forecast model configuration escalate chaotically,leading to a significant loss in forecast skill with time.Traditionally,rainfall forecasts have been generated at grid-based spatial resolutions,providing valuable information on regional precipitation patterns.However,Weather varies markedly within a grid box and forecasts for specific sites have occasionally failed inevitably.The grid-based forecasts may not always meet the needs of decision-makers at specific points of interest.The major challenge is dealing with variations in sub-grid variability,that is to say,the variation seen amongst rainfall point values within a given model grid box,more especially in convective situations.While ensemble forecasts have shown promise in capturing the uncertainty inherent in average rainfall predictions of much larger grid boxes,their utility at point locations has not been extensively explored.Most evaluation studies focus on grid-based verification metrics,which may not accurately reflect forecast per-formance at individual points of interest.EcPoint,a post-processing approach developed at the European Centre for Medium Range Forecasts(ECMWF),is tailored to forecast rainfall at point locations.In this study,we evaluate the performance of the EcPoint post-processing method over the south China re-gion.The analysis focuses on the reliability,accuracy and discrimination skill of this post-processing method over the three provinces in south China(An-hui,Zhejiang and Jiangsu).We examine performance versus lead time,sea-sons,and altitude.Through verifications,the study highlighted the added value of the post-processing method over Raw ensemble forecasts.One year of verification demonstrates that,between the Raw ensemble and post-pro-cessed EcPoint forecasts,EcPoint is the more reliable and skillful system,add-ing significant value to most rainfall events occurring during the day and the seasonal associated events,as well as the topography-associated rainfall events.To complement the one-year verification analysis,a case study was conducted on an extremely heavy rainfall event observed on June 2,2022 at 12 UTC.The analysis demonstrated EcPoint’s ability to provide more localized and refined forecasts whereas Raw didn’t provide any possibility of rainfall,particularly at short lead times.At longer lead times,EcPoint ensembles maintained rela-tively low probabilities,but offered improved performance in capturing rain-fall variability,while Raw ensemble exhibited broader but less precise rainfall predictions with a tendency of over warning of some areas.Future work can extend the evaluation to more diverse climatic and topographic regions of China to enhance the general applicability of the method.Although based solely on the global ECMWF-IFS model,EcPoint performs well over the small domain of south China(three provinces).Besides verifying EcPoint,the study confirms that the post-processing method can significantly improve the fore-cast performance.
基金supported by the National Research Foundation of Korea(NRF)under Grant RS-2022-NR-069955(2022R1A2C1092178).
文摘Previous research utilizing Cartoon Generative Adversarial Network(CartoonGAN)has encountered limitations in managing intricate outlines and accurately representing lighting effects,particularly in complex scenes requiring detailed shading and contrast.This paper presents a novel Enhanced Pixel Integration(EPI)technique designed to improve the visual quality of images generated by CartoonGAN.Rather than modifying the core model,the EPI approach employs post-processing adjustments that enhance images without significant computational overhead.In this method,images produced by CartoonGAN are converted from Red-Green-Blue(RGB)to Hue-Saturation-Value(HSV)format,allowing for precise adjustments in hue,saturation,and brightness,thereby improving color fidelity.Specific correction values are applied to fine-tune colors,ensuring they closely match the original input while maintaining the characteristic,stylized effect of CartoonGAN.The corrected images are blended with the originals to retain aesthetic appeal and visual distinctiveness,resulting in improved color accuracy and overall coherence.Experimental results demonstrate that EPI significantly increases similarity to original input images compared to the standard CartoonGAN model,achieving a 40.14%enhancement in visual similarity in Learned Perceptual Image Patch Similarity(LPIPS),a 30.21%improvement in structural consistency in Structural Similarity Index Measure(SSIM),and an 11.81%reduction in pixel-level error in Mean Squared Error(MSE).By addressing limitations present in the traditional CartoonGAN pipeline,EPI offers practical enhancements for creative applications,particularly within media and design fields where visual fidelity and artistic style preservation are critical.These improvements align with the goals of Fog and Edge Computing,which also seek to enhance processing efficiency and application performance in sensitive industries such as healthcare,logistics,and education.This research not only resolves key deficiencies in existing CartoonGAN models but also expands its potential applications in image-based content creation,bridging gaps between technical constraints and creative demands.Future studies may explore the adaptability of EPI across various datasets and artistic styles,potentially broadening its impact on visual transformation tasks.
基金supported by the National Key Research and Development Program of China(No.2022YFA1204800)the National Natural Science Foundation of China(Grant No.U2001219)+1 种基金Hubei Provincial Natural Science Foundation of China(No.2023AFA034)the Key R&D program of Hubei Province(No.2023BAB102).
文摘Metal halide perovskites have become one of the most competitive new-generation optoelectronic materials due to their excellent optoelectronic properties. Vacuum evaporation can produce high-purity and large-area films, leading to the wide application of this method in the semiconductor industry and optoelectronics field. However, the electroluminescent performance of vacuum-evaporated perovskite light-emitting diodes(PeLEDs) still lags behind those counterparts fabricated by solution methods. Herein, based on vacuum evaporation, 3D perovskite films are obtained by three-source co-evaporation.Considering the unique quantum well structure of quasi-2D perovskite can significantly enhance the exciton binding energy and improve the radiative recombination rate, leading to a high photoluminescence quantum yield(PLQY). Subsequently,the highly stable and low-defect-density quasi-2D perovskite is introduced into 3D perovskite films through post-treatment with phenethylammonium chloride(PEACl). To minimize the degradation of film quality caused by PEACl treatment, a layer of guanidinium bromide(GABr) is vacuum evaporated on top of PEACl treatment to further improve the quality of emitting layer. Finally, under the synergistic post-processing modification of PEACl and GABr, blue PeLEDs with a maximum external quantum efficiency(EQE) of 6.09% and a maximum brightness of 1325 cd/m^(2) are successfully obtained. This work deepens the understanding of 2D/3D heterojunctions and provides a new approach to construct PeLEDs with high performance.
基金supported by the National Natural Science Foundation of China(Grant No.12372233)the Fund of NPU-Duke China Seed Program(Grant No.119003067)the“111 Project”of China(Grant No.B17037-106).
文摘The concept of local shock strength and a quantitative measure index str of local shock strength are proposed,derived from the oblique shock relation and the monotonic relationship between total pressure loss ratio and normal Mach number.Utilizing the high density gradient characteristic of shock waves and the oblique shock relation,a post-processing algorithm for two-dimensional flow field data is developed.The objective of the post-processing algorithm is to obtain specific shock wave location coordinates and calculate the corresponding str from flow filed data under the calibration of the oblique shock relation.Valida-tion of this post-processing algorithm is conducted using a standard model example that can be solved analytically.Combining the concept of local shock strength with the post-processing algorithm,a local shock strength quantitative mapping approach is established for the first time.This approach enables a quantitative measure and visualization of local shock strength at distinct locations,represented by color mapping on the shock structures.The approach can be applied to post-processing numerical sim-ulation data of two-dimensional flows.Applications to the intersection of two left-running oblique shock waves(straight shock waves),the bow shock in front of a cylinder(curved shock wave),and Mach reflection(mixed straight and curved shock waves)demonstrate the accuracy,and effectiveness of the mapping approach in investigating diverse shock wave phenomena.The quan-titative mapping approach of str may be a valuable tool in the design of supersonic/hypersonic vehicles and the exploration of shock wave evolution.
文摘Magnesium(Mg)and its alloys are emerging as a structural material for the aerospace,automobile,and electronics industries,driven by the imperative of weight reduction.They are also drawing notable attention in the medical industries owing to their biodegradability and a lower elastic modulus comparable to bone.The ability to manufacture near-net shape products featuring intricate geometries has sparked huge interest in additive manufacturing(AM)of Mg alloys,reflecting a transformation in the manufacturing sectors.However,AM of Mg alloys presents more formidable challenges due to inherent properties,particularly susceptibility to oxidation,gas trapping,high thermal expansion coefficient,and low solidification temperature.This leads to defects such as porosity,lack of fusion,cracking,delamination,residual stresses,and inhomogeneity,ultimately influencing the mechanical,corrosion,and surface properties of AM Mg alloys.To address these issues,post-processing of AM Mg alloys are often needed to make them suitable for application.The present article reviews all post-processing techniques adapted for AM Mg alloys to date,including heat treatment,hot isostatic pressing,friction stir processing,and surface peening.The utilization of these methods within the hybrid AM process,employing interlayer post-processing,is also discussed.Optimal post-processing conditions are reported,and their influence on the microstructure,mechanical,and corrosion properties are detailed.Additionally,future prospects and research directions are proposed.
基金supported by the National Natural Science Foundation of China (Project No.42375192)the China Meteorological Administration Climate Change Special Program (CMA-CCSP+1 种基金Project No.QBZ202315)support by the Vector Stiftung through the Young Investigator Group"Artificial Intelligence for Probabilistic Weather Forecasting."
文摘Despite the maturity of ensemble numerical weather prediction(NWP),the resulting forecasts are still,more often than not,under-dispersed.As such,forecast calibration tools have become popular.Among those tools,quantile regression(QR)is highly competitive in terms of both flexibility and predictive performance.Nevertheless,a long-standing problem of QR is quantile crossing,which greatly limits the interpretability of QR-calibrated forecasts.On this point,this study proposes a non-crossing quantile regression neural network(NCQRNN),for calibrating ensemble NWP forecasts into a set of reliable quantile forecasts without crossing.The overarching design principle of NCQRNN is to add on top of the conventional QRNN structure another hidden layer,which imposes a non-decreasing mapping between the combined output from nodes of the last hidden layer to the nodes of the output layer,through a triangular weight matrix with positive entries.The empirical part of the work considers a solar irradiance case study,in which four years of ensemble irradiance forecasts at seven locations,issued by the European Centre for Medium-Range Weather Forecasts,are calibrated via NCQRNN,as well as via an eclectic mix of benchmarking models,ranging from the naïve climatology to the state-of-the-art deep-learning and other non-crossing models.Formal and stringent forecast verification suggests that the forecasts post-processed via NCQRNN attain the maximum sharpness subject to calibration,amongst all competitors.Furthermore,the proposed conception to resolve quantile crossing is remarkably simple yet general,and thus has broad applicability as it can be integrated with many shallow-and deep-learning-based neural networks.
基金supported by the National Natu-ral Science Foundation of China(Nos.52375396,52034005,and 51975553)the Liaoning Provincial Department of Science and Technology(No.2023JH2/101300149)+4 种基金the Shenyang Science and Technology Bureau(No.22-315-6-03)and Institute of Metal Re-search,Chinese Academy of Sciences(No.2023-ZD02-01)the Liaoning Province Excellent Youth Foundation(No.2021-YQ-01)the Program of the Youth Innovation Promotion Association of the Chi-nese Academy of Sciences(No.Y2021061)the Bintech-IMR R&D Program(No.GYY-JSBU-2022-002).
文摘Large-scale components of steel and aluminum alloys(Fe-Al)with high bonding strength are highly needed from space exploration to the fabrication of transportation systems.The formation of detrimental intermetallic compounds at the Al-Fe interface has limited the application range of the Fe-Al components.The modified friction stir additive manufacturing was developed for fabricating large-scale Fe-Al compo-nents with homogenously distributed interfacial amorphous layers rather than detrimental intermetallic compounds.The interfacial amorphous layers comprised an Mg-O rich amorphous layer<20 nm in thick-ness and an Al-Fe-Si amorphous layer<120 nm in thickness.The interfacial amorphous layers exhibited high thermal stability and did not change even after the post-processing heat treatment of heating at 500℃ for 20 min and aging at 170℃ for 7 h.The tensile strengths of the Fe-Al tensile specimens were increased from 160 to 250 MPa after the application of the post-processing heat treatment.The fracture occurred in the aluminum alloys instead of at the dissimilar metal interface,demonstrating that high bonding strength at the Al-Fe interface was enabled by the formation of homogenously distributed interfacial amorphous layers.
文摘Association rule learning(ARL)is a widely used technique for discovering relationships within datasets.However,it often generates excessive irrelevant or ambiguous rules.Therefore,post-processing is crucial not only for removing irrelevant or redundant rules but also for uncovering hidden associations that impact other factors.Recently,several post-processing methods have been proposed,each with its own strengths and weaknesses.In this paper,we propose THAPE(Tunable Hybrid Associative Predictive Engine),which combines descriptive and predictive techniques.By leveraging both techniques,our aim is to enhance the quality of analyzing generated rules.This includes removing irrelevant or redundant rules,uncovering interesting and useful rules,exploring hidden association rules that may affect other factors,and providing backtracking ability for a given product.The proposed approach offers a tailored method that suits specific goals for retailers,enabling them to gain a better understanding of customer behavior based on factual transactions in the target market.We applied THAPE to a real dataset as a case study in this paper to demonstrate its effectiveness.Through this application,we successfully mined a concise set of highly interesting and useful association rules.Out of the 11,265 rules generated,we identified 125 rules that are particularly relevant to the business context.These identified rules significantly improve the interpretability and usefulness of association rules for decision-making purposes.
基金the National Natural Science Foundation of China(Nos.52130204,52174376)the Guangdong Basic and Ap-plied Basic Research Foundation(No.2021B1515120028)+6 种基金the Sci-ence and Technology Innovation Team Plan of Shaan Xi Province(No.2021TD-17)the Youth Innovation Team of Shaanxi Univer-sities,Xi’an Science and Technology Program(No.21ZCZZHXJS-QCY6-0005)the Fundamental Research Funds for the Central Uni-versities(Nos.D5000230348,D5000210902)the TQ Innovation Foundation(No.23-TQ09-02-ZT-01-005)the Aeronautical Science Foundation of China(No.20220042053001)the Thousands Person Plan of Jiangxi Province(No.grant number JXSQ2020102131)the Innovation Foundation for Doctor Dissertation of Northwestern Polytechnical University(No.CX2022033),China.
文摘Vat photopolymerization(VPP)3D printing technology has broken through mold limitations and shown great potential to manufacture complex-structured ceramic cores for turbine blades.However,improving dimensional accuracy is difficult for the VPP 3D printed parts due to the high contraction deformation.Reducing shrinkage is a key challenge for developing 3D-printed ceramic cores.In this study,3D-printed alumina ceramic cores with near-zero shrinkage in the X direction were achieved for the first time using a novel approach that was called atmosphere-controlled in-situ oxidation of aluminum powder.The in-situ oxidation reaction of the aluminum powder was creatively tuned by changing the atmosphere transition temperature from argon to air.Then,the microstructure and properties of the ceramic core could be controlled by the liquid-phase sintering with the participation of atmosphere-protected molten aluminum.As a result,the pore size of the ceramic cores was significantly increased by almost ten times,but the bonding strength of the grains was also increased.In addition,the powder consolidation generated by the action of molten aluminum was considered to be an important reason for reducing the linear shrinkage of ceramic cores.Under the optimized parameters,the linear shrinkage of the ceramic cores was as low as 0.3%in the X direction.The high apparent porosity(45.02%)and flexural strength(72.7 MPa)of the alumina ceramic cores were realized at the same time.The in-situ control of sintering by changing the atmosphere will be a creative method for regulating the properties of ceramic materials.