Data-driven process monitoring is an effective approach to assure safe operation of modern manufacturing and energy systems,such as thermal power plants being studied in this work.Industrial processes are inherently d...Data-driven process monitoring is an effective approach to assure safe operation of modern manufacturing and energy systems,such as thermal power plants being studied in this work.Industrial processes are inherently dynamic and need to be monitored using dynamic algorithms.Mainstream dynamic algorithms rely on concatenating current measurement with past data.This work proposes a new,alternative dynamic process monitoring algorithm,using dot product feature analysis(DPFA).DPFA computes the dot product of consecutive samples,thus naturally capturing the process dynamics through temporal correlation.At the same time,DPFA's online computational complexity is lower than not just existing dynamic algorithms,but also classical static algorithms(e.g.,principal component analysis and slow feature analysis).The detectability of the new algorithm is analyzed for three types of faults typically seen in process systems:sensor bias,process fault and gain change fault.Through experiments with a numerical example and real data from a thermal power plant,the DPFA algorithm is shown to be superior to the state-of-the-art methods,in terms of better monitoring performance(fault detection rate and false alarm rate)and lower computational complexity.展开更多
Green building construction typically incurs higher costs than conventional methods.To facilitate broader adoption by construction entities,cost optimization is essential.Firms must align with technological advancemen...Green building construction typically incurs higher costs than conventional methods.To facilitate broader adoption by construction entities,cost optimization is essential.Firms must align with technological advancements,judiciously apply emerging technologies,and ensure resource efficiency through context-specific strategies.Proactive and precise scheduling is critical to avert delays from unforeseen events.Additionally,construction units should enhance on-site safety training,promote mastery of innovative techniques,and foster environmental awareness among personnel.Finally,companies ought to capitalize on government incentives for green materials while adopting bulk procurement from local sources to minimize transportation costs and secure lower unit prices.展开更多
The textile industry,while creating material wealth,also exerts a significant impact on the environment.Particularly in the textile manufacturing phase,which is the most energy-intensive phase throughout the product l...The textile industry,while creating material wealth,also exerts a significant impact on the environment.Particularly in the textile manufacturing phase,which is the most energy-intensive phase throughout the product lifecycle,the problem of high energy usage is increasingly notable.Nevertheless,current analyses of carbon emissions in textile manufacturing emphasize the dynamic temporal characteristics while failing to adequately consider critical information such as material flows and energy consumption.A carbon emission analysis method based on a holographic process model(HPM)is proposed to address these issues.First,the system boundary in the textile manufacturing is defined,and the characteristics of carbon emissions are analyzed.Next,an HPM based on the object-centric Petri net(OCPN)is constructed,and simulation experiments are conducted on three different scenarios in the textile manufacturing.Subsequently,the constructed HPM is utilized to achieve a multi-perspective analysis of carbon emissions.Finally,the feasibility of the method is verified by using the production data of pure cotton products from a certain textile manufacturing enterprise.The results indicate that this method can analyze the impact of various factors on the carbon emissions of pure cotton product production,and by applying targeted optimization strategies,carbon emissions have been reduced by nearly 20%.This contributes to propelling the textile manufacturing industry toward sustainable development.展开更多
This paper conducted a more comprehensive review and comparative analysis of the two heavy to blizzard processes that occurred in the Beijing area during December 13-15,2023,and February 20-21,2024,in terms of compreh...This paper conducted a more comprehensive review and comparative analysis of the two heavy to blizzard processes that occurred in the Beijing area during December 13-15,2023,and February 20-21,2024,in terms of comprehensive weather situation diagnosis,forecasting,and decision-making services,and summarized the meteorological service support experience of such heavy snow weather processes.It was found that both blizzard processes were jointly influenced by the 700 hPa southwesterly warm and humid jet stream and the near-surface easterly backflow;the numerical forecast was relatively accurate in the overall description of the snowfall process,and the forecast bias of the position of the 700 hPa southwesterly warm and humid jet stream determined the bias of the snowfall magnitude forecast at a certain point;when a deviation was found between the actual snowfall and the forecast,the cause should be analyzed in a timely manner,and the warning and forecast conclusions should be updated.With the full cooperation of relevant departments,it can greatly make up for the deviation of the early forecast snowfall amount,and ensure the safety and efficiency of people's travel.展开更多
US EPA's Community Multiscale Air Quality modeling system(CMAQ) with Process Analysis tool was used to simulate and quantify the contribution of individual atmospheric processes to PM_(2.5) concentration in Qingda...US EPA's Community Multiscale Air Quality modeling system(CMAQ) with Process Analysis tool was used to simulate and quantify the contribution of individual atmospheric processes to PM_(2.5) concentration in Qingdao during three representative PM_(2.5) pollution events in the winter of 2015 and 2016. Compared with the observed surface PM_(2.5) concentrations, CMAQ could reasonably reproduce the temporal and spatial variations of PM_(2.5) during these three events. Process analysis results show that primary emissions accounted for 72.7%–93.2% of the accumulation of surface PM_(2.5) before and after the events.When the events occurred, primary emissions were still the major contributor to the increase of PM_(2.5) in Qingdao, however the contribution percentage reduced significantly,which only account for 51.4%–71.8%. Net contribution from horizontal and vertical transport to the accumulation of PM_(2.5) was also positive and its percentage increased when events occurred. Only 1.1%–4.6% of aerosol accumulation was due to PM processes and aqueous chemical processes before and after events. When the events occurred,contribution from PM processes and aqueous chemistry increased to 6.0%–11.8%. Loss of PM_(2.5) was mainly through horizontal transport, vertical transport and dry deposition, no matter during or outside the events. Wet deposition would become the main removal pathway of PM_(2.5), when precipitation occurred.展开更多
Astragali Radix, the root of Astragalus membranaceus(Fisch.) Bge. var. mongholicus(Bge.) Hsiao or Astragalus membranaceus(Fisch.) Bge., is widely used as a tonic decoction pieces in the clinic of traditional Chinese m...Astragali Radix, the root of Astragalus membranaceus(Fisch.) Bge. var. mongholicus(Bge.) Hsiao or Astragalus membranaceus(Fisch.) Bge., is widely used as a tonic decoction pieces in the clinic of traditional Chinese medicine(TCM). Astragali Radix has various processed products with varying pharmacological actions. There is no modern scientific evidence to explain the differences in pharmacological activities and related mechanisms. In the present study, we explore the changes in chemical components in Astragali Radix after processing, by ultra-high performance liquid chromatography quadrupole time-of-flight mass spectrometry(UPLC-QTOF-MS) combined with novel informatics UNIFI platform and multivariate statistical analysis. Our results showed that the crude and various processed products could be clearly separated in PCA scores plot and 15 significant markers could be used to distinguish crude and various processed products by OPLS-DA in UNIFI platform. In conclusion, the present study provided a basis of chemical components for revealing connotation of different processing techniques on Astragali Radix.展开更多
Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode che...Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis(LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis(PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Finally a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.展开更多
Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in ...Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in 10 loci were detected using 10 polymorphic SSR markers selected within the range of 5-14 alleles per locus. All the 20 varieties could be distinguished using two primer pairs and they were divided into four groups using cluster analysis. The genetic similarity (GS) of groups analyzed using cluster analysis varied from 0.14 to 0.83. High acid variety Avrolles separated from other varieties with GS less than 0.42. The second group contained Longfeng and Dolgo from Northeast of China, the inherited genes of Chinese crab apple. The five cider varieties with high tannin contents, namely, Dabinette, Frequin rouge, Kermerrien, M.Menard, and D.Coetligne were clustered into the third group. The fourth group was mainly composed of 12 juice and fresh varieties. Principal coordinate analysis (PCO) also divided all the varieties into four groups. Juice and fresh apple varieties, Longfeng and Dolgo were clustered together, respectively, using both the analyses. Both the analyses showed there was much difference between cider and juice varieties, cider and fresh varieties, as well as Chinese crab apple and western European crab apple, whereas juice varieties and fresh varieties had a similar genetic background. The genetic diversity and differentiation could be sufficiently reflected by combining the two analytical methods.展开更多
The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic suppor...The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic support systems with multiple decision-making factors. The multiple transfer parameters graphical evaluation and review technique(MTP-GERT) is used to model the logistic support process in consideration of two important factors, support activity time and support activity resources, which are two primary causes for the logistic support process uncertainty. On this basis,a global sensitivity analysis(GSA) method based on covariance is designed to analyze the logistic support process uncertainty. The aircraft support process is selected as a case application which illustrates the validity of the proposed method to analyze the support process uncertainty, and some feasible recommendations are proposed for aircraft support decision making on carrier.展开更多
This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze t...This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze this extract from three aspects: the analysis of the objective plane of narration, the analysis of Mrs. Bennet' s discourse and the analysis of Mr. Bennet' s discourse.展开更多
Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inc...Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inconsistencies between energy-saving and environmental conservation,a uniform way of reporting the information and classification was presented. Based on the establishment of carbon footprint( CFP) for machine tools operation,carbon footprint per kilogram( CFK) was proposed as the normalized index to evaluate the machining process.Furthermore,a classification approach was developed as a tracking and analyzing system for the machining process. In addition,a case study was also used to illustrate the validity of the methodology. The results show that the approach is reasonable and feasible for machining process evaluation,which provides a reliable reference to the optimization measures for low carbon manufacturing.展开更多
The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical ...The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical models and human expertise.In the era of data-driven manufacturing,the explosion of data amount revolutionized how data is collected and analyzed.This paper overviews the advance of technologies developed for in-process manufacturing data collection and analysis.It can be concluded that groundbreaking sensoring technology to facilitate direct measurement is one important leading trend for advanced data collection,due to the complexity and uncertainty during indirect measurement.On the other hand,physical model-based data analysis contains inevitable simplifications and sometimes ill-posed solutions due to the limited capacity of describing complex manufacturing process.Machine learning,especially deep learning approach has great potential for making better decisions to automate the process when fed with abundant data,while trending data-driven manufacturing approaches succeeded by using limited data to achieve similar or even better decisions.And these trends can demonstrated be by analyzing some typical applications of manufacturing process.展开更多
Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In ord...Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In order to get a better visualization effect, a novel fault diagnosis method which combines self-organizing map (SOM) with Fisher discriminant analysis (FDA) is proposed. FDA can reduce the dimension of the data in terms of maximizing the separability of the classes. After feature extraction by FDA, SOM can distinguish the different states on the output map clearly and it can also be employed to monitor abnormal states. Tennessee Eastman (TE) process is employed to illustrate the fault diagnosis and monitoring performance of the proposed method. The result shows that the SOM integrated with FDA method is efficient and capable for real-time monitoring and fault diagnosis in complex chemical process.展开更多
Theoretical minimum and actual specific energy consumptions (SEC) of typical manufacturing process (SMP) were studied. Firstly, a process division of a typical SMP in question was conducted with the theory of SEC ...Theoretical minimum and actual specific energy consumptions (SEC) of typical manufacturing process (SMP) were studied. Firstly, a process division of a typical SMP in question was conducted with the theory of SEC analysis. Secondly, an exergy analysis model of a subsystem consisting of several parallel processes and a SEC analysis model of SMP were developed. And finally, based on the analysis models, the SEC of SMP was analyzed by means of the statistical significance. The results show that the SEC of typical SMP comprises the theoretical minimum SEC and the additional SEC derived from the irreversibility~ and the SMP has a theoretical minimum SEC of 6.74 GJ/t and an additional SEC of 19.32 GJ/t, which account for 25.88% and 74.12% of the actual SEC, respectively.展开更多
[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analy...[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analysis and diagnosis methods,and combining with the cold wave forecast index in spring of Sichuan,a cold wave event covering the whole region between March 21 and 24,2010 was analyzed from the aspects of circulation background,influencing weather systems and weather causation.[Result] Results showed that the 500 high-altitude cold vortex,700-850 hPa low layer shear,and ground cold front were the main systems that influenced this cold wave;there was a ridge from Lake Balkhash across Lake Baikal at 500 hPa.The early stage of the process was controlled by the high pressure ridge and the temperature was increasing obviously.The daily mean temperature was high.The range of cold high pressure was large and the central intensity was 1 043.0 hPa;the cold air was strong and deep which was in accordance with the strong surface temperature reduction center.The strong north airstream of Lake Balkhash to Lake Baikal,ground cold high pressure center intensity changes,north and south ocean pressure and temperature differences,850 hPa temperature changes,cold advection movement route and intensity were considered as reference factors for the forecast of cold wave intensity.[Conclusion] The study provided theoretical basis for improving the forecast ability of cold wave weather.展开更多
In the past decades, on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry. This paper focuses on two troubleso...In the past decades, on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry. This paper focuses on two troublesome issues concerning selecting dominant independent components without a standard criterion and deter- mining the control limits of monitoring statistics in the presence of non-Gaussian distribution. To optimize the number of key independent components~ we introctuce-anoveiconcept of-system-cleviation, which is ab^e'io'evalu[ ate the reconstructed observations with different independent components. The monitored statistics arc transformed to Gaussian distribution data by means of Box-Cox transformation, which helps readily determine the control limits. The proposed method is applied to on-line monitoring of a fed-hatch penicillin fermentation simulator, and the ex- _perimental results indicate the advantages of the improved MICA monitoring compared to the conventional methods.展开更多
Damage smear method(DSM)is adopted to study trans-scale progressive rock failure process,based on statistical meso-damage model and finite element solver.The statistical approach is utilized to reflect the mesoscopic ...Damage smear method(DSM)is adopted to study trans-scale progressive rock failure process,based on statistical meso-damage model and finite element solver.The statistical approach is utilized to reflect the mesoscopic rock heterogeneity.The constitutive law of representative volume element(RVE)is established according to continuum damage mechanics in which double-damage criterion is considered.The damage evolution and accumulation of RVEs are used to reveal the macroscopic rock failure characteristics.Each single RVE will be represented by one unique element.The initiation,propagation and coalescence of meso-to macro-cracks are captured by smearing failed elements.The above ideas are formulated into the framework of the DSM and programed into self-developed rock failure process analysis(RFPA)software.Two laboratory-scale examples are conducted and the well-known engineering-scale tests,i.e.Atomic Energy of Canada Limited’s(AECL’s)Underground Research Laboratory(URL)tests,are used for verification.It shows that the simulation results match with other experimental results and field observations.展开更多
Energy efficiency data from ethylene production equipment are of high dimension, dynamic and time sequential, so their evaluation is affected by many factors. Abnormal data from ethylene production are eliminated thro...Energy efficiency data from ethylene production equipment are of high dimension, dynamic and time sequential, so their evaluation is affected by many factors. Abnormal data from ethylene production are eliminated through consistency test, making the data consumption uniform to improve the comparability of data. Due to the limit of input and output data of decision making unit in data envelopment analysis(DEA), the energy efficiency data from the same technology in a certain year are disposed monthly using DEA. The DEA data of energy efficiency from the same technology are weighted and fused using analytic hierarchy process. The energy efficiency data from different technologies are evaluated by their relative effectiveness to find the direction of energy saving and consumption reduction.展开更多
The construction of basic wavelet was discussed and many basic analyzing wavelets was compared. Acomplex analyzing wavelet which is continuous, smoothing, orthogonal and exponential decreasing was presented, andit was...The construction of basic wavelet was discussed and many basic analyzing wavelets was compared. Acomplex analyzing wavelet which is continuous, smoothing, orthogonal and exponential decreasing was presented, andit was used to decompose two blasting seismic signals with the continuous wavelet transforms (CWT). The resultshows that wavelet analysis is the better method to help us determine the essential factors which create damage effectsthan Fourier analysis.展开更多
Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In or...Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In order to overcome the need for estimated or filled up future unmeasured values in the online fault diagnosis, sufficiently utilize the finite information of faults, and enhance the diagnostic performance, an improved multi-model Fisher discriminant analysis is represented. The trait of the proposed method is that the training data sets are made of the current measured information and the past major discriminant information, and not only the current information or the whole batch data. An industrial typical multi-stage streptomycin fermentation process is used to test the performance of fault diagnosis of the proposed method.展开更多
基金supported in part by the National Science Fund for Distinguished Young Scholars of China(62225303)the National Natural Science Fundation of China(62303039,62433004)+2 种基金the China Postdoctoral Science Foundation(BX20230034,2023M730190)the Fundamental Research Funds for the Central Universities(buctrc202201,QNTD2023-01)the High Performance Computing Platform,College of Information Science and Technology,Beijing University of Chemical Technology
文摘Data-driven process monitoring is an effective approach to assure safe operation of modern manufacturing and energy systems,such as thermal power plants being studied in this work.Industrial processes are inherently dynamic and need to be monitored using dynamic algorithms.Mainstream dynamic algorithms rely on concatenating current measurement with past data.This work proposes a new,alternative dynamic process monitoring algorithm,using dot product feature analysis(DPFA).DPFA computes the dot product of consecutive samples,thus naturally capturing the process dynamics through temporal correlation.At the same time,DPFA's online computational complexity is lower than not just existing dynamic algorithms,but also classical static algorithms(e.g.,principal component analysis and slow feature analysis).The detectability of the new algorithm is analyzed for three types of faults typically seen in process systems:sensor bias,process fault and gain change fault.Through experiments with a numerical example and real data from a thermal power plant,the DPFA algorithm is shown to be superior to the state-of-the-art methods,in terms of better monitoring performance(fault detection rate and false alarm rate)and lower computational complexity.
文摘Green building construction typically incurs higher costs than conventional methods.To facilitate broader adoption by construction entities,cost optimization is essential.Firms must align with technological advancements,judiciously apply emerging technologies,and ensure resource efficiency through context-specific strategies.Proactive and precise scheduling is critical to avert delays from unforeseen events.Additionally,construction units should enhance on-site safety training,promote mastery of innovative techniques,and foster environmental awareness among personnel.Finally,companies ought to capitalize on government incentives for green materials while adopting bulk procurement from local sources to minimize transportation costs and secure lower unit prices.
基金National Key R&D Program of China(No.2019YFB1706300)。
文摘The textile industry,while creating material wealth,also exerts a significant impact on the environment.Particularly in the textile manufacturing phase,which is the most energy-intensive phase throughout the product lifecycle,the problem of high energy usage is increasingly notable.Nevertheless,current analyses of carbon emissions in textile manufacturing emphasize the dynamic temporal characteristics while failing to adequately consider critical information such as material flows and energy consumption.A carbon emission analysis method based on a holographic process model(HPM)is proposed to address these issues.First,the system boundary in the textile manufacturing is defined,and the characteristics of carbon emissions are analyzed.Next,an HPM based on the object-centric Petri net(OCPN)is constructed,and simulation experiments are conducted on three different scenarios in the textile manufacturing.Subsequently,the constructed HPM is utilized to achieve a multi-perspective analysis of carbon emissions.Finally,the feasibility of the method is verified by using the production data of pure cotton products from a certain textile manufacturing enterprise.The results indicate that this method can analyze the impact of various factors on the carbon emissions of pure cotton product production,and by applying targeted optimization strategies,carbon emissions have been reduced by nearly 20%.This contributes to propelling the textile manufacturing industry toward sustainable development.
文摘This paper conducted a more comprehensive review and comparative analysis of the two heavy to blizzard processes that occurred in the Beijing area during December 13-15,2023,and February 20-21,2024,in terms of comprehensive weather situation diagnosis,forecasting,and decision-making services,and summarized the meteorological service support experience of such heavy snow weather processes.It was found that both blizzard processes were jointly influenced by the 700 hPa southwesterly warm and humid jet stream and the near-surface easterly backflow;the numerical forecast was relatively accurate in the overall description of the snowfall process,and the forecast bias of the position of the 700 hPa southwesterly warm and humid jet stream determined the bias of the snowfall magnitude forecast at a certain point;when a deviation was found between the actual snowfall and the forecast,the cause should be analyzed in a timely manner,and the warning and forecast conclusions should be updated.With the full cooperation of relevant departments,it can greatly make up for the deviation of the early forecast snowfall amount,and ensure the safety and efficiency of people's travel.
基金supported by the National Natural Science Foundation of China(Nos.41430646,41305087)the Shandong Provincial Natural Science Foundation,China(No.ZR2013DQ022)+1 种基金the National Key Basic Research Program of China(No.2014CB953701)the Qingdao science and technology project(14-8-3-10-NSH)
文摘US EPA's Community Multiscale Air Quality modeling system(CMAQ) with Process Analysis tool was used to simulate and quantify the contribution of individual atmospheric processes to PM_(2.5) concentration in Qingdao during three representative PM_(2.5) pollution events in the winter of 2015 and 2016. Compared with the observed surface PM_(2.5) concentrations, CMAQ could reasonably reproduce the temporal and spatial variations of PM_(2.5) during these three events. Process analysis results show that primary emissions accounted for 72.7%–93.2% of the accumulation of surface PM_(2.5) before and after the events.When the events occurred, primary emissions were still the major contributor to the increase of PM_(2.5) in Qingdao, however the contribution percentage reduced significantly,which only account for 51.4%–71.8%. Net contribution from horizontal and vertical transport to the accumulation of PM_(2.5) was also positive and its percentage increased when events occurred. Only 1.1%–4.6% of aerosol accumulation was due to PM processes and aqueous chemical processes before and after events. When the events occurred,contribution from PM processes and aqueous chemistry increased to 6.0%–11.8%. Loss of PM_(2.5) was mainly through horizontal transport, vertical transport and dry deposition, no matter during or outside the events. Wet deposition would become the main removal pathway of PM_(2.5), when precipitation occurred.
基金supported by the Construction Foundation of the State Administration of Traditional Chinese Medicine to the Processing Technology Heritage Base of TCM,Special Fund of the State Administration of Traditional Chinese Medicine to the Industry of TCM(No.20110700711)
文摘Astragali Radix, the root of Astragalus membranaceus(Fisch.) Bge. var. mongholicus(Bge.) Hsiao or Astragalus membranaceus(Fisch.) Bge., is widely used as a tonic decoction pieces in the clinic of traditional Chinese medicine(TCM). Astragali Radix has various processed products with varying pharmacological actions. There is no modern scientific evidence to explain the differences in pharmacological activities and related mechanisms. In the present study, we explore the changes in chemical components in Astragali Radix after processing, by ultra-high performance liquid chromatography quadrupole time-of-flight mass spectrometry(UPLC-QTOF-MS) combined with novel informatics UNIFI platform and multivariate statistical analysis. Our results showed that the crude and various processed products could be clearly separated in PCA scores plot and 15 significant markers could be used to distinguish crude and various processed products by OPLS-DA in UNIFI platform. In conclusion, the present study provided a basis of chemical components for revealing connotation of different processing techniques on Astragali Radix.
基金Supported by the National Natural Science Foundation of China(61273160,61403418)the Natural Science Foundation of Shandong Province(ZR2011FM014)+1 种基金the Fundamental Research Funds for the Central Universities(10CX04046A)the Doctoral Fund of Shandong Province(BS2012ZZ011)
文摘Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis(LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis(PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Finally a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.
文摘Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in 10 loci were detected using 10 polymorphic SSR markers selected within the range of 5-14 alleles per locus. All the 20 varieties could be distinguished using two primer pairs and they were divided into four groups using cluster analysis. The genetic similarity (GS) of groups analyzed using cluster analysis varied from 0.14 to 0.83. High acid variety Avrolles separated from other varieties with GS less than 0.42. The second group contained Longfeng and Dolgo from Northeast of China, the inherited genes of Chinese crab apple. The five cider varieties with high tannin contents, namely, Dabinette, Frequin rouge, Kermerrien, M.Menard, and D.Coetligne were clustered into the third group. The fourth group was mainly composed of 12 juice and fresh varieties. Principal coordinate analysis (PCO) also divided all the varieties into four groups. Juice and fresh apple varieties, Longfeng and Dolgo were clustered together, respectively, using both the analyses. Both the analyses showed there was much difference between cider and juice varieties, cider and fresh varieties, as well as Chinese crab apple and western European crab apple, whereas juice varieties and fresh varieties had a similar genetic background. The genetic diversity and differentiation could be sufficiently reflected by combining the two analytical methods.
基金supported by the National Natural Science Foundation of China(71171008)
文摘The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic support systems with multiple decision-making factors. The multiple transfer parameters graphical evaluation and review technique(MTP-GERT) is used to model the logistic support process in consideration of two important factors, support activity time and support activity resources, which are two primary causes for the logistic support process uncertainty. On this basis,a global sensitivity analysis(GSA) method based on covariance is designed to analyze the logistic support process uncertainty. The aircraft support process is selected as a case application which illustrates the validity of the proposed method to analyze the support process uncertainty, and some feasible recommendations are proposed for aircraft support decision making on carrier.
文摘This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze this extract from three aspects: the analysis of the objective plane of narration, the analysis of Mrs. Bennet' s discourse and the analysis of Mr. Bennet' s discourse.
基金National Science &Technology Pillar Program during the Twelfth Five-year Plan Period(No.2012BAF01B02)National Science and Technology Major Project of China(No.2012ZX04005031)
文摘Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inconsistencies between energy-saving and environmental conservation,a uniform way of reporting the information and classification was presented. Based on the establishment of carbon footprint( CFP) for machine tools operation,carbon footprint per kilogram( CFK) was proposed as the normalized index to evaluate the machining process.Furthermore,a classification approach was developed as a tracking and analyzing system for the machining process. In addition,a case study was also used to illustrate the validity of the methodology. The results show that the approach is reasonable and feasible for machining process evaluation,which provides a reliable reference to the optimization measures for low carbon manufacturing.
基金Supported by National Natural Science Foundation of China(Grant No.51805260)National Natural Science Foundation for Distinguished Young Scholars of China(Grant No.51925505)National Natural Science Foundation of China(Grant No.51775278).
文摘The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical models and human expertise.In the era of data-driven manufacturing,the explosion of data amount revolutionized how data is collected and analyzed.This paper overviews the advance of technologies developed for in-process manufacturing data collection and analysis.It can be concluded that groundbreaking sensoring technology to facilitate direct measurement is one important leading trend for advanced data collection,due to the complexity and uncertainty during indirect measurement.On the other hand,physical model-based data analysis contains inevitable simplifications and sometimes ill-posed solutions due to the limited capacity of describing complex manufacturing process.Machine learning,especially deep learning approach has great potential for making better decisions to automate the process when fed with abundant data,while trending data-driven manufacturing approaches succeeded by using limited data to achieve similar or even better decisions.And these trends can demonstrated be by analyzing some typical applications of manufacturing process.
基金Supported by the National Basic Research Program of China (2013CB733600), the National Natural Science Foundation of China (21176073), the Doctoral Fund of Ministry of Education of China (20090074110005), the Program for New Century Excellent Talents in University (NCET-09-0346), Shu Guang Project (09SG29) and the Fundamental Research Funds for the Central Universities.
文摘Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In order to get a better visualization effect, a novel fault diagnosis method which combines self-organizing map (SOM) with Fisher discriminant analysis (FDA) is proposed. FDA can reduce the dimension of the data in terms of maximizing the separability of the classes. After feature extraction by FDA, SOM can distinguish the different states on the output map clearly and it can also be employed to monitor abnormal states. Tennessee Eastman (TE) process is employed to illustrate the fault diagnosis and monitoring performance of the proposed method. The result shows that the SOM integrated with FDA method is efficient and capable for real-time monitoring and fault diagnosis in complex chemical process.
基金Item Sponsored by Fundamental Research Funds for the Central Universities of China(N090602007)National Key Technology Research and Development Program in 11th Five-Year Plan Project of China(2006BAE03A09)
文摘Theoretical minimum and actual specific energy consumptions (SEC) of typical manufacturing process (SMP) were studied. Firstly, a process division of a typical SMP in question was conducted with the theory of SEC analysis. Secondly, an exergy analysis model of a subsystem consisting of several parallel processes and a SEC analysis model of SMP were developed. And finally, based on the analysis models, the SEC of SMP was analyzed by means of the statistical significance. The results show that the SEC of typical SMP comprises the theoretical minimum SEC and the additional SEC derived from the irreversibility~ and the SMP has a theoretical minimum SEC of 6.74 GJ/t and an additional SEC of 19.32 GJ/t, which account for 25.88% and 74.12% of the actual SEC, respectively.
文摘[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analysis and diagnosis methods,and combining with the cold wave forecast index in spring of Sichuan,a cold wave event covering the whole region between March 21 and 24,2010 was analyzed from the aspects of circulation background,influencing weather systems and weather causation.[Result] Results showed that the 500 high-altitude cold vortex,700-850 hPa low layer shear,and ground cold front were the main systems that influenced this cold wave;there was a ridge from Lake Balkhash across Lake Baikal at 500 hPa.The early stage of the process was controlled by the high pressure ridge and the temperature was increasing obviously.The daily mean temperature was high.The range of cold high pressure was large and the central intensity was 1 043.0 hPa;the cold air was strong and deep which was in accordance with the strong surface temperature reduction center.The strong north airstream of Lake Balkhash to Lake Baikal,ground cold high pressure center intensity changes,north and south ocean pressure and temperature differences,850 hPa temperature changes,cold advection movement route and intensity were considered as reference factors for the forecast of cold wave intensity.[Conclusion] The study provided theoretical basis for improving the forecast ability of cold wave weather.
文摘In the past decades, on-line monitoring of batch processes using multi-way independent component analysis (MICA) has received considerable attention in both academia and industry. This paper focuses on two troublesome issues concerning selecting dominant independent components without a standard criterion and deter- mining the control limits of monitoring statistics in the presence of non-Gaussian distribution. To optimize the number of key independent components~ we introctuce-anoveiconcept of-system-cleviation, which is ab^e'io'evalu[ ate the reconstructed observations with different independent components. The monitored statistics arc transformed to Gaussian distribution data by means of Box-Cox transformation, which helps readily determine the control limits. The proposed method is applied to on-line monitoring of a fed-hatch penicillin fermentation simulator, and the ex- _perimental results indicate the advantages of the improved MICA monitoring compared to the conventional methods.
基金supported in part by the National Natural Science Foundation of China (Grant Nos.51679028 and 51879034)Key Laboratory for Geomechanics and Deep Underground Engineering, China University of Mining and Technology (Grant No. SKLGDUEK1804)the Fundamental Research Funds for the Central Universities (Grant No.DUT18JC10)
文摘Damage smear method(DSM)is adopted to study trans-scale progressive rock failure process,based on statistical meso-damage model and finite element solver.The statistical approach is utilized to reflect the mesoscopic rock heterogeneity.The constitutive law of representative volume element(RVE)is established according to continuum damage mechanics in which double-damage criterion is considered.The damage evolution and accumulation of RVEs are used to reveal the macroscopic rock failure characteristics.Each single RVE will be represented by one unique element.The initiation,propagation and coalescence of meso-to macro-cracks are captured by smearing failed elements.The above ideas are formulated into the framework of the DSM and programed into self-developed rock failure process analysis(RFPA)software.Two laboratory-scale examples are conducted and the well-known engineering-scale tests,i.e.Atomic Energy of Canada Limited’s(AECL’s)Underground Research Laboratory(URL)tests,are used for verification.It shows that the simulation results match with other experimental results and field observations.
基金Supported by the National Natural Science Foundation of China(61374166)the Doctoral Fund of Ministry of Education of China(20120010110010)the Fundamental Research Funds for the Central Universities(YS1404)
文摘Energy efficiency data from ethylene production equipment are of high dimension, dynamic and time sequential, so their evaluation is affected by many factors. Abnormal data from ethylene production are eliminated through consistency test, making the data consumption uniform to improve the comparability of data. Due to the limit of input and output data of decision making unit in data envelopment analysis(DEA), the energy efficiency data from the same technology in a certain year are disposed monthly using DEA. The DEA data of energy efficiency from the same technology are weighted and fused using analytic hierarchy process. The energy efficiency data from different technologies are evaluated by their relative effectiveness to find the direction of energy saving and consumption reduction.
文摘The construction of basic wavelet was discussed and many basic analyzing wavelets was compared. Acomplex analyzing wavelet which is continuous, smoothing, orthogonal and exponential decreasing was presented, andit was used to decompose two blasting seismic signals with the continuous wavelet transforms (CWT). The resultshows that wavelet analysis is the better method to help us determine the essential factors which create damage effectsthan Fourier analysis.
基金Supported by the National Natural Science Foundation of China (No.60421002).
文摘Since there are not enough fault data in historical data sets, it is very difficult to diagnose faults for batch processes. In addition, a complete batch trajectory can be obtained till the end of its operation. In order to overcome the need for estimated or filled up future unmeasured values in the online fault diagnosis, sufficiently utilize the finite information of faults, and enhance the diagnostic performance, an improved multi-model Fisher discriminant analysis is represented. The trait of the proposed method is that the training data sets are made of the current measured information and the past major discriminant information, and not only the current information or the whole batch data. An industrial typical multi-stage streptomycin fermentation process is used to test the performance of fault diagnosis of the proposed method.