The efficacy of error correction and various kinds of correction approaches is one of the key issues in second language writing faced by both teachers and researchers. The current paper reviews the definition of error...The efficacy of error correction and various kinds of correction approaches is one of the key issues in second language writing faced by both teachers and researchers. The current paper reviews the definition of error correction and examines the different views on whether error correction in L2 writing should be corrected. In particular, the paper discusses and analyses the three common correction methods: direct correction, peer feedback and indirect correction. Teachers are encouraged to weigh and analyze the advantages and disadvantages of these methods according to the current literature, employ the most beneficial error correction method in L2 writing, and adapt its suitability to their teaching context.展开更多
The digital measurement and processing is an important direction in the measurement and control field. The quantization error widely existing in the digital processing is always the decisive factor that restricts the ...The digital measurement and processing is an important direction in the measurement and control field. The quantization error widely existing in the digital processing is always the decisive factor that restricts the development and applications of the digital technology. In this paper, we find that the stability of the digital quantization system is obviously better than the quantization resolution. The application of a border effect in the digital quantization can greatly improve the accuracy of digital processing. Its effective precision has nothing to do with the number of quantization bits, which is only related to the stability of the quantization system. The high precision measurement results obtained in the low level quantization system with high sampling rate have an important application value for the progress in the digital measurement and processing field.展开更多
This paper points out that with interference from their native language and culture, Chinese students will inevitably make some errors in the process of learning English. It is important for teachers to know when and ...This paper points out that with interference from their native language and culture, Chinese students will inevitably make some errors in the process of learning English. It is important for teachers to know when and how to correct the students’ errors. By employing error-correction skillfully and appropriately, one can expect to improve the present English teaching and learning, to develop the self-confidence and self-esteem in students themselves.展开更多
For actively modulated In-line Sagnac interferential all optic fiber current transformers (AOFCTs), the accuracies are directly affected by the amplitude of the modulation signal. In order to deeply undertand the func...For actively modulated In-line Sagnac interferential all optic fiber current transformers (AOFCTs), the accuracies are directly affected by the amplitude of the modulation signal. In order to deeply undertand the function of the modulator, a theoretical model of modulation effect to AOFCTs is built up in this paper. The effect of the amplitude of the modulation signal to the output intensity of AOFCTs is theoretically formulated and numerical calculated. The results show that the modulation voltage variation could affect the output accuracies significantly. This might be some references on the investigation for practical applications of AOFCTs.展开更多
Single event upsets(SEUs) induced by heavy ions were observed in 65 nm SRAMs to quantitatively evaluate the applicability and effectiveness of single-bit error correcting code(ECC) utilizing Hamming Code.The results s...Single event upsets(SEUs) induced by heavy ions were observed in 65 nm SRAMs to quantitatively evaluate the applicability and effectiveness of single-bit error correcting code(ECC) utilizing Hamming Code.The results show that the ECC did improve the performance dramatically,with the SEU cross sections of SRAMs with ECC being at the order of 10^(-11) cm^2/bit,two orders of magnitude higher than that without ECC(at the order of 10^(-9) cm^2/bit).Also,ineffectiveness of ECC module,including 1-,2- and 3-bits errors in single word(not Multiple Bit Upsets),was detected.The ECC modules in SRAMs utilizing(12,8) Hamming code would lose work when 2-bits upset accumulates in one codeword.Finally,the probabilities of failure modes involving 1-,2- and 3-bits errors,were calcaulated at 39.39%,37.88%and 22.73%,respectively,which agree well with the experimental results.展开更多
For the product degradation process with random effect (RE), measurement error (ME) and nonlinearity in step-stress accelerated degradation test (SSADT), the nonlinear Wiener based degradation model with RE and ME is ...For the product degradation process with random effect (RE), measurement error (ME) and nonlinearity in step-stress accelerated degradation test (SSADT), the nonlinear Wiener based degradation model with RE and ME is built. An analytical approximation to the probability density function (PDF) of the product's lifetime is derived in a closed form. The process and data of SSADT are analyzed to obtain the relation model of the observed data under each accelerated stress. The likelihood function for the population-based observed data is constructed. The population-based model parameters and its random coefficient prior values are estimated. According to the newly observed data of the target product in SSADT, an analytical approximation to the PDF of its residual lifetime (RL) is derived in accordance with its individual degradation characteristics. The parameter updating method based on Bayesian inference is applied to obtain the posterior value of random coefficient of the RL model. A numerical example by simulation is analyzed to verify the accuracy and advantage of the proposed model.展开更多
We predict proton single event effect(SEE)error rates for the VATA160 ASIC chip on the Dark Matter Particle Explorer(DAMPE) to evaluate its radiation tolerance.Lacking proton test facilities,we built a Monte Carlo sim...We predict proton single event effect(SEE)error rates for the VATA160 ASIC chip on the Dark Matter Particle Explorer(DAMPE) to evaluate its radiation tolerance.Lacking proton test facilities,we built a Monte Carlo simulation tool named PRESTAGE to calculate the proton SEE cross-sections.PRESTAGE is based on the particle transport toolkit Geant4.It adopts a location-dependent strategy to derive the SEE sensitivity of the device from heavy-ion test data,which have been measured at the HI-13 tandem accelerator of the China Institute of Atomic Energy and the heavy-ion research facility in Lanzhou.The AP-8,SOLPRO,and August 1972 worst-case models are used to predict the average and peak proton fluxes on the DAMPE orbit.Calculation results show that the averaged proton SEE error rate for the VATA160 chip is approximately 2.17×10^(-5)/device/day.Worst-case error rates for the Van Allen belts and solar energetic particle events are 1-3 orders of magnitude higher than the averaged error rate.展开更多
Based on the basic formula of the confidence interval and the sampling error of mathematical statistics, the mathematical statistics method of evaluating application effects of a new type of gas anchor was given in th...Based on the basic formula of the confidence interval and the sampling error of mathematical statistics, the mathematical statistics method of evaluating application effects of a new type of gas anchor was given in this paper. By the method mentioned above, the confidence interval and the sampling errors of the relevant mean value differences of Daqing Oilfield S block’s 150 wells, according to the mean value differences of the liquid producing capacity per day, the oil production per day, the submergence depth of the 10 sampling test wells, in which before and after a new type of gas anchor were laid down, were calculated. The calculation results show that a new type of gas anchor has a better effect of increasing oil production of oil well and enhancing pump efficiency. Through the real value differences analysis of the liquid producing capacity per day, the oil production per day, the submergence depth of 150 wells mentioned above, in which before and after a new type of gas anchor were laid down, it was verified. By using the confidence interval and the sampling errors of the liquid producing capacity per day, the oil production per day, the submergence depth mentioned above, in which before and after a new type of gas anchor were laid down, the application effects of a new type of gas anchor could be evaluated. And a mathematical statistics method of evaluation application effects of a new type of gas anchor is presented.展开更多
This study uses <span style="font-family:Verdana;">an empirical</span><span style="font-family:Verdana;"> analysis to quantify the downstream analysis effects of data pre-processi...This study uses <span style="font-family:Verdana;">an empirical</span><span style="font-family:Verdana;"> analysis to quantify the downstream analysis effects of data pre-processing choices. Bootstrap data simulation is used to measure the bias-variance decomposition of an empirical risk function, mean square error (MSE). Results of the risk function decomposition are used to measure the effects of model development choices on </span><span style="font-family:Verdana;">model</span><span style="font-family:Verdana;"> bias, variance, and irreducible error. Measurements of bias and variance are then applied as diagnostic procedures for model pre-processing and development. Best performing model-normalization-data structure combinations were found to illustrate the downstream analysis effects of these model development choices. </span><span style="font-family:Verdana;">In addition</span><span style="font-family:Verdana;">s</span><span style="font-family:Verdana;">, results found from simulations were verified and expanded to include additional data characteristics (imbalanced, sparse) by testing on benchmark datasets available from the UCI Machine Learning Library. Normalization results on benchmark data were consistent with those found using simulations, while also illustrating that more complex and/or non-linear models provide better performance on datasets with additional complexities. Finally, applying the findings from simulation experiments to previously tested applications led to equivalent or improved results with less model development overhead and processing time.</span>展开更多
Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, th...Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, that double assumption is unlikely to hold, particularly for the random effects, a crucial component </span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">in </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">which assessment of magnitude is key in such modeling. Alternative fitting methods not relying on that assumption (as ANOVA ones and Rao</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">s MINQUE) apply, quite often, only to the very constrained class of variance components models. In this paper, a new computationally feasible estimation methodology is designed, first for the widely used class of 2-level (or longitudinal) LMMs with only assumption (beyond the usual basic ones) that residual errors are uncorrelated and homoscedastic, with no distributional assumption imposed on the random effects. A major asset of this new approach is that it yields nonnegative variance estimates and covariance matrices estimates which are symmetric and, at least, positive semi-definite. Furthermore, it is shown that when the LMM is, indeed, Gaussian, this new methodology differs from ML just through a slight variation in the denominator of the residual variance estimate. The new methodology actually generalizes to LMMs a well known nonparametric fitting procedure for standard Linear Models. Finally, the methodology is also extended to ANOVA LMMs, generalizing an old method by Henderson for ML estimation in such models under normality.展开更多
Technology scaling results in the propagation-induced pulse broadening and quenching(PIPBQ) effect become more noticeable.In order to effectively evaluate the soft error rate for combinational logic circuits,a soft ...Technology scaling results in the propagation-induced pulse broadening and quenching(PIPBQ) effect become more noticeable.In order to effectively evaluate the soft error rate for combinational logic circuits,a soft error rate analysis approach considering the PIPBQ effect is proposed.As different original pulse propagating through logic gate cells,pulse broadening and quenching are measured by HSPICE.After that,electrical effect look-up tables(EELUTs) for logic gate cells are created to evaluate the PIPBQ effect.Sensitized paths are accurately retrieved by the proposed re-convergence aware sensitized path search algorithm.Further,by propagating pulses on these paths to simulate fault injection,the PIPBQ effect on these paths can be quantified by EELUTs.As a result,the soft error rate of circuits can be effectively computed by the proposed technique.Simulation results verify the soft error rate improvement comparing with the PIPBQ-not-aware method.展开更多
Single event effects(SEEs) induced by radiations become a significant challenge to the reliability for modern electronic systems. To evaluate SEEs susceptibility for microelectronic devices and integrated circuits(ICs...Single event effects(SEEs) induced by radiations become a significant challenge to the reliability for modern electronic systems. To evaluate SEEs susceptibility for microelectronic devices and integrated circuits(ICs), an SEE testing system with flexibility and robustness was developed at Heavy Ion Research Facility in Lanzhou(HIRFL). The system is compatible with various types of microelectronic devices and ICs, and supports plenty of complex and high-speed test schemes and plans for the irradiated devices under test(DUTs). Thanks to the combination of meticulous circuit design and the hardened logic design, the system has additional performances to avoid an overheated situation and irradiations by stray radiations. The system has been tested and verified by experiments for irradiating devices at HIRFL.展开更多
Single event effects of 1-T structure programmable read-only memory(PROM) devices fabricated with a 130-nm complementary metal oxide semiconductorbased thin/thick gate oxide anti-fuse process were investigated using h...Single event effects of 1-T structure programmable read-only memory(PROM) devices fabricated with a 130-nm complementary metal oxide semiconductorbased thin/thick gate oxide anti-fuse process were investigated using heavy ions and a picosecond pulsed laser. The cross sections of a single event upset(SEU) for radiationhardened PROMs were measured using a linear energy transfer(LET) ranging from 9.2 to 95.6 MeV cm^2mg^(-1).The result indicated that the LET threshold for a dynamic bit upset was ~ 9 MeV cm^2mg^(-1), which was lower than the threshold of ~ 20 MeV cm^2mg^(-1) for an address counter upset owing to the additional triple modular redundancy structure present in the latch. In addition, a slight hard error was observed in the anti-fuse structure when employing209 Bi ions with extremely high LET values(~ 91.6 MeV cm^2mg^(-1)) and large ion fluence(~ 1×10~8 ions cm^(-2)). To identify the detailed sensitive position of a SEU in PROMs, a pulsed laser with a 5-μm beam spot was used to scan the entire surface of the device.This revealed that the upset occurred in the peripheral circuits of the internal power source and I/O pairs rather than in the internal latches and buffers. This was subsequently confirmed by a ^(181)Ta experiment. Based on the experimental data and a rectangular parallelepiped model of the sensitive volume, the space error rates for the used PROMs were calculated using the CRèME-96 prediction tool. The results showed that this type of PROM was suitable for specific space applications, even in the geosynchronous orbit.展开更多
Medical errors are reported with increased frequency both in Europe and in the United States of America and measures are put in place to deal with the problem. In Greece, more and more patients think that it is likely...Medical errors are reported with increased frequency both in Europe and in the United States of America and measures are put in place to deal with the problem. In Greece, more and more patients think that it is likely to experience a medical error during health care delivery and the organizations they can turn to if this happens are hardly enough and with meagre response. The consequences of medical errors are multiple and complex with significant financial implications. Nowadays there is an urgent need to resolve problems that refer to cost containment in the Greek Health System. Some research findings from the review of 128 compensations awarded by civil courts for the years 2000 to 2009 for medical errors in Greece are quite interesting. The mean compensation amounted to €292,613 representing 35.41% of claimed compensation. Only a small proportion of medical errors gain publicity as the majority of claims get settled out of court, covered by the insurance policy or the hospitals. The burden of the obvious and hidden cost affects not only the patient, his family and the hospital but also the whole of the society. This comes from our estimation that the level of compensation awarded by the civil courts for medical errors is remarkable high. Unfortunately only some estimates of the cost are possible due to the lack of statistical data. The creation of an independent oversight body for the review of medical errors and complaints nationwide as well as the modernization of the hospitals’ monitoring systems is necessary in order to handle the medical error phenomenon. Above all, cooperation and trust between patients, health care professionals, hospital managers, medical boards and the government are essential to get to the root of the problem.展开更多
文摘The efficacy of error correction and various kinds of correction approaches is one of the key issues in second language writing faced by both teachers and researchers. The current paper reviews the definition of error correction and examines the different views on whether error correction in L2 writing should be corrected. In particular, the paper discusses and analyses the three common correction methods: direct correction, peer feedback and indirect correction. Teachers are encouraged to weigh and analyze the advantages and disadvantages of these methods according to the current literature, employ the most beneficial error correction method in L2 writing, and adapt its suitability to their teaching context.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.10978017 and 61201288)Shaanxi Natural Science Foundation Research Plan Projects,China(Grant No.2014JM2-6128)Shaanxi Major Technological Achievements Transformation and Guidance Special Projects,China(Grant No.2015KTCG01-01)
文摘The digital measurement and processing is an important direction in the measurement and control field. The quantization error widely existing in the digital processing is always the decisive factor that restricts the development and applications of the digital technology. In this paper, we find that the stability of the digital quantization system is obviously better than the quantization resolution. The application of a border effect in the digital quantization can greatly improve the accuracy of digital processing. Its effective precision has nothing to do with the number of quantization bits, which is only related to the stability of the quantization system. The high precision measurement results obtained in the low level quantization system with high sampling rate have an important application value for the progress in the digital measurement and processing field.
文摘This paper points out that with interference from their native language and culture, Chinese students will inevitably make some errors in the process of learning English. It is important for teachers to know when and how to correct the students’ errors. By employing error-correction skillfully and appropriately, one can expect to improve the present English teaching and learning, to develop the self-confidence and self-esteem in students themselves.
文摘For actively modulated In-line Sagnac interferential all optic fiber current transformers (AOFCTs), the accuracies are directly affected by the amplitude of the modulation signal. In order to deeply undertand the function of the modulator, a theoretical model of modulation effect to AOFCTs is built up in this paper. The effect of the amplitude of the modulation signal to the output intensity of AOFCTs is theoretically formulated and numerical calculated. The results show that the modulation voltage variation could affect the output accuracies significantly. This might be some references on the investigation for practical applications of AOFCTs.
基金Supported by the National Natural Science Foundation of China(Nos.11079045 and 11179003)the Important Direction Project of the CAS Knowledge Innovation Program(No.KJCX2-YW-N27)
文摘Single event upsets(SEUs) induced by heavy ions were observed in 65 nm SRAMs to quantitatively evaluate the applicability and effectiveness of single-bit error correcting code(ECC) utilizing Hamming Code.The results show that the ECC did improve the performance dramatically,with the SEU cross sections of SRAMs with ECC being at the order of 10^(-11) cm^2/bit,two orders of magnitude higher than that without ECC(at the order of 10^(-9) cm^2/bit).Also,ineffectiveness of ECC module,including 1-,2- and 3-bits errors in single word(not Multiple Bit Upsets),was detected.The ECC modules in SRAMs utilizing(12,8) Hamming code would lose work when 2-bits upset accumulates in one codeword.Finally,the probabilities of failure modes involving 1-,2- and 3-bits errors,were calcaulated at 39.39%,37.88%and 22.73%,respectively,which agree well with the experimental results.
基金supported by the National Defense Foundation of China(71601183)
文摘For the product degradation process with random effect (RE), measurement error (ME) and nonlinearity in step-stress accelerated degradation test (SSADT), the nonlinear Wiener based degradation model with RE and ME is built. An analytical approximation to the probability density function (PDF) of the product's lifetime is derived in a closed form. The process and data of SSADT are analyzed to obtain the relation model of the observed data under each accelerated stress. The likelihood function for the population-based observed data is constructed. The population-based model parameters and its random coefficient prior values are estimated. According to the newly observed data of the target product in SSADT, an analytical approximation to the PDF of its residual lifetime (RL) is derived in accordance with its individual degradation characteristics. The parameter updating method based on Bayesian inference is applied to obtain the posterior value of random coefficient of the RL model. A numerical example by simulation is analyzed to verify the accuracy and advantage of the proposed model.
基金supported by the National Natural Science Foundation of China(Nos.11179003,10975164,10805062,and 11005134)
文摘We predict proton single event effect(SEE)error rates for the VATA160 ASIC chip on the Dark Matter Particle Explorer(DAMPE) to evaluate its radiation tolerance.Lacking proton test facilities,we built a Monte Carlo simulation tool named PRESTAGE to calculate the proton SEE cross-sections.PRESTAGE is based on the particle transport toolkit Geant4.It adopts a location-dependent strategy to derive the SEE sensitivity of the device from heavy-ion test data,which have been measured at the HI-13 tandem accelerator of the China Institute of Atomic Energy and the heavy-ion research facility in Lanzhou.The AP-8,SOLPRO,and August 1972 worst-case models are used to predict the average and peak proton fluxes on the DAMPE orbit.Calculation results show that the averaged proton SEE error rate for the VATA160 chip is approximately 2.17×10^(-5)/device/day.Worst-case error rates for the Van Allen belts and solar energetic particle events are 1-3 orders of magnitude higher than the averaged error rate.
文摘Based on the basic formula of the confidence interval and the sampling error of mathematical statistics, the mathematical statistics method of evaluating application effects of a new type of gas anchor was given in this paper. By the method mentioned above, the confidence interval and the sampling errors of the relevant mean value differences of Daqing Oilfield S block’s 150 wells, according to the mean value differences of the liquid producing capacity per day, the oil production per day, the submergence depth of the 10 sampling test wells, in which before and after a new type of gas anchor were laid down, were calculated. The calculation results show that a new type of gas anchor has a better effect of increasing oil production of oil well and enhancing pump efficiency. Through the real value differences analysis of the liquid producing capacity per day, the oil production per day, the submergence depth of 150 wells mentioned above, in which before and after a new type of gas anchor were laid down, it was verified. By using the confidence interval and the sampling errors of the liquid producing capacity per day, the oil production per day, the submergence depth mentioned above, in which before and after a new type of gas anchor were laid down, the application effects of a new type of gas anchor could be evaluated. And a mathematical statistics method of evaluation application effects of a new type of gas anchor is presented.
文摘This study uses <span style="font-family:Verdana;">an empirical</span><span style="font-family:Verdana;"> analysis to quantify the downstream analysis effects of data pre-processing choices. Bootstrap data simulation is used to measure the bias-variance decomposition of an empirical risk function, mean square error (MSE). Results of the risk function decomposition are used to measure the effects of model development choices on </span><span style="font-family:Verdana;">model</span><span style="font-family:Verdana;"> bias, variance, and irreducible error. Measurements of bias and variance are then applied as diagnostic procedures for model pre-processing and development. Best performing model-normalization-data structure combinations were found to illustrate the downstream analysis effects of these model development choices. </span><span style="font-family:Verdana;">In addition</span><span style="font-family:Verdana;">s</span><span style="font-family:Verdana;">, results found from simulations were verified and expanded to include additional data characteristics (imbalanced, sparse) by testing on benchmark datasets available from the UCI Machine Learning Library. Normalization results on benchmark data were consistent with those found using simulations, while also illustrating that more complex and/or non-linear models provide better performance on datasets with additional complexities. Finally, applying the findings from simulation experiments to previously tested applications led to equivalent or improved results with less model development overhead and processing time.</span>
文摘Today, Linear Mixed Models (LMMs) are fitted, mostly, by assuming that random effects and errors have Gaussian distributions, therefore using Maximum Likelihood (ML) or REML estimation. However, for many data sets, that double assumption is unlikely to hold, particularly for the random effects, a crucial component </span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">in </span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">which assessment of magnitude is key in such modeling. Alternative fitting methods not relying on that assumption (as ANOVA ones and Rao</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">s MINQUE) apply, quite often, only to the very constrained class of variance components models. In this paper, a new computationally feasible estimation methodology is designed, first for the widely used class of 2-level (or longitudinal) LMMs with only assumption (beyond the usual basic ones) that residual errors are uncorrelated and homoscedastic, with no distributional assumption imposed on the random effects. A major asset of this new approach is that it yields nonnegative variance estimates and covariance matrices estimates which are symmetric and, at least, positive semi-definite. Furthermore, it is shown that when the LMM is, indeed, Gaussian, this new methodology differs from ML just through a slight variation in the denominator of the residual variance estimate. The new methodology actually generalizes to LMMs a well known nonparametric fitting procedure for standard Linear Models. Finally, the methodology is also extended to ANOVA LMMs, generalizing an old method by Henderson for ML estimation in such models under normality.
基金supported by the National Natural Science Foundation of China under Grant No.61274036No.61106038+1 种基金No.61371025and No.61474036
文摘Technology scaling results in the propagation-induced pulse broadening and quenching(PIPBQ) effect become more noticeable.In order to effectively evaluate the soft error rate for combinational logic circuits,a soft error rate analysis approach considering the PIPBQ effect is proposed.As different original pulse propagating through logic gate cells,pulse broadening and quenching are measured by HSPICE.After that,electrical effect look-up tables(EELUTs) for logic gate cells are created to evaluate the PIPBQ effect.Sensitized paths are accurately retrieved by the proposed re-convergence aware sensitized path search algorithm.Further,by propagating pulses on these paths to simulate fault injection,the PIPBQ effect on these paths can be quantified by EELUTs.As a result,the soft error rate of circuits can be effectively computed by the proposed technique.Simulation results verify the soft error rate improvement comparing with the PIPBQ-not-aware method.
基金Supported by the National Natural Science Foundation of China(No.11079045,11179003 and 11305233)the Important Direction Project of the CAS Knowledge Innovation Program(No.KJCX2-YWN27)
文摘Single event effects(SEEs) induced by radiations become a significant challenge to the reliability for modern electronic systems. To evaluate SEEs susceptibility for microelectronic devices and integrated circuits(ICs), an SEE testing system with flexibility and robustness was developed at Heavy Ion Research Facility in Lanzhou(HIRFL). The system is compatible with various types of microelectronic devices and ICs, and supports plenty of complex and high-speed test schemes and plans for the irradiated devices under test(DUTs). Thanks to the combination of meticulous circuit design and the hardened logic design, the system has additional performances to avoid an overheated situation and irradiations by stray radiations. The system has been tested and verified by experiments for irradiating devices at HIRFL.
基金supported by the National Natural Science Foundation of China(Nos.11690041,11805244,and 11675233)the Opening Project of Science and Technology on Reliability Physics and Application Technology of the Electronic Component Laboratory(No.ZHD 201604)
文摘Single event effects of 1-T structure programmable read-only memory(PROM) devices fabricated with a 130-nm complementary metal oxide semiconductorbased thin/thick gate oxide anti-fuse process were investigated using heavy ions and a picosecond pulsed laser. The cross sections of a single event upset(SEU) for radiationhardened PROMs were measured using a linear energy transfer(LET) ranging from 9.2 to 95.6 MeV cm^2mg^(-1).The result indicated that the LET threshold for a dynamic bit upset was ~ 9 MeV cm^2mg^(-1), which was lower than the threshold of ~ 20 MeV cm^2mg^(-1) for an address counter upset owing to the additional triple modular redundancy structure present in the latch. In addition, a slight hard error was observed in the anti-fuse structure when employing209 Bi ions with extremely high LET values(~ 91.6 MeV cm^2mg^(-1)) and large ion fluence(~ 1×10~8 ions cm^(-2)). To identify the detailed sensitive position of a SEU in PROMs, a pulsed laser with a 5-μm beam spot was used to scan the entire surface of the device.This revealed that the upset occurred in the peripheral circuits of the internal power source and I/O pairs rather than in the internal latches and buffers. This was subsequently confirmed by a ^(181)Ta experiment. Based on the experimental data and a rectangular parallelepiped model of the sensitive volume, the space error rates for the used PROMs were calculated using the CRèME-96 prediction tool. The results showed that this type of PROM was suitable for specific space applications, even in the geosynchronous orbit.
文摘Medical errors are reported with increased frequency both in Europe and in the United States of America and measures are put in place to deal with the problem. In Greece, more and more patients think that it is likely to experience a medical error during health care delivery and the organizations they can turn to if this happens are hardly enough and with meagre response. The consequences of medical errors are multiple and complex with significant financial implications. Nowadays there is an urgent need to resolve problems that refer to cost containment in the Greek Health System. Some research findings from the review of 128 compensations awarded by civil courts for the years 2000 to 2009 for medical errors in Greece are quite interesting. The mean compensation amounted to €292,613 representing 35.41% of claimed compensation. Only a small proportion of medical errors gain publicity as the majority of claims get settled out of court, covered by the insurance policy or the hospitals. The burden of the obvious and hidden cost affects not only the patient, his family and the hospital but also the whole of the society. This comes from our estimation that the level of compensation awarded by the civil courts for medical errors is remarkable high. Unfortunately only some estimates of the cost are possible due to the lack of statistical data. The creation of an independent oversight body for the review of medical errors and complaints nationwide as well as the modernization of the hospitals’ monitoring systems is necessary in order to handle the medical error phenomenon. Above all, cooperation and trust between patients, health care professionals, hospital managers, medical boards and the government are essential to get to the root of the problem.