Deep learning has been recently studied to generate high-quality prediction intervals(PIs)for uncertainty quantification in regression tasks,including recent applications in simulation metamodeling.The high-quality cr...Deep learning has been recently studied to generate high-quality prediction intervals(PIs)for uncertainty quantification in regression tasks,including recent applications in simulation metamodeling.The high-quality criterion requires PIs to be as narrow as possible,whilst maintaining a pre-specified level of data(marginal)coverage.However,most existing works for high-quality PIs lack accurate information on conditional coverage,which may cause unreliable predictions if it is significantly smaller than the marginal coverage.To address this problem,we propose an end-to-end framework which could output high-quality PIs and simultaneously provide their conditional coverage estimation.In doing so,we design a new loss function that is both easy-to-implement and theoretically justified via an exponential concentration bound.Our evaluation on real-world benchmark datasets and synthetic examples shows that our approach not only achieves competitive results on high-quality PIs in terms of average PI width,but also accurately estimates conditional coverage information that is useful in assessing model uncertainty.展开更多
基金the National Science Foundation under grants CAREER CMMI-1834710 and IS-1849280The research of Ziyi Huang and Haofeng Zhang is supported in part by the Cheung-Kong Innovation Doctoral Fellowship.
文摘Deep learning has been recently studied to generate high-quality prediction intervals(PIs)for uncertainty quantification in regression tasks,including recent applications in simulation metamodeling.The high-quality criterion requires PIs to be as narrow as possible,whilst maintaining a pre-specified level of data(marginal)coverage.However,most existing works for high-quality PIs lack accurate information on conditional coverage,which may cause unreliable predictions if it is significantly smaller than the marginal coverage.To address this problem,we propose an end-to-end framework which could output high-quality PIs and simultaneously provide their conditional coverage estimation.In doing so,we design a new loss function that is both easy-to-implement and theoretically justified via an exponential concentration bound.Our evaluation on real-world benchmark datasets and synthetic examples shows that our approach not only achieves competitive results on high-quality PIs in terms of average PI width,but also accurately estimates conditional coverage information that is useful in assessing model uncertainty.