We present an efficient physics-informed neural networks(PINNs)framework,termed Adaptive Interface-PINNs(AdaI-PINNs),to improve the modeling of interface problems with discontinuous coefficients and/or interfacial jum...We present an efficient physics-informed neural networks(PINNs)framework,termed Adaptive Interface-PINNs(AdaI-PINNs),to improve the modeling of interface problems with discontinuous coefficients and/or interfacial jumps.This framework is an enhanced version of its predecessor,Interface PINNs or I-PINNs(Sarma et al.[1];https://doi.org/10.1016/j.cma.2024.117135),which involves domain decomposition and assignment of different predefined activation functions to the neural networks in each subdomain across a sharp interface,while keeping all other parameters of the neural networks identical.In AdaI-PINNs,the activation functions vary solely in their slopes,which are trained along with the other parameters of the neural networks.This makes the AdaI-PINNs framework fully automated without requiring preset activation functions.Comparative studies on one-dimensional,two-dimensional,and three-dimensional benchmark elliptic interface problems reveal that AdaI-PINNs outperform I-PINNs,reducing computational costs by 2-6 times while producing similar or better accuracy.展开更多
基金support from ExxonMobil Corporation to the Subsurface Mechanics and Geo-Energy Laboratory under the grant SP22230020CEEXXU008957The support from the Ministry of Education,Government of India and IIT Madras under the grant SB20210856CEMHRD008957 is also gratefully acknowledged.
文摘We present an efficient physics-informed neural networks(PINNs)framework,termed Adaptive Interface-PINNs(AdaI-PINNs),to improve the modeling of interface problems with discontinuous coefficients and/or interfacial jumps.This framework is an enhanced version of its predecessor,Interface PINNs or I-PINNs(Sarma et al.[1];https://doi.org/10.1016/j.cma.2024.117135),which involves domain decomposition and assignment of different predefined activation functions to the neural networks in each subdomain across a sharp interface,while keeping all other parameters of the neural networks identical.In AdaI-PINNs,the activation functions vary solely in their slopes,which are trained along with the other parameters of the neural networks.This makes the AdaI-PINNs framework fully automated without requiring preset activation functions.Comparative studies on one-dimensional,two-dimensional,and three-dimensional benchmark elliptic interface problems reveal that AdaI-PINNs outperform I-PINNs,reducing computational costs by 2-6 times while producing similar or better accuracy.