Background The efficacy of percutaneous transluminal angioplasty and stenting(PTAS)relative to medical management in treating symptomatic intracranial arterial stenosis(ICAS)varies based on the qualifying artery.This ...Background The efficacy of percutaneous transluminal angioplasty and stenting(PTAS)relative to medical management in treating symptomatic intracranial arterial stenosis(ICAS)varies based on the qualifying artery.This study aims to evaluate PTAS compared with medical therapy alone in cases of ICAS involving the internal carotid artery(ICA),middle cerebral artery(MCA),vertebral artery(VA)and basilar artery(BA).Methods This study involves a thorough pooled analysis of individual patient data from two randomised controlled trials,evaluating the efficacy of PTAS in comparison to medical management for symptomatic ICAS with different qualifying arteries.The primary outcome was stroke or death within 30 days postenrolment,or stroke in the region of the qualifying artery beyond 30 days through 1 year.A methodology based on intention-to-treat was employed,and HR accompanied by 95%CIs were used to convey risk estimates.Results The data of 809 individuals were collected from Stenting vs Aggressive Medical Management for Preventing Recurrent Stroke in Intracranial Stenosis trial and China Angioplasty and Stenting for Symptomatic Intracranial Severe Stenosis trial.Four hundred were designated for PTAS,while 409 were assigned to medical therapy alone.For the primary outcome,patients with symptomatic BA stenosis had a significantly higher risk of receiving PTAS compared with medical therapy(17.17%vs 7.77%;9.40;HR,2.38(1.03 to 5.52);p=0.04).However,PTAS had no significant difference in patients with symptomatic ICA(26.67%vs 16.67%;HR,1.68(0.78 to 3.62);p=0.19),MCA(8.28%vs 9.79%;HR,0.85(0.42 to 1.74);p=0.66)and VA stenosis(9.52%vs 10.71%;HR,0.91(0.32 to 2.62);p=0.86)compared with medical therapy.Conclusions PTAS significantly increases the risk of both short-term and long-term stroke in patients with symptomatic BA stenosis.Without significant technological advancements to mitigate these risks,PTAS offers limited benefits.For symptomatic ICA,MCA and VA stenosis,PTAS provided no significant advantage.展开更多
The three dimensional variable cross-section roll forming is a kind of new metal forming technol- ogy which combines large forming force, multi-axis linkage movement and space synergic movement, and the sequential syn...The three dimensional variable cross-section roll forming is a kind of new metal forming technol- ogy which combines large forming force, multi-axis linkage movement and space synergic movement, and the sequential synergic movement of the ganged roller group is used to complete the metal sheet forming according to the shape of the complicated and variable forming part data. The control system should meet the demands of quick response to the test requirements of the product part. A new kind of real time data driving multi-axis linkage and synergic movement control strategy of 3D roll forming is put forward in the paper. In the new control strategy, the forming data are automatically generated according to the shape of the parts, and the multi-axis linkage movement together with cooperative motion among the six stands of the 3D roll forming machine is driven by the real-time information, and the control nodes are also driven by the forming data. The new control strategy is applied to a 48 axis 3D roll forming machine developed by our research center, and the control servo period is less than 10ms. A forming experiment of variable cross section part is carried out, and the forming preci- sion is better than + 0.5mm by the control strategy. The result of the experiment proves that the control strategy has significant potentiality for the development of 3D roll forming production line with large scale, multi-axis ganged and svner^ic movement展开更多
In this review, we highlight some recent methodological and theoretical develop- ments in estimation and testing of large panel data models with cross-sectional dependence. The paper begins with a discussion of issues...In this review, we highlight some recent methodological and theoretical develop- ments in estimation and testing of large panel data models with cross-sectional dependence. The paper begins with a discussion of issues of cross-sectional dependence, and introduces the concepts of weak and strong cross-sectional dependence. Then, the main attention is primarily paid to spatial and factor approaches for modeling cross-sectional dependence for both linear and nonlinear (nonparametric and semiparametric) panel data models. Finally, we conclude with some speculations on future research directions.展开更多
Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minute...Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.展开更多
We present new data on the^(63)Cu(γ,n)cross-section studied using a quasi-monochromatic and energy-tunableγbeam produced at the Shanghai Laser Electron Gamma Source to resolve the long-standing discrepancy between e...We present new data on the^(63)Cu(γ,n)cross-section studied using a quasi-monochromatic and energy-tunableγbeam produced at the Shanghai Laser Electron Gamma Source to resolve the long-standing discrepancy between existing measurements and evaluations of this cross-section.Using an unfolding iteration method,^(63)Cu(γ,n)data were obtained with an uncertainty of less than 4%,and the inconsistencies between the available experimental data were discussed.Theγ-ray strength function of^(63)Cu(γ,n)was successfully extracted as an experimental constraint.We further calculated the cross-section of the radiative neutron capture reaction^(62)Cu(n,γ)using the TALYS code.Our calculation method enables the extraction of(n,γ)cross-sections for unstable nuclides.展开更多
Often in longitudinal studies, some subjects complete their follow-up visits, but others miss their visits due to various reasons. For those who miss follow-up visits, some of them might learn that the event of intere...Often in longitudinal studies, some subjects complete their follow-up visits, but others miss their visits due to various reasons. For those who miss follow-up visits, some of them might learn that the event of interest has already happened when they come back. In this case, not only are their event times interval-censored, but also their time-dependent measurements are incomplete. This problem was motivated by a national longitudinal survey of youth data. Maximum likelihood estimation (MLE) method based on expectation-maximization (EM) algorithm is used for parameter estimation. Then missing information principle is applied to estimate the variance-covariance matrix of the MLEs. Simulation studies demonstrate that the proposed method works well in terms of bias, standard error, and power for samples of moderate size. The national longitudinal survey of youth 1997 (NLSY97) data is analyzed for illustration.展开更多
The modeling of volatility and correlation is important in order to calculate hedge ratios, value at risk estimates, CAPM (Capital Asset Pricing Model betas), derivate pricing and risk management in general. Recent ...The modeling of volatility and correlation is important in order to calculate hedge ratios, value at risk estimates, CAPM (Capital Asset Pricing Model betas), derivate pricing and risk management in general. Recent access to intra-daily high-frequency data for two of the most liquid contracts at the Nord Pool exchange has made it possible to apply new and promising methods for analyzing volatility and correlation. The concepts of realized volatility and realized correlation are applied, and this study statistically describes the distribution (both distributional properties and temporal dependencies) of electricity forward data from 2005 to 2009. The main findings show that the logarithmic realized volatility is approximately normally distributed, while realized correlation seems not to be. Further, realized volatility and realized correlation have a long-memory feature. There also seems to be a high correlation between realized correlation and volatilities and positive relations between trading volume and realized volatility and between trading volume and realized correlation. These results are to a large extent consistent with earlier studies of stylized facts of other financial and commodity markets.展开更多
The deep learning algorithm,which has been increasingly applied in the field of petroleum geophysical prospecting,has achieved good results in improving efficiency and accuracy based on test applications.To play a gre...The deep learning algorithm,which has been increasingly applied in the field of petroleum geophysical prospecting,has achieved good results in improving efficiency and accuracy based on test applications.To play a greater role in actual production,these algorithm modules must be integrated into software systems and used more often in actual production projects.Deep learning frameworks,such as TensorFlow and PyTorch,basically take Python as the core architecture,while the application program mainly uses Java,C#,and other programming languages.During integration,the seismic data read by the Java and C#data interfaces must be transferred to the Python main program module.The data exchange methods between Java,C#,and Python include shared memory,shared directory,and so on.However,these methods have the disadvantages of low transmission efficiency and unsuitability for asynchronous networks.Considering the large volume of seismic data and the need for network support for deep learning,this paper proposes a method of transmitting seismic data based on Socket.By maximizing Socket’s cross-network and efficient longdistance transmission,this approach solves the problem of inefficient transmission of underlying data while integrating the deep learning algorithm module into a software system.Furthermore,the actual production application shows that this method effectively solves the shortage of data transmission in shared memory,shared directory,and other modes while simultaneously improving the transmission efficiency of massive seismic data across modules at the bottom of the software.展开更多
Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the g...Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the great potential to deal with pore pressure prediction.However,most of the traditional deep learning models are less efficient to address generalization problems.To fill this technical gap,in this work,we developed a new adaptive physics-informed deep learning model with high generalization capability to predict pore pressure values directly from seismic data.Specifically,the new model,named CGP-NN,consists of a novel parametric features extraction approach(1DCPP),a stacked multilayer gated recurrent model(multilayer GRU),and an adaptive physics-informed loss function.Through machine training,the developed model can automatically select the optimal physical model to constrain the results for each pore pressure prediction.The CGP-NN model has the best generalization when the physicsrelated metricλ=0.5.A hybrid approach combining Eaton and Bowers methods is also proposed to build machine-learnable labels for solving the problem of few labels.To validate the developed model and methodology,a case study on a complex reservoir in Tarim Basin was further performed to demonstrate the high accuracy on the pore pressure prediction of new wells along with the strong generalization ability.The adaptive physics-informed deep learning approach presented here has potential application in the prediction of pore pressures coupled with multiple genesis mechanisms using seismic data.展开更多
Aim Alcoholism is a disease that a patient becomes dependent or addicted to alcohol.This paper aims to design a novel artificial intelligence model that can recognize alcoholism more accurately.Methods We propose the ...Aim Alcoholism is a disease that a patient becomes dependent or addicted to alcohol.This paper aims to design a novel artificial intelligence model that can recognize alcoholism more accurately.Methods We propose the VGG-Inspired stochastic pooling neural network(VISPNN)model based on three components:(i)a VGG-inspired mainstay network,(ii)the stochastic pooling technique,which aims to outperform traditional max pooling and average pooling,and(iii)an improved 20-way data augmentation(Gaussian noise,salt-and-pepper noise,speckle noise,Poisson noise,horizontal shear,vertical shear,rotation,Gamma correction,random translation,and scaling on both raw image and its horizontally mirrored image).In addition,two networks(Net-I and Net-II)are proposed in ablation studies.Net-I is based on VISPNN by replacing stochastic pooling with ordinary max pooling.Net-II removes the 20-way data augmentation.Results The results by ten runs of 10-fold cross-validation show that our VISPNN model gains a sensitivity of 97.98±1.32,a specificity of 97.80±1.35,a precision of 97.78±1.35,an accuracy of 97.89±1.11,an F1 score of 97.87±1.12,an MCC of 95.79±2.22,an FMI of 97.88±1.12,and an AUC of 0.9849,respectively.Conclusion The performance of our VISPNN model is better than two internal networks(Net-I and Net-II)and ten state-of-the-art alcoholism recognition methods.展开更多
Purpose: This paper relates the definition of data quality procedures for knowledge organizations such as Higher Education Institutions. The main purpose is to present the flexible approach developed for monitoring th...Purpose: This paper relates the definition of data quality procedures for knowledge organizations such as Higher Education Institutions. The main purpose is to present the flexible approach developed for monitoring the data quality of the European Tertiary Education Register(ETER) database, illustrating its functioning and highlighting the main challenges that still have to be faced in this domain.Design/methodology/approach: The proposed data quality methodology is based on two kinds of checks, one to assess the consistency of cross-sectional data and the other to evaluate the stability of multiannual data. This methodology has an operational and empirical orientation. This means that the proposed checks do not assume any theoretical distribution for the determination of the threshold parameters that identify potential outliers, inconsistencies, and errors in the data. Findings: We show that the proposed cross-sectional checks and multiannual checks are helpful to identify outliers, extreme observations and to detect ontological inconsistencies not described in the available meta-data. For this reason, they may be a useful complement to integrate the processing of the available information.Research limitations: The coverage of the study is limited to European Higher Education Institutions. The cross-sectional and multiannual checks are not yet completely integrated.Practical implications: The consideration of the quality of the available data and information is important to enhance data quality-aware empirical investigations, highlighting problems, and areas where to invest for improving the coverage and interoperability of data in future data collection initiatives.Originality/value: The data-driven quality checks proposed in this paper may be useful as a reference for building and monitoring the data quality of new databases or of existing databases available for other countries or systems characterized by high heterogeneity and complexity of the units of analysis without relying on pre-specified theoretical distributions.展开更多
On the basis of Argo profile data of the temperature and salinity from January 2001 to July 2014, the spatial distributions of an upper ocean heat content(OHC) and ocean salt content(OSC) of the western Pacific warm p...On the basis of Argo profile data of the temperature and salinity from January 2001 to July 2014, the spatial distributions of an upper ocean heat content(OHC) and ocean salt content(OSC) of the western Pacific warm pool(WPWP) region and their seasonal and interannual variations are studied by a cyclostationary empirical orthogonal function(CSEOF) decomposition, a maximum entropy spectral analysis, and a correlation analysis.Probable reasons for variations are discussed. The results show the following.(1) The OHC variations in the subsurface layer of the WPWP are much greater than those in the surface layer. On the contrary, the OSC variations are mainly in the surface layer, while the subsurface layer varies little.(2) Compared with the OSC, the OHC of the WPWP region is more affected by El Ni?o-Southern Oscillation(ENSO) events. The CSEOF analysis shows that the OHC pattern in mode 1 has strong interannual oscillation, with eastern and western parts opposite in phase. The distribution of the OSC has a positive-negative-positive tripole pattern. Time series analysis shows that the OHC has three phase adjustments with the occurrence of ENSO events after 2007, while the OSC only had one such adjustment during the same period. Further analysis indicates that the OHC variations are mainly caused by ENSO events, local winds, and zonal currents, whereas the OSC variations are caused by much more complex reasons. Two of these, the zonal current and a freshwater flux, have a positive feedback on the OSC change in the WPWP region.展开更多
Residential water use is gradually becoming the focus in China's municipal water supply planning and management in recent years.Little is known,however,about the residential water use in modern China due to the tr...Residential water use is gradually becoming the focus in China's municipal water supply planning and management in recent years.Little is known,however,about the residential water use in modern China due to the transition of economy and enhancement of management on water conservation.In order to better understand the characteristics of residential water use in North China,a model for identifying the determinants of residential water use was established and analyzed by using panel data and cross-section data methodologies.Then Taiyuan city,the capital city of Shanxi Province in Northern China was selected as a case study.Both the analyses and field investigation indicate that the relatively slow increase of residential water use in recent years may result from the implementation of strict laws and regulations on water conservation.And through the investigation, first-hand information about water consumption pattern,water reuse/conservation,people's attitude toward water quantity and quality,etc.have been obtained.展开更多
基金Beijing Hospitals Authority’s Ascent Plan(DFL20220702)National Natural Science Foundation of China(82101398).
文摘Background The efficacy of percutaneous transluminal angioplasty and stenting(PTAS)relative to medical management in treating symptomatic intracranial arterial stenosis(ICAS)varies based on the qualifying artery.This study aims to evaluate PTAS compared with medical therapy alone in cases of ICAS involving the internal carotid artery(ICA),middle cerebral artery(MCA),vertebral artery(VA)and basilar artery(BA).Methods This study involves a thorough pooled analysis of individual patient data from two randomised controlled trials,evaluating the efficacy of PTAS in comparison to medical management for symptomatic ICAS with different qualifying arteries.The primary outcome was stroke or death within 30 days postenrolment,or stroke in the region of the qualifying artery beyond 30 days through 1 year.A methodology based on intention-to-treat was employed,and HR accompanied by 95%CIs were used to convey risk estimates.Results The data of 809 individuals were collected from Stenting vs Aggressive Medical Management for Preventing Recurrent Stroke in Intracranial Stenosis trial and China Angioplasty and Stenting for Symptomatic Intracranial Severe Stenosis trial.Four hundred were designated for PTAS,while 409 were assigned to medical therapy alone.For the primary outcome,patients with symptomatic BA stenosis had a significantly higher risk of receiving PTAS compared with medical therapy(17.17%vs 7.77%;9.40;HR,2.38(1.03 to 5.52);p=0.04).However,PTAS had no significant difference in patients with symptomatic ICA(26.67%vs 16.67%;HR,1.68(0.78 to 3.62);p=0.19),MCA(8.28%vs 9.79%;HR,0.85(0.42 to 1.74);p=0.66)and VA stenosis(9.52%vs 10.71%;HR,0.91(0.32 to 2.62);p=0.86)compared with medical therapy.Conclusions PTAS significantly increases the risk of both short-term and long-term stroke in patients with symptomatic BA stenosis.Without significant technological advancements to mitigate these risks,PTAS offers limited benefits.For symptomatic ICA,MCA and VA stenosis,PTAS provided no significant advantage.
基金Supported by National Key Technology R&D Program(No.2011BAG03B03)
文摘The three dimensional variable cross-section roll forming is a kind of new metal forming technol- ogy which combines large forming force, multi-axis linkage movement and space synergic movement, and the sequential synergic movement of the ganged roller group is used to complete the metal sheet forming according to the shape of the complicated and variable forming part data. The control system should meet the demands of quick response to the test requirements of the product part. A new kind of real time data driving multi-axis linkage and synergic movement control strategy of 3D roll forming is put forward in the paper. In the new control strategy, the forming data are automatically generated according to the shape of the parts, and the multi-axis linkage movement together with cooperative motion among the six stands of the 3D roll forming machine is driven by the real-time information, and the control nodes are also driven by the forming data. The new control strategy is applied to a 48 axis 3D roll forming machine developed by our research center, and the control servo period is less than 10ms. A forming experiment of variable cross section part is carried out, and the forming preci- sion is better than + 0.5mm by the control strategy. The result of the experiment proves that the control strategy has significant potentiality for the development of 3D roll forming production line with large scale, multi-axis ganged and svner^ic movement
基金Supported by the National Natural Science Foundation of China(71131008(Key Project)and 71271179)
文摘In this review, we highlight some recent methodological and theoretical develop- ments in estimation and testing of large panel data models with cross-sectional dependence. The paper begins with a discussion of issues of cross-sectional dependence, and introduces the concepts of weak and strong cross-sectional dependence. Then, the main attention is primarily paid to spatial and factor approaches for modeling cross-sectional dependence for both linear and nonlinear (nonparametric and semiparametric) panel data models. Finally, we conclude with some speculations on future research directions.
基金National Natural Science Foundation of China(No.41801379)Fundamental Research Funds for the Central Universities(No.2019B08414)National Key R&D Program of China(No.2016YFC0401801)。
文摘Tunnel deformation monitoring is a crucial task to evaluate tunnel stability during the metro operation period.Terrestrial Laser Scanning(TLS)can collect high density and high accuracy point cloud data in a few minutes as an innovation technique,which provides promising applications in tunnel deformation monitoring.Here,an efficient method for extracting tunnel cross-sections and convergence analysis using dense TLS point cloud data is proposed.First,the tunnel orientation is determined using principal component analysis(PCA)in the Euclidean plane.Two control points are introduced to detect and remove the unsuitable points by using point cloud division and then the ground points are removed by defining an elevation value width of 0.5 m.Next,a z-score method is introduced to detect and remove the outlies.Because the tunnel cross-section’s standard shape is round,the circle fitting is implemented using the least-squares method.Afterward,the convergence analysis is made at the angles of 0°,30°and 150°.The proposed approach’s feasibility is tested on a TLS point cloud of a Nanjing subway tunnel acquired using a FARO X330 laser scanner.The results indicate that the proposed methodology achieves an overall accuracy of 1.34 mm,which is also in agreement with the measurements acquired by a total station instrument.The proposed methodology provides new insights and references for the applications of TLS in tunnel deformation monitoring,which can also be extended to other engineering applications.
基金supported by the National Key Research and Development Program(Nos.2023YFA1606901 and 2022YFA1602400)National Natural Science Foundation of China(Nos.U2230133,12275338,and 12388102)Open Fund of the CIAE Key Laboratory of Nuclear Data(No.JCKY2022201C152).
文摘We present new data on the^(63)Cu(γ,n)cross-section studied using a quasi-monochromatic and energy-tunableγbeam produced at the Shanghai Laser Electron Gamma Source to resolve the long-standing discrepancy between existing measurements and evaluations of this cross-section.Using an unfolding iteration method,^(63)Cu(γ,n)data were obtained with an uncertainty of less than 4%,and the inconsistencies between the available experimental data were discussed.Theγ-ray strength function of^(63)Cu(γ,n)was successfully extracted as an experimental constraint.We further calculated the cross-section of the radiative neutron capture reaction^(62)Cu(n,γ)using the TALYS code.Our calculation method enables the extraction of(n,γ)cross-sections for unstable nuclides.
文摘Often in longitudinal studies, some subjects complete their follow-up visits, but others miss their visits due to various reasons. For those who miss follow-up visits, some of them might learn that the event of interest has already happened when they come back. In this case, not only are their event times interval-censored, but also their time-dependent measurements are incomplete. This problem was motivated by a national longitudinal survey of youth data. Maximum likelihood estimation (MLE) method based on expectation-maximization (EM) algorithm is used for parameter estimation. Then missing information principle is applied to estimate the variance-covariance matrix of the MLEs. Simulation studies demonstrate that the proposed method works well in terms of bias, standard error, and power for samples of moderate size. The national longitudinal survey of youth 1997 (NLSY97) data is analyzed for illustration.
文摘The modeling of volatility and correlation is important in order to calculate hedge ratios, value at risk estimates, CAPM (Capital Asset Pricing Model betas), derivate pricing and risk management in general. Recent access to intra-daily high-frequency data for two of the most liquid contracts at the Nord Pool exchange has made it possible to apply new and promising methods for analyzing volatility and correlation. The concepts of realized volatility and realized correlation are applied, and this study statistically describes the distribution (both distributional properties and temporal dependencies) of electricity forward data from 2005 to 2009. The main findings show that the logarithmic realized volatility is approximately normally distributed, while realized correlation seems not to be. Further, realized volatility and realized correlation have a long-memory feature. There also seems to be a high correlation between realized correlation and volatilities and positive relations between trading volume and realized volatility and between trading volume and realized correlation. These results are to a large extent consistent with earlier studies of stylized facts of other financial and commodity markets.
基金supported by the PetroChina Prospective,Basic,and Strategic Technology Research Project(No.2021ZG03-02 and No.2023DJ8402)。
文摘The deep learning algorithm,which has been increasingly applied in the field of petroleum geophysical prospecting,has achieved good results in improving efficiency and accuracy based on test applications.To play a greater role in actual production,these algorithm modules must be integrated into software systems and used more often in actual production projects.Deep learning frameworks,such as TensorFlow and PyTorch,basically take Python as the core architecture,while the application program mainly uses Java,C#,and other programming languages.During integration,the seismic data read by the Java and C#data interfaces must be transferred to the Python main program module.The data exchange methods between Java,C#,and Python include shared memory,shared directory,and so on.However,these methods have the disadvantages of low transmission efficiency and unsuitability for asynchronous networks.Considering the large volume of seismic data and the need for network support for deep learning,this paper proposes a method of transmitting seismic data based on Socket.By maximizing Socket’s cross-network and efficient longdistance transmission,this approach solves the problem of inefficient transmission of underlying data while integrating the deep learning algorithm module into a software system.Furthermore,the actual production application shows that this method effectively solves the shortage of data transmission in shared memory,shared directory,and other modes while simultaneously improving the transmission efficiency of massive seismic data across modules at the bottom of the software.
基金funded by the National Natural Science Foundation of China(General Program:No.52074314,No.U19B6003-05)National Key Research and Development Program of China(2019YFA0708303-05)。
文摘Accurate prediction of formation pore pressure is essential to predict fluid flow and manage hydrocarbon production in petroleum engineering.Recent deep learning technique has been receiving more interest due to the great potential to deal with pore pressure prediction.However,most of the traditional deep learning models are less efficient to address generalization problems.To fill this technical gap,in this work,we developed a new adaptive physics-informed deep learning model with high generalization capability to predict pore pressure values directly from seismic data.Specifically,the new model,named CGP-NN,consists of a novel parametric features extraction approach(1DCPP),a stacked multilayer gated recurrent model(multilayer GRU),and an adaptive physics-informed loss function.Through machine training,the developed model can automatically select the optimal physical model to constrain the results for each pore pressure prediction.The CGP-NN model has the best generalization when the physicsrelated metricλ=0.5.A hybrid approach combining Eaton and Bowers methods is also proposed to build machine-learnable labels for solving the problem of few labels.To validate the developed model and methodology,a case study on a complex reservoir in Tarim Basin was further performed to demonstrate the high accuracy on the pore pressure prediction of new wells along with the strong generalization ability.The adaptive physics-informed deep learning approach presented here has potential application in the prediction of pore pressures coupled with multiple genesis mechanisms using seismic data.
基金This paper is partially supported by the Royal Society International Exchanges Cost Share Award,UK(RP202G0230)Medical Research Council Confidence in Concept Award,UK(MC_PC_17171)+3 种基金Hope Foundation for Cancer Research,UK(RM60G0680)British Heart Foundation Accelerator Award,UKSino-UK Industrial Fund,UK(RP202G0289)Global Challenges Research Fund(GCRF),UK(P202PF11).In addition,we acknowledge the help of Dr.Hemil Patel and Dr.Qinghua Zhou for their help in English correction.
文摘Aim Alcoholism is a disease that a patient becomes dependent or addicted to alcohol.This paper aims to design a novel artificial intelligence model that can recognize alcoholism more accurately.Methods We propose the VGG-Inspired stochastic pooling neural network(VISPNN)model based on three components:(i)a VGG-inspired mainstay network,(ii)the stochastic pooling technique,which aims to outperform traditional max pooling and average pooling,and(iii)an improved 20-way data augmentation(Gaussian noise,salt-and-pepper noise,speckle noise,Poisson noise,horizontal shear,vertical shear,rotation,Gamma correction,random translation,and scaling on both raw image and its horizontally mirrored image).In addition,two networks(Net-I and Net-II)are proposed in ablation studies.Net-I is based on VISPNN by replacing stochastic pooling with ordinary max pooling.Net-II removes the 20-way data augmentation.Results The results by ten runs of 10-fold cross-validation show that our VISPNN model gains a sensitivity of 97.98±1.32,a specificity of 97.80±1.35,a precision of 97.78±1.35,an accuracy of 97.89±1.11,an F1 score of 97.87±1.12,an MCC of 95.79±2.22,an FMI of 97.88±1.12,and an AUC of 0.9849,respectively.Conclusion The performance of our VISPNN model is better than two internal networks(Net-I and Net-II)and ten state-of-the-art alcoholism recognition methods.
基金support of the European Commission ETER Project (No. 934533-2017-AO8-CH)H2020 RISIS 2 project (No. 824091)。
文摘Purpose: This paper relates the definition of data quality procedures for knowledge organizations such as Higher Education Institutions. The main purpose is to present the flexible approach developed for monitoring the data quality of the European Tertiary Education Register(ETER) database, illustrating its functioning and highlighting the main challenges that still have to be faced in this domain.Design/methodology/approach: The proposed data quality methodology is based on two kinds of checks, one to assess the consistency of cross-sectional data and the other to evaluate the stability of multiannual data. This methodology has an operational and empirical orientation. This means that the proposed checks do not assume any theoretical distribution for the determination of the threshold parameters that identify potential outliers, inconsistencies, and errors in the data. Findings: We show that the proposed cross-sectional checks and multiannual checks are helpful to identify outliers, extreme observations and to detect ontological inconsistencies not described in the available meta-data. For this reason, they may be a useful complement to integrate the processing of the available information.Research limitations: The coverage of the study is limited to European Higher Education Institutions. The cross-sectional and multiannual checks are not yet completely integrated.Practical implications: The consideration of the quality of the available data and information is important to enhance data quality-aware empirical investigations, highlighting problems, and areas where to invest for improving the coverage and interoperability of data in future data collection initiatives.Originality/value: The data-driven quality checks proposed in this paper may be useful as a reference for building and monitoring the data quality of new databases or of existing databases available for other countries or systems characterized by high heterogeneity and complexity of the units of analysis without relying on pre-specified theoretical distributions.
基金The National Natural Science Foundation of China under contract Nos 41406022 and 41606003the Scientific Research Fund of the Second Institute of Oceanography,State Oceanic Administration of China under contract Nos JG1812 and JG1709the Special Program for the National Basic Research of China under contract No.2012FY112300
文摘On the basis of Argo profile data of the temperature and salinity from January 2001 to July 2014, the spatial distributions of an upper ocean heat content(OHC) and ocean salt content(OSC) of the western Pacific warm pool(WPWP) region and their seasonal and interannual variations are studied by a cyclostationary empirical orthogonal function(CSEOF) decomposition, a maximum entropy spectral analysis, and a correlation analysis.Probable reasons for variations are discussed. The results show the following.(1) The OHC variations in the subsurface layer of the WPWP are much greater than those in the surface layer. On the contrary, the OSC variations are mainly in the surface layer, while the subsurface layer varies little.(2) Compared with the OSC, the OHC of the WPWP region is more affected by El Ni?o-Southern Oscillation(ENSO) events. The CSEOF analysis shows that the OHC pattern in mode 1 has strong interannual oscillation, with eastern and western parts opposite in phase. The distribution of the OSC has a positive-negative-positive tripole pattern. Time series analysis shows that the OHC has three phase adjustments with the occurrence of ENSO events after 2007, while the OSC only had one such adjustment during the same period. Further analysis indicates that the OHC variations are mainly caused by ENSO events, local winds, and zonal currents, whereas the OSC variations are caused by much more complex reasons. Two of these, the zonal current and a freshwater flux, have a positive feedback on the OSC change in the WPWP region.
文摘Residential water use is gradually becoming the focus in China's municipal water supply planning and management in recent years.Little is known,however,about the residential water use in modern China due to the transition of economy and enhancement of management on water conservation.In order to better understand the characteristics of residential water use in North China,a model for identifying the determinants of residential water use was established and analyzed by using panel data and cross-section data methodologies.Then Taiyuan city,the capital city of Shanxi Province in Northern China was selected as a case study.Both the analyses and field investigation indicate that the relatively slow increase of residential water use in recent years may result from the implementation of strict laws and regulations on water conservation.And through the investigation, first-hand information about water consumption pattern,water reuse/conservation,people's attitude toward water quantity and quality,etc.have been obtained.