The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pr...The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pressure transient and rate transient data.The initial flowback involves producing back the fracturing fuid after hydraulic fracturing,while the second flowback involves producing back the preloading fluid injected into the parent wells before fracturing of child wells.The main objective of this research is to compare the initial and second flowback data to capture the changes in fracture volume after production and preload processes.Such a comparison is useful for evaluating well performance and optimizing frac-turing operations.We construct rate-normalized pressure(RNP)versus material balance time(MBT)diagnostic plots using both initial and second flowback data(FB;and FBs,respectively)of six multi-fractured horizontal wells completed in Niobrara and Codell formations in DJ Basin.In general,the slope of RNP plot during the FB,period is higher than that during the FB;period,indicating a potential loss of fracture volume from the FB;to the FB,period.We estimate the changes in effective fracture volume(Ver)by analyzing the changes in the RNP slope and total compressibility between these two flowback periods.Ver during FB,is in general 3%-45%lower than that during FB:.We also compare the drive mechanisms for the two flowback periods by calculating the compaction-drive index(CDI),hydrocarbon-drive index(HDI),and water-drive index(WDI).The dominant drive mechanism during both flowback periods is CDI,but its contribution is reduced by 16%in the FB,period.This drop is generally compensated by a relatively higher HDI during this period.The loss of effective fracture volume might be attributed to the pressure depletion in fractures,which occurs during the production period and can extend 800 days.展开更多
[Objective]The research aimed to discuss shallowly the application of L-band sounding seconds data in the artificial precipitation.[Method]The characteristics,getting manner and displaying method of L-band sounding se...[Objective]The research aimed to discuss shallowly the application of L-band sounding seconds data in the artificial precipitation.[Method]The characteristics,getting manner and displaying method of L-band sounding seconds data were introduced briefly.Moreover,its application prospect in the artificial precipitation operation was analyzed initially.We aimed to improve its application rate in the artificial precipitation operation.[Result]L-band sounding seconds data had the great improvement in the time-space resolution and the space positioning accuracy aspects when compared with the previous sounding data,and the precision reached the second level.It could provide the high-precision data basis for the assimilation of artificial precipitation numerical model initial field,and improve the numerical model.Moreover,the sounding product could provide the accurate scientific basis for the selection of artificial precipitation operation tool,the determination of operation height and range,and guide the artificial precipitation operation,and improve the operation efficiency.[Conclusion]The research provided the analysis and reference basis for the command of artificial precipitation operation.展开更多
With the arrival of the "housing stock" in first - tier cities, the second - handhousing^market will become the dominant property market. This ardcle aim to the first - tiercities of second - hand housing prices and...With the arrival of the "housing stock" in first - tier cities, the second - handhousing^market will become the dominant property market. This ardcle aim to the first - tiercities of second - hand housing prices and new home price index for the empirical analysis, thedata related to the cointegration analysis found that the result of the first -tier cities real estatemarket in China, the new home price index is the significant factors influencing the second -hand house price indexi For Beijing, Shanghai second - hand housing and new home price in-dex time series johans test, found that there exists cointegration relationship between two varia-bles,the new city real estate market prices out of a line on the secondary market have clearguide. Therefore, the real estate market regulation aiming at the first -tier cities and the"housing stock" should take the second - hand housing market as the main direction, startingwith the sale price and influencing factors of new houses. At the same time, in different cities,we should adhere to the city' s policies, reflect the policy differentiation, promote the reformof the real estate supply side, and promote the return of housing properties.展开更多
This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which...This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which is intermittently examined at constant sampling instants. Only partial neighbor information and local measurements are required for event detection. Then the corresponding event-triggered consensus tracking protocol is presented to guarantee second-order multi-agent systems to achieve consensus tracking. Numerical simulations are given to illustrate the effectiveness of the proposed strategy.展开更多
This work presents a comprehensive second-order predictive modeling (PM) methodology based on the maximum entropy (MaxEnt) principle for obtaining best-estimate mean values and correlations for model responses and par...This work presents a comprehensive second-order predictive modeling (PM) methodology based on the maximum entropy (MaxEnt) principle for obtaining best-estimate mean values and correlations for model responses and parameters. This methodology is designated by the acronym 2<sup>nd</sup>-BERRU-PMP, where the attribute “2<sup>nd</sup>” indicates that this methodology incorporates second- order uncertainties (means and covariances) and second (and higher) order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best-Estimate Results with Reduced Uncertainties” and the last letter (“P”) in the acronym indicates “probabilistic,” referring to the MaxEnt probabilistic inclusion of the computational model responses. This is in contradistinction to the 2<sup>nd</sup>-BERRU-PMD methodology, which deterministically combines the computed model responses with the experimental information, as presented in the accompanying work (Part I). Although both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies yield expressions that include second (and higher) order sensitivities of responses to model parameters, the respective expressions for the predicted responses, for the calibrated predicted parameters and for their predicted uncertainties (covariances), are not identical to each other. Nevertheless, the results predicted by both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies encompass, as particular cases, the results produced by the extant data assimilation and data adjustment procedures, which rely on the minimization, in a least-square sense, of a user-defined functional meant to represent the discrepancies between measured and computed model responses.展开更多
This work presents a comprehensive second-order predictive modeling (PM) methodology designated by the acronym 2<sup>nd</sup>-BERRU-PMD. The attribute “2<sup>nd</sup>” indicates that this met...This work presents a comprehensive second-order predictive modeling (PM) methodology designated by the acronym 2<sup>nd</sup>-BERRU-PMD. The attribute “2<sup>nd</sup>” indicates that this methodology incorporates second-order uncertainties (means and covariances) and second-order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best- Estimate Results with Reduced Uncertainties” and the last letter (“D”) in the acronym indicates “deterministic,” referring to the deterministic inclusion of the computational model responses. The 2<sup>nd</sup>-BERRU-PMD methodology is fundamentally based on the maximum entropy (MaxEnt) principle. This principle is in contradistinction to the fundamental principle that underlies the extant data assimilation and/or adjustment procedures which minimize in a least-square sense a subjective user-defined functional which is meant to represent the discrepancies between measured and computed model responses. It is shown that the 2<sup>nd</sup>-BERRU-PMD methodology generalizes and extends current data assimilation and/or data adjustment procedures while overcoming the fundamental limitations of these procedures. In the accompanying work (Part II), the alternative framework for developing the “second- order MaxEnt predictive modelling methodology” is presented by incorporating probabilistically (as opposed to “deterministically”) the computed model responses.展开更多
SI:Agentic AI for 6G Networks.Introduction.6G networks are poised to provide full coverage across air,land,and sea,deliver terabit-per-second data rates,and achieve microsecond-level latency.They promise comprehensive...SI:Agentic AI for 6G Networks.Introduction.6G networks are poised to provide full coverage across air,land,and sea,deliver terabit-per-second data rates,and achieve microsecond-level latency.They promise comprehensive upgrades across industries through embedded intelligence,ushering in an era of intelligent interconnection of all things.However,managing real-time interactions among devices,infrastructure,and services in 6G networks is much more complex than in previous generations.Massive data streams from terrestrial nodes(e.g.,edge devices,sensors,distributed computing)and non-terrestrial nodes(LEO/MEO/GEO satellites)demand more intelligent and efficient processing.展开更多
This work illustrates the innovative results obtained by applying the recently developed the 2<sup>nd</sup>-order predictive modeling methodology called “2<sup>nd</sup>- BERRU-PM”, where the ...This work illustrates the innovative results obtained by applying the recently developed the 2<sup>nd</sup>-order predictive modeling methodology called “2<sup>nd</sup>- BERRU-PM”, where the acronym BERRU denotes “best-estimate results with reduced uncertainties” and “PM” denotes “predictive modeling.” The physical system selected for this illustrative application is a polyethylene-reflected plutonium (acronym: PERP) OECD/NEA reactor physics benchmark. This benchmark is modeled using the neutron transport Boltzmann equation (involving 21,976 uncertain parameters), the solution of which is representative of “large-scale computations.” The results obtained in this work confirm the fact that the 2<sup>nd</sup>-BERRU-PM methodology predicts best-estimate results that fall in between the corresponding computed and measured values, while reducing the predicted standard deviations of the predicted results to values smaller than either the experimentally measured or the computed values of the respective standard deviations. The obtained results also indicate that 2<sup>nd</sup>-order response sensitivities must always be included to quantify the need for including (or not) the 3<sup>rd</sup>- and/or 4<sup>th</sup>-order sensitivities. When the parameters are known with high precision, the contributions of the higher-order sensitivities diminish with increasing order, so that the inclusion of the 1<sup>st</sup>- and 2<sup>nd</sup>-order sensitivities may suffice for obtaining accurate predicted best- estimate response values and best-estimate standard deviations. On the other hand, when the parameters’ standard deviations are sufficiently large to approach (or be outside of) the radius of convergence of the multivariate Taylor-series which represents the response in the phase-space of model parameters, the contributions stemming from the 3<sup>rd</sup>- and even 4<sup>th</sup>-order sensitivities are necessary to ensure consistency between the computed and measured response. In such cases, the use of only the 1<sup>st</sup>-order sensitivities erroneously indicates that the computed results are inconsistent with the respective measured response. Ongoing research aims at extending the 2<sup>nd</sup>-BERRU-PM methodology to fourth-order, thus enabling the computation of third-order response correlations (skewness) and fourth-order response correlations (kurtosis).展开更多
文摘The fracture volume is gradually changed with the depletion of fracture pressure during the production process.However,there are few flowback models available so far that can estimate the fracture volume loss using pressure transient and rate transient data.The initial flowback involves producing back the fracturing fuid after hydraulic fracturing,while the second flowback involves producing back the preloading fluid injected into the parent wells before fracturing of child wells.The main objective of this research is to compare the initial and second flowback data to capture the changes in fracture volume after production and preload processes.Such a comparison is useful for evaluating well performance and optimizing frac-turing operations.We construct rate-normalized pressure(RNP)versus material balance time(MBT)diagnostic plots using both initial and second flowback data(FB;and FBs,respectively)of six multi-fractured horizontal wells completed in Niobrara and Codell formations in DJ Basin.In general,the slope of RNP plot during the FB,period is higher than that during the FB;period,indicating a potential loss of fracture volume from the FB;to the FB,period.We estimate the changes in effective fracture volume(Ver)by analyzing the changes in the RNP slope and total compressibility between these two flowback periods.Ver during FB,is in general 3%-45%lower than that during FB:.We also compare the drive mechanisms for the two flowback periods by calculating the compaction-drive index(CDI),hydrocarbon-drive index(HDI),and water-drive index(WDI).The dominant drive mechanism during both flowback periods is CDI,but its contribution is reduced by 16%in the FB,period.This drop is generally compensated by a relatively higher HDI during this period.The loss of effective fracture volume might be attributed to the pressure depletion in fractures,which occurs during the production period and can extend 800 days.
文摘[Objective]The research aimed to discuss shallowly the application of L-band sounding seconds data in the artificial precipitation.[Method]The characteristics,getting manner and displaying method of L-band sounding seconds data were introduced briefly.Moreover,its application prospect in the artificial precipitation operation was analyzed initially.We aimed to improve its application rate in the artificial precipitation operation.[Result]L-band sounding seconds data had the great improvement in the time-space resolution and the space positioning accuracy aspects when compared with the previous sounding data,and the precision reached the second level.It could provide the high-precision data basis for the assimilation of artificial precipitation numerical model initial field,and improve the numerical model.Moreover,the sounding product could provide the accurate scientific basis for the selection of artificial precipitation operation tool,the determination of operation height and range,and guide the artificial precipitation operation,and improve the operation efficiency.[Conclusion]The research provided the analysis and reference basis for the command of artificial precipitation operation.
文摘With the arrival of the "housing stock" in first - tier cities, the second - handhousing^market will become the dominant property market. This ardcle aim to the first - tiercities of second - hand housing prices and new home price index for the empirical analysis, thedata related to the cointegration analysis found that the result of the first -tier cities real estatemarket in China, the new home price index is the significant factors influencing the second -hand house price indexi For Beijing, Shanghai second - hand housing and new home price in-dex time series johans test, found that there exists cointegration relationship between two varia-bles,the new city real estate market prices out of a line on the secondary market have clearguide. Therefore, the real estate market regulation aiming at the first -tier cities and the"housing stock" should take the second - hand housing market as the main direction, startingwith the sale price and influencing factors of new houses. At the same time, in different cities,we should adhere to the city' s policies, reflect the policy differentiation, promote the reformof the real estate supply side, and promote the return of housing properties.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61203147,61374047,and 61403168)
文摘This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which is intermittently examined at constant sampling instants. Only partial neighbor information and local measurements are required for event detection. Then the corresponding event-triggered consensus tracking protocol is presented to guarantee second-order multi-agent systems to achieve consensus tracking. Numerical simulations are given to illustrate the effectiveness of the proposed strategy.
文摘This work presents a comprehensive second-order predictive modeling (PM) methodology based on the maximum entropy (MaxEnt) principle for obtaining best-estimate mean values and correlations for model responses and parameters. This methodology is designated by the acronym 2<sup>nd</sup>-BERRU-PMP, where the attribute “2<sup>nd</sup>” indicates that this methodology incorporates second- order uncertainties (means and covariances) and second (and higher) order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best-Estimate Results with Reduced Uncertainties” and the last letter (“P”) in the acronym indicates “probabilistic,” referring to the MaxEnt probabilistic inclusion of the computational model responses. This is in contradistinction to the 2<sup>nd</sup>-BERRU-PMD methodology, which deterministically combines the computed model responses with the experimental information, as presented in the accompanying work (Part I). Although both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies yield expressions that include second (and higher) order sensitivities of responses to model parameters, the respective expressions for the predicted responses, for the calibrated predicted parameters and for their predicted uncertainties (covariances), are not identical to each other. Nevertheless, the results predicted by both the 2<sup>nd</sup>-BERRU-PMP and the 2<sup>nd</sup>-BERRU-PMD methodologies encompass, as particular cases, the results produced by the extant data assimilation and data adjustment procedures, which rely on the minimization, in a least-square sense, of a user-defined functional meant to represent the discrepancies between measured and computed model responses.
文摘This work presents a comprehensive second-order predictive modeling (PM) methodology designated by the acronym 2<sup>nd</sup>-BERRU-PMD. The attribute “2<sup>nd</sup>” indicates that this methodology incorporates second-order uncertainties (means and covariances) and second-order sensitivities of computed model responses to model parameters. The acronym BERRU stands for “Best- Estimate Results with Reduced Uncertainties” and the last letter (“D”) in the acronym indicates “deterministic,” referring to the deterministic inclusion of the computational model responses. The 2<sup>nd</sup>-BERRU-PMD methodology is fundamentally based on the maximum entropy (MaxEnt) principle. This principle is in contradistinction to the fundamental principle that underlies the extant data assimilation and/or adjustment procedures which minimize in a least-square sense a subjective user-defined functional which is meant to represent the discrepancies between measured and computed model responses. It is shown that the 2<sup>nd</sup>-BERRU-PMD methodology generalizes and extends current data assimilation and/or data adjustment procedures while overcoming the fundamental limitations of these procedures. In the accompanying work (Part II), the alternative framework for developing the “second- order MaxEnt predictive modelling methodology” is presented by incorporating probabilistically (as opposed to “deterministically”) the computed model responses.
文摘SI:Agentic AI for 6G Networks.Introduction.6G networks are poised to provide full coverage across air,land,and sea,deliver terabit-per-second data rates,and achieve microsecond-level latency.They promise comprehensive upgrades across industries through embedded intelligence,ushering in an era of intelligent interconnection of all things.However,managing real-time interactions among devices,infrastructure,and services in 6G networks is much more complex than in previous generations.Massive data streams from terrestrial nodes(e.g.,edge devices,sensors,distributed computing)and non-terrestrial nodes(LEO/MEO/GEO satellites)demand more intelligent and efficient processing.
文摘This work illustrates the innovative results obtained by applying the recently developed the 2<sup>nd</sup>-order predictive modeling methodology called “2<sup>nd</sup>- BERRU-PM”, where the acronym BERRU denotes “best-estimate results with reduced uncertainties” and “PM” denotes “predictive modeling.” The physical system selected for this illustrative application is a polyethylene-reflected plutonium (acronym: PERP) OECD/NEA reactor physics benchmark. This benchmark is modeled using the neutron transport Boltzmann equation (involving 21,976 uncertain parameters), the solution of which is representative of “large-scale computations.” The results obtained in this work confirm the fact that the 2<sup>nd</sup>-BERRU-PM methodology predicts best-estimate results that fall in between the corresponding computed and measured values, while reducing the predicted standard deviations of the predicted results to values smaller than either the experimentally measured or the computed values of the respective standard deviations. The obtained results also indicate that 2<sup>nd</sup>-order response sensitivities must always be included to quantify the need for including (or not) the 3<sup>rd</sup>- and/or 4<sup>th</sup>-order sensitivities. When the parameters are known with high precision, the contributions of the higher-order sensitivities diminish with increasing order, so that the inclusion of the 1<sup>st</sup>- and 2<sup>nd</sup>-order sensitivities may suffice for obtaining accurate predicted best- estimate response values and best-estimate standard deviations. On the other hand, when the parameters’ standard deviations are sufficiently large to approach (or be outside of) the radius of convergence of the multivariate Taylor-series which represents the response in the phase-space of model parameters, the contributions stemming from the 3<sup>rd</sup>- and even 4<sup>th</sup>-order sensitivities are necessary to ensure consistency between the computed and measured response. In such cases, the use of only the 1<sup>st</sup>-order sensitivities erroneously indicates that the computed results are inconsistent with the respective measured response. Ongoing research aims at extending the 2<sup>nd</sup>-BERRU-PM methodology to fourth-order, thus enabling the computation of third-order response correlations (skewness) and fourth-order response correlations (kurtosis).