Offline policy evaluation,evaluating and selecting complex policies for decision-making by only using offline datasets is important in reinforcement learning.At present,the model-based offline policy evaluation(MBOPE)...Offline policy evaluation,evaluating and selecting complex policies for decision-making by only using offline datasets is important in reinforcement learning.At present,the model-based offline policy evaluation(MBOPE)is widely welcomed because of its easy to implement and good performance.MBOPE directly approximates the unknown value of a given policy using the Monte Carlo method given the estimated transition and reward functions of the environment.Usually,multiple models are trained,and then one of them is selected to be used.However,a challenge remains in selecting an appropriate model from those trained for further use.The authors first analyse the upper bound of the difference between the approximated value and the unknown true value.Theoretical results show that this difference is related to the trajectories generated by the given policy on the learnt model and the prediction error of the transition and reward functions at these generated data points.Based on the theoretical results,a new criterion is proposed to tell which trained model is better suited for evaluating the given policy.At last,the effectiveness of the proposed criterion is demonstrated on both benchmark and synthetic offline datasets.展开更多
Proteolysis-targeting chimeras(PROTACs)represent a promising class of drugs that can target disease-causing proteins more effectively than traditional small molecule inhibitors can,potentially revolutionizing drug dis...Proteolysis-targeting chimeras(PROTACs)represent a promising class of drugs that can target disease-causing proteins more effectively than traditional small molecule inhibitors can,potentially revolutionizing drug discovery and treatment strategies.However,the links between in vitro and in vivo data are poorly understood,hindering a comprehensive understanding of the absorption,distribution,metabolism,and excretion(ADME)of PROTACs.In this work,14C-labeled vepdegestrant(ARV-471),which is currently in phase III clinical trials for breast cancer,was synthesized as a model PROTAC to characterize its preclinical ADME properties and simulate its clinical pharmacokinetics(PK)by establishing a physiologically based pharmacokinetics(PBPK)model.For in vitro–in vivo extrapolation(IVIVE),hepatocyte clearance correlated more closely with in vivo rat PK data than liver microsomal clearance did.PBPK models,which were initially developed and validated in rats,accurately simulate ARV-471's PK across fed and fasted states,with parameters within 1.75-fold of the observed values.Human models,informed by in vitro ADME data,closely mirrored postoral dose plasma profiles at 30 mg.Furthermore,no human-specific metabolites were identified in vitro and the metabolic profile of rats could overlap that of humans.This work presents a roadmap for developing future PROTAC medications by elucidating the correlation between in vitro and in vivo characteristics.展开更多
A new series of benzothiazole Schiff bases 3–29 was synthesized and screened for antitumor activity against cervical cancer(Hela) and kidney fibroblast cancer(COS-7) cell lines. Results indicated that compounds 3...A new series of benzothiazole Schiff bases 3–29 was synthesized and screened for antitumor activity against cervical cancer(Hela) and kidney fibroblast cancer(COS-7) cell lines. Results indicated that compounds 3, 14, 19, 27 and 28 have promising activity against Hela cell line with IC50 values of 2.41,3.06, 6.46, 2.22 and 6.25 mmol/L, respectively, in comparison to doxorubicin as a reference antitumor agent(IC50 2.05 mmol/L). In addition, compound 3 displayed excellent activity against COS-7 cell line with IC50 value of 4.31 mmol/L in comparison to doxorubicin(IC50 3.04 mmol/L). In the present work,structure based pharmacophore mapping, molecular docking, protein-ligand interaction, fingerprints and binding energy calculations were employed in a virtual screening strategy to identify the interaction between the compounds and the active site of the putative target, EGFR tyrosine kinase. Molecular properties, toxicity, drug-likeness, and drug score profiles of compounds 3, 14, 19, 27, 28 and 29 were also assessed.展开更多
Amplitudes have been found to be a function of incident angle and offset. Hence data required to test for amplitude variation with angle or offset needs to have its amplitudes for all offsets preserved and not stacked...Amplitudes have been found to be a function of incident angle and offset. Hence data required to test for amplitude variation with angle or offset needs to have its amplitudes for all offsets preserved and not stacked. Amplitude Variation with Offset (AVO)/Amplitude Variation with Angle (AVA) is necessary to account for information in the offset/angle parameter (mode converted S-wave and P-wave velocities). Since amplitudes are a function of the converted S- and P-waves, it is important to investigate the dependence of amplitudes on the elastic (P- and S-waves) parameters from the seismic data. By modelling these effects for different reservoir fluids via fluid substitution, various AVO geobody classes present along the well and in the entire seismic cube can be observed. AVO analysis was performed on one test well (Well_1) and 3D pre-stack angle gathers from the Tano Basin. The analysis involves creating a synthetic model to infer the effect of offset scaling techniques on amplitude responses in the Tano basin as compared to the effect of unscaled seismic data. The spectral balance process was performed to match the amplitude spectra of all angle stacks to that of the mid (26°) stack on the test lines. The process had an effect primarily on the far (34° - 40°) stacks. The frequency content of these stacks slightly increased to match that of the near and mid stacks. In offset scaling process, the root mean square (RMS) amplitude comparison between the synthetic and seismic suggests that the amplitude of the far traces should be reduced relative to the nears by up to 16%. However, the exact scaler values depend on the time window considered. This suggests that the amplitude scaling with offset delivered from seismic processing is only approximately correct and needs to be checked with well synthetics and adjusted accordingly prior to use for AVO studies. The AVO attribute volumes generated were better at resolving anomalies on spectrally balanced and offset scaled data than data delivered from conventional processing. A typical class II AVO anomaly is seen along the test well from the cross-plot analysis and AVO attribute cube which indicates an oil filled reservoir.展开更多
With the aid of multi-agent based modeling approach to complex systems, the hierarchy simulation models of carrier-based aircraft catapult launch are developed. Ocean, carrier, aircraft, and atmosphere are treated as ...With the aid of multi-agent based modeling approach to complex systems, the hierarchy simulation models of carrier-based aircraft catapult launch are developed. Ocean, carrier, aircraft, and atmosphere are treated as aggregation agents, the detailed components like catapult, landing gears, and disturbances are considered as meta-agents, which belong to their aggregation agent. Thus, the model with two layers is formed i.e. the aggregation agent layer and the meta-agent layer. The information communication among all agents is described. The meta-agents within one aggregation agent communicate with each other directly by information sharing, but the meta-agents, which belong to different aggregation agents exchange their information through the aggregation layer first, and then perceive it from the sharing environment, that is the aggregation agent. Thus, not only the hierarchy model is built, but also the environment perceived by each agent is specified. Meanwhile, the problem of balancing the independency of agent and the resource consumption brought by real-time communication within multi-agent system (MAS) is resolved. Each agent involved in carrier-based aircraft catapult launch is depicted, with considering the interaction within disturbed atmospheric environment and multiple motion bodies including carrier, aircraft, and landing gears. The models of reactive agents among them are derived based on tensors, and the perceived messages and inner frameworks of each agent are characterized. Finally, some results of a simulation instance are given. The simulation and modeling of dynamic system based on multi-agent system is of benefit to express physical concepts and logical hierarchy clearly and precisely. The system model can easily draw in kinds of other agents to achieve a precise simulation of more complex system. This modeling technique makes the complex integral dynamic equations of multibodies decompose into parallel operations of single agent, and it is convenient to expand, maintain, and reuse the program codes.展开更多
Software defect prediction plays a critical role in software development and quality assurance processes. Effective defect prediction enables testers to accurately prioritize testing efforts and enhance defect detecti...Software defect prediction plays a critical role in software development and quality assurance processes. Effective defect prediction enables testers to accurately prioritize testing efforts and enhance defect detection efficiency. Additionally, this technology provides developers with a means to quickly identify errors, thereby improving software robustness and overall quality. However, current research in software defect prediction often faces challenges, such as relying on a single data source or failing to adequately account for the characteristics of multiple coexisting data sources. This approach may overlook the differences and potential value of various data sources, affecting the accuracy and generalization performance of prediction results. To address this issue, this study proposes a multivariate heterogeneous hybrid deep learning algorithm for defect prediction (DP-MHHDL). Initially, Abstract Syntax Tree (AST), Code Dependency Network (CDN), and code static quality metrics are extracted from source code files and used as inputs to ensure data diversity. Subsequently, for the three types of heterogeneous data, the study employs a graph convolutional network optimization model based on adjacency and spatial topologies, a Convolutional Neural Network-Bidirectional Long Short-Term Memory (CNN-BiLSTM) hybrid neural network model, and a TabNet model to extract data features. These features are then concatenated and processed through a fully connected neural network for defect prediction. Finally, the proposed framework is evaluated using ten promise defect repository projects, and performance is assessed with three metrics: F1, Area under the curve (AUC), and Matthews correlation coefficient (MCC). The experimental results demonstrate that the proposed algorithm outperforms existing methods, offering a novel solution for software defect prediction.展开更多
Understanding migratory waterfowl spatiotemporal distributions is important because,in addition to their economic and cultural value,wild waterfowl can be infectious reservoirs of highly pathogenic avian influenza vir...Understanding migratory waterfowl spatiotemporal distributions is important because,in addition to their economic and cultural value,wild waterfowl can be infectious reservoirs of highly pathogenic avian influenza virus(HPAIV).Waterfowl migration has been implicated in regional and intercontinental HPAIV dispersal,and predictive capabilities of where and when HPAIV may be introduced to susceptible spillover hosts would facilitate biosecurity and mitigation efforts.To develop forecasts for HPAIV dispersal,an improved understanding of how individual birds interact with their environment and move on a landscape scale is required.Using an agent-based modeling approach,we integrated individual-scale energetics,species-specific morphology and behavior,and landscape-scale weather and habitat data in a mechanistic stochastic framework to simulate Mallard(Anas platyrhynchos)and Northern Pintail(Anas acuta)annual migration across the Northern Hemisphere.Our model recreated biologically realistic migratory patterns using a first principles approach to waterfowl ecology,behavior,and physiology.Conducting a limited structural sensitivity analysis comparing reduced models to eBird Status and Trends in reference to the full model,we identified density dependence as the main factor influencing spring migration and breeding distributions,and wind as the main factor influencing fall migration and overwintering distributions.We show evidence of weather patterns in Northeast Asia causing significant intercontinental pintail migration to North America.By linking individual energetics to landscapescale processes,we identify key drivers of waterfowl migration while developing a predictive model responsive to daily weather patterns.This model paves the way for future waterfowl migration research predicting HPAIV transmission,climate change impacts,and oil spill effects.展开更多
Understanding the fracture behavior of rocks subjected to temperature and accounting for the rock's texture is vital for safe and efficient design.Prior studies have often focused on isolated aspects of rock fract...Understanding the fracture behavior of rocks subjected to temperature and accounting for the rock's texture is vital for safe and efficient design.Prior studies have often focused on isolated aspects of rock fracture behavior,neglecting the combined influence of grain size and temperature on fracture behavior.This study employs specimens based on the particle flow code-grain based model to scrutinize the influence of temperature and grain size discrepancies on the fracture characteristics of sandstone.In pursuit of this goal,we manufactured ninety-six semi-circular bend specimens with grain sizes spanning from 0.5 mm to 1.5 mm,predicated on the mineral composition of sandstone.Recognizing the significance of intra-granular and inter-granular fractures,the grains were considered deformable and susceptible to breakage.The numerical model was calibrated using the results of uniaxial compressive strength(UCS)and Brazilian tests.We implemented thermo-mechanical coupled analysis to simulate mode Ⅰ,mode Ⅱ,and mixed mode(Ⅰ-Ⅱ)fracture toughness tests and subsequently studied alterations in the fracture behavior of sandstone at temperatures from 25℃ to 700℃.Our findings revealed increased fracture toughness as the temperature escalated from 25℃ to 200℃.However,beyond the threshold of 200℃,we noted a decline in fracture toughness.More specifically,the drop in mode Ⅰ fracture toughness was more pronounced in specimens with finer grains than those with coarser grains.Contrarily,the trend was reversed for mode Ⅱ fracture toughness.In contrast,the reduction of mixed mode(Ⅰ-Ⅱ)fracture toughness seemed almost linear across all grain sizes.Furthermore,we identified a correlation between temperature and grain size and their collective impact on crack propagation patterns.Comparing our results with established theoretical benchmarks,we confirmed that both temperature and grain size variations influence the fracture envelopes of sandstone.展开更多
Efficiently executing inference tasks of deep neural networks on devices with limited resources poses a significant load in IoT systems.To alleviate the load,one innovative method is branching that adds extra layers w...Efficiently executing inference tasks of deep neural networks on devices with limited resources poses a significant load in IoT systems.To alleviate the load,one innovative method is branching that adds extra layers with classification exits to a pre-trained model,enabling inputs with high-confidence predictions to exit early,thus reducing inference cost.However,branching networks,not originally tailored for IoT environments,are susceptible to noisy and out-of-distribution(OOD)data,and they demand additional training for optimal performance.The authors introduce BrevisNet,a novel branching methodology designed for creating on-device branching models that are both resourceadaptive and noise-robust for IoT applications.The method leverages the refined uncertainty estimation capabilities of Dirichlet distributions for classification predictions,combined with the superior OOD detection of energy-based models.The authors propose a unique training approach and thresholding technique that enhances the precision of branch predictions,offering robustness against noise and OOD inputs.The findings demonstrate that BrevisNet surpasses existing branching techniques in training efficiency,accuracy,overall performance,and robustness.展开更多
Decision support systems(DSS)based on physically based numerical models are standard tools used by water services and utilities.However,few DSS based on holistic approaches combining distributed hydrological,hydraulic...Decision support systems(DSS)based on physically based numerical models are standard tools used by water services and utilities.However,few DSS based on holistic approaches combining distributed hydrological,hydraulic,and hydrogeological models are operationally exploited.This holistic approach was adopted for the development of the AquaVar DSS,used for water resource management in the French Mediterranean Var watershed.The year 2019 marked the initial use of the DSS in its operational environment.Over the next 5 years,multiple hydrological events allowed to test the performance of the DSS.The results show that the tool is capable of simulating peak flows associated with two extreme rainfall events(storms Alex and Aline).For a moderate flood,the real-time functionality was able to simulate forecast discharges 26 h before the flood peak,with a maximum local error of 30%.Finally,simulations for the drought period 2022-2023 highlighted the essential need for DSS to evolve in line with changing climatic conditions,which give rise to unprecedented hydrological processes.The lessons learned from these first 5 years of AquaVar use under operational conditions are synthesized,addressing various topics such as DSS modularity,evolution,data positioning,technology,and governance.展开更多
Power transmission lines are a critical component of the entire power system,and ice accretion incidents caused by various types of power systems can result in immeasurable harm.Currently,network models used for ice d...Power transmission lines are a critical component of the entire power system,and ice accretion incidents caused by various types of power systems can result in immeasurable harm.Currently,network models used for ice detection on power transmission lines require a substantial amount of sample data to support their training,and their drawback is that detection accuracy is significantly affected by the inaccurate annotation among training dataset.Therefore,we propose a transformer-based detection model,structured into two stages to collectively address the impact of inaccurate datasets on model training.In the first stage,a spatial similarity enhancement(SSE)module is designed to leverage spatial information to enhance the construction of the detection framework,thereby improving the accuracy of the detector.In the second stage,a target similarity enhancement(TSE)module is introduced to enhance object-related features,reducing the impact of inaccurate data on model training,thereby expanding global correlation.Additionally,by incorporating a multi-head adaptive attention window(MAAW),spatial information is combined with category information to achieve information interaction.Simultaneously,a quasi-wavelet structure,compatible with deep learning,is employed to highlight subtle features at different scales.Experimental results indicate that the proposed model in this paper outperforms existing mainstream detection models,demonstrating superior performance and stability.展开更多
The rise in construction activities within mountainous regions has significantly increased the frequency of rockfalls.Statistical models for rockfall hazard assessment often struggle to achieve high precision on a lar...The rise in construction activities within mountainous regions has significantly increased the frequency of rockfalls.Statistical models for rockfall hazard assessment often struggle to achieve high precision on a large scale.This limitation arises primarily from the scarcity of historical rockfall data and the inadequacy of conventional assessment indicators in capturing the physical and structural characteristics of rockfalls.This study proposes a physically based deterministic model designed to accurately quantify rockfall hazards at a large scale.The model accounts for multiple rockfall failure modes and incorporates the key physical and structural parameters of the rock mass.Rockfall hazard is defined as the product of three factors:the rockfall failure probability,the probability of reaching a specific position,and the corresponding impact intensity.The failure probability includes probabilities of formation and instability of rock blocks under different failure modes,modeled based on the combination patterns of slope surfaces and rock discontinuities.The Monte Carlo method is employed to account for the randomness of mechanical and geometric parameters when quantifying instability probabilities.Additionally,the rock trajectories and impact energies simulated using Flow-R software are combined with rockfall failure probability to enable regional rockfall hazard zoning.A case study was conducted in Tiefeng,Chongqing,China,considering four types of rockfall failure modes.Hazard zoning results identified the steep and elevated terrains of the northern and southern anaclinal slopes as areas of highest rockfall hazard.These findings align with observed conditions,providing detailed hazard zoning and validating the effectiveness and potential of the proposed model.展开更多
Rapid increases in Carbon dioxide(CO_(2))levels could trigger unpredictable climate change.The assessment of spatiotempor-al variation and influencing factors of CO_(2) concentration are helpful in understanding the s...Rapid increases in Carbon dioxide(CO_(2))levels could trigger unpredictable climate change.The assessment of spatiotempor-al variation and influencing factors of CO_(2) concentration are helpful in understanding the source/sink balance and supporting the formu-lation of climate policy.In this study,Greenhouse Gases Observing Satellite(GOSAT)data were used to explore the variability of CO_(2) concentrations in China from 2009 to 2020.Meteorological parameters,vegetation cover,and anthropogenic activities were combined to explain the increase in CO_(2) concentration,using pixel-based correlations and Covariance Based Structural Equation Modeling(CB-SEM)analysis.The results showed that the influence of vertical CO_(2) transport diminished with altitude,with a distinct inter-annual in-crease in CO_(2) concentrations at 17 vertical levels.Spatially,the highest values were observed in East China,whereas the lowest were observed in Northwest China.There were significant seasonal variations in CO_(2) concentration,with maximum and minimum values in spring(April)and summer(August),respectively.According to the pixel-based correlation analysis,the near-surface CO_(2) concentration was positively correlated with population(r=0.99,P<0.001),Leaf Area Index(LAI,r=0.95,P<0.001),emissions(r=0.91,P<0.001),temperature(r=0.60,P<0.05),precipitation(r=0.34,P>0.05),soil water(r=0.29,P>0.05),nightlight(r=0.28,P>0.05);and negatively correlated with wind speed(r=−0.58,P<0.05).CB-SEM analysis revealed that LAI was the most important con-trolling factor explaining CO_(2) concentration variation(total effect of 0.66),followed by emissions(0.58),temperature(0.45),precipita-tion(0.30),wind speed(−0.28),and soil water(−0.07).The model explained 93% of the increase in CO_(2) concentration.Our results provide crucial information on the patterns of CO_(2) concentrations and their driving mechanisms,which are particularly significant in the context of climate change.展开更多
In this article,an approach for economic performance assessment of model predictive control(MPC) system is presented.The method builds on steady-state economic optimization techniques and uses the linear quadratic Gau...In this article,an approach for economic performance assessment of model predictive control(MPC) system is presented.The method builds on steady-state economic optimization techniques and uses the linear quadratic Gaussian(LQG) benchmark other than conventional minimum variance control(MVC) to estimate the potential of reduction in variance.The LQG control is a more practical performance benchmark compared to MVC for performance assessment since it considers input variance and output variance,and it thus provides a desired basis for determining the theoretical maximum economic benefit potential arising from variability reduction.Combining the LQG benchmark directly with benefit potential of MPC control system,both the economic benefit and the optimal operation condition can be obtained by solving the economic optimization problem.The proposed algorithm is illustrated by simulated example as well as application to economic performance assessment of an industrial model predictive control system.展开更多
Instead of establishing mathematical hydraulic system models from physical laws usually done with the problems of complex modelling processes, low reliability and practicality caused by large uncertainties, a novel mo...Instead of establishing mathematical hydraulic system models from physical laws usually done with the problems of complex modelling processes, low reliability and practicality caused by large uncertainties, a novel modelling method for a highly nonlinear system of a hydraulic excavator is presented. Based on the data collected in the excavator's arms driving experiments, a data-based excavator dynamic model using Simplified Refined Instrumental Variable (SRIV) identification and estimation algorithms is established. The validity of the proposed data-based model is indirectly demonstrated by the performance of computer simulation and the.real machine motion control exoeriments.展开更多
This paper presents a state-of-the-art review in modeling approach of hardware in the loop simulation(HILS)realization of electric machine drives using commercial real time machines.HILS implementation using digital s...This paper presents a state-of-the-art review in modeling approach of hardware in the loop simulation(HILS)realization of electric machine drives using commercial real time machines.HILS implementation using digital signal processors(DSPs)and field programmable gate array(FPGA)for electric machine drives has been investigated but those methods have drawbacks such as complexity in development and verification.Among various HILS implementation approaches,more efficient development and verification for electric machine drives can be achieved through use of commercial real time machines.As well as implementation of the HILS,accurate modeling of a control target system plays an important role.Therefore,modeling trend in electric machine drives for HILS implementation is needed to be reviewed.This paper provides a background of HILS and commercially available real time machines and characteristics of each real time machine are introduced.Also,recent trends and progress of permanent magnet synchronous machines(PMSMs)modeling are presented for providing more accurate HILS implementation approaches in this paper.展开更多
In the manufacturing of thin wall components for aerospace industry,apart from the side wall contour error,the Remaining Bottom Thickness Error(RBTE)for the thin-wall pocket component(e.g.rocket shell)is of the same i...In the manufacturing of thin wall components for aerospace industry,apart from the side wall contour error,the Remaining Bottom Thickness Error(RBTE)for the thin-wall pocket component(e.g.rocket shell)is of the same importance but overlooked in current research.If the RBTE reduces by 30%,the weight reduction of the entire component will reach up to tens of kilograms while improving the dynamic balance performance of the large component.Current RBTE control requires the off-process measurement of limited discrete points on the component bottom to provide the reference value for compensation.This leads to incompleteness in the remaining bottom thickness control and redundant measurement in manufacturing.In this paper,the framework of data-driven physics based model is proposed and developed for the real-time prediction of critical quality for large components,which enables accurate prediction and compensation of RBTE value for the thin wall components.The physics based model considers the primary root cause,in terms of tool deflection and clamping stiffness induced Axial Material Removal Thickness(AMRT)variation,for the RBTE formation.And to incorporate the dynamic and inherent coupling of the complicated manufacturing system,the multi-feature fusion and machine learning algorithm,i.e.kernel Principal Component Analysis(kPCA)and kernel Support Vector Regression(kSVR),are incorporated with the physics based model.Therefore,the proposed data-driven physics based model combines both process mechanism and the system disturbance to achieve better prediction accuracy.The final verification experiment is implemented to validate the effectiveness of the proposed method for dimensional accuracy prediction in pocket milling,and the prediction accuracy of AMRT achieves 0.014 mm and 0.019 mm for straight and corner milling,respectively.展开更多
Climate change can escalate rainfall intensity and cause further increase in sediment transport in arid lands which in turn can adversely affect water quality. Hence, there is a strong need to predict the fate of sedi...Climate change can escalate rainfall intensity and cause further increase in sediment transport in arid lands which in turn can adversely affect water quality. Hence, there is a strong need to predict the fate of sediments in order to provide measures for sound erosion control and water quality management. The presence of micro- topography on hillslopes influences processes of runoff generation and erosion, which should be taken into account to achieve more accurate modelling results. This study presents a physically based mathematical model for erosion and sediment transport coupled to one-dimensional overland flow equations that simulate rainfall-runoff generation on the rill and interrill areas of a bare hillslope. Modelling effort at such a fine resolution considering the flow con- nection between Jnterrill areas and rills is rarely verified. The developed model was applied on a set of data gath- ered from an experimental setup where a 650 cm×136 cm erosion flume was pre-formed with a longitudinal rill and interrJll having a plane geometry and was equipped with a rainfall simulator that reproduces natural rainfall characteristics. The flume can be given both longitudinal and lateral slope directions. For calibration and validation, the model was applied on the experimental results obtained from the setup of the flume having 5% lateral and 10% longitudinal slope directions under rainfall intensities of 105 and 45 mm/h, respectively. Calibration showed that the model was able to produce good results based on the R2 (0.84) and NSE (0.80) values. The model performance was further tested through validation which also produced good statistics (R2=0.83, NSE=0.72). Results in terms of the sedigraphs, cumulative mass curves and performance statistics suggest that the model can be a useful and an important step towards verifying and improving mathematical models of erosion and sediment transport.展开更多
To improve the agility, dynamics, composability, reusability, and development efficiency restricted by monolithic federation object model (FOM), a modular FOM is proposed by high level architecture (HLA) evolved p...To improve the agility, dynamics, composability, reusability, and development efficiency restricted by monolithic federation object model (FOM), a modular FOM is proposed by high level architecture (HLA) evolved product development group. This paper reviews the state-of-the-art of HLA evolved modular FOM. In particular, related concepts, the overall impact on HLA standards, extension principles, and merging processes are discussed. Also permitted and restricted combinations, and merging rules are provided, and the influence on HLA interface specification is given. The comparison between modular FOM and base object model (BOM) is performed to illustrate the importance of their combination. The applications of modular FOM are summarized. Finally, the significance to facilitate compoable simulation both in academia and practice is presented and future directions are pointed out.展开更多
文摘Offline policy evaluation,evaluating and selecting complex policies for decision-making by only using offline datasets is important in reinforcement learning.At present,the model-based offline policy evaluation(MBOPE)is widely welcomed because of its easy to implement and good performance.MBOPE directly approximates the unknown value of a given policy using the Monte Carlo method given the estimated transition and reward functions of the environment.Usually,multiple models are trained,and then one of them is selected to be used.However,a challenge remains in selecting an appropriate model from those trained for further use.The authors first analyse the upper bound of the difference between the approximated value and the unknown true value.Theoretical results show that this difference is related to the trajectories generated by the given policy on the learnt model and the prediction error of the transition and reward functions at these generated data points.Based on the theoretical results,a new criterion is proposed to tell which trained model is better suited for evaluating the given policy.At last,the effectiveness of the proposed criterion is demonstrated on both benchmark and synthetic offline datasets.
基金supported by the National Natural Science Foundation of China(Grant Nos.:82373938,82104275,and 82204585)Key Technologies R&D Program of Guangdong Province,China(Grant No.:2023B1111030004)National Key R&D Program of China(Grant No.:2022YFF1202600).
文摘Proteolysis-targeting chimeras(PROTACs)represent a promising class of drugs that can target disease-causing proteins more effectively than traditional small molecule inhibitors can,potentially revolutionizing drug discovery and treatment strategies.However,the links between in vitro and in vivo data are poorly understood,hindering a comprehensive understanding of the absorption,distribution,metabolism,and excretion(ADME)of PROTACs.In this work,14C-labeled vepdegestrant(ARV-471),which is currently in phase III clinical trials for breast cancer,was synthesized as a model PROTAC to characterize its preclinical ADME properties and simulate its clinical pharmacokinetics(PK)by establishing a physiologically based pharmacokinetics(PBPK)model.For in vitro–in vivo extrapolation(IVIVE),hepatocyte clearance correlated more closely with in vivo rat PK data than liver microsomal clearance did.PBPK models,which were initially developed and validated in rats,accurately simulate ARV-471's PK across fed and fasted states,with parameters within 1.75-fold of the observed values.Human models,informed by in vitro ADME data,closely mirrored postoral dose plasma profiles at 30 mg.Furthermore,no human-specific metabolites were identified in vitro and the metabolic profile of rats could overlap that of humans.This work presents a roadmap for developing future PROTAC medications by elucidating the correlation between in vitro and in vivo characteristics.
文摘A new series of benzothiazole Schiff bases 3–29 was synthesized and screened for antitumor activity against cervical cancer(Hela) and kidney fibroblast cancer(COS-7) cell lines. Results indicated that compounds 3, 14, 19, 27 and 28 have promising activity against Hela cell line with IC50 values of 2.41,3.06, 6.46, 2.22 and 6.25 mmol/L, respectively, in comparison to doxorubicin as a reference antitumor agent(IC50 2.05 mmol/L). In addition, compound 3 displayed excellent activity against COS-7 cell line with IC50 value of 4.31 mmol/L in comparison to doxorubicin(IC50 3.04 mmol/L). In the present work,structure based pharmacophore mapping, molecular docking, protein-ligand interaction, fingerprints and binding energy calculations were employed in a virtual screening strategy to identify the interaction between the compounds and the active site of the putative target, EGFR tyrosine kinase. Molecular properties, toxicity, drug-likeness, and drug score profiles of compounds 3, 14, 19, 27, 28 and 29 were also assessed.
文摘Amplitudes have been found to be a function of incident angle and offset. Hence data required to test for amplitude variation with angle or offset needs to have its amplitudes for all offsets preserved and not stacked. Amplitude Variation with Offset (AVO)/Amplitude Variation with Angle (AVA) is necessary to account for information in the offset/angle parameter (mode converted S-wave and P-wave velocities). Since amplitudes are a function of the converted S- and P-waves, it is important to investigate the dependence of amplitudes on the elastic (P- and S-waves) parameters from the seismic data. By modelling these effects for different reservoir fluids via fluid substitution, various AVO geobody classes present along the well and in the entire seismic cube can be observed. AVO analysis was performed on one test well (Well_1) and 3D pre-stack angle gathers from the Tano Basin. The analysis involves creating a synthetic model to infer the effect of offset scaling techniques on amplitude responses in the Tano basin as compared to the effect of unscaled seismic data. The spectral balance process was performed to match the amplitude spectra of all angle stacks to that of the mid (26°) stack on the test lines. The process had an effect primarily on the far (34° - 40°) stacks. The frequency content of these stacks slightly increased to match that of the near and mid stacks. In offset scaling process, the root mean square (RMS) amplitude comparison between the synthetic and seismic suggests that the amplitude of the far traces should be reduced relative to the nears by up to 16%. However, the exact scaler values depend on the time window considered. This suggests that the amplitude scaling with offset delivered from seismic processing is only approximately correct and needs to be checked with well synthetics and adjusted accordingly prior to use for AVO studies. The AVO attribute volumes generated were better at resolving anomalies on spectrally balanced and offset scaled data than data delivered from conventional processing. A typical class II AVO anomaly is seen along the test well from the cross-plot analysis and AVO attribute cube which indicates an oil filled reservoir.
基金Aeronautical Science Foundation of China (2006ZA51004)
文摘With the aid of multi-agent based modeling approach to complex systems, the hierarchy simulation models of carrier-based aircraft catapult launch are developed. Ocean, carrier, aircraft, and atmosphere are treated as aggregation agents, the detailed components like catapult, landing gears, and disturbances are considered as meta-agents, which belong to their aggregation agent. Thus, the model with two layers is formed i.e. the aggregation agent layer and the meta-agent layer. The information communication among all agents is described. The meta-agents within one aggregation agent communicate with each other directly by information sharing, but the meta-agents, which belong to different aggregation agents exchange their information through the aggregation layer first, and then perceive it from the sharing environment, that is the aggregation agent. Thus, not only the hierarchy model is built, but also the environment perceived by each agent is specified. Meanwhile, the problem of balancing the independency of agent and the resource consumption brought by real-time communication within multi-agent system (MAS) is resolved. Each agent involved in carrier-based aircraft catapult launch is depicted, with considering the interaction within disturbed atmospheric environment and multiple motion bodies including carrier, aircraft, and landing gears. The models of reactive agents among them are derived based on tensors, and the perceived messages and inner frameworks of each agent are characterized. Finally, some results of a simulation instance are given. The simulation and modeling of dynamic system based on multi-agent system is of benefit to express physical concepts and logical hierarchy clearly and precisely. The system model can easily draw in kinds of other agents to achieve a precise simulation of more complex system. This modeling technique makes the complex integral dynamic equations of multibodies decompose into parallel operations of single agent, and it is convenient to expand, maintain, and reuse the program codes.
文摘Software defect prediction plays a critical role in software development and quality assurance processes. Effective defect prediction enables testers to accurately prioritize testing efforts and enhance defect detection efficiency. Additionally, this technology provides developers with a means to quickly identify errors, thereby improving software robustness and overall quality. However, current research in software defect prediction often faces challenges, such as relying on a single data source or failing to adequately account for the characteristics of multiple coexisting data sources. This approach may overlook the differences and potential value of various data sources, affecting the accuracy and generalization performance of prediction results. To address this issue, this study proposes a multivariate heterogeneous hybrid deep learning algorithm for defect prediction (DP-MHHDL). Initially, Abstract Syntax Tree (AST), Code Dependency Network (CDN), and code static quality metrics are extracted from source code files and used as inputs to ensure data diversity. Subsequently, for the three types of heterogeneous data, the study employs a graph convolutional network optimization model based on adjacency and spatial topologies, a Convolutional Neural Network-Bidirectional Long Short-Term Memory (CNN-BiLSTM) hybrid neural network model, and a TabNet model to extract data features. These features are then concatenated and processed through a fully connected neural network for defect prediction. Finally, the proposed framework is evaluated using ten promise defect repository projects, and performance is assessed with three metrics: F1, Area under the curve (AUC), and Matthews correlation coefficient (MCC). The experimental results demonstrate that the proposed algorithm outperforms existing methods, offering a novel solution for software defect prediction.
文摘Understanding migratory waterfowl spatiotemporal distributions is important because,in addition to their economic and cultural value,wild waterfowl can be infectious reservoirs of highly pathogenic avian influenza virus(HPAIV).Waterfowl migration has been implicated in regional and intercontinental HPAIV dispersal,and predictive capabilities of where and when HPAIV may be introduced to susceptible spillover hosts would facilitate biosecurity and mitigation efforts.To develop forecasts for HPAIV dispersal,an improved understanding of how individual birds interact with their environment and move on a landscape scale is required.Using an agent-based modeling approach,we integrated individual-scale energetics,species-specific morphology and behavior,and landscape-scale weather and habitat data in a mechanistic stochastic framework to simulate Mallard(Anas platyrhynchos)and Northern Pintail(Anas acuta)annual migration across the Northern Hemisphere.Our model recreated biologically realistic migratory patterns using a first principles approach to waterfowl ecology,behavior,and physiology.Conducting a limited structural sensitivity analysis comparing reduced models to eBird Status and Trends in reference to the full model,we identified density dependence as the main factor influencing spring migration and breeding distributions,and wind as the main factor influencing fall migration and overwintering distributions.We show evidence of weather patterns in Northeast Asia causing significant intercontinental pintail migration to North America.By linking individual energetics to landscapescale processes,we identify key drivers of waterfowl migration while developing a predictive model responsive to daily weather patterns.This model paves the way for future waterfowl migration research predicting HPAIV transmission,climate change impacts,and oil spill effects.
文摘Understanding the fracture behavior of rocks subjected to temperature and accounting for the rock's texture is vital for safe and efficient design.Prior studies have often focused on isolated aspects of rock fracture behavior,neglecting the combined influence of grain size and temperature on fracture behavior.This study employs specimens based on the particle flow code-grain based model to scrutinize the influence of temperature and grain size discrepancies on the fracture characteristics of sandstone.In pursuit of this goal,we manufactured ninety-six semi-circular bend specimens with grain sizes spanning from 0.5 mm to 1.5 mm,predicated on the mineral composition of sandstone.Recognizing the significance of intra-granular and inter-granular fractures,the grains were considered deformable and susceptible to breakage.The numerical model was calibrated using the results of uniaxial compressive strength(UCS)and Brazilian tests.We implemented thermo-mechanical coupled analysis to simulate mode Ⅰ,mode Ⅱ,and mixed mode(Ⅰ-Ⅱ)fracture toughness tests and subsequently studied alterations in the fracture behavior of sandstone at temperatures from 25℃ to 700℃.Our findings revealed increased fracture toughness as the temperature escalated from 25℃ to 200℃.However,beyond the threshold of 200℃,we noted a decline in fracture toughness.More specifically,the drop in mode Ⅰ fracture toughness was more pronounced in specimens with finer grains than those with coarser grains.Contrarily,the trend was reversed for mode Ⅱ fracture toughness.In contrast,the reduction of mixed mode(Ⅰ-Ⅱ)fracture toughness seemed almost linear across all grain sizes.Furthermore,we identified a correlation between temperature and grain size and their collective impact on crack propagation patterns.Comparing our results with established theoretical benchmarks,we confirmed that both temperature and grain size variations influence the fracture envelopes of sandstone.
基金Australian Research Council,Grant/Award Numbers:DE200101465,DP240101108。
文摘Efficiently executing inference tasks of deep neural networks on devices with limited resources poses a significant load in IoT systems.To alleviate the load,one innovative method is branching that adds extra layers with classification exits to a pre-trained model,enabling inputs with high-confidence predictions to exit early,thus reducing inference cost.However,branching networks,not originally tailored for IoT environments,are susceptible to noisy and out-of-distribution(OOD)data,and they demand additional training for optimal performance.The authors introduce BrevisNet,a novel branching methodology designed for creating on-device branching models that are both resourceadaptive and noise-robust for IoT applications.The method leverages the refined uncertainty estimation capabilities of Dirichlet distributions for classification predictions,combined with the superior OOD detection of energy-based models.The authors propose a unique training approach and thresholding technique that enhances the precision of branch predictions,offering robustness against noise and OOD inputs.The findings demonstrate that BrevisNet surpasses existing branching techniques in training efficiency,accuracy,overall performance,and robustness.
文摘Decision support systems(DSS)based on physically based numerical models are standard tools used by water services and utilities.However,few DSS based on holistic approaches combining distributed hydrological,hydraulic,and hydrogeological models are operationally exploited.This holistic approach was adopted for the development of the AquaVar DSS,used for water resource management in the French Mediterranean Var watershed.The year 2019 marked the initial use of the DSS in its operational environment.Over the next 5 years,multiple hydrological events allowed to test the performance of the DSS.The results show that the tool is capable of simulating peak flows associated with two extreme rainfall events(storms Alex and Aline).For a moderate flood,the real-time functionality was able to simulate forecast discharges 26 h before the flood peak,with a maximum local error of 30%.Finally,simulations for the drought period 2022-2023 highlighted the essential need for DSS to evolve in line with changing climatic conditions,which give rise to unprecedented hydrological processes.The lessons learned from these first 5 years of AquaVar use under operational conditions are synthesized,addressing various topics such as DSS modularity,evolution,data positioning,technology,and governance.
文摘Power transmission lines are a critical component of the entire power system,and ice accretion incidents caused by various types of power systems can result in immeasurable harm.Currently,network models used for ice detection on power transmission lines require a substantial amount of sample data to support their training,and their drawback is that detection accuracy is significantly affected by the inaccurate annotation among training dataset.Therefore,we propose a transformer-based detection model,structured into two stages to collectively address the impact of inaccurate datasets on model training.In the first stage,a spatial similarity enhancement(SSE)module is designed to leverage spatial information to enhance the construction of the detection framework,thereby improving the accuracy of the detector.In the second stage,a target similarity enhancement(TSE)module is introduced to enhance object-related features,reducing the impact of inaccurate data on model training,thereby expanding global correlation.Additionally,by incorporating a multi-head adaptive attention window(MAAW),spatial information is combined with category information to achieve information interaction.Simultaneously,a quasi-wavelet structure,compatible with deep learning,is employed to highlight subtle features at different scales.Experimental results indicate that the proposed model in this paper outperforms existing mainstream detection models,demonstrating superior performance and stability.
基金supported by the National Natural Science Foundation of China(Grant Nos.42172318 and 42377186)the National Key R&D Program of China(Grant No.2023YFC3007201).
文摘The rise in construction activities within mountainous regions has significantly increased the frequency of rockfalls.Statistical models for rockfall hazard assessment often struggle to achieve high precision on a large scale.This limitation arises primarily from the scarcity of historical rockfall data and the inadequacy of conventional assessment indicators in capturing the physical and structural characteristics of rockfalls.This study proposes a physically based deterministic model designed to accurately quantify rockfall hazards at a large scale.The model accounts for multiple rockfall failure modes and incorporates the key physical and structural parameters of the rock mass.Rockfall hazard is defined as the product of three factors:the rockfall failure probability,the probability of reaching a specific position,and the corresponding impact intensity.The failure probability includes probabilities of formation and instability of rock blocks under different failure modes,modeled based on the combination patterns of slope surfaces and rock discontinuities.The Monte Carlo method is employed to account for the randomness of mechanical and geometric parameters when quantifying instability probabilities.Additionally,the rock trajectories and impact energies simulated using Flow-R software are combined with rockfall failure probability to enable regional rockfall hazard zoning.A case study was conducted in Tiefeng,Chongqing,China,considering four types of rockfall failure modes.Hazard zoning results identified the steep and elevated terrains of the northern and southern anaclinal slopes as areas of highest rockfall hazard.These findings align with observed conditions,providing detailed hazard zoning and validating the effectiveness and potential of the proposed model.
基金Under the auspices of National Natural Science Foundation of China(No.41871193,U1910207)Program for the Philosophy and Social Science of Shanxi Province(No.2023YJ107)。
文摘Rapid increases in Carbon dioxide(CO_(2))levels could trigger unpredictable climate change.The assessment of spatiotempor-al variation and influencing factors of CO_(2) concentration are helpful in understanding the source/sink balance and supporting the formu-lation of climate policy.In this study,Greenhouse Gases Observing Satellite(GOSAT)data were used to explore the variability of CO_(2) concentrations in China from 2009 to 2020.Meteorological parameters,vegetation cover,and anthropogenic activities were combined to explain the increase in CO_(2) concentration,using pixel-based correlations and Covariance Based Structural Equation Modeling(CB-SEM)analysis.The results showed that the influence of vertical CO_(2) transport diminished with altitude,with a distinct inter-annual in-crease in CO_(2) concentrations at 17 vertical levels.Spatially,the highest values were observed in East China,whereas the lowest were observed in Northwest China.There were significant seasonal variations in CO_(2) concentration,with maximum and minimum values in spring(April)and summer(August),respectively.According to the pixel-based correlation analysis,the near-surface CO_(2) concentration was positively correlated with population(r=0.99,P<0.001),Leaf Area Index(LAI,r=0.95,P<0.001),emissions(r=0.91,P<0.001),temperature(r=0.60,P<0.05),precipitation(r=0.34,P>0.05),soil water(r=0.29,P>0.05),nightlight(r=0.28,P>0.05);and negatively correlated with wind speed(r=−0.58,P<0.05).CB-SEM analysis revealed that LAI was the most important con-trolling factor explaining CO_(2) concentration variation(total effect of 0.66),followed by emissions(0.58),temperature(0.45),precipita-tion(0.30),wind speed(−0.28),and soil water(−0.07).The model explained 93% of the increase in CO_(2) concentration.Our results provide crucial information on the patterns of CO_(2) concentrations and their driving mechanisms,which are particularly significant in the context of climate change.
基金Supported by the National Creative Research Groups Science Foundation of China (60421002) and National Basic Research Program of China (2007CB714000).
文摘In this article,an approach for economic performance assessment of model predictive control(MPC) system is presented.The method builds on steady-state economic optimization techniques and uses the linear quadratic Gaussian(LQG) benchmark other than conventional minimum variance control(MVC) to estimate the potential of reduction in variance.The LQG control is a more practical performance benchmark compared to MVC for performance assessment since it considers input variance and output variance,and it thus provides a desired basis for determining the theoretical maximum economic benefit potential arising from variability reduction.Combining the LQG benchmark directly with benefit potential of MPC control system,both the economic benefit and the optimal operation condition can be obtained by solving the economic optimization problem.The proposed algorithm is illustrated by simulated example as well as application to economic performance assessment of an industrial model predictive control system.
文摘Instead of establishing mathematical hydraulic system models from physical laws usually done with the problems of complex modelling processes, low reliability and practicality caused by large uncertainties, a novel modelling method for a highly nonlinear system of a hydraulic excavator is presented. Based on the data collected in the excavator's arms driving experiments, a data-based excavator dynamic model using Simplified Refined Instrumental Variable (SRIV) identification and estimation algorithms is established. The validity of the proposed data-based model is indirectly demonstrated by the performance of computer simulation and the.real machine motion control exoeriments.
基金supported in part by the National Research Foundation of Korea(NRF)grant funded by Korea government(No.2020R1C1C1013260)in part by INHA UNIVERSITY Research Grant.
文摘This paper presents a state-of-the-art review in modeling approach of hardware in the loop simulation(HILS)realization of electric machine drives using commercial real time machines.HILS implementation using digital signal processors(DSPs)and field programmable gate array(FPGA)for electric machine drives has been investigated but those methods have drawbacks such as complexity in development and verification.Among various HILS implementation approaches,more efficient development and verification for electric machine drives can be achieved through use of commercial real time machines.As well as implementation of the HILS,accurate modeling of a control target system plays an important role.Therefore,modeling trend in electric machine drives for HILS implementation is needed to be reviewed.This paper provides a background of HILS and commercially available real time machines and characteristics of each real time machine are introduced.Also,recent trends and progress of permanent magnet synchronous machines(PMSMs)modeling are presented for providing more accurate HILS implementation approaches in this paper.
基金the Science and Technology Major Project of China(No.2019ZX04020001-004,2017ZX04007001)。
文摘In the manufacturing of thin wall components for aerospace industry,apart from the side wall contour error,the Remaining Bottom Thickness Error(RBTE)for the thin-wall pocket component(e.g.rocket shell)is of the same importance but overlooked in current research.If the RBTE reduces by 30%,the weight reduction of the entire component will reach up to tens of kilograms while improving the dynamic balance performance of the large component.Current RBTE control requires the off-process measurement of limited discrete points on the component bottom to provide the reference value for compensation.This leads to incompleteness in the remaining bottom thickness control and redundant measurement in manufacturing.In this paper,the framework of data-driven physics based model is proposed and developed for the real-time prediction of critical quality for large components,which enables accurate prediction and compensation of RBTE value for the thin wall components.The physics based model considers the primary root cause,in terms of tool deflection and clamping stiffness induced Axial Material Removal Thickness(AMRT)variation,for the RBTE formation.And to incorporate the dynamic and inherent coupling of the complicated manufacturing system,the multi-feature fusion and machine learning algorithm,i.e.kernel Principal Component Analysis(kPCA)and kernel Support Vector Regression(kSVR),are incorporated with the physics based model.Therefore,the proposed data-driven physics based model combines both process mechanism and the system disturbance to achieve better prediction accuracy.The final verification experiment is implemented to validate the effectiveness of the proposed method for dimensional accuracy prediction in pocket milling,and the prediction accuracy of AMRT achieves 0.014 mm and 0.019 mm for straight and corner milling,respectively.
基金study was based on the international project "Development of a Hillslope-scale Sediment Transport Model" bilaterally supported by the National Research Foundation of Korea (NRF-2007-614-D00036, NRF-2008-614-D00018, NRF-2011013-D00124 and NRF-2013R1A1A4A01007676) and TUBITAK (The Scientific and Technological Research Council of Turkey 108Y250)supported in part by a grant (13CRTI-B052117-01) from the Regional Technology Innovation Program and another grant from the Advanced Water Management Research Program funded by the Ministry of Land, Infrastructure and Transport of the Korean Government, a 2011–2012 grant from Geum-River Environment Research Center, National Institute of Environmental Research, Korea, and a Korea University Grant
文摘Climate change can escalate rainfall intensity and cause further increase in sediment transport in arid lands which in turn can adversely affect water quality. Hence, there is a strong need to predict the fate of sediments in order to provide measures for sound erosion control and water quality management. The presence of micro- topography on hillslopes influences processes of runoff generation and erosion, which should be taken into account to achieve more accurate modelling results. This study presents a physically based mathematical model for erosion and sediment transport coupled to one-dimensional overland flow equations that simulate rainfall-runoff generation on the rill and interrill areas of a bare hillslope. Modelling effort at such a fine resolution considering the flow con- nection between Jnterrill areas and rills is rarely verified. The developed model was applied on a set of data gath- ered from an experimental setup where a 650 cm×136 cm erosion flume was pre-formed with a longitudinal rill and interrJll having a plane geometry and was equipped with a rainfall simulator that reproduces natural rainfall characteristics. The flume can be given both longitudinal and lateral slope directions. For calibration and validation, the model was applied on the experimental results obtained from the setup of the flume having 5% lateral and 10% longitudinal slope directions under rainfall intensities of 105 and 45 mm/h, respectively. Calibration showed that the model was able to produce good results based on the R2 (0.84) and NSE (0.80) values. The model performance was further tested through validation which also produced good statistics (R2=0.83, NSE=0.72). Results in terms of the sedigraphs, cumulative mass curves and performance statistics suggest that the model can be a useful and an important step towards verifying and improving mathematical models of erosion and sediment transport.
基金supported by the National Natural Science Foundation of China(6067406960574056).
文摘To improve the agility, dynamics, composability, reusability, and development efficiency restricted by monolithic federation object model (FOM), a modular FOM is proposed by high level architecture (HLA) evolved product development group. This paper reviews the state-of-the-art of HLA evolved modular FOM. In particular, related concepts, the overall impact on HLA standards, extension principles, and merging processes are discussed. Also permitted and restricted combinations, and merging rules are provided, and the influence on HLA interface specification is given. The comparison between modular FOM and base object model (BOM) is performed to illustrate the importance of their combination. The applications of modular FOM are summarized. Finally, the significance to facilitate compoable simulation both in academia and practice is presented and future directions are pointed out.