This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models ...This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models offer insights, they fall short in presenting a holistic view of complex urban challenges. System dynamics (SD) models that are often utilized to provide holistic, systematic understanding of a research subject, like the urban system, emerge as valuable tools, but data scarcity and theoretical inadequacy pose challenges. The research reviews relevant papers on recent SD model applications in urban sustainability since 2018, categorizing them based on nine key indicators. Among the reviewed papers, data limitations and model assumptions were identified as ma jor challenges in applying SD models to urban sustainability. This led to exploring the transformative potential of big data analytics, a rare approach in this field as identified by this study, to enhance SD models’ empirical foundation. Integrating big data could provide data-driven calibration, potentially improving predictive accuracy and reducing reliance on simplified assumptions. The paper concludes by advocating for new approaches that reduce assumptions and promote real-time applicable models, contributing to a comprehensive understanding of urban sustainability through the synergy of big data and SD models.展开更多
In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data be...In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data benchmark(HIS)has garnered the most attention due to its practicality and effectiveness.However,existing CPM reviews usually focus on the theoretical benchmark,and there is a lack of an in-depth review that thoroughly explores HIS-based methods.In this article,a comprehensive overview of HIS-based CPM is provided.First,we provide a novel static-dynamic perspective on data-level manifestations of control performance underlying typical controller capacities including regulation and servo:static and dynamic properties.The static property portrays time-independent variability in system output,and the dynamic property describes temporal behavior driven by closed-loop feedback.Accordingly,existing HIS-based CPM approaches and their intrinsic motivations are classified and analyzed from these two perspectives.Specifically,two mainstream solutions for CPM methods are summarized,including static analysis and dynamic analysis,which match data-driven techniques with actual controlling behavior.Furthermore,this paper also points out various opportunities and challenges faced in CPM for modern industry and provides promising directions in the context of artificial intelligence for inspiring future research.展开更多
In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and ...In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and constructs a higher educational data security management and control model centered on the integration of medical and educational data.By implementing a multi-dimensional strategy of dynamic classification,real-time authorization,and secure execution through educational data security levels,dynamic access control is applied to effectively enhance the security and controllability of educational data,providing a secure foundation for data sharing and openness.展开更多
Addressing the current challenges in transforming pixel displacement into physical displacement in visual monitoring technologies,as well as the inability to achieve precise full-field monitoring,this paper proposes a...Addressing the current challenges in transforming pixel displacement into physical displacement in visual monitoring technologies,as well as the inability to achieve precise full-field monitoring,this paper proposes a method for identifying the structural dynamic characteristics of wind turbines based on visual monitoring data fusion.Firstly,the Lucas-Kanade Tomasi(LKT)optical flow method and a multi-region of interest(ROI)monitoring structure are employed to track pixel displacements,which are subsequently subjected to band pass filtering and resampling operations.Secondly,the actual displacement time history is derived through double integration of the acquired acceleration data and subsequent band pass filtering.The scale factor is obtained by applying the least squares method to compare the visual displacement with the displacement derived from double integration of the acceleration data.Based on this,the multi-point displacement time histories under physical coordinates are obtained using the vision data and the scale factor.Subsequently,when visual monitoring of displacements becomes impossible due to issues such as image blurring or lens occlusion,the structural vibration equation and boundary condition constraints,among other key parameters,are employed to predict the displacements at unknown monitoring points,thereby enabling full-field displacement monitoring and dynamic characteristic testing of the structure.Finally,a small-scale shaking table test was conducted on a simulated wind turbine structure undergoing shutdown to validate the dynamic characteristics of the proposed method through test verification.The research results indicate that the proposed method achieves a time-domain error within the submillimeter range and a frequency-domain accuracy of over 99%,effectively monitoring the full-field structural dynamic characteristics of wind turbines and providing a basis for the condition assessment of wind turbine structures.展开更多
Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple dat...Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance.展开更多
Background Cardiovascular disease(CVD)remains a major health challenge globally,particularly in aging populations.Using data from the China Health and Retirement Longitudinal Study(CHARLS),this study examines the Trig...Background Cardiovascular disease(CVD)remains a major health challenge globally,particularly in aging populations.Using data from the China Health and Retirement Longitudinal Study(CHARLS),this study examines the Triglyceride-glucose(TyG)index dynamics,a marker for insulin resistance,and its relationship with CVD in Chinese adults aged 45 and older.Methods This reanalysis utilized five waves of CHARLS data with multistage sampling.From 17,705 participants,5,625 with TyG index and subsequent CVD data were included,excluding those lacking 2011 and 2015 TyG data.TyG derived from glucose and triglyceride levels,CVD outcomes via self-reports and records.Participants divided into four groups based on TyG changes(2011–2015):low-low,low-high,high-low,high-high TyG groups.Results Adjusting for covariates,stable high group showed a significantly higher risk of incident CVD compared to stable low group,with an HR of 1.18(95%CI:1.03–1.36).Similarly,for stroke risk,stable high group had a HR of 1.45(95%CI:1.11–1.89).Survival curves indicated that individuals with stable high TyG levels had a significantly increased CVD risk compared to controls.The dynamic TyG change showed a greater risk for CVD than abnormal glucose metabolism,notably for stroke.However,there was no statistical difference in single incidence risk of heart disease between stable low and stable high group.Subgroup analyses underscored demographic disparities,with stable high group consistently showing elevated risks,particularly among<65 years individuals,females,and those with higher education,lower BMI,or higher depression scores.Machine learning models,including random forest,XGBoost,CoxBoost,Deepsurv and GBM,underscored the predictive superiority of dynamic TyG over abnormal glucose metabolism for CVD.Conclusions Dynamic TyG change correlate with CVD risks.Monitoring these changes could predict and manage cardiovascular health in middle-aged and older adults.Targeted interventions based on TyG index trends are crucial for reducing CVD risks in this population.展开更多
Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed ...Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed into the model. And the computational errors are corrected using statistical approaches. It involves a variety of aspects, including the uncertainty modeling, the measurement evaluation, the system model and the measurement model coupling ,the computation complexity, and the performance issue. Authors intend to set up the architecture of DDDS for wildfire spread model, DEVS-FIRE, based on the discrete event speeification (DEVS) formalism. The experimental results show that the framework can track the dynamically changing fire front based on fire sen- sor data, thus, it provides more aecurate predictions.展开更多
In this paper, the stability of iterative learning control with data dropouts is discussed. By the super vector formulation, an iterative learning control (ILC) system with data dropouts can be modeled as an asynchr...In this paper, the stability of iterative learning control with data dropouts is discussed. By the super vector formulation, an iterative learning control (ILC) system with data dropouts can be modeled as an asynchronous dynamical system with rate constraints on events in the iteration domain. The stability condition is provided in the form of linear matrix inequalities (LMIS) depending on the stability of asynchronous dynamical systems. The analysis is supported by simulations.展开更多
In order to test the anti-interference ability of an Unmanned Aerial Vehicle(UAV) data link in a complex electromagnetic environment,a method for simulating the dynamic electromagnetic interference of an indoor wirele...In order to test the anti-interference ability of an Unmanned Aerial Vehicle(UAV) data link in a complex electromagnetic environment,a method for simulating the dynamic electromagnetic interference of an indoor wireless environment is proposed.This method can estimate the relational degree between the actual face of an UAV data link in an interface environment and the simulation scenarios in an anechoic chamber by using the Grey Relational Analysis(GRA) theory.The dynamic drive of the microwave instrument produces a real-time corresponding interference signal and realises scene mapping.The experimental results show that the maximal correlation between the interference signal in the real scene and the angular domain of the radiation antenna in the anechoic chamber is 0.959 3.Further,the relational degree of the Signal-toInterference Ratio(SIR) of the UAV at its reception terminal indoors and in the anechoic chamber is 0.996 8,and the time of instrument drive is only approximately 10 μs.All of the above illustrates that this method can achieve a simulation close to a real field dynamic electromagnetic interference signal of an indoor UAV data link.展开更多
A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improv...A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improved fairness and dynamic provable data possession scheme that supports public verification and batch auditing while preserves data privacy. The rb23Tree is utilized to facilitate data dynamics. Moreover, the fairness is considered to prevent a dishonest user from accusing the cloud service provider of manipulating the data. The scheme allows a third party auditor (TPA) to verify the data integrity without learning any information about the data content during the auditing process. Furthermore, our scheme also allows batch auditing, which greatly accelerates the auditing process when there are multiple auditing requests. Security analysis and extensive experimental evaluations show that our scheme is secure and efficient.展开更多
Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influe...Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influence of outliers on the result of DR. This method introduces a penalty function matrix in a conventional least-square objective function, to assign small weights for outliers and large weights for normal measurements. To avoid the loss of data information, element-wise Mahalanobis distance is proposed, as an improvement on vector-wise distance, to construct a penalty function matrix. The correlation of measurement error is also considered in this article. The method introduces the robust statistical theory into conventional least square estimator by constructing the penalty weight matrix and gets not only good robustness but also simple calculation. Simulation of a continuous stirred tank reactor, verifies the effectiveness of the proposed algorithm.展开更多
Mapping crop distribution with remote sensing data is of great importance for agricultural production, food security and agricultural sustainability. Winter rape is an important oil crop, which plays an important role...Mapping crop distribution with remote sensing data is of great importance for agricultural production, food security and agricultural sustainability. Winter rape is an important oil crop, which plays an important role in the cooking oil market of China. The Jianghan Plain and Dongting Lake Plain (JPDLP) are major agricultural production areas in China. Essential changes in winter rape distribution have taken place in this area during the 21st century. However, the pattern of these changes remains unknown. In this study, the spatial and temporal dynamics of winter rape from 2000 to 2017 on the JPDLP were analyzed. An artificial neural network (ANN)-based classification method was proposed to map fractional winter rape distribution by fusing moderate resolution imaging spectrometer (MODIS) data and high-resolution imagery. The results are as follows:(1) The total winter rape acreages on the JPDLP dropped significantly, especially on the Jianghan Plain with a decline of about 45% during 2000 and 2017.(2) The winter rape abundance keeps changing with about 20–30% croplands changing their abundance drastically in every two consecutive observation years.(3) The winter rape has obvious regional differentiation for the trend of its change at the county level, and the decreasing trend was observed more strongly in the traditionally dominant agricultural counties.展开更多
The efficacy of vegetation dynamics simulations in offline land surface models(LSMs)largely depends on the quality and spatial resolution of meteorological forcing data.In this study,the Princeton Global Meteorologica...The efficacy of vegetation dynamics simulations in offline land surface models(LSMs)largely depends on the quality and spatial resolution of meteorological forcing data.In this study,the Princeton Global Meteorological Forcing Data(PMFD)and the high spatial resolution and upscaled China Meteorological Forcing Data(CMFD)were used to drive the Simplified Simple Biosphere model version 4/Top-down Representation of Interactive Foliage and Flora Including Dynamics(SSiB4/TRIFFID)and investigate how meteorological forcing datasets with different spatial resolutions affect simulations over the Tibetan Plateau(TP),a region with complex topography and sparse observations.By comparing the monthly Leaf Area Index(LAI)and Gross Primary Production(GPP)against observations,we found that SSiB4/TRIFFID driven by upscaled CMFD improved the performance in simulating the spatial distributions of LAI and GPP over the TP,reducing RMSEs by 24.3%and 20.5%,respectively.The multi-year averaged GPP decreased from 364.68 gC m^(-2)yr^(-1)to 241.21 gC m^(-2)yr^(-1)with the percentage bias dropping from 50.2%to-1.7%.When using the high spatial resolution CMFD,the RMSEs of the spatial distributions of LAI and GPP simulations were further reduced by 7.5%and 9.5%,respectively.This study highlights the importance of more realistic and high-resolution forcing data in simulating vegetation growth and carbon exchange between the atmosphere and biosphere over the TP.展开更多
Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the u...Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the user's privacy before their release is critically important yet challenging.Differential Privacy(DP)is well-known to provide effective privacy protection,and thus the dynamic DP preserving data release was designed to publish a histogram to meet DP guarantee.Unfortunately,this scheme may result in high cumulative errors and lower the data availability.To address this problem,in this paper,we apply Jensen-Shannon(JS)divergence to design the OPTICS(Ordering Points To Identify The Clustering Structure)scheme.It uses JS divergence to measure the difference between the updated data set at the current release time and private data set at the previous release time.By comparing the difference with a threshold,only when the difference is greater than the threshold,can we apply OPTICS to publish DP protected data sets.Our experimental results show that the absolute errors and average relative errors are significantly lower than those existing works.展开更多
Social network contains the interaction between social members, which constitutes the structure and attribute of social network. The interactive relationship of social network contains a lot of personal privacy inform...Social network contains the interaction between social members, which constitutes the structure and attribute of social network. The interactive relationship of social network contains a lot of personal privacy information. The direct release of social network data will cause the disclosure of privacy information. Aiming at the dynamic characteristics of social network data release, a new dynamic social network data publishing method based on differential privacy was proposed. This method was consistent with differential privacy. It is named DDPA (Dynamic Differential Privacy Algorithm). DDPA algorithm is an improvement of privacy protection algorithm in static social network data publishing. DDPA adds noise which follows Laplace to network edge weights. DDPA identifies the edge weight information that changes as the number of iterations increases, adding the privacy protection budget. Through experiments on real data sets, the results show that the DDPA algorithm satisfies the user’s privacy requirement in social network. DDPA reduces the execution time brought by iterations and reduces the information loss rate of graph structure.展开更多
Nowadays,massive amounts of data have been accumulated in various and wide fields,it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful informati...Nowadays,massive amounts of data have been accumulated in various and wide fields,it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data.It is often that the output data of systems are measurable while dynamic structures producing these data are hidden,and thus studies to reveal system structures by analyzing available data,i.e.,reconstructions of systems become one of the most important tasks of information extractions.In the past,most of the works in this respect were based on theoretical analyses and numerical verifications.Direct analyses of experimental data are very rare.In physical science,most of the analyses of experimental setups were based on the first principles of physics laws,i.e.,so-called top-down analyses.In this paper,we conducted an experiment of"Boer resonant instrument for forced vibration"(BRIFV)and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data,i.e.,by applying the bottom-up strategy.Dynamics of the experimental set is strongly nonlinear and chaotic,and it's subjects to inevitable noises.We proposed to use high-order correlation computations to treat nonlinear dynamics;use two-time correlations to treat noise effects.By applying these approaches,we have successfully reconstructed the structure of the experimental setup,and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.展开更多
A new method of nonlinear analysis is established by combining phase space reconstruction and data reduction sub-frequency band wavelet. This method is applied to two types of chaotic dynamic systems(Lorenz and Rssler...A new method of nonlinear analysis is established by combining phase space reconstruction and data reduction sub-frequency band wavelet. This method is applied to two types of chaotic dynamic systems(Lorenz and Rssler) to examine the anti-noise ability for complex systems. Results show that the nonlinear dynamic system analysis method resists noise and reveals the internal dynamics of a weak signal from noise pollution. On this basis, the vertical upward gas–liquid two-phase flow in a 2 mm × 0.81 mm small rectangular channel is investigated. The frequency and energy distributions of the main oscillation mode are revealed by analyzing the time–frequency spectra of the pressure signals of different flow patterns. The positive power spectral density of singular-value frequency entropy and the damping ratio are extracted to characterize the evolution of flow patterns and achieve accurate recognition of different vertical upward gas–liquid flow patterns(bubbly flow:100%, slug flow: 92%, churn flow: 96%, annular flow: 100%). The proposed analysis method will enrich the dynamics theory of multi-phase flow in small channel.展开更多
When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive ...When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.展开更多
Data is the last defense line of security,in order to prevent data loss,no matter where the data is stored,copied or transmitted,it is necessary to accurately detect the data type,and further clarify the form and encr...Data is the last defense line of security,in order to prevent data loss,no matter where the data is stored,copied or transmitted,it is necessary to accurately detect the data type,and further clarify the form and encryption structure of the data transmission process to ensure the accuracy of the data,so as to prevent data leakage,take the data characteristics as the core,use transparent encryption and decryption technology as the leading,and According to the data element characteristics such as identity authentication,authority management,outgoing management,file audit and external device management,the terminal data is marked with attributes to form a data leakage prevention module with data function,so as to control the data in the whole life cycle from creation,storage,transmission,use to destruction,no matter whether the data is stored in the server,PC or mobile device,provide unified policy management,form ecological data chain with vital characteristics,and provide comprehensive protection system for file dynamic encryption transmission,such as prevention in advance,control in the event,and audit after the event,so as to ensure the security of dynamic encryption in the process of file transmission,ensure the core data of the file,and help the enterprise keep away from the risk of data leakage.展开更多
基金sponsored by the U.S.Department of Housing and Urban Development(Grant No.NJLTS0027-22)The opinions expressed in this study are the authors alone,and do not represent the U.S.Depart-ment of HUD’s opinions.
文摘This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models offer insights, they fall short in presenting a holistic view of complex urban challenges. System dynamics (SD) models that are often utilized to provide holistic, systematic understanding of a research subject, like the urban system, emerge as valuable tools, but data scarcity and theoretical inadequacy pose challenges. The research reviews relevant papers on recent SD model applications in urban sustainability since 2018, categorizing them based on nine key indicators. Among the reviewed papers, data limitations and model assumptions were identified as ma jor challenges in applying SD models to urban sustainability. This led to exploring the transformative potential of big data analytics, a rare approach in this field as identified by this study, to enhance SD models’ empirical foundation. Integrating big data could provide data-driven calibration, potentially improving predictive accuracy and reducing reliance on simplified assumptions. The paper concludes by advocating for new approaches that reduce assumptions and promote real-time applicable models, contributing to a comprehensive understanding of urban sustainability through the synergy of big data and SD models.
基金supported in part by the National Natural Science Foundation of China(62125306)Zhejiang Key Research and Development Project(2024C01163)the State Key Laboratory of Industrial Control Technology,China(ICT2024A06)
文摘In recent decades,control performance monitoring(CPM)has experienced remarkable progress in research and industrial applications.While CPM research has been investigated using various benchmarks,the historical data benchmark(HIS)has garnered the most attention due to its practicality and effectiveness.However,existing CPM reviews usually focus on the theoretical benchmark,and there is a lack of an in-depth review that thoroughly explores HIS-based methods.In this article,a comprehensive overview of HIS-based CPM is provided.First,we provide a novel static-dynamic perspective on data-level manifestations of control performance underlying typical controller capacities including regulation and servo:static and dynamic properties.The static property portrays time-independent variability in system output,and the dynamic property describes temporal behavior driven by closed-loop feedback.Accordingly,existing HIS-based CPM approaches and their intrinsic motivations are classified and analyzed from these two perspectives.Specifically,two mainstream solutions for CPM methods are summarized,including static analysis and dynamic analysis,which match data-driven techniques with actual controlling behavior.Furthermore,this paper also points out various opportunities and challenges faced in CPM for modern industry and provides promising directions in the context of artificial intelligence for inspiring future research.
基金supported by:the 2023 Basic Public Welfare Research Project of the Wenzhou Science and Technology Bureau“Research on Multi-Source Data Classification and Grading Standards and Intelligent Algorithms for Higher Education Institutions”(Project No.G2023094)Major Humanities and Social Sciences Research Projects in Zhejiang higher education institutions(Grant/Award Number:2024QN061)2023 Basic Public Welfare Research Project of Wenzhou(No.:S2023014).
文摘In the context of the rapid development of digital education,the security of educational data has become an increasing concern.This paper explores strategies for the classification and grading of educational data,and constructs a higher educational data security management and control model centered on the integration of medical and educational data.By implementing a multi-dimensional strategy of dynamic classification,real-time authorization,and secure execution through educational data security levels,dynamic access control is applied to effectively enhance the security and controllability of educational data,providing a secure foundation for data sharing and openness.
基金supported by the National Science Foundation of China(Grant Nos.52068049 and 51908266)the Science Fund for Distinguished Young Scholars of Gansu Province(No.21JR7RA267)Hongliu Outstanding Young Talents Program of Lanzhou University of Technology.
文摘Addressing the current challenges in transforming pixel displacement into physical displacement in visual monitoring technologies,as well as the inability to achieve precise full-field monitoring,this paper proposes a method for identifying the structural dynamic characteristics of wind turbines based on visual monitoring data fusion.Firstly,the Lucas-Kanade Tomasi(LKT)optical flow method and a multi-region of interest(ROI)monitoring structure are employed to track pixel displacements,which are subsequently subjected to band pass filtering and resampling operations.Secondly,the actual displacement time history is derived through double integration of the acquired acceleration data and subsequent band pass filtering.The scale factor is obtained by applying the least squares method to compare the visual displacement with the displacement derived from double integration of the acceleration data.Based on this,the multi-point displacement time histories under physical coordinates are obtained using the vision data and the scale factor.Subsequently,when visual monitoring of displacements becomes impossible due to issues such as image blurring or lens occlusion,the structural vibration equation and boundary condition constraints,among other key parameters,are employed to predict the displacements at unknown monitoring points,thereby enabling full-field displacement monitoring and dynamic characteristic testing of the structure.Finally,a small-scale shaking table test was conducted on a simulated wind turbine structure undergoing shutdown to validate the dynamic characteristics of the proposed method through test verification.The research results indicate that the proposed method achieves a time-domain error within the submillimeter range and a frequency-domain accuracy of over 99%,effectively monitoring the full-field structural dynamic characteristics of wind turbines and providing a basis for the condition assessment of wind turbine structures.
文摘Cloud computing has become an essential technology for the management and processing of large datasets,offering scalability,high availability,and fault tolerance.However,optimizing data replication across multiple data centers poses a significant challenge,especially when balancing opposing goals such as latency,storage costs,energy consumption,and network efficiency.This study introduces a novel Dynamic Optimization Algorithm called Dynamic Multi-Objective Gannet Optimization(DMGO),designed to enhance data replication efficiency in cloud environments.Unlike traditional static replication systems,DMGO adapts dynamically to variations in network conditions,system demand,and resource availability.The approach utilizes multi-objective optimization approaches to efficiently balance data access latency,storage efficiency,and operational costs.DMGO consistently evaluates data center performance and adjusts replication algorithms in real time to guarantee optimal system efficiency.Experimental evaluations conducted in a simulated cloud environment demonstrate that DMGO significantly outperforms conventional static algorithms,achieving faster data access,lower storage overhead,reduced energy consumption,and improved scalability.The proposed methodology offers a robust and adaptable solution for modern cloud systems,ensuring efficient resource consumption while maintaining high performance.
基金the National Natural Science Foundation of China(grant numbers 82070434,LYQ)。
文摘Background Cardiovascular disease(CVD)remains a major health challenge globally,particularly in aging populations.Using data from the China Health and Retirement Longitudinal Study(CHARLS),this study examines the Triglyceride-glucose(TyG)index dynamics,a marker for insulin resistance,and its relationship with CVD in Chinese adults aged 45 and older.Methods This reanalysis utilized five waves of CHARLS data with multistage sampling.From 17,705 participants,5,625 with TyG index and subsequent CVD data were included,excluding those lacking 2011 and 2015 TyG data.TyG derived from glucose and triglyceride levels,CVD outcomes via self-reports and records.Participants divided into four groups based on TyG changes(2011–2015):low-low,low-high,high-low,high-high TyG groups.Results Adjusting for covariates,stable high group showed a significantly higher risk of incident CVD compared to stable low group,with an HR of 1.18(95%CI:1.03–1.36).Similarly,for stroke risk,stable high group had a HR of 1.45(95%CI:1.11–1.89).Survival curves indicated that individuals with stable high TyG levels had a significantly increased CVD risk compared to controls.The dynamic TyG change showed a greater risk for CVD than abnormal glucose metabolism,notably for stroke.However,there was no statistical difference in single incidence risk of heart disease between stable low and stable high group.Subgroup analyses underscored demographic disparities,with stable high group consistently showing elevated risks,particularly among<65 years individuals,females,and those with higher education,lower BMI,or higher depression scores.Machine learning models,including random forest,XGBoost,CoxBoost,Deepsurv and GBM,underscored the predictive superiority of dynamic TyG over abnormal glucose metabolism for CVD.Conclusions Dynamic TyG change correlate with CVD risks.Monitoring these changes could predict and manage cardiovascular health in middle-aged and older adults.Targeted interventions based on TyG index trends are crucial for reducing CVD risks in this population.
文摘Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed into the model. And the computational errors are corrected using statistical approaches. It involves a variety of aspects, including the uncertainty modeling, the measurement evaluation, the system model and the measurement model coupling ,the computation complexity, and the performance issue. Authors intend to set up the architecture of DDDS for wildfire spread model, DEVS-FIRE, based on the discrete event speeification (DEVS) formalism. The experimental results show that the framework can track the dynamically changing fire front based on fire sen- sor data, thus, it provides more aecurate predictions.
基金supported by General Program (No. 60774022)State Key Program (No. 60834001) of National Natural Science Foundation of China
文摘In this paper, the stability of iterative learning control with data dropouts is discussed. By the super vector formulation, an iterative learning control (ILC) system with data dropouts can be modeled as an asynchronous dynamical system with rate constraints on events in the iteration domain. The stability condition is provided in the form of linear matrix inequalities (LMIS) depending on the stability of asynchronous dynamical systems. The analysis is supported by simulations.
基金supported by a certain Ministry Foundation under Grant No.20212HK03010
文摘In order to test the anti-interference ability of an Unmanned Aerial Vehicle(UAV) data link in a complex electromagnetic environment,a method for simulating the dynamic electromagnetic interference of an indoor wireless environment is proposed.This method can estimate the relational degree between the actual face of an UAV data link in an interface environment and the simulation scenarios in an anechoic chamber by using the Grey Relational Analysis(GRA) theory.The dynamic drive of the microwave instrument produces a real-time corresponding interference signal and realises scene mapping.The experimental results show that the maximal correlation between the interference signal in the real scene and the angular domain of the radiation antenna in the anechoic chamber is 0.959 3.Further,the relational degree of the Signal-toInterference Ratio(SIR) of the UAV at its reception terminal indoors and in the anechoic chamber is 0.996 8,and the time of instrument drive is only approximately 10 μs.All of the above illustrates that this method can achieve a simulation close to a real field dynamic electromagnetic interference signal of an indoor UAV data link.
基金Supported by the Doctoral Fund of the Ministry of Education priority Areas of Development Projects(20110141130006)the Natural Science Foundation of Hubei Province(ZRZ0041)
文摘A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improved fairness and dynamic provable data possession scheme that supports public verification and batch auditing while preserves data privacy. The rb23Tree is utilized to facilitate data dynamics. Moreover, the fairness is considered to prevent a dishonest user from accusing the cloud service provider of manipulating the data. The scheme allows a third party auditor (TPA) to verify the data integrity without learning any information about the data content during the auditing process. Furthermore, our scheme also allows batch auditing, which greatly accelerates the auditing process when there are multiple auditing requests. Security analysis and extensive experimental evaluations show that our scheme is secure and efficient.
基金Supported by the National Natural Science Foundation of China (No.60504033)
文摘Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influence of outliers on the result of DR. This method introduces a penalty function matrix in a conventional least-square objective function, to assign small weights for outliers and large weights for normal measurements. To avoid the loss of data information, element-wise Mahalanobis distance is proposed, as an improvement on vector-wise distance, to construct a penalty function matrix. The correlation of measurement error is also considered in this article. The method introduces the robust statistical theory into conventional least square estimator by constructing the penalty weight matrix and gets not only good robustness but also simple calculation. Simulation of a continuous stirred tank reactor, verifies the effectiveness of the proposed algorithm.
基金supported by the Natural Science Foundation of Hubei Province, China (2017CFB434)the National Natural Science Foundation of China (41506208 and 61501200)the Basic Research Funds for Yellow River Institute of Hydraulic Research, China (HKYJBYW-2016-06)
文摘Mapping crop distribution with remote sensing data is of great importance for agricultural production, food security and agricultural sustainability. Winter rape is an important oil crop, which plays an important role in the cooking oil market of China. The Jianghan Plain and Dongting Lake Plain (JPDLP) are major agricultural production areas in China. Essential changes in winter rape distribution have taken place in this area during the 21st century. However, the pattern of these changes remains unknown. In this study, the spatial and temporal dynamics of winter rape from 2000 to 2017 on the JPDLP were analyzed. An artificial neural network (ANN)-based classification method was proposed to map fractional winter rape distribution by fusing moderate resolution imaging spectrometer (MODIS) data and high-resolution imagery. The results are as follows:(1) The total winter rape acreages on the JPDLP dropped significantly, especially on the Jianghan Plain with a decline of about 45% during 2000 and 2017.(2) The winter rape abundance keeps changing with about 20–30% croplands changing their abundance drastically in every two consecutive observation years.(3) The winter rape has obvious regional differentiation for the trend of its change at the county level, and the decreasing trend was observed more strongly in the traditionally dominant agricultural counties.
基金the National Natural Science Foundation of China(Grant Nos.42130602,42175136)the Collaborative Innovation Center for Climate Change,Jiangsu Province,China.
文摘The efficacy of vegetation dynamics simulations in offline land surface models(LSMs)largely depends on the quality and spatial resolution of meteorological forcing data.In this study,the Princeton Global Meteorological Forcing Data(PMFD)and the high spatial resolution and upscaled China Meteorological Forcing Data(CMFD)were used to drive the Simplified Simple Biosphere model version 4/Top-down Representation of Interactive Foliage and Flora Including Dynamics(SSiB4/TRIFFID)and investigate how meteorological forcing datasets with different spatial resolutions affect simulations over the Tibetan Plateau(TP),a region with complex topography and sparse observations.By comparing the monthly Leaf Area Index(LAI)and Gross Primary Production(GPP)against observations,we found that SSiB4/TRIFFID driven by upscaled CMFD improved the performance in simulating the spatial distributions of LAI and GPP over the TP,reducing RMSEs by 24.3%and 20.5%,respectively.The multi-year averaged GPP decreased from 364.68 gC m^(-2)yr^(-1)to 241.21 gC m^(-2)yr^(-1)with the percentage bias dropping from 50.2%to-1.7%.When using the high spatial resolution CMFD,the RMSEs of the spatial distributions of LAI and GPP simulations were further reduced by 7.5%and 9.5%,respectively.This study highlights the importance of more realistic and high-resolution forcing data in simulating vegetation growth and carbon exchange between the atmosphere and biosphere over the TP.
基金supported in part by National Natural Science Foundation of China(No.61672106)in part by Natural Science Foundation of Beijing,China(L192023)in part by the project of promoting the Classified Development of Beijing Information Science and Technology University(No.5112211038,5112211039)。
文摘Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the user's privacy before their release is critically important yet challenging.Differential Privacy(DP)is well-known to provide effective privacy protection,and thus the dynamic DP preserving data release was designed to publish a histogram to meet DP guarantee.Unfortunately,this scheme may result in high cumulative errors and lower the data availability.To address this problem,in this paper,we apply Jensen-Shannon(JS)divergence to design the OPTICS(Ordering Points To Identify The Clustering Structure)scheme.It uses JS divergence to measure the difference between the updated data set at the current release time and private data set at the previous release time.By comparing the difference with a threshold,only when the difference is greater than the threshold,can we apply OPTICS to publish DP protected data sets.Our experimental results show that the absolute errors and average relative errors are significantly lower than those existing works.
文摘Social network contains the interaction between social members, which constitutes the structure and attribute of social network. The interactive relationship of social network contains a lot of personal privacy information. The direct release of social network data will cause the disclosure of privacy information. Aiming at the dynamic characteristics of social network data release, a new dynamic social network data publishing method based on differential privacy was proposed. This method was consistent with differential privacy. It is named DDPA (Dynamic Differential Privacy Algorithm). DDPA algorithm is an improvement of privacy protection algorithm in static social network data publishing. DDPA adds noise which follows Laplace to network edge weights. DDPA identifies the edge weight information that changes as the number of iterations increases, adding the privacy protection budget. Through experiments on real data sets, the results show that the DDPA algorithm satisfies the user’s privacy requirement in social network. DDPA reduces the execution time brought by iterations and reduces the information loss rate of graph structure.
文摘Nowadays,massive amounts of data have been accumulated in various and wide fields,it has become today one of the central issues in interdisciplinary fields to analyze existing data and extract as much useful information as possible from data.It is often that the output data of systems are measurable while dynamic structures producing these data are hidden,and thus studies to reveal system structures by analyzing available data,i.e.,reconstructions of systems become one of the most important tasks of information extractions.In the past,most of the works in this respect were based on theoretical analyses and numerical verifications.Direct analyses of experimental data are very rare.In physical science,most of the analyses of experimental setups were based on the first principles of physics laws,i.e.,so-called top-down analyses.In this paper,we conducted an experiment of"Boer resonant instrument for forced vibration"(BRIFV)and inferred the dynamic structure of the experimental set purely from the analysis of the measurable experimental data,i.e.,by applying the bottom-up strategy.Dynamics of the experimental set is strongly nonlinear and chaotic,and it's subjects to inevitable noises.We proposed to use high-order correlation computations to treat nonlinear dynamics;use two-time correlations to treat noise effects.By applying these approaches,we have successfully reconstructed the structure of the experimental setup,and the dynamic system reconstructed with the measured data reproduces good experimental results in a wide range of parameters.
基金Supported by the National Natural Science Foundation of China(51406031)
文摘A new method of nonlinear analysis is established by combining phase space reconstruction and data reduction sub-frequency band wavelet. This method is applied to two types of chaotic dynamic systems(Lorenz and Rssler) to examine the anti-noise ability for complex systems. Results show that the nonlinear dynamic system analysis method resists noise and reveals the internal dynamics of a weak signal from noise pollution. On this basis, the vertical upward gas–liquid two-phase flow in a 2 mm × 0.81 mm small rectangular channel is investigated. The frequency and energy distributions of the main oscillation mode are revealed by analyzing the time–frequency spectra of the pressure signals of different flow patterns. The positive power spectral density of singular-value frequency entropy and the damping ratio are extracted to characterize the evolution of flow patterns and achieve accurate recognition of different vertical upward gas–liquid flow patterns(bubbly flow:100%, slug flow: 92%, churn flow: 96%, annular flow: 100%). The proposed analysis method will enrich the dynamics theory of multi-phase flow in small channel.
基金supported by the New Century Excellent Talents in University(NCET-09-0396)the National Science&Technology Key Projects of Numerical Control(2012ZX04014-031)+1 种基金the Natural Science Foundation of Hubei Province(2011CDB279)the Foundation for Innovative Research Groups of the Natural Science Foundation of Hubei Province,China(2010CDA067)
文摘When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied.
基金The work was funded by Scientific Research Project of Sichuan Provincial Department of Education(13zao125)Comprehensive Reform Project of Software Engineering(zg−1202)Enterprise Informatization and Internet of Things Measurement and Control Technology Open Fund Project of Sichuan University Key Laboratory(2014wzy05).
文摘Data is the last defense line of security,in order to prevent data loss,no matter where the data is stored,copied or transmitted,it is necessary to accurately detect the data type,and further clarify the form and encryption structure of the data transmission process to ensure the accuracy of the data,so as to prevent data leakage,take the data characteristics as the core,use transparent encryption and decryption technology as the leading,and According to the data element characteristics such as identity authentication,authority management,outgoing management,file audit and external device management,the terminal data is marked with attributes to form a data leakage prevention module with data function,so as to control the data in the whole life cycle from creation,storage,transmission,use to destruction,no matter whether the data is stored in the server,PC or mobile device,provide unified policy management,form ecological data chain with vital characteristics,and provide comprehensive protection system for file dynamic encryption transmission,such as prevention in advance,control in the event,and audit after the event,so as to ensure the security of dynamic encryption in the process of file transmission,ensure the core data of the file,and help the enterprise keep away from the risk of data leakage.