In the era of digital intelligence,data is a key element in promoting social and economic development.Educational data,as a vital component of data,not only supports teaching and learning but also contains much sensit...In the era of digital intelligence,data is a key element in promoting social and economic development.Educational data,as a vital component of data,not only supports teaching and learning but also contains much sensitive information.How to effectively categorize and protect sensitive data has become an urgent issue in educational data security.This paper systematically researches and constructs a multi-dimensional classification framework for sensitive educational data,and discusses its security protection strategy from the aspects of identification and desensitization,aiming to provide new ideas for the security management of sensitive educational data and to help the construction of an educational data security ecosystem in the era of digital intelligence.展开更多
Big data has ushered in an era of unprecedented access to vast amounts of new,unstructured data,particularly in the realm of sensitive information.It presents unique opportunities for enhancing risk alerting systems,b...Big data has ushered in an era of unprecedented access to vast amounts of new,unstructured data,particularly in the realm of sensitive information.It presents unique opportunities for enhancing risk alerting systems,but also poses challenges in terms of extraction and analysis due to its diverse file formats.This paper proposes the utilization of a DAE-based(Deep Auto-encoders)model for projecting risk associated with financial data.The research delves into the development of an indicator assessing the degree to which organizations successfully avoid displaying bias in handling financial information.Simulation results demonstrate the superior performance of the DAE algorithm,showcasing fewer false positives,improved overall detection rates,and a noteworthy 9%reduction in failure jitter.The optimized DAE algorithm achieves an accuracy of 99%,surpassing existing methods,thereby presenting a robust solution for sensitive data risk projection.展开更多
This study developed a new methodology for analyzing the risk level of marine spill accidents from two perspectives,namely,marine traffic density and sensitive resources.Through a case study conducted in Busan,South K...This study developed a new methodology for analyzing the risk level of marine spill accidents from two perspectives,namely,marine traffic density and sensitive resources.Through a case study conducted in Busan,South Korea,detailed procedures of the methodology were proposed and its scalability was confirmed.To analyze the risk from a more detailed and microscopic viewpoint,vessel routes as hazard sources were delineated on the basis of automated identification system(AIS)big data.The outliers and errors of AIS big data were removed using the density-based spatial clustering of applications with noise algorithm,and a marine traffic density map was evaluated by combining all of the gridded routes.Vulnerability of marine environment was identified on the basis of the sensitive resource map constructed by the Korea Coast Guard in a similar manner to the National Oceanic and Atmospheric Administration environmental sensitivity index approach.In this study,aquaculture sites,water intake facilities of power plants,and beach/resort areas were selected as representative indicators for each category.The vulnerability values of neighboring cells decreased according to the Euclidean distance from the resource cells.Two resulting maps were aggregated to construct a final sensitive resource and traffic density(SRTD)risk analysis map of the Busan–Ulsan sea areas.We confirmed the effectiveness of SRTD risk analysis by comparing it with the actual marine spill accident records.Results show that all of the marine spill accidents in 2018 occurred within 2 km of high-risk cells(level 6 and above).Thus,if accident management and monitoring capabilities are concentrated on high-risk cells,which account for only 6.45%of the total study area,then it is expected that it will be possible to cope with most marine spill accidents effectively.展开更多
Numerous industries,especially the medical industry,are likely to exhibit significant developments in the future.Ever since the announcement of the precision medicine initiative by the United States in 2015,interest i...Numerous industries,especially the medical industry,are likely to exhibit significant developments in the future.Ever since the announcement of the precision medicine initiative by the United States in 2015,interest in the field has considerably increased.The techniques of precision medicine are employed to provide optimal treatment and medical services to patients,in addition to the prevention and management of diseases via the collection and analysis of big data related to their individual genetic characteristics,occupation,living environment,and dietary habits.As this involves the accumulation and utilization of sensitive information,such as patient history,DNA,and personal details,its implementation is difficult if the data are inaccurate,exposed,or forged,and there is also a concern for privacy,as massive amount of data are collected;hence,ensuring the security of information is essential.Therefore,it is necessary to develop methods of securely sharing sensitive data for the establishment of a precision medicine system.An authentication and data sharing scheme is presented in this study on the basis of an analysis of sensitive data.The proposed scheme securely shares sensitive data of each entity in the precision medicine system according to its architecture and data flow.展开更多
Evolution of the Arctic sea ice and its snow cover during the SHEBA year were simulated by applying a high-resolution thermodynamic snow/ice model (HIGHTSI). Attention was paid to the impact of albedo on snow and se...Evolution of the Arctic sea ice and its snow cover during the SHEBA year were simulated by applying a high-resolution thermodynamic snow/ice model (HIGHTSI). Attention was paid to the impact of albedo on snow and sea ice mass balance, effect of snow on total ice mass balance, and the model vertical resolution. The SHEBA annual simulation was made applying the best possible external forcing data set created by the Sea Ice Model Intercomparison Project. The HIGHTSI control run reasonably reproduced the observed snow and ice thickness. A number of albedo schemes were incorporated into HIGHTSI to study the feedback processes between the albedo and snow and ice thickness. The snow thickness turned out to be an essential variable in the albedo parameterization. Albedo schemes dependent on the surface temperature were liable to excessive positive feedback effects generated by errors in the modelled surface temperature. The superimposed ice formation should be taken into account for the annual Arctic sea ice mass balance.展开更多
In direct sequence spread spectrum communication both for satelliteto-ground and inter-satellite links, the system constrains due to radio frequency spectral occupation, channel data throughput and link performances i...In direct sequence spread spectrum communication both for satelliteto-ground and inter-satellite links, the system constrains due to radio frequency spectral occupation, channel data throughput and link performances in terms of data channel coding which might result in a signal structure where the symbol duration is shorter than the pseudo code period. This can generate some difficulties in the DSSS signal acquisition due to the polarity inversion caused by the data modulation. To eliminate the influence due to polarity inversion, this paper proposes a novel acquisition algorithm based on the simultaneous search of the code phase, data phase and Doppler frequency. In the proposed algorithm the data phase is predicted and the correlation period for the coherent integration can be set equal to the symbol duration. Then non-coherent accumulation over different symbol is implemented in order to enhance the acquisition algorithm sensitivity; the interval of non-coherent accumulation is the least common multiple between the symbol duration and the pseudo code period. The algorithm proposed can largely minimize the SNR loss caused by data polarity inversion and enhance acquisition performance without a noticeable increase in hardware complexity. Theoretical analysis, simulation and measured results verify the validity of the algorithm.展开更多
Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvat...Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvatures of mode shapes of the periodic spring-mass system by utilizing the periodic structure theory are derived in this paper. The sensitivities of these mode parameters with respect to structural damages, which do not depend on the physical parameters of the original structures, are obtained. Based on the sensitivity analysis of these mode parameters, a two-stage method is proposed to localize and quantify damages of multi-story or highrise buildings. The slopes and curvatures of mode shapes, which are highly sensitive to local damages, are used to localize the damages. Subsequently, the limited measured natural frequencies, which have a better accuracy than the other mode parameters, are used to quantify the extent of damages within the potential damaged locations. The experimental results of a 3-story experimental building demonstrate that the single or multiple damages of buildings, either slight or severe, can be correctly localized by using only the slope or curvature of mode shape in one of the lower modes, in which the change of natural frequency is the largest, and can be accurately quantified by the limited measured natural frequencies with noise pollution.展开更多
In this paper, we present a distributed multi-level cache system based on cloud storage, which is aimed at the low access efficiency of small spatio-temporal data files in information service system of Smart City. Tak...In this paper, we present a distributed multi-level cache system based on cloud storage, which is aimed at the low access efficiency of small spatio-temporal data files in information service system of Smart City. Taking classification attribute of small spatio-temporal data files in Smart City as the basis of cache content selection, the cache system adopts different cache pool management strategies in different levels of cache. The results of experiment in prototype system indicate that multi-level cache in this paper effectively increases the access bandwidth of small spatio-temporal files in Smart City and greatly improves service quality of multiple concurrent access in system.展开更多
At present,the process of digital image information fusion has the problems of low data cleaning unaccuracy and more repeated data omission,resulting in the unideal information fusion.In this regard,a visualized multi...At present,the process of digital image information fusion has the problems of low data cleaning unaccuracy and more repeated data omission,resulting in the unideal information fusion.In this regard,a visualized multicomponent information fusion method for big data based on radar map is proposed in this paper.The data model of perceptual digital image is constructed by using the linear regression analysis method.The ID tag of the collected image data as Transactin Identification(TID)is compared.If the TID of two data is the same,the repeated data detection is carried out.After the test,the data set is processed many times in accordance with the method process to improve the precision of data cleaning and reduce the omission.Based on the radar images,hierarchical visualization of processed multi-level information fusion is realized.The experiments show that the method can clean the redundant data accurately and achieve the efficient fusion of multi-level information of big data in the digital image.展开更多
The crowdsourced OpenStreetMap mapping platform is utilized by countless stakeholders worldwide for various purposes and applications.Individuals,researchers,governments,commercial,and humanitarian organizations,in ad...The crowdsourced OpenStreetMap mapping platform is utilized by countless stakeholders worldwide for various purposes and applications.Individuals,researchers,governments,commercial,and humanitarian organizations,in addition to the engineers,professionals,and technical developers,use OpenStreetMap both as data contributors and consumers.The storage,usage,and integration of volunteered geographical data in software applications often create complex ethical dilemmas and values regarding the relationships between different categories of stakeholders.It is therefore common for moral preferences of stakeholders to be neglected.This paper investigates the integration of ethical values in OpenStreetMap using the value sensitive design methodology that examines technical,empirical,and conceptual aspects at each design stage.We use the Humanitarian OpenStreetMap Team,an existing volunteered geographic information initiative,as a case study.Our investigation shows that although OpenStreetMap does integrate ethical values in its organizational structure,a deeper understanding of its direct and indirect stakeholders’perspectives is still required.This study is expected to assist organizations that contribute to or use OpenStreetMap in recognizing and preserving existing and important ethical values.To the best of our knowledge,this is the first attempt to evaluate ethical values methodically and comprehensively in the design process of the OpenStreetMap platform.展开更多
In this paper, a numerical modeling tool is described which can be used to explore various aspects of four dimensional variational data assimilation and parameter estimation arising in geophysical, environmental, biol...In this paper, a numerical modeling tool is described which can be used to explore various aspects of four dimensional variational data assimilation and parameter estimation arising in geophysical, environmental, biological and engineering sciences. A major component of this tool is a coupled chaotic dynamical system obtained by coupling two versions of the well-known Lorenz (1963) model with different time scales which differ by a certain time-scale factor. A tangent linear model and its adjoint are considered that correspond to a coupled chaotic system. The general idea of applying sensitivity measures (sensitivity functions) to coupled systems, emphasizing the data assimilation aspects, is explored as well by the forward sensitivity approach. For this purpose the set of sensitivity equations is derived from the nonlinear equations of the coupled dynamical system. To estimate the influence of model parameter uncertainties on the simulated state variables the relative error in the energy norm is used.展开更多
The elliptic curve cryptography algorithm represents a major advancement in the field of computer security. This innovative algorithm uses elliptic curves to encrypt and secure data, providing an exceptional level of ...The elliptic curve cryptography algorithm represents a major advancement in the field of computer security. This innovative algorithm uses elliptic curves to encrypt and secure data, providing an exceptional level of security while optimizing the efficiency of computer resources. This study focuses on how elliptic curves cryptography helps to protect sensitive data. Text is encrypted using the elliptic curve technique because it provides great security with a smaller key on devices with limited resources, such as mobile phones. The elliptic curves cryptography of this study is better than using a 256-bit RSA key. To achieve equivalent protection by using the elliptic curves cryptography, several Python libraries such as cryptography, pycryptodome, pyQt5, secp256k1, etc. were used. These technologies are used to develop a software based on elliptic curves. If built, the software helps to encrypt and decrypt data such as a text messages and it offers the authentication for the communication.展开更多
Accurate prediction of landslide displacement is crucial for effective early warning of landslide disasters.While most existing prediction methods focus on time-series forecasting for individual monitoring points,ther...Accurate prediction of landslide displacement is crucial for effective early warning of landslide disasters.While most existing prediction methods focus on time-series forecasting for individual monitoring points,there is limited research on the spatiotemporal characteristics of landslide deformation.This paper proposes a novel Multi-Relation Spatiotemporal Graph Residual Network with Multi-Level Feature Attention(MFA-MRSTGRN)that effectively improves the prediction performance of landslide displacement through spatiotemporal fusion.This model integrates internal seepage factors as data feature enhancements with external triggering factors,allowing for accurate capture of the complex spatiotemporal characteristics of landslide displacement and the construction of a multi-source heterogeneous dataset.The MFA-MRSTGRN model incorporates dynamic graph theory and four key modules:multilevel feature attention,temporal-residual decomposition,spatial multi-relational graph convolution,and spatiotemporal fusion prediction.This comprehensive approach enables the efficient analyses of multi-source heterogeneous datasets,facilitating adaptive exploration of the evolving multi-relational,multi-dimensional spatiotemporal complexities in landslides.When applying this model to predict the displacement of the Liangshuijing landslide,we demonstrate that the MFA-MRSTGRN model surpasses traditional models,such as random forest(RF),long short-term memory(LSTM),and spatial temporal graph convolutional networks(ST-GCN)models in terms of various evaluation metrics including mean absolute error(MAE=1.27 mm),root mean square error(RMSE=1.49 mm),mean absolute percentage error(MAPE=0.026),and R-squared(R^(2)=0.88).Furthermore,feature ablation experiments indicate that incorporating internal seepage factors improves the predictive performance of landslide displacement models.This research provides an advanced and reliable method for landslide displacement prediction.展开更多
BACKGROUND Currently,only tumors classified as LR-5 are considered definitive hepatocellular carcinoma(HCC),and no further pathologic confirmation is required to initiate therapy.Previous studies have shown that the s...BACKGROUND Currently,only tumors classified as LR-5 are considered definitive hepatocellular carcinoma(HCC),and no further pathologic confirmation is required to initiate therapy.Previous studies have shown that the sensitivity of LR-5 is modest,and lesions enhanced by gadoxetic acid(Gd-EOB-DTPA)may exhibit lower sensitivity than those enhanced by Gd-DTPA.AIM To identify malignant ancillary features(AFs)that can independently and significantly predict HCC in Liver Imaging Reporting and Data System version 2018,and to develop modified LR-5 criteria to improve diagnostic performance on Gd-EOB-DTPA-enhanced magnetic resonance imaging.METHODS Imaging data from patients with HCC risk factors who underwent abdominal Gd-EOB-DTPA-enhanced magnetic resonance imaging were collected.Univariate and multivariate logistic regression analyses were performed to determine AFs that could independently and significantly predict HCC.The modified LR-5 criteria involved reclassifying LR-4/LR-3 lesions based on major features combined with independently significant AFs for HCC,or by substituting threshold growth with significant AFs.McNemar's test was used to compare the diagnostic performance of the modified LR-5 criteria.RESULTS A total of 244 lesions from 216 patients were included.Transitional phase hypointensity,mild-moderate T2 hyperintensity,and fat in mass(more than adjacent liver)were identified as significant independent predictors of HCC.Using the modified LR-5 criteria(e.g.,LR-5-M1:LR-4+transitional phase hypointensity;LR-5-M4:LR-5 by transitional phase hypointensity instead of threshold growth;LR-5-M5:LR-5 by mild-moderate T2 hyperintensity instead of threshold growth;LR-5-M8:LR-3/LR-4+any two features of transitional phase hypointensity/mild-moderate T2 hyperintensity/fat in mass),sensitivities were significantly increased(88.5%-89.1%)compared to the standard LR-5(60.6%;all P values<0.05),while specificities(84.8%-89.9%)remained largely unchanged(93.7%;all P values>0.05).The LR-5-M8 criterion achieved the highest sensitivity.CONCLUSION Mild-moderate T2 hyperintensity,transitional phase hypointensity,and fat in mass are independent and significant predictors of HCC malignant AFs.The modified LR-5 criteria can improve sensitivity without significantly reducing specificity.展开更多
基金Education Science planning project of Jiangsu Province in 2024(Grant No:B-b/2024/01/152)2025 Jiangsu Normal University Graduate Research and Innovation Program school-level project“Research on the Construction and Desensitization Strategies of Education Sensitive Data Classification from the Perspective of Educational Ecology”。
文摘In the era of digital intelligence,data is a key element in promoting social and economic development.Educational data,as a vital component of data,not only supports teaching and learning but also contains much sensitive information.How to effectively categorize and protect sensitive data has become an urgent issue in educational data security.This paper systematically researches and constructs a multi-dimensional classification framework for sensitive educational data,and discusses its security protection strategy from the aspects of identification and desensitization,aiming to provide new ideas for the security management of sensitive educational data and to help the construction of an educational data security ecosystem in the era of digital intelligence.
文摘Big data has ushered in an era of unprecedented access to vast amounts of new,unstructured data,particularly in the realm of sensitive information.It presents unique opportunities for enhancing risk alerting systems,but also poses challenges in terms of extraction and analysis due to its diverse file formats.This paper proposes the utilization of a DAE-based(Deep Auto-encoders)model for projecting risk associated with financial data.The research delves into the development of an indicator assessing the degree to which organizations successfully avoid displaying bias in handling financial information.Simulation results demonstrate the superior performance of the DAE algorithm,showcasing fewer false positives,improved overall detection rates,and a noteworthy 9%reduction in failure jitter.The optimized DAE algorithm achieves an accuracy of 99%,surpassing existing methods,thereby presenting a robust solution for sensitive data risk projection.
基金This research was supported by a grant[KCG-01-2017-01]through the Disaster and Safety Management Institute funded by the Ministry of Public Safety and Securitythe National Research Foundation of Korea(NRF)grant[No.2018R1D1A1B07050208]funded by the Ministry of Science and ICT of Korea Government.
文摘This study developed a new methodology for analyzing the risk level of marine spill accidents from two perspectives,namely,marine traffic density and sensitive resources.Through a case study conducted in Busan,South Korea,detailed procedures of the methodology were proposed and its scalability was confirmed.To analyze the risk from a more detailed and microscopic viewpoint,vessel routes as hazard sources were delineated on the basis of automated identification system(AIS)big data.The outliers and errors of AIS big data were removed using the density-based spatial clustering of applications with noise algorithm,and a marine traffic density map was evaluated by combining all of the gridded routes.Vulnerability of marine environment was identified on the basis of the sensitive resource map constructed by the Korea Coast Guard in a similar manner to the National Oceanic and Atmospheric Administration environmental sensitivity index approach.In this study,aquaculture sites,water intake facilities of power plants,and beach/resort areas were selected as representative indicators for each category.The vulnerability values of neighboring cells decreased according to the Euclidean distance from the resource cells.Two resulting maps were aggregated to construct a final sensitive resource and traffic density(SRTD)risk analysis map of the Busan–Ulsan sea areas.We confirmed the effectiveness of SRTD risk analysis by comparing it with the actual marine spill accident records.Results show that all of the marine spill accidents in 2018 occurred within 2 km of high-risk cells(level 6 and above).Thus,if accident management and monitoring capabilities are concentrated on high-risk cells,which account for only 6.45%of the total study area,then it is expected that it will be possible to cope with most marine spill accidents effectively.
文摘Numerous industries,especially the medical industry,are likely to exhibit significant developments in the future.Ever since the announcement of the precision medicine initiative by the United States in 2015,interest in the field has considerably increased.The techniques of precision medicine are employed to provide optimal treatment and medical services to patients,in addition to the prevention and management of diseases via the collection and analysis of big data related to their individual genetic characteristics,occupation,living environment,and dietary habits.As this involves the accumulation and utilization of sensitive information,such as patient history,DNA,and personal details,its implementation is difficult if the data are inaccurate,exposed,or forged,and there is also a concern for privacy,as massive amount of data are collected;hence,ensuring the security of information is essential.Therefore,it is necessary to develop methods of securely sharing sensitive data for the establishment of a precision medicine system.An authentication and data sharing scheme is presented in this study on the basis of an analysis of sensitive data.The proposed scheme securely shares sensitive data of each entity in the precision medicine system according to its architecture and data flow.
基金supported by the EC-funded project DAMOCLES (grant 18509)which is part of the Sixth Framework Program of DFG(grant LU 818/1-1)Natural Science Foundation of China(grants No.40233032,40376006).
文摘Evolution of the Arctic sea ice and its snow cover during the SHEBA year were simulated by applying a high-resolution thermodynamic snow/ice model (HIGHTSI). Attention was paid to the impact of albedo on snow and sea ice mass balance, effect of snow on total ice mass balance, and the model vertical resolution. The SHEBA annual simulation was made applying the best possible external forcing data set created by the Sea Ice Model Intercomparison Project. The HIGHTSI control run reasonably reproduced the observed snow and ice thickness. A number of albedo schemes were incorporated into HIGHTSI to study the feedback processes between the albedo and snow and ice thickness. The snow thickness turned out to be an essential variable in the albedo parameterization. Albedo schemes dependent on the surface temperature were liable to excessive positive feedback effects generated by errors in the modelled surface temperature. The superimposed ice formation should be taken into account for the annual Arctic sea ice mass balance.
基金the support of the National High Technology Research and Development Program of China (863) (Grant No. 2012AA1406)
文摘In direct sequence spread spectrum communication both for satelliteto-ground and inter-satellite links, the system constrains due to radio frequency spectral occupation, channel data throughput and link performances in terms of data channel coding which might result in a signal structure where the symbol duration is shorter than the pseudo code period. This can generate some difficulties in the DSSS signal acquisition due to the polarity inversion caused by the data modulation. To eliminate the influence due to polarity inversion, this paper proposes a novel acquisition algorithm based on the simultaneous search of the code phase, data phase and Doppler frequency. In the proposed algorithm the data phase is predicted and the correlation period for the coherent integration can be set equal to the symbol duration. Then non-coherent accumulation over different symbol is implemented in order to enhance the acquisition algorithm sensitivity; the interval of non-coherent accumulation is the least common multiple between the symbol duration and the pseudo code period. The algorithm proposed can largely minimize the SNR loss caused by data polarity inversion and enhance acquisition performance without a noticeable increase in hardware complexity. Theoretical analysis, simulation and measured results verify the validity of the algorithm.
基金Project supported by the National Natural Science Foundation of China (No. 50378041) Specialized Research Fund for Doctoral Programs of Higher Education (No. 20030487016).
文摘Many multi-story or highrise buildings consisting of a number of identical stories are usually considered as periodic spring-mass systems. The general expressions of natural frequencies, mode shapes, slopes and curvatures of mode shapes of the periodic spring-mass system by utilizing the periodic structure theory are derived in this paper. The sensitivities of these mode parameters with respect to structural damages, which do not depend on the physical parameters of the original structures, are obtained. Based on the sensitivity analysis of these mode parameters, a two-stage method is proposed to localize and quantify damages of multi-story or highrise buildings. The slopes and curvatures of mode shapes, which are highly sensitive to local damages, are used to localize the damages. Subsequently, the limited measured natural frequencies, which have a better accuracy than the other mode parameters, are used to quantify the extent of damages within the potential damaged locations. The experimental results of a 3-story experimental building demonstrate that the single or multiple damages of buildings, either slight or severe, can be correctly localized by using only the slope or curvature of mode shape in one of the lower modes, in which the change of natural frequency is the largest, and can be accurately quantified by the limited measured natural frequencies with noise pollution.
基金Supported by the Natural Science Foundation of Hubei Province(2012FFC034,2014CFC1100)
文摘In this paper, we present a distributed multi-level cache system based on cloud storage, which is aimed at the low access efficiency of small spatio-temporal data files in information service system of Smart City. Taking classification attribute of small spatio-temporal data files in Smart City as the basis of cache content selection, the cache system adopts different cache pool management strategies in different levels of cache. The results of experiment in prototype system indicate that multi-level cache in this paper effectively increases the access bandwidth of small spatio-temporal files in Smart City and greatly improves service quality of multiple concurrent access in system.
基金2018 National Grade Innovation and Entrepreneurship Training Program for College Students,China(No.201811562005)Research Project of Gansu University,China(No.2016A-105)Innovation and Entrepreneurship Education Project of Gansu Province in 2019,China(No.2019024)。
文摘At present,the process of digital image information fusion has the problems of low data cleaning unaccuracy and more repeated data omission,resulting in the unideal information fusion.In this regard,a visualized multicomponent information fusion method for big data based on radar map is proposed in this paper.The data model of perceptual digital image is constructed by using the linear regression analysis method.The ID tag of the collected image data as Transactin Identification(TID)is compared.If the TID of two data is the same,the repeated data detection is carried out.After the test,the data set is processed many times in accordance with the method process to improve the precision of data cleaning and reduce the omission.Based on the radar images,hierarchical visualization of processed multi-level information fusion is realized.The experiments show that the method can clean the redundant data accurately and achieve the efficient fusion of multi-level information of big data in the digital image.
文摘The crowdsourced OpenStreetMap mapping platform is utilized by countless stakeholders worldwide for various purposes and applications.Individuals,researchers,governments,commercial,and humanitarian organizations,in addition to the engineers,professionals,and technical developers,use OpenStreetMap both as data contributors and consumers.The storage,usage,and integration of volunteered geographical data in software applications often create complex ethical dilemmas and values regarding the relationships between different categories of stakeholders.It is therefore common for moral preferences of stakeholders to be neglected.This paper investigates the integration of ethical values in OpenStreetMap using the value sensitive design methodology that examines technical,empirical,and conceptual aspects at each design stage.We use the Humanitarian OpenStreetMap Team,an existing volunteered geographic information initiative,as a case study.Our investigation shows that although OpenStreetMap does integrate ethical values in its organizational structure,a deeper understanding of its direct and indirect stakeholders’perspectives is still required.This study is expected to assist organizations that contribute to or use OpenStreetMap in recognizing and preserving existing and important ethical values.To the best of our knowledge,this is the first attempt to evaluate ethical values methodically and comprehensively in the design process of the OpenStreetMap platform.
文摘In this paper, a numerical modeling tool is described which can be used to explore various aspects of four dimensional variational data assimilation and parameter estimation arising in geophysical, environmental, biological and engineering sciences. A major component of this tool is a coupled chaotic dynamical system obtained by coupling two versions of the well-known Lorenz (1963) model with different time scales which differ by a certain time-scale factor. A tangent linear model and its adjoint are considered that correspond to a coupled chaotic system. The general idea of applying sensitivity measures (sensitivity functions) to coupled systems, emphasizing the data assimilation aspects, is explored as well by the forward sensitivity approach. For this purpose the set of sensitivity equations is derived from the nonlinear equations of the coupled dynamical system. To estimate the influence of model parameter uncertainties on the simulated state variables the relative error in the energy norm is used.
文摘The elliptic curve cryptography algorithm represents a major advancement in the field of computer security. This innovative algorithm uses elliptic curves to encrypt and secure data, providing an exceptional level of security while optimizing the efficiency of computer resources. This study focuses on how elliptic curves cryptography helps to protect sensitive data. Text is encrypted using the elliptic curve technique because it provides great security with a smaller key on devices with limited resources, such as mobile phones. The elliptic curves cryptography of this study is better than using a 256-bit RSA key. To achieve equivalent protection by using the elliptic curves cryptography, several Python libraries such as cryptography, pycryptodome, pyQt5, secp256k1, etc. were used. These technologies are used to develop a software based on elliptic curves. If built, the software helps to encrypt and decrypt data such as a text messages and it offers the authentication for the communication.
基金the funding support from the National Natural Science Foundation of China(Grant No.52308340)Chongqing Talent Innovation and Entrepreneurship Demonstration Team Project(Grant No.cstc2024ycjh-bgzxm0012)the Science and Technology Projects supported by China Coal Technology and Engineering Chongqing Design and Research Institute(Group)Co.,Ltd.(Grant No.H20230317).
文摘Accurate prediction of landslide displacement is crucial for effective early warning of landslide disasters.While most existing prediction methods focus on time-series forecasting for individual monitoring points,there is limited research on the spatiotemporal characteristics of landslide deformation.This paper proposes a novel Multi-Relation Spatiotemporal Graph Residual Network with Multi-Level Feature Attention(MFA-MRSTGRN)that effectively improves the prediction performance of landslide displacement through spatiotemporal fusion.This model integrates internal seepage factors as data feature enhancements with external triggering factors,allowing for accurate capture of the complex spatiotemporal characteristics of landslide displacement and the construction of a multi-source heterogeneous dataset.The MFA-MRSTGRN model incorporates dynamic graph theory and four key modules:multilevel feature attention,temporal-residual decomposition,spatial multi-relational graph convolution,and spatiotemporal fusion prediction.This comprehensive approach enables the efficient analyses of multi-source heterogeneous datasets,facilitating adaptive exploration of the evolving multi-relational,multi-dimensional spatiotemporal complexities in landslides.When applying this model to predict the displacement of the Liangshuijing landslide,we demonstrate that the MFA-MRSTGRN model surpasses traditional models,such as random forest(RF),long short-term memory(LSTM),and spatial temporal graph convolutional networks(ST-GCN)models in terms of various evaluation metrics including mean absolute error(MAE=1.27 mm),root mean square error(RMSE=1.49 mm),mean absolute percentage error(MAPE=0.026),and R-squared(R^(2)=0.88).Furthermore,feature ablation experiments indicate that incorporating internal seepage factors improves the predictive performance of landslide displacement models.This research provides an advanced and reliable method for landslide displacement prediction.
基金This study was approved by the Medical Ethics Committee of Jieshou City People's Hospital,approval No.[2022]21.
文摘BACKGROUND Currently,only tumors classified as LR-5 are considered definitive hepatocellular carcinoma(HCC),and no further pathologic confirmation is required to initiate therapy.Previous studies have shown that the sensitivity of LR-5 is modest,and lesions enhanced by gadoxetic acid(Gd-EOB-DTPA)may exhibit lower sensitivity than those enhanced by Gd-DTPA.AIM To identify malignant ancillary features(AFs)that can independently and significantly predict HCC in Liver Imaging Reporting and Data System version 2018,and to develop modified LR-5 criteria to improve diagnostic performance on Gd-EOB-DTPA-enhanced magnetic resonance imaging.METHODS Imaging data from patients with HCC risk factors who underwent abdominal Gd-EOB-DTPA-enhanced magnetic resonance imaging were collected.Univariate and multivariate logistic regression analyses were performed to determine AFs that could independently and significantly predict HCC.The modified LR-5 criteria involved reclassifying LR-4/LR-3 lesions based on major features combined with independently significant AFs for HCC,or by substituting threshold growth with significant AFs.McNemar's test was used to compare the diagnostic performance of the modified LR-5 criteria.RESULTS A total of 244 lesions from 216 patients were included.Transitional phase hypointensity,mild-moderate T2 hyperintensity,and fat in mass(more than adjacent liver)were identified as significant independent predictors of HCC.Using the modified LR-5 criteria(e.g.,LR-5-M1:LR-4+transitional phase hypointensity;LR-5-M4:LR-5 by transitional phase hypointensity instead of threshold growth;LR-5-M5:LR-5 by mild-moderate T2 hyperintensity instead of threshold growth;LR-5-M8:LR-3/LR-4+any two features of transitional phase hypointensity/mild-moderate T2 hyperintensity/fat in mass),sensitivities were significantly increased(88.5%-89.1%)compared to the standard LR-5(60.6%;all P values<0.05),while specificities(84.8%-89.9%)remained largely unchanged(93.7%;all P values>0.05).The LR-5-M8 criterion achieved the highest sensitivity.CONCLUSION Mild-moderate T2 hyperintensity,transitional phase hypointensity,and fat in mass are independent and significant predictors of HCC malignant AFs.The modified LR-5 criteria can improve sensitivity without significantly reducing specificity.