As a significant city in the Yangtze River Delta regions,Hefei has experienced rapid changes in the sources of air pollution due to its high-speed economic development and urban expansion.However,there has been limite...As a significant city in the Yangtze River Delta regions,Hefei has experienced rapid changes in the sources of air pollution due to its high-speed economic development and urban expansion.However,there has been limited research in recent years on the spatial-temporal distribution and emission of its atmospheric pollutants.To address this,this study conducted mobile observations of urban roads using the Mobile-DOAS instrument from June 2021 to May 2022.The monitoring results exhibit a favourable consistent with TROPOMI satellite data and ground monitoring station data.Temporally,there were pronounced seasonal variations in air pollutants.Spatially,high concentration of HCHO and NO_(2)were closely associated with traffic congestion on roadways,while heightened SO_(2)levels were attributed to winter heating and industrial emissions.The study also revealed that with the implementation of road policies,the average vehicle speed increased by 95.4%,while the NO concentration decreased by 54.4%.In the estimation of urban NO_(x)emission flux,it was observed that in temporal terms,compared with inventory data,the emissions calculated viamobile measurements exhibitedmore distinct seasonal patterns,with the highest emission rate of 349 g/sec in winter and the lowest of 142 g/sec in summer.In spatial terms,the significant difference in emissions between the inner and outer ring roads also suggests the presence of the city’s primary NO_(x)emission sources in the area between these two rings.This study offers data support for formulating the next phase of air pollution control measures in urban areas.展开更多
Normal forms have a significant role in the theory of relational database normalization.The definitions of normal forms are established through the functional dependency(FD)relationship between a prime or nonprime att...Normal forms have a significant role in the theory of relational database normalization.The definitions of normal forms are established through the functional dependency(FD)relationship between a prime or nonprime attribute and a key.However,determining whether an attribute is a prime attribute is a nondeterministic polynomial-time complete(NP-complete)problem,making it intractable to determine if a relation scheme is in a specific normal form.While the prime attribute problem is generally NP-complete,there are cases where identifying prime attributes is not challenging.In a relation scheme R(U,F),we partition U into four distinct subsets based on where attributes in U appear in F:U_(1)(attributes only appearing on the left-hand side of FDs),U_(2)(attributes only appearing on the right-hand side of FDs),U_(3)(attributes appearing on both sides of FDs),and U_(4)(attributes not present in F).Next,we demonstrate the necessary and sufficient conditions for a key to be the unique key of a relation scheme.Subsequently,we illustrate the features of prime attributes in U_(3) and generalize the features of common prime attributes.The findings lay the groundwork for distinguishing between complex and simple cases in prime attribute identification,thereby deepening the understanding of this problem.展开更多
In the current situation of decelerating economic expansion,examining the digital economy(DE)as a novel economic model is beneficial for the local economy’s sustainable and high-quality development(HQD).We analyzed p...In the current situation of decelerating economic expansion,examining the digital economy(DE)as a novel economic model is beneficial for the local economy’s sustainable and high-quality development(HQD).We analyzed panel data from the Yellow River(YR)region from 2013 to 2021 and discovered notable spatial variances in the composite index and coupling coordination of the two systems.Specifically,the downstream region exhibited the highest coupling coordination,while the upstream region had the lowest.We identified that favorable factors such as economic development,innovation,industrial upgrading,and government intervention can bolster the coupling.Our findings provide a valuable framework for promoting DE and HQD in the YR region.展开更多
Detecting overlapping communities in attributed networks remains a significant challenge due to the complexity of jointly modeling topological structure and node attributes,the unknown number of communities,and the ne...Detecting overlapping communities in attributed networks remains a significant challenge due to the complexity of jointly modeling topological structure and node attributes,the unknown number of communities,and the need to capture nodes with multiple memberships.To address these issues,we propose a novel framework named density peaks clustering with neutrosophic C-means.First,we construct a consensus embedding by aligning structure-based and attribute-based representations using spectral decomposition and canonical correlation analysis.Then,an improved density peaks algorithm automatically estimates the number of communities and selects initial cluster centers based on a newly designed cluster strength metric.Finally,a neutrosophic C-means algorithm refines the community assignments,modeling uncertainty and overlap explicitly.Experimental results on synthetic and real-world networks demonstrate that the proposed method achieves superior performance in terms of detection accuracy,stability,and its ability to identify overlapping structures.展开更多
Accurate traffic flow prediction has a profound impact on modern traffic management. Traffic flow has complex spatial-temporal correlations and periodicity, which poses difficulties for precise prediction. To address ...Accurate traffic flow prediction has a profound impact on modern traffic management. Traffic flow has complex spatial-temporal correlations and periodicity, which poses difficulties for precise prediction. To address this problem, a Multi-head Self-attention and Spatial-Temporal Graph Convolutional Network (MSSTGCN) for multiscale traffic flow prediction is proposed. Firstly, to capture the hidden traffic periodicity of traffic flow, traffic flow is divided into three kinds of periods, including hourly, daily, and weekly data. Secondly, a graph attention residual layer is constructed to learn the global spatial features across regions. Local spatial-temporal dependence is captured by using a T-GCN module. Thirdly, a transformer layer is introduced to learn the long-term dependence in time. A position embedding mechanism is introduced to label position information for all traffic sequences. Thus, this multi-head self-attention mechanism can recognize the sequence order and allocate weights for different time nodes. Experimental results on four real-world datasets show that the MSSTGCN performs better than the baseline methods and can be successfully adapted to traffic prediction tasks.展开更多
Linguistic steganography(LS)aims to embed secret information into normal natural text for covert communication.It includes modification-based(MLS)and generation-based(GLS)methods.MLS often relies on limited manual rul...Linguistic steganography(LS)aims to embed secret information into normal natural text for covert communication.It includes modification-based(MLS)and generation-based(GLS)methods.MLS often relies on limited manual rules,resulting in low embedding capacity,while GLS achieves higher embedding capacity through automatic text generation but typically ignores extraction efficiency.To address this,we propose a sentence attribute encodingbased MLS method that enhances extraction efficiency while maintaining strong performance.The proposed method designs a lightweight semantic attribute analyzer to encode sentence attributes for embedding secret information.When the attribute values of the cover sentence differ from the secret information to be embedded,a semantic attribute adjuster based on paraphrasing is used to automatically generate paraphrase sentences of the target attribute,thereby improving the problem of insufficient manual rules.During the extraction,secret information can be extracted solely by employing the semantic attribute analyzer,thereby eliminating the dependence on the paraphrasing generation model.Experimental results show that thismethod achieves an extraction speed of 1141.54 bits/sec,compared with the existing methods,it has remarkable advantages regarding extraction speed.Meanwhile,the stego text generated by thismethod respectively reaches 68.53,39.88,and 80.77 on BLEU,△PPL,and BERTScore.Compared with the existing methods,the text quality is effectively improved.展开更多
Security attributes are the premise and foundation for implementing Attribute-Based Access Control(ABAC)mechanisms.However,when dealing with massive volumes of unstructured text big data resources,the current attribut...Security attributes are the premise and foundation for implementing Attribute-Based Access Control(ABAC)mechanisms.However,when dealing with massive volumes of unstructured text big data resources,the current attribute management methods based on manual extraction face several issues,such as high costs for attribute extraction,long processing times,unstable accuracy,and poor scalability.To address these problems,this paper proposes an attribute mining technology for access control institutions based on hybrid capsule networks.This technology leverages transfer learning ideas,utilizing Bidirectional Encoder Representations from Transformers(BERT)pre-trained language models to achieve vectorization of unstructured text data resources.Furthermore,we have designed a novel end-to-end parallel hybrid network structure,where the parallel networks handle global and local information features of the text that they excel at,respectively.By employing techniques such as attention mechanisms,capsule networks,and dynamic routing,effective mining of security attributes for access control resources has been achieved.Finally,we evaluated the performance level of the proposed attribute mining method for access control institutions through experiments on the medical referral text resource dataset.The experimental results show that,compared with baseline algorithms,our method adopts a parallel network structure that can better balance global and local feature information,resulting in improved overall performance.Specifically,it achieves a comprehensive performance enhancement of 2.06%to 8.18%in the F1 score metric.Therefore,this technology can effectively provide attribute support for access control of unstructured text big data resources.展开更多
Seismic attributes encapsulate substantial reservoir characterization information and can effectively support reservoir prediction.Given the high-dimensional nonlinear between sandbodies and seismic attributes,this st...Seismic attributes encapsulate substantial reservoir characterization information and can effectively support reservoir prediction.Given the high-dimensional nonlinear between sandbodies and seismic attributes,this study employs the RFECV method for seismic attribute selection,inputting the optimized attributes into a LightGBM model to enhance spatial delineation of sandbody identification.By constructing training datasets based on optimized seismic attributes and well logs,followed by class imbalance correction as input variables for machine learning models,with sandbody probability as the output variable,and employing grid search to optimize model parameters,a high-precision sandbody prediction model was established.Taking the 3D seismic data of Block F3 in the North Sea of Holland as an example,this method successfully depicted the three-dimensional spatial distribution of target formation sandstones.The results indicate that even under strong noise conditions,the multi-attribute sandbody identification method based on LightGBM effectively characterizes the distribution features of sandbodies.Compared to unselected attributes,the prediction results using selected attributes have higher vertical resolution and inter-well conformity,with the prediction accuracy for single wells reaching 80.77%,significantly improving the accuracy of sandbody boundary delineation.展开更多
Thunderstorm detection based on the Atmospheric Electric Field(AEF)has evolved from time-domain models to space-domain models.It is especially important to evaluate and determine the particularly Weather Attribute(WA)...Thunderstorm detection based on the Atmospheric Electric Field(AEF)has evolved from time-domain models to space-domain models.It is especially important to evaluate and determine the particularly Weather Attribute(WA),which is directly related to the detection reliability and authenticity.In this paper,a strategy is proposed to integrate three currently competitive WA's evaluation methods.First,a conventional evaluation method based on AEF statistical indicators is selected.Subsequent evaluation approaches include competing AEF-based predicted value intervals,and AEF classification based on fuzzy c-means.Different AEF attributes contribute to a more accurate AEF classification to different degrees.The resulting dynamic weighting applied to these attributes improves the classification accuracy.Each evaluation method is applied to evaluate the WA of a particular AEF,to obtain the corresponding evaluation score.The integration in the proposed strategy takes the form of a score accumulation.Different cumulative score levels correspond to different final WA results.Thunderstorm imaging is performed to visualize thunderstorm activities using those AEFs already evaluated to exhibit thunderstorm attributes.Empirical results confirm that the proposed strategy effectively and reliably images thunderstorms,with a 100%accuracy of WA evaluation.This is the first study to design an integrated thunderstorm detection strategy from a new perspective of WA evaluation,which provides promising solutions for a more reliable and flexible thunderstorm detection.展开更多
Conditional proxy re-encryption(CPRE)is an effective cryptographic primitive language that enhances the access control mechanism and makes the delegation of decryption permissions more granular,but most of the attribu...Conditional proxy re-encryption(CPRE)is an effective cryptographic primitive language that enhances the access control mechanism and makes the delegation of decryption permissions more granular,but most of the attribute-based conditional proxy re-encryption(AB-CPRE)schemes proposed so far do not take into account the importance of user attributes.A weighted attribute-based conditional proxy re-encryption(WAB-CPRE)scheme is thus designed to provide more precise decryption rights delegation.By introducing the concept of weight attributes,the quantity of system attributes managed by the server is reduced greatly.At the same time,a weighted tree structure is constructed to simplify the expression of access structure effectively.With conditional proxy re-encryption,large amounts of data and complex computations are outsourced to cloud servers,so the data owner(DO)can revoke the user’s decryption rights directly with minimal costs.The scheme proposed achieves security against chosen plaintext attacks(CPA).Experimental simulation results demonstrated that the decryption time is within 6–9 ms,and it has a significant reduction in communication and computation cost on the user side with better functionality compared to other related schemes,which enables users to access cloud data on devices with limited resources.展开更多
Attribute-based encryption(ABE)is a cryptographic framework that provides flexible access control by allowing encryption based on user attributes.ABE is widely applied in cloud storage,file sharing,e-Health,and digita...Attribute-based encryption(ABE)is a cryptographic framework that provides flexible access control by allowing encryption based on user attributes.ABE is widely applied in cloud storage,file sharing,e-Health,and digital rightsmanagement.ABE schemes rely on hard cryptographic assumptions such as pairings and others(pairingfree)to ensure their security against external and internal attacks.Internal attacks are carried out by authorized users who misuse their access to compromise security with potentially malicious intent.One common internal attack is the attribute collusion attack,in which users with different attribute keys collaborate to decrypt data they could not individually access.This paper focuses on the ciphertext-policy ABE(CP-ABE),a type of ABE where ciphertexts are produced with access policies.Our firstwork is to carry out the attribute collusion attack against several existing pairingfree CP-ABE schemes.As a main contribution,we introduce a novel attack,termed the anonymous key-leakage attack,concerning the context in which users could anonymously publish their secret keys associated with certain attributes on public platforms without the risk of detection.This kind of internal attack has not been defined or investigated in the literature.We then show that several prominent pairing-based CP-ABE schemes are vulnerable to this attack.We believe that this work will contribute to helping the community evaluate suitable CP-ABE schemes for secure deployment in real-life applications.展开更多
Attributed graph clustering plays a vital role in uncovering hidden network structures,but it presents significant challenges.In recent years,various models have been proposed to identify meaningful clusters by integr...Attributed graph clustering plays a vital role in uncovering hidden network structures,but it presents significant challenges.In recent years,various models have been proposed to identify meaningful clusters by integrating both structural and attribute-based information.However,these models often emphasize node proximities without adequately balancing the efficiency of clustering based on both structural and attribute data.Furthermore,they tend to neglect the critical fuzzy information inherent in attributed graph clusters.To address these issues,we introduce a new framework,Markov lumpability optimization,for efficient clustering of large-scale attributed graphs.Specifically,we define a lumped Markov chain on an attribute-augmented graph and introduce a new metric,Markov lumpability,to quantify the differences between the original and lumped Markov transition probability matrices.To minimize this measure,we propose a conjugate gradient projectionbased approach that ensures the partitioning closely aligns with the intrinsic structure of fuzzy clusters through conditional optimization.Extensive experiments on both synthetic and real-world datasets demonstrate the superior performance of the proposed framework compared to existing clustering algorithms.This framework has many potential applications,including dynamic community analysis of social networks,user profiling in recommendation systems,functional module identification in biological molecular networks,and financial risk control,offering a new paradigm for mining complex patterns in high-dimensional attributed graph data.展开更多
Spatial-temporal traffic prediction technology is crucial for network planning,resource allocation optimizing,and user experience improving.With the development of virtual network operators,multi-operator collaboratio...Spatial-temporal traffic prediction technology is crucial for network planning,resource allocation optimizing,and user experience improving.With the development of virtual network operators,multi-operator collaborations,and edge computing,spatial-temporal traffic data has taken on a distributed nature.Consequently,noncentralized spatial-temporal traffic prediction solutions have emerged as a recent research focus.Currently,the majority of research typically adopts federated learning methods to train traffic prediction models distributed on each base station.This method reduces additional burden on communication systems.However,this method has a drawback:it cannot handle irregular traffic data.Due to unstable wireless network environments,device failures,insufficient storage resources,etc.,data missing inevitably occurs during the process of collecting traffic data.This results in the irregular nature of distributed traffic data.Yet,commonly used traffic prediction models such as Recurrent Neural Networks(RNN)and Long Short-Term Memory(LSTM)typically assume that the data is complete and regular.To address the challenge of handling irregular traffic data,this paper transforms irregular traffic prediction into problems of estimating latent variables and generating future traffic.To solve the aforementioned problems,this paper introduces split learning to design a structured distributed learning framework.The framework comprises a Global-level Spatial structure mining Model(GSM)and several Nodelevel Generative Models(NGMs).NGM and GSM represent Seq2Seq models deployed on the base station and graph neural network models deployed on the cloud or central controller.Firstly,the time embedding layer in NGM establishes the mapping relationship between irregular traffic data and regular latent temporal feature variables.Secondly,GSM collects statistical feature parameters of latent temporal feature variables from various nodes and executes graph embedding for spatial-temporal traffic data.Finally,NGM generates future traffic based on latent temporal and spatial feature variables.The introduction of the time attention mechanism enhances the framework’s capability to handle irregular traffic data.Graph attention network introduces spatially correlated base station traffic feature information into local traffic prediction,which compensates for missing information in local irregular traffic data.The proposed framework effectively addresses the distributed prediction issues of irregular traffic data.By testing on real world datasets,the proposed framework improves traffic prediction accuracy by 35%compared to other commonly used distributed traffic prediction methods.展开更多
Studies to enhance the management of electrical energy have gained considerable momentum in recent years. The question of how much energy will be needed in households is a pressing issue as it allows the management pl...Studies to enhance the management of electrical energy have gained considerable momentum in recent years. The question of how much energy will be needed in households is a pressing issue as it allows the management plan of the available resources at the power grids and consumer levels. A non-intrusive inference process can be adopted to predict the amount of energy required by appliances. In this study, an inference process of appliance consumption based on temporal and environmental factors used as a soft sensor is proposed. First, a study of the correlation between the electrical and environmental variables is presented. Then, a resampling process is applied to the initial data set to generate three other subsets of data. All the subsets were evaluated to deduce the adequate granularity for the prediction of the energy demand. Then, a cloud-assisted deep neural network model is designed to forecast short-term energy consumption in a residential area while preserving user privacy. The solution is applied to the consumption data of four appliances elected from a set of real household power data. The experiment results show that the proposed framework is effective for estimating consumption with convincing accuracy.展开更多
Conservation and enhancement of old-growth forests are key in forest planning and policies.In order to do so,more knowledge is needed on how the attributes traditionally associated with old-growth forests are distribu...Conservation and enhancement of old-growth forests are key in forest planning and policies.In order to do so,more knowledge is needed on how the attributes traditionally associated with old-growth forests are distributed in space,what differences exist across distinct forest types and what natural or anthropic conditions are affecting the distribution of these old-growthness attributes.Using data from the Third Spanish National Forest Inventory(1997–2007),we calculated six indicators commonly associated with forest old-growthness for the plots in the territory of Peninsular Spain and Balearic Islands,and then combined them into an aggregated index.We then assessed their spatial distribution and the differences across five forest functional types,as well as the effects of ten climate,topographic,landscape,and anthropic variables in their distribution.Relevant geographical patterns were apparent,with climate factors,namely temperature and precipitation,playing a crucial role in the distribution of these attributes.The distribution of the indicators also varied across different forest types,while the effects of recent anthropic impacts were weaker but still relevant.Aridity seemed to be one of the main impediments for the development of old-growthness attributes,coupled with a negative impact of recent human pressure.However,these effects seemed to be mediated by other factors,specially the legacies imposed by the complex history of forest management practices,land use changes and natural disturbances that have shaped the forests of Spain.The results of this exploratory analysis highlight on one hand the importance of climate in the dynamic of forests towards old-growthness,which is relevant in a context of Climate Change,and on the other hand,the need for more insights on the history of our forests in order to understand their present and future.展开更多
Generative image steganography is a technique that directly generates stego images from secret infor-mation.Unlike traditional methods,it theoretically resists steganalysis because there is no cover image.Currently,th...Generative image steganography is a technique that directly generates stego images from secret infor-mation.Unlike traditional methods,it theoretically resists steganalysis because there is no cover image.Currently,the existing generative image steganography methods generally have good steganography performance,but there is still potential room for enhancing both the quality of stego images and the accuracy of secret information extraction.Therefore,this paper proposes a generative image steganography algorithm based on attribute feature transformation and invertible mapping rule.Firstly,the reference image is disentangled by a content and an attribute encoder to obtain content features and attribute features,respectively.Then,a mean mapping rule is introduced to map the binary secret information into a noise vector,conforming to the distribution of attribute features.This noise vector is input into the generator to produce the attribute transformed stego image with the content feature of the reference image.Additionally,we design an adversarial loss,a reconstruction loss,and an image diversity loss to train the proposed model.Experimental results demonstrate that the stego images generated by the proposed method are of high quality,with an average extraction accuracy of 99.4%for the hidden information.Furthermore,since the stego image has a uniform distribution similar to the attribute-transformed image without secret information,it effectively resists both subjective and objective steganalysis.展开更多
The information from sparsely logged wellbores is currently under-utilized in reservoir simulation models and their proxies using deep and machine learning (DL/ML).This is particularly problematic for large heterogene...The information from sparsely logged wellbores is currently under-utilized in reservoir simulation models and their proxies using deep and machine learning (DL/ML).This is particularly problematic for large heterogeneous gas/oil reservoirs being considered for repurposing as gas storage reservoirs for CH_(4),CO_(2) or H_(2) and/or enhanced oil recovery technologies.Lack of well-log data leads to inadequate spatial definition of complex models due to the large uncertainties associated with the extrapolation of petrophysical rock types (PRT) calibrated with limited core data across heterogeneous and/or anisotropic reservoirs.Extracting well-log attributes from the few well logs available in many wells and tying PRT predictions based on them to seismic data has the potential to substantially improve the confidence in PRT 3D-mapping across such reservoirs.That process becomes more efficient when coupled with DL/ML models incorporating feature importance and optimized,dual-objective feature selection techniques.展开更多
0 INTRODUCTION As a high-risk construction project,underground engineering is characterized by large investment,long construction period,complexconstruction techniques,numerous unforeseeable risk factors,and significa...0 INTRODUCTION As a high-risk construction project,underground engineering is characterized by large investment,long construction period,complexconstruction techniques,numerous unforeseeable risk factors,and significantenvironmental impacts.Identifying potentialdisaster risks from the intricate web of influencing factors plays a critical role in ensuring project safety.展开更多
Ground penetrating radar (GPR) attribute technology has been applied to many aspects in recent years but there are very few examples in the field of archaeology. Especially how can we extract effective attributes fr...Ground penetrating radar (GPR) attribute technology has been applied to many aspects in recent years but there are very few examples in the field of archaeology. Especially how can we extract effective attributes from the two- or three-dimensional radar data so that we can map and describe numerous archaeological targets in a large cultural site? In this paper, we applied GPR attribute technology to investigate the ancient Nanzhao castle-site in Tengchong, Yunnan Province. In order to get better archaeological target (the ancient wall, the ancient kiln site, and the ancient tomb) analysis and description, we collated the GPR data by collected standardization and then put them to the seismic data processing and interpretation workstation. The data was processed, including a variety of GPR attribute extraction, analysis, and optimization and combined with the archaeological drilling data. We choose the RMS Amplitude, Average Peak Amplitude, Instantaneous Phase, and Maximum Peak Time to interpret three archaeological targets. By comparative analysis, we have clarified that we should use different attributes to interpret different archaeological targets and the results of attribute analysis after horizon tracking is much better than the results based on a time slice.展开更多
To investigate the distribution and velocity attributes of gas hydrates in the northern continental slope of South China Sea, Guangzhou Marine Geological Survey conducted four-component (4C) ocean-bottom seismometer...To investigate the distribution and velocity attributes of gas hydrates in the northern continental slope of South China Sea, Guangzhou Marine Geological Survey conducted four-component (4C) ocean-bottom seismometer (OBS) surveys. A case study is presented to show the results of acquiring and processing OBS data for detecting gas hydrates. Key processing steps such as repositioning, reorientation, PZ summation, and mirror imaging are discussed. Repositioning and reorientation find the correct location and direction of nodes. PZ summation matches P- and Z-components and sums them to separate upgoing and downgoing waves. Upgoing waves are used in conventional imaging, whereas downgoing waves are used in mirror imaging. Mirror imaging uses the energy of the receiver ghost reflection to improve the illumination of shallow structures, where gas hydrates and the associated bottom-simulating reflections (BSRs) are located. We developed a new method of velocity analysis using mirror imaging. The proposed method is based on velocity scanning and iterative prestack time migration. The final imaging results are promising. When combined with the derived velocity field, we can characterize the BSR and shallow structures; hence, we conclude that using 4C OBS can reveal the distribution and velocity attributes of gas hydrates.展开更多
基金supported by the National Natural Science Foundation of China(Nos.U19A2044,42105132,42030609,41975037,and 42105133)the National Key Research and Development Program of China(No.2022YFC3703502)+1 种基金the Plan for Anhui Major Provincial Science&Technology Project(No.202203a07020003)Hefei Ecological Environment Bureau Project(No.2020BFFFD01804).
文摘As a significant city in the Yangtze River Delta regions,Hefei has experienced rapid changes in the sources of air pollution due to its high-speed economic development and urban expansion.However,there has been limited research in recent years on the spatial-temporal distribution and emission of its atmospheric pollutants.To address this,this study conducted mobile observations of urban roads using the Mobile-DOAS instrument from June 2021 to May 2022.The monitoring results exhibit a favourable consistent with TROPOMI satellite data and ground monitoring station data.Temporally,there were pronounced seasonal variations in air pollutants.Spatially,high concentration of HCHO and NO_(2)were closely associated with traffic congestion on roadways,while heightened SO_(2)levels were attributed to winter heating and industrial emissions.The study also revealed that with the implementation of road policies,the average vehicle speed increased by 95.4%,while the NO concentration decreased by 54.4%.In the estimation of urban NO_(x)emission flux,it was observed that in temporal terms,compared with inventory data,the emissions calculated viamobile measurements exhibitedmore distinct seasonal patterns,with the highest emission rate of 349 g/sec in winter and the lowest of 142 g/sec in summer.In spatial terms,the significant difference in emissions between the inner and outer ring roads also suggests the presence of the city’s primary NO_(x)emission sources in the area between these two rings.This study offers data support for formulating the next phase of air pollution control measures in urban areas.
文摘Normal forms have a significant role in the theory of relational database normalization.The definitions of normal forms are established through the functional dependency(FD)relationship between a prime or nonprime attribute and a key.However,determining whether an attribute is a prime attribute is a nondeterministic polynomial-time complete(NP-complete)problem,making it intractable to determine if a relation scheme is in a specific normal form.While the prime attribute problem is generally NP-complete,there are cases where identifying prime attributes is not challenging.In a relation scheme R(U,F),we partition U into four distinct subsets based on where attributes in U appear in F:U_(1)(attributes only appearing on the left-hand side of FDs),U_(2)(attributes only appearing on the right-hand side of FDs),U_(3)(attributes appearing on both sides of FDs),and U_(4)(attributes not present in F).Next,we demonstrate the necessary and sufficient conditions for a key to be the unique key of a relation scheme.Subsequently,we illustrate the features of prime attributes in U_(3) and generalize the features of common prime attributes.The findings lay the groundwork for distinguishing between complex and simple cases in prime attribute identification,thereby deepening the understanding of this problem.
基金supported by the National Office for Philosophy and Social Sciences(grant reference 22&ZD067).
文摘In the current situation of decelerating economic expansion,examining the digital economy(DE)as a novel economic model is beneficial for the local economy’s sustainable and high-quality development(HQD).We analyzed panel data from the Yellow River(YR)region from 2013 to 2021 and discovered notable spatial variances in the composite index and coupling coordination of the two systems.Specifically,the downstream region exhibited the highest coupling coordination,while the upstream region had the lowest.We identified that favorable factors such as economic development,innovation,industrial upgrading,and government intervention can bolster the coupling.Our findings provide a valuable framework for promoting DE and HQD in the YR region.
基金supported by the Natural Science Foundation of China(Grant No.72571150)。
文摘Detecting overlapping communities in attributed networks remains a significant challenge due to the complexity of jointly modeling topological structure and node attributes,the unknown number of communities,and the need to capture nodes with multiple memberships.To address these issues,we propose a novel framework named density peaks clustering with neutrosophic C-means.First,we construct a consensus embedding by aligning structure-based and attribute-based representations using spectral decomposition and canonical correlation analysis.Then,an improved density peaks algorithm automatically estimates the number of communities and selects initial cluster centers based on a newly designed cluster strength metric.Finally,a neutrosophic C-means algorithm refines the community assignments,modeling uncertainty and overlap explicitly.Experimental results on synthetic and real-world networks demonstrate that the proposed method achieves superior performance in terms of detection accuracy,stability,and its ability to identify overlapping structures.
基金supported by the National Natural Science Foundation of China(Grant Nos.62472149,62376089,62202147)Hubei Provincial Science and Technology Plan Project(2023BCB04100).
文摘Accurate traffic flow prediction has a profound impact on modern traffic management. Traffic flow has complex spatial-temporal correlations and periodicity, which poses difficulties for precise prediction. To address this problem, a Multi-head Self-attention and Spatial-Temporal Graph Convolutional Network (MSSTGCN) for multiscale traffic flow prediction is proposed. Firstly, to capture the hidden traffic periodicity of traffic flow, traffic flow is divided into three kinds of periods, including hourly, daily, and weekly data. Secondly, a graph attention residual layer is constructed to learn the global spatial features across regions. Local spatial-temporal dependence is captured by using a T-GCN module. Thirdly, a transformer layer is introduced to learn the long-term dependence in time. A position embedding mechanism is introduced to label position information for all traffic sequences. Thus, this multi-head self-attention mechanism can recognize the sequence order and allocate weights for different time nodes. Experimental results on four real-world datasets show that the MSSTGCN performs better than the baseline methods and can be successfully adapted to traffic prediction tasks.
基金supported by the National Natural Science Foundation of China under Grant 61972057Hunan Provincial Natural Science Foundation of China under Grant 2022JJ30623.
文摘Linguistic steganography(LS)aims to embed secret information into normal natural text for covert communication.It includes modification-based(MLS)and generation-based(GLS)methods.MLS often relies on limited manual rules,resulting in low embedding capacity,while GLS achieves higher embedding capacity through automatic text generation but typically ignores extraction efficiency.To address this,we propose a sentence attribute encodingbased MLS method that enhances extraction efficiency while maintaining strong performance.The proposed method designs a lightweight semantic attribute analyzer to encode sentence attributes for embedding secret information.When the attribute values of the cover sentence differ from the secret information to be embedded,a semantic attribute adjuster based on paraphrasing is used to automatically generate paraphrase sentences of the target attribute,thereby improving the problem of insufficient manual rules.During the extraction,secret information can be extracted solely by employing the semantic attribute analyzer,thereby eliminating the dependence on the paraphrasing generation model.Experimental results show that thismethod achieves an extraction speed of 1141.54 bits/sec,compared with the existing methods,it has remarkable advantages regarding extraction speed.Meanwhile,the stego text generated by thismethod respectively reaches 68.53,39.88,and 80.77 on BLEU,△PPL,and BERTScore.Compared with the existing methods,the text quality is effectively improved.
基金supported by National Natural Science Foundation of China(No.62102449).
文摘Security attributes are the premise and foundation for implementing Attribute-Based Access Control(ABAC)mechanisms.However,when dealing with massive volumes of unstructured text big data resources,the current attribute management methods based on manual extraction face several issues,such as high costs for attribute extraction,long processing times,unstable accuracy,and poor scalability.To address these problems,this paper proposes an attribute mining technology for access control institutions based on hybrid capsule networks.This technology leverages transfer learning ideas,utilizing Bidirectional Encoder Representations from Transformers(BERT)pre-trained language models to achieve vectorization of unstructured text data resources.Furthermore,we have designed a novel end-to-end parallel hybrid network structure,where the parallel networks handle global and local information features of the text that they excel at,respectively.By employing techniques such as attention mechanisms,capsule networks,and dynamic routing,effective mining of security attributes for access control resources has been achieved.Finally,we evaluated the performance level of the proposed attribute mining method for access control institutions through experiments on the medical referral text resource dataset.The experimental results show that,compared with baseline algorithms,our method adopts a parallel network structure that can better balance global and local feature information,resulting in improved overall performance.Specifically,it achieves a comprehensive performance enhancement of 2.06%to 8.18%in the F1 score metric.Therefore,this technology can effectively provide attribute support for access control of unstructured text big data resources.
基金co-funded by the China National Nuclear Corporation-State Key Laboratory of Nuclear Resources and Environment(East ChinaUniversity of Technology)Joint Innovation Fund Project(No.2023NRE-LH-08)the Natural Science Foundation of Jiangxi Province,China(No.20252BAC240270)+1 种基金the Funding of National Key Laboratory of Uranium Resources Exploration-Mining and Nuclear Remote Sensing(2025QZ-YZZ-08)the National Major Science and Technology Project on Deep Earth of China(No.2024ZD 1003300)。
文摘Seismic attributes encapsulate substantial reservoir characterization information and can effectively support reservoir prediction.Given the high-dimensional nonlinear between sandbodies and seismic attributes,this study employs the RFECV method for seismic attribute selection,inputting the optimized attributes into a LightGBM model to enhance spatial delineation of sandbody identification.By constructing training datasets based on optimized seismic attributes and well logs,followed by class imbalance correction as input variables for machine learning models,with sandbody probability as the output variable,and employing grid search to optimize model parameters,a high-precision sandbody prediction model was established.Taking the 3D seismic data of Block F3 in the North Sea of Holland as an example,this method successfully depicted the three-dimensional spatial distribution of target formation sandstones.The results indicate that even under strong noise conditions,the multi-attribute sandbody identification method based on LightGBM effectively characterizes the distribution features of sandbodies.Compared to unselected attributes,the prediction results using selected attributes have higher vertical resolution and inter-well conformity,with the prediction accuracy for single wells reaching 80.77%,significantly improving the accuracy of sandbody boundary delineation.
基金supported in part by the National Natural Science Foundation of China under Grant 62171228in part by the National Key R&D Program of China under Grant 2021YFE0105500in part by the Program of China Scholarship Council under Grant 202209040027。
文摘Thunderstorm detection based on the Atmospheric Electric Field(AEF)has evolved from time-domain models to space-domain models.It is especially important to evaluate and determine the particularly Weather Attribute(WA),which is directly related to the detection reliability and authenticity.In this paper,a strategy is proposed to integrate three currently competitive WA's evaluation methods.First,a conventional evaluation method based on AEF statistical indicators is selected.Subsequent evaluation approaches include competing AEF-based predicted value intervals,and AEF classification based on fuzzy c-means.Different AEF attributes contribute to a more accurate AEF classification to different degrees.The resulting dynamic weighting applied to these attributes improves the classification accuracy.Each evaluation method is applied to evaluate the WA of a particular AEF,to obtain the corresponding evaluation score.The integration in the proposed strategy takes the form of a score accumulation.Different cumulative score levels correspond to different final WA results.Thunderstorm imaging is performed to visualize thunderstorm activities using those AEFs already evaluated to exhibit thunderstorm attributes.Empirical results confirm that the proposed strategy effectively and reliably images thunderstorms,with a 100%accuracy of WA evaluation.This is the first study to design an integrated thunderstorm detection strategy from a new perspective of WA evaluation,which provides promising solutions for a more reliable and flexible thunderstorm detection.
基金Programs for Science and Technology Development of Henan Province,grant number 242102210152The Fundamental Research Funds for the Universities of Henan Province,grant number NSFRF240620+1 种基金Key Scientific Research Project of Henan Higher Education Institutions,grant number 24A520015Henan Key Laboratory of Network Cryptography Technology,grant number LNCT2022-A11.
文摘Conditional proxy re-encryption(CPRE)is an effective cryptographic primitive language that enhances the access control mechanism and makes the delegation of decryption permissions more granular,but most of the attribute-based conditional proxy re-encryption(AB-CPRE)schemes proposed so far do not take into account the importance of user attributes.A weighted attribute-based conditional proxy re-encryption(WAB-CPRE)scheme is thus designed to provide more precise decryption rights delegation.By introducing the concept of weight attributes,the quantity of system attributes managed by the server is reduced greatly.At the same time,a weighted tree structure is constructed to simplify the expression of access structure effectively.With conditional proxy re-encryption,large amounts of data and complex computations are outsourced to cloud servers,so the data owner(DO)can revoke the user’s decryption rights directly with minimal costs.The scheme proposed achieves security against chosen plaintext attacks(CPA).Experimental simulation results demonstrated that the decryption time is within 6–9 ms,and it has a significant reduction in communication and computation cost on the user side with better functionality compared to other related schemes,which enables users to access cloud data on devices with limited resources.
文摘Attribute-based encryption(ABE)is a cryptographic framework that provides flexible access control by allowing encryption based on user attributes.ABE is widely applied in cloud storage,file sharing,e-Health,and digital rightsmanagement.ABE schemes rely on hard cryptographic assumptions such as pairings and others(pairingfree)to ensure their security against external and internal attacks.Internal attacks are carried out by authorized users who misuse their access to compromise security with potentially malicious intent.One common internal attack is the attribute collusion attack,in which users with different attribute keys collaborate to decrypt data they could not individually access.This paper focuses on the ciphertext-policy ABE(CP-ABE),a type of ABE where ciphertexts are produced with access policies.Our firstwork is to carry out the attribute collusion attack against several existing pairingfree CP-ABE schemes.As a main contribution,we introduce a novel attack,termed the anonymous key-leakage attack,concerning the context in which users could anonymously publish their secret keys associated with certain attributes on public platforms without the risk of detection.This kind of internal attack has not been defined or investigated in the literature.We then show that several prominent pairing-based CP-ABE schemes are vulnerable to this attack.We believe that this work will contribute to helping the community evaluate suitable CP-ABE schemes for secure deployment in real-life applications.
基金supported by the National Natural Science Foundation of China(Grant No.72571150)Beijing Natural Science Foundation(Grant No.9182015)。
文摘Attributed graph clustering plays a vital role in uncovering hidden network structures,but it presents significant challenges.In recent years,various models have been proposed to identify meaningful clusters by integrating both structural and attribute-based information.However,these models often emphasize node proximities without adequately balancing the efficiency of clustering based on both structural and attribute data.Furthermore,they tend to neglect the critical fuzzy information inherent in attributed graph clusters.To address these issues,we introduce a new framework,Markov lumpability optimization,for efficient clustering of large-scale attributed graphs.Specifically,we define a lumped Markov chain on an attribute-augmented graph and introduce a new metric,Markov lumpability,to quantify the differences between the original and lumped Markov transition probability matrices.To minimize this measure,we propose a conjugate gradient projectionbased approach that ensures the partitioning closely aligns with the intrinsic structure of fuzzy clusters through conditional optimization.Extensive experiments on both synthetic and real-world datasets demonstrate the superior performance of the proposed framework compared to existing clustering algorithms.This framework has many potential applications,including dynamic community analysis of social networks,user profiling in recommendation systems,functional module identification in biological molecular networks,and financial risk control,offering a new paradigm for mining complex patterns in high-dimensional attributed graph data.
基金supported by the Beijing Natural Science Foundation(Certificate Number:L234025).
文摘Spatial-temporal traffic prediction technology is crucial for network planning,resource allocation optimizing,and user experience improving.With the development of virtual network operators,multi-operator collaborations,and edge computing,spatial-temporal traffic data has taken on a distributed nature.Consequently,noncentralized spatial-temporal traffic prediction solutions have emerged as a recent research focus.Currently,the majority of research typically adopts federated learning methods to train traffic prediction models distributed on each base station.This method reduces additional burden on communication systems.However,this method has a drawback:it cannot handle irregular traffic data.Due to unstable wireless network environments,device failures,insufficient storage resources,etc.,data missing inevitably occurs during the process of collecting traffic data.This results in the irregular nature of distributed traffic data.Yet,commonly used traffic prediction models such as Recurrent Neural Networks(RNN)and Long Short-Term Memory(LSTM)typically assume that the data is complete and regular.To address the challenge of handling irregular traffic data,this paper transforms irregular traffic prediction into problems of estimating latent variables and generating future traffic.To solve the aforementioned problems,this paper introduces split learning to design a structured distributed learning framework.The framework comprises a Global-level Spatial structure mining Model(GSM)and several Nodelevel Generative Models(NGMs).NGM and GSM represent Seq2Seq models deployed on the base station and graph neural network models deployed on the cloud or central controller.Firstly,the time embedding layer in NGM establishes the mapping relationship between irregular traffic data and regular latent temporal feature variables.Secondly,GSM collects statistical feature parameters of latent temporal feature variables from various nodes and executes graph embedding for spatial-temporal traffic data.Finally,NGM generates future traffic based on latent temporal and spatial feature variables.The introduction of the time attention mechanism enhances the framework’s capability to handle irregular traffic data.Graph attention network introduces spatially correlated base station traffic feature information into local traffic prediction,which compensates for missing information in local irregular traffic data.The proposed framework effectively addresses the distributed prediction issues of irregular traffic data.By testing on real world datasets,the proposed framework improves traffic prediction accuracy by 35%compared to other commonly used distributed traffic prediction methods.
基金funded by NARI Group’s Independent Project of China(Grant No.524609230125)the Foundation of NARI-TECH Nanjing Control System Ltd.of China(Grant No.0914202403120020).
文摘Studies to enhance the management of electrical energy have gained considerable momentum in recent years. The question of how much energy will be needed in households is a pressing issue as it allows the management plan of the available resources at the power grids and consumer levels. A non-intrusive inference process can be adopted to predict the amount of energy required by appliances. In this study, an inference process of appliance consumption based on temporal and environmental factors used as a soft sensor is proposed. First, a study of the correlation between the electrical and environmental variables is presented. Then, a resampling process is applied to the initial data set to generate three other subsets of data. All the subsets were evaluated to deduce the adequate granularity for the prediction of the energy demand. Then, a cloud-assisted deep neural network model is designed to forecast short-term energy consumption in a residential area while preserving user privacy. The solution is applied to the consumption data of four appliances elected from a set of real household power data. The experiment results show that the proposed framework is effective for estimating consumption with convincing accuracy.
基金supported by the Spanish Ministry of Science and Innovation project GREEN-RISK(Evaluation of past changes in ecosystem services and biodiversity in forests and restoration priorities under global change impacts-PID2020-119933RB-C21)A.C.received a pre-doctoral fellowship funded by the Spanish Ministry of Science and Innovation(PRE2021-099642).
文摘Conservation and enhancement of old-growth forests are key in forest planning and policies.In order to do so,more knowledge is needed on how the attributes traditionally associated with old-growth forests are distributed in space,what differences exist across distinct forest types and what natural or anthropic conditions are affecting the distribution of these old-growthness attributes.Using data from the Third Spanish National Forest Inventory(1997–2007),we calculated six indicators commonly associated with forest old-growthness for the plots in the territory of Peninsular Spain and Balearic Islands,and then combined them into an aggregated index.We then assessed their spatial distribution and the differences across five forest functional types,as well as the effects of ten climate,topographic,landscape,and anthropic variables in their distribution.Relevant geographical patterns were apparent,with climate factors,namely temperature and precipitation,playing a crucial role in the distribution of these attributes.The distribution of the indicators also varied across different forest types,while the effects of recent anthropic impacts were weaker but still relevant.Aridity seemed to be one of the main impediments for the development of old-growthness attributes,coupled with a negative impact of recent human pressure.However,these effects seemed to be mediated by other factors,specially the legacies imposed by the complex history of forest management practices,land use changes and natural disturbances that have shaped the forests of Spain.The results of this exploratory analysis highlight on one hand the importance of climate in the dynamic of forests towards old-growthness,which is relevant in a context of Climate Change,and on the other hand,the need for more insights on the history of our forests in order to understand their present and future.
基金supported in part by the National Natural Science Foundation of China(Nos.62202234,62401270)the China Postdoctoral Science Foundation(No.2023M741778)the Natural Science Foundation of Jiangsu Province(Nos.BK20240706,BK20240694).
文摘Generative image steganography is a technique that directly generates stego images from secret infor-mation.Unlike traditional methods,it theoretically resists steganalysis because there is no cover image.Currently,the existing generative image steganography methods generally have good steganography performance,but there is still potential room for enhancing both the quality of stego images and the accuracy of secret information extraction.Therefore,this paper proposes a generative image steganography algorithm based on attribute feature transformation and invertible mapping rule.Firstly,the reference image is disentangled by a content and an attribute encoder to obtain content features and attribute features,respectively.Then,a mean mapping rule is introduced to map the binary secret information into a noise vector,conforming to the distribution of attribute features.This noise vector is input into the generator to produce the attribute transformed stego image with the content feature of the reference image.Additionally,we design an adversarial loss,a reconstruction loss,and an image diversity loss to train the proposed model.Experimental results demonstrate that the stego images generated by the proposed method are of high quality,with an average extraction accuracy of 99.4%for the hidden information.Furthermore,since the stego image has a uniform distribution similar to the attribute-transformed image without secret information,it effectively resists both subjective and objective steganalysis.
文摘The information from sparsely logged wellbores is currently under-utilized in reservoir simulation models and their proxies using deep and machine learning (DL/ML).This is particularly problematic for large heterogeneous gas/oil reservoirs being considered for repurposing as gas storage reservoirs for CH_(4),CO_(2) or H_(2) and/or enhanced oil recovery technologies.Lack of well-log data leads to inadequate spatial definition of complex models due to the large uncertainties associated with the extrapolation of petrophysical rock types (PRT) calibrated with limited core data across heterogeneous and/or anisotropic reservoirs.Extracting well-log attributes from the few well logs available in many wells and tying PRT predictions based on them to seismic data has the potential to substantially improve the confidence in PRT 3D-mapping across such reservoirs.That process becomes more efficient when coupled with DL/ML models incorporating feature importance and optimized,dual-objective feature selection techniques.
基金supported by the National Natural Science Foundation of China(Nos.42107211 and 42130719)the Natural Science Foundation of Sichuan Province(No.2025ZNSFSC0097)the open project of State Key Laboratory of Performance Monitoring and Protecting of Rail Transit Infrastructure,East China Jiaotong University(No.HJGZ2022104).
文摘0 INTRODUCTION As a high-risk construction project,underground engineering is characterized by large investment,long construction period,complexconstruction techniques,numerous unforeseeable risk factors,and significantenvironmental impacts.Identifying potentialdisaster risks from the intricate web of influencing factors plays a critical role in ensuring project safety.
基金sponsored by the National Natural Science Foundation of China(Grant No.41176167)the Projects of Cultural Heritage Protection,Zhejiang Province(Grant No.2010001 and No.2011008)
文摘Ground penetrating radar (GPR) attribute technology has been applied to many aspects in recent years but there are very few examples in the field of archaeology. Especially how can we extract effective attributes from the two- or three-dimensional radar data so that we can map and describe numerous archaeological targets in a large cultural site? In this paper, we applied GPR attribute technology to investigate the ancient Nanzhao castle-site in Tengchong, Yunnan Province. In order to get better archaeological target (the ancient wall, the ancient kiln site, and the ancient tomb) analysis and description, we collated the GPR data by collected standardization and then put them to the seismic data processing and interpretation workstation. The data was processed, including a variety of GPR attribute extraction, analysis, and optimization and combined with the archaeological drilling data. We choose the RMS Amplitude, Average Peak Amplitude, Instantaneous Phase, and Maximum Peak Time to interpret three archaeological targets. By comparative analysis, we have clarified that we should use different attributes to interpret different archaeological targets and the results of attribute analysis after horizon tracking is much better than the results based on a time slice.
基金supported by the National Hi-tech Research and Development Program of China(863 Program)(Grant No.2013AA092501)the China Geological Survey Projects(Grant Nos.GZH201100303 and GZH201100305)
文摘To investigate the distribution and velocity attributes of gas hydrates in the northern continental slope of South China Sea, Guangzhou Marine Geological Survey conducted four-component (4C) ocean-bottom seismometer (OBS) surveys. A case study is presented to show the results of acquiring and processing OBS data for detecting gas hydrates. Key processing steps such as repositioning, reorientation, PZ summation, and mirror imaging are discussed. Repositioning and reorientation find the correct location and direction of nodes. PZ summation matches P- and Z-components and sums them to separate upgoing and downgoing waves. Upgoing waves are used in conventional imaging, whereas downgoing waves are used in mirror imaging. Mirror imaging uses the energy of the receiver ghost reflection to improve the illumination of shallow structures, where gas hydrates and the associated bottom-simulating reflections (BSRs) are located. We developed a new method of velocity analysis using mirror imaging. The proposed method is based on velocity scanning and iterative prestack time migration. The final imaging results are promising. When combined with the derived velocity field, we can characterize the BSR and shallow structures; hence, we conclude that using 4C OBS can reveal the distribution and velocity attributes of gas hydrates.