In the heterogeneous power internet of things(IoT)environment,data signals are acquired to support different business systems to realize advanced intelligent applications,with massive,multi-source,heterogeneous and ot...In the heterogeneous power internet of things(IoT)environment,data signals are acquired to support different business systems to realize advanced intelligent applications,with massive,multi-source,heterogeneous and other characteristics.Reliable perception of information and efficient transmission of energy in multi-source heterogeneous environments are crucial issues.Compressive sensing(CS),as an effective method of signal compression and transmission,can accurately recover the original signal only by very few sampling.In this paper,we study a new method of multi-source heterogeneous data signal reconstruction of power IoT based on compressive sensing technology.Based on the traditional compressive sensing technology to directly recover multi-source heterogeneous signals,we fully use the interference subspace information to design the measurement matrix,which directly and effectively eliminates the interference while making the measurement.The measure matrix is optimized by minimizing the average cross-coherence of the matrix,and the reconstruction performance of the new method is further improved.Finally,the effectiveness of the new method with different parameter settings under different multi-source heterogeneous data signal cases is verified by using orthogonal matching pursuit(OMP)and sparsity adaptive matching pursuit(SAMP)for considering the actual environment with prior information utilization of signal sparsity and no prior information utilization of signal sparsity.展开更多
Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.P...Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.Previous schemes have achieved secure outsourced computing,but they suffer from low computational accuracy,difficult-to-handle heterogeneous distribution of data from multiple sources,and high computational cost,which result in extremely poor user experience and expensive cloud computing costs.To address the above problems,we propose amulti-precision,multi-sourced,andmulti-key outsourcing neural network training scheme.Firstly,we design a multi-precision functional encryption computation based on Euclidean division.Second,we design the outsourcing model training algorithm based on a multi-precision functional encryption with multi-sourced heterogeneity.Finally,we conduct experiments on three datasets.The results indicate that our framework achieves an accuracy improvement of 6%to 30%.Additionally,it offers a memory space optimization of 1.0×2^(24) times compared to the previous best approach.展开更多
With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heter...With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%.展开更多
The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initiall...The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.展开更多
Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantita...Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantitative parameters.However,due to the harsh on-site construction conditions,it is rather difficult to obtain some of the evaluation parameters which are essential for the rock mass quality prediction.In this study,a novel improved Swin Transformer is proposed to detect,segment,and quantify rock mass characteristic parameters such as water leakage,fractures,weak interlayers.The site experiment results demonstrate that the improved Swin Transformer achieves optimal segmentation results and achieving accuracies of 92%,81%,and 86%for water leakage,fractures,and weak interlayers,respectively.A multisource rock tunnel face characteristic(RTFC)dataset includes 11 parameters for predicting rock mass quality is established.Considering the limitations in predictive performance of incomplete evaluation parameters exist in this dataset,a novel tree-augmented naive Bayesian network(BN)is proposed to address the challenge of the incomplete dataset and achieved a prediction accuracy of 88%.In comparison with other commonly used Machine Learning models the proposed BN-based approach proved an improved performance on predicting the rock mass quality with the incomplete dataset.By utilizing the established BN,a further sensitivity analysis is conducted to quantitatively evaluate the importance of the various parameters,results indicate that the rock strength and fractures parameter exert the most significant influence on rock mass quality.展开更多
With the increasing frequency of floods,in-depth flood event analyses are essential for effective disaster relief and prevention.Satellite-based flood event datasets have become the primary data source for flood event...With the increasing frequency of floods,in-depth flood event analyses are essential for effective disaster relief and prevention.Satellite-based flood event datasets have become the primary data source for flood event analyses instead of limited disaster maps due to their enhanced availability.Nevertheless,despite the vast amount of available remote sensing images,existing flood event datasets continue to pose significant challenges in flood event analyses due to the uneven geographical distribution of data,the scarcity of time series data,and the limited availability of flood-related semantic information.There has been a surge in acceptance of deep learning models for flood event analyses,but some existing flood datasets do not align well with model training,and distinguishing flooded areas has proven difficult with limited data modalities and semantic information.Moreover,efficient retrieval and pre-screening of flood-related imagery from vast satellite data impose notable obstacles,particularly within large-scale analyses.To address these issues,we propose a Multimodal Flood Event Dataset(MFED)for deep-learning-based flood event analyses and data retrieval.It consists of 18 years of multi-source remote sensing imagery and heterogeneous textual information covering flood-prone areas worldwide.Incorporating optical and radar imagery can exploit the correlation and complementarity between distinct image modalities to capture the pixel features in flood imagery.It is worth noting that text modality data,including auxiliary hydrological information extracted from the Global Flood Database and text information refined from online news records,can also offer a semantic supplement to the images for flood event retrieval and analysis.To verify the applicability of the MFED in deep learning models,we carried out experiments with different models using a single modality and different combinations of modalities,which fully verified the effectiveness of the dataset.Furthermore,we also verify the efficiency of the MFED in comparative experiments with existing multimodal datasets and diverse neural network structures.展开更多
The sparsity of ground gauges poses a significant challenge for evaluating and merging satellite-based and reanalysis-based precipitation datasets in lake regions.While the standard triple collocation(TC)method offers...The sparsity of ground gauges poses a significant challenge for evaluating and merging satellite-based and reanalysis-based precipitation datasets in lake regions.While the standard triple collocation(TC)method offers a solution without access to ground-based observations,it fails to address rain/no-rain classification and its suitability for assessing and merging lake precipitation has not been explored.This study combines categorical triple collocation(CTC)with standard TC to create an integrated framework(CTC-TC)tailored to evaluate and merge global gridded precipitation products(GPPs).We assess the efficacy of CTC-TC using six GPPs(ERA5-Land,SM2 RAIN-ASCAT,IMERG-Early,IMERG-Late,GSMaPMVK,and PERSIANN-CCS)across the five largest freshwater lakes in China.CTC-TC effectively captures the spatial patterns of metrics for all GPPs,and precisely estimates the correlation coefficient and root mean square error for satellite-based datasets apart from SM2 RAIN-ASCAT,but overestimates the classification accuracy indicator V for all GPPs.Regarding multi-source fusion,CTC-TC leverages the strengths of individual products of triplets,resulting in significant improvements in the critical success index(CSI)by over 11.9%and the modified Kling-Gupta efficiency(KGE')by more than 13.3%.Compared to baseline models,including standard TC,simple model averaging,one outlier removal,and Bayesian model averaging,CTC-TC achieves gains in CSI and KGE'of no less than 24.7%and 3.6%,respectively.In conclusion,the CTC-TC framework offers a thorough evaluation and efficient fusion of GPPs,addressing both categorical and continuous accuracy in data-scarce regions such as lakes.展开更多
To address the underutilization of Chinese research materials in nonferrous metals,a method for constructing a domain of nonferrous metals knowledge graph(DNMKG)was established.Starting from a domain thesaurus,entitie...To address the underutilization of Chinese research materials in nonferrous metals,a method for constructing a domain of nonferrous metals knowledge graph(DNMKG)was established.Starting from a domain thesaurus,entities and relationships were mapped as resource description framework(RDF)triples to form the graph’s framework.Properties and related entities were extracted from open knowledge bases,enriching the graph.A large-scale,multi-source heterogeneous corpus of over 1×10^(9) words was compiled from recent literature to further expand DNMKG.Using the knowledge graph as prior knowledge,natural language processing techniques were applied to the corpus,generating word vectors.A novel entity evaluation algorithm was used to identify and extract real domain entities,which were added to DNMKG.A prototype system was developed to visualize the knowledge graph and support human−computer interaction.Results demonstrate that DNMKG can enhance knowledge discovery and improve research efficiency in the nonferrous metals field.展开更多
As a product of the deep integration between next-generation information technology and industrial systems,digital twin technology has demonstrated significant advantages in real-time monitoring,predictive maintenance...As a product of the deep integration between next-generation information technology and industrial systems,digital twin technology has demonstrated significant advantages in real-time monitoring,predictive maintenance,and optimization decision-making for thermal power plants.To address challenges such as low equipment efficiency,high maintenance costs,and difficulties in safety risk management in traditional thermal power plants,this study developed a digital twin simulation system that covers the entire lifecycle of power generation units.The system achieves real-time collection and processing of critical parameters such as temperature,pressure,and flow rate through a collaborative architecture integrating multi-source heterogeneous sensor networks with Programmable Logic Controllers(PLCs).A three-tier processing framework handles data preprocessing,feature extraction,and intelligent analysis,while establishing a hybrid storage system combining time-series databases and relational databases to enable millisecond-level queries and data traceability.The simulation model development module employs modular design methodology,integrating multi-physics coupling algorithms including computational fluid dynamics(CFD)and thermal circulation equations.Automated parameter calibration is achieved through intelligent optimization algorithms,with model accuracy validated via unitlevel verification,system-level cascaded debugging tests,and virtual test platform simulations.Based on the modular layout strategy,the user interface and interaction module integrates 3D plant panoramic view,dynamic equipment model and multi-mode interaction channel,supports cross-terminal adaptation of PC,mobile terminal and control screen,and improves fault handling efficiency through AR assisted diagnosis function.展开更多
The rapid urbanization and structural imbalances in Chinese megacities have exacerbated the housing supplydemand mismatch,creating an urgent need for fine-scale diagnostic tools.This study addresses this critical gap ...The rapid urbanization and structural imbalances in Chinese megacities have exacerbated the housing supplydemand mismatch,creating an urgent need for fine-scale diagnostic tools.This study addresses this critical gap by developing the Housing Contradiction Evaluation Weighted Index(HCEWI)model,making three key contributions to high-resolution housing monitoring.First,we establish a tripartite theoretical framework integrating dynamic population pressure(PPI),housing supply potential(HSI),and functional diversity(HHI).The PPI innovatively combines mobile signaling data with principal component analysis to capture real-time commuting patterns,while the HSI introduces a novel dual-criteria system based on Local Climate Zones(LCZ),weighted by building density and residential function ratio.Second,we develop a spatiotemporal coupling architecture featuring an entropy-weighted dynamic integration mechanism with self-correcting modules,demonstrating robust performance against data noise.Third,our 25-month longitudinal analysis in Shenzhen reveals significant findings,including persistent bipolar clustering patterns,contrasting volatility between peripheral and core areas,and seasonal policy responsiveness.Methodologically,we advance urban diagnostics through 500-meter grid monthly monitoring and process-oriented temporal operators that reveal“tentacle-like”spatial restructuring along transit corridors.Our findings provide a replicable framework for precision housing governance and demonstrate the transformative potential of mobile signaling data in implementing China’s“city-specific policy”approach.We further propose targeted intervention strategies,including balance regulation for high-contradiction zones,Transit-Oriented Development(TOD)activation for low-contradiction clusters,and dynamic land conversion mechanisms for transitional areas.展开更多
Deep learning algorithms show good prospects for remote sensingflood monitoring.They mostly rely on huge amounts of labeled data.However,there is a lack of available labeled data in actual needs.In this paper,we propo...Deep learning algorithms show good prospects for remote sensingflood monitoring.They mostly rely on huge amounts of labeled data.However,there is a lack of available labeled data in actual needs.In this paper,we propose a high-resolution multi-source remote sensing dataset forflood area extraction:GF-FloodNet.GF-FloodNet contains 13388 samples from Gaofen-3(GF-3)and Gaofen-2(GF-2)images.We use a multi-level sample selection and interactive annotation strategy based on active learning to construct it.Compare with otherflood-related datasets,GF-FloodNet not only has a spatial resolution of up to 1.5 m and provides pixel-level labels,but also consists of multi-source remote sensing data.We thoroughly validate and evaluate the dataset using several deep learning models,including quantitative analysis,qualitative analysis,and validation on large-scale remote sensing data in real scenes.Experimental results reveal that GF-FloodNet has significant advantages by multi-source data.It can support different deep learning models for training to extractflood areas.There should be a potential optimal boundary for model training in any deep learning dataset.The boundary seems close to 4824 samples in GF-FloodNet.We provide GF-FloodNet at https://www.kaggle.com/datasets/pengliuair/gf-floodnet and https://pan.baidu.com/s/1vdUCGNAfFwG5UjZ9RLLFMQ?pwd=8v6o.展开更多
基金supported by National Natural Science Foundation of China(12174350)Science and Technology Project of State Grid Henan Electric Power Company(5217Q0240008).
文摘In the heterogeneous power internet of things(IoT)environment,data signals are acquired to support different business systems to realize advanced intelligent applications,with massive,multi-source,heterogeneous and other characteristics.Reliable perception of information and efficient transmission of energy in multi-source heterogeneous environments are crucial issues.Compressive sensing(CS),as an effective method of signal compression and transmission,can accurately recover the original signal only by very few sampling.In this paper,we study a new method of multi-source heterogeneous data signal reconstruction of power IoT based on compressive sensing technology.Based on the traditional compressive sensing technology to directly recover multi-source heterogeneous signals,we fully use the interference subspace information to design the measurement matrix,which directly and effectively eliminates the interference while making the measurement.The measure matrix is optimized by minimizing the average cross-coherence of the matrix,and the reconstruction performance of the new method is further improved.Finally,the effectiveness of the new method with different parameter settings under different multi-source heterogeneous data signal cases is verified by using orthogonal matching pursuit(OMP)and sparsity adaptive matching pursuit(SAMP)for considering the actual environment with prior information utilization of signal sparsity and no prior information utilization of signal sparsity.
基金supported by Natural Science Foundation of China(Nos.62303126,62362008,author Z.Z,https://www.nsfc.gov.cn/,accessed on 20 December 2024)Major Scientific and Technological Special Project of Guizhou Province([2024]014)+2 种基金Guizhou Provincial Science and Technology Projects(No.ZK[2022]General149) ,author Z.Z,https://kjt.guizhou.gov.cn/,accessed on 20 December 2024)The Open Project of the Key Laboratory of Computing Power Network and Information Security,Ministry of Education under Grant 2023ZD037,author Z.Z,https://www.gzu.edu.cn/,accessed on 20 December 2024)Open Research Project of the State Key Laboratory of Industrial Control Technology,Zhejiang University,China(No.ICT2024B25),author Z.Z,https://www.gzu.edu.cn/,accessed on 20 December 2024).
文摘Due to the development of cloud computing and machine learning,users can upload their data to the cloud for machine learning model training.However,dishonest clouds may infer user data,resulting in user data leakage.Previous schemes have achieved secure outsourced computing,but they suffer from low computational accuracy,difficult-to-handle heterogeneous distribution of data from multiple sources,and high computational cost,which result in extremely poor user experience and expensive cloud computing costs.To address the above problems,we propose amulti-precision,multi-sourced,andmulti-key outsourcing neural network training scheme.Firstly,we design a multi-precision functional encryption computation based on Euclidean division.Second,we design the outsourcing model training algorithm based on a multi-precision functional encryption with multi-sourced heterogeneity.Finally,we conduct experiments on three datasets.The results indicate that our framework achieves an accuracy improvement of 6%to 30%.Additionally,it offers a memory space optimization of 1.0×2^(24) times compared to the previous best approach.
文摘With the acceleration of intelligent transformation of energy system,the monitoring of equipment operation status and optimization of production process in thermal power plants face the challenge of multi-source heterogeneous data integration.In view of the heterogeneous characteristics of physical sensor data,including temperature,vibration and pressure that generated by boilers,steam turbines and other key equipment and real-time working condition data of SCADA system,this paper proposes a multi-source heterogeneous data fusion and analysis platform for thermal power plants based on edge computing and deep learning.By constructing a multi-level fusion architecture,the platform adopts dynamic weight allocation strategy and 5D digital twin model to realize the collaborative analysis of physical sensor data,simulation calculation results and expert knowledge.The data fusion module combines Kalman filter,wavelet transform and Bayesian estimation method to solve the problem of data time series alignment and dimension difference.Simulation results show that the data fusion accuracy can be improved to more than 98%,and the calculation delay can be controlled within 500 ms.The data analysis module integrates Dymola simulation model and AERMOD pollutant diffusion model,supports the cascade analysis of boiler combustion efficiency prediction and flue gas emission monitoring,system response time is less than 2 seconds,and data consistency verification accuracy reaches 99.5%.
基金supported by the National Key Research and Development Program of China(grant number 2019YFE0123600)。
文摘The power Internet of Things(IoT)is a significant trend in technology and a requirement for national strategic development.With the deepening digital transformation of the power grid,China’s power system has initially built a power IoT architecture comprising a perception,network,and platform application layer.However,owing to the structural complexity of the power system,the construction of the power IoT continues to face problems such as complex access management of massive heterogeneous equipment,diverse IoT protocol access methods,high concurrency of network communications,and weak data security protection.To address these issues,this study optimizes the existing architecture of the power IoT and designs an integrated management framework for the access of multi-source heterogeneous data in the power IoT,comprising cloud,pipe,edge,and terminal parts.It further reviews and analyzes the key technologies involved in the power IoT,such as the unified management of the physical model,high concurrent access,multi-protocol access,multi-source heterogeneous data storage management,and data security control,to provide a more flexible,efficient,secure,and easy-to-use solution for multi-source heterogeneous data access in the power IoT.
基金supported by the National Natural Science Foundation of China(Nos.52279107 and 52379106)the Qingdao Guoxin Jiaozhou Bay Second Submarine Tunnel Co.,Ltd.,the Academician and Expert Workstation of Yunnan Province(No.202205AF150015)the Science and Technology Innovation Project of YCIC Group Co.,Ltd.(No.YCIC-YF-2022-15)。
文摘Rock mass quality serves as a vital index for predicting the stability and safety status of rock tunnel faces.In tunneling practice,the rock mass quality is often assessed via a combination of qualitative and quantitative parameters.However,due to the harsh on-site construction conditions,it is rather difficult to obtain some of the evaluation parameters which are essential for the rock mass quality prediction.In this study,a novel improved Swin Transformer is proposed to detect,segment,and quantify rock mass characteristic parameters such as water leakage,fractures,weak interlayers.The site experiment results demonstrate that the improved Swin Transformer achieves optimal segmentation results and achieving accuracies of 92%,81%,and 86%for water leakage,fractures,and weak interlayers,respectively.A multisource rock tunnel face characteristic(RTFC)dataset includes 11 parameters for predicting rock mass quality is established.Considering the limitations in predictive performance of incomplete evaluation parameters exist in this dataset,a novel tree-augmented naive Bayesian network(BN)is proposed to address the challenge of the incomplete dataset and achieved a prediction accuracy of 88%.In comparison with other commonly used Machine Learning models the proposed BN-based approach proved an improved performance on predicting the rock mass quality with the incomplete dataset.By utilizing the established BN,a further sensitivity analysis is conducted to quantitatively evaluate the importance of the various parameters,results indicate that the rock strength and fractures parameter exert the most significant influence on rock mass quality.
基金supported by the National Natural Science Foundation of China[Grant No.42071413]the GHfund C[Grant No.202302039381].
文摘With the increasing frequency of floods,in-depth flood event analyses are essential for effective disaster relief and prevention.Satellite-based flood event datasets have become the primary data source for flood event analyses instead of limited disaster maps due to their enhanced availability.Nevertheless,despite the vast amount of available remote sensing images,existing flood event datasets continue to pose significant challenges in flood event analyses due to the uneven geographical distribution of data,the scarcity of time series data,and the limited availability of flood-related semantic information.There has been a surge in acceptance of deep learning models for flood event analyses,but some existing flood datasets do not align well with model training,and distinguishing flooded areas has proven difficult with limited data modalities and semantic information.Moreover,efficient retrieval and pre-screening of flood-related imagery from vast satellite data impose notable obstacles,particularly within large-scale analyses.To address these issues,we propose a Multimodal Flood Event Dataset(MFED)for deep-learning-based flood event analyses and data retrieval.It consists of 18 years of multi-source remote sensing imagery and heterogeneous textual information covering flood-prone areas worldwide.Incorporating optical and radar imagery can exploit the correlation and complementarity between distinct image modalities to capture the pixel features in flood imagery.It is worth noting that text modality data,including auxiliary hydrological information extracted from the Global Flood Database and text information refined from online news records,can also offer a semantic supplement to the images for flood event retrieval and analysis.To verify the applicability of the MFED in deep learning models,we carried out experiments with different models using a single modality and different combinations of modalities,which fully verified the effectiveness of the dataset.Furthermore,we also verify the efficiency of the MFED in comparative experiments with existing multimodal datasets and diverse neural network structures.
基金National Key R&D Program of China,No.2022YFC3202802National Natural Science Foundation of China,No.52009081,No.52121006,No.52279071Special Funded Project for Basic Scientific Research Operation Expenses of the Central Public Welfare Scientific Research Institutes of China,No.Y524017。
文摘The sparsity of ground gauges poses a significant challenge for evaluating and merging satellite-based and reanalysis-based precipitation datasets in lake regions.While the standard triple collocation(TC)method offers a solution without access to ground-based observations,it fails to address rain/no-rain classification and its suitability for assessing and merging lake precipitation has not been explored.This study combines categorical triple collocation(CTC)with standard TC to create an integrated framework(CTC-TC)tailored to evaluate and merge global gridded precipitation products(GPPs).We assess the efficacy of CTC-TC using six GPPs(ERA5-Land,SM2 RAIN-ASCAT,IMERG-Early,IMERG-Late,GSMaPMVK,and PERSIANN-CCS)across the five largest freshwater lakes in China.CTC-TC effectively captures the spatial patterns of metrics for all GPPs,and precisely estimates the correlation coefficient and root mean square error for satellite-based datasets apart from SM2 RAIN-ASCAT,but overestimates the classification accuracy indicator V for all GPPs.Regarding multi-source fusion,CTC-TC leverages the strengths of individual products of triplets,resulting in significant improvements in the critical success index(CSI)by over 11.9%and the modified Kling-Gupta efficiency(KGE')by more than 13.3%.Compared to baseline models,including standard TC,simple model averaging,one outlier removal,and Bayesian model averaging,CTC-TC achieves gains in CSI and KGE'of no less than 24.7%and 3.6%,respectively.In conclusion,the CTC-TC framework offers a thorough evaluation and efficient fusion of GPPs,addressing both categorical and continuous accuracy in data-scarce regions such as lakes.
文摘To address the underutilization of Chinese research materials in nonferrous metals,a method for constructing a domain of nonferrous metals knowledge graph(DNMKG)was established.Starting from a domain thesaurus,entities and relationships were mapped as resource description framework(RDF)triples to form the graph’s framework.Properties and related entities were extracted from open knowledge bases,enriching the graph.A large-scale,multi-source heterogeneous corpus of over 1×10^(9) words was compiled from recent literature to further expand DNMKG.Using the knowledge graph as prior knowledge,natural language processing techniques were applied to the corpus,generating word vectors.A novel entity evaluation algorithm was used to identify and extract real domain entities,which were added to DNMKG.A prototype system was developed to visualize the knowledge graph and support human−computer interaction.Results demonstrate that DNMKG can enhance knowledge discovery and improve research efficiency in the nonferrous metals field.
文摘As a product of the deep integration between next-generation information technology and industrial systems,digital twin technology has demonstrated significant advantages in real-time monitoring,predictive maintenance,and optimization decision-making for thermal power plants.To address challenges such as low equipment efficiency,high maintenance costs,and difficulties in safety risk management in traditional thermal power plants,this study developed a digital twin simulation system that covers the entire lifecycle of power generation units.The system achieves real-time collection and processing of critical parameters such as temperature,pressure,and flow rate through a collaborative architecture integrating multi-source heterogeneous sensor networks with Programmable Logic Controllers(PLCs).A three-tier processing framework handles data preprocessing,feature extraction,and intelligent analysis,while establishing a hybrid storage system combining time-series databases and relational databases to enable millisecond-level queries and data traceability.The simulation model development module employs modular design methodology,integrating multi-physics coupling algorithms including computational fluid dynamics(CFD)and thermal circulation equations.Automated parameter calibration is achieved through intelligent optimization algorithms,with model accuracy validated via unitlevel verification,system-level cascaded debugging tests,and virtual test platform simulations.Based on the modular layout strategy,the user interface and interaction module integrates 3D plant panoramic view,dynamic equipment model and multi-mode interaction channel,supports cross-terminal adaptation of PC,mobile terminal and control screen,and improves fault handling efficiency through AR assisted diagnosis function.
基金National Natural Science Foundation of China(No.42101346)Undergraduate Training Programs for Innovation and Entrepreneurship of Wuhan University(GeoAI Special Project)(No.202510486196).
文摘The rapid urbanization and structural imbalances in Chinese megacities have exacerbated the housing supplydemand mismatch,creating an urgent need for fine-scale diagnostic tools.This study addresses this critical gap by developing the Housing Contradiction Evaluation Weighted Index(HCEWI)model,making three key contributions to high-resolution housing monitoring.First,we establish a tripartite theoretical framework integrating dynamic population pressure(PPI),housing supply potential(HSI),and functional diversity(HHI).The PPI innovatively combines mobile signaling data with principal component analysis to capture real-time commuting patterns,while the HSI introduces a novel dual-criteria system based on Local Climate Zones(LCZ),weighted by building density and residential function ratio.Second,we develop a spatiotemporal coupling architecture featuring an entropy-weighted dynamic integration mechanism with self-correcting modules,demonstrating robust performance against data noise.Third,our 25-month longitudinal analysis in Shenzhen reveals significant findings,including persistent bipolar clustering patterns,contrasting volatility between peripheral and core areas,and seasonal policy responsiveness.Methodologically,we advance urban diagnostics through 500-meter grid monthly monitoring and process-oriented temporal operators that reveal“tentacle-like”spatial restructuring along transit corridors.Our findings provide a replicable framework for precision housing governance and demonstrate the transformative potential of mobile signaling data in implementing China’s“city-specific policy”approach.We further propose targeted intervention strategies,including balance regulation for high-contradiction zones,Transit-Oriented Development(TOD)activation for low-contradiction clusters,and dynamic land conversion mechanisms for transitional areas.
基金supported by the National Natural Science Foundation of China under Grant number U2243222,42071413,and 41971397.
文摘Deep learning algorithms show good prospects for remote sensingflood monitoring.They mostly rely on huge amounts of labeled data.However,there is a lack of available labeled data in actual needs.In this paper,we propose a high-resolution multi-source remote sensing dataset forflood area extraction:GF-FloodNet.GF-FloodNet contains 13388 samples from Gaofen-3(GF-3)and Gaofen-2(GF-2)images.We use a multi-level sample selection and interactive annotation strategy based on active learning to construct it.Compare with otherflood-related datasets,GF-FloodNet not only has a spatial resolution of up to 1.5 m and provides pixel-level labels,but also consists of multi-source remote sensing data.We thoroughly validate and evaluate the dataset using several deep learning models,including quantitative analysis,qualitative analysis,and validation on large-scale remote sensing data in real scenes.Experimental results reveal that GF-FloodNet has significant advantages by multi-source data.It can support different deep learning models for training to extractflood areas.There should be a potential optimal boundary for model training in any deep learning dataset.The boundary seems close to 4824 samples in GF-FloodNet.We provide GF-FloodNet at https://www.kaggle.com/datasets/pengliuair/gf-floodnet and https://pan.baidu.com/s/1vdUCGNAfFwG5UjZ9RLLFMQ?pwd=8v6o.