The advent of the digital age has consistently provided impetus for facilitating global trade,as evidenced by the numerous customs clearance documents and participants involved in the international trade process,inclu...The advent of the digital age has consistently provided impetus for facilitating global trade,as evidenced by the numerous customs clearance documents and participants involved in the international trade process,including enterprises,agents,and government departments.However,the urgent issue that requires immediate attention is how to achieve secure and efficient cross-border data sharing among these government departments and enterprises in complex trade processes.In addressing this need,this paper proposes a data exchange architecture employing Multi-Authority Attribute-Based Encryption(MA-ABE)in combination with blockchain technology.This scheme supports proxy decryption,attribute revocation,and policy update,while allowing each participating entity to manage their keys autonomously,ensuring system security and enhancing trust among participants.In order to enhance system decentralization,a mechanism has been designed in the architecture where multiple institutions interact with smart contracts and jointly participate in the generation of public parameters.Integration with the multi-party process execution engine Caterpillar has been shown to boost the transparency of cross-border information flow and cooperation between different organizations.The scheme ensures the auditability of data access control information and the visualization of on-chain data sharing.The MA-ABE scheme is statically secure under the q-Decisional Parallel Bilinear Diffie-Hellman Exponent(q-DPBDHE2)assumption in the random oracle model,and can resist ciphertext rollback attacks to achieve true backward and forward security.Theoretical analysis and experimental results demonstrate the appropriateness of the scheme for cross-border data collaboration between different institutions.展开更多
The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO...The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.展开更多
An autoregressive long-and short-term memory(ARLSTM)model was applied to develop a real-time probabilistic slope stability estimation model for the engineered barrier system(EBS)of a near surface radioactive waste dis...An autoregressive long-and short-term memory(ARLSTM)model was applied to develop a real-time probabilistic slope stability estimation model for the engineered barrier system(EBS)of a near surface radioactive waste disposal facility.The effectiveness of the developed model was verified using actual data acquired from South Korea,including precipitation,soil moisture contents,and inclinometer time-series data.The precipitation and the factor of safety(FS)ensemble results were used as the input and output variables of the AR-LSTM model,respectively,where the FS ensemble results were calculated by the Taylor model,integrating the Mualem-van Genuchten soil water retention model with consideration of the multivariate statistics on the hydrophysical properties of the soil.The estimation accuracy of the AR-LSTM model was reasonable by showing high correlation coefficient(0.9468)and low root mean squared error(0.0070)values between the actual and estimated FS values.Moreover,a significant correlation was observed between the estimated FS ensemble results and displacement events recorded by the inclinometer sensor.All the results suggest the effectiveness of the developed model for the long-term integrity assurance of the EBS.展开更多
The rise in construction activities within mountainous regions has significantly increased the frequency of rockfalls.Statistical models for rockfall hazard assessment often struggle to achieve high precision on a lar...The rise in construction activities within mountainous regions has significantly increased the frequency of rockfalls.Statistical models for rockfall hazard assessment often struggle to achieve high precision on a large scale.This limitation arises primarily from the scarcity of historical rockfall data and the inadequacy of conventional assessment indicators in capturing the physical and structural characteristics of rockfalls.This study proposes a physically based deterministic model designed to accurately quantify rockfall hazards at a large scale.The model accounts for multiple rockfall failure modes and incorporates the key physical and structural parameters of the rock mass.Rockfall hazard is defined as the product of three factors:the rockfall failure probability,the probability of reaching a specific position,and the corresponding impact intensity.The failure probability includes probabilities of formation and instability of rock blocks under different failure modes,modeled based on the combination patterns of slope surfaces and rock discontinuities.The Monte Carlo method is employed to account for the randomness of mechanical and geometric parameters when quantifying instability probabilities.Additionally,the rock trajectories and impact energies simulated using Flow-R software are combined with rockfall failure probability to enable regional rockfall hazard zoning.A case study was conducted in Tiefeng,Chongqing,China,considering four types of rockfall failure modes.Hazard zoning results identified the steep and elevated terrains of the northern and southern anaclinal slopes as areas of highest rockfall hazard.These findings align with observed conditions,providing detailed hazard zoning and validating the effectiveness and potential of the proposed model.展开更多
The consultation intention of emergency decision-makers in urban rail transit(URT)is input into the emergency knowledge base in the form of domain questions to obtain emergency decision support services.This approach ...The consultation intention of emergency decision-makers in urban rail transit(URT)is input into the emergency knowledge base in the form of domain questions to obtain emergency decision support services.This approach facilitates the rapid collection of complete knowledge and rules to form effective decisions.However,the current structured degree of the URT emergency knowledge base remains low,and the domain questions lack labeled datasets,resulting in a large deviation between the consultation outcomes and the intended objectives.To address this issue,this paper proposes a question intention recognition model for the URT emergency domain,leveraging knowledge graph(KG)and data enhancement technology.First,a structured storage of emergency cases and emergency plans is realized based on KG.Subsequently,a comprehensive question template is developed,and the labeled dataset of emergency domain questions in URT is generated through the KG.Lastly,data enhancement is applied by prompt learning and the NLP Chinese Data Augmentation(NLPCDA)tool,and the intention recognition model combining Generalized Auto-regression Pre-training for Language Understanding(XLNet)and Recurrent Convolutional Neural Network for Text Classification(TextRCNN)is constructed.Word embeddings are generated by XLNet,context information is further captured using Bidirectional Long Short-Term Memory Neural Network(BiLSTM),and salient features are extracted with Convolutional Neural Network(CNN).Experimental results demonstrate that the proposed model can enhance the clarity of classification and the identification of domain questions,thereby providing supportive knowledge for emergency decision-making in URT.展开更多
The investigation of the tendency of climate change and its effects on ecology, economy and sociology is essential for long term policy making. The long-term measurement of the physical and chemical properties of the ...The investigation of the tendency of climate change and its effects on ecology, economy and sociology is essential for long term policy making. The long-term measurement of the physical and chemical properties of the atmosphere with state-of-the-arts instruments provides high-quality data for these studies. The evaluated data are stored in really special file structures and formats that cannot be inserted in one common database. Moreover, the observed data usually available in ASCII format and the users sometimes need to convert them in other format. The file conversion is usually time consuming procedure and can contribute to the uncertainties. MeteoRead is a client database software that imports the observed atmospheric data e.g. wind direction, wind speed, aerosol particle concentration etc. and makes them available in different file formats, which are most commonly used in climate research. This Java<sup>TM</sup> based program applies the Structured Query Language (SQL) functions such as table creation on a database server, data or figures insertion into the table and data selection via Graphical User Interface. The selected data can be stored in NetCDF, HDF5, DataBase or TXT file formats and the figures can be available in PNG, JPG, JPNG, PDF or GIF file formats. The program was tested on Linux and Windows platforms with different Java<sup>TM</sup> Development Kit. The MeteoRead is planned to be developed to visualizing the annual, seasonal, monthly, daily or hourly average value of the selected data and to use the functionality of the SQL database to calculate various mathematical and statistical correlations.展开更多
1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zh...1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zhang et al.,2016;Teng et al.,2016;Tian and Li,2018).The United States has built an information-sharing platform for state-owned scientific data as a national strategy.展开更多
This paper first puts forward a case based system framework based on data mining techniques. Then the paper examines the possibility of using neural networks as a method of retrieval in such a case based system. In ...This paper first puts forward a case based system framework based on data mining techniques. Then the paper examines the possibility of using neural networks as a method of retrieval in such a case based system. In this system we propose data mining algorithms to discover case knowledge and other algorithms.展开更多
Since web based GIS processes large size spatial geographic information on internet, we should try to improve the efficiency of spatial data query processing and transmission. This paper presents two efficient metho...Since web based GIS processes large size spatial geographic information on internet, we should try to improve the efficiency of spatial data query processing and transmission. This paper presents two efficient methods for this purpose: division transmission and progressive transmission methods. In division transmission method, a map can be divided into several parts, called “tiles”, and only tiles can be transmitted at the request of a client. In progressive transmission method, a map can be split into several phase views based on the significance of vertices, and a server produces a target object and then transmits it progressively when this spatial object is requested from a client. In order to achieve these methods, the algorithms, “tile division”, “priority order estimation” and the strategies for data transmission are proposed in this paper, respectively. Compared with such traditional methods as “map total transmission” and “layer transmission”, the web based GIS data transmission, proposed in this paper, is advantageous in the increase of the data transmission efficiency by a great margin.展开更多
Sensor nodes in a wireless sensor network (WSN) are typically powered by batteries, thus the energy is constrained. It is our design goal to efficiently utilize the energy of each sensor node to extend its lifetime,...Sensor nodes in a wireless sensor network (WSN) are typically powered by batteries, thus the energy is constrained. It is our design goal to efficiently utilize the energy of each sensor node to extend its lifetime, so as to prolong the lifetime of the whole WSN. In this paper, we propose a path-based data aggregation scheme (PBDAS) for grid-based wireless sensor networks. In order to extend the lifetime of a WSN, we construct a grid infrastructure by partitioning the whole sensor field into a grid of cells. Each cell has a head responsible for aggregating its own data with the data sensed by the others in the same cell and then transmitting out. In order to efficiently and rapidly transmit the data to the base station (BS), we link each cell head to form a chain. Each cell head on the chain takes turn becoming the chain leader responsible for transmitting data to the BS. Aggregated data moves from head to head along the chain, and finally the chain leader transmits to the BS. In PBDAS, only the cell heads need to transmit data toward the BS. Therefore, the data transmissions to the BS substantially decrease. Besides, the cell heads and chain leader are designated in turn according to the energy level so that the energy depletion of nodes is evenly distributed. Simulation results show that the proposed PBDAS extends the lifetime of sensor nodes, so as to make the lifetime of the whole network longer.展开更多
An 8×10 GHz receiver optical sub-assembly (ROSA) consisting of an 8-channel arrayed waveguide grating (AWG) and an 8-channel PIN photodetector (PD) array is designed and fabricated based on silica hybrid in...An 8×10 GHz receiver optical sub-assembly (ROSA) consisting of an 8-channel arrayed waveguide grating (AWG) and an 8-channel PIN photodetector (PD) array is designed and fabricated based on silica hybrid integration technology. Multimode output waveguides in the silica AWG with 2% refractive index difference are used to obtain fiat-top spectra. The output waveguide facet is polished to 45° bevel to change the light propagation direction into the mesa-type PIN PD, which simplifies the packaging process. The experimentM results show that the single channel I dB bandwidth of AWG ranges from 2.12nm to 3.06nm, the ROSA responsivity ranges from 0.097 A/W to 0.158A/W, and the 3dB bandwidth is up to 11 GHz. It is promising to be applied in the eight-lane WDM transmission system in data center interconnection.展开更多
Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offer...Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.展开更多
The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias. In this study, the neighborhood precipitation threat score is modified by defining the thr...The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias. In this study, the neighborhood precipitation threat score is modified by defining the thresholds in terms of the percentiles of overall precipitation instead of fixed threshold values. The impact of intensity forecast bias on the calculated threat score is reduced. The method is tested with the forecasts of a tropical storm that re-intensified after making landfall and caused heavy flooding. The forecasts are produced with and without radar data assimilation. The forecast with assimilation of both radial velocity and reflectivity produce precipitation patterns that better match observations but have large positive intensity bias. When using fixed thresholds, the neighborhood threat scores fail to yield high scores for forecasts that have good pattern match with observations, due to large intensity bias. In contrast, the percentile-based neighborhood method yields the highest score for the forecast with the best pattern match and the smallest position error. The percentile-based method also yields scores that are more consistent with object-based verifications, which are less sensitive to intensity bias, demonstrating the potential value of percentile-based verification.展开更多
基金supported by Hainan Provincial Natural Science Foundation of China Nos.622RC617,624RC485Open Foundation of State Key Laboratory of Networking and Switching Technology(Beijing University of Posts and Telecommunications)(SKLNST-2023-1-07).
文摘The advent of the digital age has consistently provided impetus for facilitating global trade,as evidenced by the numerous customs clearance documents and participants involved in the international trade process,including enterprises,agents,and government departments.However,the urgent issue that requires immediate attention is how to achieve secure and efficient cross-border data sharing among these government departments and enterprises in complex trade processes.In addressing this need,this paper proposes a data exchange architecture employing Multi-Authority Attribute-Based Encryption(MA-ABE)in combination with blockchain technology.This scheme supports proxy decryption,attribute revocation,and policy update,while allowing each participating entity to manage their keys autonomously,ensuring system security and enhancing trust among participants.In order to enhance system decentralization,a mechanism has been designed in the architecture where multiple institutions interact with smart contracts and jointly participate in the generation of public parameters.Integration with the multi-party process execution engine Caterpillar has been shown to boost the transparency of cross-border information flow and cooperation between different organizations.The scheme ensures the auditability of data access control information and the visualization of on-chain data sharing.The MA-ABE scheme is statically secure under the q-Decisional Parallel Bilinear Diffie-Hellman Exponent(q-DPBDHE2)assumption in the random oracle model,and can resist ciphertext rollback attacks to achieve true backward and forward security.Theoretical analysis and experimental results demonstrate the appropriateness of the scheme for cross-border data collaboration between different institutions.
基金financial support extended for this academic work by the Beijing Natural Science Foundation(Grant 2232066)the Open Project Foundation of State Key Laboratory of Solid Lubrication(Grant LSL-2212).
文摘The composition of base oils affects the performance of lubricants made from them.This paper proposes a hybrid model based on gradient-boosted decision tree(GBDT)to analyze the effect of different ratios of KN4010,PAO40,and PriEco3000 component in a composite base oil system on the performance of lubricants.The study was conducted under small laboratory sample conditions,and a data expansion method using the Gaussian Copula function was proposed to improve the prediction ability of the hybrid model.The study also compared four optimization algorithms,sticky mushroom algorithm(SMA),genetic algorithm(GA),whale optimization algorithm(WOA),and seagull optimization algorithm(SOA),to predict the kinematic viscosity at 40℃,kinematic viscosity at 100℃,viscosity index,and oxidation induction time performance of the lubricant.The results showed that the Gaussian Copula function data expansion method improved the prediction ability of the hybrid model in the case of small samples.The SOA-GBDT hybrid model had the fastest convergence speed for the samples and the best prediction effect,with determination coefficients(R^(2))for the four indicators of lubricants reaching 0.98,0.99,0.96 and 0.96,respectively.Thus,this model can significantly reduce the model’s prediction error and has good prediction ability.
基金supported by the Radioactive Waste Management of the Korea Institute of Energy Technology Evaluation and Planning grant funded by the Korea government Ministry of Knowledge(20193210100130)the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.202008980000).
文摘An autoregressive long-and short-term memory(ARLSTM)model was applied to develop a real-time probabilistic slope stability estimation model for the engineered barrier system(EBS)of a near surface radioactive waste disposal facility.The effectiveness of the developed model was verified using actual data acquired from South Korea,including precipitation,soil moisture contents,and inclinometer time-series data.The precipitation and the factor of safety(FS)ensemble results were used as the input and output variables of the AR-LSTM model,respectively,where the FS ensemble results were calculated by the Taylor model,integrating the Mualem-van Genuchten soil water retention model with consideration of the multivariate statistics on the hydrophysical properties of the soil.The estimation accuracy of the AR-LSTM model was reasonable by showing high correlation coefficient(0.9468)and low root mean squared error(0.0070)values between the actual and estimated FS values.Moreover,a significant correlation was observed between the estimated FS ensemble results and displacement events recorded by the inclinometer sensor.All the results suggest the effectiveness of the developed model for the long-term integrity assurance of the EBS.
基金supported by the National Natural Science Foundation of China(Grant Nos.42172318 and 42377186)the National Key R&D Program of China(Grant No.2023YFC3007201).
文摘The rise in construction activities within mountainous regions has significantly increased the frequency of rockfalls.Statistical models for rockfall hazard assessment often struggle to achieve high precision on a large scale.This limitation arises primarily from the scarcity of historical rockfall data and the inadequacy of conventional assessment indicators in capturing the physical and structural characteristics of rockfalls.This study proposes a physically based deterministic model designed to accurately quantify rockfall hazards at a large scale.The model accounts for multiple rockfall failure modes and incorporates the key physical and structural parameters of the rock mass.Rockfall hazard is defined as the product of three factors:the rockfall failure probability,the probability of reaching a specific position,and the corresponding impact intensity.The failure probability includes probabilities of formation and instability of rock blocks under different failure modes,modeled based on the combination patterns of slope surfaces and rock discontinuities.The Monte Carlo method is employed to account for the randomness of mechanical and geometric parameters when quantifying instability probabilities.Additionally,the rock trajectories and impact energies simulated using Flow-R software are combined with rockfall failure probability to enable regional rockfall hazard zoning.A case study was conducted in Tiefeng,Chongqing,China,considering four types of rockfall failure modes.Hazard zoning results identified the steep and elevated terrains of the northern and southern anaclinal slopes as areas of highest rockfall hazard.These findings align with observed conditions,providing detailed hazard zoning and validating the effectiveness and potential of the proposed model.
基金supported in part by the National Natural Science Foundation of China.The funding numbers 62433005,62272036,62132003,and 62173167.
文摘The consultation intention of emergency decision-makers in urban rail transit(URT)is input into the emergency knowledge base in the form of domain questions to obtain emergency decision support services.This approach facilitates the rapid collection of complete knowledge and rules to form effective decisions.However,the current structured degree of the URT emergency knowledge base remains low,and the domain questions lack labeled datasets,resulting in a large deviation between the consultation outcomes and the intended objectives.To address this issue,this paper proposes a question intention recognition model for the URT emergency domain,leveraging knowledge graph(KG)and data enhancement technology.First,a structured storage of emergency cases and emergency plans is realized based on KG.Subsequently,a comprehensive question template is developed,and the labeled dataset of emergency domain questions in URT is generated through the KG.Lastly,data enhancement is applied by prompt learning and the NLP Chinese Data Augmentation(NLPCDA)tool,and the intention recognition model combining Generalized Auto-regression Pre-training for Language Understanding(XLNet)and Recurrent Convolutional Neural Network for Text Classification(TextRCNN)is constructed.Word embeddings are generated by XLNet,context information is further captured using Bidirectional Long Short-Term Memory Neural Network(BiLSTM),and salient features are extracted with Convolutional Neural Network(CNN).Experimental results demonstrate that the proposed model can enhance the clarity of classification and the identification of domain questions,thereby providing supportive knowledge for emergency decision-making in URT.
文摘The investigation of the tendency of climate change and its effects on ecology, economy and sociology is essential for long term policy making. The long-term measurement of the physical and chemical properties of the atmosphere with state-of-the-arts instruments provides high-quality data for these studies. The evaluated data are stored in really special file structures and formats that cannot be inserted in one common database. Moreover, the observed data usually available in ASCII format and the users sometimes need to convert them in other format. The file conversion is usually time consuming procedure and can contribute to the uncertainties. MeteoRead is a client database software that imports the observed atmospheric data e.g. wind direction, wind speed, aerosol particle concentration etc. and makes them available in different file formats, which are most commonly used in climate research. This Java<sup>TM</sup> based program applies the Structured Query Language (SQL) functions such as table creation on a database server, data or figures insertion into the table and data selection via Graphical User Interface. The selected data can be stored in NetCDF, HDF5, DataBase or TXT file formats and the figures can be available in PNG, JPG, JPNG, PDF or GIF file formats. The program was tested on Linux and Windows platforms with different Java<sup>TM</sup> Development Kit. The MeteoRead is planned to be developed to visualizing the annual, seasonal, monthly, daily or hourly average value of the selected data and to use the functionality of the SQL database to calculate various mathematical and statistical correlations.
基金granted by the National Science&Technology Major Projects of China(Grant No.2016ZX05033).
文摘1 Introduction Information technology has been playing an ever-increasing role in geoscience.Sphisicated database platforms are essential for geological data storage,analysis and exchange of Big Data(Feblowitz,2013;Zhang et al.,2016;Teng et al.,2016;Tian and Li,2018).The United States has built an information-sharing platform for state-owned scientific data as a national strategy.
基金Supported by the National Science of China(6 0 0 75 0 15 ) and Key Project of Scientific and Technological Departmentin Anhui
文摘This paper first puts forward a case based system framework based on data mining techniques. Then the paper examines the possibility of using neural networks as a method of retrieval in such a case based system. In this system we propose data mining algorithms to discover case knowledge and other algorithms.
文摘Since web based GIS processes large size spatial geographic information on internet, we should try to improve the efficiency of spatial data query processing and transmission. This paper presents two efficient methods for this purpose: division transmission and progressive transmission methods. In division transmission method, a map can be divided into several parts, called “tiles”, and only tiles can be transmitted at the request of a client. In progressive transmission method, a map can be split into several phase views based on the significance of vertices, and a server produces a target object and then transmits it progressively when this spatial object is requested from a client. In order to achieve these methods, the algorithms, “tile division”, “priority order estimation” and the strategies for data transmission are proposed in this paper, respectively. Compared with such traditional methods as “map total transmission” and “layer transmission”, the web based GIS data transmission, proposed in this paper, is advantageous in the increase of the data transmission efficiency by a great margin.
基金supported by the NSC under Grant No.NSC-101-2221-E-239-032 and NSC-102-2221-E-239-020
文摘Sensor nodes in a wireless sensor network (WSN) are typically powered by batteries, thus the energy is constrained. It is our design goal to efficiently utilize the energy of each sensor node to extend its lifetime, so as to prolong the lifetime of the whole WSN. In this paper, we propose a path-based data aggregation scheme (PBDAS) for grid-based wireless sensor networks. In order to extend the lifetime of a WSN, we construct a grid infrastructure by partitioning the whole sensor field into a grid of cells. Each cell has a head responsible for aggregating its own data with the data sensed by the others in the same cell and then transmitting out. In order to efficiently and rapidly transmit the data to the base station (BS), we link each cell head to form a chain. Each cell head on the chain takes turn becoming the chain leader responsible for transmitting data to the BS. Aggregated data moves from head to head along the chain, and finally the chain leader transmits to the BS. In PBDAS, only the cell heads need to transmit data toward the BS. Therefore, the data transmissions to the BS substantially decrease. Besides, the cell heads and chain leader are designated in turn according to the energy level so that the energy depletion of nodes is evenly distributed. Simulation results show that the proposed PBDAS extends the lifetime of sensor nodes, so as to make the lifetime of the whole network longer.
基金Supported by the National High Technology Research and Development Program of China under Grant No 2015AA016902the National Natural Science Foundation of China under Grant Nos 61435013 and 61405188the K.C.Wong Education Foundation
文摘An 8×10 GHz receiver optical sub-assembly (ROSA) consisting of an 8-channel arrayed waveguide grating (AWG) and an 8-channel PIN photodetector (PD) array is designed and fabricated based on silica hybrid integration technology. Multimode output waveguides in the silica AWG with 2% refractive index difference are used to obtain fiat-top spectra. The output waveguide facet is polished to 45° bevel to change the light propagation direction into the mesa-type PIN PD, which simplifies the packaging process. The experimentM results show that the single channel I dB bandwidth of AWG ranges from 2.12nm to 3.06nm, the ROSA responsivity ranges from 0.097 A/W to 0.158A/W, and the 3dB bandwidth is up to 11 GHz. It is promising to be applied in the eight-lane WDM transmission system in data center interconnection.
文摘Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required.
基金Supported by National Natural Science Foundation of China (61304079, 61125306, 61034002), the Open Research Project from SKLMCCS (20120106), the Fundamental Research Funds for the Central Universities (FRF-TP-13-018A), and the China Postdoctoral Science. Foundation (201_3M_ 5305_27)_ _ _
文摘为有致动器浸透和未知动力学的分离时间的系统的一个班的一个新奇最佳的追踪控制方法在这份报纸被建议。计划基于反复的适应动态编程(自动数据处理) 算法。以便实现控制计划,一个 data-based 标识符首先为未知系统动力学被构造。由介绍 M 网络,稳定的控制的明确的公式被完成。以便消除致动器浸透的效果, nonquadratic 表演功能被介绍,然后一个反复的自动数据处理算法被建立与集中分析完成最佳的追踪控制解决方案。为实现最佳的控制方法,神经网络被用来建立 data-based 标识符,计算性能索引功能,近似最佳的控制政策并且分别地解决稳定的控制。模拟例子被提供验证介绍最佳的追踪的控制计划的有效性。
基金primarily supported by the National 973 Fundamental Research Program of China(Grant No.2013CB430103)the Department of Transportation Federal Aviation Administration(Grant No.NA17RJ1227)through the National Oceanic and Atmospheric Administration+1 种基金supported by the National Science Foundation of China(Grant No.41405100)the Fundamental Research Funds for the Central Universities(Grant No.20620140343)
文摘The traditional threat score based on fixed thresholds for precipitation verification is sensitive to intensity forecast bias. In this study, the neighborhood precipitation threat score is modified by defining the thresholds in terms of the percentiles of overall precipitation instead of fixed threshold values. The impact of intensity forecast bias on the calculated threat score is reduced. The method is tested with the forecasts of a tropical storm that re-intensified after making landfall and caused heavy flooding. The forecasts are produced with and without radar data assimilation. The forecast with assimilation of both radial velocity and reflectivity produce precipitation patterns that better match observations but have large positive intensity bias. When using fixed thresholds, the neighborhood threat scores fail to yield high scores for forecasts that have good pattern match with observations, due to large intensity bias. In contrast, the percentile-based neighborhood method yields the highest score for the forecast with the best pattern match and the smallest position error. The percentile-based method also yields scores that are more consistent with object-based verifications, which are less sensitive to intensity bias, demonstrating the potential value of percentile-based verification.