Switzerland is one of the most desirable European destinations for Chinese tourists;therefore, a better understanding of Chinese tourists is essential for successful business practices. In China, the largest and leadi...Switzerland is one of the most desirable European destinations for Chinese tourists;therefore, a better understanding of Chinese tourists is essential for successful business practices. In China, the largest and leading social media platform—Sina Weibo, a hybrid of Twitter and Facebook—has more than 600 million users. Weibo’s great market penetration suggests that tourism operators and markets need to understand how to build effective and sustainable communications on Chinese social media platforms. In order to offer a better decision support platform to tourism destination managers as well as Chinese tourists, we proposed a framework using linked data on Sina Weibo. Linked Data is a term referring to using the Internet to connect related data. We will show how it can be used and how ontology can be designed to include the users’ context (e.g., GPS locations). Our framework will provide a good theoretical foundation for further understand Chinese tourists’ expectation, experiences, behaviors and new trends in Switzerland.展开更多
Research for detecting or obtaining radionuclide by gamma energy spectrum data acquisition and process system is one of the key issues about intelligent measurement of gamma-ray spectrum. For this reason, a software a...Research for detecting or obtaining radionuclide by gamma energy spectrum data acquisition and process system is one of the key issues about intelligent measurement of gamma-ray spectrum. For this reason, a software and hardware implementation schematic design based on ARM ( Advanced RISC Machines) + DSP ( Digital Signal Processor) architecture for gamma energy spectrum data acquisition and processing system is proposed. The paper discusses in detail some key technologies such as communication interface design between microcontroller ARM and digital signal processor DSP,distribution scheduling under multi-task in the ARM-Linux,DSP handling procedures for multi-channel A / D high-speed sample. At the same time,because the traditional Gaussian fitting to determine the boundary of peak is not ideal,it puts forward a weighting factor of Gaussian function least squares fitting realize boundary determined. Finally gamma-spectrum data from sodium iodide NaI( TI) scintillation detector is tested and processed in the new system. The results show that gamma energy spectrum data acquisition and process system is perfect functionality, stable and convergence in unimodal. Compared with data from conventional energy spectrometers,the system can keep better energy resolution in a wide range of pulse pass rate.展开更多
In this paper, we present a study on activity functions for an MLNN (multi-layered neural network) and propose a suitable activity function for data enlargement processing. We have carefully studied the training perfo...In this paper, we present a study on activity functions for an MLNN (multi-layered neural network) and propose a suitable activity function for data enlargement processing. We have carefully studied the training performance of Sigmoid, ReLu, Leaky-ReLu and L & exp. activity functions for few inputs to multiple output training patterns. Our MLNNs model has L hidden layers with two or three inputs to four or six outputs data variations by BP (backpropagation) NN (neural network) training. We focused on the multi teacher training signals to investigate and evaluate the training performance in MLNNs to select the best and good activity function for data enlargement and hence could be applicable for image and signal processing (synaptic divergence) along with the proposed methods with convolution networks. We specifically used four activity functions from which we found out that L & exp. activity function can suite DENN (data enlargement neural network) training since it could give the highest percentage training abilities compared to the other activity functions of Sigmoid, ReLu and Leaky-ReLu during simulation and training of data in the network. And finally, we recommend L & exp. function to be good for MLNNs and may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple teacher training patterns using original generated data and hence can be tried with CNN (convolution neural networks) of image processing.展开更多
With the increasing variety of application software of meteorological satellite ground system, how to provide reasonable hardware resources and improve the efficiency of software is paid more and more attention. In th...With the increasing variety of application software of meteorological satellite ground system, how to provide reasonable hardware resources and improve the efficiency of software is paid more and more attention. In this paper, a set of software classification method based on software operating characteristics is proposed. The method uses software run-time resource consumption to describe the software running characteristics. Firstly, principal component analysis (PCA) is used to reduce the dimension of software running feature data and to interpret software characteristic information. Then the modified K-means algorithm was used to classify the meteorological data processing software. Finally, it combined with the results of principal component analysis to explain the significance of various types of integrated software operating characteristics. And it is used as the basis for optimizing the allocation of software hardware resources and improving the efficiency of software operation.展开更多
The rapid development of urbanization requires land management business should change the former single systematic pattern, and advance to integration of functions and data sharing. In order to meets the requirement, ...The rapid development of urbanization requires land management business should change the former single systematic pattern, and advance to integration of functions and data sharing. In order to meets the requirement, this paper presents a new thinking for land management pattern, and management tools of data center for integration of urban and rural areas. The tools were based on MapGIS, which have made the management of multi-subjects, multi-areas, multi-sources and multi-measurement data possible. The techniques of this system are designed accord with national related standard. Experimental result shows that the tools have obvious technical advantage in land resource business integration management.展开更多
The increase in computing capacity caused a rapid and sudden increase in the Operational Expenses (OPEX) of data centers. OPEX reduction is a big concern and a key target in modern data centers. In this study, the sca...The increase in computing capacity caused a rapid and sudden increase in the Operational Expenses (OPEX) of data centers. OPEX reduction is a big concern and a key target in modern data centers. In this study, the scalability of the Dynamic Voltage and Frequency Scaling (DVFS) power management technique is studied under multiple different workloads. The environment of this study is a 3-Tier data center. We conducted multiple experiments to find the impact of using DVFS on energy reduction under two scheduling techniques, namely: Round Robin and Green. We observed that the amount of energy reduction varies according to data center load. When the data center load increases, the energy reduction decreases. Experiments using Green scheduler showed around 83% decrease in power consumption when DVFS is enabled and DC is lightly loaded. In case the DC is fully loaded, in which case the servers’ CPUs are constantly busy with no idle time, the effect of DVFS decreases and stabilizes to less than 10%. Experiments using Round Robin scheduler showed less energy saving by DVFS, specifically, around 25% in light DC load and less than 5% in heavy DC load. In order to find the effect of task weight on energy consumption, a set of experiments were conducted through applying thin and fat tasks. A thin task has much less instructions compared to fat tasks. We observed, through the simulation, that the difference in power reduction between both types of tasks when using DVFS is less than 1%.展开更多
To provide scientific management basis for the garden planning, project construction, maintenance, social service, this paper prompted that the urban gardening administration sectors need to construct gardening inform...To provide scientific management basis for the garden planning, project construction, maintenance, social service, this paper prompted that the urban gardening administration sectors need to construct gardening information management system. On the basis of fully requirements analysis of gardening sectors, this paper discussed the key technology for system construction. It also proposed to flexibly and smartly build up the system by using the secondary development design environment and running environment based on data center integration development platform. This system greatly helps the daily management and plays very important role in improving urban ecological environment and investment environment.展开更多
Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods...Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods,based on reliable existing data stored in project management tools’datasets,automating this evaluation process becomes a natural step forward.In this context,our approach focuses on quantifying software developer expertise by using metadata from the task-tracking systems.For this,we mathematically formalize two categories of expertise:technology-specific expertise,which denotes the skills required for a particular technology,and general expertise,which encapsulates overall knowledge in the software industry.Afterward,we automatically classify the zones of expertise associated with each task a developer has worked on using Bidirectional Encoder Representations from Transformers(BERT)-like transformers to handle the unique characteristics of project tool datasets effectively.Finally,our method evaluates the proficiency of each software specialist across already completed projects from both technology-specific and general perspectives.The method was experimentally validated,yielding promising results.展开更多
The growth of computing power in data centers(DCs)leads to an increase in energy consumption and noise pollution of air cooling systems.Chip-level cooling with high-efficiency coolant is one of the promising methods t...The growth of computing power in data centers(DCs)leads to an increase in energy consumption and noise pollution of air cooling systems.Chip-level cooling with high-efficiency coolant is one of the promising methods to address the cooling challenge for high-power devices in DCs.Hybrid nanofluid(HNF)has the advantages of high thermal conductivity and good rheological properties.This study summarizes the numerical investigations of HNFs in mini/micro heat sinks,including the numerical methods,hydrothermal characteristics,and enhanced heat transfer technologies.The innovations of this paper include:(1)the characteristics,applicable conditions,and scenarios of each theoretical method and numerical method are clarified;(2)the molecular dynamics(MD)simulation can reveal the synergy effect,micro motion,and agglomeration morphology of different nanoparticles.Machine learning(ML)presents a feasiblemethod for parameter prediction,which provides the opportunity for the intelligent regulation of the thermal performance of HNFs;(3)the HNFs flowboiling and the synergy of passive and active technologies may further improve the overall efficiency of liquid cooling systems in DCs.This review provides valuable insights and references for exploring the multi-phase flow and heat transport mechanisms of HNFs,and promoting the practical application of HNFs in chip-level liquid cooling in DCs.展开更多
Owing to wide applications of automatic control systems in the process industries, the impacts of controller performance on industrial processes are becoming increasingly significant. Consequently, controller maintena...Owing to wide applications of automatic control systems in the process industries, the impacts of controller performance on industrial processes are becoming increasingly significant. Consequently, controller maintenance is critical to guarantee routine operations of industrial processes. The workflow of controller maintenance generally involves the following steps: monitor operating controller performance and detect performance degradation, diagnose probable root causes of control system malfunctions, and take specific actions to resolve associated problems. In this article, a comprehensive overview of the mainstream of control loop monitoring and diagnosis is provided, and some existing problems are also analyzed and discussed. From the viewpoint of synthesizing abundant information in the context of big data, some prospective ideas and promising methods are outlined to potentially solve problems in industrial applications.展开更多
The effect of gradient exhaust strategy and blind plate installation on the inhibition of backflow and thermal stratification in data center cabinets is systematically investigated in this study through numericalmetho...The effect of gradient exhaust strategy and blind plate installation on the inhibition of backflow and thermal stratification in data center cabinets is systematically investigated in this study through numericalmethods.The validated Re-Normalization Group(RNG)k-ε turbulence model was used to analyze airflow patterns within cabinet structures equipped with backplane air conditioning.Key findings reveal that server-generated thermal plumes induce hot air accumulation at the cabinet apex,creating a 0.8℃ temperature elevation at the top server’s inlet compared to the ideal situation(23℃).Strategic increases in backplane fan exhaust airflow rates reduce server 1’s inlet temperature from 26.1℃(0%redundancy case)to 23.1℃(40%redundancy case).Gradient exhaust strategies achieve equivalent server temperature performance to uniform exhaust distributions while requiring 25%less redundant airflow.This approach decreases the recirculation ratio from1.52%(uniformexhaust at 15%redundancy)to 0.57%(gradient exhaust at equivalent redundancy).Comparative analyses demonstrate divergent thermal behaviors:in bottom-server-absent configurations,gradient exhaust reduces top server inlet temperatures by 1.6℃vs.uniformexhaust,whereas top-serverabsent configurations exhibit a 1.8℃ temperature increase under gradient conditions.The blind plate implementation achieves a 0.4℃ top server temperature reduction compared to 15%-redundancy uniform exhaust systems without requiring additional airflow redundancy.Partially installed server arrangements with blind plates maintain thermal characteristics comparable to fully populated cabinets.This study validates gradient exhaust and blind plate technologies as effective countermeasures against cabinet-scale thermal recirculation,providing actionable insights for optimizing backplane air conditioning systems in mission-critical data center environments.展开更多
This article proposes a framework, called BP-M* which includes: 1) a methodology to analyze, engineer, restructure and implement business processes, and 2) a process model that extends the process diagram with the spe...This article proposes a framework, called BP-M* which includes: 1) a methodology to analyze, engineer, restructure and implement business processes, and 2) a process model that extends the process diagram with the specification of resources that execute the process activities, allocation policies, schedules, times of activities, management of queues in input to the activities and workloads so that the same model can be simulated by a discrete event simulator. The BP-M* framework has been applied to a real case study, a public Contact Center which provides different typologies of answers to users’ requests. The simulation allows to study different system operating scenarios (“What-If” analysis) providing useful information for analysts to evaluate restructuring actions.展开更多
The transmission of scientific data over long distances is required to enable interplanetary science expeditions. Current approaches include transmitting all collected data or transmitting low resolution data to enabl...The transmission of scientific data over long distances is required to enable interplanetary science expeditions. Current approaches include transmitting all collected data or transmitting low resolution data to enable ground controller review and selection of data for transmission. Model-based data transmission (MBDT) seeks to increase the amount of knowledge conveyed per unit of data transmitted by comparing high-resolution data collected in situ to a pre-existing (or potentially co-transmitted) model. This paper describes the application of MBDT to gravitational data and characterizes its utility and performance. This is performed by applying the MBDT technique to a selection of gravitational data previously collected for the Earth and comparing the transmission requirements to the level required for raw data transmis-sion and non-application-aware compression. Levels of transmission reduction up to 31.8% (without the use maximum-error-thresholding) and up to 97.17% (with the use of maximum-error-thresholding) resulted. These levels significantly exceed what is possible with non-application-aware compression.展开更多
This paper presents a lineament detection method using multi-band remote sensing images. The main objective of this work is to design an automatic image processing tool for lineament mapping from Landsat-7 ETM + satel...This paper presents a lineament detection method using multi-band remote sensing images. The main objective of this work is to design an automatic image processing tool for lineament mapping from Landsat-7 ETM + satellite data. Five procedures were involved: 1) The Principal Component Analysis;2) image enhancement using histogram equalization technique 3) directional Sobel filters of the original data;4) histogram segmentation and 5) binary image generation. The applied methodology was contributed in identifying several known large-scale faults in the Northeast of Tunisia. The statistical and spatial analyses of lineament map indicate a difference of morphological appearance of lineaments in the satellite image. Indeed, all the lineaments present a specific organization. Five groups were classified based on three orientations: NE-SW, E-W and NW-SE. The overlapping of lineament map with the geologic map confirms that these lineaments of diverse directions can be identified and recognized on the field as a fault. The identified lineaments were linked to a deep faults caused by tectonic movements in Tunisia. This study shows the performance of the satellite image processing in the analysis and mapping of the accidents in the northern Atlas.展开更多
While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as...While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.展开更多
A method of data processing to determine the coefficients of linearization equations for 1050 anemometer (produced by Thermo-Systems Inc. -TSI, USA) with the sensors made of domestic hot wire using the program preferr...A method of data processing to determine the coefficients of linearization equations for 1050 anemometer (produced by Thermo-Systems Inc. -TSI, USA) with the sensors made of domestic hot wire using the program preferred in this Paper is described. By calculation and test, it is indicated that the error resulting from this method is about 0. 5% of the full scale and less than TSl's. By using this method we can set up the calibration curve according to the measurement range and the diameter of the hot wire at a certain accuracy.展开更多
This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter ...This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter and extract fundamental waves. Finally the effectiveness of the data processing method introduced in this paper is verified by CAAP software.展开更多
Data centers operate as physical digital infrastructure for generating,storing,computing,transmitting,and utilizing massive data and information,constituting the backbone of the flourishing digital economy across the ...Data centers operate as physical digital infrastructure for generating,storing,computing,transmitting,and utilizing massive data and information,constituting the backbone of the flourishing digital economy across the world.Given the lack of a consistent analysis for studying the locational factors of data centers and empirical deficiencies in longitudinal investigations on spatial dynamics of heterogeneous data centers,this paper develops a comprehensive analytical framework to examine the dynamic geographies and locational factors of techno-environmentally heterogeneous data centers across Chinese cities in the period of 2006–2021.First,we develop a“supply-demand-environment trinity”analytical framework as well as an accompanying evaluation indicator system with Chinese characteristics.Second,the dynamic geographies of data centers in Chinese cities over the last decades are characterized as spatial polarization in economically leading urban agglomerations alongside persistent interregional gaps across eastern,central,and western regions.Data centers present dual spatial expansion trajectories featuring outward radiation from eastern core urban agglomerations to adjacent peripheries and leapfrog diffusion to strategic central and western digital infrastructural hubs.Third,it is empirically verified that data center construction in Chinese cities over the last decades has been jointly influenced by supply-,demand-,and environment-side locational factors,echoing the efficacy of the trinity analytical framework.Overall,our findings demonstrate the temporal variance,contextual contingency,and attribute-based differentiation of locational factors underlying techno-environmentally heterogeneous data centers in Chinese cities.展开更多
The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,suc...The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.展开更多
Aiming at the problems such as low throughput and unbalanced load of data center network caused by traditional multipath routing strategy,a dynamic load balancing strategy for flow classification oriented to Fat-Tree ...Aiming at the problems such as low throughput and unbalanced load of data center network caused by traditional multipath routing strategy,a dynamic load balancing strategy for flow classification oriented to Fat-Tree topology based on the software defined network(SDN)architecture is proposed,named DLB-FC.Multi-index evaluation methods such as link state information and network traffic characteristics are considered.DLB-FC mechanism can dynamically adjust the flow classification threshold to differentiate between large and small flows.The scheme selects different forwarding paths to meet the transmission performance requirements of different flow characteristics.On this basis,an SDN simulation platform is built for performance testing.The simulation results show that DLB-FC algorithm can dynamically distinguish large flows from small flows and achieve load balancing effectively.Compared with equal-cost multi-path(ECMP),global first fit(GFF)and minmum total delay load routing(MTDLR)algorithms,DLB-FC scheme improves the network throughput and link utilization of the data center network effectively.The transmission delay is also reduced with better load balance.展开更多
文摘Switzerland is one of the most desirable European destinations for Chinese tourists;therefore, a better understanding of Chinese tourists is essential for successful business practices. In China, the largest and leading social media platform—Sina Weibo, a hybrid of Twitter and Facebook—has more than 600 million users. Weibo’s great market penetration suggests that tourism operators and markets need to understand how to build effective and sustainable communications on Chinese social media platforms. In order to offer a better decision support platform to tourism destination managers as well as Chinese tourists, we proposed a framework using linked data on Sina Weibo. Linked Data is a term referring to using the Internet to connect related data. We will show how it can be used and how ontology can be designed to include the users’ context (e.g., GPS locations). Our framework will provide a good theoretical foundation for further understand Chinese tourists’ expectation, experiences, behaviors and new trends in Switzerland.
基金Sponsored by the Natural Science Fundation of Jiangxi Province(Grant No.20114BAB211026 and No.20122BA-B201028)Open Science Fund from Key Laboratory of Radioactive Geology and Exploration Technology Fundamental Science for National Defense,East China Institute of Technology(Grant No.2010RGET11)
文摘Research for detecting or obtaining radionuclide by gamma energy spectrum data acquisition and process system is one of the key issues about intelligent measurement of gamma-ray spectrum. For this reason, a software and hardware implementation schematic design based on ARM ( Advanced RISC Machines) + DSP ( Digital Signal Processor) architecture for gamma energy spectrum data acquisition and processing system is proposed. The paper discusses in detail some key technologies such as communication interface design between microcontroller ARM and digital signal processor DSP,distribution scheduling under multi-task in the ARM-Linux,DSP handling procedures for multi-channel A / D high-speed sample. At the same time,because the traditional Gaussian fitting to determine the boundary of peak is not ideal,it puts forward a weighting factor of Gaussian function least squares fitting realize boundary determined. Finally gamma-spectrum data from sodium iodide NaI( TI) scintillation detector is tested and processed in the new system. The results show that gamma energy spectrum data acquisition and process system is perfect functionality, stable and convergence in unimodal. Compared with data from conventional energy spectrometers,the system can keep better energy resolution in a wide range of pulse pass rate.
文摘In this paper, we present a study on activity functions for an MLNN (multi-layered neural network) and propose a suitable activity function for data enlargement processing. We have carefully studied the training performance of Sigmoid, ReLu, Leaky-ReLu and L & exp. activity functions for few inputs to multiple output training patterns. Our MLNNs model has L hidden layers with two or three inputs to four or six outputs data variations by BP (backpropagation) NN (neural network) training. We focused on the multi teacher training signals to investigate and evaluate the training performance in MLNNs to select the best and good activity function for data enlargement and hence could be applicable for image and signal processing (synaptic divergence) along with the proposed methods with convolution networks. We specifically used four activity functions from which we found out that L & exp. activity function can suite DENN (data enlargement neural network) training since it could give the highest percentage training abilities compared to the other activity functions of Sigmoid, ReLu and Leaky-ReLu during simulation and training of data in the network. And finally, we recommend L & exp. function to be good for MLNNs and may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple teacher training patterns using original generated data and hence can be tried with CNN (convolution neural networks) of image processing.
文摘With the increasing variety of application software of meteorological satellite ground system, how to provide reasonable hardware resources and improve the efficiency of software is paid more and more attention. In this paper, a set of software classification method based on software operating characteristics is proposed. The method uses software run-time resource consumption to describe the software running characteristics. Firstly, principal component analysis (PCA) is used to reduce the dimension of software running feature data and to interpret software characteristic information. Then the modified K-means algorithm was used to classify the meteorological data processing software. Finally, it combined with the results of principal component analysis to explain the significance of various types of integrated software operating characteristics. And it is used as the basis for optimizing the allocation of software hardware resources and improving the efficiency of software operation.
文摘The rapid development of urbanization requires land management business should change the former single systematic pattern, and advance to integration of functions and data sharing. In order to meets the requirement, this paper presents a new thinking for land management pattern, and management tools of data center for integration of urban and rural areas. The tools were based on MapGIS, which have made the management of multi-subjects, multi-areas, multi-sources and multi-measurement data possible. The techniques of this system are designed accord with national related standard. Experimental result shows that the tools have obvious technical advantage in land resource business integration management.
文摘The increase in computing capacity caused a rapid and sudden increase in the Operational Expenses (OPEX) of data centers. OPEX reduction is a big concern and a key target in modern data centers. In this study, the scalability of the Dynamic Voltage and Frequency Scaling (DVFS) power management technique is studied under multiple different workloads. The environment of this study is a 3-Tier data center. We conducted multiple experiments to find the impact of using DVFS on energy reduction under two scheduling techniques, namely: Round Robin and Green. We observed that the amount of energy reduction varies according to data center load. When the data center load increases, the energy reduction decreases. Experiments using Green scheduler showed around 83% decrease in power consumption when DVFS is enabled and DC is lightly loaded. In case the DC is fully loaded, in which case the servers’ CPUs are constantly busy with no idle time, the effect of DVFS decreases and stabilizes to less than 10%. Experiments using Round Robin scheduler showed less energy saving by DVFS, specifically, around 25% in light DC load and less than 5% in heavy DC load. In order to find the effect of task weight on energy consumption, a set of experiments were conducted through applying thin and fat tasks. A thin task has much less instructions compared to fat tasks. We observed, through the simulation, that the difference in power reduction between both types of tasks when using DVFS is less than 1%.
文摘To provide scientific management basis for the garden planning, project construction, maintenance, social service, this paper prompted that the urban gardening administration sectors need to construct gardening information management system. On the basis of fully requirements analysis of gardening sectors, this paper discussed the key technology for system construction. It also proposed to flexibly and smartly build up the system by using the secondary development design environment and running environment based on data center integration development platform. This system greatly helps the daily management and plays very important role in improving urban ecological environment and investment environment.
基金supported by the project“Romanian Hub for Artificial Intelligence-HRIA”,Smart Growth,Digitization and Financial Instruments Program,2021–2027,MySMIS No.334906.
文摘Objective expertise evaluation of individuals,as a prerequisite stage for team formation,has been a long-term desideratum in large software development companies.With the rapid advancements in machine learning methods,based on reliable existing data stored in project management tools’datasets,automating this evaluation process becomes a natural step forward.In this context,our approach focuses on quantifying software developer expertise by using metadata from the task-tracking systems.For this,we mathematically formalize two categories of expertise:technology-specific expertise,which denotes the skills required for a particular technology,and general expertise,which encapsulates overall knowledge in the software industry.Afterward,we automatically classify the zones of expertise associated with each task a developer has worked on using Bidirectional Encoder Representations from Transformers(BERT)-like transformers to handle the unique characteristics of project tool datasets effectively.Finally,our method evaluates the proficiency of each software specialist across already completed projects from both technology-specific and general perspectives.The method was experimentally validated,yielding promising results.
基金funded by the Science and Technology Project of Tianjin(No.24YDTPJC00680)the National Natural Science Foundation of China(No.52406191).
文摘The growth of computing power in data centers(DCs)leads to an increase in energy consumption and noise pollution of air cooling systems.Chip-level cooling with high-efficiency coolant is one of the promising methods to address the cooling challenge for high-power devices in DCs.Hybrid nanofluid(HNF)has the advantages of high thermal conductivity and good rheological properties.This study summarizes the numerical investigations of HNFs in mini/micro heat sinks,including the numerical methods,hydrothermal characteristics,and enhanced heat transfer technologies.The innovations of this paper include:(1)the characteristics,applicable conditions,and scenarios of each theoretical method and numerical method are clarified;(2)the molecular dynamics(MD)simulation can reveal the synergy effect,micro motion,and agglomeration morphology of different nanoparticles.Machine learning(ML)presents a feasiblemethod for parameter prediction,which provides the opportunity for the intelligent regulation of the thermal performance of HNFs;(3)the HNFs flowboiling and the synergy of passive and active technologies may further improve the overall efficiency of liquid cooling systems in DCs.This review provides valuable insights and references for exploring the multi-phase flow and heat transport mechanisms of HNFs,and promoting the practical application of HNFs in chip-level liquid cooling in DCs.
基金Supported by the National Basic Research Program of China(2012CB720505)the National Natural Science Foundation of China(21276137,61433001)+1 种基金Tsinghua University Initiative Scientific Research Programthe seventh framework programme(FP7-PEOPLE-2013-IRSES-612230)of European Union
文摘Owing to wide applications of automatic control systems in the process industries, the impacts of controller performance on industrial processes are becoming increasingly significant. Consequently, controller maintenance is critical to guarantee routine operations of industrial processes. The workflow of controller maintenance generally involves the following steps: monitor operating controller performance and detect performance degradation, diagnose probable root causes of control system malfunctions, and take specific actions to resolve associated problems. In this article, a comprehensive overview of the mainstream of control loop monitoring and diagnosis is provided, and some existing problems are also analyzed and discussed. From the viewpoint of synthesizing abundant information in the context of big data, some prospective ideas and promising methods are outlined to potentially solve problems in industrial applications.
基金financially supported by the Basic Research Funds for the Central Government“Innovative Team of Zhejiang University”under contract number(2022FZZX01-09).
文摘The effect of gradient exhaust strategy and blind plate installation on the inhibition of backflow and thermal stratification in data center cabinets is systematically investigated in this study through numericalmethods.The validated Re-Normalization Group(RNG)k-ε turbulence model was used to analyze airflow patterns within cabinet structures equipped with backplane air conditioning.Key findings reveal that server-generated thermal plumes induce hot air accumulation at the cabinet apex,creating a 0.8℃ temperature elevation at the top server’s inlet compared to the ideal situation(23℃).Strategic increases in backplane fan exhaust airflow rates reduce server 1’s inlet temperature from 26.1℃(0%redundancy case)to 23.1℃(40%redundancy case).Gradient exhaust strategies achieve equivalent server temperature performance to uniform exhaust distributions while requiring 25%less redundant airflow.This approach decreases the recirculation ratio from1.52%(uniformexhaust at 15%redundancy)to 0.57%(gradient exhaust at equivalent redundancy).Comparative analyses demonstrate divergent thermal behaviors:in bottom-server-absent configurations,gradient exhaust reduces top server inlet temperatures by 1.6℃vs.uniformexhaust,whereas top-serverabsent configurations exhibit a 1.8℃ temperature increase under gradient conditions.The blind plate implementation achieves a 0.4℃ top server temperature reduction compared to 15%-redundancy uniform exhaust systems without requiring additional airflow redundancy.Partially installed server arrangements with blind plates maintain thermal characteristics comparable to fully populated cabinets.This study validates gradient exhaust and blind plate technologies as effective countermeasures against cabinet-scale thermal recirculation,providing actionable insights for optimizing backplane air conditioning systems in mission-critical data center environments.
文摘This article proposes a framework, called BP-M* which includes: 1) a methodology to analyze, engineer, restructure and implement business processes, and 2) a process model that extends the process diagram with the specification of resources that execute the process activities, allocation policies, schedules, times of activities, management of queues in input to the activities and workloads so that the same model can be simulated by a discrete event simulator. The BP-M* framework has been applied to a real case study, a public Contact Center which provides different typologies of answers to users’ requests. The simulation allows to study different system operating scenarios (“What-If” analysis) providing useful information for analysts to evaluate restructuring actions.
文摘The transmission of scientific data over long distances is required to enable interplanetary science expeditions. Current approaches include transmitting all collected data or transmitting low resolution data to enable ground controller review and selection of data for transmission. Model-based data transmission (MBDT) seeks to increase the amount of knowledge conveyed per unit of data transmitted by comparing high-resolution data collected in situ to a pre-existing (or potentially co-transmitted) model. This paper describes the application of MBDT to gravitational data and characterizes its utility and performance. This is performed by applying the MBDT technique to a selection of gravitational data previously collected for the Earth and comparing the transmission requirements to the level required for raw data transmis-sion and non-application-aware compression. Levels of transmission reduction up to 31.8% (without the use maximum-error-thresholding) and up to 97.17% (with the use of maximum-error-thresholding) resulted. These levels significantly exceed what is possible with non-application-aware compression.
文摘This paper presents a lineament detection method using multi-band remote sensing images. The main objective of this work is to design an automatic image processing tool for lineament mapping from Landsat-7 ETM + satellite data. Five procedures were involved: 1) The Principal Component Analysis;2) image enhancement using histogram equalization technique 3) directional Sobel filters of the original data;4) histogram segmentation and 5) binary image generation. The applied methodology was contributed in identifying several known large-scale faults in the Northeast of Tunisia. The statistical and spatial analyses of lineament map indicate a difference of morphological appearance of lineaments in the satellite image. Indeed, all the lineaments present a specific organization. Five groups were classified based on three orientations: NE-SW, E-W and NW-SE. The overlapping of lineament map with the geologic map confirms that these lineaments of diverse directions can be identified and recognized on the field as a fault. The identified lineaments were linked to a deep faults caused by tectonic movements in Tunisia. This study shows the performance of the satellite image processing in the analysis and mapping of the accidents in the northern Atlas.
文摘While the Ordos Basin is recognized for its substantial hydrocarbon exploration prospects,its rugged loess tableland terrain has rendered seismic exploration exceptionally challenging[1-3].Persistent obstacles such as complex 3D survey planning,low signal-tonoise ratio raw data,inadequate near-surface velocity modeling,and imaging inaccuracy have long hindered the advancement of seismic exploration across this region.Through a problem-solving approach rooted in geological target analysis,this research systematically investigates the behavioral patterns of nodal seismometer-based high-density seismic acquisition in loess plateau.Tailored advancements in waveform enhancement and depth velocity modelling methodologies have been engineered.Field validations confirm that the optimized workflow demonstrates marked improvements in amplitude preservation and imaging resolution,offering novel insights for future reservoir characterization endeavors.
文摘A method of data processing to determine the coefficients of linearization equations for 1050 anemometer (produced by Thermo-Systems Inc. -TSI, USA) with the sensors made of domestic hot wire using the program preferred in this Paper is described. By calculation and test, it is indicated that the error resulting from this method is about 0. 5% of the full scale and less than TSl's. By using this method we can set up the calibration curve according to the measurement range and the diameter of the hot wire at a certain accuracy.
文摘This article studies the fault recorder in power system and introduces the Comtrade format. Andituses C++ programming to read recorded fault data and adopts Fourier analysis and symmetrical component method to filter and extract fundamental waves. Finally the effectiveness of the data processing method introduced in this paper is verified by CAAP software.
基金Major Program of National Social Science Foundation of China,No.21&ZD107。
文摘Data centers operate as physical digital infrastructure for generating,storing,computing,transmitting,and utilizing massive data and information,constituting the backbone of the flourishing digital economy across the world.Given the lack of a consistent analysis for studying the locational factors of data centers and empirical deficiencies in longitudinal investigations on spatial dynamics of heterogeneous data centers,this paper develops a comprehensive analytical framework to examine the dynamic geographies and locational factors of techno-environmentally heterogeneous data centers across Chinese cities in the period of 2006–2021.First,we develop a“supply-demand-environment trinity”analytical framework as well as an accompanying evaluation indicator system with Chinese characteristics.Second,the dynamic geographies of data centers in Chinese cities over the last decades are characterized as spatial polarization in economically leading urban agglomerations alongside persistent interregional gaps across eastern,central,and western regions.Data centers present dual spatial expansion trajectories featuring outward radiation from eastern core urban agglomerations to adjacent peripheries and leapfrog diffusion to strategic central and western digital infrastructural hubs.Third,it is empirically verified that data center construction in Chinese cities over the last decades has been jointly influenced by supply-,demand-,and environment-side locational factors,echoing the efficacy of the trinity analytical framework.Overall,our findings demonstrate the temporal variance,contextual contingency,and attribute-based differentiation of locational factors underlying techno-environmentally heterogeneous data centers in Chinese cities.
基金supported by the National Key R&D Program of China(Nos.2022YFF 0503203 and 2024YFF0809900)the Research Funds of the Institute of Geophysics,China Earthquake Administration(No.DQJB24X28)the National Natural Science Foundation of China(Nos.42474226 and 42441827).
文摘The InSight mission has obtained seismic data from Mars,offering new insights into the planet’s internal structure and seismic activity.However,the raw data released to the public contain various sources of noise,such as ticks and glitches,which hamper further seismological studies.This paper presents step-by-step processing of InSight’s Very Broad Band seismic data,focusing on the suppression and removal of non-seismic noise.The processing stages include tick noise removal,glitch signal suppression,multicomponent synchronization,instrument response correction,and rotation of orthogonal components.The processed datasets and associated codes are openly accessible and will support ongoing efforts to explore the geophysical properties of Mars and contribute to the broader field of planetary seismology.
基金Supported by the National Natural Science Foundation of China(No.61672270)Jiangsu Provionce Teaching Reform Project for Cloud Computing Technology and Application Talent Training(No.201802130049).
文摘Aiming at the problems such as low throughput and unbalanced load of data center network caused by traditional multipath routing strategy,a dynamic load balancing strategy for flow classification oriented to Fat-Tree topology based on the software defined network(SDN)architecture is proposed,named DLB-FC.Multi-index evaluation methods such as link state information and network traffic characteristics are considered.DLB-FC mechanism can dynamically adjust the flow classification threshold to differentiate between large and small flows.The scheme selects different forwarding paths to meet the transmission performance requirements of different flow characteristics.On this basis,an SDN simulation platform is built for performance testing.The simulation results show that DLB-FC algorithm can dynamically distinguish large flows from small flows and achieve load balancing effectively.Compared with equal-cost multi-path(ECMP),global first fit(GFF)and minmum total delay load routing(MTDLR)algorithms,DLB-FC scheme improves the network throughput and link utilization of the data center network effectively.The transmission delay is also reduced with better load balance.