Glacier mass balance is a key indicator of glacier health and climate change sensitivity.Influencing factors include both climatic and nonclimatic elements,forming a complex set of drivers.There is a lack of quantitat...Glacier mass balance is a key indicator of glacier health and climate change sensitivity.Influencing factors include both climatic and nonclimatic elements,forming a complex set of drivers.There is a lack of quantitative analysis of these composite factors,particularly in climate-typical regions like the Tanggula Mountains on the central Tibetan Plateau.We collected data on various factors affecting glacier mass balance from 2000 to 2020,including climate variables,topographic variables,geometric parameters,and glacier dynamics.We utilized linear regression models,ensemble learning models,and Open Global Glacier Model(OGGM)to analyze glacier mass balance changes in the Tanggula Mountains.Results indicate that linear models explain 58%of the variance in glacier mass balance,with seasonal temperature and precipitation having significant impacts.Our findings show that ensemble learning models made the explanations 5.2%more accurate by including the impact of topographic and geometric factors such as the average glacier height,the slope of the glacier tongue,the speed of the ice flow,and the area of the glacier.Interpretable machine learning identified the spatial distribution of positive and negative impacts of these characteristics and the interaction between glacier topography and ice dynamics.Finally,we predicted the responses of glaciers of different sizes to future climate change based on the results of interpretable machine learning.It was found that relatively large glaciers(>1 km~2)are likely to persist until the end of this century under low emission scenarios,whereas small glaciers(<1 km~2)are expected to nearly disappear by 2080 under any emission scenario.Our research provides technical support for improving glacier change modeling and protection on the Tibetan Plateau.展开更多
A local and global context representation learning model for Chinese characters is designed and a Chinese word segmentation method based on character representations is proposed in this paper. First, the proposed Chin...A local and global context representation learning model for Chinese characters is designed and a Chinese word segmentation method based on character representations is proposed in this paper. First, the proposed Chinese character learning model uses the semanties of loeal context and global context to learn the representation of Chinese characters. Then, Chinese word segmentation model is built by a neural network, while the segmentation model is trained with the eharaeter representations as its input features. Finally, experimental results show that Chinese charaeter representations can effectively learn the semantic information. Characters with similar semantics cluster together in the visualize space. Moreover, the proposed Chinese word segmentation model also achieves a pretty good improvement on precision, recall and f-measure.展开更多
In this study, we investigated the variations in warming between Japanese cities for 1960-1989, and 1990-2019 using principal component analysis (PCA) and k-means clustering. The precipitation and sunshine hours exhib...In this study, we investigated the variations in warming between Japanese cities for 1960-1989, and 1990-2019 using principal component analysis (PCA) and k-means clustering. The precipitation and sunshine hours exhibited opposite tendencies in the PCA results. It was found that 1960M and 1990M had a correlation (r = 0.51). The 1960M and 1990M are the mean temperature anomalies in Japanese cities for 1960-1989 and 1990-2019, respectively. There was a strong correlation between temperature and precipitation (r = 0.62). There was an inverse correlation between 1960M and sunshine hours (r = −0.25), but a correlation between 1990M and sunshine hours (r = 0.11). Sunshine hours had less effect on the 1960M but more impact on the 1990M. The k-means clustering for 1960M and 1990M can be classified into four types: high 1960M and high 1990M, which indicates that global warming is progressing rapidly (Sapporo, Tokyo, Kyoto, Osaka, Fukuoka, Nagasaki), low 1960M and low 1990M, global warming is progressing slowly (Nemuro, Ishinomaki, Yamagata, Niigata, Fushiki, Nagano, Karuizawa, Mito, Suwa, Iida, Hamada, Miyazaki, Naha), low 1960M and high 1990M, global warming has accelerated since 1990 (Utsunomiya, Kofu, Okayama, Hiroshima), and normal 1960M and normal 1990M, the rate of warming is normal among the 38 cities (Asahikawa, Aomori, Akita, Kanazawa, Maebashi, Matsumoto, Yokohama, Gifu, Nagoya, Hamamatsu, Kochi, Kagoshima). Higher annual temperatures were correlated with higher annual precipitation according to the k-means clustering of temperature and precipitation. Two of the four categories consisted of places with high annual temperatures and high precipitation (Fushiki, Kanazawa, Kochi, Miyazaki, Kagoshima, Naha, Ishigakijima), and places with low annual temperatures and low precipitation (Asahikawa, Nemuro, Sapporo, Karuizawa).展开更多
Accurate reconstruction of understory terrain is essential for environmental monitoring and resource management.This study integrates 1:10,000 Digital Elevation Model,Global Ecosystem Dynamics Investigation(GEDI),and ...Accurate reconstruction of understory terrain is essential for environmental monitoring and resource management.This study integrates 1:10,000 Digital Elevation Model,Global Ecosystem Dynamics Investigation(GEDI),and AW3D30 Digital Surface Model data,combined with three machine learning algorithms—Random Forest(RF),Back Propagation Neural Network(BPNN),and Extreme Gradient Boosting(XGBoost)—to evaluate the performance of canopy height inversion and understory terrain reconstruction.The analysis emphasizes the impact of topographic and vegetation-related factors on model accuracy.Results reveal that slope is the most influential variable,contributing three to five times more to model performance than other features.In low-slope areas,understory terrain tends to be underestimated,whereas high-slope areas often result in overestimation.Moreover,the Normalized Difference Vegetation Index(NDVI)and land cover types,particularly forests and grasslands,significantly affect prediction accuracy,with model performance showing heightened sensitivity to vegetation characteristics in these regions.Among the models tested,XGBoost demonstrated superior performance,achieving a canopy height bias of-0.06 m,a root mean square error(RMSE)of 4.69 m for canopy height,and an RMSE of 9.82 m for understory terrain.Its ability to capture complex nonlinear relationships and handle high-dimensional data underlines its robustness.While the RF model exhibited strong stability and resistance to noise,its accuracy lagged slightly behind XGBoost.The BPNN model,by contrast,struggled in areas with complex terrain.This study offers valuable insights into feature selection and optimization in remote sensing applications,providing a reference framework for enhancing the accuracy and efficiency of environmental monitoring practices.展开更多
In contemporary,globalization is advancing at an unprecedented rate in multitude arenas.Globalization has brought us to contact with the culture,customs and thinking of countries around the world.English learning unde...In contemporary,globalization is advancing at an unprecedented rate in multitude arenas.Globalization has brought us to contact with the culture,customs and thinking of countries around the world.English learning under the context of globalization has been changed to some extent.Globalization is exuberant,specific learning instead of systematic learning is what is necessitated.展开更多
A novel approach to optimizing any given mathematical function, called the MOdified REinforcement Learning Algorithm (MORELA), is proposed. Although Reinforcement Learning (RL) is primarily developed for solving Marko...A novel approach to optimizing any given mathematical function, called the MOdified REinforcement Learning Algorithm (MORELA), is proposed. Although Reinforcement Learning (RL) is primarily developed for solving Markov decision problems, it can be used with some improvements to optimize mathematical functions. At the core of MORELA, a sub-environment is generated around the best solution found in the feasible solution space and compared with the original environment. Thus, MORELA makes it possible to discover global optimum for a mathematical function because it is sought around the best solution achieved in the previous learning episode using the sub-environment. The performance of MORELA has been tested with the results obtained from other optimization methods described in the literature. Results exposed that MORELA improved the performance of RL and performed better than many of the optimization methods to which it was compared in terms of the robustness measures adopted.展开更多
Dear Editor,Through distributed machine learning,multi-UAV systems can achieve global optimization goals without a centralized server,such as optimal target tracking,by leveraging local calculation and communication w...Dear Editor,Through distributed machine learning,multi-UAV systems can achieve global optimization goals without a centralized server,such as optimal target tracking,by leveraging local calculation and communication with neighbors.In this work,we implement the stochastic gradient descent algorithm(SGD)distributedly to optimize tracking errors based on local state and aggregation of the neighbors'estimation.However,Byzantine agents can mislead neighbors,causing deviations from optimal tracking.We prove that the swarm achieves resilient convergence if aggregated results lie within the normal neighbors'convex hull,which can be guaranteed by the introduced centerpoint-based aggregation rule.In the given simulated scenarios,distributed learning using average,geometric median(GM),and coordinate-wise median(CM)based aggregation rules fail to track the target.Compared to solely using the centerpoint aggregation method,our approach,which combines a pre-filter with the centroid aggregation rule,significantly enhances resilience against Byzantine attacks,achieving faster convergence and smaller tracking errors.展开更多
Accessible communication based on sign language recognition(SLR)is the key to emergency medical assistance for the hearing-impaired community.Balancing the capture of both local and global information in SLR for emerg...Accessible communication based on sign language recognition(SLR)is the key to emergency medical assistance for the hearing-impaired community.Balancing the capture of both local and global information in SLR for emergency medicine poses a significant challenge.To address this,we propose a novel approach based on the inter-learning of visual features between global and local information.Specifically,our method enhances the perception capabilities of the visual feature extractor by strategically leveraging the strengths of convolutional neural network(CNN),which are adept at capturing local features,and visual transformers which perform well at perceiving global features.Furthermore,to mitigate the issue of overfitting caused by the limited availability of sign language data for emergency medical applications,we introduce an enhanced short temporal module for data augmentation through additional subsequences.Experimental results on three publicly available sign language datasets demonstrate the efficacy of the proposed approach.展开更多
Global learning professional competencies (GLPCs) are essential for college students to be able to address the impact of globalization in the 21st century. Organizations and society-at-large look to higher education t...Global learning professional competencies (GLPCs) are essential for college students to be able to address the impact of globalization in the 21st century. Organizations and society-at-large look to higher education to prepare college students with GLPCs. In addition, there is a body of literature that suggest personal tacit knowledge enhance GLPCs. However, researchers have done little from an empirical perspective to determine the relationship between the use of P-T K and enhancement of GLPCs, hence the purpose of this study. The statistical results revealed significant correlations, p st century knowledge society through use of P-T K.展开更多
Introduction Human papillomavirus(HPV)vaccination is a cornerstone of cervical cancer prevention,particularly in low-and middle-income countries(LMICs),where the burden of disease remains high~1.The World Health Organ...Introduction Human papillomavirus(HPV)vaccination is a cornerstone of cervical cancer prevention,particularly in low-and middle-income countries(LMICs),where the burden of disease remains high~1.The World Health Organization(WHO)HPV Vaccine Introduction Clearing House reported that 147 countries(of 194 reporting)had fully introduced the HPV vaccine into their national schedules as of 20242.After COVID-19 pandemic disruptions,global coverage is again increasing.展开更多
Climate change is a controversial topic of debate, especially in the US, where many do not believe in anthropogenic climate change. Because its consequences are predicted to be dire, such as a mass ocean extinction an...Climate change is a controversial topic of debate, especially in the US, where many do not believe in anthropogenic climate change. Because its consequences are predicted to be dire, such as a mass ocean extinction and frequent extreme weather events, it is important to learn what causes the warming in order to better combat it. In this study, the first challenge dwells on how to construct reliable statistical models based on massive climate data of 800,000 years and accurately capture the relationship between temperature and potential factors such as concentrations of carbon dioxide (CO2), nitrous oxide (N2O), and methane (CH4). We compared the performance several mainstream machine learning algorithms on our data, which includes linear regression, lasso, support vector regression and random forest, to build the state of the art model to verify the warming of the earth and identifying factors contributing the global warming. We found that random forest outperforms other algorithms to create accurate climate models which use features including concentrations of different greenhouse gases to precisely forecast global atmosphere. The other challenges in identifying factor importance can be met by the feature of ensemble tree-based random forest algorithm. It was found that CO2 is the largest contributor to temperature change, followed by CH4, then by N2O. They all had some sorts of impact, though, meaning their release into the atmosphere should all be controlled to help restrain temperature increase, and help prevent climate change’s potential ramifications.展开更多
Unmanned Aerial Vehicles(UAVs)are widely used and meet many demands in military and civilian fields.With the continuous enrichment and extensive expansion of application scenarios,the safety of UAVs is constantly bein...Unmanned Aerial Vehicles(UAVs)are widely used and meet many demands in military and civilian fields.With the continuous enrichment and extensive expansion of application scenarios,the safety of UAVs is constantly being challenged.To address this challenge,we propose algorithms to detect anomalous data collected from drones to improve drone safety.We deployed a one-class kernel extreme learning machine(OCKELM)to detect anomalies in drone data.By default,OCKELM uses the radial basis(RBF)kernel function as the kernel function of themodel.To improve the performance ofOCKELM,we choose a TriangularGlobalAlignmentKernel(TGAK)instead of anRBF Kernel and introduce the Fast Independent Component Analysis(FastICA)algorithm to reconstruct UAV data.Based on the above improvements,we create a novel anomaly detection strategy FastICA-TGAK-OCELM.The method is finally validated on the UCI dataset and detected on the Aeronautical Laboratory Failures and Anomalies(ALFA)dataset.The experimental results show that compared with other methods,the accuracy of this method is improved by more than 30%,and point anomalies are effectively detected.展开更多
With the globalization of English, the macro and micro cultures of the users of English around the world interact intensively. Considering these conditions, the local and global cultural interface seems an important i...With the globalization of English, the macro and micro cultures of the users of English around the world interact intensively. Considering these conditions, the local and global cultural interface seems an important issue which needs to be clarified in the materials and books used for learning English. Thus, the focus of this study was to explore the language learning policy of the new Iranian English course book at high schools, Prospect 1, recently published and taught for a year in Iran, in light of globalization and culture. This qualitative study was conducted through carrying out semi-structured interviews. The participants of this study were 30 teachers of Ministry of Education, who had the experience of teaching Prospect I for a year and they were mostly chosen from Mashhad and the rest from other cities of Khorasan province, Iran. The interview contained four main questions which were posed to the teachers. The findings of the study indicate that the language learning policy of Iran need to pay more attention to the learners' intercultural communicative competence because it mainly attempts at teaching English language focusing on the home culture in the Iranian context. The article ends with some pedagogical implications and more recommendations for developing research studies.展开更多
This paper studies the division of labor and economic development under global value chains in North South trade by mainly investigating the changes of production hours and cost per unit along with more and more outpu...This paper studies the division of labor and economic development under global value chains in North South trade by mainly investigating the changes of production hours and cost per unit along with more and more output and increasing trade value in several industries in the U.S., because the U. S. is at the leading position in the division of labor by global value chains. The empirical evidence reveals that more international outsourcing, there will be more detailed division of labor, and the industry unit production time and production cost will show more declining trend year by year. This is consistent with that the global value chains and the outsourcing play more and more important roles in the international division of labor and economic growth in both developed and developing countries, and helps explain the integration of workforce across countries in the global value chains.展开更多
In recent years,multi-label learning has received a lot of attention.However,most of the existing methods only consider global label correlation or local label correlation.In fact,on the one hand,both global and local...In recent years,multi-label learning has received a lot of attention.However,most of the existing methods only consider global label correlation or local label correlation.In fact,on the one hand,both global and local label correlations can appear in real-world situation at same time.On the other hand,we should not be limited to pairwise labels while ignoring the high-order label correlation.In this paper,we propose a novel and effective method called GLLCBN for multi-label learning.Firstly,we obtain the global label correlation by exploiting label semantic similarity.Then,we analyze the pairwise labels in the label space of the data set to acquire the local correlation.Next,we build the original version of the label dependency model by global and local label correlations.After that,we use graph theory,probability theory and Bayesian networks to eliminate redundant dependency structure in the initial version model,so as to get the optimal label dependent model.Finally,we obtain the feature extraction model by adjusting the Inception V3 model of convolution neural network and combine it with the GLLCBN model to achieve the multi-label learning.The experimental results show that our proposed model has better performance than other multi-label learning methods in performance evaluating.展开更多
The safety assessment of high-level radioactive waste repositories requires a high predictive accuracy for radionuclide diffusion and a comprehensive understanding of the diffusion mechanism.In this study,a through-di...The safety assessment of high-level radioactive waste repositories requires a high predictive accuracy for radionuclide diffusion and a comprehensive understanding of the diffusion mechanism.In this study,a through-diffusion method and six machine-learning methods were employed to investigate the diffusion of ReO_(4)^(−),HCrO_(4)^(−),and I−in saturated compacted bentonite under different salinities and compacted dry densities.The machine-learning models were trained using two datasets.One dataset contained six input features and 293 instances obtained from the diffusion database system of the Japan Atomic Energy Agency(JAEA-DDB)and 15 publications.The other dataset,comprising 15,000 pseudo-instances,was produced using a multi-porosity model and contained eight input features.The results indicate that the former dataset yielded a higher predictive accuracy than the latter.Light gradient-boosting exhibited a higher prediction accuracy(R2=0.92)and lower error(MSE=0.01)than the other machine-learning algorithms.In addition,Shapley Additive Explanations,Feature Importance,and Partial Dependence Plot analysis results indicate that the rock capacity factor and compacted dry density had the two most significant effects on predicting the effective diffusion coefficient,thereby offering valuable insights.展开更多
Building an automatic fish recognition and detection system for largescale fish classes is helpful for marine researchers and marine scientists because there are large numbers of fish species.However,it is quite diffi...Building an automatic fish recognition and detection system for largescale fish classes is helpful for marine researchers and marine scientists because there are large numbers of fish species.However,it is quite difficult to build such systems owing to the lack of data imbalance problems and large number of classes.To solve these issues,we propose a transfer learning-based technique in which we use Efficient-Net,which is pre-trained on ImageNet dataset and fine-tuned on QuT Fish Database,which is a large scale dataset.Furthermore,prior to the activation layer,we use Global Average Pooling(GAP)instead of dense layer with the aim of averaging the results of predictions along with having more information compared to the dense layer.To check the validity of our model,we validate our model on the validation set which achieves satisfactory results.Also,for the localization task,we propose an architecture that consists of localization aware block,which captures localization information for better prediction and residual connections to handle the over-fitting problem.Actually,the residual connections help the layer to combine missing information with the relevant one.In addition,we use class weights and Focal Loss(FL)to handle class imbalance problems along with reducing false predictions.Actually,class weights assign less weights to classes having fewer instances and large weights to classes having more number of instances.During the localization,the qualitative assessment shows that we achieve 57%Mean Intersection Over Union(IoU)on testing data,and the classification results show 75%precision,70%recall,78%accuracy and 74%F1-Score for 468 fish species.展开更多
Smart Grids(SG)is a power system development concept that has received significant attention nationally.SG signifies real-time data for specific communication requirements.The best capabilities for monitoring and control...Smart Grids(SG)is a power system development concept that has received significant attention nationally.SG signifies real-time data for specific communication requirements.The best capabilities for monitoring and controlling the grid are essential to system stability.One of the most critical needs for smart-grid execution is fast,precise,and economically synchronized measurements,which are made feasible by Phasor Measurement Units(PMU).PMUs can pro-vide synchronized measurements and measure voltages as well as current phasors dynamically.PMUs utilize GPS time-stamping at Coordinated Universal Time(UTC)to capture electric phasors with great accuracy and precision.This research tends to Deep Learning(DL)advances to design a Residual Network(ResNet)model that can accurately identify and classify defects in grid-connected systems.As part of fault detection and probe,the proposed strategy uses a ResNet-50 tech-nique to evaluate real-time measurement data from geographically scattered PMUs.As a result of its excellent signal classification efficiency and ability to extract high-quality signal features,its fault diagnosis performance is excellent.Our results demonstrate that the proposed method is effective in detecting and classifying faults at sufficient time.The proposed approaches classify the fault type with a precision of 98.5%and an accuracy of 99.1%.The long-short-term memory(LSTM),Convolutional Neural Network(CNN),and CNN-LSTM algo-rithms are applied to compare the networks.Real-world data tends to evaluate these networks.展开更多
Due to drastic increase in the generation of data,it is tedious to examine and derive high level knowledge from the data.The rising trends of high dimension data gathering and problem representation necessitates featu...Due to drastic increase in the generation of data,it is tedious to examine and derive high level knowledge from the data.The rising trends of high dimension data gathering and problem representation necessitates feature selection process in several machine learning processes.The feature selection procedure establishes a generally encountered issue of global combinatorial optimization.The FS process can lessen the number of features by the removal of unwanted and repetitive data.In this aspect,this article introduces an improved harmony search based global optimization for feature selection with optimal deep learning(IHSFS-ODL)enabled classification model.The proposed IHSFS-ODL technique intends to reduce the curse of dimensionality and enhance classification outcomes.In addition,the IHSFSODL technique derives an IHSFS technique by the use of local search method with traditional harmony search algorithm(HSA)for global optimization.Besides,ODL based classifier including quantum behaved particle swarm optimization(QPSO)with gated recurrent unit(GRU)is applied for data classification process.The utilization of HSA for the choice of features and QPSO algorithm for hyper parameter tuning processes helps to accomplish maximum classification performance.In order to demonstrate the enhanced outcomes of the IHSFS-ODL technique,a series of simulations were carried out and the results reported the betterment over its recent state of art approaches.展开更多
基金funding from the National Key Research and Development Program of China(2023YFC3206300)the Gansu Provincial Science and Technology Program(22ZD6FA005)+2 种基金the Gansu Youth Science and Technology Fund(E4310103)the Gansu Postdoctoral Science Foundation(E339880112)the Tibet Science and Technology Program(XZ202301ZY0001G and XZ202401JD0007)。
文摘Glacier mass balance is a key indicator of glacier health and climate change sensitivity.Influencing factors include both climatic and nonclimatic elements,forming a complex set of drivers.There is a lack of quantitative analysis of these composite factors,particularly in climate-typical regions like the Tanggula Mountains on the central Tibetan Plateau.We collected data on various factors affecting glacier mass balance from 2000 to 2020,including climate variables,topographic variables,geometric parameters,and glacier dynamics.We utilized linear regression models,ensemble learning models,and Open Global Glacier Model(OGGM)to analyze glacier mass balance changes in the Tanggula Mountains.Results indicate that linear models explain 58%of the variance in glacier mass balance,with seasonal temperature and precipitation having significant impacts.Our findings show that ensemble learning models made the explanations 5.2%more accurate by including the impact of topographic and geometric factors such as the average glacier height,the slope of the glacier tongue,the speed of the ice flow,and the area of the glacier.Interpretable machine learning identified the spatial distribution of positive and negative impacts of these characteristics and the interaction between glacier topography and ice dynamics.Finally,we predicted the responses of glaciers of different sizes to future climate change based on the results of interpretable machine learning.It was found that relatively large glaciers(>1 km~2)are likely to persist until the end of this century under low emission scenarios,whereas small glaciers(<1 km~2)are expected to nearly disappear by 2080 under any emission scenario.Our research provides technical support for improving glacier change modeling and protection on the Tibetan Plateau.
基金Supported by the National Natural Science Foundation of China(No.61303179,U1135005,61175020)
文摘A local and global context representation learning model for Chinese characters is designed and a Chinese word segmentation method based on character representations is proposed in this paper. First, the proposed Chinese character learning model uses the semanties of loeal context and global context to learn the representation of Chinese characters. Then, Chinese word segmentation model is built by a neural network, while the segmentation model is trained with the eharaeter representations as its input features. Finally, experimental results show that Chinese charaeter representations can effectively learn the semantic information. Characters with similar semantics cluster together in the visualize space. Moreover, the proposed Chinese word segmentation model also achieves a pretty good improvement on precision, recall and f-measure.
文摘In this study, we investigated the variations in warming between Japanese cities for 1960-1989, and 1990-2019 using principal component analysis (PCA) and k-means clustering. The precipitation and sunshine hours exhibited opposite tendencies in the PCA results. It was found that 1960M and 1990M had a correlation (r = 0.51). The 1960M and 1990M are the mean temperature anomalies in Japanese cities for 1960-1989 and 1990-2019, respectively. There was a strong correlation between temperature and precipitation (r = 0.62). There was an inverse correlation between 1960M and sunshine hours (r = −0.25), but a correlation between 1990M and sunshine hours (r = 0.11). Sunshine hours had less effect on the 1960M but more impact on the 1990M. The k-means clustering for 1960M and 1990M can be classified into four types: high 1960M and high 1990M, which indicates that global warming is progressing rapidly (Sapporo, Tokyo, Kyoto, Osaka, Fukuoka, Nagasaki), low 1960M and low 1990M, global warming is progressing slowly (Nemuro, Ishinomaki, Yamagata, Niigata, Fushiki, Nagano, Karuizawa, Mito, Suwa, Iida, Hamada, Miyazaki, Naha), low 1960M and high 1990M, global warming has accelerated since 1990 (Utsunomiya, Kofu, Okayama, Hiroshima), and normal 1960M and normal 1990M, the rate of warming is normal among the 38 cities (Asahikawa, Aomori, Akita, Kanazawa, Maebashi, Matsumoto, Yokohama, Gifu, Nagoya, Hamamatsu, Kochi, Kagoshima). Higher annual temperatures were correlated with higher annual precipitation according to the k-means clustering of temperature and precipitation. Two of the four categories consisted of places with high annual temperatures and high precipitation (Fushiki, Kanazawa, Kochi, Miyazaki, Kagoshima, Naha, Ishigakijima), and places with low annual temperatures and low precipitation (Asahikawa, Nemuro, Sapporo, Karuizawa).
基金funded by the National Key Research and Development Program(Grants No.2023YFE0207900)。
文摘Accurate reconstruction of understory terrain is essential for environmental monitoring and resource management.This study integrates 1:10,000 Digital Elevation Model,Global Ecosystem Dynamics Investigation(GEDI),and AW3D30 Digital Surface Model data,combined with three machine learning algorithms—Random Forest(RF),Back Propagation Neural Network(BPNN),and Extreme Gradient Boosting(XGBoost)—to evaluate the performance of canopy height inversion and understory terrain reconstruction.The analysis emphasizes the impact of topographic and vegetation-related factors on model accuracy.Results reveal that slope is the most influential variable,contributing three to five times more to model performance than other features.In low-slope areas,understory terrain tends to be underestimated,whereas high-slope areas often result in overestimation.Moreover,the Normalized Difference Vegetation Index(NDVI)and land cover types,particularly forests and grasslands,significantly affect prediction accuracy,with model performance showing heightened sensitivity to vegetation characteristics in these regions.Among the models tested,XGBoost demonstrated superior performance,achieving a canopy height bias of-0.06 m,a root mean square error(RMSE)of 4.69 m for canopy height,and an RMSE of 9.82 m for understory terrain.Its ability to capture complex nonlinear relationships and handle high-dimensional data underlines its robustness.While the RF model exhibited strong stability and resistance to noise,its accuracy lagged slightly behind XGBoost.The BPNN model,by contrast,struggled in areas with complex terrain.This study offers valuable insights into feature selection and optimization in remote sensing applications,providing a reference framework for enhancing the accuracy and efficiency of environmental monitoring practices.
文摘In contemporary,globalization is advancing at an unprecedented rate in multitude arenas.Globalization has brought us to contact with the culture,customs and thinking of countries around the world.English learning under the context of globalization has been changed to some extent.Globalization is exuberant,specific learning instead of systematic learning is what is necessitated.
文摘A novel approach to optimizing any given mathematical function, called the MOdified REinforcement Learning Algorithm (MORELA), is proposed. Although Reinforcement Learning (RL) is primarily developed for solving Markov decision problems, it can be used with some improvements to optimize mathematical functions. At the core of MORELA, a sub-environment is generated around the best solution found in the feasible solution space and compared with the original environment. Thus, MORELA makes it possible to discover global optimum for a mathematical function because it is sought around the best solution achieved in the previous learning episode using the sub-environment. The performance of MORELA has been tested with the results obtained from other optimization methods described in the literature. Results exposed that MORELA improved the performance of RL and performed better than many of the optimization methods to which it was compared in terms of the robustness measures adopted.
基金supported By Guangdong Major Project of Basic and Applied Basic Research(2023B0303000009)Guangdong Basic and Applied Basic Research Foundation(2024A1515030153,2025A1515011587)+1 种基金Project of Department of Education of Guangdong Province(2023ZDZX4046)Shen-zhen Natural Science Fund(Stable Support Plan Program 20231122121608001),Ningbo Municipal Science and Technology Bureau(ZX2024000604).
文摘Dear Editor,Through distributed machine learning,multi-UAV systems can achieve global optimization goals without a centralized server,such as optimal target tracking,by leveraging local calculation and communication with neighbors.In this work,we implement the stochastic gradient descent algorithm(SGD)distributedly to optimize tracking errors based on local state and aggregation of the neighbors'estimation.However,Byzantine agents can mislead neighbors,causing deviations from optimal tracking.We prove that the swarm achieves resilient convergence if aggregated results lie within the normal neighbors'convex hull,which can be guaranteed by the introduced centerpoint-based aggregation rule.In the given simulated scenarios,distributed learning using average,geometric median(GM),and coordinate-wise median(CM)based aggregation rules fail to track the target.Compared to solely using the centerpoint aggregation method,our approach,which combines a pre-filter with the centroid aggregation rule,significantly enhances resilience against Byzantine attacks,achieving faster convergence and smaller tracking errors.
基金supported by the National Natural Science Foundation of China(No.62376197)the Tianjin Science and Technology Program(No.23JCYBJC00360)the Tianjin Health Research Project(No.TJWJ2025MS045).
文摘Accessible communication based on sign language recognition(SLR)is the key to emergency medical assistance for the hearing-impaired community.Balancing the capture of both local and global information in SLR for emergency medicine poses a significant challenge.To address this,we propose a novel approach based on the inter-learning of visual features between global and local information.Specifically,our method enhances the perception capabilities of the visual feature extractor by strategically leveraging the strengths of convolutional neural network(CNN),which are adept at capturing local features,and visual transformers which perform well at perceiving global features.Furthermore,to mitigate the issue of overfitting caused by the limited availability of sign language data for emergency medical applications,we introduce an enhanced short temporal module for data augmentation through additional subsequences.Experimental results on three publicly available sign language datasets demonstrate the efficacy of the proposed approach.
文摘Global learning professional competencies (GLPCs) are essential for college students to be able to address the impact of globalization in the 21st century. Organizations and society-at-large look to higher education to prepare college students with GLPCs. In addition, there is a body of literature that suggest personal tacit knowledge enhance GLPCs. However, researchers have done little from an empirical perspective to determine the relationship between the use of P-T K and enhancement of GLPCs, hence the purpose of this study. The statistical results revealed significant correlations, p st century knowledge society through use of P-T K.
文摘Introduction Human papillomavirus(HPV)vaccination is a cornerstone of cervical cancer prevention,particularly in low-and middle-income countries(LMICs),where the burden of disease remains high~1.The World Health Organization(WHO)HPV Vaccine Introduction Clearing House reported that 147 countries(of 194 reporting)had fully introduced the HPV vaccine into their national schedules as of 20242.After COVID-19 pandemic disruptions,global coverage is again increasing.
文摘Climate change is a controversial topic of debate, especially in the US, where many do not believe in anthropogenic climate change. Because its consequences are predicted to be dire, such as a mass ocean extinction and frequent extreme weather events, it is important to learn what causes the warming in order to better combat it. In this study, the first challenge dwells on how to construct reliable statistical models based on massive climate data of 800,000 years and accurately capture the relationship between temperature and potential factors such as concentrations of carbon dioxide (CO2), nitrous oxide (N2O), and methane (CH4). We compared the performance several mainstream machine learning algorithms on our data, which includes linear regression, lasso, support vector regression and random forest, to build the state of the art model to verify the warming of the earth and identifying factors contributing the global warming. We found that random forest outperforms other algorithms to create accurate climate models which use features including concentrations of different greenhouse gases to precisely forecast global atmosphere. The other challenges in identifying factor importance can be met by the feature of ensemble tree-based random forest algorithm. It was found that CO2 is the largest contributor to temperature change, followed by CH4, then by N2O. They all had some sorts of impact, though, meaning their release into the atmosphere should all be controlled to help restrain temperature increase, and help prevent climate change’s potential ramifications.
基金supported by the Natural Science Foundation of The Jiangsu Higher Education Institutions of China(Grant No.19JKB520031).
文摘Unmanned Aerial Vehicles(UAVs)are widely used and meet many demands in military and civilian fields.With the continuous enrichment and extensive expansion of application scenarios,the safety of UAVs is constantly being challenged.To address this challenge,we propose algorithms to detect anomalous data collected from drones to improve drone safety.We deployed a one-class kernel extreme learning machine(OCKELM)to detect anomalies in drone data.By default,OCKELM uses the radial basis(RBF)kernel function as the kernel function of themodel.To improve the performance ofOCKELM,we choose a TriangularGlobalAlignmentKernel(TGAK)instead of anRBF Kernel and introduce the Fast Independent Component Analysis(FastICA)algorithm to reconstruct UAV data.Based on the above improvements,we create a novel anomaly detection strategy FastICA-TGAK-OCELM.The method is finally validated on the UCI dataset and detected on the Aeronautical Laboratory Failures and Anomalies(ALFA)dataset.The experimental results show that compared with other methods,the accuracy of this method is improved by more than 30%,and point anomalies are effectively detected.
文摘With the globalization of English, the macro and micro cultures of the users of English around the world interact intensively. Considering these conditions, the local and global cultural interface seems an important issue which needs to be clarified in the materials and books used for learning English. Thus, the focus of this study was to explore the language learning policy of the new Iranian English course book at high schools, Prospect 1, recently published and taught for a year in Iran, in light of globalization and culture. This qualitative study was conducted through carrying out semi-structured interviews. The participants of this study were 30 teachers of Ministry of Education, who had the experience of teaching Prospect I for a year and they were mostly chosen from Mashhad and the rest from other cities of Khorasan province, Iran. The interview contained four main questions which were posed to the teachers. The findings of the study indicate that the language learning policy of Iran need to pay more attention to the learners' intercultural communicative competence because it mainly attempts at teaching English language focusing on the home culture in the Iranian context. The article ends with some pedagogical implications and more recommendations for developing research studies.
文摘This paper studies the division of labor and economic development under global value chains in North South trade by mainly investigating the changes of production hours and cost per unit along with more and more output and increasing trade value in several industries in the U.S., because the U. S. is at the leading position in the division of labor by global value chains. The empirical evidence reveals that more international outsourcing, there will be more detailed division of labor, and the industry unit production time and production cost will show more declining trend year by year. This is consistent with that the global value chains and the outsourcing play more and more important roles in the international division of labor and economic growth in both developed and developing countries, and helps explain the integration of workforce across countries in the global value chains.
文摘In recent years,multi-label learning has received a lot of attention.However,most of the existing methods only consider global label correlation or local label correlation.In fact,on the one hand,both global and local label correlations can appear in real-world situation at same time.On the other hand,we should not be limited to pairwise labels while ignoring the high-order label correlation.In this paper,we propose a novel and effective method called GLLCBN for multi-label learning.Firstly,we obtain the global label correlation by exploiting label semantic similarity.Then,we analyze the pairwise labels in the label space of the data set to acquire the local correlation.Next,we build the original version of the label dependency model by global and local label correlations.After that,we use graph theory,probability theory and Bayesian networks to eliminate redundant dependency structure in the initial version model,so as to get the optimal label dependent model.Finally,we obtain the feature extraction model by adjusting the Inception V3 model of convolution neural network and combine it with the GLLCBN model to achieve the multi-label learning.The experimental results show that our proposed model has better performance than other multi-label learning methods in performance evaluating.
基金the Key Program of National Natural Science Foundation of China(No.12335008),the Postgraduate Research and Innovation Project of Huzhou University(No.2023KYCX62)the Scientific Research Fund of Zhejiang Provincial Education Department(No.Y202352712)the Huzhou science and technology planning project(No.2021GZ60)。
文摘The safety assessment of high-level radioactive waste repositories requires a high predictive accuracy for radionuclide diffusion and a comprehensive understanding of the diffusion mechanism.In this study,a through-diffusion method and six machine-learning methods were employed to investigate the diffusion of ReO_(4)^(−),HCrO_(4)^(−),and I−in saturated compacted bentonite under different salinities and compacted dry densities.The machine-learning models were trained using two datasets.One dataset contained six input features and 293 instances obtained from the diffusion database system of the Japan Atomic Energy Agency(JAEA-DDB)and 15 publications.The other dataset,comprising 15,000 pseudo-instances,was produced using a multi-porosity model and contained eight input features.The results indicate that the former dataset yielded a higher predictive accuracy than the latter.Light gradient-boosting exhibited a higher prediction accuracy(R2=0.92)and lower error(MSE=0.01)than the other machine-learning algorithms.In addition,Shapley Additive Explanations,Feature Importance,and Partial Dependence Plot analysis results indicate that the rock capacity factor and compacted dry density had the two most significant effects on predicting the effective diffusion coefficient,thereby offering valuable insights.
基金Zamil S.Alzamil would like to thank Deanship of Scientific Research at Majmaah University for supporting this work under Project No.R-2022-172.
文摘Building an automatic fish recognition and detection system for largescale fish classes is helpful for marine researchers and marine scientists because there are large numbers of fish species.However,it is quite difficult to build such systems owing to the lack of data imbalance problems and large number of classes.To solve these issues,we propose a transfer learning-based technique in which we use Efficient-Net,which is pre-trained on ImageNet dataset and fine-tuned on QuT Fish Database,which is a large scale dataset.Furthermore,prior to the activation layer,we use Global Average Pooling(GAP)instead of dense layer with the aim of averaging the results of predictions along with having more information compared to the dense layer.To check the validity of our model,we validate our model on the validation set which achieves satisfactory results.Also,for the localization task,we propose an architecture that consists of localization aware block,which captures localization information for better prediction and residual connections to handle the over-fitting problem.Actually,the residual connections help the layer to combine missing information with the relevant one.In addition,we use class weights and Focal Loss(FL)to handle class imbalance problems along with reducing false predictions.Actually,class weights assign less weights to classes having fewer instances and large weights to classes having more number of instances.During the localization,the qualitative assessment shows that we achieve 57%Mean Intersection Over Union(IoU)on testing data,and the classification results show 75%precision,70%recall,78%accuracy and 74%F1-Score for 468 fish species.
文摘Smart Grids(SG)is a power system development concept that has received significant attention nationally.SG signifies real-time data for specific communication requirements.The best capabilities for monitoring and controlling the grid are essential to system stability.One of the most critical needs for smart-grid execution is fast,precise,and economically synchronized measurements,which are made feasible by Phasor Measurement Units(PMU).PMUs can pro-vide synchronized measurements and measure voltages as well as current phasors dynamically.PMUs utilize GPS time-stamping at Coordinated Universal Time(UTC)to capture electric phasors with great accuracy and precision.This research tends to Deep Learning(DL)advances to design a Residual Network(ResNet)model that can accurately identify and classify defects in grid-connected systems.As part of fault detection and probe,the proposed strategy uses a ResNet-50 tech-nique to evaluate real-time measurement data from geographically scattered PMUs.As a result of its excellent signal classification efficiency and ability to extract high-quality signal features,its fault diagnosis performance is excellent.Our results demonstrate that the proposed method is effective in detecting and classifying faults at sufficient time.The proposed approaches classify the fault type with a precision of 98.5%and an accuracy of 99.1%.The long-short-term memory(LSTM),Convolutional Neural Network(CNN),and CNN-LSTM algo-rithms are applied to compare the networks.Real-world data tends to evaluate these networks.
基金This work was funded by the Deanship of Scientific Research(DSR),King Abdulaziz University,Jeddah,under Grant No.(D-914-611-1443).
文摘Due to drastic increase in the generation of data,it is tedious to examine and derive high level knowledge from the data.The rising trends of high dimension data gathering and problem representation necessitates feature selection process in several machine learning processes.The feature selection procedure establishes a generally encountered issue of global combinatorial optimization.The FS process can lessen the number of features by the removal of unwanted and repetitive data.In this aspect,this article introduces an improved harmony search based global optimization for feature selection with optimal deep learning(IHSFS-ODL)enabled classification model.The proposed IHSFS-ODL technique intends to reduce the curse of dimensionality and enhance classification outcomes.In addition,the IHSFSODL technique derives an IHSFS technique by the use of local search method with traditional harmony search algorithm(HSA)for global optimization.Besides,ODL based classifier including quantum behaved particle swarm optimization(QPSO)with gated recurrent unit(GRU)is applied for data classification process.The utilization of HSA for the choice of features and QPSO algorithm for hyper parameter tuning processes helps to accomplish maximum classification performance.In order to demonstrate the enhanced outcomes of the IHSFS-ODL technique,a series of simulations were carried out and the results reported the betterment over its recent state of art approaches.