Cyber-Physical Systems(CPS)represent an integration of computational and physical elements,revolutionizing industries by enabling real-time monitoring,control,and optimization.A complementary technology,Digital Twin(D...Cyber-Physical Systems(CPS)represent an integration of computational and physical elements,revolutionizing industries by enabling real-time monitoring,control,and optimization.A complementary technology,Digital Twin(DT),acts as a virtual replica of physical assets or processes,facilitating better decision making through simulations and predictive analytics.CPS and DT underpin the evolution of Industry 4.0 by bridging the physical and digital domains.This survey explores their synergy,highlighting how DT enriches CPS with dynamic modeling,realtime data integration,and advanced simulation capabilities.The layered architecture of DTs within CPS is examined,showcasing the enabling technologies and tools vital for seamless integration.The study addresses key challenges in CPS modeling,such as concurrency and communication,and underscores the importance of DT in overcoming these obstacles.Applications in various sectors are analyzed,including smart manufacturing,healthcare,and urban planning,emphasizing the transformative potential of CPS-DT integration.In addition,the review identifies gaps in existing methodologies and proposes future research directions to develop comprehensive,scalable,and secure CPSDT systems.By synthesizing insights fromthe current literature and presenting a taxonomy of CPS and DT,this survey serves as a foundational reference for academics and practitioners.The findings stress the need for unified frameworks that align CPS and DT with emerging technologies,fostering innovation and efficiency in the digital transformation era.展开更多
Background Zonal application maps are designed to represent field variability using key variables that can be translated into tailored management practices.For cotton,zonal maps for crop growth regulator(CGR)applicati...Background Zonal application maps are designed to represent field variability using key variables that can be translated into tailored management practices.For cotton,zonal maps for crop growth regulator(CGR)applications under variable-rate(VR)strategies are commonly based exclusively on vegetation indices(VIs)variability.However,VIs often saturate in dense crop vegetation areas,limiting their effectiveness in distinguishing variability in crop growth.This study aimed to compare unsupervised framework(UF)and supervised framework(SUF)approaches for generat-ing zonal application maps for CGR under VR conditions.During 2022-2023 agricultural seasons,an UF was employed to generate zonal maps based on locally collected field data on plant height of cotton,satellite imagery,soil texture,and phenology data.Subsequently,a SUF(based on historical data between 2020-2021 to 2022-2023 agricultural seasons)was developed to predict plant height using remote sensing and phenology data,aiming to replicate same zonal maps but without relying on direct field measurements of plant height.Both approaches were tested in three fields and on two different dates per field.Results The predictive model for plant height of SUF performed well,as indicated by the model metrics.However,when comparing zonal application maps for specific field-date combinations,the predicted plant height exhibited lower variability compared with field measurements.This led to variable compatibility between SUF maps,which utilized the model predictions,and the UF maps,which were based on the real field data.Fields characterized by much pronounced soil texture variability yielded the highest compatibility between the zonal application maps produced by both SUF and UF approaches.This was predominantly due to the greater consistency in estimating plant development patterns within these heterogeneous field environments.While VR application approach can facilitate product savings during the application operation,other key factors must be considered.These include the availability of specialized machinery required for this type of applications,as well as the inherent operational costs associated with applying a single CGR product which differs from the typical uniform rate applications that often integrate multi-ple inputs.Conclusion Predictive modeling shows promise for assisting in the creation of zonal application maps for VR of CGR applications.However,the degree of agreement with the actual variability in crop growth found in the field should be evaluated on a field-by-field basis.The SUF approach,which is based on plant heigh prediction,demonstrated potential for supporting the development of zonal application maps for VR of CGR applications.However,the degree to which this approach aligns itself with the actual variability in crop growth observed in the field may vary,necessi-tating field-by-field evaluation.展开更多
Healthcare systems nowadays depend on IoT sensors for sending data over the internet as a common practice.Encryption ofmedical images is very important to secure patient information.Encrypting these images consumes a ...Healthcare systems nowadays depend on IoT sensors for sending data over the internet as a common practice.Encryption ofmedical images is very important to secure patient information.Encrypting these images consumes a lot of time onedge computing;therefore,theuse of anauto-encoder for compressionbefore encodingwill solve such a problem.In this paper,we use an auto-encoder to compress amedical image before encryption,and an encryption output(vector)is sent out over the network.On the other hand,a decoder was used to reproduce the original image back after the vector was received and decrypted.Two convolutional neural networks were conducted to evaluate our proposed approach:The first one is the auto-encoder,which is utilized to compress and encrypt the images,and the other assesses the classification accuracy of the image after decryption and decoding.Different hyperparameters of the encoder were tested,followed by the classification of the image to verify that no critical information was lost,to test the encryption and encoding resolution.In this approach,sixteen hyperparameter permutations are utilized,but this research discusses three main cases in detail.The first case shows that the combination of Mean Square Logarithmic Error(MSLE),ADAgrad,two layers for the auto-encoder,and ReLU had the best auto-encoder results with a Mean Absolute Error(MAE)=0.221 after 50 epochs and 75%classification with the best result for the classification algorithm.The second case shows the reflection of auto-encoder results on the classification results which is a combination ofMean Square Error(MSE),RMSprop,three layers for the auto-encoder,and ReLU,which had the best classification accuracy of 65%,the auto-encoder gives MAE=0.31 after 50 epochs.The third case is the worst,which is the combination of the hinge,RMSprop,three layers for the auto-encoder,and ReLU,providing accuracy of 20%and MAE=0.485.展开更多
Social Edge Service(SES)is an emerging mechanism in the Social Internet of Things(SIoT)orchestration for effective user-centric reliable communication and computation.The services are affected by active and/or passive...Social Edge Service(SES)is an emerging mechanism in the Social Internet of Things(SIoT)orchestration for effective user-centric reliable communication and computation.The services are affected by active and/or passive attacks such as replay attacks,message tampering because of sharing the same spectrum,as well as inadequate trust measurement methods among intelligent devices(roadside units,mobile edge devices,servers)during computing and content-sharing.These issues lead to computation and communication overhead of servers and computation nodes.To address this issue,we propose the HybridgrAph-Deep-learning(HAD)approach in two stages for secure communication and computation.First,the Adaptive Trust Weight(ATW)model with relation-based feedback fusion analysis to estimate the fitness-priority of every node based on directed graph theory to detect malicious nodes and reduce computation and communication overhead.Second,a Quotient User-centric Coeval-Learning(QUCL)mechanism to formulate secure channel selection,and Nash equilibrium method for optimizing the communication to share data over edge devices.The simulation results confirm that our proposed approach has achieved effective communication and computation performance,and enhanced Social Edge Services(SES)reliability than state-of-the-art approaches.展开更多
Dynamic contrast-enhanced magnetic resonance imaging(DCE-MRI) can show subtle lesion morphology, improve the display of lesion definitions, and objectively reflect the blood supply of breast tumors; it can also reflec...Dynamic contrast-enhanced magnetic resonance imaging(DCE-MRI) can show subtle lesion morphology, improve the display of lesion definitions, and objectively reflect the blood supply of breast tumors; it can also reflect different strengthening patterns of normal tissues and lesion areas after medical tracer injection. DCE-MRI has become an important basis for the clinical diagnosis of breast cancer. To DCE-MRI data acquired from several hospitals across multiple provinces, a series of in-silico computational methods were applied for lesion segmentation and identification of breast tumor in this paper. The image segmentation methods include Otsu segmentation of subtraction images, signal-interference-ratio segmentation method and an improved variational level set method,each has its own application scope. After that, the distribution of benign and malignant in lesion region is identified based on three-time-point theory. From the experiment, the analysis of DCE-MRI data of breast tumor can show the distribution of benign and malignant in lesion region, provide a great help for clinicians to diagnose breast cancer more expediently and lay a basis for medical diagnosis and treatment planning.展开更多
Classification of edge-on galaxies is important to astronomical studies due to our Milky Way galaxy being an edge-on galaxy.Edge-on galaxies pose a problem to classification due to their less overall brightness levels...Classification of edge-on galaxies is important to astronomical studies due to our Milky Way galaxy being an edge-on galaxy.Edge-on galaxies pose a problem to classification due to their less overall brightness levels and smaller numbers of pixels.In the current work,a novel technique for the classification of edge-on galaxies has been developed.This technique is based on the mathematical treatment of galaxy brightness data from their images.A special treatment for galaxies’brightness data is developed to enhance faint galaxies and eliminate adverse effects of high brightness backgrounds as well as adverse effects of background bright stars.A novel slimness weighting factor is developed to classify edge-on galaxies based on their slimness.The technique has the capacity to be optimized for different catalogs with different brightness levels.In the current work,the developed technique is optimized for the EFIGI catalog and is trained using a set of 1800 galaxies from this catalog.Upon classification of the full set of 4458 galaxies from the EFIGI catalog,an accuracy of 97.5% has been achieved,with an average processing time of about 0.26 seconds per galaxy on an average laptop.展开更多
A Mobile Ad Hoc Network (MANET) is a self-governing network of mobile nodes without the inclusion of any wired links. Each node can move in an ad hoc manner and therefore, such a network should consist of routing prot...A Mobile Ad Hoc Network (MANET) is a self-governing network of mobile nodes without the inclusion of any wired links. Each node can move in an ad hoc manner and therefore, such a network should consist of routing protocols which can adapt to dynamically changing topologies. Numerous protocols have been proposed for the same. However, the trajectories followed by the individual nodes have not been distinctly dealt with. This paper presents a meticulous study on QoS parameters of proactive (OLSR) and reactive (DSR) protocols of MANETs for uniform as well as dissimilar trajectories of individual nodes in a small network of about 20 nodes. Also an examination of partial node failures for both the above mentioned protocols has been done. The performance metrics utilized in this study are average throughput and average delay. OPNET modeler has been utilized for this study. This assessment shows that for uniform trajectories, OLSR has almost same average delay but a higher average throughput as compared to DSR. Also it is seen that, as compared to uniform trajectories, non-uniform trajectories deliver a much higher average throughput. Node failures only reduce average throughputs whereas average delays remain unchanged.展开更多
Artificial Intelligence(AI)is finding increasing application in healthcare monitoring.Machine learning systems are utilized for monitoring patient health through the use of IoT sensor,which keep track of the physiolog...Artificial Intelligence(AI)is finding increasing application in healthcare monitoring.Machine learning systems are utilized for monitoring patient health through the use of IoT sensor,which keep track of the physiological state by way of various health data.Thus,early detection of any disease or derangement can aid doctors in saving patients’lives.However,there are some challenges associated with predicting health status using the common algorithms,such as time requirements,chances of errors,and improper classification.We propose an Artificial Krill Herd based on the Random Forest(AKHRF)technique for monitoring patients’health and eliciting an optimal prescription based on their health status.To begin with,various patient datasets were collected and trained into the system using IoT sensors.As a result,the framework developed includes four processes:preprocessing,feature extraction,classification,and result visibility.Additionally,preprocessing removes errors,noise,and missing values from the dataset,whereas feature extraction extracts the relevant information.Then,in the classification layer,we updated the fitness function of the krill herd to classify the patient’s health status and also generate a prescription.We found that the results fromthe proposed framework are comparable to the results from other state-of-the-art techniques in terms of sensitivity,specificity,Area under the Curve(AUC),accuracy,precision,recall,and F-measure.展开更多
Intensive agricultural practices have undeniably reduced soil fertility and crop productivity.Furthermore,alkaline calcareous soils represent a significant challenge for agricultural production,particularly durum whea...Intensive agricultural practices have undeniably reduced soil fertility and crop productivity.Furthermore,alkaline calcareous soils represent a significant challenge for agricultural production,particularly durum wheat,which is vital for ensuring food security.It is therefore essential to explore new cereal management strategies to maintain food production and promote crop sustainability.The application of soil microorganisms,particularly plant growth–promoting rhizobacteria(PGPR),as inoculants to enhance crop production is a growing area of interest.This study investigates the effects of the rhizobacteria Paenibacillus polymyxa SGH1 and SGK2,applied both individually and in combination,on the growth and productivity of durum wheat in alkaline calcareous soil.We conducted field experiments over two growing seasons using a randomized complete block design with three blocks,considering four treatments:non-inoculated wheat grains(T0),inoculation with the P.polymyxa SGH1 strain(T1),inoculation with the P.polymyxa SGK2 strain(T2),and co-inoculation with both strains(T3).The results clearly showed that SGH1 and SGK2 inoculation improved the morphometric characteristics of wheat plants,with co-inoculation of both strains that induced more pronounced improvements compared to T0 in terms of collar diameter(+16.9%),tillers plant-1(+89.8%),and SA/RA ratio(+35.5%).Co-inoculation was also the most effective treatment for improving the wheat grain yield(+41.1%in season I and+16.6%in season Ⅱ).In addition,T3 significantly increased the grain starch content(+220%).T1 determined the highest grain protein content in both seasons(9.5%in season Ⅰand 9.66%DW in season Ⅱ).This study demonstrated that bacterial inoculation and co-inoculation strategies can significantly enhance wheat productivity and grain quality in alkaline calcareous soils while reducing at the same time the ecological footprint of agriculture.展开更多
1 Introduction onMultimodal Learning in Image Processing IP(Image processing),as a classical research domain in computer application technology,has been researched for decades.It is one of the most important research ...1 Introduction onMultimodal Learning in Image Processing IP(Image processing),as a classical research domain in computer application technology,has been researched for decades.It is one of the most important research directions in computer vision,which is the basis for many current hotspots such as intelligent transportation/education/industry,etc.Because image processing is the strongest link for AI(artificial intelligence)applying to real world application,it has been a challenging research field with the development of AI,from DNN(deep convolutional network),Attention/LSTM(long-short term memory),to Transformer/Diffusion/Mamba based GAI(generated AI)models,e.g.,GPT and Sora[1].Today,the description ability of single-model feature limits the performance of image processing.More comprehensive description of the image is required to match the computational performance of current large scale models.展开更多
A network intrusion detection system is critical for cyber security against llegitimate attacks.In terms of feature perspectives,network traffic may include a variety of elements such as attack reference,attack type,a...A network intrusion detection system is critical for cyber security against llegitimate attacks.In terms of feature perspectives,network traffic may include a variety of elements such as attack reference,attack type,a subcategory of attack,host information,malicious scripts,etc.In terms of network perspectives,network traffic may contain an imbalanced number of harmful attacks when compared to normal traffic.It is challenging to identify a specific attack due to complex features and data imbalance issues.To address these issues,this paper proposes an Intrusion Detection System using transformer-based transfer learning for Imbalanced Network Traffic(IDS-INT).IDS-INT uses transformer-based transfer learning to learn feature interactions in both network feature representation and imbalanced data.First,detailed information about each type of attack is gathered from network interaction descriptions,which include network nodes,attack type,reference,host information,etc.Second,the transformer-based transfer learning approach is developed to learn detailed feature representation using their semantic anchors.Third,the Synthetic Minority Oversampling Technique(SMOTE)is implemented to balance abnormal traffic and detect minority attacks.Fourth,the Convolution Neural Network(CNN)model is designed to extract deep features from the balanced network traffic.Finally,the hybrid approach of the CNN-Long Short-Term Memory(CNN-LSTM)model is developed to detect different types of attacks from the deep features.Detailed experiments are conducted to test the proposed approach using three standard datasets,i.e.,UNsWNB15,CIC-IDS2017,and NSL-KDD.An explainable AI approach is implemented to interpret the proposed method and develop a trustable model.展开更多
In this paper, we investigate the growth of solutions of higher order linear differential equations with meromorphic coefficients. Under certain conditions, we obtain precise estimation of growth order and hyper-order...In this paper, we investigate the growth of solutions of higher order linear differential equations with meromorphic coefficients. Under certain conditions, we obtain precise estimation of growth order and hyper-order of solutions of the equation.展开更多
Parkinson’s disease(PD),classified under the category of a neurological syndrome,affects the brain of a person which leads to the motor and non-motor symptoms.Among motor symptoms,one of the major disabling symptom i...Parkinson’s disease(PD),classified under the category of a neurological syndrome,affects the brain of a person which leads to the motor and non-motor symptoms.Among motor symptoms,one of the major disabling symptom is Freezing of Gait(FoG)that affects the daily standard of living of PD patients.Available treatments target to improve the symptoms of PD.Detection of PD at the early stages is an arduous task due to being indistinguishable from a healthy individual.This work proposed a novel attention-basedmodel for the detection of FoG events and PD,andmeasuring the intensity of PD on the United Parkinson’s Disease Rating Scale.Two separate datasets,that is,UCF Daphnet dataset for detection of Freezing of Gait Events and PhysioNet Gait in PD Dataset were used for training and validating on their respective problems.The results show a definite rise in the various performance metrics when compared to landmark models on these problems using these datasets.These results strongly suggest that the proposed state of the art attention-based deep learning model provide a consistent as well as an efficient solution to the selected problem.High valueswere obtained for various performance metrics like accuracy of 98.74%for detection FoG,98.72%for detection of PD and 98.05%for measuring the intensity of PD on UPDRS.The model was also analyzed for robustness against noisy samples,where also model exhibited consistent performance.These results strongly suggest that the proposed model provides a better classification method for selected problem.展开更多
Dynamic analysis of malware allows us to examine malware samples, and then group those samples into families based on observed behavior. Using Boolean variables to represent the presence or absence of a range of malwa...Dynamic analysis of malware allows us to examine malware samples, and then group those samples into families based on observed behavior. Using Boolean variables to represent the presence or absence of a range of malware behavior, we create a bitstring that represents each malware behaviorally, and then group samples into the same class if they exhibit the same behavior. Combining class definitions with malware discovery dates, we can construct a timeline of showing the emergence date of each class, in order to examine prevalence, complexity, and longevity of each class. We find that certain behavior classes are more prevalent than others, following a frequency power law. Some classes have had lower longevity, indicating that their attack profile is no longer manifested by new variants of malware, while others of greater longevity, continue to affect new computer systems. We verify for the first time commonly held intuitions on malware evolution, showing quantitatively from the archaeological record that over 80% of the time, classes of higher malware complexity emerged later than classes of lower complexity. In addition to providing historical perspective on malware evolution, the methods described in this paper may aid malware detection through classification, leading to new proactive methods to identify malicious software.展开更多
Heart monitoring improves life quality.Electrocardiograms(ECGs or EKGs)detect heart irregularities.Machine learning algorithms can create a few ECG diagnosis processing methods.The first method uses raw ECG and time-s...Heart monitoring improves life quality.Electrocardiograms(ECGs or EKGs)detect heart irregularities.Machine learning algorithms can create a few ECG diagnosis processing methods.The first method uses raw ECG and time-series data.The second method classifies the ECG by patient experience.The third technique translates ECG impulses into Q waves,R waves and S waves(QRS)features using richer information.Because ECG signals vary naturally between humans and activities,we will combine the three feature selection methods to improve classification accuracy and diagnosis.Classifications using all three approaches have not been examined till now.Several researchers found that Machine Learning(ML)techniques can improve ECG classification.This study will compare popular machine learning techniques to evaluate ECG features.Four algorithms—Support Vector Machine(SVM),Decision Tree,Naive Bayes,and Neural Network—compare categorization results.SVM plus prior knowledge has the highest accuracy(99%)of the four ML methods.QRS characteristics failed to identify signals without chaos theory.With 99.8%classification accuracy,the Decision Tree technique outperformed all previous experiments.展开更多
This paper presents a new method for obtaining network properties from incomplete data sets. Problems associated with missing data represent well-known stumbling blocks in Social Network Analysis. The method of “esti...This paper presents a new method for obtaining network properties from incomplete data sets. Problems associated with missing data represent well-known stumbling blocks in Social Network Analysis. The method of “estimating connectivity from spanning tree completions” (ECSTC) is specifically designed to address situations where only spanning tree(s) of a network are known, such as those obtained through respondent driven sampling (RDS). Using repeated random completions derived from degree information, this method forgoes the usual step of trying to obtain final edge or vertex rosters, and instead aims to estimate network-centric properties of vertices probabilistically from the spanning trees themselves. In this paper, we discuss the problem of missing data and describe the protocols of our completion method, and finally the results of an experiment where ECSTC was used to estimate graph dependent vertex properties from spanning trees sampled from a graph whose characteristics were known ahead of time. The results show that ECSTC methods hold more promise for obtaining network-centric properties of individuals from a limited set of data than researchers may have previously assumed. Such an approach represents a break with past strategies of working with missing data which have mainly sought means to complete the graph, rather than ECSTC’s approach, which is to estimate network properties themselves without deciding on the final edge set.展开更多
Hospital facilities use a collection of heterogeneous devices, produced by many different vendors, to monitor the state of patient vital signs. The limited interoperability of current devices makes it difficult to syn...Hospital facilities use a collection of heterogeneous devices, produced by many different vendors, to monitor the state of patient vital signs. The limited interoperability of current devices makes it difficult to synthesize multivariate monitoring data into a unified array of real-time information regarding the patients state. Without an infrastructure for the integrated evaluation, display, and storage of vital sign data, one cannot adequately ensure that the assignment of caregivers to patients reflects the relative urgency of patient needs. This is an especially serious issue in critical care units (CCUs). We present a formal mathematical model of an operational critical care unit, together with metrics for evaluating the systematic impact of caregiver scheduling decisions on patient care. The model is rich enough to capture the essential features of device and patient diversity, and so enables us to test the hypothesis that integration of vital sign data could realistically yield a significant positive impact on the efficacy of critical care delivery outcome. To test the hypothesis, we employ the model within a computer simulation. The simulation enables us to compare the current scheduling processes in widespread use within CCUs, against a new scheduling algorithm that makes use of an integrated array of patient information collected by an (anticipated) vital sign data integration infrastructure. The simulation study provides clear evidence that such an infrastructure reduces risk to patients and lowers operational costs, and in so doing reveals the inherent costs of medical device non-interoperability.展开更多
With the rapid expansion of the Internet, Web servers have played a major role in accessing the enormous mass of Web pages to find the information needed by the user. Despite the exponential growth of the WWW, a very ...With the rapid expansion of the Internet, Web servers have played a major role in accessing the enormous mass of Web pages to find the information needed by the user. Despite the exponential growth of the WWW, a very negligible amount of research has been conducted in Web server performance analysis with a view to improve the time a Web server takes to connect, receive, and analyze a request sent by the client and then sending the answer back to client. In this paper, we propose a multi-layer analytical approach to study the web server performance. A simple client-server model is used to represent the WWW server in order to demonstrate how to apply the proposed approach. We developed a systematic, analytical methodology to quantify the communication delay and queuing overhead in a distributed web server system. The approach uses the Computation Structure Model to derive server processing time required to process a request sent from a client and queueing model to analyze the communication between the clients and the server.展开更多
文摘Cyber-Physical Systems(CPS)represent an integration of computational and physical elements,revolutionizing industries by enabling real-time monitoring,control,and optimization.A complementary technology,Digital Twin(DT),acts as a virtual replica of physical assets or processes,facilitating better decision making through simulations and predictive analytics.CPS and DT underpin the evolution of Industry 4.0 by bridging the physical and digital domains.This survey explores their synergy,highlighting how DT enriches CPS with dynamic modeling,realtime data integration,and advanced simulation capabilities.The layered architecture of DTs within CPS is examined,showcasing the enabling technologies and tools vital for seamless integration.The study addresses key challenges in CPS modeling,such as concurrency and communication,and underscores the importance of DT in overcoming these obstacles.Applications in various sectors are analyzed,including smart manufacturing,healthcare,and urban planning,emphasizing the transformative potential of CPS-DT integration.In addition,the review identifies gaps in existing methodologies and proposes future research directions to develop comprehensive,scalable,and secure CPSDT systems.By synthesizing insights fromthe current literature and presenting a taxonomy of CPS and DT,this survey serves as a foundational reference for academics and practitioners.The findings stress the need for unified frameworks that align CPS and DT with emerging technologies,fostering innovation and efficiency in the digital transformation era.
文摘Background Zonal application maps are designed to represent field variability using key variables that can be translated into tailored management practices.For cotton,zonal maps for crop growth regulator(CGR)applications under variable-rate(VR)strategies are commonly based exclusively on vegetation indices(VIs)variability.However,VIs often saturate in dense crop vegetation areas,limiting their effectiveness in distinguishing variability in crop growth.This study aimed to compare unsupervised framework(UF)and supervised framework(SUF)approaches for generat-ing zonal application maps for CGR under VR conditions.During 2022-2023 agricultural seasons,an UF was employed to generate zonal maps based on locally collected field data on plant height of cotton,satellite imagery,soil texture,and phenology data.Subsequently,a SUF(based on historical data between 2020-2021 to 2022-2023 agricultural seasons)was developed to predict plant height using remote sensing and phenology data,aiming to replicate same zonal maps but without relying on direct field measurements of plant height.Both approaches were tested in three fields and on two different dates per field.Results The predictive model for plant height of SUF performed well,as indicated by the model metrics.However,when comparing zonal application maps for specific field-date combinations,the predicted plant height exhibited lower variability compared with field measurements.This led to variable compatibility between SUF maps,which utilized the model predictions,and the UF maps,which were based on the real field data.Fields characterized by much pronounced soil texture variability yielded the highest compatibility between the zonal application maps produced by both SUF and UF approaches.This was predominantly due to the greater consistency in estimating plant development patterns within these heterogeneous field environments.While VR application approach can facilitate product savings during the application operation,other key factors must be considered.These include the availability of specialized machinery required for this type of applications,as well as the inherent operational costs associated with applying a single CGR product which differs from the typical uniform rate applications that often integrate multi-ple inputs.Conclusion Predictive modeling shows promise for assisting in the creation of zonal application maps for VR of CGR applications.However,the degree of agreement with the actual variability in crop growth found in the field should be evaluated on a field-by-field basis.The SUF approach,which is based on plant heigh prediction,demonstrated potential for supporting the development of zonal application maps for VR of CGR applications.However,the degree to which this approach aligns itself with the actual variability in crop growth observed in the field may vary,necessi-tating field-by-field evaluation.
基金funding was provided by the Institute for Research and Consulting Studies at King Khalid University through Corona Research(Fast Track)[Grant No.3-103S-2020].
文摘Healthcare systems nowadays depend on IoT sensors for sending data over the internet as a common practice.Encryption ofmedical images is very important to secure patient information.Encrypting these images consumes a lot of time onedge computing;therefore,theuse of anauto-encoder for compressionbefore encodingwill solve such a problem.In this paper,we use an auto-encoder to compress amedical image before encryption,and an encryption output(vector)is sent out over the network.On the other hand,a decoder was used to reproduce the original image back after the vector was received and decrypted.Two convolutional neural networks were conducted to evaluate our proposed approach:The first one is the auto-encoder,which is utilized to compress and encrypt the images,and the other assesses the classification accuracy of the image after decryption and decoding.Different hyperparameters of the encoder were tested,followed by the classification of the image to verify that no critical information was lost,to test the encryption and encoding resolution.In this approach,sixteen hyperparameter permutations are utilized,but this research discusses three main cases in detail.The first case shows that the combination of Mean Square Logarithmic Error(MSLE),ADAgrad,two layers for the auto-encoder,and ReLU had the best auto-encoder results with a Mean Absolute Error(MAE)=0.221 after 50 epochs and 75%classification with the best result for the classification algorithm.The second case shows the reflection of auto-encoder results on the classification results which is a combination ofMean Square Error(MSE),RMSprop,three layers for the auto-encoder,and ReLU,which had the best classification accuracy of 65%,the auto-encoder gives MAE=0.31 after 50 epochs.The third case is the worst,which is the combination of the hinge,RMSprop,three layers for the auto-encoder,and ReLU,providing accuracy of 20%and MAE=0.485.
基金supported in part by Basic Science Research Programs of the Ministry of Education(NRF-2018R1A2B6005105)in part by the National Research Foundation of Korea(NRF)grant funded by the Korean government(MSIT)(No.2019R1A5A8080290).
文摘Social Edge Service(SES)is an emerging mechanism in the Social Internet of Things(SIoT)orchestration for effective user-centric reliable communication and computation.The services are affected by active and/or passive attacks such as replay attacks,message tampering because of sharing the same spectrum,as well as inadequate trust measurement methods among intelligent devices(roadside units,mobile edge devices,servers)during computing and content-sharing.These issues lead to computation and communication overhead of servers and computation nodes.To address this issue,we propose the HybridgrAph-Deep-learning(HAD)approach in two stages for secure communication and computation.First,the Adaptive Trust Weight(ATW)model with relation-based feedback fusion analysis to estimate the fitness-priority of every node based on directed graph theory to detect malicious nodes and reduce computation and communication overhead.Second,a Quotient User-centric Coeval-Learning(QUCL)mechanism to formulate secure channel selection,and Nash equilibrium method for optimizing the communication to share data over edge devices.The simulation results confirm that our proposed approach has achieved effective communication and computation performance,and enhanced Social Edge Services(SES)reliability than state-of-the-art approaches.
基金the National Basic Research Program(973) of China(No.2010CB732506)the National Science & Technology Pillar Program(No.2012BAI15B07)+1 种基金the National Natural Science Foundation of China(Nos.61104041 and 61201397)the Science Foundation of Fujian Province(No.2012J01261)
文摘Dynamic contrast-enhanced magnetic resonance imaging(DCE-MRI) can show subtle lesion morphology, improve the display of lesion definitions, and objectively reflect the blood supply of breast tumors; it can also reflect different strengthening patterns of normal tissues and lesion areas after medical tracer injection. DCE-MRI has become an important basis for the clinical diagnosis of breast cancer. To DCE-MRI data acquired from several hospitals across multiple provinces, a series of in-silico computational methods were applied for lesion segmentation and identification of breast tumor in this paper. The image segmentation methods include Otsu segmentation of subtraction images, signal-interference-ratio segmentation method and an improved variational level set method,each has its own application scope. After that, the distribution of benign and malignant in lesion region is identified based on three-time-point theory. From the experiment, the analysis of DCE-MRI data of breast tumor can show the distribution of benign and malignant in lesion region, provide a great help for clinicians to diagnose breast cancer more expediently and lay a basis for medical diagnosis and treatment planning.
文摘Classification of edge-on galaxies is important to astronomical studies due to our Milky Way galaxy being an edge-on galaxy.Edge-on galaxies pose a problem to classification due to their less overall brightness levels and smaller numbers of pixels.In the current work,a novel technique for the classification of edge-on galaxies has been developed.This technique is based on the mathematical treatment of galaxy brightness data from their images.A special treatment for galaxies’brightness data is developed to enhance faint galaxies and eliminate adverse effects of high brightness backgrounds as well as adverse effects of background bright stars.A novel slimness weighting factor is developed to classify edge-on galaxies based on their slimness.The technique has the capacity to be optimized for different catalogs with different brightness levels.In the current work,the developed technique is optimized for the EFIGI catalog and is trained using a set of 1800 galaxies from this catalog.Upon classification of the full set of 4458 galaxies from the EFIGI catalog,an accuracy of 97.5% has been achieved,with an average processing time of about 0.26 seconds per galaxy on an average laptop.
文摘A Mobile Ad Hoc Network (MANET) is a self-governing network of mobile nodes without the inclusion of any wired links. Each node can move in an ad hoc manner and therefore, such a network should consist of routing protocols which can adapt to dynamically changing topologies. Numerous protocols have been proposed for the same. However, the trajectories followed by the individual nodes have not been distinctly dealt with. This paper presents a meticulous study on QoS parameters of proactive (OLSR) and reactive (DSR) protocols of MANETs for uniform as well as dissimilar trajectories of individual nodes in a small network of about 20 nodes. Also an examination of partial node failures for both the above mentioned protocols has been done. The performance metrics utilized in this study are average throughput and average delay. OPNET modeler has been utilized for this study. This assessment shows that for uniform trajectories, OLSR has almost same average delay but a higher average throughput as compared to DSR. Also it is seen that, as compared to uniform trajectories, non-uniform trajectories deliver a much higher average throughput. Node failures only reduce average throughputs whereas average delays remain unchanged.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through Small Research Groups under grant number(RGP.1/62/43).
文摘Artificial Intelligence(AI)is finding increasing application in healthcare monitoring.Machine learning systems are utilized for monitoring patient health through the use of IoT sensor,which keep track of the physiological state by way of various health data.Thus,early detection of any disease or derangement can aid doctors in saving patients’lives.However,there are some challenges associated with predicting health status using the common algorithms,such as time requirements,chances of errors,and improper classification.We propose an Artificial Krill Herd based on the Random Forest(AKHRF)technique for monitoring patients’health and eliciting an optimal prescription based on their health status.To begin with,various patient datasets were collected and trained into the system using IoT sensors.As a result,the framework developed includes four processes:preprocessing,feature extraction,classification,and result visibility.Additionally,preprocessing removes errors,noise,and missing values from the dataset,whereas feature extraction extracts the relevant information.Then,in the classification layer,we updated the fitness function of the krill herd to classify the patient’s health status and also generate a prescription.We found that the results fromthe proposed framework are comparable to the results from other state-of-the-art techniques in terms of sensitivity,specificity,Area under the Curve(AUC),accuracy,precision,recall,and F-measure.
基金QK22020008(National Agency of Agricultural Research of the Czech Republic)TQ03000234(Technological Agency of the Czech Republic)MZE-RO0123(Ministry of Agriculture,CR).
文摘Intensive agricultural practices have undeniably reduced soil fertility and crop productivity.Furthermore,alkaline calcareous soils represent a significant challenge for agricultural production,particularly durum wheat,which is vital for ensuring food security.It is therefore essential to explore new cereal management strategies to maintain food production and promote crop sustainability.The application of soil microorganisms,particularly plant growth–promoting rhizobacteria(PGPR),as inoculants to enhance crop production is a growing area of interest.This study investigates the effects of the rhizobacteria Paenibacillus polymyxa SGH1 and SGK2,applied both individually and in combination,on the growth and productivity of durum wheat in alkaline calcareous soil.We conducted field experiments over two growing seasons using a randomized complete block design with three blocks,considering four treatments:non-inoculated wheat grains(T0),inoculation with the P.polymyxa SGH1 strain(T1),inoculation with the P.polymyxa SGK2 strain(T2),and co-inoculation with both strains(T3).The results clearly showed that SGH1 and SGK2 inoculation improved the morphometric characteristics of wheat plants,with co-inoculation of both strains that induced more pronounced improvements compared to T0 in terms of collar diameter(+16.9%),tillers plant-1(+89.8%),and SA/RA ratio(+35.5%).Co-inoculation was also the most effective treatment for improving the wheat grain yield(+41.1%in season I and+16.6%in season Ⅱ).In addition,T3 significantly increased the grain starch content(+220%).T1 determined the highest grain protein content in both seasons(9.5%in season Ⅰand 9.66%DW in season Ⅱ).This study demonstrated that bacterial inoculation and co-inoculation strategies can significantly enhance wheat productivity and grain quality in alkaline calcareous soils while reducing at the same time the ecological footprint of agriculture.
基金This work was supported by the National Natural Science Foundation of China (No.60875006), Program for Leading Talent of SEAC, and also was supported by Innovative Team Subsidize of Northwest University for Nationalities.
基金supported by 2023 Key Supported Project of the 14th Five Year Plan for Education and Science in Hunan Province with No.XJK23AXX0012021 Supported Project of the Educational Science Plan in Hunan Province with No.XJK21BXX010.
文摘1 Introduction onMultimodal Learning in Image Processing IP(Image processing),as a classical research domain in computer application technology,has been researched for decades.It is one of the most important research directions in computer vision,which is the basis for many current hotspots such as intelligent transportation/education/industry,etc.Because image processing is the strongest link for AI(artificial intelligence)applying to real world application,it has been a challenging research field with the development of AI,from DNN(deep convolutional network),Attention/LSTM(long-short term memory),to Transformer/Diffusion/Mamba based GAI(generated AI)models,e.g.,GPT and Sora[1].Today,the description ability of single-model feature limits the performance of image processing.More comprehensive description of the image is required to match the computational performance of current large scale models.
文摘A network intrusion detection system is critical for cyber security against llegitimate attacks.In terms of feature perspectives,network traffic may include a variety of elements such as attack reference,attack type,a subcategory of attack,host information,malicious scripts,etc.In terms of network perspectives,network traffic may contain an imbalanced number of harmful attacks when compared to normal traffic.It is challenging to identify a specific attack due to complex features and data imbalance issues.To address these issues,this paper proposes an Intrusion Detection System using transformer-based transfer learning for Imbalanced Network Traffic(IDS-INT).IDS-INT uses transformer-based transfer learning to learn feature interactions in both network feature representation and imbalanced data.First,detailed information about each type of attack is gathered from network interaction descriptions,which include network nodes,attack type,reference,host information,etc.Second,the transformer-based transfer learning approach is developed to learn detailed feature representation using their semantic anchors.Third,the Synthetic Minority Oversampling Technique(SMOTE)is implemented to balance abnormal traffic and detect minority attacks.Fourth,the Convolution Neural Network(CNN)model is designed to extract deep features from the balanced network traffic.Finally,the hybrid approach of the CNN-Long Short-Term Memory(CNN-LSTM)model is developed to detect different types of attacks from the deep features.Detailed experiments are conducted to test the proposed approach using three standard datasets,i.e.,UNsWNB15,CIC-IDS2017,and NSL-KDD.An explainable AI approach is implemented to interpret the proposed method and develop a trustable model.
文摘In this paper, we investigate the growth of solutions of higher order linear differential equations with meromorphic coefficients. Under certain conditions, we obtain precise estimation of growth order and hyper-order of solutions of the equation.
基金This work has been funded by the Faculty Research Grants,Augustana College,Rock Island Illinois,USA,Initials of the Author:TKM,website:https://www.augustana.edu/aboutus/offices/academic-affairs/scholarship-grants.
文摘Parkinson’s disease(PD),classified under the category of a neurological syndrome,affects the brain of a person which leads to the motor and non-motor symptoms.Among motor symptoms,one of the major disabling symptom is Freezing of Gait(FoG)that affects the daily standard of living of PD patients.Available treatments target to improve the symptoms of PD.Detection of PD at the early stages is an arduous task due to being indistinguishable from a healthy individual.This work proposed a novel attention-basedmodel for the detection of FoG events and PD,andmeasuring the intensity of PD on the United Parkinson’s Disease Rating Scale.Two separate datasets,that is,UCF Daphnet dataset for detection of Freezing of Gait Events and PhysioNet Gait in PD Dataset were used for training and validating on their respective problems.The results show a definite rise in the various performance metrics when compared to landmark models on these problems using these datasets.These results strongly suggest that the proposed state of the art attention-based deep learning model provide a consistent as well as an efficient solution to the selected problem.High valueswere obtained for various performance metrics like accuracy of 98.74%for detection FoG,98.72%for detection of PD and 98.05%for measuring the intensity of PD on UPDRS.The model was also analyzed for robustness against noisy samples,where also model exhibited consistent performance.These results strongly suggest that the proposed model provides a better classification method for selected problem.
文摘Dynamic analysis of malware allows us to examine malware samples, and then group those samples into families based on observed behavior. Using Boolean variables to represent the presence or absence of a range of malware behavior, we create a bitstring that represents each malware behaviorally, and then group samples into the same class if they exhibit the same behavior. Combining class definitions with malware discovery dates, we can construct a timeline of showing the emergence date of each class, in order to examine prevalence, complexity, and longevity of each class. We find that certain behavior classes are more prevalent than others, following a frequency power law. Some classes have had lower longevity, indicating that their attack profile is no longer manifested by new variants of malware, while others of greater longevity, continue to affect new computer systems. We verify for the first time commonly held intuitions on malware evolution, showing quantitatively from the archaeological record that over 80% of the time, classes of higher malware complexity emerged later than classes of lower complexity. In addition to providing historical perspective on malware evolution, the methods described in this paper may aid malware detection through classification, leading to new proactive methods to identify malicious software.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through Large Groups(Grant Number RGP.2/246/44),B.B.,and https://www.kku.edu.sa/en.
文摘Heart monitoring improves life quality.Electrocardiograms(ECGs or EKGs)detect heart irregularities.Machine learning algorithms can create a few ECG diagnosis processing methods.The first method uses raw ECG and time-series data.The second method classifies the ECG by patient experience.The third technique translates ECG impulses into Q waves,R waves and S waves(QRS)features using richer information.Because ECG signals vary naturally between humans and activities,we will combine the three feature selection methods to improve classification accuracy and diagnosis.Classifications using all three approaches have not been examined till now.Several researchers found that Machine Learning(ML)techniques can improve ECG classification.This study will compare popular machine learning techniques to evaluate ECG features.Four algorithms—Support Vector Machine(SVM),Decision Tree,Naive Bayes,and Neural Network—compare categorization results.SVM plus prior knowledge has the highest accuracy(99%)of the four ML methods.QRS characteristics failed to identify signals without chaos theory.With 99.8%classification accuracy,the Decision Tree technique outperformed all previous experiments.
文摘This paper presents a new method for obtaining network properties from incomplete data sets. Problems associated with missing data represent well-known stumbling blocks in Social Network Analysis. The method of “estimating connectivity from spanning tree completions” (ECSTC) is specifically designed to address situations where only spanning tree(s) of a network are known, such as those obtained through respondent driven sampling (RDS). Using repeated random completions derived from degree information, this method forgoes the usual step of trying to obtain final edge or vertex rosters, and instead aims to estimate network-centric properties of vertices probabilistically from the spanning trees themselves. In this paper, we discuss the problem of missing data and describe the protocols of our completion method, and finally the results of an experiment where ECSTC was used to estimate graph dependent vertex properties from spanning trees sampled from a graph whose characteristics were known ahead of time. The results show that ECSTC methods hold more promise for obtaining network-centric properties of individuals from a limited set of data than researchers may have previously assumed. Such an approach represents a break with past strategies of working with missing data which have mainly sought means to complete the graph, rather than ECSTC’s approach, which is to estimate network properties themselves without deciding on the final edge set.
文摘Hospital facilities use a collection of heterogeneous devices, produced by many different vendors, to monitor the state of patient vital signs. The limited interoperability of current devices makes it difficult to synthesize multivariate monitoring data into a unified array of real-time information regarding the patients state. Without an infrastructure for the integrated evaluation, display, and storage of vital sign data, one cannot adequately ensure that the assignment of caregivers to patients reflects the relative urgency of patient needs. This is an especially serious issue in critical care units (CCUs). We present a formal mathematical model of an operational critical care unit, together with metrics for evaluating the systematic impact of caregiver scheduling decisions on patient care. The model is rich enough to capture the essential features of device and patient diversity, and so enables us to test the hypothesis that integration of vital sign data could realistically yield a significant positive impact on the efficacy of critical care delivery outcome. To test the hypothesis, we employ the model within a computer simulation. The simulation enables us to compare the current scheduling processes in widespread use within CCUs, against a new scheduling algorithm that makes use of an integrated array of patient information collected by an (anticipated) vital sign data integration infrastructure. The simulation study provides clear evidence that such an infrastructure reduces risk to patients and lowers operational costs, and in so doing reveals the inherent costs of medical device non-interoperability.
文摘With the rapid expansion of the Internet, Web servers have played a major role in accessing the enormous mass of Web pages to find the information needed by the user. Despite the exponential growth of the WWW, a very negligible amount of research has been conducted in Web server performance analysis with a view to improve the time a Web server takes to connect, receive, and analyze a request sent by the client and then sending the answer back to client. In this paper, we propose a multi-layer analytical approach to study the web server performance. A simple client-server model is used to represent the WWW server in order to demonstrate how to apply the proposed approach. We developed a systematic, analytical methodology to quantify the communication delay and queuing overhead in a distributed web server system. The approach uses the Computation Structure Model to derive server processing time required to process a request sent from a client and queueing model to analyze the communication between the clients and the server.