Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In exist...Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In existing technologies,the efficiency of big data applications(BDAs)in distributed systems hinges on the stable-state and low-latency links between worker nodes.However,LMCNs with high-dynamic nodes and long-distance links can not provide the above conditions,which makes the performance of OBDP hard to be intuitively measured.To bridge this gap,a multidimensional simulation platform is indispensable that can simulate the network environment of LMCNs and put BDAs in it for performance testing.Using STK's APIs and parallel computing framework,we achieve real-time simulation for thousands of satellite nodes,which are mapped as application nodes through software defined network(SDN)and container technologies.We elaborate the architecture and mechanism of the simulation platform,and take the Starlink and Hadoop as realistic examples for simulations.The results indicate that LMCNs have dynamic end-to-end latency which fluctuates periodically with the constellation movement.Compared to ground data center networks(GDCNs),LMCNs deteriorate the computing and storage job throughput,which can be alleviated by the utilization of erasure codes and data flow scheduling of worker nodes.展开更多
Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G...Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G mobile networks.High-security cryptography guarantees that essential data can be transmitted securely;however,it increases energy consumption and reduces data processing speed.Therefore,this study proposes a low-energy data encryption(LEDE)algorithm based on the Advanced Encryption Standard(AES)for improving data transmission security and reducing the energy consumption of encryption in Internet-of-Things(IoT)devices.In the proposed LEDE algorithm,the system time parameter is employed to create a dynamic S-Box to replace the static S-Box of AES.Tests indicated that six-round LEDE encryption achieves the same security level as 10-round conventional AES encryption.This reduction in encryption time results in the LEDE algorithm having a 67.4%lower energy consumption and 43.9%shorter encryption time than conventional AES;thus,the proposed LEDE algorithm can improve the performance and the energy consumption of IoT edge devices.展开更多
The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a w...The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a wide range of applications in student learning behavior analysis,teaching resource management,campus safety monitoring,and decision support,which improves the quality of education and management efficiency.Cloud computing technology supports the integration,distribution,and optimal use of educational resources through cloud resource sharing,virtual classrooms,intelligent campus management systems,and Infrastructure-as-a-Service(IaaS)models,which reduce costs and increase flexibility.This paper comprehensively discusses the practical application of big data and cloud computing technologies in smart campuses,showing how these technologies can contribute to the development of smart campuses,and laying the foundation for the future innovation of education models.展开更多
As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by ...As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by everyone.To this end,we discuss some of our explorations and attempts in the construction and teaching process of big data courses for the major of big data management and application from the perspective of course planning,course implementation,and course summary.After interviews with students and feedback from questionnaires,students are highly satisfied with some of the teaching measures and programs currently adopted.展开更多
This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media...This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media platforms such as Weibo,WeChat,and Zhihu influences the cognition and choice process of prospective students.By employing an online quantitative survey questionnaire,data were collected from the 2022 and 2023 classes of new students majoring in Big Data Management and Application at Guilin University of Electronic Technology.The aim was to evaluate the role of social media in their college choice process and understand the features and information that most attract prospective students.Social media has become a key factor influencing the college choice decision-making of undergraduates majoring in Big Data Management and Application in China.Students tend to obtain school information through social media platforms and use this information as an important reference in their decision-making process.Higher education institutions should strengthen their social media information dissemination,providing accurate,timely,and attractive information.It is also necessary to ensure effective management of social media platforms,maintain a positive reputation for the school on social media,and increase the interest and trust of prospective students.Simultaneously,educational decision-makers should consider incorporating social media analysis into their recruitment strategies to better attract new student enrollment.This study provides a new perspective for understanding higher education choice behavior in the digital age,particularly by revealing the importance of social media in the educational decision-making process.This has important practical and theoretical implications for higher education institutions,policymakers,and social media platform operators.展开更多
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e...This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.展开更多
Dealing with data scarcity is the biggest challenge faced by Artificial Intelligence(AI),and it will be interesting to see how we overcome this obstacle in the future,but for now,“THE SHOW MUST GO ON!!!”As AI spread...Dealing with data scarcity is the biggest challenge faced by Artificial Intelligence(AI),and it will be interesting to see how we overcome this obstacle in the future,but for now,“THE SHOW MUST GO ON!!!”As AI spreads and transforms more industries,the lack of data is a significant obstacle:the best methods for teaching machines how real-world processes work.This paper explores the considerable implications of data scarcity for the AI industry,which threatens to restrict its growth and potential,and proposes plausible solutions and perspectives.In addition,this article focuses highly on different ethical considerations:privacy,consent,and non-discrimination principles during AI model developments under limited conditions.Besides,innovative technologies are investigated through the paper in aspects that need implementation by incorporating transfer learning,few-shot learning,and data augmentation to adapt models so they could fit effective use processes in low-resource settings.This thus emphasizes the need for collaborative frameworks and sound methodologies that ensure applicability and fairness,tackling the technical and ethical challenges associated with data scarcity in AI.This article also discusses prospective approaches to dealing with data scarcity,emphasizing the blend of synthetic data and traditional models and the use of advanced machine learning techniques such as transfer learning and few-shot learning.These techniques aim to enhance the flexibility and effectiveness of AI systems across various industries while ensuring sustainable AI technology development amid ongoing data scarcity.展开更多
In today’s digital world,the Internet of Things(IoT)plays an important role in both local and global economies due to its widespread adoption in different applications.This technology has the potential to offer sever...In today’s digital world,the Internet of Things(IoT)plays an important role in both local and global economies due to its widespread adoption in different applications.This technology has the potential to offer several advantages over conventional technologies in the near future.However,the potential growth of this technology also attracts attention from hackers,which introduces new challenges for the research community that range from hardware and software security to user privacy and authentication.Therefore,we focus on a particular security concern that is associated with malware detection.The literature presents many countermeasures,but inconsistent results on identical datasets and algorithms raise concerns about model biases,training quality,and complexity.This highlights the need for an adaptive,real-time learning framework that can effectively mitigate malware threats in IoT applications.To address these challenges,(i)we propose an intelligent framework based on Two-step Deep Reinforcement Learning(TwStDRL)that is capable of learning and adapting in real-time to counter malware threats in IoT applications.This framework uses exploration and exploitation phenomena during both the training and testing phases by storing results in a replay memory.The stored knowledge allows the model to effectively navigate the environment and maximize cumulative rewards.(ii)To demonstrate the superiority of the TwStDRL framework,we implement and evaluate several machine learning algorithms for comparative analysis that include Support Vector Machines(SVM),Multi-Layer Perceptron,Random Forests,and k-means Clustering.The selection of these algorithms is driven by the inconsistent results reported in the literature,which create doubt about their robustness and reliability in real-world IoT deployments.(iii)Finally,we provide a comprehensive evaluation to justify why the TwStDRL framework outperforms them in mitigating security threats.During analysis,we noted that our proposed TwStDRL scheme achieves an average performance of 99.45%across accuracy,precision,recall,and F1-score,which is an absolute improvement of roughly 3%over the existing malware-detection models.展开更多
The distillation process is an important chemical process,and the application of data-driven modelling approach has the potential to reduce model complexity compared to mechanistic modelling,thus improving the efficie...The distillation process is an important chemical process,and the application of data-driven modelling approach has the potential to reduce model complexity compared to mechanistic modelling,thus improving the efficiency of process optimization or monitoring studies.However,the distillation process is highly nonlinear and has multiple uncertainty perturbation intervals,which brings challenges to accurate data-driven modelling of distillation processes.This paper proposes a systematic data-driven modelling framework to solve these problems.Firstly,data segment variance was introduced into the K-means algorithm to form K-means data interval(KMDI)clustering in order to cluster the data into perturbed and steady state intervals for steady-state data extraction.Secondly,maximal information coefficient(MIC)was employed to calculate the nonlinear correlation between variables for removing redundant features.Finally,extreme gradient boosting(XGBoost)was integrated as the basic learner into adaptive boosting(AdaBoost)with the error threshold(ET)set to improve weights update strategy to construct the new integrated learning algorithm,XGBoost-AdaBoost-ET.The superiority of the proposed framework is verified by applying this data-driven modelling framework to a real industrial process of propylene distillation.展开更多
The integration of artificial intelligence(AI)into the realm of robotic urologic surgery represents a remarkable paradigm shift in the field of urology and surgical healthcare.AI,with its advanced data analysis and ma...The integration of artificial intelligence(AI)into the realm of robotic urologic surgery represents a remarkable paradigm shift in the field of urology and surgical healthcare.AI,with its advanced data analysis and machine learning capabilities,has not only expedited the evolution of robotic surgical procedures but also significantly improved diagnostic accuracy and surgical outcomes.展开更多
Particle Swarm Optimization(PSO)has been utilized as a useful tool for solving intricate optimization problems for various applications in different fields.This paper attempts to carry out an update on PSO and gives a...Particle Swarm Optimization(PSO)has been utilized as a useful tool for solving intricate optimization problems for various applications in different fields.This paper attempts to carry out an update on PSO and gives a review of its recent developments and applications,but also provides arguments for its efficacy in resolving optimization problems in comparison with other algorithms.Covering six strategic areas,which include Data Mining,Machine Learning,Engineering Design,Energy Systems,Healthcare,and Robotics,the study demonstrates the versatility and effectiveness of the PSO.Experimental results are,however,used to show the strong and weak parts of PSO,and performance results are included in tables for ease of comparison.The results stress PSO’s efficiency in providing optimal solutions but also show that there are aspects that need to be improved through combination with algorithms or tuning to the parameters of the method.The review of the advantages and limitations of PSO is intended to provide academics and practitioners with a well-rounded view of the methods of employing such a tool most effectively and to encourage optimized designs of PSO in solving theoretical and practical problems in the future.展开更多
Expenditure on wells constitute a significant part of the operational costs for a petroleum enterprise, where most of the cost results from drilling. This has prompted drilling departments to continuously look for wa...Expenditure on wells constitute a significant part of the operational costs for a petroleum enterprise, where most of the cost results from drilling. This has prompted drilling departments to continuously look for ways to reduce their drilling costs and be as efficient as possible. A system called the Drilling Comprehensive Information Management and Application System (DCIMAS) is developed and presented here, with an aim at collecting, storing and making full use of the valuable well data and information relating to all drilling activities and operations. The DCIMAS comprises three main parts, including a data collection and transmission system, a data warehouse (DW) management system, and an integrated platform of core applications. With the support of the application platform, the DW management system is introduced, whereby the operation data are captured at well sites and transmitted electronically to a data warehouse via transmission equipment and ETL (extract, transformation and load) tools. With the high quality of the data guaranteed, our central task is to make the best use of the operation data and information for drilling analysis and to provide further information to guide later production stages. Applications have been developed and integrated on a uniform platform to interface directly with different layers of the multi-tier DW. Now, engineers in every department spend less time on data handling and more time on applying technology in their real work with the system.展开更多
With the development of Internet of things, cloud computing, mobile Inter- net, the scale of the data shows an alarming growth trend. Agricultural information is an important part of modern agricultural construction, ...With the development of Internet of things, cloud computing, mobile Inter- net, the scale of the data shows an alarming growth trend. Agricultural information is an important part of modern agricultural construction, and the development of a- gricultural industry is becoming more and more deeply with the application of infor- mation technology. This paper reviewed the concept and characteristic of big data, development history of big data at home and abroad, and emphatically expounded the connotation of agricultural big data, development status of agricultural big data at home and abroad, as well as the applications of agricultural big data technology, agriculture big data resources and agricultural big data in various fields.展开更多
This paper reviews the current achievements of the China Argo project. It considers aspects of both the construction of the Argo observing array, float technology, and the quality control and sharing of its data. The ...This paper reviews the current achievements of the China Argo project. It considers aspects of both the construction of the Argo observing array, float technology, and the quality control and sharing of its data. The developments of associated data products and data applications for use in the fields of ocean, atmosphere, and climate research are discussed, particularly those related to tropical cyclones (typhoons), ocean circulation, mesoscale eddies, turbulence, oceanic heat/salt storage and transportation, water masses, and operational oceanic/atmospheric/climatic forecasts and predictions. Finaliy, the challenges and opportunities involved in the long-term maintenance and sustained development of the China Argo ocean observation network are outlined. Discussion also focuses on the necessity for increasing the number of floats in the Indian Ocean and for expanding the regional Argo observation network in the South China Sea, together with the importance of promoting the use of Argo data by the maritime countries of Southeast Asia and India.展开更多
Fixture design and planning is one of the most important manufacturing activities, playing a pivotal role in deciding the lead time for product development. Fixture design, which affects the part-quality in terms of g...Fixture design and planning is one of the most important manufacturing activities, playing a pivotal role in deciding the lead time for product development. Fixture design, which affects the part-quality in terms of geometric accuracy and surface finish, can be enhanced by using the product manufacturing information(PMI) stored in the neutral standard for the exchange of product model data(STEP) file, thereby integrating design and manufacturing. The present paper proposes a unique fixture design approach, to extract the geometry information from STEP application protocol(AP) 242 files of computer aided design(CAD) models, for providing automatic suggestions of locator positions and clamping surfaces. Automatic feature extraction software "FiXplan", developed using the programming language C#, is used to extract the part feature, dimension and geometry information. The information from the STEP AP 242 file is deduced using geometric reasoning techniques, which in turn is utilized for fixture planning. The developed software is observed to be adept in identifying the primary, secondary, and tertiary locating faces and locator position configurations of prismatic components. Structural analysis of the prismatic part under different locator positions was performed using commercial finite element method software, ABAQUS, and the optimized locator position was identified on the basis of minimum deformation of the workpiece.The area-ratio(base locator enclosed area(%)/work piece base area(%)) for the ideal locator configuration was observed as 33%. Experiments were conducted on a prismatic workpiece using a specially designed fixture, for different locator configurations. The surface roughness and waviness of the machined surfaces were analysed using an Alicona non-contact optical profilometer. The best surface characteristics were obtained for the surface machined under the ideal locator positions having an area-ratio of 33%, thus validating the predicted numerical results. The efficiency, capability and applicability of the developed software is demonstrated for the finishing operation of a sensor cover – a typical prismatic component having applications in the naval industry, under different locator configurations.The best results were obtained under the proposed ideal locator configuration of area-ratio 33%.展开更多
The China Seismo-Electromagnetic Satellite, launched into orbit from Jiuquan Satellite Launch Centre on February 2 nd, 2018, is China's first space satellite dedicated to geophysical exporation. The satellite carr...The China Seismo-Electromagnetic Satellite, launched into orbit from Jiuquan Satellite Launch Centre on February 2 nd, 2018, is China's first space satellite dedicated to geophysical exporation. The satellite carries eight scientific payloads including high-precision magnetometers to detect electromagnetic changes in space, in particular changes associated with global earthquake disasters. In order to encourage and facilitate use by geophysical scientists of data from the satellite's payloads, this paper introduces the application systems developed for the China Seismo-Electromagnetic Satellite by the Institute of Crustal Dynamics, China Earthquake Administration;these include platform construction, data classification, data storage, data format, and data access and acquisition.展开更多
Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web bas...Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web based applications. The mechanism incorporates the eXtensible Markup Language (XML) and Hierarchical Data Format (HDF) to provide a flexible and efficient data format. Heterogeneous transfer data is classified into light and heavy data, which are stored using XML and HDF respectively; the HDF data format is then mapped to Java Document Object Model (JDOM) objects in XML in the Java environment. These JDOM data objects are sent across computer networks with the support of the Java Remote Method Invocation (RMI) data transfer infrastructure. Client's defined data priority levels are implemented in RMI, which guides a server to transfer data objects at different priorities. A remote monitoring system for an industrial reactor process simulator is used as a case study to illustrate the proposed data transfer mechanism.展开更多
The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation...The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation of milk is the key in the development of those ingredients and products;hence continuous research and development on this field, especially various levels of fractionation and separation by filtration, have been carried out. This review focuses on the production of milk fractions as well as their particular properties, applications and processes that increase their exploitation. Whey proteins and caseins from the protein fraction are excellent emulsifiers and protein supplements. Besides, they can be chemically or enzymatically modified to obtain bioactive peptides with numerous functional and nutritional properties. In this context, valorization techniques of cheese-whey proteins, by-product of dairy industry that constitutes both economic and environmental problems, are being developed. Phospholipids from the milk fat fraction are powerful emulsifiers and also have exclusive nutraceutical properties. In addition, enzyme modification of milk phospholipids makes it possible to tailor emulsifiers with particular properties. However, several aspects remain to be overcome;those refer to a deeper understanding of the healthy, functional and nutritional properties of these new ingredients that might be barriers for its use and acceptability. Additionally, in this review, alternative applications of milk constituents in the non-food area such as in the manufacture of plastic materials and textile fibers are also introduced. The unmet needs, the cross-fertilization in between various protein domains,the carbon footprint requirements, the environmental necessities, the health and wellness new demand, etc., are dominant factors in the search for innovation approaches;these factors are also outlining the further innovation potential deriving from those “apparent” constrains obliging science and technology to take them into account.展开更多
A kind of second-order implicit fractional step characteristic finite difference method is presented in this paper for the numerically simulation coupled system of enhanced (chemical) oil production in porous media....A kind of second-order implicit fractional step characteristic finite difference method is presented in this paper for the numerically simulation coupled system of enhanced (chemical) oil production in porous media. Some techniques, such as the calculus of variations, energy analysis method, commutativity of the products of difference operators, decomposition of high-order difference operators and the theory of a priori estimates are introduced and an optimal order error estimates in l^2 norm is derived. This method has been applied successfully to the numerical simulation of enhanced oil production in actual oilfields, and the simulation results ate quite interesting and satisfactory.展开更多
基金supported by National Natural Sciences Foundation of China(No.62271165,62027802,62201307)the Guangdong Basic and Applied Basic Research Foundation(No.2023A1515030297)+2 种基金the Shenzhen Science and Technology Program ZDSYS20210623091808025Stable Support Plan Program GXWD20231129102638002the Major Key Project of PCL(No.PCL2024A01)。
文摘Due to the restricted satellite payloads in LEO mega-constellation networks(LMCNs),remote sensing image analysis,online learning and other big data services desirably need onboard distributed processing(OBDP).In existing technologies,the efficiency of big data applications(BDAs)in distributed systems hinges on the stable-state and low-latency links between worker nodes.However,LMCNs with high-dynamic nodes and long-distance links can not provide the above conditions,which makes the performance of OBDP hard to be intuitively measured.To bridge this gap,a multidimensional simulation platform is indispensable that can simulate the network environment of LMCNs and put BDAs in it for performance testing.Using STK's APIs and parallel computing framework,we achieve real-time simulation for thousands of satellite nodes,which are mapped as application nodes through software defined network(SDN)and container technologies.We elaborate the architecture and mechanism of the simulation platform,and take the Starlink and Hadoop as realistic examples for simulations.The results indicate that LMCNs have dynamic end-to-end latency which fluctuates periodically with the constellation movement.Compared to ground data center networks(GDCNs),LMCNs deteriorate the computing and storage job throughput,which can be alleviated by the utilization of erasure codes and data flow scheduling of worker nodes.
基金This work was supported by the National Science and Technology Council,Taiwan,under Project NSTC 112-2221-E-029-015.
文摘Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G mobile networks.High-security cryptography guarantees that essential data can be transmitted securely;however,it increases energy consumption and reduces data processing speed.Therefore,this study proposes a low-energy data encryption(LEDE)algorithm based on the Advanced Encryption Standard(AES)for improving data transmission security and reducing the energy consumption of encryption in Internet-of-Things(IoT)devices.In the proposed LEDE algorithm,the system time parameter is employed to create a dynamic S-Box to replace the static S-Box of AES.Tests indicated that six-round LEDE encryption achieves the same security level as 10-round conventional AES encryption.This reduction in encryption time results in the LEDE algorithm having a 67.4%lower energy consumption and 43.9%shorter encryption time than conventional AES;thus,the proposed LEDE algorithm can improve the performance and the energy consumption of IoT edge devices.
文摘The current education field is experiencing an innovation driven by big data and cloud technologies,and these advanced technologies play a central role in the construction of smart campuses.Big data technology has a wide range of applications in student learning behavior analysis,teaching resource management,campus safety monitoring,and decision support,which improves the quality of education and management efficiency.Cloud computing technology supports the integration,distribution,and optimal use of educational resources through cloud resource sharing,virtual classrooms,intelligent campus management systems,and Infrastructure-as-a-Service(IaaS)models,which reduce costs and increase flexibility.This paper comprehensively discusses the practical application of big data and cloud computing technologies in smart campuses,showing how these technologies can contribute to the development of smart campuses,and laying the foundation for the future innovation of education models.
文摘As an introductory course for the emerging major of big data management and application,“Introduction to Big Data”has not yet formed a curriculum standard and implementation plan that is widely accepted and used by everyone.To this end,we discuss some of our explorations and attempts in the construction and teaching process of big data courses for the major of big data management and application from the perspective of course planning,course implementation,and course summary.After interviews with students and feedback from questionnaires,students are highly satisfied with some of the teaching measures and programs currently adopted.
文摘This study aims to investigate the influence of social media on college choice among undergraduates majoring in Big Data Management and Application in China.The study attempts to reveal how information on social media platforms such as Weibo,WeChat,and Zhihu influences the cognition and choice process of prospective students.By employing an online quantitative survey questionnaire,data were collected from the 2022 and 2023 classes of new students majoring in Big Data Management and Application at Guilin University of Electronic Technology.The aim was to evaluate the role of social media in their college choice process and understand the features and information that most attract prospective students.Social media has become a key factor influencing the college choice decision-making of undergraduates majoring in Big Data Management and Application in China.Students tend to obtain school information through social media platforms and use this information as an important reference in their decision-making process.Higher education institutions should strengthen their social media information dissemination,providing accurate,timely,and attractive information.It is also necessary to ensure effective management of social media platforms,maintain a positive reputation for the school on social media,and increase the interest and trust of prospective students.Simultaneously,educational decision-makers should consider incorporating social media analysis into their recruitment strategies to better attract new student enrollment.This study provides a new perspective for understanding higher education choice behavior in the digital age,particularly by revealing the importance of social media in the educational decision-making process.This has important practical and theoretical implications for higher education institutions,policymakers,and social media platform operators.
文摘This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.
基金supported by Internal Research Support Program(IRSPG202202).
文摘Dealing with data scarcity is the biggest challenge faced by Artificial Intelligence(AI),and it will be interesting to see how we overcome this obstacle in the future,but for now,“THE SHOW MUST GO ON!!!”As AI spreads and transforms more industries,the lack of data is a significant obstacle:the best methods for teaching machines how real-world processes work.This paper explores the considerable implications of data scarcity for the AI industry,which threatens to restrict its growth and potential,and proposes plausible solutions and perspectives.In addition,this article focuses highly on different ethical considerations:privacy,consent,and non-discrimination principles during AI model developments under limited conditions.Besides,innovative technologies are investigated through the paper in aspects that need implementation by incorporating transfer learning,few-shot learning,and data augmentation to adapt models so they could fit effective use processes in low-resource settings.This thus emphasizes the need for collaborative frameworks and sound methodologies that ensure applicability and fairness,tackling the technical and ethical challenges associated with data scarcity in AI.This article also discusses prospective approaches to dealing with data scarcity,emphasizing the blend of synthetic data and traditional models and the use of advanced machine learning techniques such as transfer learning and few-shot learning.These techniques aim to enhance the flexibility and effectiveness of AI systems across various industries while ensuring sustainable AI technology development amid ongoing data scarcity.
基金supported by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2025R104)Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia。
文摘In today’s digital world,the Internet of Things(IoT)plays an important role in both local and global economies due to its widespread adoption in different applications.This technology has the potential to offer several advantages over conventional technologies in the near future.However,the potential growth of this technology also attracts attention from hackers,which introduces new challenges for the research community that range from hardware and software security to user privacy and authentication.Therefore,we focus on a particular security concern that is associated with malware detection.The literature presents many countermeasures,but inconsistent results on identical datasets and algorithms raise concerns about model biases,training quality,and complexity.This highlights the need for an adaptive,real-time learning framework that can effectively mitigate malware threats in IoT applications.To address these challenges,(i)we propose an intelligent framework based on Two-step Deep Reinforcement Learning(TwStDRL)that is capable of learning and adapting in real-time to counter malware threats in IoT applications.This framework uses exploration and exploitation phenomena during both the training and testing phases by storing results in a replay memory.The stored knowledge allows the model to effectively navigate the environment and maximize cumulative rewards.(ii)To demonstrate the superiority of the TwStDRL framework,we implement and evaluate several machine learning algorithms for comparative analysis that include Support Vector Machines(SVM),Multi-Layer Perceptron,Random Forests,and k-means Clustering.The selection of these algorithms is driven by the inconsistent results reported in the literature,which create doubt about their robustness and reliability in real-world IoT deployments.(iii)Finally,we provide a comprehensive evaluation to justify why the TwStDRL framework outperforms them in mitigating security threats.During analysis,we noted that our proposed TwStDRL scheme achieves an average performance of 99.45%across accuracy,precision,recall,and F1-score,which is an absolute improvement of roughly 3%over the existing malware-detection models.
基金supported by the National Key Research and Development Program of China(2023YFB3307801)the National Natural Science Foundation of China(62394343,62373155,62073142)+3 种基金Major Science and Technology Project of Xinjiang(No.2022A01006-4)the Programme of Introducing Talents of Discipline to Universities(the 111 Project)under Grant B17017the Fundamental Research Funds for the Central Universities,Science Foundation of China University of Petroleum,Beijing(No.2462024YJRC011)the Open Research Project of the State Key Laboratory of Industrial Control Technology,China(Grant No.ICT2024B70).
文摘The distillation process is an important chemical process,and the application of data-driven modelling approach has the potential to reduce model complexity compared to mechanistic modelling,thus improving the efficiency of process optimization or monitoring studies.However,the distillation process is highly nonlinear and has multiple uncertainty perturbation intervals,which brings challenges to accurate data-driven modelling of distillation processes.This paper proposes a systematic data-driven modelling framework to solve these problems.Firstly,data segment variance was introduced into the K-means algorithm to form K-means data interval(KMDI)clustering in order to cluster the data into perturbed and steady state intervals for steady-state data extraction.Secondly,maximal information coefficient(MIC)was employed to calculate the nonlinear correlation between variables for removing redundant features.Finally,extreme gradient boosting(XGBoost)was integrated as the basic learner into adaptive boosting(AdaBoost)with the error threshold(ET)set to improve weights update strategy to construct the new integrated learning algorithm,XGBoost-AdaBoost-ET.The superiority of the proposed framework is verified by applying this data-driven modelling framework to a real industrial process of propylene distillation.
文摘The integration of artificial intelligence(AI)into the realm of robotic urologic surgery represents a remarkable paradigm shift in the field of urology and surgical healthcare.AI,with its advanced data analysis and machine learning capabilities,has not only expedited the evolution of robotic surgical procedures but also significantly improved diagnostic accuracy and surgical outcomes.
文摘Particle Swarm Optimization(PSO)has been utilized as a useful tool for solving intricate optimization problems for various applications in different fields.This paper attempts to carry out an update on PSO and gives a review of its recent developments and applications,but also provides arguments for its efficacy in resolving optimization problems in comparison with other algorithms.Covering six strategic areas,which include Data Mining,Machine Learning,Engineering Design,Energy Systems,Healthcare,and Robotics,the study demonstrates the versatility and effectiveness of the PSO.Experimental results are,however,used to show the strong and weak parts of PSO,and performance results are included in tables for ease of comparison.The results stress PSO’s efficiency in providing optimal solutions but also show that there are aspects that need to be improved through combination with algorithms or tuning to the parameters of the method.The review of the advantages and limitations of PSO is intended to provide academics and practitioners with a well-rounded view of the methods of employing such a tool most effectively and to encourage optimized designs of PSO in solving theoretical and practical problems in the future.
文摘Expenditure on wells constitute a significant part of the operational costs for a petroleum enterprise, where most of the cost results from drilling. This has prompted drilling departments to continuously look for ways to reduce their drilling costs and be as efficient as possible. A system called the Drilling Comprehensive Information Management and Application System (DCIMAS) is developed and presented here, with an aim at collecting, storing and making full use of the valuable well data and information relating to all drilling activities and operations. The DCIMAS comprises three main parts, including a data collection and transmission system, a data warehouse (DW) management system, and an integrated platform of core applications. With the support of the application platform, the DW management system is introduced, whereby the operation data are captured at well sites and transmitted electronically to a data warehouse via transmission equipment and ETL (extract, transformation and load) tools. With the high quality of the data guaranteed, our central task is to make the best use of the operation data and information for drilling analysis and to provide further information to guide later production stages. Applications have been developed and integrated on a uniform platform to interface directly with different layers of the multi-tier DW. Now, engineers in every department spend less time on data handling and more time on applying technology in their real work with the system.
文摘With the development of Internet of things, cloud computing, mobile Inter- net, the scale of the data shows an alarming growth trend. Agricultural information is an important part of modern agricultural construction, and the development of a- gricultural industry is becoming more and more deeply with the application of infor- mation technology. This paper reviewed the concept and characteristic of big data, development history of big data at home and abroad, and emphatically expounded the connotation of agricultural big data, development status of agricultural big data at home and abroad, as well as the applications of agricultural big data technology, agriculture big data resources and agricultural big data in various fields.
基金The National Natural Science Foundation under contract No.41621064the Science and Technology Basic Work of the Ministry of Science and Technology of China under contract No.2012FY112300the Public Science and Technology Research Funds Projects of Ocean under contract No.201005033
文摘This paper reviews the current achievements of the China Argo project. It considers aspects of both the construction of the Argo observing array, float technology, and the quality control and sharing of its data. The developments of associated data products and data applications for use in the fields of ocean, atmosphere, and climate research are discussed, particularly those related to tropical cyclones (typhoons), ocean circulation, mesoscale eddies, turbulence, oceanic heat/salt storage and transportation, water masses, and operational oceanic/atmospheric/climatic forecasts and predictions. Finaliy, the challenges and opportunities involved in the long-term maintenance and sustained development of the China Argo ocean observation network are outlined. Discussion also focuses on the necessity for increasing the number of floats in the Indian Ocean and for expanding the regional Argo observation network in the South China Sea, together with the importance of promoting the use of Argo data by the maritime countries of Southeast Asia and India.
基金Department of Science and Technology,Government of India for providing financial support under the scheme FIST(No.SR/FST/ETI-388/2015)。
文摘Fixture design and planning is one of the most important manufacturing activities, playing a pivotal role in deciding the lead time for product development. Fixture design, which affects the part-quality in terms of geometric accuracy and surface finish, can be enhanced by using the product manufacturing information(PMI) stored in the neutral standard for the exchange of product model data(STEP) file, thereby integrating design and manufacturing. The present paper proposes a unique fixture design approach, to extract the geometry information from STEP application protocol(AP) 242 files of computer aided design(CAD) models, for providing automatic suggestions of locator positions and clamping surfaces. Automatic feature extraction software "FiXplan", developed using the programming language C#, is used to extract the part feature, dimension and geometry information. The information from the STEP AP 242 file is deduced using geometric reasoning techniques, which in turn is utilized for fixture planning. The developed software is observed to be adept in identifying the primary, secondary, and tertiary locating faces and locator position configurations of prismatic components. Structural analysis of the prismatic part under different locator positions was performed using commercial finite element method software, ABAQUS, and the optimized locator position was identified on the basis of minimum deformation of the workpiece.The area-ratio(base locator enclosed area(%)/work piece base area(%)) for the ideal locator configuration was observed as 33%. Experiments were conducted on a prismatic workpiece using a specially designed fixture, for different locator configurations. The surface roughness and waviness of the machined surfaces were analysed using an Alicona non-contact optical profilometer. The best surface characteristics were obtained for the surface machined under the ideal locator positions having an area-ratio of 33%, thus validating the predicted numerical results. The efficiency, capability and applicability of the developed software is demonstrated for the finishing operation of a sensor cover – a typical prismatic component having applications in the naval industry, under different locator configurations.The best results were obtained under the proposed ideal locator configuration of area-ratio 33%.
基金supported by the Civil Space Research project (ZH1 data validation: Ionospheric observatory theory)NFSC grant 41574139 and 41874174
文摘The China Seismo-Electromagnetic Satellite, launched into orbit from Jiuquan Satellite Launch Centre on February 2 nd, 2018, is China's first space satellite dedicated to geophysical exporation. The satellite carries eight scientific payloads including high-precision magnetometers to detect electromagnetic changes in space, in particular changes associated with global earthquake disasters. In order to encourage and facilitate use by geophysical scientists of data from the satellite's payloads, this paper introduces the application systems developed for the China Seismo-Electromagnetic Satellite by the Institute of Crustal Dynamics, China Earthquake Administration;these include platform construction, data classification, data storage, data format, and data access and acquisition.
文摘Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web based applications. The mechanism incorporates the eXtensible Markup Language (XML) and Hierarchical Data Format (HDF) to provide a flexible and efficient data format. Heterogeneous transfer data is classified into light and heavy data, which are stored using XML and HDF respectively; the HDF data format is then mapped to Java Document Object Model (JDOM) objects in XML in the Java environment. These JDOM data objects are sent across computer networks with the support of the Java Remote Method Invocation (RMI) data transfer infrastructure. Client's defined data priority levels are implemented in RMI, which guides a server to transfer data objects at different priorities. A remote monitoring system for an industrial reactor process simulator is used as a case study to illustrate the proposed data transfer mechanism.
文摘The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation of milk is the key in the development of those ingredients and products;hence continuous research and development on this field, especially various levels of fractionation and separation by filtration, have been carried out. This review focuses on the production of milk fractions as well as their particular properties, applications and processes that increase their exploitation. Whey proteins and caseins from the protein fraction are excellent emulsifiers and protein supplements. Besides, they can be chemically or enzymatically modified to obtain bioactive peptides with numerous functional and nutritional properties. In this context, valorization techniques of cheese-whey proteins, by-product of dairy industry that constitutes both economic and environmental problems, are being developed. Phospholipids from the milk fat fraction are powerful emulsifiers and also have exclusive nutraceutical properties. In addition, enzyme modification of milk phospholipids makes it possible to tailor emulsifiers with particular properties. However, several aspects remain to be overcome;those refer to a deeper understanding of the healthy, functional and nutritional properties of these new ingredients that might be barriers for its use and acceptability. Additionally, in this review, alternative applications of milk constituents in the non-food area such as in the manufacture of plastic materials and textile fibers are also introduced. The unmet needs, the cross-fertilization in between various protein domains,the carbon footprint requirements, the environmental necessities, the health and wellness new demand, etc., are dominant factors in the search for innovation approaches;these factors are also outlining the further innovation potential deriving from those “apparent” constrains obliging science and technology to take them into account.
基金supported by the Major State Basic Research Development Program of China(G19990328)National Tackling Key Program(2011ZX05011-004+6 种基金2011ZX0505220050200069)National Natural Science Foundation of China(11101244112712311077112410372052)Doctorate Foundation of the Ministry of Education of China(20030422047)
文摘A kind of second-order implicit fractional step characteristic finite difference method is presented in this paper for the numerically simulation coupled system of enhanced (chemical) oil production in porous media. Some techniques, such as the calculus of variations, energy analysis method, commutativity of the products of difference operators, decomposition of high-order difference operators and the theory of a priori estimates are introduced and an optimal order error estimates in l^2 norm is derived. This method has been applied successfully to the numerical simulation of enhanced oil production in actual oilfields, and the simulation results ate quite interesting and satisfactory.