2024年日本机床展览会(JIMTOF2024)于2024年11月5~10日在东京Tokyo Big Sight举办,展出面积118540平方米。展会以“技术传承提供无限可能(Technologies passed down to the future offer unlimited possibilities)”为主题。
This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models ...This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models offer insights, they fall short in presenting a holistic view of complex urban challenges. System dynamics (SD) models that are often utilized to provide holistic, systematic understanding of a research subject, like the urban system, emerge as valuable tools, but data scarcity and theoretical inadequacy pose challenges. The research reviews relevant papers on recent SD model applications in urban sustainability since 2018, categorizing them based on nine key indicators. Among the reviewed papers, data limitations and model assumptions were identified as ma jor challenges in applying SD models to urban sustainability. This led to exploring the transformative potential of big data analytics, a rare approach in this field as identified by this study, to enhance SD models’ empirical foundation. Integrating big data could provide data-driven calibration, potentially improving predictive accuracy and reducing reliance on simplified assumptions. The paper concludes by advocating for new approaches that reduce assumptions and promote real-time applicable models, contributing to a comprehensive understanding of urban sustainability through the synergy of big data and SD models.展开更多
The security of the seed industry is crucial for ensuring national food security.Currently,developed countries in Europe and America,along with international seed industry giants,have entered the Breeding 4.0 era.This...The security of the seed industry is crucial for ensuring national food security.Currently,developed countries in Europe and America,along with international seed industry giants,have entered the Breeding 4.0 era.This era integrates biotechnology,artificial intelligence(AI),and big data information technology.In contrast,China is still in a transition period between stages 2.0 and 3.0,which primarily relies on conventional selection and molecular breeding.In the context of increasingly complex international situations,accurately identifying core issues in China's seed industry innovation and seizing the frontier of international seed technology are strategically important.These efforts are essential for ensuring food security and revitalizing the seed industry.This paper systematically analyzes the characteristics of crop breeding data from artificial selection to intelligent design breeding.It explores the applications and development trends of AI and big data in modern crop breeding from several key perspectives.These include highthroughput phenotype acquisition and analysis,multiomics big data database and management system construction,AI-based multiomics integrated analysis,and the development of intelligent breeding software tools based on biological big data and AI technology.Based on an in-depth analysis of the current status and challenges of China's seed industry technology development,we propose strategic goals and key tasks for China's new generation of AI and big data-driven intelligent design breeding.These suggestions aim to accelerate the development of an intelligent-driven crop breeding engineering system that features large-scale gene mining,efficient gene manipulation,engineered variety design,and systematized biobreeding.This study provides a theoretical basis and practical guidance for the development of China's seed industry technology.展开更多
On October 18,2017,the 19th National Congress Report called for the implementation of the Healthy China Strategy.The development of biomedical data plays a pivotal role in advancing this strategy.Since the 18th Nation...On October 18,2017,the 19th National Congress Report called for the implementation of the Healthy China Strategy.The development of biomedical data plays a pivotal role in advancing this strategy.Since the 18th National Congress of the Communist Party of China,China has vigorously promoted the integration and implementation of the Healthy China and Digital China strategies.The National Health Commission has prioritized the development of health and medical big data,issuing policies to promote standardized applica-tions and foster innovation in"Internet+Healthcare."Biomedical data has significantly contributed to preci-sion medicine,personalized health management,drug development,disease diagnosis,public health monitor-ing,and epidemic prediction capabilities.展开更多
The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,s...The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.展开更多
Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning fr...Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.展开更多
This study examines the Big Data Collection and Preprocessing course at Anhui Institute of Information Engineering,implementing a hybrid teaching reform using the Bosi Smart Learning Platform.The proposed hybrid model...This study examines the Big Data Collection and Preprocessing course at Anhui Institute of Information Engineering,implementing a hybrid teaching reform using the Bosi Smart Learning Platform.The proposed hybrid model follows a“three-stage”and“two-subject”framework,incorporating a structured design for teaching content and assessment methods before,during,and after class.Practical results indicate that this approach significantly enhances teaching effectiveness and improves students’learning autonomy.展开更多
Open networks and heterogeneous services in the Internet of Vehicles(IoV)can lead to security and privacy challenges.One key requirement for such systems is the preservation of user privacy,ensuring a seamless experie...Open networks and heterogeneous services in the Internet of Vehicles(IoV)can lead to security and privacy challenges.One key requirement for such systems is the preservation of user privacy,ensuring a seamless experience in driving,navigation,and communication.These privacy needs are influenced by various factors,such as data collected at different intervals,trip durations,and user interactions.To address this,the paper proposes a Support Vector Machine(SVM)model designed to process large amounts of aggregated data and recommend privacy preserving measures.The model analyzes data based on user demands and interactions with service providers or neighboring infrastructure.It aims to minimize privacy risks while ensuring service continuity and sustainability.The SVMmodel helps validate the system’s reliability by creating a hyperplane that distinguishes between maximum and minimum privacy recommendations.The results demonstrate the effectiveness of the proposed SVM model in enhancing both privacy and service performance.展开更多
Deep-time Earth research plays a pivotal role in deciphering the rates,patterns,and mechanisms of Earth's evolutionary processes throughout geological history,providing essential scientific foundations for climate...Deep-time Earth research plays a pivotal role in deciphering the rates,patterns,and mechanisms of Earth's evolutionary processes throughout geological history,providing essential scientific foundations for climate prediction,natural resource exploration,and sustainable planetary stewardship.To advance Deep-time Earth research in the era of big data and artificial intelligence,the International Union of Geological Sciences initiated the“Deeptime Digital Earth International Big Science Program”(DDE)in 2019.At the core of this ambitious program lies the development of geoscience knowledge graphs,serving as a transformative knowledge infrastructure that enables the integration,sharing,mining,and analysis of heterogeneous geoscience big data.The DDE knowledge graph initiative has made significant strides in three critical dimensions:(1)establishing a unified knowledge structure across geoscience disciplines that ensures consistent representation of geological entities and their interrelationships through standardized ontologies and semantic frameworks;(2)developing a robust and scalable software infrastructure capable of supporting both expert-driven and machine-assisted knowledge engineering for large-scale graph construction and management;(3)implementing a comprehensive three-tiered architecture encompassing basic,discipline-specific,and application-oriented knowledge graphs,spanning approximately 20 geoscience disciplines.Through its open knowledge framework and international collaborative network,this initiative has fostered multinational research collaborations,establishing a robust foundation for next-generation geoscience research while propelling the discipline toward FAIR(Findable,Accessible,Interoperable,Reusable)data practices in deep-time Earth systems research.展开更多
In the Internet era,recommendation systems play a crucial role in helping users find relevant information from large datasets.Class imbalance is known to severely affect data quality,and therefore reduce the performan...In the Internet era,recommendation systems play a crucial role in helping users find relevant information from large datasets.Class imbalance is known to severely affect data quality,and therefore reduce the performance of recommendation systems.Due to the imbalance,machine learning algorithms tend to classify inputs into the positive(majority)class every time to achieve high prediction accuracy.Imbalance can be categorized such as by features and classes,but most studies consider only class imbalance.In this paper,we propose a recommendation system that can integrate multiple networks to adapt to a large number of imbalanced features and can deal with highly skewed and imbalanced datasets through a loss function.We propose a loss aware feature attention mechanism(LAFAM)to solve the issue of feature imbalance.The network incorporates an attention mechanism and uses multiple sub-networks to classify and learn features.For better results,the network can learn the weights of sub-networks and assign higher weights to important features.We propose suppression loss to address class imbalance,which favors negative loss by penalizing positive loss,and pays more attention to sample points near the decision boundary.Experiments on two large-scale datasets verify that the performance of the proposed system is greatly improved compared to baseline methods.展开更多
High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and ...High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and process optimization,characterized by low efficiency and high costs.The integration of Artificial Intelligence(AI)technologies has provided innovative solutions for HEAs research.This review presented a detailed overview of recent advancements in AI applications for structural modeling and mechanical property prediction of HEAs.Furthermore,it discussed the advantages of big data analytics in facilitating alloy composition design and screening,quality control,and defect prediction,as well as the construction and sharing of specialized material databases.The paper also addressed the existing challenges in current AI-driven HEAs research,including issues related to data quality,model interpretability,and cross-domain knowledge integration.Additionally,it proposed prospects for the synergistic development of AI-enhanced computational materials science and experimental validation systems.展开更多
A team of researchers from the Beijing Normal University,the Institute of High Energy Physics(IHEP)under the Chinese Academy of Sciences(CAS),and the National Astronomical Observatories,CAS(NAOC),reported in Nature As...A team of researchers from the Beijing Normal University,the Institute of High Energy Physics(IHEP)under the Chinese Academy of Sciences(CAS),and the National Astronomical Observatories,CAS(NAOC),reported in Nature Astronomy on January 23,2025 their discovery of an X-ray flash about 12.5 billion lightyears away.The signals burst out only 1.2 billion years after the Big Bang,when our 13.8-billion-year-old universe was still in its infancy,and a science satellite swiftly recorded them.展开更多
Research into metamorphism plays a pivotal role in reconstructing the evolution of continent,particularly through the study of ancient rocks that are highly susceptible to metamorphic alterations due to multiple tecto...Research into metamorphism plays a pivotal role in reconstructing the evolution of continent,particularly through the study of ancient rocks that are highly susceptible to metamorphic alterations due to multiple tectonic activities.In the big data era,the establishment of new data platforms and the application of big data methods have become a focus for metamorphic rocks.Significant progress has been made in creating specialized databases,compiling comprehensive datasets,and utilizing data analytics to address complex scientific questions.However,many existing databases are inadequate in meeting the specific requirements of metamorphic research,resulting from a substantial amount of valuable data remaining uncollected.Therefore,constructing new databases that can cope with the development of the data era is necessary.This article provides an extensive review of existing databases related to metamorphic rocks and discusses data-driven studies in this.Accordingly,several crucial factors that need to be taken into consideration in the establishment of specialized metamorphic databases are identified,aiming to leverage data-driven applications to achieve broader scientific objectives in metamorphic research.展开更多
Managing sensitive data in dynamic and high-stakes environments,such as healthcare,requires access control frameworks that offer real-time adaptability,scalability,and regulatory compliance.BIG-ABAC introduces a trans...Managing sensitive data in dynamic and high-stakes environments,such as healthcare,requires access control frameworks that offer real-time adaptability,scalability,and regulatory compliance.BIG-ABAC introduces a transformative approach to Attribute-Based Access Control(ABAC)by integrating real-time policy evaluation and contextual adaptation.Unlike traditional ABAC systems that rely on static policies,BIG-ABAC dynamically updates policies in response to evolving rules and real-time contextual attributes,ensuring precise and efficient access control.Leveraging decision trees evaluated in real-time,BIG-ABAC overcomes the limitations of conventional access control models,enabling seamless adaptation to complex,high-demand scenarios.The framework adheres to the NIST ABAC standard while incorporating modern distributed streaming technologies to enhance scalability and traceability.Its flexible policy enforcement mechanisms facilitate the implementation of regulatory requirements such as HIPAA and GDPR,allowing organizations to align access control policies with compliance needs dynamically.Performance evaluations demonstrate that BIG-ABAC processes 95% of access requests within 50 ms and updates policies dynamically with a latency of 30 ms,significantly outperforming traditional ABAC models.These results establish BIG-ABAC as a benchmark for adaptive,scalable,and context-aware access control,making it an ideal solution for dynamic,high-risk domains such as healthcare,smart cities,and Industrial IoT(IIoT).展开更多
Social media outlets deliver customers a medium for communication,exchange,and expression of their thoughts with others.The advent of social networks and the fast escalation of the quantity of data have created opport...Social media outlets deliver customers a medium for communication,exchange,and expression of their thoughts with others.The advent of social networks and the fast escalation of the quantity of data have created opportunities for textual evaluation.Utilising the user corpus,characteristics of social platform users,and other data,academic research may accurately discern the personality traits of users.This research examines the traits of consumer personalities.Usually,personality tests administered by psychological experts via interviews or self-report questionnaires are costly,time-consuming,complex,and labour-intensive.Currently,academics in computational linguistics are increasingly focused on predicting personality traits from social media data.An individual’s personality comprises their traits and behavioral habits.To address this distinction,we propose a novel LSTMapproach(BERT-LIWC-LSTM)that simultaneously incorporates users’enduring and immediate personality characteristics for textual personality recognition.Long-termPersonality Encoding in the proposed paradigmcaptures and represents persisting personality traits.Short-termPersonality Capturing records changing personality states.Experimental results demonstrate that the designed BERT-LIWC-LSTM model achieves an average improvement in accuracy of 3.41% on the Big Five dataset compared to current methods,thereby justifying the efficacy of encoding both stable and dynamic personality traits simultaneously through long-and short-term feature interaction.展开更多
We put forward an enlightening view on repulsive force between antimatter:Antimatter repels each other,and the repulsive force is proportional to the product of their masses and inversely proportional to the square of...We put forward an enlightening view on repulsive force between antimatter:Antimatter repels each other,and the repulsive force is proportional to the product of their masses and inversely proportional to the square of the distance between them;There is no gravitational or anti-gravitational interaction between antimatter and positive matter.As their applications,we explain the Big Bang process in a new light.展开更多
Morphology,the study of shapes or forms,when applied to tourism,emphasizes the multifarious spatial practices between morphological elements and tourism activities.However,existing literature on morphology in the cont...Morphology,the study of shapes or forms,when applied to tourism,emphasizes the multifarious spatial practices between morphological elements and tourism activities.However,existing literature on morphology in the context of tourism usually only focuses on a single or a limited number of study areas,overlooking common or even universal patterns across various tourism destinations.To address this gap,we utilize geospatial big data and present a case study on the morphology of 406“AAAAA”-rated scenic areas in China.A framework based on“points”,“lines”,“planes”,and“solids”was designed to systematically organize and analyze morphological elements across scenic areas.The findings provide valuable insights for tourism planning and development,such as the co-occurrence of dense road networks and fragmented landscapes within scenic areas,as well as the resourcecontext-influenced(cultural or natural)associations between morphological features and tourism indicators.This research provides valuable strategic guidance for more effective and informed tourism development while acknowledging the trade-offs between generalizability and local specificity.展开更多
With the constant changes of the times,China's science and technology have entered a period of rapid development.At the same time,the economic structure is also changing with the changes of the times,and the origi...With the constant changes of the times,China's science and technology have entered a period of rapid development.At the same time,the economic structure is also changing with the changes of the times,and the original Haikou logistics industry in the process is also facing new impacts and challenges.And related enterprises want to stand out in the fierce market competition,we must optimize and upgrade the current industry development situation,promote the integrated development of Haikou logistics and manufacturing industry,to constantly promote the innovative application of digital technology in the logistics industry and manufacturing industry,the formation of a multi-force economic development model.This paper mainly starts with the development status of Haikou logistics,analyzes the importance of the integration of Haikou logistics and manufacturing industry under the background of big data drive,and makes an in-depth discussion on the path of the integration of Haikou logistics and manufacturing industry under the drive of big data,hoping to contribute new strength to the development of social economy.展开更多
The era of big data brings new challenges for information network systems(INS),simultaneously offering unprecedented opportunities for advancing intelligent intrusion detection systems.In this work,we propose a data-d...The era of big data brings new challenges for information network systems(INS),simultaneously offering unprecedented opportunities for advancing intelligent intrusion detection systems.In this work,we propose a data-driven intrusion detection system for Distributed Denial of Service(DDoS)attack detection.The system focuses on intrusion detection from a big data perceptive.As intelligent information processing methods,big data and artificial intelligence have been widely used in information systems.The INS system is an important information system in cyberspace.In advanced INS systems,the network architectures have become more complex.And the smart devices in INS systems collect a large scale of network data.How to improve the performance of a complex intrusion detection system with big data and artificial intelligence is a big challenge.To address the problem,we design a novel intrusion detection system(IDS)from a big data perspective.The IDS system uses tensors to represent large-scale and complex multi-source network data in a unified tensor.Then,a novel tensor decomposition(TD)method is developed to complete big data mining.The TD method seamlessly collaborates with the XGBoost(eXtreme Gradient Boosting)method to complete the intrusion detection.To verify the proposed IDS system,a series of experiments is conducted on two real network datasets.The results revealed that the proposed IDS system attained an impressive accuracy rate over 98%.Additionally,by altering the scale of the datasets,the proposed IDS system still maintains excellent detection performance,which demonstrates the proposed IDS system’s robustness.展开更多
High-quality data is essential for the success of data-driven learning tasks.The characteristics,precision,and completeness of the datasets critically determine the reliability,interpretability,and effectiveness of su...High-quality data is essential for the success of data-driven learning tasks.The characteristics,precision,and completeness of the datasets critically determine the reliability,interpretability,and effectiveness of subsequent analyzes and applications,such as fault detection,predictive maintenance,and process optimization.However,for many industrial processes,obtaining sufficient high-quality data remains a significant challenge due to high costs,safety concerns,and practical constraints.To overcome these challenges,data augmentation has emerged as a rapidly growing research area,attracting considerable attention across both academia and industry.By expanding datasets,data augmentation techniques improve greater generalization and more robust performance in actual applications.This paper provides a comprehensive,multi-perspective review of data augmentation methods for industrial processes.For clarity and organization,existing studies are systematically grouped into four categories:small sample with low dimension,small sample with high dimension,large sample with low dimension,and large sample with high dimension.Within this framework,the review examines current research from both methodological and application-oriented perspectives,highlighting main methods,advantages,and limitations.By synthesizing these findings,this review offers a structured overview for scholars and practitioners,serving as a valuable reference for newcomers and experienced researchers seeking to explore and advance data augmentation techniques in industrial processes.展开更多
文摘2024年日本机床展览会(JIMTOF2024)于2024年11月5~10日在东京Tokyo Big Sight举办,展出面积118540平方米。展会以“技术传承提供无限可能(Technologies passed down to the future offer unlimited possibilities)”为主题。
基金sponsored by the U.S.Department of Housing and Urban Development(Grant No.NJLTS0027-22)The opinions expressed in this study are the authors alone,and do not represent the U.S.Depart-ment of HUD’s opinions.
文摘This paper addresses urban sustainability challenges amid global urbanization, emphasizing the need for innova tive approaches aligned with the Sustainable Development Goals. While traditional tools and linear models offer insights, they fall short in presenting a holistic view of complex urban challenges. System dynamics (SD) models that are often utilized to provide holistic, systematic understanding of a research subject, like the urban system, emerge as valuable tools, but data scarcity and theoretical inadequacy pose challenges. The research reviews relevant papers on recent SD model applications in urban sustainability since 2018, categorizing them based on nine key indicators. Among the reviewed papers, data limitations and model assumptions were identified as ma jor challenges in applying SD models to urban sustainability. This led to exploring the transformative potential of big data analytics, a rare approach in this field as identified by this study, to enhance SD models’ empirical foundation. Integrating big data could provide data-driven calibration, potentially improving predictive accuracy and reducing reliance on simplified assumptions. The paper concludes by advocating for new approaches that reduce assumptions and promote real-time applicable models, contributing to a comprehensive understanding of urban sustainability through the synergy of big data and SD models.
基金partially supported by the Construction of Collaborative Innovation Center of Beijing Academy of Agricultural and Forestry Sciences(KJCX20240406)the Beijing Natural Science Foundation(JQ24037)+1 种基金the National Natural Science Foundation of China(32330075)the Earmarked Fund for China Agriculture Research System(CARS-02 and CARS-54)。
文摘The security of the seed industry is crucial for ensuring national food security.Currently,developed countries in Europe and America,along with international seed industry giants,have entered the Breeding 4.0 era.This era integrates biotechnology,artificial intelligence(AI),and big data information technology.In contrast,China is still in a transition period between stages 2.0 and 3.0,which primarily relies on conventional selection and molecular breeding.In the context of increasingly complex international situations,accurately identifying core issues in China's seed industry innovation and seizing the frontier of international seed technology are strategically important.These efforts are essential for ensuring food security and revitalizing the seed industry.This paper systematically analyzes the characteristics of crop breeding data from artificial selection to intelligent design breeding.It explores the applications and development trends of AI and big data in modern crop breeding from several key perspectives.These include highthroughput phenotype acquisition and analysis,multiomics big data database and management system construction,AI-based multiomics integrated analysis,and the development of intelligent breeding software tools based on biological big data and AI technology.Based on an in-depth analysis of the current status and challenges of China's seed industry technology development,we propose strategic goals and key tasks for China's new generation of AI and big data-driven intelligent design breeding.These suggestions aim to accelerate the development of an intelligent-driven crop breeding engineering system that features large-scale gene mining,efficient gene manipulation,engineered variety design,and systematized biobreeding.This study provides a theoretical basis and practical guidance for the development of China's seed industry technology.
文摘On October 18,2017,the 19th National Congress Report called for the implementation of the Healthy China Strategy.The development of biomedical data plays a pivotal role in advancing this strategy.Since the 18th National Congress of the Communist Party of China,China has vigorously promoted the integration and implementation of the Healthy China and Digital China strategies.The National Health Commission has prioritized the development of health and medical big data,issuing policies to promote standardized applica-tions and foster innovation in"Internet+Healthcare."Biomedical data has significantly contributed to preci-sion medicine,personalized health management,drug development,disease diagnosis,public health monitor-ing,and epidemic prediction capabilities.
基金supported in part by the National Natural Science Foundation of China under Grant 62371181in part by the Changzhou Science and Technology International Cooperation Program under Grant CZ20230029+1 种基金supported by a National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(2021R1A2B5B02087169)supported under the framework of international cooperation program managed by the National Research Foundation of Korea(2022K2A9A1A01098051)。
文摘The Intelligent Internet of Things(IIoT)involves real-world things that communicate or interact with each other through networking technologies by collecting data from these“things”and using intelligent approaches,such as Artificial Intelligence(AI)and machine learning,to make accurate decisions.Data science is the science of dealing with data and its relationships through intelligent approaches.Most state-of-the-art research focuses independently on either data science or IIoT,rather than exploring their integration.Therefore,to address the gap,this article provides a comprehensive survey on the advances and integration of data science with the Intelligent IoT(IIoT)system by classifying the existing IoT-based data science techniques and presenting a summary of various characteristics.The paper analyzes the data science or big data security and privacy features,including network architecture,data protection,and continuous monitoring of data,which face challenges in various IoT-based systems.Extensive insights into IoT data security,privacy,and challenges are visualized in the context of data science for IoT.In addition,this study reveals the current opportunities to enhance data science and IoT market development.The current gap and challenges faced in the integration of data science and IoT are comprehensively presented,followed by the future outlook and possible solutions.
基金supported by the National Natural Science Foundation of China(32370703)the CAMS Innovation Fund for Medical Sciences(CIFMS)(2022-I2M-1-021,2021-I2M-1-061)the Major Project of Guangzhou National Labora-tory(GZNL2024A01015).
文摘Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.
基金2024 Anqing Normal University University-Level Key Project(ZK2024062D)。
文摘This study examines the Big Data Collection and Preprocessing course at Anhui Institute of Information Engineering,implementing a hybrid teaching reform using the Bosi Smart Learning Platform.The proposed hybrid model follows a“three-stage”and“two-subject”framework,incorporating a structured design for teaching content and assessment methods before,during,and after class.Practical results indicate that this approach significantly enhances teaching effectiveness and improves students’learning autonomy.
基金supported by the Deanship of Graduate Studies and Scientific Research at University of Bisha for funding this research through the promising program under grant number(UB-Promising-33-1445).
文摘Open networks and heterogeneous services in the Internet of Vehicles(IoV)can lead to security and privacy challenges.One key requirement for such systems is the preservation of user privacy,ensuring a seamless experience in driving,navigation,and communication.These privacy needs are influenced by various factors,such as data collected at different intervals,trip durations,and user interactions.To address this,the paper proposes a Support Vector Machine(SVM)model designed to process large amounts of aggregated data and recommend privacy preserving measures.The model analyzes data based on user demands and interactions with service providers or neighboring infrastructure.It aims to minimize privacy risks while ensuring service continuity and sustainability.The SVMmodel helps validate the system’s reliability by creating a hyperplane that distinguishes between maximum and minimum privacy recommendations.The results demonstrate the effectiveness of the proposed SVM model in enhancing both privacy and service performance.
基金Strategic Priority Research Program of the Chinese Academy of Sciences,No.XDB0740000National Key Research and Development Program of China,No.2022YFB3904200,No.2022YFF0711601+1 种基金Key Project of Innovation LREIS,No.PI009National Natural Science Foundation of China,No.42471503。
文摘Deep-time Earth research plays a pivotal role in deciphering the rates,patterns,and mechanisms of Earth's evolutionary processes throughout geological history,providing essential scientific foundations for climate prediction,natural resource exploration,and sustainable planetary stewardship.To advance Deep-time Earth research in the era of big data and artificial intelligence,the International Union of Geological Sciences initiated the“Deeptime Digital Earth International Big Science Program”(DDE)in 2019.At the core of this ambitious program lies the development of geoscience knowledge graphs,serving as a transformative knowledge infrastructure that enables the integration,sharing,mining,and analysis of heterogeneous geoscience big data.The DDE knowledge graph initiative has made significant strides in three critical dimensions:(1)establishing a unified knowledge structure across geoscience disciplines that ensures consistent representation of geological entities and their interrelationships through standardized ontologies and semantic frameworks;(2)developing a robust and scalable software infrastructure capable of supporting both expert-driven and machine-assisted knowledge engineering for large-scale graph construction and management;(3)implementing a comprehensive three-tiered architecture encompassing basic,discipline-specific,and application-oriented knowledge graphs,spanning approximately 20 geoscience disciplines.Through its open knowledge framework and international collaborative network,this initiative has fostered multinational research collaborations,establishing a robust foundation for next-generation geoscience research while propelling the discipline toward FAIR(Findable,Accessible,Interoperable,Reusable)data practices in deep-time Earth systems research.
基金supported by the National Key Research and Development Program of China(Grant numbers:2021YFF0901705,2021YFF0901700)the State Key Laboratory of Media Convergence and Communication,Communication University of China+1 种基金the Fundamental Research Funds for the Central Universitiesthe High-Quality and Cutting-Edge Disciplines Construction Project for Universities in Beijing(Internet Information,Communication University of China).
文摘In the Internet era,recommendation systems play a crucial role in helping users find relevant information from large datasets.Class imbalance is known to severely affect data quality,and therefore reduce the performance of recommendation systems.Due to the imbalance,machine learning algorithms tend to classify inputs into the positive(majority)class every time to achieve high prediction accuracy.Imbalance can be categorized such as by features and classes,but most studies consider only class imbalance.In this paper,we propose a recommendation system that can integrate multiple networks to adapt to a large number of imbalanced features and can deal with highly skewed and imbalanced datasets through a loss function.We propose a loss aware feature attention mechanism(LAFAM)to solve the issue of feature imbalance.The network incorporates an attention mechanism and uses multiple sub-networks to classify and learn features.For better results,the network can learn the weights of sub-networks and assign higher weights to important features.We propose suppression loss to address class imbalance,which favors negative loss by penalizing positive loss,and pays more attention to sample points near the decision boundary.Experiments on two large-scale datasets verify that the performance of the proposed system is greatly improved compared to baseline methods.
文摘High-Entropy Alloys(HEAs)exhibit significant potential across multiple domains due to their unique properties.However,conventional research methodologies face limitations in composition design,property prediction,and process optimization,characterized by low efficiency and high costs.The integration of Artificial Intelligence(AI)technologies has provided innovative solutions for HEAs research.This review presented a detailed overview of recent advancements in AI applications for structural modeling and mechanical property prediction of HEAs.Furthermore,it discussed the advantages of big data analytics in facilitating alloy composition design and screening,quality control,and defect prediction,as well as the construction and sharing of specialized material databases.The paper also addressed the existing challenges in current AI-driven HEAs research,including issues related to data quality,model interpretability,and cross-domain knowledge integration.Additionally,it proposed prospects for the synergistic development of AI-enhanced computational materials science and experimental validation systems.
文摘A team of researchers from the Beijing Normal University,the Institute of High Energy Physics(IHEP)under the Chinese Academy of Sciences(CAS),and the National Astronomical Observatories,CAS(NAOC),reported in Nature Astronomy on January 23,2025 their discovery of an X-ray flash about 12.5 billion lightyears away.The signals burst out only 1.2 billion years after the Big Bang,when our 13.8-billion-year-old universe was still in its infancy,and a science satellite swiftly recorded them.
基金funded by the National Natural Science Foundation of China(No.42220104008)。
文摘Research into metamorphism plays a pivotal role in reconstructing the evolution of continent,particularly through the study of ancient rocks that are highly susceptible to metamorphic alterations due to multiple tectonic activities.In the big data era,the establishment of new data platforms and the application of big data methods have become a focus for metamorphic rocks.Significant progress has been made in creating specialized databases,compiling comprehensive datasets,and utilizing data analytics to address complex scientific questions.However,many existing databases are inadequate in meeting the specific requirements of metamorphic research,resulting from a substantial amount of valuable data remaining uncollected.Therefore,constructing new databases that can cope with the development of the data era is necessary.This article provides an extensive review of existing databases related to metamorphic rocks and discusses data-driven studies in this.Accordingly,several crucial factors that need to be taken into consideration in the establishment of specialized metamorphic databases are identified,aiming to leverage data-driven applications to achieve broader scientific objectives in metamorphic research.
文摘Managing sensitive data in dynamic and high-stakes environments,such as healthcare,requires access control frameworks that offer real-time adaptability,scalability,and regulatory compliance.BIG-ABAC introduces a transformative approach to Attribute-Based Access Control(ABAC)by integrating real-time policy evaluation and contextual adaptation.Unlike traditional ABAC systems that rely on static policies,BIG-ABAC dynamically updates policies in response to evolving rules and real-time contextual attributes,ensuring precise and efficient access control.Leveraging decision trees evaluated in real-time,BIG-ABAC overcomes the limitations of conventional access control models,enabling seamless adaptation to complex,high-demand scenarios.The framework adheres to the NIST ABAC standard while incorporating modern distributed streaming technologies to enhance scalability and traceability.Its flexible policy enforcement mechanisms facilitate the implementation of regulatory requirements such as HIPAA and GDPR,allowing organizations to align access control policies with compliance needs dynamically.Performance evaluations demonstrate that BIG-ABAC processes 95% of access requests within 50 ms and updates policies dynamically with a latency of 30 ms,significantly outperforming traditional ABAC models.These results establish BIG-ABAC as a benchmark for adaptive,scalable,and context-aware access control,making it an ideal solution for dynamic,high-risk domains such as healthcare,smart cities,and Industrial IoT(IIoT).
文摘Social media outlets deliver customers a medium for communication,exchange,and expression of their thoughts with others.The advent of social networks and the fast escalation of the quantity of data have created opportunities for textual evaluation.Utilising the user corpus,characteristics of social platform users,and other data,academic research may accurately discern the personality traits of users.This research examines the traits of consumer personalities.Usually,personality tests administered by psychological experts via interviews or self-report questionnaires are costly,time-consuming,complex,and labour-intensive.Currently,academics in computational linguistics are increasingly focused on predicting personality traits from social media data.An individual’s personality comprises their traits and behavioral habits.To address this distinction,we propose a novel LSTMapproach(BERT-LIWC-LSTM)that simultaneously incorporates users’enduring and immediate personality characteristics for textual personality recognition.Long-termPersonality Encoding in the proposed paradigmcaptures and represents persisting personality traits.Short-termPersonality Capturing records changing personality states.Experimental results demonstrate that the designed BERT-LIWC-LSTM model achieves an average improvement in accuracy of 3.41% on the Big Five dataset compared to current methods,thereby justifying the efficacy of encoding both stable and dynamic personality traits simultaneously through long-and short-term feature interaction.
文摘We put forward an enlightening view on repulsive force between antimatter:Antimatter repels each other,and the repulsive force is proportional to the product of their masses and inversely proportional to the square of the distance between them;There is no gravitational or anti-gravitational interaction between antimatter and positive matter.As their applications,we explain the Big Bang process in a new light.
基金Yunnan Provincial Science and Technology Project at Southwest United Graduate School,No.202302AO370012National Natural Science Foundation of China,No.42401510Postdoctoral Fellowship Program of CPSF,No.GZC20240017。
文摘Morphology,the study of shapes or forms,when applied to tourism,emphasizes the multifarious spatial practices between morphological elements and tourism activities.However,existing literature on morphology in the context of tourism usually only focuses on a single or a limited number of study areas,overlooking common or even universal patterns across various tourism destinations.To address this gap,we utilize geospatial big data and present a case study on the morphology of 406“AAAAA”-rated scenic areas in China.A framework based on“points”,“lines”,“planes”,and“solids”was designed to systematically organize and analyze morphological elements across scenic areas.The findings provide valuable insights for tourism planning and development,such as the co-occurrence of dense road networks and fragmented landscapes within scenic areas,as well as the resourcecontext-influenced(cultural or natural)associations between morphological features and tourism indicators.This research provides valuable strategic guidance for more effective and informed tourism development while acknowledging the trade-offs between generalizability and local specificity.
基金Research on the Digital Transformation of Financial Management Major and the Training Model of Outstanding Talents(2023122203988)Research on the Integration of Haikou Logistics and Manufacturing Driven by Big Data and Its Consumption Promotion Effect(HKKY2024-ZD-24)。
文摘With the constant changes of the times,China's science and technology have entered a period of rapid development.At the same time,the economic structure is also changing with the changes of the times,and the original Haikou logistics industry in the process is also facing new impacts and challenges.And related enterprises want to stand out in the fierce market competition,we must optimize and upgrade the current industry development situation,promote the integrated development of Haikou logistics and manufacturing industry,to constantly promote the innovative application of digital technology in the logistics industry and manufacturing industry,the formation of a multi-force economic development model.This paper mainly starts with the development status of Haikou logistics,analyzes the importance of the integration of Haikou logistics and manufacturing industry under the background of big data drive,and makes an in-depth discussion on the path of the integration of Haikou logistics and manufacturing industry under the drive of big data,hoping to contribute new strength to the development of social economy.
基金supported in part by the National Nature Science Foundation of China under Project 62166047in part by the Yunnan International Joint Laboratory of Natural Rubber Intelligent Monitor and Digital Applications under Grant 202403AP140001in part by the Xingdian Talent Support Program under Grant YNWR-QNBJ-2019-270.
文摘The era of big data brings new challenges for information network systems(INS),simultaneously offering unprecedented opportunities for advancing intelligent intrusion detection systems.In this work,we propose a data-driven intrusion detection system for Distributed Denial of Service(DDoS)attack detection.The system focuses on intrusion detection from a big data perceptive.As intelligent information processing methods,big data and artificial intelligence have been widely used in information systems.The INS system is an important information system in cyberspace.In advanced INS systems,the network architectures have become more complex.And the smart devices in INS systems collect a large scale of network data.How to improve the performance of a complex intrusion detection system with big data and artificial intelligence is a big challenge.To address the problem,we design a novel intrusion detection system(IDS)from a big data perspective.The IDS system uses tensors to represent large-scale and complex multi-source network data in a unified tensor.Then,a novel tensor decomposition(TD)method is developed to complete big data mining.The TD method seamlessly collaborates with the XGBoost(eXtreme Gradient Boosting)method to complete the intrusion detection.To verify the proposed IDS system,a series of experiments is conducted on two real network datasets.The results revealed that the proposed IDS system attained an impressive accuracy rate over 98%.Additionally,by altering the scale of the datasets,the proposed IDS system still maintains excellent detection performance,which demonstrates the proposed IDS system’s robustness.
基金supported by the Postdoctoral Fellowship Program(Grade B)of China(GZB20250435)the National Natural Science Foundation of China(62403270).
文摘High-quality data is essential for the success of data-driven learning tasks.The characteristics,precision,and completeness of the datasets critically determine the reliability,interpretability,and effectiveness of subsequent analyzes and applications,such as fault detection,predictive maintenance,and process optimization.However,for many industrial processes,obtaining sufficient high-quality data remains a significant challenge due to high costs,safety concerns,and practical constraints.To overcome these challenges,data augmentation has emerged as a rapidly growing research area,attracting considerable attention across both academia and industry.By expanding datasets,data augmentation techniques improve greater generalization and more robust performance in actual applications.This paper provides a comprehensive,multi-perspective review of data augmentation methods for industrial processes.For clarity and organization,existing studies are systematically grouped into four categories:small sample with low dimension,small sample with high dimension,large sample with low dimension,and large sample with high dimension.Within this framework,the review examines current research from both methodological and application-oriented perspectives,highlighting main methods,advantages,and limitations.By synthesizing these findings,this review offers a structured overview for scholars and practitioners,serving as a valuable reference for newcomers and experienced researchers seeking to explore and advance data augmentation techniques in industrial processes.