期刊文献+
共找到362,226篇文章
< 1 2 250 >
每页显示 20 50 100
A Convolutional Neural Network-Based Deep Support Vector Machine for Parkinson’s Disease Detection with Small-Scale and Imbalanced Datasets
1
作者 Kwok Tai Chui Varsha Arya +2 位作者 Brij B.Gupta Miguel Torres-Ruiz Razaz Waheeb Attar 《Computers, Materials & Continua》 2026年第1期1410-1432,共23页
Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using d... Parkinson’s disease(PD)is a debilitating neurological disorder affecting over 10 million people worldwide.PD classification models using voice signals as input are common in the literature.It is believed that using deep learning algorithms further enhances performance;nevertheless,it is challenging due to the nature of small-scale and imbalanced PD datasets.This paper proposed a convolutional neural network-based deep support vector machine(CNN-DSVM)to automate the feature extraction process using CNN and extend the conventional SVM to a DSVM for better classification performance in small-scale PD datasets.A customized kernel function reduces the impact of biased classification towards the majority class(healthy candidates in our consideration).An improved generative adversarial network(IGAN)was designed to generate additional training data to enhance the model’s performance.For performance evaluation,the proposed algorithm achieves a sensitivity of 97.6%and a specificity of 97.3%.The performance comparison is evaluated from five perspectives,including comparisons with different data generation algorithms,feature extraction techniques,kernel functions,and existing works.Results reveal the effectiveness of the IGAN algorithm,which improves the sensitivity and specificity by 4.05%–4.72%and 4.96%–5.86%,respectively;and the effectiveness of the CNN-DSVM algorithm,which improves the sensitivity by 1.24%–57.4%and specificity by 1.04%–163%and reduces biased detection towards the majority class.The ablation experiments confirm the effectiveness of individual components.Two future research directions have also been suggested. 展开更多
关键词 Convolutional neural network data generation deep support vector machine feature extraction generative artificial intelligence imbalanced dataset medical diagnosis Parkinsons disease small-scale dataset
在线阅读 下载PDF
Statistical Methods of SNP Data Analysis and Applications
2
作者 Alexander Bulinski Oleg Butkovsky +5 位作者 Victor Sadovnichy Alexey Shashkin Pavel Yaskov Alexander Balatskiy Larisa Samokhodskaya Vsevolod Tkachuk 《Open Journal of Statistics》 2012年第1期73-87,共15页
We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction... We develop various statistical methods important for multidimensional genetic data analysis. Theorems justifying application of these methods are established. We concentrate on the multifactor dimensionality reduction, logic regression, random forests, stochastic gradient boosting along with their new modifications. We use complementary approaches to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and non-genetic risk factors are examined. To perform the data analysis concerning the coronary heart disease and myocardial infarction the Lomonosov Moscow State University supercomputer “Chebyshev” was employed. 展开更多
关键词 Genetic data Statistical Analysis Multifactor Dimensionality Reduction Ternary Logic Regression Random FORESTS Stochastic Gradient Boosting Independent Rule Single NUCLEOTIDE POLYMORPHISMS CORONARY Heart Disease MYOCARDIAL INFARCTION
暂未订购
FDI对中国收入分配影响的panel data模型分析 被引量:3
3
作者 林宏 《浙江统计》 2005年第3期19-21,共3页
关键词 FDI panel data
在线阅读 下载PDF
Potential Applications of Milk Fractions and Valorization of Dairy By-Products: A Review of the State-of-the-Art Available Data, Outlining the Innovation Potential from a Bigger Data Standpoint 被引量:3
4
作者 Serge Rebouillat Salvadora Ortega-Requena 《Journal of Biomaterials and Nanobiotechnology》 2015年第3期176-203,共28页
The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation... The unique composition of milk makes this basic foodstuff into an exceptional raw material for the production of new ingredients with desired properties and diverse applications in the food industry. The fractionation of milk is the key in the development of those ingredients and products;hence continuous research and development on this field, especially various levels of fractionation and separation by filtration, have been carried out. This review focuses on the production of milk fractions as well as their particular properties, applications and processes that increase their exploitation. Whey proteins and caseins from the protein fraction are excellent emulsifiers and protein supplements. Besides, they can be chemically or enzymatically modified to obtain bioactive peptides with numerous functional and nutritional properties. In this context, valorization techniques of cheese-whey proteins, by-product of dairy industry that constitutes both economic and environmental problems, are being developed. Phospholipids from the milk fat fraction are powerful emulsifiers and also have exclusive nutraceutical properties. In addition, enzyme modification of milk phospholipids makes it possible to tailor emulsifiers with particular properties. However, several aspects remain to be overcome;those refer to a deeper understanding of the healthy, functional and nutritional properties of these new ingredients that might be barriers for its use and acceptability. Additionally, in this review, alternative applications of milk constituents in the non-food area such as in the manufacture of plastic materials and textile fibers are also introduced. The unmet needs, the cross-fertilization in between various protein domains,the carbon footprint requirements, the environmental necessities, the health and wellness new demand, etc., are dominant factors in the search for innovation approaches;these factors are also outlining the further innovation potential deriving from those “apparent” constrains obliging science and technology to take them into account. 展开更多
关键词 MILK Product MILK Fractionation Casein Phospholipid Whey Protein NON-FOOD Application VALORIZATION Enzyme Modification Bioactive Peptides BIGGER data Innovation: Closed Open Collaborative Disruptive Inclusive Nested
暂未订购
A Review: On Smart Materials Based on Some Polysaccharides;within the Contextual Bigger Data, Insiders, “Improvisation” and Said Artificial Intelligence Trends 被引量:1
5
作者 Serge Rebouillat Fernand Pla 《Journal of Biomaterials and Nanobiotechnology》 2019年第2期41-77,共37页
Smart Materials are along with Innovation attributes and Artificial Intelligence among the most used “buzz” words in all media. Central to their practical occurrence, many talents are to be gathered within new conte... Smart Materials are along with Innovation attributes and Artificial Intelligence among the most used “buzz” words in all media. Central to their practical occurrence, many talents are to be gathered within new contextual data influxes. Has this, in the last 20 years, changed some of the essential fundamental dimensions and the required skills of the actors such as providers, users, insiders, etc.? This is a preliminary focus and prelude of this review. As an example, polysaccharide materials are the most abundant macromolecules present as an integral part of the natural system of our planet. They are renewable, biodegradable, carbon neutral with low environmental, health and safety risks and serve as structural materials in the cell walls of plants. Most of them are used, for many years, as engineering materials in many important industrial processes, such as pulp and papermaking and manufacture of synthetic textile fibres. They are also used in other domains such as conversion into biofuels and, more recently, in the design of processes using polysaccharide nanoparticles. The main properties of polysaccharides (e.g. low density, thermal stability, chemical resistance, high mechanical strength…), together with their biocompatibility, biodegradability, functionality, durability and uniformity, allow their use for manufacturing smart materials such as blends and composites, electroactive polymers and hydrogels which can be obtained 1) through direct utilization and/or 2) after chemical or physical modifications of the polysaccharides. This paper reviews recent works developed on polysaccharides, mainly on cellulose, hemicelluloses, chitin, chitosans, alginates, and their by-products (blends and composites), with the objectives of manufacturing smart materials. It is worth noting that, today, the fundamental understanding of the molecular level interactions that confer smartness to polysaccharides remains poor and one can predict that new experimental and theoretical tools will emerge to develop the necessary understanding of the structure-property-function relationships that will enable polysaccharide-smartness to be better understood and controlled, giving rise to the development of new and innovative applications such as nanotechnology, foods, cosmetics and medicine (e.g. controlled drug release and regenerative medicine) and so, opening up major commercial markets in the context of green chemistry. 展开更多
关键词 POLYSACCHARIDES Cellulose Hemicelluloses Chitosan Alginate Composites Blends Hydrogels Smart Materials Electro-Active Papers Sensors Actuators BIGGER data Innovation Science in Education Jazz 4C CRAC
暂未订购
Modeling and Simulation Study of Space Data Link Protocol
6
作者 Ismail Hababeh Rizik M. H. Al-Sayyed +2 位作者 Ja’far Alqatawna Yousef Majdalawi Marwan Nabelsi 《International Journal of Communications, Network and System Sciences》 2014年第10期440-452,共13页
This research paper describes the design and implementation of the Consultative Committee for Space Data Systems (CCSDS) standards REF _Ref401069962 \r \h \* MERGEFORMAT [1] for Space Data Link Layer Protocol (SDLP). ... This research paper describes the design and implementation of the Consultative Committee for Space Data Systems (CCSDS) standards REF _Ref401069962 \r \h \* MERGEFORMAT [1] for Space Data Link Layer Protocol (SDLP). The primer focus is the telecommand (TC) part of the standard. The implementation of the standard was in the form of DLL functions using C++ programming language. The second objective of this paper was to use the DLL functions with OMNeT++ simulating environment to create a simulator in order to analyze the mean end-to-end Packet Delay, maximum achievable application layer throughput for a given fixed link capacity and normalized protocol overhead, defined as the total number of bytes transmitted on the link in a given period of time (e.g. per second) divided by the number of bytes of application data received at the application layer model data sink. In addition, the DLL was also integrated with Ground Support Equipment Operating System (GSEOS), a software system for space instruments and small spacecrafts especially suited for low budget missions. The SDLP is designed for rapid test system design and high flexibility for changing telemetry and command requirements. GSEOS can be seamlessly moved from EM/FM development (bench testing) to flight operations. It features the Python programming language as a configuration/scripting tool and can easily be extended to accommodate custom hardware interfaces. This paper also shows the results of the simulations and its analysis. 展开更多
关键词 Consultative COMMITTEE for SPACE data Systems Standards SPACE data Link PROTOCOL Mean END-TO-END Packet Delay Maximum Achievable Application Layer Throughput Normalized PROTOCOL OVERHEAD Telecommand Spacecrafts SPACE Instruments
暂未订购
Spatio-Temporal Earthquake Analysis via Data Warehousing for Big Data-Driven Decision Systems
7
作者 Georgia Garani George Pramantiotis Francisco Javier Moreno Arboleda 《Computers, Materials & Continua》 2026年第3期1963-1988,共26页
Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from sei... Earthquakes are highly destructive spatio-temporal phenomena whose analysis is essential for disaster preparedness and risk mitigation.Modern seismological research produces vast volumes of heterogeneous data from seismic networks,satellite observations,and geospatial repositories,creating the need for scalable infrastructures capable of integrating and analyzing such data to support intelligent decision-making.Data warehousing technologies provide a robust foundation for this purpose;however,existing earthquake-oriented data warehouses remain limited,often relying on simplified schemas,domain-specific analytics,or cataloguing efforts.This paper presents the design and implementation of a spatio-temporal data warehouse for seismic activity.The framework integrates spatial and temporal dimensions in a unified schema and introduces a novel array-based approach for managing many-to-many relationships between facts and dimensions without intermediate bridge tables.A comparative evaluation against a conventional bridge-table schema demonstrates that the array-based design improves fact-centric query performance,while the bridge-table schema remains advantageous for dimension-centric queries.To reconcile these trade-offs,a hybrid schema is proposed that retains both representations,ensuring balanced efficiency across heterogeneous workloads.The proposed framework demonstrates how spatio-temporal data warehousing can address schema complexity,improve query performance,and support multidimensional visualization.In doing so,it provides a foundation for integrating seismic analysis into broader big data-driven intelligent decision systems for disaster resilience,risk mitigation,and emergency management. 展开更多
关键词 data warehouse data analysis big data decision systems SEISMOLOGY data visualization
在线阅读 下载PDF
Proposed Caching Scheme for Optimizing Trade-off between Freshness and Energy Consumption in Name Data Networking Based IoT 被引量:1
8
作者 Rahul Shrimali Hemal Shah Riya Chauhan 《Advances in Internet of Things》 2017年第2期11-24,共14页
Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offer... Over the last few years, the Internet of Things (IoT) has become an omnipresent term. The IoT expands the existing common concepts, anytime and anyplace to the connectivity for anything. The proliferation in IoT offers opportunities but may also bear risks. A hitherto neglected aspect is the possible increase in power consumption as smart devices in IoT applications are expected to be reachable by other devices at all times. This implies that the device is consuming electrical energy even when it is not in use for its primary function. Many researchers’ communities have started addressing storage ability like cache memory of smart devices using the concept called—Named Data Networking (NDN) to achieve better energy efficient communication model. In NDN, memory or buffer overflow is the common challenge especially when internal memory of node exceeds its limit and data with highest degree of freshness may not be accommodated and entire scenarios behaves like a traditional network. In such case, Data Caching is not performed by intermediate nodes to guarantee highest degree of freshness. On the periodical updates sent from data producers, it is exceedingly demanded that data consumers must get up to date information at cost of lease energy. Consequently, there is challenge in maintaining tradeoff between freshness energy consumption during Publisher-Subscriber interaction. In our work, we proposed the architecture to overcome cache strategy issue by Smart Caching Algorithm for improvement in memory management and data freshness. The smart caching strategy updates the data at precise interval by keeping garbage data into consideration. It is also observed from experiment that data redundancy can be easily obtained by ignoring/dropping data packets for the information which is not of interest by other participating nodes in network, ultimately leading to optimizing tradeoff between freshness and energy required. 展开更多
关键词 Internet of Things (IoT) Named data NETWORKING Smart CACHING Table Pending INTEREST Forwarding INFORMATION Base CONTENT Store CONTENT Centric NETWORKING INFORMATION Centric NETWORKING data & INTEREST Packets SCTSmart CACHING
暂未订购
Multimodal artificial intelligence integrates imaging,endoscopic,and omics data for intelligent decision-making in individualized gastrointestinal tumor treatment
9
作者 Hui Nian Yi-Bin Wu +5 位作者 Yu Bai Zhi-Long Zhang Xiao-Huang Tu Qi-Zhi Liu De-Hua Zhou Qian-Cheng Du 《Artificial Intelligence in Gastroenterology》 2026年第1期1-19,共19页
Gastrointestinal tumors require personalized treatment strategies due to their heterogeneity and complexity.Multimodal artificial intelligence(AI)addresses this challenge by integrating diverse data sources-including ... Gastrointestinal tumors require personalized treatment strategies due to their heterogeneity and complexity.Multimodal artificial intelligence(AI)addresses this challenge by integrating diverse data sources-including computed tomography(CT),magnetic resonance imaging(MRI),endoscopic imaging,and genomic profiles-to enable intelligent decision-making for individualized therapy.This approach leverages AI algorithms to fuse imaging,endoscopic,and omics data,facilitating comprehensive characterization of tumor biology,prediction of treatment response,and optimization of therapeutic strategies.By combining CT and MRI for structural assessment,endoscopic data for real-time visual inspection,and genomic information for molecular profiling,multimodal AI enhances the accuracy of patient stratification and treatment personalization.The clinical implementation of this technology demonstrates potential for improving patient outcomes,advancing precision oncology,and supporting individualized care in gastrointestinal cancers.Ultimately,multimodal AI serves as a transformative tool in oncology,bridging data integration with clinical application to effectively tailor therapies. 展开更多
关键词 Multimodal artificial intelligence Gastrointestinal tumors Individualized therapy Intelligent diagnosis Treatment optimization Prognostic prediction data fusion Deep learning Precision medicine
在线阅读 下载PDF
High-precision classification of benthic habitat sediments in shallow waters of islands by multi-source data
10
作者 Qiuhua TANG Ningning LI +4 位作者 Yujie ZHANG Zhipeng DONG Yongling ZHENG Jingjing BAO Jingyu ZHANG 《Journal of Oceanology and Limnology》 2026年第1期99-108,共10页
Benthic habitat mapping is an emerging discipline in the international marine field in recent years,providing an effective tool for marine spatial planning,marine ecological management,and decision-making applications... Benthic habitat mapping is an emerging discipline in the international marine field in recent years,providing an effective tool for marine spatial planning,marine ecological management,and decision-making applications.Seabed sediment classification is one of the main contents of seabed habitat mapping.In response to the impact of remote sensing imaging quality and the limitations of acoustic measurement range,where a single data source does not fully reflect the substrate type,we proposed a high-precision seabed habitat sediment classification method that integrates data from multiple sources.Based on WorldView-2 multi-spectral remote sensing image data and multibeam bathymetry data,constructed a random forests(RF)classifier with optimal feature selection.A seabed sediment classification experiment integrating optical remote sensing and acoustic remote sensing data was carried out in the shallow water area of Wuzhizhou Island,Hainan,South China.Different seabed sediment types,such as sand,seagrass,and coral reefs were effectively identified,with an overall classification accuracy of 92%.Experimental results show that RF matrix optimized by fusing multi-source remote sensing data for feature selection were better than the classification results of simple combinations of data sources,which improved the accuracy of seabed sediment classification.Therefore,the method proposed in this paper can be effectively applied to high-precision seabed sediment classification and habitat mapping around islands and reefs. 展开更多
关键词 Wuzhizhou Island marine remote sensing coastal mapping multi-spectral remote sensing shallow water reef seabed sediment classification benthic habitat mapping multi-source data fusion random forest(RF)
在线阅读 下载PDF
Linked Data Based Framework for Tourism Decision Support System: Case Study of Chinese Tourists in Switzerland
11
作者 Zhan Liu Anne Le Calvé +3 位作者 Fabian Cretton Nicole Glassey Balet Maria Sokhn Nicolas Délétroz 《Journal of Computer and Communications》 2015年第5期118-126,共9页
Switzerland is one of the most desirable European destinations for Chinese tourists;therefore, a better understanding of Chinese tourists is essential for successful business practices. In China, the largest and leadi... Switzerland is one of the most desirable European destinations for Chinese tourists;therefore, a better understanding of Chinese tourists is essential for successful business practices. In China, the largest and leading social media platform—Sina Weibo, a hybrid of Twitter and Facebook—has more than 600 million users. Weibo’s great market penetration suggests that tourism operators and markets need to understand how to build effective and sustainable communications on Chinese social media platforms. In order to offer a better decision support platform to tourism destination managers as well as Chinese tourists, we proposed a framework using linked data on Sina Weibo. Linked Data is a term referring to using the Internet to connect related data. We will show how it can be used and how ontology can be designed to include the users’ context (e.g., GPS locations). Our framework will provide a good theoretical foundation for further understand Chinese tourists’ expectation, experiences, behaviors and new trends in Switzerland. 展开更多
关键词 Linked data SEMANTIC Web DECISION Support System Natural Language Processing BEHAVIORS Analysis Social Networks Chinese TOURIST Switzerland New Trends SINA Weibo
在线阅读 下载PDF
On the Combination of “The Textual Research on Historical Documents” and “The Comparative Study of Historical Data” —— and a Discussion on “The Law of Quan-ma and Gui-mei” in Chinese Language Studies
12
作者 Lu Guoyao 《宏观语言学》 2007年第1期46-59,共14页
In Chinese language studies, both “The Textual Research on Historical Documents” and “The Comparative Study of Historical Data” are traditional in methodology and they both deserve being treasured, passed on, and ... In Chinese language studies, both “The Textual Research on Historical Documents” and “The Comparative Study of Historical Data” are traditional in methodology and they both deserve being treasured, passed on, and further developed. It will certainly do harm to the development of academic research if any of the two methods is given unreasonable priority. The author claims that the best or one of the best methodologies of the historical study of Chinese language is the combination of the two, hence a new interpretation of “The Double-proof Method”. Meanwhile, this essay is also an attempt to put forward “The Law of Quan-ma and Gui-mei” in Chinese language studies, in which the author believes that it is not advisable to either treat Gui-mei as Quan-ma or vice versa in linguistic research. It is crucial for us to respect always the language facts first, which is considered the very soul of linguistics. 展开更多
关键词 the history of Chinese language methodology The Textual Research on HISTORICAL Documents The Comparative Study of HISTORICAL data Double-proof method the LAW of Quan-ma and Gui-mei
在线阅读 下载PDF
Mapping of moraine dammed glacial lakes and assessment of their areal changes in the central and eastern Himalayas using satellite data 被引量:3
13
作者 Sazeda BEGAM Dhrubajyoti SEN 《Journal of Mountain Science》 SCIE CSCD 2019年第1期77-94,共18页
The relatively rapid recession of glaciers in the Himalayas and formation of moraine dammed glacial lakes(MDGLs) in the recent past have increased the risk of glacier lake outburst floods(GLOF) in the countries of Nep... The relatively rapid recession of glaciers in the Himalayas and formation of moraine dammed glacial lakes(MDGLs) in the recent past have increased the risk of glacier lake outburst floods(GLOF) in the countries of Nepal and Bhutan and in the mountainous territory of Sikkim in India. As a product of climate change and global warming, such a risk has not only raised the level of threats to the habitation and infrastructure of the region, but has also contributed to the worsening of the balance of the unique ecosystem that exists in this domain that sustains several of the highest mountain peaks of the world. This study attempts to present an up to date mapping of the MDGLs in the central and eastern Himalayan regions using remote sensing data, with an objective to analyse their surface area variations with time from 1990 through 2015, disaggregated over six episodes. The study also includes the evaluation for susceptibility of MDGLs to GLOF with the least criteria decision analysis(LCDA). Forty two major MDGLs, each having a lake surface area greater than 0.2 km2, that were identified in the Himalayan ranges of Nepal, Bhutan, and Sikkim, have been categorized according to their surface area expansion rates in space and time. The lakes have been identified as located within the elevation range of 3800 m and6800 m above mean sea level(a msl). With a total surface area of 37.9 km2, these MDGLs as a whole were observed to have expanded by an astonishing 43.6% in area over the 25 year period of this study. A factor is introduced to numerically sort the lakes in terms of their relative yearly expansion rates, based on their interpretation of their surface area extents from satellite imageries. Verification of predicted GLOF events in the past using this factor with the limited field data as reported in literature indicates that the present analysis may be considered a sufficiently reliable and rapid technique for assessing the potential bursting susceptibility of the MDGLs. The analysis also indicates that, as of now, there are eight MDGLs in the region which appear to be in highly vulnerable states and have high chances in causing potential GLOF events anytime in the recent future. 展开更多
关键词 GLACIER RETREAT LAKES MAPPING MORAINE dammed GLACIAL lake(MDGL) Surface area change of LAKES Landsat imagery data Least criteria decision analysis(LCDA)
原文传递
Constructing Large Scale Cohort for Clinical Study on Heart Failure with Electronic Health Record in Regional Healthcare Platform:Challenges and Strategies in Data Reuse 被引量:2
14
作者 Daowen Liu Liqi Lei +1 位作者 Tong Ruan Ping He 《Chinese Medical Sciences Journal》 CAS CSCD 2019年第2期90-102,共13页
Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face ... Regional healthcare platforms collect clinical data from hospitals in specific areas for the purpose of healthcare management.It is a common requirement to reuse the data for clinical research.However,we have to face challenges like the inconsistence of terminology in electronic health records (EHR) and the complexities in data quality and data formats in regional healthcare platform.In this paper,we propose methodology and process on constructing large scale cohorts which forms the basis of causality and comparative effectiveness relationship in epidemiology.We firstly constructed a Chinese terminology knowledge graph to deal with the diversity of vocabularies on regional platform.Secondly,we built special disease case repositories (i.e.,heart failure repository) that utilize the graph to search the related patients and to normalize the data.Based on the requirements of the clinical research which aimed to explore the effectiveness of taking statin on 180-days readmission in patients with heart failure,we built a large-scale retrospective cohort with 29647 cases of heart failure patients from the heart failure repository.After the propensity score matching,the study group (n=6346) and the control group (n=6346) with parallel clinical characteristics were acquired.Logistic regression analysis showed that taking statins had a negative correlation with 180-days readmission in heart failure patients.This paper presents the workflow and application example of big data mining based on regional EHR data. 展开更多
关键词 electronic health RECORDS CLINICAL TERMINOLOGY knowledge graph CLINICAL special disease case REPOSITORY evaluation of data quality large scale COHORT study
暂未订购
Action Recognition via Shallow CNNs on Intelligently Selected Motion Data
15
作者 Jalees Ur Rahman Muhammad Hanif +2 位作者 Usman Haider Saeed Mian Qaisar Sarra Ayouni 《Computers, Materials & Continua》 2026年第3期2223-2243,共21页
Deep neural networks have achieved excellent classification results on several computer vision benchmarks.This has led to the popularity of machine learning as a service,where trained algorithms are hosted on the clou... Deep neural networks have achieved excellent classification results on several computer vision benchmarks.This has led to the popularity of machine learning as a service,where trained algorithms are hosted on the cloud and inference can be obtained on real-world data.In most applications,it is important to compress the vision data due to the enormous bandwidth and memory requirements.Video codecs exploit spatial and temporal correlations to achieve high compression ratios,but they are computationally expensive.This work computes the motion fields between consecutive frames to facilitate the efficient classification of videos.However,contrary to the normal practice of reconstructing the full-resolution frames through motion compensation,this work proposes to infer the class label from the block-based computed motion fields directly.Motion fields are a richer and more complex representation of motion vectors,where each motion vector carries the magnitude and direction information.This approach has two advantages:the cost of motion compensation and video decoding is avoided,and the dimensions of the input signal are highly reduced.This results in a shallower network for classification.The neural network can be trained using motion vectors in two ways:complex representations and magnitude-direction pairs.The proposed work trains a convolutional neural network on the direction and magnitude tensors of the motion fields.Our experimental results show 20×faster convergence during training,reduced overfitting,and accelerated inference on a hand gesture recognition dataset compared to full-resolution and downsampled frames.We validate the proposed methodology on the HGds dataset,achieving a testing accuracy of 99.21%,on the HMDB51 dataset,achieving 82.54%accuracy,and on the UCF101 dataset,achieving 97.13%accuracy,outperforming state-of-the-art methods in computational efficiency. 展开更多
关键词 Action recognition block matching algorithm convolutional neural network deep learning data compression motion fields optimization videos classification
在线阅读 下载PDF
AI-driven integration of multi-omics and multimodal data for precision medicine
16
作者 Heng-Rui Liu 《Medical Data Mining》 2026年第1期1-2,共2页
High-throughput transcriptomics has evolved from bulk RNA-seq to single-cell and spatial profiling,yet its clinical translation still depends on effective integration across diverse omics and data modalities.Emerging ... High-throughput transcriptomics has evolved from bulk RNA-seq to single-cell and spatial profiling,yet its clinical translation still depends on effective integration across diverse omics and data modalities.Emerging foundation models and multimodal learning frameworks are enabling scalable and transferable representations of cellular states,while advances in interpretability and real-world data integration are bridging the gap between discovery and clinical application.This paper outlines a concise roadmap for AI-driven,transcriptome-centered multi-omics integration in precision medicine(Figure 1). 展开更多
关键词 high throughput transcriptomics multi omics single cell multimodal learning frameworks foundation models omics data modalitiesemerging ai driven precision medicine
在线阅读 下载PDF
Cosmic Acceleration and the Hubble Tension from Baryon Acoustic Oscillation Data
17
作者 Xuchen Lu Shengqing Gao Yungui Gong 《Chinese Physics Letters》 2026年第1期327-332,共6页
We investigate the null tests of cosmic accelerated expansion by using the baryon acoustic oscillation(BAO)data measured by the dark energy spectroscopic instrument(DESI)and reconstruct the dimensionless Hubble parame... We investigate the null tests of cosmic accelerated expansion by using the baryon acoustic oscillation(BAO)data measured by the dark energy spectroscopic instrument(DESI)and reconstruct the dimensionless Hubble parameter E(z)from the DESI BAO Alcock-Paczynski(AP)data using Gaussian process to perform the null test.We find strong evidence of accelerated expansion from the DESI BAO AP data.By reconstructing the deceleration parameter q(z) from the DESI BAO AP data,we find that accelerated expansion persisted until z■0.7 with a 99.7%confidence level.Additionally,to provide insights into the Hubble tension problem,we propose combining the reconstructed E(z) with D_(H)/r_(d) data to derive a model-independent result r_(d)h=99.8±3.1 Mpc.This result is consistent with measurements from cosmic microwave background(CMB)anisotropies using the ΛCDM model.We also propose a model-independent method for reconstructing the comoving angular diameter distance D_(M)(z) from the distance modulus μ,using SNe Ia data and combining this result with DESI BAO data of D_(M)/r_(d) to constrain the value of r_(d).We find that the value of r_(d),derived from this model-independent method,is smaller than that obtained from CMB measurements,with a significant discrepancy of at least 4.17σ.All the conclusions drawn in this paper are independent of cosmological models and gravitational theories. 展开更多
关键词 baryon acoustic oscillation bao data cosmic accelerated expansion dimensionless hubble parameter reconstructing deceleration parameter null testwe accelerated expansion null tests gaussian process
原文传递
Characterizing Potential Fishing Zone of Skipjack Tuna during the Southeast Monsoon in the Bone Bay-Flores Sea Using Remotely Sensed Oceanographic Data 被引量:3
18
作者 Mukti Zainuddin Alfa Nelwan +4 位作者 Siti Aisjah Farhum Najamuddin   Muhammad A. Ibnu Hajar Muhammad Kurnia Sudirman   《International Journal of Geosciences》 2013年第1期259-266,共8页
Potential fishing zones for skipjack tuna in the Bone Bay-Flores Sea were investigated from satellite-based oceanography and catch data, using a linear model (generalized linear model) constructed from generalized add... Potential fishing zones for skipjack tuna in the Bone Bay-Flores Sea were investigated from satellite-based oceanography and catch data, using a linear model (generalized linear model) constructed from generalized additive models and geographic information systems. Monthly mean remotely sensed sea surface temperature and surface chlorophyll-a concentration during the southeast monsoon (April-August) were used for the year 2012. The best generalized additive model was selected to assess the effect of marine environment variables (sea surface temperature and chlorophyll-a concentration) on skipjack tuna abundance (catch per unit effort). Then, the appropriate linear model was constructed from the functional relationship of the generalized additive model for generating a robust predictive model. Model selection process for the generalized additive model was based on significance of model terms, decrease in residual deviance, and increase in cumulative variance explained, whereas the model selection for the linear model was based on decrease in residual deviance, reduction in Akaike’s Information Criterion, increasing cumulative variance explained and significance of model terms. The best model was selected to predict skipjack tuna abundance and their spatial distribution patterns over entire study area. A simple linear model was used to verify the predicted values. Results indicated that the distribution pattern of potential fishing zones for skipjack during the southeast monsoon were well characterized by sea surface temperatures ranging from 28.5℃ to 30.5 ℃ and chlorophyll-a ranging from 0.10 to 0.20 mg·m-3. Predicted highest catch per unit efforts were significantly consistent with the fishing data (P 2 = 0.8), suggesting that the oceanographic indicators may correspond well with the potential feeding ground for skipjack tuna. This good feeding opportunity for skipjack was driven the dynamics of upwelling operating within study area which are capable of creating a highly potential fishing zone during the southeast monsoon. 展开更多
关键词 Skipjack Tuna Satellite data Generalized Additive Model Linear Model Upwelling Potential FISHING Zones BONE BAY and FLORES SEA Southeast MONSOON
暂未订购
Paving the High-Way to Sustainable, Value Adding Open-Innovation Integrating Bigger-Data Challenges: Three Examples from Bio-Ingredients to Robust Durable Applications of Electrochemical Impacts 被引量:1
19
作者 Salvadora Ortega-Requena Serge Rebouillat Fernand Pla 《Journal of Biomaterials and Nanobiotechnology》 2018年第2期117-188,共72页
A trilogy review, based on more than 300 references, is used to underline three challenges facing 1) the supply of sustainable, durable and protected biosourced ingredients such as lipids, 2) the accounting for valuab... A trilogy review, based on more than 300 references, is used to underline three challenges facing 1) the supply of sustainable, durable and protected biosourced ingredients such as lipids, 2) the accounting for valuable bio-by-products, such as whey proteins that have added-value potential removing their environmental weight and 3) the practical reliable synthetic biology and evolutionary engineering that already serve as a technology and science basis to expand from, such as for biopolymer growth. Bioresources, which are the major topic of this review, must provide answers to several major challenges related to health, food, energy or chemistry of tomorrow. They offer a wide range of ingredients which are available in trees, plants, grasses, vegetables, algae, milk, food wastes, animal manures and other organic wastes. Researches in this domain must be oriented towards a bio-sustainable-economy based on new valuations of the potential of those renewable biological resources. This will aim at the substitution of fossil raw materials with renewable raw materials to ensure the sustainability of industrial processes by providing bioproducts through innovative processes using for instance micro-organisms and enzymes (the so-called white biotechnology). The final stage objective is to manufacture high value-added products gifted with the right set of physical, chemical and biological properties leading to particularly innovative applications. In this review, three examples are considered in a green context open innovation and bigger data environment. Two of them (lipids antioxidants and milk proteins) concern food industry while the third (biomonomers and corresponding bioplastics and derivatives) relates to biomaterials industry. Lipids play a crucial role in the food industry, but they are chemically unstable and very sensitive to atmospheric oxidation which leads to the formation of numerous by-compounds which have adverse effects on lipids quality attributes and on the nutritive value of meat. To overcome this problem, natural antioxidants, with a positive impact on the safety and acceptability of the food system, have been discovered and evaluated. In the same context, milk proteins and their derivatives are of great interest. They can be modified by enzymatic means leading to the formation of by-products that are able to increase their functionality and possible applications. They can also produce bioactive peptides, a field with almost unlimited research potential. On the other hand, biosourced chemicals and materials, mainly biomonomers and biopolymers, are already produced today. Metabolic engineering tools and strategies to engineer synthetic enzyme pathways are developed to manufacture, from renewable feedstocks, with high yields, a number of monomer building-block chemicals that can be used to produce replacements to many conventional plastic materials. Through those three examples this review aims to highlight recent and important advancements in production, modification and applications of the studied bioproducts. Bigger data analysis and artificial intelligence may help reweight practical and theoretical observations and concepts in these fields;helping to cross the boarders of expert traditional exploration fields and sometime fortresses. 展开更多
关键词 BIO Green Sustainability BIGGER data Biomimetic Artificial Intelligence Synthetic Biology Lipids Oxidation Antioxidants Milk Protein WHEY Biopolymers ELECTROCHEMICAL Conductive
暂未订购
Source complexity of the 2016 M_W7.8 Kaikoura (New Zealand) earthquake revealed from teleseismic and InSAR data 被引量:4
20
作者 HaiLin Du Xu Zhang +3 位作者 LiSheng Xu WanPeng Feng Lei Yi Peng Li 《Earth and Planetary Physics》 2018年第4期310-326,共17页
On November 13, 2016, an MW7.8 earthquake struck Kaikoura in South Island of New Zealand. By means of back-projection of array recordings, ASTFs-analysis of global seismic recordings, and joint inversion of global sei... On November 13, 2016, an MW7.8 earthquake struck Kaikoura in South Island of New Zealand. By means of back-projection of array recordings, ASTFs-analysis of global seismic recordings, and joint inversion of global seismic data and co-seismic In SAR data, we investigated complexity of the earthquake source. The result shows that the 2016 MW7.8 Kaikoura earthquake ruptured about 100 s unilaterally from south to northeast(~N28°–33°E), producing a rupture area about 160 km long and about 50 km wide and releasing scalar moment 1.01×1021 Nm. In particular, the rupture area consisted of two slip asperities, with one close to the initial rupture point having a maximal slip value ~6.9 m while the other far away in the northeast having a maximal slip value ~9.3 m. The first asperity slipped for about 65 s and the second one started 40 s after the first one had initiated. The two slipped simultaneously for about 25 s.Furthermore, the first had a nearly thrust slip while the second had both thrust and strike slip. It is interesting that the rupture velocity was not constant, and the whole process may be divided into 5 stages in which the velocities were estimated to be 1.4 km/s, 0 km/s, 2.1 km/s, 0 km/s and 1.1 km/s, respectively. The high-frequency sources distributed nearly along the lower edge of the rupture area, the highfrequency radiating mainly occurred at launching of the asperities, and it seemed that no high-frequency energy was radiated when the rupturing was going to stop. 展开更多
关键词 2016 MW7.8 Kaikoura EARTHQUAKE BACK-PROJECTION of array RECORDINGS ASTFs-analysis of global RECORDINGS joint inversion of teleseismic and InSAR data COMPLEXITY of SOURCE
在线阅读 下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部