Brain tumors,one of the most lethal diseases with low survival rates,require early detection and accurate diagnosis to enable effective treatment planning.While deep learning architectures,particularly Convolutional N...Brain tumors,one of the most lethal diseases with low survival rates,require early detection and accurate diagnosis to enable effective treatment planning.While deep learning architectures,particularly Convolutional Neural Networks(CNNs),have shown significant performance improvements over traditional methods,they struggle to capture the subtle pathological variations between different brain tumor types.Recent attention-based models have attempted to address this by focusing on global features,but they come with high computational costs.To address these challenges,this paper introduces a novel parallel architecture,ParMamba,which uniquely integrates Convolutional Attention Patch Embedding(CAPE)and the Conv Mamba block including CNN,Mamba and the channel enhancement module,marking a significant advancement in the field.The unique design of ConvMamba block enhances the ability of model to capture both local features and long-range dependencies,improving the detection of subtle differences between tumor types.The channel enhancement module refines feature interactions across channels.Additionally,CAPE is employed as a downsampling layer that extracts both local and global features,further improving classification accuracy.Experimental results on two publicly available brain tumor datasets demonstrate that ParMamba achieves classification accuracies of 99.62%and 99.35%,outperforming existing methods.Notably,ParMamba surpasses vision transformers(ViT)by 1.37%in accuracy,with a throughput improvement of over 30%.These results demonstrate that ParMamba delivers superior performance while operating faster than traditional attention-based methods.展开更多
基金supported by the Outstanding Youth Science and Technology Innovation Team Project of Colleges and Universities in Hubei Province(Grant no.T201923)Key Science and Technology Project of Jingmen(Grant nos.2021ZDYF024,2022ZDYF019)Cultivation Project of Jingchu University of Technology(Grant no.PY201904).
文摘Brain tumors,one of the most lethal diseases with low survival rates,require early detection and accurate diagnosis to enable effective treatment planning.While deep learning architectures,particularly Convolutional Neural Networks(CNNs),have shown significant performance improvements over traditional methods,they struggle to capture the subtle pathological variations between different brain tumor types.Recent attention-based models have attempted to address this by focusing on global features,but they come with high computational costs.To address these challenges,this paper introduces a novel parallel architecture,ParMamba,which uniquely integrates Convolutional Attention Patch Embedding(CAPE)and the Conv Mamba block including CNN,Mamba and the channel enhancement module,marking a significant advancement in the field.The unique design of ConvMamba block enhances the ability of model to capture both local features and long-range dependencies,improving the detection of subtle differences between tumor types.The channel enhancement module refines feature interactions across channels.Additionally,CAPE is employed as a downsampling layer that extracts both local and global features,further improving classification accuracy.Experimental results on two publicly available brain tumor datasets demonstrate that ParMamba achieves classification accuracies of 99.62%and 99.35%,outperforming existing methods.Notably,ParMamba surpasses vision transformers(ViT)by 1.37%in accuracy,with a throughput improvement of over 30%.These results demonstrate that ParMamba delivers superior performance while operating faster than traditional attention-based methods.