Breast cancer is the most common cancer among women worldwide,posing significant diagnostic challenges.Traditional diagnostic techniques,while foundational,often lack precision and fail to provide clear insights into ...Breast cancer is the most common cancer among women worldwide,posing significant diagnostic challenges.Traditional diagnostic techniques,while foundational,often lack precision and fail to provide clear insights into their decision-making processes.This limitation underscores the need for advanced diagnostic tools that enhance both accuracy and interpretability.This study aims to integrate cutting-edge deep learning algorithms with Gradient-weighted Class Activation Mapping(Grad-CAM)to improve the accuracy and transparency of breast cancer diagnostics through mammographic analysis.We proposed robust approaches using MobileNet,Xception,and DenseNet models,enhanced with Grad-CAM,to analyze mammogram images.This integration facilitates a deeper understanding of model decisions,highlighting critical diagnostic features through visual explanations.The models were rigorously tested on the MIAS dataset to evaluate their diagnostic performance and reliability,achieving a diagnostic accuracy of 94.17%,demonstrating superior performance compared to traditional methods.The findings show significant potential for clinical application,promising to enhance patient outcomes through more accurate and transparent diagnostic practices in oncology.展开更多
Skin cancer is a serious and potentially life-threatening disease that affects millions of people worldwide. Early detection and accurate diagnosis are critical for successful treatment and improved patient outcomes. ...Skin cancer is a serious and potentially life-threatening disease that affects millions of people worldwide. Early detection and accurate diagnosis are critical for successful treatment and improved patient outcomes. In recent years, deep learning has emerged as a powerful tool for medical image analysis, including the diagnosis of skin cancer. The importance of using deep learning in diagnosing skin cancer lies in its ability to analyze large amounts of data quickly and accurately. This can help doctors make more informed decisions about patient care and improve overall outcomes. Additionally, deep learning models can be trained to recognize subtle patterns and features that may not be visible to the human eye, leading to earlier detection and more effective treatment. The pre-trained Visual Geometry Group 16 (VGG16) architecture has been used in this study to classification of skin cancer images, and the images have been converted into other color scales, there are named: 1) Hue Saturation Value (HSV), 2) YCbCr, 3) Grayscale for evaluation. Results show that the dataset created with RGB and YCbCr images in field condition was promising with a classification accuracy of 84.242%. The dataset has also been evaluated with other popular architectures and compared. The performance of VGG16 with images of each color scale is analyzed. In addition, feature parameters have been extracted from the different layers. The extracted layers were felt with the VGG16 to evaluate the ability of the feature parameters in classifying the disease.展开更多
基金supported by the National Key R&D Program of the China Project No.2020YFB2104402.
文摘Breast cancer is the most common cancer among women worldwide,posing significant diagnostic challenges.Traditional diagnostic techniques,while foundational,often lack precision and fail to provide clear insights into their decision-making processes.This limitation underscores the need for advanced diagnostic tools that enhance both accuracy and interpretability.This study aims to integrate cutting-edge deep learning algorithms with Gradient-weighted Class Activation Mapping(Grad-CAM)to improve the accuracy and transparency of breast cancer diagnostics through mammographic analysis.We proposed robust approaches using MobileNet,Xception,and DenseNet models,enhanced with Grad-CAM,to analyze mammogram images.This integration facilitates a deeper understanding of model decisions,highlighting critical diagnostic features through visual explanations.The models were rigorously tested on the MIAS dataset to evaluate their diagnostic performance and reliability,achieving a diagnostic accuracy of 94.17%,demonstrating superior performance compared to traditional methods.The findings show significant potential for clinical application,promising to enhance patient outcomes through more accurate and transparent diagnostic practices in oncology.
文摘Skin cancer is a serious and potentially life-threatening disease that affects millions of people worldwide. Early detection and accurate diagnosis are critical for successful treatment and improved patient outcomes. In recent years, deep learning has emerged as a powerful tool for medical image analysis, including the diagnosis of skin cancer. The importance of using deep learning in diagnosing skin cancer lies in its ability to analyze large amounts of data quickly and accurately. This can help doctors make more informed decisions about patient care and improve overall outcomes. Additionally, deep learning models can be trained to recognize subtle patterns and features that may not be visible to the human eye, leading to earlier detection and more effective treatment. The pre-trained Visual Geometry Group 16 (VGG16) architecture has been used in this study to classification of skin cancer images, and the images have been converted into other color scales, there are named: 1) Hue Saturation Value (HSV), 2) YCbCr, 3) Grayscale for evaluation. Results show that the dataset created with RGB and YCbCr images in field condition was promising with a classification accuracy of 84.242%. The dataset has also been evaluated with other popular architectures and compared. The performance of VGG16 with images of each color scale is analyzed. In addition, feature parameters have been extracted from the different layers. The extracted layers were felt with the VGG16 to evaluate the ability of the feature parameters in classifying the disease.