A Multi-Modal Deep Learning Framework Combining Histopathological Imaging and Gene Expression for Automated Cancer Detection

Authors

  • Kantilal P Rane, Prof. (Dr.) Chandra Kumar Dixit, Dr. Sreemoy Author

Keywords:

Multi-Colony Optimization, Multi-Modal Learning, CNN, Gene Expression, Feature Fusion, Heterogeneous Ant Colonies.

Abstract

Cancer subtyping diagnosis is essential for personalized medicine and is conventionally facilitated by histopathological imaging OR genetic assessment diagnosis. Such a method of diagnosis compromises biological development as genetic component considerations are neglected OR a histology approach is taken with subjective visual interpretation and a non-confirmatory genetic component. Here, a Multi-Modal Deep Learning Framework is established for image diagnosis via Convolutional Neural Networks (CNN) and gene expression profiles via a Multi-Layer Perceptron (MLP), and Cooperative Multi-Colony Ant Optimization (MCO) attunes the dimensionality complications of both data streams. This framework is the first to engage a Cooperative Multi-Colony effort where heterogeneous colonies operate independently, but collaboratively throughout the optimization process - one colony for spatial feature selection (CNN) and another for genomic feature selection (MLP) - and a third "Master Colony" features the assessment of both to optimally tune the fused layer. Performance results on TCGA-based multimodal datasets (Lung, Breast, and Colorectal) suggest that this cooperative multi-colony operation outperforms the single colony/single modality counterparts for convergence time and classification accuracy.

Downloads

Published

2025-12-04