Sorry, you need to enable JavaScript to visit this website.

Transforming Tabular Data For Multi-modality: Enhancing Breast Cancer Metastasis Prediction Through Data Conversion

Citation Author(s):
Faseela Abdullakutty, Younes Akbari, Somaya Al-maadeed, Ahmed Bouridane, Rifat Hamoudi
Submitted by:
Faseela Abdullakutty
Last updated:
1 December 2024 - 10:14am
Document Type:
Research Manuscript
Document Year:
2024
Event:
Presenters:
Faseela Abdullakutty
Paper Code:
ICIP BID -2841
 

Breast cancer metastasis prediction plays a key role in
clinical decision-making and secondary analysis. Traditionally,
metastasis classification models have been developed
using structured tabular clinical data, but these approaches
may result in data loss and lack of contextual information. A
multi-modal approach is presented in this article for predicting
breast cancer metastasis by converting structured clinical
data into unstructured text, which provides more contextual
information, and then converting that text into histopathology
images. For text classification, features were extracted
and fine-tuned. Using Logistic Regression and XGBoost
classifiers, these extracted features exhibited enhanced performance.
The accuracy of the metastasis detection was
further enhanced by fine-tuning. A pre-trained diffusion
model was used to generate histopathology images based
on the same clinical data to address the multimodality gap.
The classification of the features extracted from these images,
using pre-trained vision models like VGG-16 and ViT,
provided similar results to traditional tabular predictions. A
multi-modal early fusion approach was then created by combining
vision-derived features with text-based features from
the BERT. Using unstructured text and histopathology images
can effectively address multi-modal data limitations, providing
a promising alternative for future research and providing a
context-rich approach to breast cancer metastasis prediction.

up
0 users have voted: