Written by: Russ Nelson February 5, 2025 Dr. Keka Biswas, a lecturer in the UAH Department of Biological Sciences. Michael Mercier | UAH One of the most agonizing experiences a cancer patient suffers is waiting without knowing: waiting for a diagnosis, waiting to get test results back, waiting to learn the outcome of treatment protocols. Now a researcher at 糖心原创出品 (UAH) is co-author of that studies the use of artificial intelligence (AI) and neuronal networks to significantly cut the time required for medical professionals to classify lesions in breast cancer ultrasound images. Accurate classification is pivotal for early diagnosis and treatment, and a deep-learning (DL) approach can effectively represent and utilize the digital content of images for more precise and rapid medical image analysis. Dr. Keka Biswas is a lecturer of biological sciences at UAH, a part of The University of Alabama System. Her current focus is examining human cancerous cells in a collaborative approach with biologists, mathematicians and statisticians in the field of mathematical biology. Recent advances in AI and medical imaging have led to the widespread use of deep-learning technology, particularly in image processing and classification. 鈥淒eep learning is a subfield of machine learning that employs neural network-based models to imitate the human brain鈥檚 capacity to analyze huge amounts of complicated data in areas such as image recognition,鈥 Biswas explains. 鈥淭he applications include cancer subtype discovery, text classification, medical imaging, etc. With AI, you can actually use these advances during surgery to see what stage the cancer is in, and the imaging of it is a much faster turnaround time.鈥 Breast ultrasound imaging is useful for detecting and distinguishing benign masses from malignant masses, but imaging reporting and data system features are difficult for practitioners and radiologists and also time consuming. The study investigated the relationship of breast cancer imaging features and the need for rapid classification and analysis of precise medical images of breast lesions. 鈥淲hile progress has been made in the diagnosis, prognosis and treatment of cancer patients, individualized and data-driven care remains a challenge,鈥 the researcher notes. 鈥淎I has been used to predict and automate many cancers and has emerged as a promising option for improving health care accuracy and patient outcomes.鈥 AI applications in oncology include risk assessment, early diagnosis, patient prognosis, estimation and treatment selection, based on deep-learning knowledge. Biswas鈥 study investigates the relationship between breast cancer imaging features and the roles of inter- and extra-lesional tissues and their impact on refining the performance of deep-learning classification. These advances are all predicated on the different ways benign and malignant tumors affect neighboring tissues, such as the pattern of growth and border irregularities, the penetration degree of the adjacent tissue and tissue-level changes. 鈥淭his is where deep learning comes in, looking into the deeper tissues and the outside tissue, and that can give us datasets to work with,鈥 Biswas says. Working to speed up hope Researchers use AI and pre-training to obtain 鈥渘eural network datasets,鈥 collections of data used to train a neural network, a type of machine-learning algorithm inspired by the human brain. The training employs labeled examples that the network learns from to identify patterns and make predictions on new data. A model is first trained on a dataset to learn broad features and patterns, which then serves as a 鈥減re-trained鈥 model that can be further fine-tuned on smaller, task-specific datasets to achieve better performance for a particular research goal 鈥 basically leveraging the knowledge gained from the large pre-training dataset to improve the efficiency of training on smaller, specialized datasets. In 2023, Biswas took a serendipitous trip to a conference in South Africa, where she met Dr. Luminita Moraru, a professor at the University of Galati in Romania who has a similar career focus, while approaching the challenge using different tools. 鈥淚 met Dr. Moraru, who was from a department of chemistry at her university where she has a model and stimulation lab,鈥 Biswas says. 鈥淏ut she didn鈥檛 have the biological or anatomical background for this kind of research.鈥 By teaming up, the two researchers found they could complement one another鈥檚 skillsets to delve much deeper into data challenges like these. 鈥淭hen, about the same time, one of my friends got detected with breast cancer, and she had given up hope,鈥 Biswas says. 鈥淚t affected her tremendously. The thing that was scary was how long it was taking to go through all the pathological testing.鈥 That鈥檚 when events soon took an even more personal turn for the UAH educator. 鈥淭his summer [2024], I actually experienced this all for myself,鈥 the researcher says. 鈥淚 had gone in for a routine exam and told the doctor what my symptoms were, and she said let鈥檚 run a biopsy. Two days later I got a call, you need to come in, and I knew something was wrong. You have cancer, my doctor said. Do you know which stage it is in? I asked. She couldn鈥檛 tell me a stage, even with the pathological tests that had been done. I have a child, a family. I needed to know which stage it is in, what the treatment is going to be. That took another three to four weeks for the results to come in. The delay was the major thing 鈥 do I need surgery? How long will it take? 鈥淚t all was a much longer process. My oncologist was very frustrated,鈥 the researcher says. 鈥淚 had a diagnosis, but none of the imaging was telling us what stage it was in. Whether it has metastasized or not. I needed to be aware of how soon I could recover.鈥 Beating the waiting game Machine learning algorithms use data such as investigations performed, scans conducted, patients鈥 medical history and other information to forecast or diagnose a cancerous condition from biopsies to help diagnose cancer stages. 鈥淚nitially, the model in the study was trained using more than one million well-annotated images,鈥 Biswas says. 鈥淭he pre-trained models can classify images into 1,000 object categories, because they learn meaningful feature representations for a wide range of images.鈥 Breast ultrasound imaging is most notably limited in its low sensitivity and specificity for small masses or solid tumors. In addition, for an accurate diagnosis, the required number of scans to cover the entire breast depends on breast size. On average, two or three volumes are acquired for each breast per examination, so large volumes of breast ultrasound images have to be reviewed, representing a daunting amount of time and attention for accurate disease diagnosis. 鈥淔or efficiency in time and robust classification, we focused on discrete or localized areas inside and around the breast lesions as significant attributes,鈥 Biswas says. 鈥淭he novelty of this study lies in considering the features as extracted from the tissue inside the tumor and the role of inter- and extra-lesional tissues. Based on how these were affecting the neighboring cells, three criteria were addressed: pattern of growth and border irregularities; the degree of penetration and tissue level changes. 鈥淲e used specific algorithms for morphological classification,鈥 the researcher continues. 鈥淭o get quicker results, we looked at the degree of machine learning that allows us to not only look on the surface, but also deeper inside. And this can be used in robotics surgery, reducing the surgery time significantly. With robotics and machine learning, you have a lot more images to look at, and it allows the surgeon to not only do the surgery, but also the diagnosis and retreatment, three things all at the same time. 鈥淏y reducing the number of false positives and negatives, DL networks provide highly accurate breast cancer detection,鈥 Biswas says. 鈥淭hese methods can be used to diagnose different breast cancer subtypes at various stages by analyzing clinical and test data. To address the limited number of available annotated images, various DL networks pre-trained on large image databases are now available.鈥 Looking to the future, Biswas sees wide-ranging applications for these technologies. 鈥淭he system proposed in our study outperforms the existing breast cancer detection algorithms reported in the literature and will facilitate this challenging and time-consuming classification task. In future work, the proposed system could be effective in assisting in the diagnosis of other cancerous lesions in colorectal, endometrial and melanoma tumors as well,鈥 the researcher concludes. 鈥淎dditionally, various AI-powered data generators could be used to overcome the shortage of datasets.鈥 Learn More College of Science Department of Biological Sciences Contact Kristina Hendrix 256-824-6341 kristina.hendrix@uah.edu Julie Jansen 256-824-6926julie.jansen@uah.edu