Addressing Catastrophic Forgetting in Neural Networks: A Review and Comparative Analysis of Algorithms for Crack Detection

Authors

DOI:

https://doi.org/10.56042/jsir.v84i10.18406

Keywords:

ResNet18, surface cracks, continous learning, catastrophic forgetting, image classification

Abstract

Deep neural networks have demonstrated impressive results on a huge area of interest, but they suffer from catastrophic forgetting – when learning new tasks, they tend to forget what has been previously learned. Catastrophic forgetting in neural networks has garnered significant attention in recent years. The problem is too complex to generate general conclusions independently of the types of analyzed networks and specific issues. This paper addresses catastrophic forgetting in neural networks applied to crack detection. Given the limited research in this area, the existing research gap is highlighted in this study by reviewing recent works on catastrophic forgetting that explore various learning methods to mitigate forgetting. A comprehensive survey of techniques is conducted. Three representative algorithms used in crack detection datasets are evaluated - one from each major approach. Finally, a comparative analysis establishes a foundation for a novel algorithm that integrates the strengths of these different approaches.

Downloads

Published

02-12-2025

Issue

Section

Computer Sciences, Communication and Information Technology

How to Cite

Addressing Catastrophic Forgetting in Neural Networks: A Review and Comparative Analysis of Algorithms for Crack Detection. (2025). Journal of Scientific & Industrial Research (JSIR), 84(10), 1095-1106. https://doi.org/10.56042/jsir.v84i10.18406

Similar Articles

1-10 of 100

You may also start an advanced similarity search for this article.