A Software Reliability Model Incorporating Delay in Correction of Fault, Single Change Point and Hazard Function in Imperfect Debugging Environment

  • Asheesh Tiwari, Vishishta Agrawal, Yash Jalan, Nikhil Pandey, Namrata Tiwari, Piyoosh Srivastav

Abstract

    The subject of interest, SRGM, is a crucial tool to test the probability of failure of some software, in the given conditions. Several models are proposed over time to predict issues in a system over a fixed time constraint. Due to complexity, when a fault occurs, it cannot be corrected perfectly and correcting the original fault may result in another fault generation. This phenomenon is called Imperfect Debugging. We have also assumed that when a fault occurs, there is some delay in removing it. During testing of software, rate of detection of fault may not be the same at all points of time and it may change at some point, to compensate for that, in this analysis, an alternative to a singular change point is put forth. Testing time and testing effort, both combined, has been fused utilizing the Cobb-Douglas Production Function. For quantitative comparisons between the model that is proposed and previous existing models, Mean Square Error (MSE) is calculated and compared with.

Published
2020-03-30
How to Cite
Asheesh Tiwari, Vishishta Agrawal, Yash Jalan, Nikhil Pandey, Namrata Tiwari, Piyoosh Srivastav. (2020). A Software Reliability Model Incorporating Delay in Correction of Fault, Single Change Point and Hazard Function in Imperfect Debugging Environment. International Journal of Advanced Science and Technology, 29(3), 10523 -. Retrieved from https://sersc.org/journals/index.php/IJAST/article/view/27129
Section
Articles