Ruchika Malhotra / Department of Computer Science and Engineering, Delhi Technological University
Arjun Rajpal / Delhi Technological University
Dushyant Rathore / Delhi Technological University
Machine Learning algorithms are used in the field of Software Engineering to predict defects. These defect predictors are powerful in comparison to manual methods. They are also pretty simple to grasp and use. But an important thing, which is often ignored is the tuning of these defect predictors to optimize their performance.
Objective: We try to find simple and easy to implement methods for tuning the defect predictors and also compare the performances of these methods.
Method: We ran Differential Evolution and Simulated Annealing as optimizers using different datasets from open-source JAVA systems to explore the tuning space. Finally, we tested the tunings and compared the results obtained from both the methods.
Results: We found that tuning improved the performance in the majority of cases. It was also found that not all optimization algorithms used for tuning produced the same results.
Conclusion: As (1) there is a significant improvement in performance after parameter tuning, there is a need to change standard methods used in software analytics. It is not sufficient to present the result without performing a proper tuning optimization study, especially in the case of defect prediction. (2) Differential Evolution and Simulated Annealing didn’t give similar results with Differential Evolution performing better than Simulated Annealing for the majority of the datasets. Therefore, it is necessary to perform tuning using different optimization algorithms to obtain the best possible results.
International Venture Network
Taiwanese Institute of Knowledge Innovation
The International Association for the Exchange of Students for Technical Experience