Interpretable Machine Learning
Queue2021Vol. 19(6), pp. 28–56
Citations Over TimeTop 1% of 2021 papers
Abstract
The emergence of machine learning as a society-changing technology in the past decade has triggered concerns about people's inability to understand the reasoning of increasingly complex models. The field of IML (interpretable machine learning) grew out of these concerns, with the goal of empowering various stakeholders to tackle use cases, such as building trust in models, performing model debugging, and generally informing real human decision-making.
Related Papers
- → Studying the advancement in debugging practice of professional software developers(2016)104 cited
- → Debugging(2004)81 cited
- → Using submission log data to investigate novice programmers’ employment of debugging strategies(2023)9 cited
- → Studying the Advancement in Debugging Practice of Professional Software Developers(2014)14 cited
- → Design of Scan Cell for System on Chip Scan Based Debugging Applications(2017)1 cited