A4 Refereed article in a conference publication
Human-in-the-loop: Explainable or accurate artificial intelligence by exploiting human bias?
Authors: Valtonen Laura, Mäkinen Saku J.
Editors: N/A
Conference name: IEEE International Conference on Engineering, Technology and Innovation
Publication year: 2022
Book title : 2022 IEEE 28th International Conference on Engineering, Technology and Innovation (ICE/ITMC) & 31st International Association For Management of Technology (IAMOT) Joint Conference
ISBN: 978-1-6654-8818-1
eISBN: 978-1-6654-8817-4
DOI: https://doi.org/10.1109/ICE/ITMC-IAMOT55089.2022.10033225
Web address : https://ieeexplore.ieee.org/document/10033225
Artificial intelligence (AI) is a major contributor in industry 4.0 and there exists a strong push for AI adoption across fields for both research and practice. However, AI has quite well elaborated risks for both business and general society. Hence, paying attention to avoiding hurried adoption of counter-productive practices is important. For both managerial and general social issues, the same solution is sometimes proposed: human-in-the-loop (HITL). However, HITL literature is contradictory: HITL is proposed to promote fairness, accountability, and transparency of AI, which are sometimes assumed to come at the cost of AI accuracy. Yet, HITL is also considered a way to improve accuracy. To make sense of the convoluted literature, we begin to explore qualitatively how explainability is constructed in a HITL process, and how method accuracy is affected as its function. To do this, we study qualitatively and quantitatively a multi-class classification task with multiple machine learning algorithms. We find that HITL can increase both accuracy and explainability, but not without deliberate effort to do so. The effort required to achieve both increased accuracy and explainability, requires an iterative HITL in which accuracy improvements are not continuous, but disrupted by unique and varying human biases shedding additional perspectives on the task at hand.