Musleh, M., Raidou, R. G., & Ceneda, D. (2025). TrustME: A Context-Aware Explainability Model to Promote User Trust in Guidance. IEEE Transactions on Visualization and Computer Graphics. https://doi.org/10.1109/TVCG.2025.3562929
IEEE Transactions on Visualization and Computer Graphics
-
ISSN:
1077-2626
-
Date (published):
21-Apr-2025
-
Number of Pages:
17
-
Publisher:
IEEE COMPUTER SOC
-
Peer reviewed:
Yes
-
Keywords:
Explainability; Explainable Guidance; User Trust; Visual Analytics
en
Abstract:
Guidance-enhanced approaches are used to support users in making sense of their data and overcoming challenging analytical scenarios. While recent literature underscores the value of guidance, a lack of clear explanations to motivate system interventions may still negatively impact guidance effectiveness. Hence, guidance-enhanced VA approaches require meticulous design, demanding contextual adjustments for developing appropriate explanations. Our paper discusses the concept of explainable guidance and how it impacts the user-system relationship-specifically, a user's trust in guidance within the VA process. We subsequently propose a model that supports the design of explainability strategies for guidance in VA. The model builds upon flourishing literature in explainable AI, available guidelines for developing effective guidance in VA systems, and accrued knowledge on user-system trust dynamics. Our model responds to challenges concerning guidance adoption and context-effectiveness by fostering trust through appropriately designed explanations. To demonstrate the model's value, we employ it in designing explanations within two existing VA scenarios. We also describe a design walk-through with a guidance expert to showcase how our model supports designers in clarifying the rationale behind system interventions and designing explainable guidance.