2020
DOI: 10.1007/978-3-030-45691-7_49
|View full text |Cite
|
Sign up to set email alerts
|

Personalising Explainable Recommendations: Literature and Conceptualisation

Abstract: Explanations in intelligent systems aim to enhance a users' understandability of their reasoning process and the resulted decisions and recommendations. Explanations typically increase trust, user acceptance and retention. The need for explanations is on the rise due to the increasing public concerns about AI and the emergence of new laws, such as the General Data Protection Regulation (GDPR) in Europe. However, users are different in their needs for explanations, and such needs can depend on their dynamic con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

4
4

Authors

Journals

citations
Cited by 21 publications
(13 citation statements)
references
References 40 publications
0
11
0
Order By: Relevance
“…For instance, simple feedback such as "explain more", "redundant explanation" or "different explanation" can support users who wish to involve with the explanations and improve the explanations in future interactions. In a previous paper [61], we reported on the results related to input modalities meant for tailoring the explanations for a specific user or group of users i.e. personalisation.…”
Section: Discussion and Research Challengesmentioning
confidence: 99%
“…For instance, simple feedback such as "explain more", "redundant explanation" or "different explanation" can support users who wish to involve with the explanations and improve the explanations in future interactions. In a previous paper [61], we reported on the results related to input modalities meant for tailoring the explanations for a specific user or group of users i.e. personalisation.…”
Section: Discussion and Research Challengesmentioning
confidence: 99%
“…Sokol et al [19] present 11 usability requirements for the explanations which are: Soundness, Completeness, Contextfullness, Interactiveness, Actionability, Chronology, Coherence, Novelty, Complexity and Personalisation. Also, the initial findings from this PhD research provide in-depth investigation about the conceptualisation of the personalisation aspect in a previous work [14]. Since the implementation of these aspects has been limited to low stake applications, these principles and findings may not translate to high stake applications where trust calibration and safety are crucial requirements.…”
Section: Human-computer Interaction (Hci) and Explainabilitymentioning
confidence: 93%
“…This means, explanation alone is not enough but rather its delivery and presentation should adapt to the user and task context to avoid that conflict. Also, our previous literature review revealed that explanation shall be adapted to users' level of knowledge and expertise in a given Human-AI collaborative decision-making task [56]. Similarly, previous studies showed that adaptive information increases collaborative Human-Robot performance [82].…”
Section: Adaptationmentioning
confidence: 94%