CfP. "The ethics and epistemology of explanatory AI in medicine and healthcare" in Ethics and Information Technology
Call for papers for a Special Issue on “The ethics and epistemology of explanatory AI in medicine and healthcare” in Ethics and Information Technology.
Please find the full call for papers here: https://www.springer.com
We are particularly interested in contributions that shed new light on the following questions:
• Which are the distinctive characteristics of explanations in AI for medicine and healthcare?
• Which epistemic and normative values (e.g., explainability, accuracy, transparency) should guide the design and use of AI in medicine and healthcare?
• Does AI in medicine pose particular requirements for explanations?
• Is explanatory pluralism a viable option for medical AI (i.e., pluralism of discipline and pluralism of agents receiving/offering explanations)?
• Which virtues (e.g., social, moral, cultural, cognitive) are at the basis of explainable medical AI?
• What is the epistemic and normative connection between explanation and understanding?
• How are trust (e.g., normative and epistemic) and explainability related?
• What kind of explanations are required to increase trust in medical decisions?
• What is the role of transparency in explanations in medical AI?
• How are accountability and explainability related to medical AI?
The deadline for submission is May 1st, 2021. We expect the SI to be published by the end of next year. Papers should be prepared for blind peer review and not exceed 8000 words.
If you have any questions, please do not hesitate to contact one of the guest editors: Martin Sand, Karin Jongsma or Juan M. Durán.