target audience: TECH BUYER Publication date: Aug 2021 - Document type: IDC PeerScape - Doc Document number: # EUR148147721
IDC PeerScape: Practices for Successfully Delivering Explainable AI
Content
List of Figures
Get More
When you purchase this document, the purchase price can be applied to the cost of an annual subscription, giving you access to more research for your investment.
Related Links
Abstract
This IDC PeerScape describes three practices from successful explainable AI deployments to overcome challenges involved in introducing the technology:
- Begin projects by identifying the nature of the explanations you need to generate for AI systems and the process point in which they need to be delivered.
- Based on an evaluation of your organizations current position develop a clear resourcing strategy that can inform technology purchasing and staffing decisions.
- Ensure explainability is meaningful to employees beyond the data science team, and intentionally enhances the work of other teams interacting with AI systems.
"Explainability is rapidly becoming a must have for European organizations looking to implement AI. For many use case areas, it will become a regulatory requirement in the next few years. Making AI explainable isn't a simple process; it can require specialized technical expertise, a resourcing strategy when deployed at scale, and process change," said Jack Vernon senior research analyst for European AI Strategies at IDC. "The following practices and commercial examples should help guide enterprises to successfully introduce explainable AI."