How to Build, Implement and Visualize Explainable AI(XAI) in Python
Languages • October 2020
How to Build, Implement and Visualize Explainable AI(XAI) in Python
Explore more
About
How to Build, Implement and Visualize Explainable AI(XAI) in Python
About

Explainable AI(XAI) has become legally and ethically mandatory. Artificial Intelligence often surpasses human performance and understanding. European and US laws, for example, have increasingly enforced the regulations and sanctions on automatic decisions that cannot be explained. In 2020, an AI system must produce trustworthy unbiased results and non-discriminating explanations.
We will go through user-friendly XAI Python and Tensorflow 2 visualizations tools of otherwise impossible AI decisions to explain. We will go through case studies with LIME, SHAP, WIT, CEM, and more.

Language
English
Level
Advanced
Length
31 minutes
Type
online conference
About the speaker
About the speaker
Denis Rothman
AI Author, Speaker, Developer, and InstructorDenis Rothman
Denis Rothman graduated from Sorbonne University and Paris-Diderot University, writing one of the very first word2vector embedding solutions. He began his career authoring one of the first AI cognitive natural language processing (NLP) chatbots applied as a language teacher for Moët et Chandon and other companies. He has also authored an AI resource optimizer for IBM and apparel producers. He then authored an advanced planning and scheduling (APS) solution that is used worldwide. Denis is the author of Artificial Intelligence by Example,(2020, Packt) and Hands-on Explainable AI (2020, Packt).
Details
Language
English
Level
Advanced
Length
31 minutes
Type
online conference