Abstract: Transformers have taken the AI research and product community by storm. We have seen them advancing multiple fields in AI such as NLP, Computer Vision, Robotics. In this talk, I will be giving some background in Conversational AI, NLP and Transformers based Large Scale Language Models such as BERT and GPT-3. I will walk the audience through hands-on examples and how they can leverage such techniques in their applications.
Bio: Chandra Khatri is the Chief Scientist and Head of AI at Got It AI, wherein, his team is transforming the AI space by leveraging state-of-the-art technologies in order to deliver Self-Discovering, Self-Training, and Self-Optimizing products. Under his leadership, Got It AI is democratizing Conversational AI and related ecosystems through automation. Prior to Got-It, Chandra was leading various kinds of applied research projects at Uber AI such as Conversational AI, Multi-modal AI, and Recommendation Systems.
Prior to Uber AI, he was the founding member of the Alexa Prize Competition at Amazon, wherein he was leading the R&D and got the opportunity to significantly advance the field of Conversational AI, particularly Open-domain Dialog Systems, which is considered as the holy-grail of Conversational AI and is one of the open-ended problems in AI. Prior to Alexa AI, he was driving NLP, Deep Learning, and Recommendation Systems related Applied Research at eBay. He graduated from Georgia Tech with a specialization in Deep Learning in 2015 and holds an undergraduate degree from BITS Pilani, India.
His current areas of research include Artificial and General Intelligence, Democratization of AI, Reinforcement Learning, Language Understanding, Conversational AI, Multi-modal and Human-agent Interactions, and Introducing Common Sense within Artificial Agents