Integrating Prior Knowledge with Learning in Natural Language Processing
Integrating Prior Knowledge with Learning in Natural Language Processing

Abstract: 

Prior knowledge is believed to be informative to assist the understanding of natural language and the integration of prior knowledge with machine learning models has been found useful in various NLP tasks. The prior knowledge can be categorised into two categories. Structured knowledge explicitly defined by knowledge graph and more, while unstructured knowledge implicitly contained in large text corpus. Our research focuses on the effectiveness of integrating these two kinds of prior knowledge with machine learning models on text classification and summarisation.

Bio: 

Jingqing Zhang is a 3rd-year PhD (HiPEDS) at Department of Computing, Imperial College London under the supervision of Prof. Yi-Ke Guo. His research interest includes Natural Language Processing, Text Mining, Data Mining and Deep Learning. He received his BEng degree in Computer Science and Technology from Tsinghua University, 2016, and MRes degree with distinction in Computing from Imperial College London, 2017.

Platform Prerequisite 

To facilitate the hands-on sessions, we are delighted to announce that DataRobot, our diamond partner, is providing hands-on session attendees with free access to their AI platform. 

Many of our workshops  and tutorials will utilize the AI Platform for instruction and collaboration. It automates the end-to-end process for building, deploying, and maintaining AI at scale and also provides feature engineering, auto model evaluation, and advanced machine learning techniques. It comes preloaded with models and datasets so you can get started prior to the event.  Please note: YOU MUST BE REGISTERED TO GET FREE ACCESS to the AI Platform