Advances in NLP and their Applications to Healthcare

Ndapa Nakashole

Moderators: Jessica Schrouff · Tim Althoff



Recent advances in Natural Language Processing (NLP) have propelled the state of the art to new highs. One such advance is the use of external memory to support reasoning in deep learning models such as Transformers.
Without external memory to store sufficient background knowledge, reasoning in NLP systems must be performed based on limited information leading to poor performance on knowledge-rich tasks. Conversely, NLP systems with access to external memory have resulted in significant performance gains on many important tasks including question answering (QA) and other tasks associated with QA such as fact verification, and entity linking. The tutorial will present : 1) an overview of state of the art approaches for representing background knowledge in addressable memory, and 2) applications in the healthcare domain.

Chat is not available.