Timezone: »

Learning to Select Best Forecast Tasks for Clinical Outcome Prediction
Yuan Xue · Nan Du · Anne Mottram · Martin Seneviratne · Andrew Dai

Mon Dec 07 09:00 PM -- 11:00 PM (PST) @ Poster Session 0 #119

The paradigm of pretraining' from a set of relevant auxiliary tasks and thenfinetuning' on a target task has been successfully applied in many different domains. However, when the auxiliary tasks are abundant, with complex relationships to the target task, using domain knowledge or searching over all possible pretraining setups are inefficient strategies. To address this challenge, we propose a method to automatically select from a large set of auxiliary tasks which yield a representation most useful to the target task. In particular, we develop an efficient algorithm that uses automatic auxiliary task selection within a nested-loop meta-learning process. We have applied this algorithm to the task of clinical outcome predictions in electronic medical records, learning from a large number of self-supervised tasks related to forecasting patient trajectories. Experiments on a real clinical dataset demonstrate the superior predictive performance of our method compared to direct supervised learning, naive pretraining and multitask learning, in particular in low-data scenarios when the primary task has very few examples. With detailed ablation analysis, we further show that the selection rules are interpretable and able to generalize to unseen target tasks with new data.

Author Information

Yuan Xue (Google)
Nan Du (Google Brain)
Anne Mottram (DeepMind)
Martin Seneviratne (Google Health)
Andrew Dai (Google Brain)

More from the Same Authors