학술논문

To Read or To Do? That's The Task: Using Transfer Learning to Detect the Intent of an Email
Document Type
Conference
Source
2018 International Conference on Computational Science and Computational Intelligence (CSCI) CSCI Computational Science and Computational Intelligence (CSCI), 2018 International Conference on. :1105-1110 Dec, 2018
Subject
Computing and Processing
Nanoelectromechanical systems
Scientific computing
Computational intelligence
Magnetohydrodynamics
Hafnium
Magnetic resonance imaging
component
email overload
email intent
transfer learning
classification
data science
Language
Abstract
This research studies the problem of email overload and proposes a system that automatically detects whether the email is "to read" or "to do". The goal of our research is to test if a Language Model trained on the entire Enron email data (around 500,000 emails) then retrained on a subset of intent labeled emails (i.e. transfer learning) would outperform an LSTM model trained on the intent email dataset only. We conduct several experiments using transfer learning in three different scenarios: 1) using pre-trained Language Model as features extractor for SVC. 2) retrain all layers of the pre-trained Language Model on the intent email dataset. 3) gradually retraining the pre-trained Language Model layer by layer, starting from the last layer. We evaluate the results on a subset of Enron Email Dataset. Comparing with a baseline model consists of two LSTM layers, we were able to increase the accuracy from 85.42% to 88.43%.