Deep learning has captured the hearts and minds of ML engineers and researchers. However, is it always the best tool for the job? In this talk, we describe two NLP applications deployed at Facebook where deep learning classifiers were outperformed by simpler, more lightweight techniques. For detecting online content related to COVID, we show that data-driven regular expressions outperform deep learning in both precision and recall. For detecting clickbaity link titles in 20+ languages, we show that a carefully designed regularized Logistic Regression outperforms the Kaggle-winning BiGRU model by AUROC and pointwise precision/recall. We discuss characterstics of practixal problems that facilitate more pedestrian ML methods.
Session Summary
Outwitting Deep Learning Models
MLconf Online 2020
Igor Markov
Facebook
Research Scientist
Learn more »