As any company’s use of ML grows, it can become difficult to have a definitive view of all the places where ML impacts the user experience. This makes comprehensive audits of production ML, e.g. for governance or compliance reasons, difficult if not impossible. In this talk, we will introduce Hendrix, the next generation of Spotify’s general-purpose machine learning platform built with traceability at its core to enable production ML at Spotify in a traceable and safe way. We will discuss why these questions matter to us at Spotify and how platform features such as the Hendrix Ontology of Machine Learning and Hendrix Configuration enable us to tackle this complexity in a formalized, methodical manner. And we will conclude with a preview of the future of Hendrix as relates to Spotify’s commitment to responsible AI.
Session Summary
Empowering Traceable and Auditable ML in Production at Spotify with Hendrix
MLconf 2022 San Francisco
Jonathan Jin
Spotify
Senior Machine Learning Engineer
Learn more »
Spotify has incorporated machine learning into all corners of the product, ranging from front-and-center features like Discover Weekly, to less obvious, more “hidden” use cases like playlist recommendations and natural-language search. Despite their power and potential, however, machine learning models have earned a reputation for being inscrutable and unexplainable “black boxes.” These effects become all the more pronounced as our usage of machine learning impacts the livelihoods of more and more musicians and audio creators. As one part of Spotify’s ongoing investments in responsible AI, Spotify is committed to deploying and using ML in traceable and auditable ways. In fact, we have designed our entire machine learning platform around it.