Federated learning enables many devices to collaboratively train a global ML model without centralizing their data. Check out a comic book style introduction to FL, and dive deeper into details at
In the hustle of IO, I missed this awesome comic on federated learning. Kudos to Lucy Bellwood and Scott McCloud for the comic. At the bottom of the page are links to research papers and other content about federated learning by researchers.
Federated learning: train machine learning models while preserving user privacy, by keeping user data on device (e.g. mobile phone) and only sending encrypted gradient updates (that can only be decrypted in aggregate) back to the server
With federated learning, a new approach to machine learning invented by Google, products like #Gboard get better, faster—without collecting data from your device. #io19
How can privacy be preserved when performing machine learning on users data? Google released a comic explainer highlighting secure aggregation, a protocol co-developed by Cornell Tech PhD alum Karn Seth and current PhD candidate Antonio Marcedone.
This is a fun online comic (from Google) which explains the idea of "federated learning" -- part of a solution for privacy. It's way to share the advantages of aggregated data without aggregating the data, only aggregating the results. Cool.
Federated whatnow? It's federated #AI and will likely be quite important in medicine. An instructive cartoon-o-graphic by and gets you up to speed h/t
This cartoon on federated learning, mentioned by Brendan McMahan from at #RAAIS2019, does a fine job of introducing an important idea