Up next


Attention for time series forecasting COVID predictions - Isaac Godfried

2,481,326 Views
AI Lover
3
Published on 12/19/22 / In How-to & Learning

Self-attention and the transformer architecture have broken many benchmarks and enabled widespread progress in NLP. However, at this point neither researchers nor companies in industry (with a few exceptions) have leveraged them in a time series context. This talk will explore both the barriers and promise of self-attention models and transfer learning in a time series context. The talk will also look into why time series tasks (forecasting/prediction) have not had their BERT/Imagenet moment and what can be done to enable transfer learning on temporal data. The extremely limited COVID-19 time series forecasting dataset will be used as an example for the need to address these limited data scenarios and enable effective few-shot/transfer learning more generally.

Isaac Godfried is a machine learning engineer at Monster focusing on the data and machine learning platform. Prior to his current position Isaac worked on machine learning problems in both retail and healthcare. He also has participated in many Kaggle competitions. Isaac’s main focus is to remove barriers related to the use of deep learning in industry. Specifically, this involves researching techniques like transfer and meta learning for data constrained scenarios, designing tools to effectively track and manage experiments, and creating frameworks to deploy models at scale. In his spare time Isaac also conducts research in AI for good causes like medicine and climate.

👩🏼‍🚀Weights and Biases:
We’re always free for academics and open source projects. Email carey@wandb.com with any questions or feature suggestions.

- Blog: https://www.wandb.com/articles
- Gallery: See what you can create with W&B -https://app.wandb.ai/gallery
- Continue the conversation on our slack community - http://bit.ly/wandb-forum

Show more
0 Comments sort Sort By

Up next