We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode A decoder-only foundation model for time-series forecasting

A decoder-only foundation model for time-series forecasting

2024/5/14
logo of podcast Papers Read on AI

Papers Read on AI

Shownotes Transcript

Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.

2023: Abhimanyu Das, Weihao Kong, Rajat Sen, Yichen Zhou

https://arxiv.org/pdf/2310.10688