TimelyGPT: Extrapolatable Transformer Pre-training for Long-term Time-Series Forecasting in Healthcare
2024pp. 1–10
Citations Over Time
Abstract
Motivation: Large-scale pre-trained models (PTMs) such as BERT and GPT have recently achieved great success in Natural Language Processing and Computer Vision domains. However, the development of PTMs on healthcare time-series data is lagging behind. This underscores the limitations of the existing transformer-based architectures, particularly their scalability to handle large-scale time series and ability to capture long-term temporal dependencies.
Related Papers
- → Recommendation Based on Users’ Long‐Term and Short‐Term Interests with Attention(2019)19 cited
- → List of conjectural series for powers of $\pi$ and other constants(2011)46 cited
- → Short-Term Service in Fiji Yields Long-Term Benefits(2008)1 cited
- → Imagination and education in Australia: Short term fixes or long-term gains?(2012)1 cited
- → House Passes Short-Term SGR Fix, But Long-Term Solution Still MIA(2010)