Building Time Series Foundational Models: Past, Present and Future

Jul 21,2024

Data Phoenix Events

2014-01-02T08:32:23Z

​​Time series data is ubiquitous across industries: the startup COO predicts customer demand; a clinician in the ICU reads medical charts, the stock broker forecasts security prices. In the past, a technical and domain expert would build, train, and implement a new model for each task, in each industry's swim lane. This is a massive intellectual fragmentation bottleneck!
​Luckily, transformer architectures, enabling zero-shot sequence modeling across modalities, are a perfect solution. We introduce a new frontier in transformer modalities - time series - where massive amounts of domain knowledge are taught to large time series models (LTSMs), forming a universal prior across forecasting, imputation, classification, and anomaly detection tasks.
​Join us as we review the next frontier of AI, showcasing Gradient’s LTSM, a novel architecture, and massive time series dataset, achieving state of the art performance on time series tasks. Our foundational model and datasets are fully open sourced. Finally, we preview multimodal foundational time series models, where working with time series data is as easy as prompting ChatGPT.

​​📌 Key Highlights of the Webinar:
​​​🔶 Cross-Industry Time Series Analysis: Learn how time series data is used in diverse fields—from retail for demand forecasting, in healthcare for monitoring ICU patients, to finance for predicting stock movements.
​​🔶 ​Introduction to Transformer Modalities in Time Series: Discover the transformative application of transformer architectures to time series data and how these models can perform zero-shot learning across different types of time series tasks to form perfect solutions.
​​🔶 State-of-the-Art Time Series Models: Get an in-depth look at Gradient’s LTSM (Large Time Series Model), a novel architecture, and massive time series dataset, achieving state of the art performance on time series tasks
​​🔶 ​Preview of Multimodal Foundational Models: Preview upcoming advancements in multimodal foundational models that promise to simplify working with time series data, making it as easy as prompting ChatGPT.

🎤 About Our Speaker:
​Leo is a Chief Scientist at Gradient leading research and analytics, a full stack AI platform that enables businesses to build customized agents to power enterprise workload. Prior to Gradient, Leo led CloudTruck's ML and data science orgs pioneering applied ML to operational challenges. Before that, Leo held leadership roles across Opendoor, Optimizely, and Disney. Leo holds a bachelor's degree in economics from Stanford, as well as a masters and PhD in statistics from Stanford.

🔗 Useful Links:
Airtrain AI: https://www.airtrain.ai/
Data Phoenix: https://dataphoenix.info/
Join Our Discord: https://discord.gg/Ac9YCzmBYj

👉 Follow us:
LinkedIn - https://www.linkedin.com/company/data-phoenix/
X - https://x.com/data_phoenix