-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:56:32
In the ever-evolving landscape of artificial intelligence, handling and leveraging data effectively has been and will continue to be a critical challenge, especially in the age of foundation models. Recent development in utilizing them, e.g. large language models (LLMs), has opened new horizons in the research. Although most algorithms are trained in a centralized fashion, access to necessary data can be restricted due to various factors such as privacy, regulation, geopolitics, and the sheer effort to move the datasets. Given the fundamentals of federated learning (FL) addressing the pivotal balance between data access and the collaborative enhancement of AI models, in this talk, we explore how FL can address the challenges with easy and scalable integration capabilities. Enabled by practical frameworks like NVIDIA FLARE, we will discuss the special challenges and solutions for embedding FL in foundation model development and customizations to enhance their accuracy and robustness. Ultimately, this talk underscores the transformative potential of FL in foundation models, offering insights into its current achievements and future possibilities.