Stephan Gillich, Tomislav Tipurić, Christian Wiebus, Alan Southall
The Future of Computing: AI Technologies in the Exascale Era
#1about 3 minutes
Defining exascale computing and its relevance for AI
Exascale computing, originating from high-performance computing benchmarks, offers massive floating-point operation capabilities that are highly relevant for training large AI models.
#2about 4 minutes
Comparing GPU and CPU architectures for deep learning
GPUs excel at AI tasks due to their specialized, parallel processing of matrix operations, while CPUs are being enhanced with features like Advanced Matrix Extensions to also handle these workloads.
#3about 2 minutes
Implementing machine learning on resource-constrained edge devices
Machine learning is becoming essential on edge devices to improve data quality and services, requiring specialized co-processors to achieve performance within strict power budgets.
#4about 4 minutes
Addressing the growing power consumption of AI computing
The massive energy demand of data centers for AI training is a major challenge, addressed by improving grid-to-core power efficiency and offloading computation to the edge.
#5about 1 minute
Key security considerations for AI systems and edge devices
Securing AI systems involves a multi-layered approach including secure boot, safe updates, certificate management, and ensuring the trustworthiness of the AI models themselves.
#6about 5 minutes
Leveraging open software and AI for code development
Open software stacks enable hardware choice, while development tools and large language models can be used to automatically optimize code for better performance on specific platforms.
#7about 8 minutes
Exploring future computing architectures and industry collaboration
The future of computing will be shaped by power efficiency challenges, leading to innovations in materials like silicon carbide, alternative architectures like neuromorphic computing, and cross-industry partnerships.
#8about 3 minutes
Balancing distributed edge AI with centralized cloud computing
A hybrid architecture that balances local processing on the edge with centralized cloud resources is the most practical approach for AI, optimizing for latency, power, and data privacy based on the specific use case.
Related jobs
Jobs that call for the skills explored in this talk.
Architekt für Cloud Security - AWS (w|m|d)

zeb consulting
Frankfurt am Main, Germany
Remote
Junior
Intermediate
Senior
Featured Partners
Related Videos
Bringing AI Everywhere
Stephan Gillich
How to build a sovereign European AI compute infrastructure
Markus Hacker, Daniel Abbou, Rosanne Kincaid-Smith, Michael Bradley
Open Source AI, To Foundation Models and Beyond
Ankit Patel, Matt White, Philipp Schmid, Lucie-Aimée Kaffee, Andreas Blattmann
Trends, Challenges and Best Practices for AI at the Edge
Ekaterina Sirazitdinova
How AI Models Get Smarter
Ankit Patel
Architecting the Future: Leveraging AI, Cloud, and Data for Business Success
Tomislav Tipurić, Christian Ertler, Marc Binder, Karin Janina Schweizer
Your Next AI Needs 10,000 GPUs. Now What?
Anshul Jindal, Martin Piercy
WWC24 - Ankit Patel - Unlocking the Future Breakthrough Application Performance and Capabilities with NVIDIA
Ankit Patel
From learning to earning
Jobs that call for the skills explored in this talk.


Senior Backend Engineer – AI Integration (m/w/x)
chatlyn GmbH
Vienna, Austria
Senior
JavaScript
AI-assisted coding tools




