Distributed Evolutionary Deep Learning: Optimizing Neural Network Weights, Hyperparameters, and Architectures

April 14, 2026
11:00 am
Location: S321

Speaker: CJ Chung

Modern deep learning is increasingly bottlenecked by two factors:

  1. the human limitations of hand-crafting optimal network architectures
    and
  2. the immense computational power required to train large
    models.

This seminar explores how to address both challenges by combining principles from biological evolution with distributed high-performance computing.

The talk introduces Evolutionary Deep Learning (EDL), a framework that uses evolutionary algorithms, such as genetic algorithms and evolution strategies, to optimize neural network weights, hyperparameters, and architectures. By evolving populations of models, EDL can explore optimization landscapes that traditional gradient-based methods may struggle with.

Because evolutionary approaches require evaluating many candidate models, they are computationally intensive but naturally
parallelizable. The seminar will also discuss multi-GPU scaling techniques and demonstrate how evolving populations can be efficiently distributed across high-performance computing systems.

Registration Form

Guest Info

First Name *
Email *
How did you hear about this event?
Number of guests who will attend
Last Name *
Phone Number
What is your relationship to LTU?

Upcoming Events