LISTENDOCK

PDF TO MP3

App Store

Get the App

Available on iOS & Android

Example153 min103 chapters103 audios readyExplained0% complete

2019 Evolutionary Algorithms Review

This 2019 review introduces a new taxonomy for evolutionary algorithms based on User Control Attributes (limiters, explainability, causality, fairness, correction) and surveys traditional and specialized EAs, their applications, challenges, and future directions.

Title

A review of evolutionary algorithms proposes a new taxonomy focusing on control, explainability, causality, bias, and corrective measures.

1:34Explained

Preface

An Evolutionary Algorithm replaces manual chemist experimentation to explore chemical problem spaces.

1:47Explained

Introduction

AI science sits at the boundary of philosophy and science, combining theoretical ideas with practical engineering.

1:23Explained

User Control Attributes

Rule-based ML is transitioning to outcome-oriented systems, with UCA (limiters, explainability, causality, fairness, and correction) and trust considerations.

1:50Explained

Control Attributes in ML

Modern ML evaluates models by control attributes including limits, explainability, causality, fairness, and the ability to correct.

1:43Explained

End of Moore's Law

Advances in silicon are hitting physical and economic limits, signaling the end of Moore’s Law.

1:33Explained

SpiNNaker

SpiNNaker uses a million interconnected ARM cores to run Spiking Neural Networks, roughly equating to about 1% of a human brain.

1:36Explained

Hybrid Evolutionary Algorithms

Hybridizing evolutionary algorithms with neural networks is a growing approach to automate complex problems.

1:28Explained

Introduction

EAs balance exploitative local search with explorative stochastic search.

1:19Explained

Figure 2.1

The figure illustrates how hardware capability and algorithmic efficiency co-evolve, expanding the unknown search space.

1:26Explained

End of Dennard Scaling

End of Dennard scaling shifts power-density concerns and enables new hardware concepts and deep learning resurgence.

1:42Explained

No Free Lunch Theorem

No single algorithm dominates across all problems; domain knowledge is needed to achieve efficiency.

1:35Explained

Evolutionary Algorithms Overview

Evolutionary algorithms are population-based metaheuristics driven by selection, variation, and reproduction.

1:21Explained

Fitness and Objectives in EAs

Fitness functions quantify success and guide selection; multi-objective optimization seeks Pareto-optimal solutions.

1:28Explained

Introduction

EAs tackle problems where traditional methods struggle due to resources, dimensionality, or complexity.

1:27Explained

Figure 2.1

A captioned figure linking hardware progress to algorithmic capability and unknown problem spaces.

1:51Explained

End of Dennard Scaling

Shifts in power dynamics and new hardware concepts accompany the deep-learning resurgence.

1:29Explained

No Free Lunch Theorem

No single algorithm outperforms all problems; domain-specific knowledge enhances efficiency.

1:27Explained

Overview of Evolutionary Algorithms

EAs are population-based metaheuristics that evolve solutions through selection, reproduction, and variation.

1:15Explained

Fitness and Objectives in EAs

Fitness measures guide selection; many-objective optimization seeks trade-offs along Pareto fronts.

1:20Explained

Introduction

EAs apply when traditional exploitative or stochastic methods fail due to complexity or resources.

1:23Explained

Figure 2.1

Shows the relationship between hardware improvement and algorithmic progress over time.

1:28Explained

End of Dennard Scaling

Describes how diminishing power density leads to reliance on new computing models and AI advances.

1:31Explained

No Free Lunch Theorem

No universal solver exists; domain knowledge increases efficiency for specific problems.

1:25Explained

Overview of Evolutionary Algorithms

EAs are population-based metaheuristics that use evolution-inspired operators to explore problem spaces.

1:38Explained

Fitness and Objectives in EAs

Fitness functions evaluate performance and multi-objective optimization seeks balanced trade-offs.

1:51Explained

Introduction

EAs address problems where classic methods struggle due to complexity or resource limits.

1:19Explained

Figure 2.1

Depicts how hardware scaling and algorithmic progress enable exploration of unknown problem regions.

1:16Explained

End of Dennard Scaling

Points to new hardware paradigms and AI-driven software growth beyond Dennard limits.

1:17Explained

No Free Lunch Theorem

Domain knowledge is essential for efficient problem solving when using EAs.

1:35Explained

Evolutionary Algorithms Overview

EAs are population-based methods using selection, variation and reproduction to search spaces.

1:20Explained

Fitness and Objectives in EAs

Fitness guides progression; multi-objective EAs aim for Pareto-optimal compromises.

1:22Explained

Introduction

Introductory discussion on when EAs are advantageous relative to traditional methods.

1:43Explained

Figure 2.1

Illustrates hardware vs software growth and the unknowns at the frontier.

1:44Explained

End of Dennard Scaling

Hardware scaling constraints drive exploration of novel computing paradigms.

1:40Explained

No Free Lunch Theorem

Algorithm performance is problem-dependent; no universal best method.

1:36Explained

Overview of Evolutionary Algorithms

EAs are Darwinian metaheuristics consisting of population, variation, selection.

1:27Explained

Fitness and Objectives in EAs

The fitness function determines success and guides evolution; multi-objective issues exist.

1:39Explained

Introduction

EAs tackle hard problems where standard methods fail or are inefficient.

1:50Explained

Figure 2.1

Depicts the interplay of hardware and algorithmic progress over time.

1:41Explained

End of Dennard Scaling

Denotes the shift to new computing approaches alongside AI trends.

1:48Explained

No Free Lunch Theorem

No single algorithm excels for all problems; domain knowledge improves efficacy.

1:10Explained

Evolutionary Algorithms Overview

EA families like GAs, GP, GE, CGP, PushGP explore problem spaces via evolution.

1:01Explained

Fitness and Objectives in EAs

Fitness quantifies progress; multi-objective optimization balances conflicting goals.

1:48Explained

Introduction

EAs provide a framework for solving complex, high-dimensional problems.

1:54Explained

Figure 2.1

Illustrates how hardware capabilities enable broader exploration of problem spaces.

1:49Explained

End of Dennard Scaling

Outlines transitions to alternative computing models to sustain AI progress.

1:17Explained

No Free Lunch Theorem

No universal algorithm exists; domain knowledge improves search efficiency.

1:33Explained

Overview of Evolutionary Algorithms

EAs are population-based metaheuristics evolving solutions via selection and variation.

1:22Explained

Fitness and Objectives in EAs

Fitness evaluation drives selection; multi-objective optimization seeks Pareto-optimal fronts.

1:50Explained

Introduction

Introduces when EAs outperform traditional approaches.

1:42Explained

Figure 2.1

Links hardware progress with AI algorithmic capabilities and unknown regions.

1:35Explained

End of Dennard Scaling

Signals a move to new hardware paradigms and deep learning resurgence.

1:02Explained

No Free Lunch Theorem

No single algorithm is universally best; domain knowledge improves efficiency.

1:07Explained

Evolutionary Algorithms Overview

Overview of evolutionary algorithms and their metaheuristic nature.

1:26Explained

Fitness and Objectives in EAs

Fitness functions guide evolution; multi-objective optimization navigates trade-offs.

1:24Explained

Introduction

Discussion of when EAs are advantageous over traditional methods.

1:52Explained

Figure 2.1

Illustrates the relationship between hardware improvements and algorithmic progress.

1:15Explained

End of Dennard Scaling

End of scaling drives exploration of new computing models.

1:26Explained

No Free Lunch Theorem

No universal algorithm exists; domain knowledge improves efficiency.

1:27Explained

Overview of Evolutionary Algorithms

EA families evolve solutions via population-based processes.

1:19Explained

Fitness and Objectives in EAs

Fitness measures success; multi-objective optimization seeks Pareto frontiers.

1:14Explained

Introduction

EAs solve complex, high-dimensional problems where other methods fail.

1:35Explained

Figure 2.1

Shows hardware and software progress and the unknown frontiers.

1:42Explained

End of Dennard Scaling

Describes a shift to advanced hardware and AI-enabled systems.

1:10Explained

No Free Lunch Theorem

Domain-specific knowledge is essential for efficient search.

1:16Explained

Evolutionary Algorithms Overview

Overview of population-based metaheuristics.

1:21Explained

Fitness and Objectives in EAs

Fitness guides evolution; multi-objective optimization balances several goals.

1:52Explained

Introduction

Explains why EAs are used for challenging problems.

1:08Explained

Figure 2.1

Depicts hardware progress enabling new problem-solving capabilities.

1:02Explained

End of Dennard Scaling

Describes the move to novel hardware models to support AI.

1:26Explained

No Free Lunch Theorem

No universal algorithm exists; domain knowledge drives efficiency.

1:17Explained

Overview of Evolutionary Algorithms

EA families such as GA, GP, GE, CGP, and PushGP evolve programs and designs.

1:26Explained

Fitness and Objectives in EAs

Fitness calculations drive selection; multi-objective may yield Pareto-optimal sets.

1:10Explained

Introduction

Introduces the role of EAs in solving complex problems.

1:31Explained

Figure 2.1

Relates hardware capability to algorithmic performance and unknown regions.

1:22Explained

End of Dennard Scaling

Discusses shifts to new computing paradigms and AI advances.

1:52Explained

No Free Lunch Theorem

No single algorithm is best for all problems; domain knowledge helps.

1:35Explained

Evolutionary Algorithms Overview

Overview of evolutionary algorithms and their metaheuristic nature.

1:19Explained

Fitness and Objectives in EAs

Fitness measures success; multiple objectives require Pareto optimization.

1:13Explained

Introduction

Explains when EAs are advantageous relative to traditional methods.

1:22Explained

Figure 2.1

Illustrates hardware progression and algorithmic capacity growth.

1:12Explained

End of Dennard Scaling

Hardware scaling limits spur new computing approaches for AI.

1:44Explained

No Free Lunch Theorem

There is no universal best algorithm; domain knowledge improves search.

1:02Explained

Overview of Evolutionary Algorithms

Introduction to EAs as population-based metaheuristics.

1:47Explained

Fitness and Objectives in EAs

Fitness guides evolution; multi-objective optimization seeks Pareto-optimal solutions.

1:33Explained

Introduction

Discusses when EAs are favored over traditional optimization methods.

1:58Explained

Figure 2.1

Illustrates the interplay of hardware progress and algorithmic capabilities.

1:41Explained

End of Dennard Scaling

Describes the transition to new hardware paradigms to sustained AI progress.

1:33Explained

No Free Lunch Theorem

No universal best algorithm; domain knowledge enhances efficiency.

1:23Explained

Traditional Techniques Overview

An overview of established EAs used in industry and research.

1:47Explained

Evolutionary Strategy (ES)

ES evolves continuous parameters using mutation and selection, with CMA-ES and CMSA-ES variants.

1:30Explained

Genetic Algorithms (GA)

GA optimizes fixed-length strings representing variables or parameters using crossover and mutation.

1:30Explained

Genetic Programming (GP)

GP evolves executable programs or equations using tree-like representations.

1:34Explained

Genetic Improvement (GI)

GI optimizes existing working code to improve performance or correctness.

1:23Explained

Grammatical Evolution (GE)

GE evolves programs by evolving grammars (BNF) to generate code.

1:29Explained

Linear Genetic Programming (LGP)

LGP uses linear programs for sequential problems and low-level optimizations.

1:45Explained

Cartesian Genetic Programming (CGP)

CGP uses Cartesian graphs with small populations to solve structured problems.

1:34Explained

Differential Evolution (DE)

DE optimizes by weighted differences and self-organizes populations.

1:40Explained

Gene Expression Programming (GEP)

GEP uses fixed-length strings encoding expression trees to generate valid programs.

1:19Explained

Specialized Techniques

Covers exotic and hybrid EAs beyond traditional methods.

1:41Explained

Auto-constructive Evolution

Entities evolve themselves without a central controller to form offspring.

0:57Explained

Neuroevolution

Uses genetic algorithms to optimize neural networks and architectures.

0:32Explained

Share this document