My Account Log in

1 option

Handbook of metaheuristic algorithms : from fundamental theories to advanced applications / Chun-Wei Tsai and Ming-Chao Chiang.

O'Reilly Online Learning: Academic/Public Library Edition Available online

View online
Format:
Book
Author/Creator:
Tsai, Chun-Wei, author.
Chiang, Ming-Chao, author.
Series:
Uncertainty, computational techniques, and decision intelligence.
Uncertainty, computational techniques, and decision intelligence
Language:
English
Subjects (All):
Metaheuristics.
Physical Description:
1 online resource (xxxviii, 584 pages) : illustrations (black & white).
Place of Publication:
London ; San Diego, CA : Academic Press, an imprint of Elsevier, [2023]
Summary:
Handbook of Metaheuristic Algorithms: From Fundamental Theories to Advanced Applications provides a brief introduction to metaheuristic algorithms from the ground up, including basic ideas and advanced solutions. Although readers may be able to find source code for some metaheuristic algorithms on the Internet, the coding styles and explanations are generally quite different, and thus requiring expanded knowledge between theory and implementation. This book can also help students and researchers construct an integrated perspective of metaheuristic and unsupervised algorithms for artificial intelligence research in computer science and applied engineering domains. Metaheuristic algorithms can be considered the epitome of unsupervised learning algorithms for the optimization of engineering and artificial intelligence problems, including simulated annealing (SA), tabu search (TS), genetic algorithm (GA), ant colony optimization (ACO), particle swarm optimization (PSO), differential evolution (DE), and others. Distinct from most supervised learning algorithms that need labeled data to learn and construct determination models, metaheuristic algorithms inherit characteristics of unsupervised learning algorithms used for solving complex engineering optimization problems without labeled data, just like self-learning, to find solutions to complex problems.
Contents:
Front Cover
Handbook of Metaheuristic Algorithms
Copyright
Contents
List of figures
List of tables
List of algorithms
List of listings
About the authors
Chun-Wei Tsai (1978-)
Ming-Chao Chiang (1956-)
Preface
Part 1 Fundamentals
1 Introduction
1.1 Why metaheuristic algorithms
1.2 Organization of this book
2 Optimization problems
2.1 Problem definition
2.2 Combinatorial optimization problems
2.2.1 The one-max and 0-1 knapsack problems
2.2.2 The B2D and deceptive problems
2.2.3 The traveling salesman problem (TSP)
2.3 Continuous optimization problems
2.3.1 The single-objective optimization problem
2.3.2 The multi-objective optimization problem
2.4 Summary
3 Traditional methods
3.1 Exhaustive search (ES)
3.1.1 The basic idea of ES
3.1.2 Implementation of ES for the one-max problem
3.1.3 Discussion of ES
3.2 Hill climbing (HC)
3.2.1 The basic idea of HC
3.2.2 Implementation of HC for the one-max problem
3.2.2.1 Main function
3.2.2.2 Search function of HC
3.2.2.2.1 Declaration of parameters and functions
3.2.2.2.2 The main loop
3.2.2.2.3 Additional functions
3.2.2.3 Library function
3.2.3 Discussion of HC
3.3 Comparisons between ES and HC
3.3.1 Simulation results of ES and HC for the one-max problem
3.3.2 Simulation results of ES and HC for the deceptive problem
3.4 Summary of ES and HC
Supplementary source code
4 Metaheuristic algorithms
4.1 What is a metaheuristic algorithm?
4.2 A unified framework for metaheuristic algorithms
4.3 Comparisons of metaheuristics with exhaustive and greedy search
5 Simulated annealing
5.1 The basic idea of simulated annealing (SA)
5.2 Implementation of SA for the one-max and deceptive problems
5.2.1 Declaration of functions and parameters
5.2.2 The main loop.
5.2.3 Additional functions
5.3 Simulation results of SA
5.3.1 Simulation results of SA for the one-max problem
5.3.2 Simulation results of SA for the deceptive problem
5.4 Discussion
6 Tabu search
6.1 The basic idea of tabu search (TS)
6.2 Implementation of TS for the one-max and deceptive problems
6.2.1 Declaration of functions and parameters
6.2.2 The main loop
6.2.3 Additional functions
6.3 Simulation results of TS
6.3.1 Simulation results of TS for the one-max problem
6.3.2 Simulation results of TS for the deceptive problem
6.4 Discussion
7 Genetic algorithm
7.1 The basic idea of genetic algorithm (GA)
7.2 Implementation of GA for the one-max and deceptive problems
7.2.1 Declaration of functions and parameters
7.2.2 The main loop
7.2.3 Additional functions
7.3 Simulation results of GA
7.3.1 Simulation results of GA for the one-max problem
7.3.2 Simulation results of GA for the deceptive problem
7.4 Discussion
8 Ant colony optimization
8.1 The basic idea of ant colony optimization (ACO)
8.1.1 The ant system (AS)
8.1.2 The ant colony system (ACS)
8.2 Implementation of ACO for the traveling salesman problem
8.2.1 Declaration of functions and parameters
8.2.2 The main loop
8.2.3 Additional functions
8.3 Simulation results of ACO for the traveling salesman problem
8.4 Discussion
9 Particle swarm optimization
9.1 The basic idea of particle swarm optimization (PSO)
9.2 Implementation of PSO for the function optimization problem
9.2.1 Declaration of functions and parameters
9.2.2 The main loop
9.2.3 Additional functions
9.3 Simulation results of PSO for the function optimization problem
9.4 Discussion.
Supplementary source code
10 Differential evolution
10.1 The basic idea of differential evolution (DE)
10.2 Implementation of DE for the function optimization problem
10.2.1 Declaration of functions and parameters
10.2.2 The main loop
10.2.3 Additional functions
10.2.4 Other mutation strategies
10.3 Simulation results of DE for the function optimization problem
10.4 Discussion
Part 2 Advanced technologies
11 Solution encoding and initialization operator
11.1 Encoding of solutions
11.2 Initialization operator
11.3 Discussion
12 Transition operator
12.1 Why use different transition operators
12.2 Different transition operators of GA for solving the TSP
12.3 Implementation of GA for the TSP with different crossover operators
12.3.1 Declaration of functions and parameters
12.3.2 The main loop
12.3.3 Additional functions
12.3.4 Adding new crossover operators
12.4 Simulation results of GA for the TSP with different crossover operators
12.5 Discussion
13 Evaluation and determination operators
13.1 Evaluation operator
13.2 Determination operator
13.2.1 Determination operator for single-solution-based metaheuristic algorithms
13.2.2 Determination operator for population-based metaheuristic algorithms
13.3 Schema theorem
13.3.1 Selection and fitness function for the schema
13.3.2 Crossover for the schema
13.3.3 Mutation for the schema
13.3.4 A simple example of schema theory
13.4 Fitness landscape analysis
13.5 Discussion
14 Parallel metaheuristic algorithm
14.1 The basic idea of the parallel metaheuristic algorithm
14.1.1 Single-solution-based parallel metaheuristic algorithms
14.1.2 Population-based parallel metaheuristic algorithms.
14.2 Implementation of parallel GA for the TSP
14.2.1 Declaration of functions and parameters
14.2.2 The main loop
14.2.3 Additional functions
14.3 Simulation results of parallel GA for the TSP
14.4 Discussion
15 Hybrid metaheuristic and hyperheuristic algorithms
15.1 The basic idea of the hybrid metaheuristic algorithm
15.2 The basic idea of the hyperheuristic algorithm
15.3 Implementation of the hybrid heuristic algorithm for the TSP
15.3.1 Declaration of functions and parameters
15.3.2 The main loop of HGA
15.3.3 Additional functions of HGA
15.3.4 The declaration of functions and parameters of SA
15.3.5 The main loop of SA
15.3.6 Additional functions of SA
15.4 Simulation results of the hybrid heuristic algorithm for the TSP
15.5 Discussion
16 Local search algorithm
16.1 The basic idea of local search
16.1.1 Iterating with different solutions
16.1.2 Changing the landscape of the problem
16.1.3 Accepting non-improving neighbors
16.1.4 k-opt
16.2 Metaheuristic algorithm with local search
16.3 Implementation of GA with 2-opt for the TSP
16.3.1 Declaration of functions and parameters
16.3.2 The main loop
16.3.3 Additional functions
16.4 Simulation results of GA with 2-opt for the TSP
16.5 Discussion
17 Pattern reduction
17.1 The basic idea of pattern reduction
17.2 Implementation of PREGA for clustering problems
17.2.1 Declaration of functions and parameters
17.2.2 The main loop
17.2.3 Additional functions
17.3 Simulation results of PREGA for clustering problems
17.4 Related work
17.5 Discussion
18 Search economics
18.1 The basic idea of search economics
18.1.1 The resource arrangement operator.
18.1.2 The vision search operator
18.1.3 The marketing research operator
18.2 Implementation of SE for the one-max problem
18.2.1 Declaration of functions and parameters
18.2.2 The main loop
18.2.3 Additional functions
18.3 Simulation results of SE for the one-max problem
18.4 Discussion
19 Advanced applications
19.1 Data clustering
19.1.1 Problem description and definition
19.1.2 Solution encoding
19.1.3 Metaheuristic algorithm for data clustering
19.2 Cluster-head selection
19.2.1 Problem description and definition
19.2.2 Solution encoding
19.2.3 Metaheuristic algorithm for cluster-head selection
19.3 Traffic light control
19.3.1 Problem description and definition
19.3.2 Solution encoding
19.3.3 Metaheuristic algorithm for traffic light control
19.4 Hyperparameter optimization
19.4.1 Problem description and definition
19.4.2 Solution encoding
19.4.3 Metaheuristic algorithm for hyperparameter optimization
19.5 Convolutional neural network filter pruning
19.5.1 Problem description and definition
19.5.2 Solution encoding
19.5.3 Metaheuristic algorithm for convolutional neural network pruning
19.6 Discussion
20 Conclusion and future research directions
20.1 Conclusion
20.2 Future research directions
A Interpretations and analyses of simulation results
A.1 Interpretations of metaheuristics
A.1.1 Quality of the end result
A.1.2 Convergence curves
A.1.3 Number of evaluations and computation time
A.2 Analyses of metaheuristics
A.2.1 Impact of parameters and operators
A.2.2 Complexity and statistical analyses
A.3 Discussion
B Implementation in Python
References
Index
Back Cover.
Notes:
Includes bibliographical references (pages 553-574) and index.
Description based on print version record.
<b>PART 1 Fundamentals</b> <p>1. Introduction</p> <p>2. Optimization problems</p> <p>3. Traditional methods</p> <p>4. Metaheuristic algorithms</p> <p>5. Simulated annealing</p> <p>6. Tabu search</p> <p>7. Genetic algorithm</p> <p>8. Ant colony optimization</p> <p>9. Particle swarm optimization</p> <p>10. Differential evolution</p> <p><b>PART 2 Advanced technologies</b></p> <p>11. Solution encoding and initialization operator</p> <p>12. Transition operator</p> <p>13. Evaluation and determination operators</p> <p>14. Parallel metaheuristic algorithm</p> <p>15. Hybrid metaheuristic and hyperheuristic algorithms</p> <p>16. Local search algorithm</p> <p>17. Pattern reduction</p> <p>18. Search economics</p> <p>19. Advanced applications</p> <p>20. Conclusion and future research directions</p> <p>A. Interpretations and analyses of simulation results</p> <p>B. Implementation in Python</p>
Other Format:
Print version: Tsai, Chun-Wei Handbook of Metaheuristic Algorithms
ISBN:
0-443-19109-3
9780443191091
OCLC:
1380993852

The Penn Libraries is committed to describing library materials using current, accurate, and responsible language. If you discover outdated or inaccurate language, please fill out this feedback form to report it and suggest alternative language.

My Account

Shelf Request an item Bookmarks Fines and fees Settings

Guides

Using the Library Catalog Using Articles+ Library Account