000 | 03042nam a22004695i 4500 | ||
---|---|---|---|
001 | 978-3-319-03422-5 | ||
003 | DE-He213 | ||
005 | 20200420221259.0 | ||
007 | cr nn 008mamaa | ||
008 | 131204s2014 gw | s |||| 0|eng d | ||
020 |
_a9783319034225 _9978-3-319-03422-5 |
||
024 | 7 |
_a10.1007/978-3-319-03422-5 _2doi |
|
050 | 4 | _aQ342 | |
072 | 7 |
_aUYQ _2bicssc |
|
072 | 7 |
_aCOM004000 _2bisacsh |
|
082 | 0 | 4 |
_a006.3 _223 |
100 | 1 |
_aKramer, Oliver. _eauthor. |
|
245 | 1 | 2 |
_aA Brief Introduction to Continuous Evolutionary Optimization _h[electronic resource] / _cby Oliver Kramer. |
264 | 1 |
_aCham : _bSpringer International Publishing : _bImprint: Springer, _c2014. |
|
300 |
_aXI, 94 p. 29 illus., 24 illus. in color. _bonline resource. |
||
336 |
_atext _btxt _2rdacontent |
||
337 |
_acomputer _bc _2rdamedia |
||
338 |
_aonline resource _bcr _2rdacarrier |
||
347 |
_atext file _bPDF _2rda |
||
490 | 1 |
_aSpringerBriefs in Applied Sciences and Technology, _x2191-530X |
|
505 | 0 | _aPart I Foundations -- Part II Advanced Optimization -- Part III Learning -- Part IV Appendix. | |
520 | _aPractical optimization problems are often hard to solve, in particular when they are black boxes and no further information about the problem is available except via function evaluations. This work introduces a collection of heuristics and algorithms for black box optimization with evolutionary algorithms in continuous solution spaces. The book gives an introduction to evolution strategies and parameter control. Heuristic extensions are presented that allow optimization in constrained, multimodal, and multi-objective solution spaces. An adaptive penalty function is introduced for constrained optimization. Meta-models reduce the number of fitness and constraint function calls in expensive optimization problems. The hybridization of evolution strategies with local search allows fast optimization in solution spaces with many local optima. A selection operator based on reference lines in objective space is introduced to optimize multiple conflictive objectives. Evolutionary search is employed for learning kernel parameters of the Nadaraya-Watson estimator, and a swarm-based iterative approach is presented for optimizing latent points in dimensionality reduction problems. Experiments on typical benchmark problems as well as numerous figures and diagrams illustrate the behavior of the introduced concepts and methods. | ||
650 | 0 | _aEngineering. | |
650 | 0 | _aArtificial intelligence. | |
650 | 0 | _aComputational intelligence. | |
650 | 1 | 4 | _aEngineering. |
650 | 2 | 4 | _aComputational Intelligence. |
650 | 2 | 4 | _aArtificial Intelligence (incl. Robotics). |
710 | 2 | _aSpringerLink (Online service) | |
773 | 0 | _tSpringer eBooks | |
776 | 0 | 8 |
_iPrinted edition: _z9783319034218 |
830 | 0 |
_aSpringerBriefs in Applied Sciences and Technology, _x2191-530X |
|
856 | 4 | 0 | _uhttp://dx.doi.org/10.1007/978-3-319-03422-5 |
912 | _aZDB-2-ENG | ||
942 | _cEBK | ||
999 |
_c53104 _d53104 |