This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
|
derivative_free_optimization [2012/05/24 09:53] prokop |
derivative_free_optimization [2012/05/24 09:59] (current) |
||
|---|---|---|---|
| Line 2: | Line 2: | ||
| ====== Derivative Free Optimization ====== | ====== Derivative Free Optimization ====== | ||
| - | http://en.wikipedia.org/wiki/Pattern_search_(optimization) | + | -http://en.wikipedia.org/wiki/Pattern_search_(optimization) |
| - | http://en.wikipedia.org/wiki/Random_search | + | -http://en.wikipedia.org/wiki/Random_search |
| - | http://en.wikipedia.org/wiki/Nelder%E2%80%93Mead_method (Ameba) | + | -http://en.wikipedia.org/wiki/Nelder%E2%80%93Mead_method (Ameba) |
| Pattern search je relativne rychly protoze sampluje pouze vyznacne smery. Minimalne sampluje 1 smer, maximalne 2N. Pamatuje si predchozi smer takze se pohybuje effektivne. | Pattern search je relativne rychly protoze sampluje pouze vyznacne smery. Minimalne sampluje 1 smer, maximalne 2N. Pamatuje si predchozi smer takze se pohybuje effektivne. | ||
| problem nastava v okamziku kdy se objevi uzke udoli sikme na hlavni osy. V tom pripade Pattern_search sice konverguje ale s velice kratkym krokem (aby se vlez do sirky udoli) coz vede k velkemu poctu potrebnych iteraci. | problem nastava v okamziku kdy se objevi uzke udoli sikme na hlavni osy. V tom pripade Pattern_search sice konverguje ale s velice kratkym krokem (aby se vlez do sirky udoli) coz vede k velkemu poctu potrebnych iteraci. | ||
| - | | ||
| + | Resenim je pouzit znalosti okolnich bodu k odhadu gradientu, pripadne i stredu paraboly. | ||
| + | |||
| + | |||
| + | === algoritmus === | ||
| + | |||
| + | == Pattern run == | ||
| + | - zvol smer ze zbylych a nasampluj ho | ||
| + | - je li energie nizsi, presun se | ||
| + | - je li energie vyzsi | ||
| + | |||
| + | | ||