Stochastic Optimization Authors: Johannes Josef SchneiderScott Kirkpatrick The search for optimal solutions pervades our daily lives. From the scientific point of view, optimization procedures play an eminent role whenever exact solutions to a given problem are not at hand or a compromise has to be sought, e.g. to obtain a sufficiently accurate solution within a given amount of time. This book addresses stochastic optimization procedures in a broad manner, giving an overview of the most relevant optimization philosophies in the first part. The second part deals with benchmark problems in depth, by applying in sequence a selection of optimization procedures to them. While having primarily scientists and students from the physical and engineering sciences in mind, this book addresses the larger community of all those wishing to learn about stochastic optimization techniques and how to use them.
2019-12-21 18:57:33 41.15MB 随机优化
1
电子书,经典的结构拓扑优化入门书籍! 电子书,经典的结构拓扑优化入门书籍!!
2019-12-21 18:57:12 40.89MB 机械工程 拓扑优化入门
1
优化觅食算法(OFA 2016年), 代码包含论文中所述的觅食操作。
2019-12-21 18:53:35 3.99MB 觅食算法
1
深度学习ADAM算法,分享给大家学习。 We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and/or sparse gradients. The hyper-parameters have intuitive interpretations and typically require little tuning. Some connections to related algorithms, on which Adam was inspired, are discussed. We also analyze the theoretical convergence properties of the algorithm and provide a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Empirical results demonstrate that Adam works well in practice and compares favorably to other stochastic optimization methods. Finally, we discuss AdaMax, a variant of Adam based on the infinity norm.
2019-12-21 18:52:51 571KB Adam算法
1
国内找了很久都没找到,跑去外网下了一个,分享给大家,Stanford教授Stephen Boyd的cvx optimization课程附加练习的答案
2019-12-21 18:52:03 4.82MB cvx addition
1
Evolutionary Optimization Algorithms Wiley Evolutionary Optimization Algorithms Wiley Evolutionary Optimization Algorithms Wiley
2019-12-21 18:49:31 10.38MB Evolutionary Optimization Algorithms
1
Theory of Multiobjective Optimization,多目标优化理论
2019-12-21 18:44:20 3.6MB 多目标
1
Convex Optimization – Boyd and Vandenberghe 课后习题解答
2010-09-24 00:00:00 1.75MB 凸优化 Convex Optimization 习题
1