Become a Readings Member to make your shopping experience even easier. Sign in or sign up for free!

Become a Readings Member. Sign in or sign up for free!

Hello Readings Member! Go to the member centre to view your orders, change your details, or view your lists, or sign out.

Hello Readings Member! Go to the member centre or sign out.

Optimization and Operations Research: Proceedings of a Conference Held at Oberwolfach, July 27-August 2, 1975
Paperback

Optimization and Operations Research: Proceedings of a Conference Held at Oberwolfach, July 27-August 2, 1975

$138.99
Sign in or become a Readings Member to add this title to your wishlist.

This title is printed to order. This book may have been self-published. If so, we cannot guarantee the quality of the content. In the main most books will have gone through the editing process however some may not. We therefore suggest that you be aware of this before ordering this book. If in doubt check either the author or publisher’s details as we are unable to accept any returns unless they are faulty. Please contact us if you have any questions.

The variable metric algorithm is widely recognised as one of the most efficient ways of solving the following problem:- Locate x* a local minimum point n ( 1) of f(x) x E R Considerable attention has been given to the study of the convergence prop- ties of this algorithm especially for the case where analytic expressions are avai- ble for the derivatives g. = af/ax. i 1 *** n * (2) ~ ~ In particular we shall mention the results of Wolfe (1969) and Powell (1972), (1975). Wolfe established general conditions under which a descent algorithm will converge to a stationary point and Powell showed that two particular very efficient algorithms that cannot be shown to satisfy \,olfe’s conditions do in fact converge to the minimum of convex functions under certain conditions. These results will be st- ed more completely in Section 2. In most practical problems analytic expressions for the gradient vector g (Equ. 2) are not available and numerical derivatives are subject to truncation error. In Section 3 we shall consider the effects of these errors on Wolfe’s convergent prop- ties and will discuss possible modifications of the algorithms to make them reliable in these circumstances. The effects of rounding error are considered in Section 4, whilst in Section 5 these thoughts are extended to include the case of on-line fu- tion minimisation where each function evaluation is subject to random noise.

Read More
In Shop
Out of stock
Shipping & Delivery

$9.00 standard shipping within Australia
FREE standard shipping within Australia for orders over $100.00
Express & International shipping calculated at checkout

MORE INFO
Format
Paperback
Publisher
Springer-Verlag Berlin and Heidelberg GmbH & Co. KG
Country
Germany
Date
1 February 1976
Pages
318
ISBN
9783540076162

This title is printed to order. This book may have been self-published. If so, we cannot guarantee the quality of the content. In the main most books will have gone through the editing process however some may not. We therefore suggest that you be aware of this before ordering this book. If in doubt check either the author or publisher’s details as we are unable to accept any returns unless they are faulty. Please contact us if you have any questions.

The variable metric algorithm is widely recognised as one of the most efficient ways of solving the following problem:- Locate x* a local minimum point n ( 1) of f(x) x E R Considerable attention has been given to the study of the convergence prop- ties of this algorithm especially for the case where analytic expressions are avai- ble for the derivatives g. = af/ax. i 1 *** n * (2) ~ ~ In particular we shall mention the results of Wolfe (1969) and Powell (1972), (1975). Wolfe established general conditions under which a descent algorithm will converge to a stationary point and Powell showed that two particular very efficient algorithms that cannot be shown to satisfy \,olfe’s conditions do in fact converge to the minimum of convex functions under certain conditions. These results will be st- ed more completely in Section 2. In most practical problems analytic expressions for the gradient vector g (Equ. 2) are not available and numerical derivatives are subject to truncation error. In Section 3 we shall consider the effects of these errors on Wolfe’s convergent prop- ties and will discuss possible modifications of the algorithms to make them reliable in these circumstances. The effects of rounding error are considered in Section 4, whilst in Section 5 these thoughts are extended to include the case of on-line fu- tion minimisation where each function evaluation is subject to random noise.

Read More
Format
Paperback
Publisher
Springer-Verlag Berlin and Heidelberg GmbH & Co. KG
Country
Germany
Date
1 February 1976
Pages
318
ISBN
9783540076162