[PDF] [PDF] Constrained and Unconstrained Optimization

Oct 10th, 2017 C Hurtado (UIUC - Economics) Numerical Methods Page 2 On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function



Previous PDF Next PDF





[PDF] Mathematical Economics (ECON 471) Lecture 4 Unconstrained

Unconstrained Constrained Optimization Teng Wah Leo 1 Unconstrained Optimization from microeconomics, one from macroeconomics, and another from 



[PDF] Lecture Notes - Department of Economics

Lecture 2: Tools for optimization (Taylor's expansion) and Unconstrained optimiza- tion Lecture 6: Constrained optimization III: The Maximum Value Function, 



[PDF] Constrained and Unconstrained Optimization

Oct 10th, 2017 C Hurtado (UIUC - Economics) Numerical Methods Page 2 On the Agenda 1 Numerical Optimization 2 Minimization of Scalar Function



[PDF] Chapter 3: Single Variable Unconstrained Optimization optimization

In a sense, nearly all economic problems are constrained because we are interested Within the unconstrained optimization problem heading, we can have single-variable and 21 in the Road Map in the C1Read pdf handout It is important 



[PDF] CONSTRAINED OPTIMIZATION - Kennedy - Economics

Peter Kennedy These notes provide a brief review of methods for constrained optimization We then solve the unconstrained maximization problem (1 30) λ,



[PDF] Optimization Techniques

Constrained versus Unconstrained Optimization The true marginal value of a function (e g , an economic relationship) is obtained from Equation A 4 when X is  



[PDF] 1 Unconstrained optimization - Simon Fraser University

Econ 798 s Introduction to Mathematical Economics Lecture Notes 4 which typically deals with problems where resources are constrained, but represents a The following theorem is the basic result used in unconstrained optimization



[PDF] 1 Unconstrained optimization - Simon Fraser University

Econ 798 t Introduction to Mathematical Economics Lecture Notes 4 which typically deals with problems where resources are constrained, but Lagrangean (from the unconstrained optimization method) but notice that we have ordered 



[PDF] Optimization - UBC Arts

4 sept 2019 · We typically model economic agents as optimizing some objective function Consumers to begin by studying unconstrained optimization problems There are or minimum of a function, perhaps subject to some constraints see the past course notes for details http://faculty arts ubc ca/pschrimpf/526/



[PDF] BEEM103 Mathematics for Economists Unconstrained Optimization

are satisfied, i e , either the k-Lagrange multiplier is zero or the k-th constraint binds for 1 ≤ k ≤ K Then (x∗ ,y ∗) is a maximum for the constrained maximization

[PDF] constrained optimization

[PDF] constrained optimization and lagrange multiplier

[PDF] constrained optimization and lagrange multiplier method example

[PDF] constrained optimization and lagrange multiplier methods bertsekas

[PDF] constrained optimization and lagrange multiplier methods matlab

[PDF] constrained optimization and lagrange multiplier methods pdf

[PDF] constrained optimization business

[PDF] constrained optimization calculator

[PDF] constrained optimization economics definition

[PDF] constrained optimization economics examples

[PDF] constrained optimization economics pdf

[PDF] constrained optimization economics questions

[PDF] constrained optimization in mathematical economics

[PDF] constrained optimization lagrange multiplier examples

[PDF] constrained optimization lagrange multiplier inequality

Constrained and Unconstrained Optimization

Carlos Hurtado

Department of Economics

University of Illinois at Urbana-Champaign

hrtdmrt2@illinois.edu

Oct 10th, 2017

C. Hurtado (UIUC - Economics)Numerical Methods

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Numerical Optimization

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Numerical Optimization

Numerical Optimization

IIn some economic problems, we would like to find the value that maximizes or minimizes a function.IWe are going to focus on the minimization problems: min xf(x) or minxf(x) s.t.x?BINotice that minimization and maximization are equivalent because we can maximizef(x) by minimizing-f(x).C. Hurtado (UIUC - Economics)Numerical Methods1 / 27

Numerical Optimization

Numerical Optimization

IWe want to solve this problem in a reasonable timeIMost often, the CPU time is dominated by the cost of evaluating

f(x).IWe will like to keep the number of evaluations off(x) as small as

possible.IThere are two types of objectives:-Fin dingglobal minimum: The lo westp ossiblevalue of the function

over the range.-Fin dinga lo calminimum: Smallest value within a b ounded neighborhood. C. Hurtado (UIUC - Economics)Numerical Methods2 / 27

Minimization of Scalar Function

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Minimization of Scalar Function

Bracketing Method

IWe would like to find the minimum of a scalar funcitonf(x), such thatf:R→R.IThe Bracketing method is a direct method that does not use curvature or local approximationIWe start with a bracket:

(a,b,c) s.t.af(b) andf(c)>f(b)IWe will search for the minimum by selecting a trial point in one of the

intervals.IIfc-b>b-a, taked=b+c2

IIff(d)>f(b), there is a new bracket (d,b,c) or (a,b,d).IIff(d) small. C. Hurtado (UIUC - Economics)Numerical Methods3 / 27

Minimization of Scalar Function

Bracketing Method

IWe selected the new point using the mid point between the extremes, but what is the best location for the new pointd?abdc

IOne possibility is to minimize the size of the next search interval.IThe next search interval will be either fromatodor frombtocIThe proportion of the left interval is

w=b-ac-aIThe proportion of the new interval is z=d-bc-aC. Hurtado (UIUC - Economics)Numerical Methods4 / 27

Golden Search

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Golden Search

Golden Search

IThe proportion of the new segment will be

1-w= c-bc-aor w+z= d-ac-aIMoreover, ifdis the new candidate to minimize the function, z1-w=d-bc-ac-bc-a= d-bc-bIIdeally we will have z= 1-2wand z1-w=wC. Hurtado (UIUC - Economics)Numerical Methods5 / 27

Golden Search

Golden Search

IThe previous equations implyw2-3w+ 1 = 0, or

w=3-⎷5 2 ?0.38197IIn mathematics, the golden ration isφ=1+⎷5 2

IThis goes back to PythagorasINotice that 1-1φ

=3-⎷5 2 IThe Golden Search algorithm uses the golden ratio to set the new

point (using a weighed average)IThis reduces the bracketing by about 40%.IThe performance is independent of the function that is being

minimized. C. Hurtado (UIUC - Economics)Numerical Methods6 / 27

Golden Search

Golden Search

ISometimes the performance can be improve substantially when a local approximation is used.IWhen we use a combination of local approximation and golden search

we get a method called Brent.ILet us suppose that we want to minimizey=x(x-2)(x+ 2)2C. Hurtado (UIUC - Economics)Numerical Methods7 / 27

Golden Search

Golden Search

ISometimes the performance can be improve substantially when a local approximation is used. IWhen we use a combination of local approximation and golden search we get a method called Brent. ILet us suppose that we want to minimizey=x(x-2)(x+ 2)22 1 012 x10 5

051015202530

y=x(x-2)(x+2)2C. Hurtado (UIUC - Economics)Numerical Methods7 / 27

Golden Search

Golden Search

IWe can use theminimizescalarfunction from thescipy.optimize

module.1>>>deff(x):2>>> ....return(x- 2 )* x * ( x+ 2 )**23>>>fromscipy.optimizeimportminimize_scalar4>>>o pt_res=minimize_scalar(f)5>>>printopt_res.x61.280776404037>>>o pt_res=minimize_scalar(f,method="golden")8>>>printopt_res.x91.2807764014710>>>o pt_res=minimize_scalar(f,b ounds=(-3,- 1),m ethod="bounded")

11>>>printopt_res.x12-2.0000002026C. Hurtado (UIUC - Economics)Numerical Methods8 / 27

Newton"s Method

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Newton"s Method

Newton"s Method

ILet us assume that the functionf(x) :R→Ris infinitely p(x) =f(a) +f?(a)(x-a) +12 f??(a)(x-a)2ITo find an optimal value forp(x) we use the FOC: p ?(x) =f?(a) + (x-a)f??(a) = 0IHence, x=a-f?(a)f ??(a)C. Hurtado (UIUC - Economics)Numerical Methods9 / 27

Newton"s Method

Newton"s Method

IThe Newton"s method starts with a givenx1.ITo compute the next candidate to minimize the function we use

x n+1=xn-f?(xn)f ??(xn)IDo this until |xn+1-xn|< ε and |f?(xn+1)|< ?INewton"s method is very fast (quadratic convergence).ITheorem: |xn+1-xn|<|f???(x?)|2|f??(x?)||xn-x?|2C. Hurtado (UIUC - Economics)Numerical Methods10 / 27

Newton"s Method

Newton"s Method

IA Quick Detour: Root FindingIConsider the problem of finding zeros forp(x)IAssume that you know a pointawherep(a) is positive and a pointb

wherep(b) is negative.IIfp(x) is continuous betweenaandb, we could approximate as: p(x)?p(a) + (x-a)p?(a)IThe approximate zero is then: x=a-p(a)p ?(a)IThe idea is the same as before. Newton"s method also works for finding roots. C. Hurtado (UIUC - Economics)Numerical Methods11 / 27

Newton"s Method

Newton"s Method

IThere are several issues with the Newton"s method:-Ite rationp ointis stationa ry

Sta rtingp ointenter a cycle

Der ivativedo esnot exist

Discontin uousderivative

INewton"s method finds a local optimum, but not a global optimum.C. Hurtado (UIUC - Economics)Numerical Methods12 / 27

Polytope Method

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Polytope Method

Polytope Method

IThe Polytope (a.k.a. Nelder-Meade) Method is a direct method to find the solution of minxf(x) wheref:Rn→R.IWe start with the pointsx1,x2andx3, such that

f(x1)≥f(x2)≥f(x3)IUsing the midpoint betweenx2andx3, we reflectx1to the pointy1ICheck iff(y1)

Polytope Method

Polytope Method

ILet us consider the following function:

f(x0,x1) = (1-x0)2+ 100(x1-x20)2IThe function looks like:C. Hurtado (UIUC - Economics)Numerical Methods14 / 27

Polytope Method

Polytope Method

ILet us consider the following function:

f(x0,x1) = (1-x0)2+ 100(x1-x20)2

IThe function looks like:x02.0

1.5 1.0 0.5

0.00.51.01.52.0

x11 0 1 2 3 4y

050010001500200025003000

x02.0 1.5 1.0 0.5

0.00.51.01.52.0

x11 0 1 2 3 4y

050010001500200025003000

C. Hurtado (UIUC - Economics)Numerical Methods14 / 27

Polytope Method

Polytope Method

ILet us consider the following function:

f(x0,x1) = (1-x0)2+ 100(x1-x20)2

IThe function looks like:1.0

0.5

0.00.51.01.50.5

0.00.51.01.5

C. Hurtado (UIUC - Economics)Numerical Methods14 / 27

Polytope Method

Polytope Method

IIn python we can do:1>>>deff2(x):2....return(1-x[0])**2+ 1 00*(x[1]-x[0]**2)**23>>>fromscipy.optimizeimportfmin4>>>o pt=fmin(func=f2,x0=[0,0])5>>>print(opt)6[ 1.00000439 1.00001064]C. Hurtado (UIUC - Economics)Numerical Methods15 / 27

Newton"s Method Reloaded

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Newton"s Method Reloaded

Newton"s Method

IWhat can we do if we want to use Newton"s Method for a function f:Rn→R?IWe can use a quadratic approximation ata?= (a1,···,an): p(x) =f(a) +?f(a)(x-a) +12 (x-a)?H(a)(x-a) wherex?= (x1,···,xn).IThe gradient?f(x) is a multi-variable generalization of the C. Hurtado (UIUC - Economics)Numerical Methods16 / 27

Newton"s Method Reloaded

Newton"s Method

IThe hessian matrixH(x) is a square matrix of second-order partial derivatives that describes the local curvature of a function of many variables.

H(x) =?

2f(x)∂x21∂

2f(x)∂x2∂x1∂

2f(x)∂xn∂x1∂

??????IThe FOC is: ?p=?f(a) +H(a)(x-a) = 0IWe can solve this to get: x=a-H(a)-1?f(a)C. Hurtado (UIUC - Economics)Numerical Methods17 / 27

Newton"s Method Reloaded

Newton"s Method

IFollowing the same logic as in the one dimensional case: x k+1=xk-H(xk)-1?f(xk)IHow do we computeH(xk)-1?f(xk)?IWe can solve:

H(xk)-1?f(xk) =s?f(xk)=H(xk)s

IThe search direction,s, is the solution of a system of equations (and we know how to solve that!) C. Hurtado (UIUC - Economics)Numerical Methods18 / 27

Quasi-Newton Methods

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Quasi-Newton Methods

Quasi-Newton Methods

IFor Newton"s method we need the Hessian of the function.IIf the Hessian is unavailable, the "full" Newton"s method cannot be

usedIAny method that replaces the Hessian with an approximation is a quasi-Newton method.IOne advantage of quasi-Newton methods is that the Hessian matrix does not need to be inverted.INewton"s method, require the Hessian to be inverted, which is

typically implemented by solving a system of equationsIQuasi-Newton methods usually generate an estimate of the inverse

directly. C. Hurtado (UIUC - Economics)Numerical Methods19 / 27

Quasi-Newton Methods

Quasi-Newton Methods

IThe Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm, the Hessian matrix is approximated using updates specified by gradient

evaluations (or approximate gradient evaluations).IIn python:1>>>importnumpya sn p2>>>fromscipy.optimizeimportfmin_bfgs3>>>deff(x):4...return(1-x[0])**2+ 1 00*(x[1]-x[0]**2)**25>>>o pt= f min_bfgs(f,x 0=[0.5,0.5])IUsing the gradient we can improve the approximation1>>>defgradient( x): 2 . . .returnnp. a rray( (-2?(1-x[ 0])-100?4?x[ 0]?(x [ 1]-x[ 0]??2) , 200?(x[ 1]-x[0]??2) ) )3>>>opt2= f minbfgs( f , x 0= [10, 10], f prime=gradient)

C. Hurtado (UIUC - Economics)Numerical Methods20 / 27

Quasi-Newton Methods

Quasi-Newton Methods

IOne of the methods that requires the fewest function calls (therefore

very fast) is the Newton-Conjugate-Gradient (NCG).IThe method uses a conjugate gradient algorithm to (approximately)

invert the local Hessian.IIf the Hessian is positive definite then the local minimum of this function can be found by setting the gradient of the quadratic form to

zeroIIn python1>>>fromscipy.optimizeimportfmin_ncg2>>>o pt3=fmin_ncg(f,x0=[10,10],fprime=gradient)C. Hurtado (UIUC - Economics)Numerical Methods21 / 27

Non-linear Least-Square

On the Agenda

1Numerical Optimization

2Minimization of Scalar Function

3Golden Search

4Newton"s Method

5Polytope Method

6Newton"s Method Reloaded

7Quasi-Newton Methods

8Non-linear Least-Square

9Constrained Optimization

C. Hurtado (UIUC - Economics)Numerical Methods

Non-linear Least-Square

Non-linear Least-Square

Isuppose it is desired to fit a set of data{xi,yi}to a model, y=f(x;p) wherepis a vector of parameters for the model that need to be found.IA common method for determining which parameter vector gives the

best fit to the data is to minimize the sum of squares errors. (why?)IThe error is usually defined for each observed data-point as:

e i(yi,xi;p) =?yi-f(xi;p)?IThe sum of the square of the errors is:

S(p;x,y) =N?

i=1e2i(yi,xi;p)C. Hurtado (UIUC - Economics)Numerical Methods22 / 27

Non-linear Least-Square

Non-linear Least-Square

ISuppose that we model some populaton data at several times.y

i=f(ti;(A,b)) =AebtIThe parametersAandbare unknown to the economist.IWe would like to minimize the square of the error to approximate the

data C. Hurtado (UIUC - Economics)Numerical Methods23 / 27quotesdbs_dbs14.pdfusesText_20