QUANTITATIVE METHODS for ECONOMIC APPLICATIONS MATHEMATICS for ECONOMIC APPLICATIONS
TASK 3/2/2020
I M 1) Transform D œ " $ " $ in trigonometric form and then calculate D
" 3 " 3
$
D œ " $ " $ œ œ
" 3 " 3 " 3 " 3
" $ " 3 " $ " 3
œ " $ " $ " $ " $ 3 œ œ " $ 3 œ
" " #
# # $ 3
œ # " $3 œ # & 3 &
# # $ $
cos 1 sin 1.
So D œ )$ cos& 31 sin&1œ )cos1 3sin1œ ).
I M 2) Given the matrix œ and the vector —œ , determine the va-
" # " "
" 7 # "
" " 5 "
lues of and for which the vector 7 5 ˜œ —† is perpendicular to the vector !ß "ß " and with modulus equal to #%.
˜œ —† œ † œ
" # " " %
" 7 # " 7 "
" " 5 " 5 #
.
˜œ —† perpendicular to !ß "ß " implies %ß 7 "ß 5 # † !ß "ß " œ ! Ê Ê ! 7 " 5 # œ ! Ê 7 5 œ " Ê 7 œ 5 " .
Modulus of equal to ˜ #% implies ˜ œ% 7 "# # 5 # # œ#% Ê Ê% 5 ## # 5 # # œ% # 5 ## # œ#% Ê
Ê #5 )5 #% œ #% Ê #5 5 % œ ! Ê 5 œ ! 7 œ 5 " œ "
5 œ % 7 œ 5 " œ $
# and
and .
I M 3) Given the matrix œ determine the two values of the parameter for5
" ! "
" 5 "
" ! "
which the matrix admits multiple eigenvalues. For these values of , check if the correspon-5 ding matrix is diagonalizable or not.
From we get
- ˆ œ ! œ 5 "
" ! "
" 5 "
" ! "
" œ
-
-
-
- -# œ5- - # #- œ5- - - # œ ! Ê-" œ 5ß-# œ !ß-$ œ #. To get multiple eigenvalues there are two possibilities.
1) if 5 œ !Ê œ œ !ß œ # and so we get: ! † œ !
" ! "
" "
" ! "
-" -# -$
ˆ ;
from Rank ! †ˆœ " Ê 7 œ $ " œ # œ 71! +! and the matrix, for 5 œ ! is a dia- gonalizable one.
2) if 5 œ #Ê œ !ß œ œ # and so we get: # † œ !
" ! "
" "
" ! "
-" -# -$
ˆ ; since:
" "
" " œ # Á !, from Rank # †ˆ œ # Ê 7 œ $ # œ " 7 œ #1# +# and the matrix, for 5 œ # is not a diagonalizable one.
I M 4) Given the basis for ‘$À –œ"ß "ß ! à "ß !ß " à !ß "ß " , find the coordinates of the vector ˜ œ !ß "ß " in such basis.
To solve the problem we must simply solve the system:
˜œ– —† œ † œ Ê Ê
" " ! B ! B B œ !
" " B " B B œ "
! " " B " B B œ "
!
" " #
# " $
$ # $
Ê Ê Ê
B œ B B œ B B œ "
B œ " B B œ " B B œ "
B B œ " B " B œ " B œ !
# " # " "
$ " $ " #
# $ " " $
.
II M 1) Given the system satisfied at P ,
0 Bß Cß D œ B / C / # / œ !
1 Bß Cß D œ B C BD #BCD œ ! œ "ß "ß "
C B D
# # !
verify that an implicit function B Ä Cß D can be defined with it, and then calculate the first order derivatives of this function.
The functions 0 Bß Cß D and 1 Bß Cß D are differentiable functions a Bß Cß D − ‘$. To apply Dini's Theorem we calculate the Jacobian matrix:
` 0 ß 1
` Bß Cß D œ / C / B/ / # /
#BC D #CD B #BD #BD #BC
C # B #C B D
and so ` 0 ß 1 . Since it is
` Bß Cß D "ß "ß " œ #/ #/ #/ #/ #/ œ #/ Á !
" " ! " !
possible to define an implicit function B Ä Cß D . For its derivatives we get:
d d
dC #/ and dD %/ .
B œ œ #/ œ " B œ œ #/ œ #
#/ #/ #/ #/
" ! " "
#/ #/ #/ #/
" ! " !
II M 2) Solve the problem .
Max/min u.c.:
0 Bß C œ B C B C $B B !
C ! C Ÿ % B
2 2
The objective function of the problem is a continuous function, the feasible region is aX compact set and therefore there are certainly maximum and minimum values.
Given the number of constraints, it is not convenient to use the Kuhn-Tucker conditions which would require the resolution of 8 systems.
We firstly study the possible points of free relative maximum and minimum.
Applying the first order conditions we get:
f Ê 0 œ #B C $ œ ! Ê Ê −
0 œ #C B œ !
$C $ œ ! B œ # B œ #C C œ "
0 Bß C œ BCww and #ß " X.
For the second order conditions we get, using the Hessian matrix:
‡ ‡ ‡
‡
Bß C œ # " œ œ # !
" # #ß " . Since " œ % " œ $ ! , the point #ß "
# is a
minimum point, with 0 #ß " œ $ .
Now we study the objective function 0 Bß C œ B C B C $B 2 2 at the points of the first constraint B œ !. It is 0 !ß C œ C 2 and for C ! it is an ever increasing function.
Let's study the objective function 0 Bß C œ B C B C $B 2 2 at the points of the second constraint C œ !. It is 0 Bß ! œ B $B Ê 0 B œ #B $ ! 2 w and therefore the function is increasing for B $.
#
Finally, let's study the objective function 0 Bß C œ B C B C $B 2 2 at the points of the third constraint C œ % B. It isÀ
0 Bß % B œ B "' B )B %B B $B œ $B "&B "' Ê 2 # # #
Ê 0 B œ 'B "& ! B &
#
w and therefore the function is increasing for .
We therefore have the following situation:
from which we see that:
- !ß % is a maximum point, with 0 !ß % œ "' , and it is the absolute maximumà - %ß ! is a maximum point, with 0 %ß ! œ % , and it is a relative maximumà - #ß " is a minimum point, with 0 #ß " œ $ , and it is the absolute minimum.
The point $#ß ! is a minimum point, with 0$ß !œ * , relatively to the points of the
# %
constraint C œ !. For a complete analysis we form the Lagrangian function only for this con- straint. We get: ABß Cß-œB C B C $B2 2 - C. SoÀ
A
A -
-
wB wC
œ #B C $ œ #C B C œ !
œ ! œ ! Ê
B œ C œ !
œ !
$
#
$
#
and therefore the point $#ß ! may be, since
- œ $ !
# , with respect to the interior points of the feasible region , a maximum point and X therefore it is neither a maximum point nor a minimum point.
The point & $# #ß is a minimum point, with 0& $ß œ "" relatively to the points of
# # %
the constraint C œ % B. For a complete analysis we form the Lagrangian function only for this constraint. We get: ABß Cß-œB C B C $B2 2 -B C %. So À
A -
A -
- -
wB wC
œ #B C $ œ #C B C œ % B
œ !
œ ! Ê Ê
'B œ "&
C œ % B œ ) $B
B œ C œ
œ !
&
$#
#"
#
and therefore the point
& $# #ß may be, since - œ " !, with respect to the interior points of the feasible region ,
# X
a maximum point and therefore it is neither a maximum point nor a minimum point.
II M 3) Given 0 Bß C œ B C and the vectors •œ "ß " and –œ "ß " , let and be@ A their unit vectors. If W@0 B ß C ! !œ# and WA0 B ß C ! !œ !, determine the coordinates of the point B ß C! !.
The function 0 Bß C œ B C it is a polynomial and therefore it is differentiable of any order.
So H 0@ B ß C! ! œ f0B ß C! !† @ and H 0A B ß C! !œ f0B ß C! !† A. Since f0 B ß C œ Cà B Ê f0 B ß C! !œ C B!ß !Þ
Now we get @ œ " " and A œ " " . Then:
#ß # #ß #
W@ ! ! ! ! ! !
! ! ! ! ! !
0 B ß C œ B ß C C B ß œ #
B ß C B ß C C B ß œ !
Ê
f0 † @ œ ß †
H 0 œ f0 † A œ ß †
" "
# #
" "
# #
A
Ê C B "ß " œ # Ê B C œ # Ê B œ "
C B "ß " œ ! B C œ ! C œ "
!! !! !! !! !!
ß †
ß † .
II M 4) Given the function 0 Bß C œ B BC C # $, determine the nature of its stationary points.
Applying the first order conditions we get:
f Ê 0 œ #B C œ ! Ê Ê
0 œ $C B œ !
C œ #B
"#B B œ B "#B " œ !
0 Bß C œ
Bw
w #
C #
Ê B œ ! ∪ C œ !
B œ
C œ
"
"#
"
'
.
For the second order conditions we use the Hessian matrix: ‡Bß C œ # "
" 'C . Since ‡ !ß ! œ # "Ê ‡ œ " ! the point !ß !
" ! # is a saddle point;
since ‡ ‡ ‡ the point
‡
" "
"# 'ß œ # " Ê œ # !ß œ " !
" " œ # " œ " !
" "
#
" "
"# 'ß is a minimum point.