TASK MATHEMATICS for ECONOMIC APPLICATIONS 23/03/2019
I M 1) If D œ " 3 $, calculate D .
From " 3 œ# " " œ # cos sin we get:
# #
3 ( 3 (
% %
1 1
" 3$ œ ##cos#" sin#" ##cos& sin& and so:
%1 3 %1 œ %1 3 %1
D œ # # cos& # sin& # . )1 5 #1 3 )1 5 #1 ß ! Ÿ 5 Ÿ "
For 5 œ ! À & 3 & œ 3 ,
) )
##cos 1 sin 1 # " # "
for 5 œ " À "$ 3 "$ œ 3 .
) )
##cos 1 sin 1 # " # "
I M 2) The characteristic polynomial of œ is
# ! !
! " "
! " "
À
: œ œ œ # " œ
- -ˆ - -
-
-
-
# ! !
! " "
! " "
" #
œ # - - # #- œ-# - - # œ ! . So we get the three eigenvalues -" œ !, -# œ-$ œ # .
To find an eigenvector associated to the eigenvalue - œ ! we must solve the system:
!ˆ † †
# ! !
! " "
! " "
— œ Ê œ Ê Ê
B !
B !
B !
B œ ! B œ !
B B œ ! B œ B
"
#
$
" "
# $ $ #
#
and so the eigenvectors associated to the eigenvalue - œ ! are !ß Bß B. For B œ " we get !ß "ß "and the corresponding unit vector !ß" ß " .
# #
Corresponding to the eigenvalue - œ # we must find two orthogonal eigenvectors.
We solve the system:
#ˆ † †
! ! !
! " "
! " "
— œ Ê œ Ê
B !
B !
B !
B B œ B
"
#
$
"
$ #
a .
For B œ B œ "" # we get "ß "ß "and the corresponding unit vector " " " .
$ß $ß $
To find a second eigenvector associated to the eigenvalue - œ # and orthogonal to the eigenvector "ß "ß " we pose "ß "ß " † B ß B ß B " # #œ B #B œ ! Ê B œ #B" # " # from which we get #ß "ß " and the corresponding unit vector # " " Þ
'ß 'ß '
So an orthogonal matrix which diagonalizes is ” œ .
!
" #
$ '
" " "
# $ '
" " "
# $ '
I M 3) From Rouchè-Capelli Theorem, if Rank œRank ˜| œ 5 the system has ∞85 solutions, where is the number of the variables; in our problem 8 8 œ %. We study the Rank
of the augmented matrix: | .
˜ œ
" " " " l "
" " " # l "
$ $ $ 2 l 7
By elementary operations on the rows: V Ã V V# # " and V Ã V $V$ $ " we getÀ
" " " " l "
! ! ! " l !
! ! ! 2 $ l 7 $
. By (V Ã V 2 $$ $ †V#) we get:
" " " " "
! ! ! " !
! ! ! ! 7 $
l l l
. And so À
a and 2 7 œ $ À Rank œRank ˜| œ # À the system has ∞# solutionsà
a if 2 7 Á $ À Rank œ # Rank ˜| œ $ Àthe system has no solutions.
I M 4) Since the vector has coordinates — "ß " in the basis •œ #ß " à "ß " we get:
—œ # " † " œ " –œ $ß " à #ß "
" " " !
. In the basis we have:
" $ # B
! œ " " † B"
# Ê B #B œ " Ê B #B œ " Ê B œ "
B B œ ! B œ B B œ "
$ " # $ " " "
" # # " # .
So the coordinates of the vector in the basis — – œ $ß " à #ß " are "ß ". Exactly the same.
II M 1) From the equation 0 Bß Cß D œ B / C / BD œ ! C D we get:
0 "ß !ß " œ " ! " œ ! and so the point "ß !ß " satisfies the equation. Then À f0 Bß Cß D œ / Dà B / / à C / B Ê f0 "ß !ß " œ !ß " /ß " C C D D .
Since 0 "ß !ß " œ " Á !Dw it is possible to define an implicit function Bß C Ä D Bß C
whose derivatives are: ` D ! ` D " / .
` B "ß ! œ " œ !à` C "ß ! œ " œ " /
II M 2) To s observe that the
Max/min u.c.:
olve the problem: , we
0 Bß Cß D œ B C D B C D œ "
B C D œ "
# # #
objective function of the problem is a continuous function, the feasible region is not a com-X pact set, but from the equations of the constraints we can easily explicitly solve respect one variable:
B C D œ " #B œ # B œ "
B C D œ " Ê D œ B C " Ê D œ C . Substituting we get:
0 Bß Cß D œ 0 "ß Dß D œ J D œ " D D œ " #D # # #. Since J D œ %Dw simply we get J D œ %Dw ! for D !. For D Ÿ ! the function J D is a decreasing function, for D ! the function J D is an increasing function and so the point D œ ! is a minimum point.
For D œ ! we have also B œ " and C œ ! and so the point "ß !ß ! is the unique solution of the problem and it is a minimum point.
If we want to solve the problem using the traditional Lagrangian function with first and second order conditions we have:
ABß Cß Dß- -"ß #œB C D# # #-"B C D "-#B C D ". First order conditions bring to the system:
A - -
A - -
A - -
- -
wB " #
wC " #
wD " #
"
#
œ #B œ !
œ #C œ !
œ #D œ !
B œ "
C œ ! D œ ! B C D œ "
B C D œ "
Ê
œ "
œ "
. For the second order conditions we use the borde-
red Hessian matrix: ‡& œ . We must calculate only ‡& .
! ! " " "
! ! " " "
" " # ! !
" " ! # !
" " ! ! #
Since
‡& œ œ œ
! ! " " " ! ! " " "
! ! " " " ! ! " " "
" " # ! ! " " # ! !
" " ! # ! ! # # # !
" " ! ! # ! # # ! #
œ " † œ œ # † œ
! " " " ! " " "
! " " " ! " " "
# # # ! ! ! # #
# # ! # # # ! #
" " "
" " "
! # #
œ # † œ # † % % œ "' !
" " "
! # #
! # #
, constraints are two, an even number, and
since ‡& ! we find again that the point "ß !ß ! is the unique solution of the problem and it is a minimum point.
II M 3) Since the function 0 Bß C œ B C B # C C# is clearly a differentiable function in ‘#, we simply calculate H 0@ "ß " œ f0 "ß " †@ and H 0A "ß " œ f0 "ß " † A.
f0 Bß C œ #BC C à B #BC " Ê f0 "ß " œ "ß ! # # .
So H 0@ "ß " œ "ß ! † cosαßsinαœcosα while H 0A "ß " œ "ß ! † "ß ! œ " and from this we get cosαœ " Êαœ !.
II M 4) To nalyze the nature of the stationary points of the function we apply first and a second order conditions. For the first order conditions we pose:
f Ê 0 œ # B #C # C #B Ê
0 œ # # B #C C #B œ !
"!B )C
)B "!C œ !
0 Bß C œ † # † œ !
† † # †
œ !
Bw Cw
from which we get the unique solution B C œ !œ ! Þ
For the second order conditions we construct the Hessian matrix:
‡ ‡ ‡
‡
Bß C œ !ß ! œ "! )
) "! . Since " œ "!! '% œ $' ! we see that the
#
œ "! ! point !ß ! is a minimum point.