Lambda-Calculus and Type Theory Part IV
Ugo Dal Lago
Scuola Estiva AILA, Gargnano, August 2018
Section 1
Implicit Complexity at a Glance
Implicit Computational Complexity
I Goal
I Machine-free characterizations of complexity classes.
I No explicit reference to resource bounds.
I P, PSPACE, L, NC,. . .
I Why?
I Simple and elegant presentations of complexity classes.
I Formal methods for complexity analysis of programs.
I How?
I First-order functional programs [BC92], [Leivant94], . . .
I Model theory [Fagin73], . . . ,
I Type Systems and λ-calculi [Hofmann97], [Girard97], . . .
I . . .
Implicit Computational Complexity
I Goal
I Machine-free characterizations of complexity classes.
I No explicit reference to resource bounds.
I P, PSPACE, L, NC,. . .
I Why?
I Simple and elegant presentations of complexity classes.
I Formal methods for complexity analysis of programs.
I How?
I First-order functional programs [BC92], [Leivant94], . . .
I Model theory [Fagin73], . . . ,
I Type Systems and λ-calculi [Hofmann97], [Girard97], . . .
I . . .
Implicit Computational Complexity
I Goal
I Machine-free characterizations of complexity classes.
I No explicit reference to resource bounds.
I P, PSPACE, L, NC,. . .
I Why?
I Simple and elegant presentations of complexity classes.
I Formal methods for complexity analysis of programs.
I How?
I First-order functional programs [BC92], [Leivant94], . . .
I Model theory [Fagin73], . . . ,
I Type Systems and λ-calculi [Hofmann97], [Girard97], . . .
I . . .
Characterizing Complexity Classes
L C
Programs Functions
Characterizing Complexity Classes
L C
J·K
Programs Functions
Characterizing Complexity Classes
L C
P
Programs Functions
Characterizing Complexity Classes
L C
S P
Programs Functions
Proving JSK = P
I P ⊆ JSK
I For every function f which can be computed within the bounds prescribed byP, there is P ∈ S such that JP K = f.
I JSK ⊆ P
I Semantically
I For every P ∈ S,JP K ∈ P is proved by showing that some algorithms computingJP K exists which works within the prescribed resource bounds.
I P ∈ L does not necessarily exhibit a nice computational behavior.
I Example: soundness by realizability [DLHofmann05].
I Operationally
I Sometimes, L can be endowed with an effective operational semantics.
I Let LP⊆ L be the set of those programs which work within the bounds prescribed by C.
I JS K ⊆ P can be shown by proving S ⊆ LP.
Proving JSK = P
I P ⊆ JSK
I For every function f which can be computed within the bounds prescribed byP, there is P ∈ S such that JP K = f.
I JSK ⊆ P
I Semantically
I For every P ∈ S,JP K ∈ P is proved by showing that some algorithms computingJP K exists which works within the prescribed resource bounds.
I P ∈ L does not necessarily exhibit a nice computational behavior.
I Example: soundness by realizability [DLHofmann05].
I Operationally
I Sometimes, L can be endowed with an effective operational semantics.
I Let LP⊆ L be the set of those programs which work within the bounds prescribed by C.
I JS K ⊆ P can be shown by proving S ⊆ LP.
If Soundness is Proved Operationally...
L L P
S
If Soundness is Proved Operationally...
L L P
S
ICC: Two Styles
L L P
S
Safe Recursion [BC92]
Light Linear Logic [Girard97]
...
ICC: Two Styles
L L P
S
Bounded Recursion on Notation [Cobham63]
Bounded Arithmetic [Buss80]
...
Section 2
Implicit Complexity in Function Algebras
Complexity Classes as Function Algebras?
I Recursion on Notation: since computation in unary notation is inefficient, we need to switch to binary notation.
I What Should we Keep in the Algebra?
I Basic functions are innoquous.
I Polytime functions are closed by composition.
I Minimization introduces partiality, and is not needed.
I Primitive Recursion?
I Certain uses of primitive recursion are dangerous, but if we do not have any form of recursion, we capture much less than polytime functions.
Complexity Classes as Function Algebras?
I Recursion on Notation: since computation in unary notation is inefficient, we need to switch to binary notation.
I What Should we Keep in the Algebra?
I Basic functions are innoquous.
I Polytime functions are closed by composition.
I Minimization introduces partiality, and is not needed.
I Primitive Recursion?
I Certain uses of primitive recursion are dangerous, but if we do not have any form of recursion, we capture much less than polytime functions.
Complexity Classes as Function Algebras?
I Recursion on Notation: since computation in unary notation is inefficient, we need to switch to binary notation.
I What Should we Keep in the Algebra?
I Basic functions are innoquous.
I Polytime functions are closed by composition.
I Minimization introduces partiality, and is not needed.
I Primitive Recursion?
I Certain uses of primitive recursion are dangerous, but if we do not have any form of recursion, we capture much less than polytime functions.
Safe Functions
I Safe Functions: pairs in the form (f, n), where f :Bm→ B and 0 ≤ n ≤ m.
I The number n identifies the number of normal arguments between those of f : they are the first n, while the other m− n are the safe arguments.
I Following [BellantoniCook93], we use semicolons to separate normal and safe arguments: if (f, n) is a safe function, we write f( ~W; ~V) to emphasize that the n words in ~W are the normal arguments, while the ones in ~V are the safe arguments.
Basic Safe Functions
I The safe function (e, 0) where e :B → B always returns the empty string ε.
I The safe function (a0,0) where a0:B → B is defined as follows: a0(W ) = 0· W .
I The safe function (a1,0) where a1:B → B is defined as follows: a1(W ) = 1· W .
I The safe function (t, 0) where t :B → B is defined as follows: t(ε) = ε, t(0W ) = W and t(1W ) = W .
I The safe function (c, 0) where c :B4 → B is defined as follows: c(ε, W, V, Y ) = W , c(0X, W, V, Y ) = V and c(1X, W, V, Y ) = Y .
I For every positive n∈ N and for whenever 1 ≤ m, k ≤ n, the safe function(Πnm, k), where Πnm is defined in a natural way.
Safe Composition
(f : B
n→ B, m)
(g
j: B
k→ B, k) for every 1 ≤ j ≤ m (h
j: B
k+i→ B, k) for every m + 1 ≤ j ≤ n
⇓
(p : B
k+i→ B, k) defined as follows:
p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).
Safe Composition
(f : B
n→ B, m)
(g
j: B
k→ B, k) for every 1 ≤ j ≤ m (h
j: B
k+i→ B, k) for every m + 1 ≤ j ≤ n
⇓
(p : B
k+i→ B, k) defined as follows:
p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).
Safe Composition
(f : B
n→ B, m)
(g
j: B
k→ B, k) for every 1 ≤ j ≤ m (h
j: B
k+i→ B, k) for every m + 1 ≤ j ≤ n
⇓
(p : B
k+i→ B, k) defined as follows:
p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).
Safe Composition
(f : B
n→ B, m)
(g
j: B
k→ B, k) for every 1 ≤ j ≤ m (h
j: B
k+i→ B, k) for every m + 1 ≤ j ≤ n
⇓
(p : B
k+i→ B, k) defined as follows:
p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).
Safe Composition
(f : B
n→ B, m)
(g
j: B
k→ B, k) for every 1 ≤ j ≤ m (h
j: B
k+i→ B, k) for every m + 1 ≤ j ≤ n
⇓
(p : B
k+i→ B, k) defined as follows:
p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).
Safe Recursion
(f :Bn→ B, m)
(gk:Bn+j+2→ B, m + 1) for every k ∈ {0, 1}
⇓
(h :Bn+1→ B, m + 1) defined as follows:
h(0, ~W; ~V) = f ( ~W; ~V);
h(0X, ~W; ~V) = g0(X, ~W; ~V , h(X, ~W; ~V));
h(1X, ~W; ~V) = g1(X, ~W; ~V , h(X, ~W; ~V)).
Safe Recursion
(f :Bn→ B, m)
(gk:Bn+j+2→ B, m + 1) for every k ∈ {0, 1}
⇓
(h :Bn+1→ B, m + 1) defined as follows:
h(0, ~W; ~V) = f ( ~W; ~V);
h(0X, ~W; ~V) = g0(X, ~W; ~V , h(X, ~W; ~V));
h(1X, ~W; ~V) = g1(X, ~W; ~V , h(X, ~W; ~V)).
Safe Recursion
(f :Bn→ B, m)
(gk:Bn+j+2→ B, m + 1) for every k ∈ {0, 1}
⇓
(h :Bn+1→ B, m + 1) defined as follows:
h(0, ~W; ~V) = f ( ~W; ~V);
h(0X, ~W; ~V) = g0(X, ~W; ~V , h(X, ~W; ~V));
h(1X, ~W; ~V) = g1(X, ~W; ~V , h(X, ~W; ~V)).
Classes of Functions
I BCS is the smallest class of safe functions which includes the basic safe functions above and which is closed by safe composition and safe recursion.
I BC is the set of those functions f :B → B such that (f, n)∈ BCS for some n ∈ {0, 1}.
Lemma (Max-Poly Lemma)
For every (f :Bn→ B, m) in BCS, there is a monotonically increasing polynomial pf : N→ N such that:
|f(V1, . . . , Vn)| ≤ pf
X
1≤k≤m
|Vk|
+ max
m+1≤k≤n|Vk|.
Proof.
I An induction on the structure of the proof a function being in BCS.
I The restrictions safe recursion must satisfy are essential.
Lemma (Max-Poly Lemma)
For every (f :Bn→ B, m) in BCS, there is a monotonically increasing polynomial pf : N→ N such that:
|f(V1, . . . , Vn)| ≤ pf
X
1≤k≤m
|Vk|
+ max
m+1≤k≤n|Vk|.
Proof.
I An induction on the structure of the proof a function being in BCS.
I The restrictions safe recursion must satisfy are essential.
Theorem (Polytime Soundness) BC⊆ FP{0,1}.
Proof.
I This is an induction on the structure of the proof of a function being in BCS.
I The proof becomes much easier if we first prove that simultaneous primitive recursion can be encoded into ordinary primitive recursion.
I We make essential use of the Max-Poly Lemma.
Theorem (Polytime Soundness) BC⊆ FP{0,1}.
Proof.
I This is an induction on the structure of the proof of a function being in BCS.
I The proof becomes much easier if we first prove that simultaneous primitive recursion can be encoded into ordinary primitive recursion.
I We make essential use of the Max-Poly Lemma.
Lemma (Polynomials)
For every polynomial p: N→ N with natural coefficients there is a safe function (f, 1) where f :B → B such that
|f(W )| = p(|W |) for every W ∈ B.
Theorem (Polytime Completeness) FP{0,1} ⊆ BC.
Theorem (BellantoniCook1992) FP{0,1} = BC.
Lemma (Polynomials)
For every polynomial p: N→ N with natural coefficients there is a safe function (f, 1) where f :B → B such that
|f(W )| = p(|W |) for every W ∈ B.
Theorem (Polytime Completeness) FP{0,1} ⊆ BC.
Theorem (BellantoniCook1992) FP{0,1} = BC.
Lemma (Polynomials)
For every polynomial p: N→ N with natural coefficients there is a safe function (f, 1) where f :B → B such that
|f(W )| = p(|W |) for every W ∈ B.
Theorem (Polytime Completeness) FP{0,1} ⊆ BC.
Theorem (BellantoniCook1992) FP{0,1} = BC.
Complexity Classes
Part I
Implicit Complexity in the λ-Calculus
λ-Calculus and Complexity
I The λ-Calculus by itself is simply too powerful to form an ICC system.
I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?
I Type Systems [Hofmann1997,BNS2000,Hofmann1999].
I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.
I This way you get something similar to Gödel’s T.
I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].
I Linearity Constraints [Girard1997,Lafont2004].
I Key observation: copying is the operation making
evaluation of λ-expressions problematic from a complexity point of view.
I Let us define some constraints on duplication, then!
λ-Calculus and Complexity
I The λ-Calculus by itself is simply too powerful to form an ICC system.
I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?
I Type Systems [Hofmann1997,BNS2000,Hofmann1999].
I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.
I This way you get something similar to Gödel’s T.
I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].
I Linearity Constraints [Girard1997,Lafont2004].
I Key observation: copying is the operation making
evaluation of λ-expressions problematic from a complexity point of view.
I Let us define some constraints on duplication, then!
λ-Calculus and Complexity
I The λ-Calculus by itself is simply too powerful to form an ICC system.
I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?
I Type Systems [Hofmann1997,BNS2000,Hofmann1999].
I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.
I This way you get something similar to Gödel’s T.
I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].
I Linearity Constraints [Girard1997,Lafont2004].
I Key observation: copying is the operation making
evaluation of λ-expressions problematic from a complexity point of view.
I Let us define some constraints on duplication, then!
λ-Calculus and Complexity
I The λ-Calculus by itself is simply too powerful to form an ICC system.
I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?
I Type Systems [Hofmann1997,BNS2000,Hofmann1999].
I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.
I This way you get something similar to Gödel’s T.
I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].
I Linearity Constraints [Girard1997,Lafont2004].
I Key observation: copying is the operation making
evaluation of λ-expressions problematic from a complexity point of view.
I Let us define some constraints on duplication, then!
From Lambda Calculus to Soft Lambda Calculus
I Lambda calculusΛ:
M ::= x| λx.M | MM with no structural constraints.
I Linear Lambda Calculus Λ!
M ::= x| λx.M | λ!x.M | MM | !M where x appears linearly in the body of λx.M .
I Soft Lambda Calculus ΛS
M ::= x| λx.M | λ!x.M | MM | !M where additional constraints are needed for λ!x.M :
I x appears once in M , inside a single occurrence of!...
I ... or x appears more than once in M , outside!.
From Lambda Calculus to Soft Lambda Calculus
I Lambda calculusΛ:
M ::= x| λx.M | MM with no structural constraints.
I Linear Lambda Calculus Λ!
M ::= x| λx.M | λ!x.M | MM | !M where x appears linearly in the body of λx.M .
I Soft Lambda Calculus ΛS
M ::= x| λx.M | λ!x.M | MM | !M where additional constraints are needed for λ!x.M :
I x appears once in M , inside a single occurrence of!...
I ... or x appears more than once in M , outside!.
From Lambda Calculus to Soft Lambda Calculus
I Lambda calculusΛ:
M ::= x| λx.M | MM with no structural constraints.
I Linear Lambda Calculus Λ!
M ::= x| λx.M | λ!x.M | MM | !M where x appears linearly in the body of λx.M .
I Soft Lambda Calculus ΛS
M ::= x| λx.M | λ!x.M | MM | !M where additional constraints are needed for λ!x.M :
I x appears once in M , inside a single occurrence of!...
I ... or x appears more than once in M , outside!.
From Lambda Calculus to Soft Lambda Calculus
I Λ =⇒ Λ! is a Refinement.
I Whenever a term can be copied, it must be marked as such, with!.
I Λ can be embedded into Λ!
{x} = x {λx.M} = λ!x.{M}
{MN} = {M}!{N}
I The embedding does not make use of λx.t.
I Λ!=⇒ ΛS is a Restriction.
I Whenever you copy, you lose the possibility of copying.
I Examples:
λ!x.yxx X λ!x.y!x X λ!x.y(!x)x
I Some results:
I Polytime soundness;
I Polytime completeness.
Linear Logic
I The Curry-Howard Correspondence comes into play.
I Linear Logic can be seen as a way to decompose A→ B into!A ( B.
I ( is the an arrow operator.
I ⊗ is the a conjunction operator.
I ! is a new operator governed by the following rules:
!A ∼= !A⊗!A
!A⊗!B ∼= !(A⊗ B)
!A ( !!A
!A ( A
Linear Logic
Subsystems...
!A⊗!B ∼=!(A⊗ B) !A (!!A !A ( A !A ∼=!A⊗!A
ELL YES NO NO YES
LLL NO NO NO YES
SLL YES NO !A ( A⊗ . . . ⊗ A
...and their expressive power ELL Elementary Functions LLL Polytime Functions SLL Polytime Functions
Linear Logic
Subsystems...
!A⊗!B ∼=!(A⊗ B) !A (!!A !A ( A !A ∼=!A⊗!A
ELL YES NO NO YES
LLL NO NO NO YES
SLL YES NO !A ( A⊗ . . . ⊗ A
...and their expressive power ELL Elementary Functions LLL Polytime Functions SLL Polytime Functions
Soft Linear Logic
I It is polynomial time sound [Lafont2002]:
I Bπ is the box depth of any proof π;
Theorem
There is a family of polynomials{pn}n such that the normal form of any proof π can be computed in time pBπ(|π|)
I This holds for many notions of proofs: proof-nets, sequent-calculus, lambda-terms, etc.
I It is also polynomial time complete [Lafont02, MairsonTerui03]:
I A function f : N→ N can be represented in soft linear logic if a proof πf rewrites to an encoding of f(n) when cut against an encoding of n.
Theorem
Every polynomial time function can be represented in soft linear logic.