• Non ci sono risultati.

Lambda-Calculus and Type Theory Part IV

N/A
N/A
Protected

Academic year: 2021

Condividi "Lambda-Calculus and Type Theory Part IV"

Copied!
51
0
0

Testo completo

(1)

Lambda-Calculus and Type Theory Part IV

Ugo Dal Lago

Scuola Estiva AILA, Gargnano, August 2018

(2)

Section 1

Implicit Complexity at a Glance

(3)

Implicit Computational Complexity

I Goal

I Machine-free characterizations of complexity classes.

I No explicit reference to resource bounds.

I P, PSPACE, L, NC,. . .

I Why?

I Simple and elegant presentations of complexity classes.

I Formal methods for complexity analysis of programs.

I How?

I First-order functional programs [BC92], [Leivant94], . . .

I Model theory [Fagin73], . . . ,

I Type Systems and λ-calculi [Hofmann97], [Girard97], . . .

I . . .

(4)

Implicit Computational Complexity

I Goal

I Machine-free characterizations of complexity classes.

I No explicit reference to resource bounds.

I P, PSPACE, L, NC,. . .

I Why?

I Simple and elegant presentations of complexity classes.

I Formal methods for complexity analysis of programs.

I How?

I First-order functional programs [BC92], [Leivant94], . . .

I Model theory [Fagin73], . . . ,

I Type Systems and λ-calculi [Hofmann97], [Girard97], . . .

I . . .

(5)

Implicit Computational Complexity

I Goal

I Machine-free characterizations of complexity classes.

I No explicit reference to resource bounds.

I P, PSPACE, L, NC,. . .

I Why?

I Simple and elegant presentations of complexity classes.

I Formal methods for complexity analysis of programs.

I How?

I First-order functional programs [BC92], [Leivant94], . . .

I Model theory [Fagin73], . . . ,

I Type Systems and λ-calculi [Hofmann97], [Girard97], . . .

I . . .

(6)

Characterizing Complexity Classes

L C

Programs Functions

(7)

Characterizing Complexity Classes

L C

J·K

Programs Functions

(8)

Characterizing Complexity Classes

L C

P

Programs Functions

(9)

Characterizing Complexity Classes

L C

S P

Programs Functions

(10)

Proving JSK = P

I P ⊆ JSK

I For every function f which can be computed within the bounds prescribed byP, there is P ∈ S such that JP K = f.

I JSK ⊆ P

I Semantically

I For every P ∈ S,JP K ∈ P is proved by showing that some algorithms computingJP K exists which works within the prescribed resource bounds.

I P ∈ L does not necessarily exhibit a nice computational behavior.

I Example: soundness by realizability [DLHofmann05].

I Operationally

I Sometimes, L can be endowed with an effective operational semantics.

I Let LP⊆ L be the set of those programs which work within the bounds prescribed by C.

I JS K ⊆ P can be shown by proving S ⊆ LP.

(11)

Proving JSK = P

I P ⊆ JSK

I For every function f which can be computed within the bounds prescribed byP, there is P ∈ S such that JP K = f.

I JSK ⊆ P

I Semantically

I For every P ∈ S,JP K ∈ P is proved by showing that some algorithms computingJP K exists which works within the prescribed resource bounds.

I P ∈ L does not necessarily exhibit a nice computational behavior.

I Example: soundness by realizability [DLHofmann05].

I Operationally

I Sometimes, L can be endowed with an effective operational semantics.

I Let LP⊆ L be the set of those programs which work within the bounds prescribed by C.

I JS K ⊆ P can be shown by proving S ⊆ LP.

(12)

If Soundness is Proved Operationally...

L L P

S

(13)

If Soundness is Proved Operationally...

L L P

S

(14)

ICC: Two Styles

L L P

S

Safe Recursion [BC92]

Light Linear Logic [Girard97]

...

(15)

ICC: Two Styles

L L P

S

Bounded Recursion on Notation [Cobham63]

Bounded Arithmetic [Buss80]

...

(16)

Section 2

Implicit Complexity in Function Algebras

(17)

Complexity Classes as Function Algebras?

I Recursion on Notation: since computation in unary notation is inefficient, we need to switch to binary notation.

I What Should we Keep in the Algebra?

I Basic functions are innoquous.

I Polytime functions are closed by composition.

I Minimization introduces partiality, and is not needed.

I Primitive Recursion?

I Certain uses of primitive recursion are dangerous, but if we do not have any form of recursion, we capture much less than polytime functions.

(18)

Complexity Classes as Function Algebras?

I Recursion on Notation: since computation in unary notation is inefficient, we need to switch to binary notation.

I What Should we Keep in the Algebra?

I Basic functions are innoquous.

I Polytime functions are closed by composition.

I Minimization introduces partiality, and is not needed.

I Primitive Recursion?

I Certain uses of primitive recursion are dangerous, but if we do not have any form of recursion, we capture much less than polytime functions.

(19)

Complexity Classes as Function Algebras?

I Recursion on Notation: since computation in unary notation is inefficient, we need to switch to binary notation.

I What Should we Keep in the Algebra?

I Basic functions are innoquous.

I Polytime functions are closed by composition.

I Minimization introduces partiality, and is not needed.

I Primitive Recursion?

I Certain uses of primitive recursion are dangerous, but if we do not have any form of recursion, we capture much less than polytime functions.

(20)

Safe Functions

I Safe Functions: pairs in the form (f, n), where f :Bm→ B and 0 ≤ n ≤ m.

I The number n identifies the number of normal arguments between those of f : they are the first n, while the other m− n are the safe arguments.

I Following [BellantoniCook93], we use semicolons to separate normal and safe arguments: if (f, n) is a safe function, we write f( ~W; ~V) to emphasize that the n words in ~W are the normal arguments, while the ones in ~V are the safe arguments.

(21)

Basic Safe Functions

I The safe function (e, 0) where e :B → B always returns the empty string ε.

I The safe function (a0,0) where a0:B → B is defined as follows: a0(W ) = 0· W .

I The safe function (a1,0) where a1:B → B is defined as follows: a1(W ) = 1· W .

I The safe function (t, 0) where t :B → B is defined as follows: t(ε) = ε, t(0W ) = W and t(1W ) = W .

I The safe function (c, 0) where c :B4 → B is defined as follows: c(ε, W, V, Y ) = W , c(0X, W, V, Y ) = V and c(1X, W, V, Y ) = Y .

I For every positive n∈ N and for whenever 1 ≤ m, k ≤ n, the safe function(Πnm, k), where Πnm is defined in a natural way.

(22)

Safe Composition

(f : B

n

→ B, m)

(g

j

: B

k

→ B, k) for every 1 ≤ j ≤ m (h

j

: B

k+i

→ B, k) for every m + 1 ≤ j ≤ n

(p : B

k+i

→ B, k) defined as follows:

p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).

(23)

Safe Composition

(f : B

n

→ B, m)

(g

j

: B

k

→ B, k) for every 1 ≤ j ≤ m (h

j

: B

k+i

→ B, k) for every m + 1 ≤ j ≤ n

(p : B

k+i

→ B, k) defined as follows:

p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).

(24)

Safe Composition

(f : B

n

→ B, m)

(g

j

: B

k

→ B, k) for every 1 ≤ j ≤ m (h

j

: B

k+i

→ B, k) for every m + 1 ≤ j ≤ n

(p : B

k+i

→ B, k) defined as follows:

p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).

(25)

Safe Composition

(f : B

n

→ B, m)

(g

j

: B

k

→ B, k) for every 1 ≤ j ≤ m (h

j

: B

k+i

→ B, k) for every m + 1 ≤ j ≤ n

(p : B

k+i

→ B, k) defined as follows:

p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).

(26)

Safe Composition

(f : B

n

→ B, m)

(g

j

: B

k

→ B, k) for every 1 ≤ j ≤ m (h

j

: B

k+i

→ B, k) for every m + 1 ≤ j ≤ n

(p : B

k+i

→ B, k) defined as follows:

p( ~W; ~V) = f (g1( ~W; ), . . . , gm( ~W; ); hm+1( ~W; ~V), . . . , hn( ~W; ~V)).

(27)

Safe Recursion

(f :Bn→ B, m)

(gk:Bn+j+2→ B, m + 1) for every k ∈ {0, 1}

(h :Bn+1→ B, m + 1) defined as follows:

h(0, ~W; ~V) = f ( ~W; ~V);

h(0X, ~W; ~V) = g0(X, ~W; ~V , h(X, ~W; ~V));

h(1X, ~W; ~V) = g1(X, ~W; ~V , h(X, ~W; ~V)).

(28)

Safe Recursion

(f :Bn→ B, m)

(gk:Bn+j+2→ B, m + 1) for every k ∈ {0, 1}

(h :Bn+1→ B, m + 1) defined as follows:

h(0, ~W; ~V) = f ( ~W; ~V);

h(0X, ~W; ~V) = g0(X, ~W; ~V , h(X, ~W; ~V));

h(1X, ~W; ~V) = g1(X, ~W; ~V , h(X, ~W; ~V)).

(29)

Safe Recursion

(f :Bn→ B, m)

(gk:Bn+j+2→ B, m + 1) for every k ∈ {0, 1}

(h :Bn+1→ B, m + 1) defined as follows:

h(0, ~W; ~V) = f ( ~W; ~V);

h(0X, ~W; ~V) = g0(X, ~W; ~V , h(X, ~W; ~V));

h(1X, ~W; ~V) = g1(X, ~W; ~V , h(X, ~W; ~V)).

(30)

Classes of Functions

I BCS is the smallest class of safe functions which includes the basic safe functions above and which is closed by safe composition and safe recursion.

I BC is the set of those functions f :B → B such that (f, n)∈ BCS for some n ∈ {0, 1}.

(31)

Lemma (Max-Poly Lemma)

For every (f :Bn→ B, m) in BCS, there is a monotonically increasing polynomial pf : N→ N such that:

|f(V1, . . . , Vn)| ≤ pf

 X

1≤k≤m

|Vk|

+ max

m+1≤k≤n|Vk|.

Proof.

I An induction on the structure of the proof a function being in BCS.

I The restrictions safe recursion must satisfy are essential.

(32)

Lemma (Max-Poly Lemma)

For every (f :Bn→ B, m) in BCS, there is a monotonically increasing polynomial pf : N→ N such that:

|f(V1, . . . , Vn)| ≤ pf

 X

1≤k≤m

|Vk|

+ max

m+1≤k≤n|Vk|.

Proof.

I An induction on the structure of the proof a function being in BCS.

I The restrictions safe recursion must satisfy are essential.

(33)

Theorem (Polytime Soundness) BC⊆ FP{0,1}.

Proof.

I This is an induction on the structure of the proof of a function being in BCS.

I The proof becomes much easier if we first prove that simultaneous primitive recursion can be encoded into ordinary primitive recursion.

I We make essential use of the Max-Poly Lemma.

(34)

Theorem (Polytime Soundness) BC⊆ FP{0,1}.

Proof.

I This is an induction on the structure of the proof of a function being in BCS.

I The proof becomes much easier if we first prove that simultaneous primitive recursion can be encoded into ordinary primitive recursion.

I We make essential use of the Max-Poly Lemma.

(35)

Lemma (Polynomials)

For every polynomial p: N→ N with natural coefficients there is a safe function (f, 1) where f :B → B such that

|f(W )| = p(|W |) for every W ∈ B.

Theorem (Polytime Completeness) FP{0,1} ⊆ BC.

Theorem (BellantoniCook1992) FP{0,1} = BC.

(36)

Lemma (Polynomials)

For every polynomial p: N→ N with natural coefficients there is a safe function (f, 1) where f :B → B such that

|f(W )| = p(|W |) for every W ∈ B.

Theorem (Polytime Completeness) FP{0,1} ⊆ BC.

Theorem (BellantoniCook1992) FP{0,1} = BC.

(37)

Lemma (Polynomials)

For every polynomial p: N→ N with natural coefficients there is a safe function (f, 1) where f :B → B such that

|f(W )| = p(|W |) for every W ∈ B.

Theorem (Polytime Completeness) FP{0,1} ⊆ BC.

Theorem (BellantoniCook1992) FP{0,1} = BC.

(38)

Complexity Classes

(39)

Part I

Implicit Complexity in the λ-Calculus

(40)

λ-Calculus and Complexity

I The λ-Calculus by itself is simply too powerful to form an ICC system.

I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?

I Type Systems [Hofmann1997,BNS2000,Hofmann1999].

I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.

I This way you get something similar to Gödel’s T.

I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].

I Linearity Constraints [Girard1997,Lafont2004].

I Key observation: copying is the operation making

evaluation of λ-expressions problematic from a complexity point of view.

I Let us define some constraints on duplication, then!

(41)

λ-Calculus and Complexity

I The λ-Calculus by itself is simply too powerful to form an ICC system.

I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?

I Type Systems [Hofmann1997,BNS2000,Hofmann1999].

I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.

I This way you get something similar to Gödel’s T.

I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].

I Linearity Constraints [Girard1997,Lafont2004].

I Key observation: copying is the operation making

evaluation of λ-expressions problematic from a complexity point of view.

I Let us define some constraints on duplication, then!

(42)

λ-Calculus and Complexity

I The λ-Calculus by itself is simply too powerful to form an ICC system.

I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?

I Type Systems [Hofmann1997,BNS2000,Hofmann1999].

I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.

I This way you get something similar to Gödel’s T.

I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].

I Linearity Constraints [Girard1997,Lafont2004].

I Key observation: copying is the operation making

evaluation of λ-expressions problematic from a complexity point of view.

I Let us define some constraints on duplication, then!

(43)

λ-Calculus and Complexity

I The λ-Calculus by itself is simply too powerful to form an ICC system.

I How should we proceed if wanting to isolate, e.g., a class of λ-terms computing polytime functions?

I Type Systems [Hofmann1997,BNS2000,Hofmann1999].

I You endow the λ-calculus with a type system and with some constants for data, recursion, etc.

I This way you get something similar to Gödel’s T.

I Then, you impose some constraints on recursion, akin to those from [BellantoniCook1992].

I Linearity Constraints [Girard1997,Lafont2004].

I Key observation: copying is the operation making

evaluation of λ-expressions problematic from a complexity point of view.

I Let us define some constraints on duplication, then!

(44)

From Lambda Calculus to Soft Lambda Calculus

I Lambda calculusΛ:

M ::= x| λx.M | MM with no structural constraints.

I Linear Lambda Calculus Λ!

M ::= x| λx.M | λ!x.M | MM | !M where x appears linearly in the body of λx.M .

I Soft Lambda Calculus ΛS

M ::= x| λx.M | λ!x.M | MM | !M where additional constraints are needed for λ!x.M :

I x appears once in M , inside a single occurrence of!...

I ... or x appears more than once in M , outside!.

(45)

From Lambda Calculus to Soft Lambda Calculus

I Lambda calculusΛ:

M ::= x| λx.M | MM with no structural constraints.

I Linear Lambda Calculus Λ!

M ::= x| λx.M | λ!x.M | MM | !M where x appears linearly in the body of λx.M .

I Soft Lambda Calculus ΛS

M ::= x| λx.M | λ!x.M | MM | !M where additional constraints are needed for λ!x.M :

I x appears once in M , inside a single occurrence of!...

I ... or x appears more than once in M , outside!.

(46)

From Lambda Calculus to Soft Lambda Calculus

I Lambda calculusΛ:

M ::= x| λx.M | MM with no structural constraints.

I Linear Lambda Calculus Λ!

M ::= x| λx.M | λ!x.M | MM | !M where x appears linearly in the body of λx.M .

I Soft Lambda Calculus ΛS

M ::= x| λx.M | λ!x.M | MM | !M where additional constraints are needed for λ!x.M :

I x appears once in M , inside a single occurrence of!...

I ... or x appears more than once in M , outside!.

(47)

From Lambda Calculus to Soft Lambda Calculus

I Λ =⇒ Λ! is a Refinement.

I Whenever a term can be copied, it must be marked as such, with!.

I Λ can be embedded into Λ!

{x} = x {λx.M} = λ!x.{M}

{MN} = {M}!{N}

I The embedding does not make use of λx.t.

I Λ!=⇒ ΛS is a Restriction.

I Whenever you copy, you lose the possibility of copying.

I Examples:

λ!x.yxx X λ!x.y!x X λ!x.y(!x)x

I Some results:

I Polytime soundness;

I Polytime completeness.

(48)

Linear Logic

I The Curry-Howard Correspondence comes into play.

I Linear Logic can be seen as a way to decompose A→ B into!A ( B.

I ( is the an arrow operator.

I ⊗ is the a conjunction operator.

I ! is a new operator governed by the following rules:

!A ∼= !A⊗!A

!A⊗!B ∼= !(A⊗ B)

!A ( !!A

!A ( A

(49)

Linear Logic

Subsystems...

!A⊗!B ∼=!(A⊗ B) !A (!!A !A ( A !A ∼=!A⊗!A

ELL YES NO NO YES

LLL NO NO NO YES

SLL YES NO !A ( A⊗ . . . ⊗ A

...and their expressive power ELL Elementary Functions LLL Polytime Functions SLL Polytime Functions

(50)

Linear Logic

Subsystems...

!A⊗!B ∼=!(A⊗ B) !A (!!A !A ( A !A ∼=!A⊗!A

ELL YES NO NO YES

LLL NO NO NO YES

SLL YES NO !A ( A⊗ . . . ⊗ A

...and their expressive power ELL Elementary Functions LLL Polytime Functions SLL Polytime Functions

(51)

Soft Linear Logic

I It is polynomial time sound [Lafont2002]:

I Bπ is the box depth of any proof π;

Theorem

There is a family of polynomials{pn}n such that the normal form of any proof π can be computed in time p(|π|)

I This holds for many notions of proofs: proof-nets, sequent-calculus, lambda-terms, etc.

I It is also polynomial time complete [Lafont02, MairsonTerui03]:

I A function f : N→ N can be represented in soft linear logic if a proof πf rewrites to an encoding of f(n) when cut against an encoding of n.

Theorem

Every polynomial time function can be represented in soft linear logic.

Riferimenti

Documenti correlati

Thus, most proof editors give no induction principles, case analysis, inversion predicates and similar tools for reasoning on terms of type Var->tm, i.e., contexts.. Nevertheless,

Scuola Estiva AILA, Gargnano, August 2018... Abstraction associates to the right: (λx(λyM )) is abbreviated

I Conjuction and disjuction are interpreted in Tarski-style, while implication is given semantics by way of the underlying partial order. I Γ ϕ indicates that every Kripke

Scuola Estiva AILA, Gargnano, August 2018..

Service Private Limited Liability Company, Limited Liability Company of the Hungarian Idők Publisher, New Wave Media Group Communication and Service Limited Liability Company

EGCG inhibits cell proliferation and induces apoptosis in several human tumor cell lines, including CaSki and HeLa cervical cells [61], Hep-2 cells [62], laryngeal squamous

- the second, a strategic concept of regional planning, is made explicit through the construction of shared vision and cooperation of the scenarios of perspective, as in the case

Frequentemente nei setting di cura ogni professione comunica e apprende informazioni tramite la lettura e la compilazione della documentazione clinica della persona; questa è