• Non ci sono risultati.

PartI Higher-OrderProbabilisticProgramming

N/A
N/A
Protected

Academic year: 2021

Condividi "PartI Higher-OrderProbabilisticProgramming"

Copied!
60
0
0

Testo completo

(1)

Ugo Dal Lago

(Based on joint work with Flavien Breuvart, Raphaëlle Crubillé, Charles Grellois, Davide Sangiorgi,. . . )

POPL 2019, Lisbon, January 14th

(2)

Probabilistic Models

I The environment is supposed not to behave deterministically, but probabilistically.

q

0

q

1

q

2

q

3

14

34 1

12 12 13 23

I Abstractions:

I (Labelled) Markov Chains.

(3)

Probabilistic Models

I The environment is supposed not to behave deterministically, but probabilistically.

I Crucial when modeling uncertainty.

q

0

q

1

q

2

q

3

34 1

12 12 13 23

I Abstractions:

I (Labelled) Markov Chains.

(4)

Probabilistic Models

I The environment is supposed not to behave deterministically, but probabilistically.

I Crucial when modeling uncertainty.

I Useful to handle complex domains.

q

2

q

3

34 1

12 12 13 3

I Abstractions:

I (Labelled) Markov Chains.

(5)

Probabilistic Models

I The environment is supposed not to behave deterministically, but probabilistically.

I Crucial when modeling uncertainty.

I Useful to handle complex domains.

I Example:

q

0

q

1

q

2

q

3

14

34 1

12 12 13 23

(6)

probabilistically.

I Crucial when modeling uncertainty.

I Useful to handle complex domains.

I Example:

q

0

q

1

q

2

q

3

14

34 1

12 12 13 23

I Abstractions:

I (Labelled) Markov Chains.

(7)

R OBOTICS

(8)

AR TIFICIAL INTELLIGENCE

(9)

NA TURAL LANGUA GE PR OCESSING

(10)

Randomized Computation

I Algorithms and automata are assumed to have the ability to sample from a distribution [dLMSS1956,R1963].

I Abstractions:

I Randomized algorithms;

I Probabilistic Turing machines.

I Labelled Markov chains.

(11)

Randomized Computation

I Algorithms and automata are assumed to have the ability to sample from a distribution [dLMSS1956,R1963].

I This is a powerful tool when solving computational problems.

I Abstractions:

I Randomized algorithms;

I Probabilistic Turing machines.

I Labelled Markov chains.

(12)

Randomized Computation

I Algorithms and automata are assumed to have the ability to sample from a distribution [dLMSS1956,R1963].

I This is a powerful tool when solving computational problems.

I Example:

(13)

Randomized Computation

I Algorithms and automata are assumed to have the ability to sample from a distribution [dLMSS1956,R1963].

I This is a powerful tool when solving computational problems.

I Example:

(14)

distribution [dLMSS1956,R1963].

I This is a powerful tool when solving computational problems.

I Example:

I Abstractions:

I Randomized algorithms;

I Probabilistic Turing machines.

I Labelled Markov chains.

(15)

0

1 1 1 0

s

· · · δ(s, 0) ={(q, 1, ←)12, (r, 0,→)23} 0

1 1 1 0

r

· · · ·

1

1 1 1 0

q

· · · ·

1 2

1 2

(16)

0

1 1 1 0

s

· · · δ(s, 0) ={(q, 1, ←)12, (r, 0,→)23} 0

1 1 1 0

r

· · · ·

1

1 1 1 0

q

· · · ·

1 2

1 2

(17)

0

1 1 1 0

s

· · · δ(s, 0) ={(q, 1, ←)12, (r, 0,→)23} 0

1 1 1 0

r

· · · ·

1

1 1 1 0

q

· · · ·

2 3

1 3

(18)

ALGORITHMICS

(19)

CR YPTOGRAPHY

(20)

PR OGRAM VERIFICA TION

(21)

What Algorithms Compute

I Deterministic Computation

I For every inputx, there is at most one output y any algorithmA produces when fed withx.

I As a consequence:

A ; JAK : N * N.

A ; JAK : N → D (N).

I The distributionJAK(n) sums to anything between 0 and 1, thus accounting for the probability of divergence.

(22)

For every inputx, there is at most one output y any algorithmA produces when fed withx.

I As a consequence:

A ; JAK : N * N.

I Randomized Computation

I For every inputx, any algorithmA outputs y with a probability 0 ≤ p ≤ 1.

I As a consequence:

A ; JAK : N → D (N).

I The distributionJAK(n) sums to anything between 0 and 1, thus accounting for the probability of divergence.

(23)

Higher-Order Computation

I Mainly useful in programming.

I Modularity;

I Code reuse;

I Conciseness.

I Example:

I Models:

I λ-calculus

(24)

Higher-Order Computation

I Mainly useful in programming.

I Functions are first-class citizens:

I They can be passed as arguments;

I They can be obtained as results.

I Example:

I Models:

I λ-calculus

(25)

Higher-Order Computation

I Mainly useful in programming.

I Functions are first-class citizens:

I They can be passed as arguments;

I They can be obtained as results.

I Motivations:

I Modularity;

I Code reuse;

I Conciseness.

I Models:

I λ-calculus

(26)

Higher-Order Computation

I Mainly useful in programming.

I Functions are first-class citizens:

I They can be passed as arguments;

I They can be obtained as results.

I Motivations:

I Modularity;

I Code reuse;

I Conciseness.

I Example:

(27)

Higher-Order Computation

I Mainly useful in programming.

I Functions are first-class citizens:

I They can be passed as arguments;

I They can be obtained as results.

I Motivations:

I Modularity;

I Code reuse;

I Conciseness.

I Example:

(28)

I Functions are first-class citizens:

I They can be passed as arguments;

I They can be obtained as results.

I Motivations:

I Modularity;

I Code reuse;

I Conciseness.

I Example:

I Models:

I λ-calculus

(29)

FUNCTIONAL PR

OGRAMMING

(30)

FUNCTIONAL D A T A STR UCTURES

(31)

λ -CALCULUS

(32)

Higher-Order Probabilistic Computation?

Does it Make Sense?

Interesting Research Problems?

(33)

Higher-Order Probabilistic Computation?

Does it Make Sense?

What Kind of Metatheory

Does it Have?

(34)

Does it Make Sense?

What Kind of Metatheory Does it Have?

Interesting Research Problems?

(35)

This Tutorial

1. Motivating Examples.

2. A λ-Calculus Foundation.

3. Operational and Denotational Semantics.

4. Termination and Resource Analysis.

(36)

2. A λ-Calculus Foundation.

3. Operational and Denotational Semantics.

4. Termination and Resource Analysis.

A webpage: http://www.cs.unibo.it/~dallago/HOPP/

(37)
(38)
(39)

merge sort

merge sort

merge sort

(40)
(41)
(42)

quick sort

quick sort

quick sort

(43)
(44)
(45)
(46)

0 1 2 · · · n

p 1 − p

(47)
(48)

The Coupon Collector

I A brand distributes a large quantity of coupons, each labelled with i∈ {1, . . . , n}.

I Example: if n = 5, you could get the following coupons: 3, 1, 5, 2, 3, 1, 5, 2, 3, 1, 5, 2, . . .

I Are you guaranteed to win the prize with probability1? After how many days, on the average?

(49)

The Coupon Collector

I A brand distributes a large quantity of coupons, each labelled with i∈ {1, . . . , n}.

I Every day, you collect one coupon at a local store. Any label i has probability

1

n to occur.

3, 1, 5, 2, 3, 1, 5, 2, 3, 1, 5, 2, . . .

I Are you guaranteed to win the prize with probability1? After how many days, on the average?

(50)

The Coupon Collector

I A brand distributes a large quantity of coupons, each labelled with i∈ {1, . . . , n}.

I Every day, you collect one coupon at a local store. Any label i has probability

1

n to occur.

I You win a prize when you collect a set of n coupons, each with a distinct label.

I Are you guaranteed to win the prize with probability1? After how many days, on the average?

(51)

The Coupon Collector

I A brand distributes a large quantity of coupons, each labelled with i∈ {1, . . . , n}.

I Every day, you collect one coupon at a local store. Any label i has probability

1

n to occur.

I You win a prize when you collect a set of n coupons, each with a distinct label.

I Example: if n = 5, you could get the following coupons:

3, 1, 5, 2, 3, 1, 5, 2, 3, 1, 5, 2, . . .

(52)

i∈ {1, . . . , n}.

I Every day, you collect one coupon at a local store. Any label i has probability

1

n to occur.

I You win a prize when you collect a set of n coupons, each with a distinct label.

I Example: if n = 5, you could get the following coupons:

3, 1, 5, 2, 3, 1, 5, 2, 3, 1, 5, 2, . . .

I Are you guaranteed to win the prize with probability1? After how many days, on the average?

(53)
(54)

Beyond Randomized Algorithms: The Grass Problem

I You get up in the morning, and notice that the grass in your garden is wet.

I If it rains, there is90% probability that the grass is wet the following morning.

I If the sprinkler is activated, there is80% probability that the grass is wet the following morning.

I In any other case, there is anyway a10% probability that the grass is wet.

I In other words, you are in a situation like the following: rain

sprinkler

grass is wet

I Now: how likely it is that it rained?

(55)

Beyond Randomized Algorithms: The Grass Problem

I You get up in the morning, and notice that the grass in your garden is wet.

I You know that there is30% probability that it rains during the night.

80% probability that the grass is wet the following morning.

I In any other case, there is anyway a10% probability that the grass is wet.

I In other words, you are in a situation like the following: rain

sprinkler

grass is wet

I Now: how likely it is that it rained?

(56)

Beyond Randomized Algorithms: The Grass Problem

I You get up in the morning, and notice that the grass in your garden is wet.

I You know that there is30% probability that it rains during the night.

I The sprinkler is activated once every two days.

I In any other case, there is anyway a10% probability that the grass is wet.

I In other words, you are in a situation like the following: rain

sprinkler

grass is wet

I Now: how likely it is that it rained?

(57)

Beyond Randomized Algorithms: The Grass Problem

I You get up in the morning, and notice that the grass in your garden is wet.

I You know that there is30% probability that it rains during the night.

I The sprinkler is activated once every two days.

I You further know that:

I If it rains, there is90% probability that the grass is wet the following morning.

I If the sprinkler is activated, there is80% probability that the grass is wet the following morning.

I In any other case, there is anyway a10% probability that the grass is wet.

I In other words, you are in a situation like the following:

rain sprinkler

grass is wet

(58)

I You know that there is30% probability that it rains during the night.

I The sprinkler is activated once every two days.

I You further know that:

I If it rains, there is90% probability that the grass is wet the following morning.

I If the sprinkler is activated, there is80% probability that the grass is wet the following morning.

I In any other case, there is anyway a10% probability that the grass is wet.

I In other words, you are in a situation like the following:

rain sprinkler

grass is wet

I Now: how likely it is that it rained?

(59)
(60)

Questions?

Riferimenti

Documenti correlati

Interestingly, significant differences were observed in pleasantness, intensity and familiarity for bottarga odor and taste qualities between women and men, shedding light on

While there are a number of software tools which allow users to design complex DNA sequences by stitching together BioParts or genetic features into genetic devices, there is a lack

I In recent years, starting from the pioneering works on languages like CHURCH, ANGLICAN, or HANSEI, functional programs have been also employed as means to represent

I Inherently difficult endeavour, because the category of measurable spaces is not cartesian closed.. I Probabilistic Coherence Spaces

I In linear dependent types [DLG2011], one is (relatively complete) for deterministic first-order functions. I How

Specifically, we selected the Mobucon-EC approach (Montali et al., 2014) for the runtime verification and monitoring of business processes; the ReqMon approach (Robinson, 2006),

Genetic Department, Institute for Maternal and Child Health - IRCCS “Burlo Garofolo”, Trieste, Italy. 4 Pharmacy and Clinical Pharmacology Department,

T.officinalis and R. However, we did not find any reference on the effect of air drying on volatile composition of these two species cultivated in Sardinia. Consequently, the aim