[PDF] Lecture 7: Nonparametric Classification and Regression





Previous PDF Next PDF



EVALUER LES CAPACITÉS DE LECTURE CHEZ LES ENFANTS

écrite et décodage en début d'apprentissage de la lecture et d'autre part



CS364A: Algorithmic Game Theory Lecture #7: Multi-Parameter

Lecture #7: Multi-Parameter Mechanism Design and the VCG Mechanism. ?. Tim Roughgarden. †. October 14 2013. 1 General Mechanism Design Problems.



Nobel Lecture 7 December 1993 - Toni Morrison

Toni Morrison. Nobel Lecture 7 December. MEMBERS Narrative of the has Swedish never Academy



6.207/14.15: Networks Lecture 7: Search on Networks: Navigation

30 sept. 2009 Networks: Lecture 7. Outline. Navigation (or decentralized search) in networks. Web search. Hubs and authorities: HITS algorithm.



Spacecraft and Aircraft Dynamics - Lecture 7: Example and

Lecture 7: Example and Directional Control How to design an aircraft for steady-level flight. • To take into account ... Lecture 7: Numerical Example.



CS269I: Incentives in Computer Science Lecture #7: Selfish Routing

Lecture #7: Selfish Routing and Network. Over-Provisioning. ?. Tim Roughgarden. †. October 17 2016. This lecture discusses incentive issues in 



Lecture 7: ?-biased and almost k-wise independent spaces

Lecture 7: ?-biased and almost k-wise independent spaces. Topics in Complexity Theory and Pseudorandomness (Spring 2013). Rutgers University.



Lecture 7: Nonparametric Classification and Regression

26 sept. 2017 Lecture 7: Nonparametric Classification and Regression ... analysis and allow us to dig deeper into the bias/variance or ...



Advanced Economic Growth: Lecture 7 Innovation by Incumbents

Advanced Economic Growth: Lecture 7 Innovation by. Incumbents and Cumulative Research. Daron Acemoglu. MIT. October 1



lecture 7: category o and representations of algebraic groups

LECTURE 7: CATEGORY O AND REPRESENTATIONS OF. ALGEBRAIC GROUPS. IVAN LOSEV. Introduction. We continue our study of the representation theory of a finite 



[PDF] 160 activités et jeux autour de Ia lecture pour les enfants de 5 à 7 ans

LECTURE PL 160 activités et jeux autour de Ia lecture pour les enfants de 5 à 7 ans O A C C E S Ed itio ns 13 A route d'Altenheim 671 OO STRASBOU RG



[PDF] livre_lecture_CPpdf

LA LECTURE DES LETTRES Les exercices de lecture se présentent en lignes et non en colonnes afin d'habituer les élèves à lire de gauche à droite 



[PDF] JAPPRENDS À LIRE AU CP

L'élève de CP doit prolonger l'apprentissage du langage oral initié en maternelle Les activités de lecture d'images lui permettent de structurer et d'augmenter 



Jeux & Exercices de Lecture CP à imprimer - PDF gratuit à télécharger

Découvrez tous nos jeux et exercices de lecture pour le CP à imprimer Au programme : fiches de lecture et exercices de lecture et compréhension



[PDF] Pour aider votre enfant à apprendre à lire - Un guide à lintention

Ce guide renferme des conseils que les parents* peuvent suivre pour encourager les jeunes enfants à lire et pour leur donner le goût de la lecture Il sera des 



[PDF] Des idées de livres pour les enfants du CP au CM2

C'est décidé elle en veut un pour ses 7 ans ! Ce petit roman de 1ère lecture est conçu pour être éventuellement lu à deux Idéal pour les récalcitrants à 



18 Jeux de lecture CP à imprimer - Bien Enseigner

8 fév 2023 · Téléchargez des jeux de lecture en CP à imprimer amusants et éducatifs Imprimez et jouez pour renforcer les compétences en lecture



Livres numériques gratuits préscolaire et primaire - Padlet

PDF 11101 Lutin Gaffeur Denise Gaouette Ill Christine Caron Imaginé par Clara 7 ans Archives des Le coin lecture - Les Explorateurs



Livres de lecture pour garçons et filles de primaire - Storyplayr

Storyplay'r propose de nombreux livres de lecture pour la primaire aux enfants à partir de 7 ans et s'adresse à tous les enfants qui apprennent à lire 

  • Comment aider mon fils de 7 ans à lire ?

    Il faut tout d'abord faire un signalement au membre du RASED (réseau d'aide spécialisé pour les élèves en difficulté) pour une possible prise en charge. Le RASED dispense des aides spécialisées aux élèves d'écoles maternelles et élémentaires en grande difficulté.
  • Comment aider un élève de ce1 qui ne sait pas lire ?

    Spécial CP : 10 conseils pour apprendre à lire sans stress

    11) Aider votre enfant à bien maîtriser l'oral. 22) Rester serein, la lecture est une aventure qui prend du temps. 33) Laisser votre enfant avancer à son rythme. 44) Rassurer et encourager. 55) Faire confiance à l'enseignant. 66) Continuer à lire des histoires le soir.
  • Comment apprendre à lire à un CP ?

    3 ans : l'âge minimum pour s'initier à la lecture
    Autrement-dit, une maîtrise basique de la langue est nécessaire. On situe donc l'âge minimum pour apprendre à lire à 3 ans. On dira plutôt que c'est à cet âge là que l'enfant peut commencer à entrer en contact avec le langage écrit.
Lecture 7: Nonparametric Classification and Regression Lecture 7:-biased and almostk-wise independent spaces Topics in Complexity Theory and Pseudorandomness (Spring 2013)

Rutgers University

Swastik Kopparty

Scribes: Ben Lund, Tim Naumovitz

Today we will see-biased spaces, almost k-wise independent spaces, and some applications.

1 Distributions

Letbe a distribution onf0;1gn.

Lemma 1.is uniformly distributed onf0;1gni for each nonemptyS[n] Pr x2[X i2Sx i= 1] =12 (where the addition is mod 2).

Proof.LetS:f0;1gn!Rwhere

S(x) = (1)P

i2Sxi:

Thus:S(x) =(

1 ifP i2Sxi= 0 1 ifP i2Sxi= 1: We can treatas a real valued function:f0;1gn!Rwith(x)0 for allx2 f0;1gnandP x2f0;1gn(x) = 1.

For nonemptyS, consider the inner product:

h;Si=X x2f0;1gn(x)S(x) X x2f0;1gn;S(x)=1(x)X x2f0;1gn;S(x)=0(x) = Pr x2[X i2Sx i= 0]Prx2[X i2Sx i= 1] = 0 (by our hypothesis) Aside: Characters and Fourier analysisTheSare called the characters of the group (Zn2;+).

We have the following important properties:

1

1.S(x+y) =S(x)S(y)

2. For any nonemptyS, we haveP

x2f0;1gnS(x) = 0. This follows from the following calcula- tion. Pick anyi2S. Then: X x2f0;1gn

S(x) =X

x2f0;1gn

S(x+ei) =X

x2f0;1gn S(x); whereeiis the 0-1 vector with 1 only in theith coordinate.

3. Furthermore,S(x)T(x) = (1)P

i2Sxi+P i2Txi= (1)P i2S4T=S4T(x). Sometimes we will also treatS;Tas their characteristic vectors inZn2, and in this notation,S(x)T(x) =

S+T(x).

4.hS;Ti=P

x2f0;1gnS(x)T(x) =P x2f0;1gnS4T(x) =(

0 ifS6=T6= 0

2 nifS=T: This means thatfxSgare an orthogonal system, and since there are 2nof them, they are a basis forRf0;1gn.

Let ^(S) =h;Si: theSthFourier coecient of.

End AsideProof (continued)We have ^(S) = 0 for each nonemptyS. Thus=;for some, and since is a distribution, we have=12 n;, and this is the uniform distribution.Lemma 2.Suppose that for all nonemptyS[n], Pr x2" X i2Sx i= 1#

2((1)=2;(1 +)=2):

Thenisclose to the uniform distribution inL2, and thusis2n=2close the the uniform distribution inL1andisclose the the uniform distribution inL1.

Proof.The hypothesis gives us thatj^(S)j

We also have ^(;) =h;;i= 1

Aside: Parseval's IdentityLemma 3.Take anyf:f0;1gn!R. We have f=12 nX

Shf;Si S=12

nX S^ f(S)S: Then X xf(x)2=12 nX S^ f(S)2 2 Proof.This follows from the orthogonality of theS:X xf(x)2=hf;fi 14 nhX S^ f(S)S;X T^ f(T)Ti 14 nX S;T^ f(S)^f(T)hS;Ti 12 nX S^ f(S)2:End Aside

Proof (cont'd):LetU=;12

nbe the uniform distribution.

We have

^U(;) = 1, and^U(S) = 0 for all nonemptyS. ThusU(which is what we want to show is small in theL2norm), has the following Fourier coecients:

U(S) =(

0 ifS=;

^(S) otherwise

Now using the Parseval identity, we get that

kUk22=12 nX

S6=;^(S)212

n2(2n1)2:

This completes the proof, and the result forL1andL1follow from Cauchy-Schwarz and trivially.Thus if a distribution fools linear functions really well, it is almost uniform.

1.1 Notes on theL1distance

There are many distances between probability distributions. But theL1distance has special status when we are interested in pseudorandomness. Lemma 4(Data processing inequality).Suppose;are distributions over the same domainD.

Letfbe a function dened onD. Pickx2,y2.

kf(x)f(y)kL1 kjL1: This is closely related to the following simple characterization ofL1distance: 12 jjjjL1if and only if there exists a distinguishing testT:D! f0;1gsuch that kT()T()kL1= Prx2[T(x) = 1]Prx2[T(x) = 1]: Sketch of Proof:Graph the distributionsandover their domain (They have the same domain). 12 jjjjL1is the area between the curves whereis above. ChooseTto output 1 on sets where > . This gives us Prx2[D(x) = 1]Prx2[D(x) = 1] is the area betweenandon these sets, which is 12 jjjj . 3 Relationship to pseudorandom generatorsWhen we studied PRGs, the goal was to nd a simples.t.8small circuits C, Pr x2[C(x) = 1]Prx2U[C(x) = 1]: The only dierence between this condition and the condition for smallL1distance is the complexity constraint thatCis small.THIS MAKES ALL THE DIFFERENCE IN THE WORLD! By the above discussion, we could try to show thatis a PRG by showing the stronger condition thatis-close to the uniform distribution inL1. This strengthening ruins the approach: there cannot be awhich is generated using a small seed that is close to the uniform distribution inL1 distance: kUkL1)support()(1)2n:

Try to show this.

2-biased distributions andk-wise independence

Denition 5.is-biased if for all nonemptyS[n],

Pr[ X i2Sx i= 1]2[(1)=2;(1 +)=2]: Note:

1.is 0-biased()is uniform.

2.-biased)is (;2n=2)-close to uniform in (L2;L1).

3.-biased(is-close to uniform inL1.

When isk-wise independent?is k-wise independent() 8S1 jSj k, we have ^(S) = 0. Proof:()) Take such an S. Sinceis k-wise independent, we have Prx2[P i2Sxi= 1] =12

By denition of ^, this yields ^(S) = 0.

(() Let S be as stated. Look atjS.8TS,T6=;, we know that Prx2jS[P i2Sxi= 1] =12 , so ^jS(T) = 0, meaningjSis uniform. This implies thatis k-wise independent.2

When is-almostk-wise independent?Suppose8S[n],jSj k,S6=;we havej^(S)j , thenis-almost k-wise independent

inL1for=2k=2. Take anyS[n]jSj=k. Look at=jS, a distribution onf0;1gS.8TS, since ^(T) = ^(T), we havej^(T)j . This implies thatis2k=2close to uniform onf0;1gS. 4 How much randomness is needed to generate these spaces? Fork-wise independence:we saw thatk(logn) bits suces. In fact, (klogn) bits are needed (you will prove this in the homework). For-biased spaces:First let us show that there exist \simple"-biased spaces; we will later see how to get them explicitly. We try to get an-biasedwhich is uniform on someK f0;1gnwith jKjsmall. Then logjKjbits suce to generate a sample from. We use the probabilistic method: ChooseKat random as follows: picky1;:::;yminf0;1gn uniformly. We want that for all linear functionsh:Zn2!Z2, Pr i2[m][h(yi) = 1]2((1)=2;(1 +)=2):

Fixh. Then:

Pry1;:::;ym[y1;:::;ymare bad for h ]e

(2m); by a Cherno bound (This is because for a randomy2Zn2, Pr[h(y) = 1] =12

We then union bound over allhto get

Pr y1;:::;ym[9h:y1;:::;ymare bad forh]2ne (2m):

Now choosem=O(n

2), so that this is less than 1. We thus get an-biased space which can

be generated using logn+ 2log1 +O(1) bits of true randomness. It turns out that this seed length is near optimal. Later this class we will explicitly construct-biased spaces with seed length

O(logn+ log1

For-almostk-wise independent spaces:We know that-biased spaces are automatically -almostk-wise independent for some. It turns out that this already gives us almostk-wise independent spaces using smaller seed length than what is needed for purek-wise independence. Indeed, if we take= 2k=2and take an explicitbiased space as mentioned above, then this is-almostk-wise independent, and has a seed length ofO(logn+ log1 ) =O(logn+k+ log1

3 Ecient construction of-almostk-wise independent spaces

One way to getk-wise independence is to multiply a random seedyby a matrixM: y7!yTM: In this construction,yis a vector withO(klogn) elements chosen uniformly at random, andM hasncolumns. Each element ofyTMishy;aii, whereaiis a column ofM. Claim:The outputyTMwill bek-wise indpendent if and only if everykrows ofMare indepenent.

Proof:LetS[n]. SupposeP

i2Shai;yiis not uniformly distributed. This is equivalent to hy;P i2Saiiis not uniformly distributed. But, this implies thatP i2Sai= 0, sincehy;biis uniform for any xedb6= 0. 5 Claim:If, instead of takingyfrom a uniformly random distribution, we takeyfrom an-biased distribution,yTMwill still be2k=2-almostk-wise independent. Proof:Suppose not; then there existsS2[n];S6=;;jSj ksuch that biashP i2Sai;yi. Since each set ofkcolumns ofMare independent, we know thatP i2Sai6= 0. In addition,yis chosen from an-biased space. So, we've reached a contradiction. How many bits of randomness will we need to generate annbit sample from a-almostk-wise independent space using this procedure? We need a2k=2-biased sample of lengthO(klogn).

Since logm+ 2log(1

) bits of uniform randomness are needed formbits of-biased randomness, we needO(logk+ loglogn) +O(log(1 ) +k) =O(k+ loglogn+ log(1 )) total bits of randomness.

4 Applications of-almostk-wise independent distribution

4.1k-universal sets

Akuniversal setS f0;1gnhas the property that the projection ofSonto anykindexes contains all 2 kpossible patterns. We can use-almostk-wise independent distribution to constructk- universal sets. If=110 12 k, then any-almostk-wise independent distribution hask-universal support. The size of thek-universal set we get out of this is 2O(k+loglogn+log(1 ))=2klognO(1), and is nearly optimal. (Note the surprisingly tiny dependence onn!)

4.2 Ramsey graphs

Pick the edges (xij)i

2)from a-almostk-wise independent space; we can interpret

x ij= 1 as an edge between vertexiand vertexj, andxij= 0 as the absence of an edge. Fix

S2[n];jSj=k. By the data processing inequality,

Pr[xijare all 0 or all 1 for alli;j2S]22(k

2)+:

Taking a union bound,

Pr[9S;jSj=k;such thatSis a clique or independent set]n k (22(k 2)+)

By setting= 2(k

2)andn= 2k=10in the above inequality, we ensure that the probability that

there is a clique or independent set of sizekis less than 1. Thus, we've described an explicit family

of 2 O(log2n)graphs onnvertexes, at least one of which isO(logn) Ramsey. As a corollary, we can construct anO(logn) Ramsey graph in 2O(log2n)time. 6

5 Construction of-biased spaces

5.1 Finite extension eld review

The construction described here will use the nite eldF2n. This is ann-dimensional vector space overF2. Addition is the same as forFn2; multiplication is a bilinear map. A polynomial

P(x)2F2n[x] of degreedhas at mostdroots.

5.2 Properties needed from an-biased space

Supposey1:::ymis an-biased space; letGbe thenbymmatrix with columnsy1:::ym. Pick x2 f0;1gn. ConsiderxTG. Ifx6= 0, it must have ((1)=2;(1 +)=2) fraction of 1s. Pick anyx;y2 f0;1gn, and consider the Hamming distance (xTG;yTG); this is the number of 1s in x TGyTG= (xy)TG, which is in the range ((1)=2;(1 +)=2). Thus, the image ofxTGis a linear space inf0;1gmof dimensionn, such that any two vectors in the space have distance of 12 =2.

5.3 Construction and proof of correctness

A point from the-biased space is calculated from two uniformly chosen elements;2F2n. The point is calculated as (;)7![h1;i;h;i;:::hN;i]: whereh;iis the inner product overFn2, when elements ofF2nare represented in some basis over F 2. IfNis set to2n, then this construction needs 2log(N=) = 2logN+ 2log(1 ) bits of randomness. We need to show that the sample space described is-biased. Pick (;) uniformly fromF2n, and take anyS[N]. We need to show that Pr X i2Shi;i= 0#

2((1)=2;(1 +)=2):

We have

P i2Shii=hP i2Si;i. There are two cases to consider; eitheris a root ofP(x) =P i2Sxi, or it is not. Ifis not a root ofP(x), then PrhP i2Si;i= 0=12 . Ifis a root ofP(x),then certainlyhP i2Si;i= 0. However, there are at mostNroots ofP(x), so the probability thatis a root is at mostN2 n=. Thus, the space is-biased. 7quotesdbs_dbs31.pdfusesText_37

[PDF] livre première lecture cp

[PDF] epreuve commune seconde histoire geo

[PDF] composition conseil vie collegienne

[PDF] conseil de vie collégienne projets

[PDF] mise en place cvc

[PDF] conseil de vie collégienne definition

[PDF] composition cvc

[PDF] conseil de vie collégienne texte officiel

[PDF] conseil de vie collégienne circulaire

[PDF] sujet caplp maths sciences 2015

[PDF] sujet caplp maths sciences 2016

[PDF] divorce ? l'amiable procédure

[PDF] divorce ? l'amiable avec bien immobilier

[PDF] divorce ? l'amiable sans avocat

[PDF] divorce ? l'amiable notaire