[Notes] Learning From Data – A Short Course: e-Chapter 9

  • Page 30: Why  E'_{out}(g_{r}) = \frac{1}{2}? My understanding:   \mathbb{P}\left [ r_{n} = + 1 | g_{r}(x_{n}) = +1 \right ] = \mathbb{P}\left [ r_{n} = - 1 | g_{r}(x_{n}) = +1 \right ] = \frac{1}{2} as the events  g_{r}(x_{n}) = +1 and  r_{n} = + 1 are independent, the same goes for the event  g_{r}(x_{n}) = -1. Note that both data points (x_{n}, + 1) and (x_{n}, -1) may exist in dataset so if  g_{r} is a deterministic hypothesis then it is obviously  E'_{out}(g_{r}) = \frac{1}{2} as  \mathbb{P}\left [ r_{n} = + 1 | x_{n} \right ] =  \mathbb{P}\left [ r_{n} = - 1 | x_{n} \right ] = \frac{1}{2} . If g_{r} is a non-deterministic hypothesis, g_{r} is still independent from the random target function (even “r_{1},...,r_{n} are generated independently”).

 


Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *