# Learning From Data – A Short Course: Exercise 4.3

Page 125:

Deterministic noise depends on , as some models approximate better than others.

(a) Assume is fixed and we increase the complexity of . Will deterministic noise in general go up or down? Is there a higher or lower tendency to overfit?

Deterministic noise in general will go up because it is harder for any hypothesis in to approximate , so the and components of expected out-of-sample error will go up. If expected in-sample error is low then the model is overfitting, the expected in-sample error is high then the model is underfitting.

(b) Assume is fixed and we decrease the complexity of . Will deterministic noise in general go up or down? Is there a higher or lower tendency to overfit?

[Hint: There is a race between two factors that affect overfitting in opposite ways, but one wins.]

Deterministic noise in general will go up because has lower chance to approximate , so the component of expected out-of-sample error will go up, however the component will go down. I am not sure about the overfitting tendency in this case (though it looks like the will eventually wins), however we have this quote from the author:

So getting back to the point, if you make more complex, you will decrease the det. noise (bias) but you will increase the var (its indirect impact). Usually the latter dominates (overfitting, not because of the direct impact of the noise, but because of its indirect impact) … unless you are in the underfitting regime when the former dominates.

There is also a dicussion about this on the forum.