## Protected: Giới thiệu về Support Vector Machine – Phần 1

There is no excerpt because this is a protected post.

Skip to content
### Tag Archive

## Protected: Giới thiệu về Support Vector Machine – Phần 1

## Learning From Data – A Short Course: Exercise 8.15

## Learning From Data – A Short Course: Exercise 8.14

## [Notes] Learning From Data – A Short Course: e-Chapter 8

## Learning From Data – A Short Course: Exercise 8.11

## Learning From Data – A Short Course: Exercise 8.9

## Learning From Data – A Short Course: Problem 8.7

## Learning From Data – A Short Course: Exercise 8.7

## Learning From Data – A Short Course: Exercise 8.3

## Learning From Data – A Short Course: Exercise 8.1

My personal site.

12 Entries in: **Support Vector Machine**

There is no excerpt because this is a protected post.

Consider two finite-dimensional feature transforms and and their corresponding kernels and . (a) Define . Express the corresponding kernel of in terms of and . (b) Consider the matrix and let be the vector representation of the matrix (say, by concatenating all the rows). Express the corresponding kernel of in terms of and [...]

Suppose that we removed a data point with . (a) Show that the previous optimal solution remains feasible for the new dual problem (8.21) (after removing ). is the old dual problem while is the new dual problem. Because appears when appears and reverse. So when we replace the value of into (8.21), there is [...]

Page 29 “Then, at least one of the will be strictly positive.”. Please refer to the constraint .

(a) Show that the problem in (8.21) is a standard QP-problem: where and ( for dual) are given by: It is easy to show this, what should be noted here is that: (b) The matrix of quadratic coefficients is . Show that , where is the ‘signed data matrix’, Hence, show that is [...]

Let be optimal for (8.10), and let be optimal for (8.11). (a) Show that . If is the optimal of (8.10) then it must satisfy the constraint , hence . If then it doesn’t matter what is, . If then we can always choose , so . (b) Show that is feasible for (8.10). To show this, [...]

For any with and even, show that there exists a balanced dichotomy that satisfies , and (This is the geometric lemma that is need to bound the VC-dimension of -fat hyperplanes by .) The following steps are a guide for the proof. Suppose you randomly select of the labels to be , the others being [...]

Assume that the data is restricted to lie in a unit sphere. (a) Show that is non-increasing in . Suppose a dichotomy is shattered by a hyperplane with margin then it is also shattered by that hyperplane with margin as there is no margin argument in final hypothesis representation. (b) In 2 dimensions, show that for [...]

For separable data that contain both positive and negative examples, and a separating hyperplane , define the positive-side margin to be the distance between and the nearest data point of class . Similarly, define the negative-side margin to be the distance between and the nearest data point of class . Argue that if is the [...]

Assume contains two data points and . Show that: (a) No hyperplane can tolerate noise radius greater than . Assume such hyperplane exists and we call it . Let is the line connecting two points and , as two points stay on different sides seperated by the hyperplane , it’s always true that crosses [...]