Suppose is a triangular matrix, so is invertible. Now with eigenvalues finding process, we can only shift each ‘s diagonal entry for a to make non-invertible. And the only possible way is to shift each ‘s diagonal entry for a that is also an ‘s diagonal entry, so after that one diagonal entry of will equal […]
Month: October 2016
Pictorial proof that Reflection(v) = 2*Projection(v) – Identity(v)
Hyperplane here is a 1-D line, I don’t know if this visual proof holds for n-dimensional space haha…
Proof that unless projection matrix P = I, P is singular
We have: , first we need to prove that : Set , hence each column vector of is a linear combination of column vectors of , so . Here is an by matrix while is an by matrix and column vectors of must be linearly indepedent (so ), hence . If then it’s trivial to show that is […]
Learning From Data – A Short Course: Problem 7.1
Page 43 Implement the decision function below using a 3-layer perceptron. First I’ll construct a rectangle like this: It’s easy to see how: Consider the four lines , , , and what we want is the hypothesis . The corresponding MLP: Next I’ll try to construct a cooler shape: Now consider the three lines , and […]
Learning From Data – A Short Course: Exercise 8.17
Page 45 Show that is an upper bound on the , where is the classification error. We consider the error and on data point . (correct classification): (wrong classification): We have: Hence: So the statement follows.
Learning From Data – A Short Course: Exercise 8.12
Page 29 If all the data is from one class, then for . (a) What is ? (b) What is ? From (8.23) we have . As all the data is from one class, we also have: . Hence: