1. Suppose that a list contains the values 20 44 48 55 62 66 74 88 93 99 at index positions 0 through 9. Trace the values of the variables….
show how PCA can be used for constructing nonlinear dimensionality reduction on the basis of the kernel trick (see Chapter 16).
1. Kernel PCA: In this exercise we show how PCA can be used for constructing nonlinear dimensionality reduction on the basis of the kernel trick (see Chapter 16). Let X be some instance space and let S = {x1, . . .,xm} be a set of points in X. Consider a feature mapping ψ : X → V, where V is some Hilbert space (possibly of infinite dimension). Let K : X × X be a kernel function, that is, k(x,x_) = _ψ(x),ψ(x_)_. Kernel PCA is the process of mapping the elements in S into V using ψ, and then applying PCA over {ψ(x1), . . .,ψ(xm)} into Rn. The output of this process is the set of reduced elements. Show how this process can be done in polynomial time in terms of m and n, assuming that each evaluation of K(·, ·) can be calculated in a constant time. In particular, if your implementation requires multiplication of two matrices A and B, verify that their product can be computed. Similarly, if an eigenvalue decomposition of some matrix C is required, verify that this decomposition can be computed.