Antti Honkela ICA: Exercise solutions
As V is a whitening matrix, it must satisfy
E{zz
T
} = VC
x
V
T
= VEDE
T
V
T
= EM(E
T
E)D(E
T
E)ME
T
= EMDME
T
= I.
(84)
Multiplying this from the left by E
T
and from the right by E yields
MDM = I. (85)
Taking the inverse of both sides and multiplying the result from both sides by M yields
D
−1
= M
2
. (86)
Thus M must be the unique symmetric positive semidefinite square root D
−1/2
of D
−1
.
Problem 7.4
Assuming x = As with A invertible, the model does not change if the data is multiplied by an
invertible matrix M:
z = Mx = (MA)s =
˜
As (87)
and while the mixing matrix changes, the independent components stay the same. If M is not
invertible, then the model does not hold anymore.
Problem 7.6
If there are more independent components than observed mixtures, we in general cannot recover
the values of the independent components. Finding the values of the components would involve
solving a set of linear equations where there are more unknowns than equations and such sets have
an infinite number of solutions.
If there are more observed mixtures than independent components, the situation reverses: now we
have more equations than unknowns. If the data really is generated by mixing the independent
components, there must be a solution to the equations and thus we can recover the components.
Problems for Chapter 8
Problem 8.2
kurt(y) = E{y
4
} − 3(E{y
2
})
2
(88)
kurt(w
T
z) = E{(w
T
z)
4
} − 3(E{(w
T
z)
2
})
2
(89)
|kurt(w
T
z)| = sign
¡
kurt(w
T
z)
¢
E{(w
T
z)
4
} − 3(E{(w
T
z)
2
})
2
(90)
∂|kurt(w
T
z)|
∂w
= sign
¡
kurt(w
T
z)
¢
∂(E{(w
T
z)
4
} − 3(E{(w
T
z)
2
})
2
)
∂w
= sign
¡
kurt(w
T
z)
¢
E{4(w
T
z)
3
z} − 3 · 2 E{(w
T
z)
2
}E{2(w
T
z)z}
= 4sign
¡
kurt(w
T
z)
¢
(E{z(w
T
z)
3
} − 3||w||
2
E{(zz
T
)}w)
= 4sign
¡
kurt(w
T
z)
¢
(E{z(w
T
z)
3
} − 3||w||
2
w)
(91)
HUT, Neural Networks Research Centre 14