
6 Dimensionality reduction
This chapter covers
- t-distributed stochastic neighbor embedding
- Multidimensional scaling
- Uniform manifold approximation and projection
- Python implementations of the algorithms
Life is really simple, but we insist on making it complicated.
Simplicity is a virtue—both in life and in data science. We have discussed a lot of algorithms so far. A few of them are simple enough, and some of them are a bit complicated. In part 1 of the book, we studied simpler clustering algorithms, and in the last chapter, we examined advanced clustering algorithms. Similarly, we studied a few dimensionality algorithms like principal component analysis (PCA) in chapter 3. Continuing on the same note, we will study three advanced dimensionality reduction techniques in this chapter.
The advanced topics we cover in this and the next part of the book are meant to prepare you for complex problems. While you can apply these advanced solutions, it is always advisable to start with the classical solutions like PCA for dimensionality reduction. And if that solution doesn’t appropriately address the problem, then you can try the advanced solutions.