1EMBEDDINGS, LATENT SPACE, AND REPRESENTATIONS
In deep learning, we often use the terms embedding vectors, representations, and latent space. What do these concepts have in common, and how do they differ?
While these three terms are often used interchangeably, we can make subtle distinctions between them:
- Embedding vectors are representations of input data where similar items are close to each other.
- Latent vectors are intermediate representations of input data.
- Representations are encoded versions of the original input.
The following sections explore the relationship between embeddings, latent vectors, and representations and how each functions ...
Get Machine Learning Q and AI now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.