Word Embeddings

Enter words to see how they are represented as vectors in 3D space

Control reference words

ON: Stable positions, relative PCA positioning based on reference words
OFF: Dynamic positioning based on PCA dimenationality reducing for all words
Reference words:

Added Words (0)

Dimentionality Reduction to 3D

Word Arithmetic

Process Overview:

Input: "king - man + woman"
Parse: [king] [-] [man] [+] [woman]
Vectors: Get 1536D embeddings for each word
Matrix Arithmetic: king[1536] - man[1536] + woman[1536] = result[1536]
Search: cosine similarity search

How Word Embeddings Work

What are Word Embeddings?

Word embeddings are dense vector representations of words where similar words have similar vectors. Each word is represented as a point in high-dimensional space.

This demo uses OpenAI's text-embedding-3-small model to convert each word to a 1536-dimensional vector, then reduces it to 3D using PCA for visualisation.