With this demo you can inspect the new AlphaEarth Foundations embeddings from Google Deepmind and query them using a drawn polygon. These embeddings are a general-purpose vector representation of every 10mx10m are on Earth's land surface area, trained on a large corpus of satellite data and text. One of the most promising applications of general-purpose embeddings is to enable similarity search and change detection with no required training or finetuning. Draw a polygon on the map to select an area of interest, then click the "🔎 Search" button to find similar areas in the dataset!
AlphaEarth Foundations is licenced under CC-BY-4.0; it is a dataset produced by Google and Google Deepmind. It is available from Google Earth Engine.
Similarity search uses the embeddings of the drawn polygon to find similar areas in the dataset. Adjust the number of neighbours (k) to control how many similar areas are returned. These are the closest 'neighbours' in the embedding space - areas that have the most similar vector representations. Search is sensitive to the drawn polygon's size - you should get results that are similar in scale to the drawn area.
Some technical details: the embeddings have been extracted from Google Earth Engine using Apache Beam, GCP's Dataflow, xarray-beam, and xee. The embeddings have been mean-pooled through a pyramid of spatial reductions and stored in a Milvus database. A FastAPI application handles the similarity search. This UI is built with React and Next.js, and is served using Vercel. Deployment is handled using Github Actions and Terraform.