AI-powered Image Search in Just a Few Lines of Code
Hot on the heels of the Text Indexes release, we’re happy to introduce you to Image Indexes, launching into self-service Beta today! We’ve had some big customers running Image Indexes in prod for a while now, and we’re really looking forward to seeing what you all build with it. If you don't have an account yet, sign up now!
Your app just learned to see inside images 👀
We’ve all built some flavor of basic text search in our developer lives. And at one point or another a lot of us have had to try to build some kind of image retrieval. Usually, the only way to do this is to try and turn your image search problem into a text search problem — tagging, JOIN tables, and saying a lot of mean things to your CLI that you wish you could take back.
And that’s because searching images for meaning is a hard problem. Fortunately, Objective Search speaks meaning, and it’s pretty phenomenal at it. It understands natural language, and extracts the human meaning behind a search query to find related meaning in the Objects in your Object Store.
Let’s build image search.
Let’s say you’re building search for a catalog of illustration ideas. You’ve decided that each Object in your Object Store has just one property - a primary illustration (illustration
). This property holds a URL to a crawlable image your app hosts.
Gotta say, our taste is impeccable:
Before we get building, notice a few things — first, that we aren’t adding any string properties to the Objects in our Object Store. It’s just the single crawlable image endpoint. That’s all. No tags. No descriptions. An image in your Object Store is worth at least a thousand words. Obviously, you can add whatever you’d like to the Objects in your Object Store, but Objective Search can work it’s magic with just an image endpoint.
Now - jump into your CLI and grab our new Python SDK (or Typescript, if that’s your flavor!) —
And then let’s jump into your favorite .py file, and build the basics of your search.
Now your Object Store now has some Objects in it - each with a main illustration and an alt illustration. In order to search your Object Store, we need to create an Image Index attached to your Object Store.
All Objective Search Indexes process (and reprocess!) your Objects automatically. While an Index is processing, you can programmatically poll the status of Index processing with the status() function.
Which will return a queue of in-flight indexing operations which you can use to make decisions about how to wait or proceed:
And now all that’s left is to search for literally anything you can think of:
When your Image Index processes a query, it will ‘see’ inside all of the images in your Objects, prioritizing the most relevant Objects first in the result set. You’ll see the first item in the response JSON document of your query results is one of the two images with a rainbow full of colors in our Object Store:
To help visualize this, jump into Console, and under “Indexes” find your new Image Index. You’ll see a “Search Index” button - click it! The Query Browser in Console is a great way to quickly visualize your results. A search for “rainbow of colors” should prioritize results like this:
And if you try something weirder and full of interpretive meaning and intent like “people in a row”, we see that (very abstract) illustration of three people in a row prioritized first:
Pretty cool, right?
Obviously, this is a pretty simple example of what’s possible when you load your Object Store with hundreds, thousands, or millions of Objects and put all of this power right in your app’s search bar for your users.
You can imagine how powerful your app’s search can be when it understands search queries full of human meaning like:
- “black and white photos of street buskers”
- “renaissance architecture”
- “snowy city at night”
- “an eerie farmhouse”
We really can’t wait to see what you all build with Image Indexes, and what we have in the hopper to show you next!