March 21, 2026

Fast WooCommerce Search: An Experiment with RediSearch and SHORTINIT

woocommerce performance search redis wordpress

WooCommerce search is slow. Not because someone wrote it badly — that’s just how WordPress works. Every keystroke in the search field boots the entire engine: plugins, themes, hooks, translations, sessions. By the time WordPress touches the database with your query, 100-300 milliseconds have already passed. The same problem affects checkout pages — it’s a systemic WordPress issue, not a search-specific one.

For autocomplete, where a user types a letter every 50-150 ms, that’s an eternity.

I’ve been experimenting with a different approach: RediSearch as the search engine + SHORTINIT as an ultra-fast WordPress bootstrap. A custom plugin, built from scratch. Early results on a cheap VPS: ~54 ms median full round-trip. Redis itself responds in under 10 ms.

In this post I’ll walk through what I’ve built so far, how it works, and why I think it’s a direction worth exploring.

Why Is the Default WooCommerce Search Slow?

Let’s start with what actually happens when a customer types something into your store’s search field.

A standard WordPress request looks roughly like this:

  1. PHP loads wp-config.php
  2. Loads every active plugin (even the ones that have nothing to do with search)
  3. Loads the theme with all its hooks
  4. Initializes translations, sessions, permissions
  5. Only now fires the SQL query to the database

And that query? By default, WooCommerce searches post titles and content using LIKE '%search_term%'. MySQL can’t optimize this — it has to scan every row in the table.

Result: 30-50 MB of RAM and hundreds of milliseconds before your customer sees the first suggestion.

Plugins like FiboSearch or SearchWP solve part of this problem (FiboSearch Pro uses SHORTINIT internally), but none of the popular plugins use RediSearch as a full-text search engine.

RediSearch — What It Is and Why It Matters

Redis is a key-value database that keeps everything in RAM. It’s commonly used in WordPress as a cache for wp_options and sessions.

RediSearch is a Redis module that adds full-text search capabilities. Since Redis Stack (7.2+), it ships bundled with Redis — but it’s still a module, not a separate database. It enables indexing, prefix matching, and typo tolerance on top of data you already keep in Redis.

Why does this matter? Because Redis operates on RAM and responds in microseconds. In my test on an index of 2,500 WooCommerce products, a typical FT.SEARCH response time is 2-5 ms per query. Practically always under 10 ms.

Compare that to MySQL LIKE, which on the same product count can take 50-200 ms.

SHORTINIT — WordPress Without WordPress

This is where it gets interesting.

WordPress has a built-in constant called SHORTINIT that very few people know about. When you set it to true before loading wp-settings.php, WordPress loads the bare minimum:

  • wp-config.php (database credentials)
  • $wpdb (database object)
  • Core constants (ABSPATH, WPINC)

That’s it. No plugins. No themes. No hooks. No translations.

Instead of 30-50 MB RAM and 100-300 ms of bootstrap, you get ~5-10 MB and under 5 ms to the first line of your logic. The autocomplete endpoint starts faster than WordPress would take to load the list of active plugins.

The Catch

SHORTINIT means no access to wp_options, get_option(), or virtually any WordPress function beyond the raw $wpdb.

How do I deal with this? Plugin settings (Redis host, port, limits, search strategy) are saved to an auto-generated PHP file as define() constants. The config class detects which mode it booted in and picks one of two separate paths: if the MERIDA_SEARCH_REDIS_HOST constant exists, we’re in SHORTINIT and all config comes from constants (early return). Only when the constants aren’t there does it check whether get_option() even exists, and then reads from the database.

One or the other — never a mix. Unconventional, but it works surprisingly well.

Early Results from a Cheap VPS

I tested everything on a cheap VPS with a ~2 GHz processor. Deliberately not on a rocket — I wanted to see what this looks like under realistic conditions for a store that doesn’t spend a fortune on hosting. As I covered in why hosting should be step one, server quality sets the baseline for everything else.

First Observations

Disclaimer: these are sample requests from DevTools, not a full benchmark. Proper performance testing (including proper backend measurement methodology) is planned after confirming usability — meaning after I verify that search results actually make sense to the user. For now, I’m testing whether the architecture holds up.

What I’m seeing at this stage: median autocomplete response time (full cycle: client, server, Redis, response) sits around 50-60 ms. ~80% of requests come in under 90 ms. Redis itself responds practically always under 10 ms — the bottleneck is the network, not the application.

Three Ideas That Made a Difference

I don’t want to turn this into a full tutorial (not yet), but a few concepts are worth flagging — they might be useful to you even in a different context.

Precise First, Loose Later

Instead of immediately firing fuzzy search (which is slower and too liberal — “soap” fuzzy-matches “soup”), I use three passes:

  1. Exact prefixdryer* — match words starting with the typed phrase. Fastest and most accurate.
  2. Query trimming — remove filler words (“to”, “for”, “the”, “a”) and search again
  3. Typo tolerance%dryer% — allows minor spelling mistakes, fires ONLY when the previous passes returned nothing

The key rule: if exact matching returns results, typo tolerance never fires. Zero wasted work.

Diacritics Normalization

Product titles are indexed twice: with diacritics and without. A user typing “rak” matches “rąk” (Polish for “hands”). In any e-commerce market with non-ASCII characters, this is essential — customers routinely skip special characters when searching.

Re-Ranking Results After Redis

RediSearch sorts results by how well they match the query textually. But it knows nothing about the store’s context. So after Redis responds, PHP applies corrections: an exact SKU hit goes to the top, out-of-stock products drop in ranking, promoted products rank higher, and titles starting with the search phrase get a boost.

Redis returns 5x more candidates than needed, PHP recalculates scores and trims to the limit.

When Does This Make Sense?

Let’s be honest — for 90% of WooCommerce stores, this is overkill. If you have 200 products and a few dozen daily visitors, FiboSearch (free or pro) will do the job.

This approach starts making sense when:

  • You have thousands of products and want sub-100 ms round-trip search
  • You have your own server (VPS or dedicated) where you can install the RediSearch module
  • You want full control over search logic (synonyms, field weights, re-ranking)
  • You care about resource efficiency — the autocomplete endpoint uses ~5-10 MB RAM instead of 30-50 MB

On shared hosting this will be difficult — most providers don’t offer Redis with the Search module, and even if Redis is available, loading custom modules is usually not an option. A VPS or dedicated server gives you full control here.

What’s Next?

This is just the beginning. What’s planned:

  • Thorough testing of search result quality — because speed without accuracy is pointless
  • Load testing under heavy traffic (many concurrent users, larger indexes)
  • If the plugin goes to production — a full case study with real data

If this caught your attention, have questions, or want to talk WooCommerce — find me on LinkedIn. You can also leave a comment below.


At SHIFT64 we engineer WooCommerce performance for a living — from ongoing care and monitoring to custom solutions like this one. If your store could be faster — let’s talk.

M
Written by

Mateusz Zadorozny

SHIFT64 Founder. WooCommerce performance specialist helping store owners achieve faster load times and better conversions.