Guides / Scaling / Servers and clusters

Algolia has one standard server setup:

  • 6 cores CPU
  • 12 threads
  • 128 GB of memory
  • 2x800 GB SSD RAID-0

The earliest configuration was already quite powerful: 4 processors, 8 threads, 32 GB RAM, and 2x120 GB SSD 320-series disks. However, these specs didn’t suit indexing requirements.

For example, 4 cores and 8 threads aren’t always enough to handle many operations running in parallel. Using 6 cores and 12 threads can handle an ongoing flow of indexing operations without impacting the speed of search. This upgrade also provides extra processing speed to manage servers’ system resources.

For the disks, Algolia experimented with many kinds of SSD RAID-0 before picking the right one. The current SSD S3500 series offers a faster hard-drive, removing serious performance bottlenecks.

To strike the right balance of RAM and disk space for both caching and data capacity, several use cases, engine tweaks, and countless performance tests were considered before arriving at the right amount of memory and disk size—no more no less. For example, Algolia increases the memory to process large indices, while leaving enough space to perform in-memory searches.

To get an idea of the importance of balancing RAM size with the right number of cores and threads, consider large indices. For performance reasons, Algolia loads all indices in memory. Additionally, larger indices are broken into smaller pieces, and each piece (shard) receives a dedicated thread on the same server. This setup allows for faster updates and searches. Meanwhile, other threads are dedicated to processing searches, while still more threads are monitoring the system and managing consensus.

Algolia machines require a large minimum of RAM and cores, with a reasonable buffer for flexibility.

To go into far more detail about the engine, take a look at the Inside the Engine blog series.

Did you find this page helpful?