For big sites with a lot of traffic and/or a high amount of products (50 000 and more), the database might become a bottleneck. In this case it makes sense to move the cache to a different server to decrease the load on the main database. Aimeos provides a Redis cache extension for storing the cached parts of a page in a Redis server, which keeps the cached entries in memory.
The hardware that hosts the Redis server should be in the same network as the web servers to minimize the latency for each request. Usually, the transfer over the network takes much longer than the look up of the data in the Redis server. Try to keep the round trip time as low as possible (at least below 5ms) to get the best performance because for each of the shop parts on a page a request is made.
By default, if a product is added, changed or deleted, all cache entries that contain products will be deleted and regenerated on the next request. Even if Aimeos is really fast without caching, the difference is at least 50% more load on the server while the cache is regenerated.
Aimeos can be told to use a more fine-grained cache tag system that allows to delete only those cache entries that are (probably) affected. In this case if a product is added, changed or deleted, only the cache entries for the product lists and the detail view for this product are deleted instead of the cached detail views for all products as well. To activate this behavior, use this configuration option:
This configuration option isn't enabled by default because it generate a lot of tags for each cached entries. Up to 400 are very common in a standard setup. The default database cache implementation can't store tags within a single database update and requires a database request for each tag. This can slow down the first request! Therefore, it's recommended in combination with the Redis extension, which is able to store the cache entry and all tags in one request. If you use another cache you should test if it has any negative performance implications when populating the cache.