Reducing requested field data cache size of to the maximum allowed size of – How to solve this Elasticsearch error

Opster Team

Aug-23, Version: 1.3-1.3

Briefly, this error occurs when the requested field data cache size exceeds the maximum allowed size. Elasticsearch uses field data cache to hold the field values for sorting and aggregating operations. If the requested size is too large, it can cause memory issues. To resolve this, you can either increase the maximum allowed size of the field data cache in the Elasticsearch settings, or reduce the requested size. Also, consider optimizing your queries to use less memory, or use doc values instead of field data for sorting and aggregating operations.

This guide will help you check for common problems that cause the log ” reducing requested field data cache size of [{}] to the maximum allowed size of [{}] ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: cache, fielddata and indices.

Log Context

Log “reducing requested field data cache size of [{}] to the maximum allowed size of [{}]” classname is IndicesFieldDataCache.java.
We extracted the following from Elasticsearch source code for those seeking an in-depth context :

super(settings);
        this.indicesFieldDataCacheListener = indicesFieldDataCacheListener;
        String size = componentSettings.get("size"; "-1");
        long sizeInBytes = componentSettings.getAsMemory("size"; "-1").bytes();
        if (sizeInBytes > ByteSizeValue.MAX_GUAVA_CACHE_SIZE.bytes()) {
            logger.warn("reducing requested field data cache size of [{}] to the maximum allowed size of [{}]"; new ByteSizeValue(sizeInBytes);
                    ByteSizeValue.MAX_GUAVA_CACHE_SIZE);
            sizeInBytes = ByteSizeValue.MAX_GUAVA_CACHE_SIZE.bytes();
            size = ByteSizeValue.MAX_GUAVA_CACHE_SIZE.toString();
        }
        final TimeValue expire = componentSettings.getAsTime("expire"; null);

 

 [ratemypost]