Briefly, this error occurs when the size of the compressed data stream exceeds the maximum limit set in Elasticsearch. This could be due to large documents or bulk requests. To resolve this, you can increase the http.max_content_length setting in the Elasticsearch configuration file. Alternatively, you can reduce the size of your documents or split your bulk requests into smaller chunks. Be cautious when increasing the limit as it may lead to memory issues.
This guide will help you check for common problems that cause the log ” compressed stream is longer than maximum allowed bytes [” + streamSize + “] ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: plugin.
Log Context
Log “compressed stream is longer than maximum allowed bytes [” + streamSize + “]” class name is InferenceToXContentCompressor.java. We extracted the following from Elasticsearch source code for those seeking an in-depth context :
static InputStream inflate(String compressedString; long streamSize) throws IOException { byte[] compressedBytes = Base64.getDecoder().decode(compressedString.getBytes(StandardCharsets.UTF_8)); // If the compressed length is already too large; it make sense that the inflated length would be as well // In the extremely small string case; the compressed data could actually be longer than the compressed stream if (compressedBytes.length > Math.max(100L; streamSize)) { throw new CircuitBreakingException("compressed stream is longer than maximum allowed bytes [" + streamSize + "]"; CircuitBreaker.Durability.PERMANENT); } InputStream gzipStream = new GZIPInputStream(new BytesArray(compressedBytes).streamInput(); BUFFER_SIZE); return new SimpleBoundedInputStream(gzipStream; streamSize); }
[ratemypost]