Max number of inference processors reached total inference processors – How to solve this Elasticsearch exception

Opster Team

August-23, Version: 7.6-7.15

Briefly, this error occurs when the maximum number of inference processors allowed in Elasticsearch has been reached. Inference processors are used for running trained machine learning models. The limit is set to prevent excessive resource usage. To resolve this issue, you can either increase the limit of inference processors if your system resources allow, or reduce the number of inference processors by optimizing your machine learning models or removing unnecessary ones.

This guide will help you check for common problems that cause the log ” Max number of inference processors reached; total inference processors [{}]. ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: plugin.

Log Context

Log “Max number of inference processors reached; total inference processors [{}].” class name is InferenceProcessor.java. We extracted the following from Elasticsearch source code for those seeking an in-depth context :

 @Override
 public InferenceProcessor create(Map processorFactories; String tag; String description;
 Map config) {  if (this.maxIngestProcessors <= currentInferenceProcessors) {
 throw new ElasticsearchStatusException("Max number of inference processors reached; total inference processors [{}]. " +
 "Adjust the setting [{}]: [{}] if a greater number is desired.";
 RestStatus.CONFLICT;
 currentInferenceProcessors;
 MAX_INFERENCE_PROCESSORS.getKey();
 maxIngestProcessors);

 

 [ratemypost]