Ad

Best Practices On Limiting Memory Usage Of Elasticsearch For Local Development Environment

- 1 answer

I have a Ruby on Rails app for which I use Docker Compose to develop locally. I always use a copy of our production PostgreSQL database which is pretty large with around 500,000 rows. A lot of that data gets indexed for search in Elasticsearch using the Searchkick gem. I want to reproduce the production environment as closely as possible but I'm getting slowed down by the huge RAM usage of Elasticsearch on my laptop. I'm getting over 10GB of RAM usage often.

Has anyone thought of a solution for limiting the RAM usage of Elasticsearch just for local development?

Here are the Elasticsearch settings in my docker-compose.yml file:

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.11.1
    environment:
      - "discovery.type=single-node"
      - "ELASTIC_USERNAME=elastic"
      - "ELASTIC_PASSWORD=DkIedPPSCb"
      - "xpack.security.enabled=true"
    ports: ['9200:9200', '9300:9300']
Ad

Answer

This is more of a docker config than elasticsearch. If you are using v2 of docker-compose , set the following flag in docker compose

Ref: Mem and CPU V2

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.11.1
    environment:
      - "discovery.type=single-node"
      - "ELASTIC_USERNAME=elastic"
      - "ELASTIC_PASSWORD=DkIedPPSCb"
      - "xpack.security.enabled=true"
    ports: ['9200:9200', '9300:9300']
    mem_limit: 1024m

If its v3 , set memory config like so :

Ref: Mem and CPU V3

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.11.1
    environment:
      - "discovery.type=single-node"
      - "ELASTIC_USERNAME=elastic"
      - "ELASTIC_PASSWORD=DkIedPPSCb"
      - "xpack.security.enabled=true"
    ports: ['9200:9200', '9300:9300']
    deploy:
      resources:
        limits:
          cpus: '0.50'
          memory: 50M
        reservations:
          cpus: '0.25'
          memory: 20M
Ad
source: stackoverflow.com
Ad