Home:ALL Converter>docker swarm with elastic cluster

docker swarm with elastic cluster

Ask Time:2017-09-15T20:11:52         Author:Pásztor Sándor

Json Formatter

I am beginner with elastic and docker swarm. I have spent two weeks to learn and try to prepare a docker swarm with elastic. I would like to prepare an elastic cluster with docker swarm. Where we can use scale up and down easily. I thought we need swarm to do this and elastic cluster to keep data in sync between swarm nodes. I also thought for the full automatism I would like to use Zen configured with container host name. Because of swarm round rubin "elasticsearch" hostname should return all of the ip.

  • docker version is: 17.06.2-ce
  • elastic docker image version is: elasticsearch:latest
  • docker-compose version >= 3

First I tried to follow this instruction: sematext.com/blog/2016/12/12/docker-elasticsearch-swarm

Here nginx-proxy is not working as a service (part of docker-compose file), but It is working as container. (docker run). I have no idea what could be the difference. But the idea itself has been prepared earlier version of docker and it is not working for me.

The main idea behind this instruction is, discovery.zen.ping.unicast.hosts has the container name. Docker swarm itself load balancing and then elastic can find the another nodes.

Because nginx-proxy doesn't work as a service I tried to follow this instruction: derpturkey.com/elasticsearch-cluster-with-docker-engine-swarm-mode/

I have defined nginx service to connect elastic and I configured all of the parameters here.

version: '3'  
  services:  
    elasticsearch:
    image: 'elasticsearch:5'
    command: [ elasticsearch, -E, network.host=0.0.0.0, -E, discovery.zen.ping.unicast.hosts=elasticsearch, -E, discovery.zen.minimum_master_nodes=1 ]
  nginx:
    image: 'nginx:1'
    ports:
      - '9200:9200'
    command: |
      /bin/bash -c "echo '
      server {
        listen 9200;
        add_header X-Frame-Options "SAMEORIGIN";
        location / {
          proxy_pass http://elasticsearch:9200;
          proxy_http_version 1.1;
          proxy_set_header Connection keep-alive;
          proxy_set_header Upgrade $$http_upgrade;
          proxy_set_header Host $$host;
          proxy_set_header X-Real-IP $$remote_addr;
          proxy_cache_bypass $$http_upgrade;
      }
   }' | tee /etc/nginx/conf.d/default.conf && nginx -g 'daemon off;'"

It didn't work. Later I made some changes:

  • I have created an own image and set nginx parameters directly in nginx config file.
  • Here is my new config of docker-compose file.

I tested it. Nginx is working. curl -XGET http://elastic:[email protected]:9200/_cluster/state?pretty return with the data.

version: '3'  
  services:  
    elasticsearch:
      image: elasticsearch:latest
      deploy:
        replicas: 2
      ports: ["9300:9300"]
      command: [elasticsearch, -E, network.bind_host=0.0.0.0, -E, discovery.zen.ping.unicast.hosts=elasticsearch, -E, discovery.zen.minimum_master_nodes=1]
    nginx:
      image: 'dodi1983/nginx:0.1'
      ports:
        - 9200:9200
      depends_on: 
        - elasticsearch

Consequences:

  • nginx is working.
  • From nginx container elasticsearch resolve dns is working and load balancing between to node.

Unfortunately When I ask elastic about available nodes. Elastic always returns with the current node info (load balancing). I can see the different ids. Only one node is available, but I thought ZEN discovery should have found both. I logged in nginx container and try to get cluster info. you can see here, they are different nodes and they are not in cluster. curl -XGET http://elastic:changeme@elasticsearch:9200/_cluster/state?pretty

Does anyone has any idea or solution? Thank you.

Author:Pásztor Sándor,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/46239296/docker-swarm-with-elastic-cluster
yy