“This is the 23rd day of my participation in the November Gwen Challenge. See details of the event: The Last Gwen Challenge 2021”.

Like again, form a habit 👏👏

preface

Kafka + ZooKeeper Docker kafka+ ZooKeeper Docker Kafka + ZooKeeper

Environment to prepare

This article uses Windows system + Docker Desktop demonstration, assuming you have a certain Docker foundation.

Docker Desktop

Docker Desktop provides a complete Desktop environment for us to learn Docker, which can provide a lot of convenience for software development.

Docker Desktop includes Docker Engine, Docker CLI client, Docker Compose, Docker Machine and Kitematic.

First go to Docker official website to download Docker Desktop and install.

docker-compose

Compose is a tool for defining and running multi-container Docker applications. With Compose, you can use YML files to configure all the services your application needs. Then, with a single command, all services can be created and started from the YML file configuration.

Compose a command

The command explain
docker-compose up Start all containers
docker-compose up -d The background starts and runs all containers
docker-compose up –no-recreate -d Do not recreate stopped containers
docker-compose up -d test2 Start only the container test2
docker-compose stop Stop the container
docker-compose start Start the container
docker-compose down Stop and destroy the container

Docker-compose is installed by default when Docker Desktop is installed, so we don’t need to install Docker-compose.

Write the docker – compose. Yml

version: '3'
services:
  zookeeper:
    image: wurstmeister/zookeeper
    volumes:
      - ./data:/data
    ports:
      - "2181:2181"
       
  kafka:
    image: wurstmeister/kafka
    ports:
      - "9092:9092"
    environment:
      KAFKA_ADVERTISED_HOST_NAME: 127.0. 01.
      KAFKA_MESSAGE_MAX_BYTES: 2000000
      KAFKA_CREATE_TOPICS: "Topic1:1:3,Topic2:1:1:compact"
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
    volumes:
      - ./kafka-logs:/kafka
      - /var/run/docker.sock:/var/run/docker.sock
 
  kafka-manager:
    image: sheepkiller/kafka-manager
    ports:
      - 9020: 9000
    environment:
      ZK_HOSTS: zookeeper:2181
Copy the code

Put docker-comemess. yml in any file directory.

The Window terminal executes the Docker Compose command

Start the Window terminal and go to the docker-comemess. yml file directory

The service package

[root@rameo kafka] # docker-compose build
zookeeper uses an image, skipping
kafka uses an image, skipping
Copy the code

Start the service

[root@rameo kafka]# docker-compose up -d
Starting kafka_kafka_1     ... done
Starting kafka_zookeeper_1 ... done
Copy the code

At this point, the image has been successfully started, and we can view the image through the Docker Desktop tool

And start the container

conclusion

Through this example, we can very efficient implementation of kafka cluster + ZooKeeper cluster construction, mainly can be integrated with Docker Desktop tool a lot of Docker ability and docker-compose orchestration processing ability, we do not need every image pull, run, configure once, Instead, they are all centrally managed by a unified docker-comemage. yml, run at once.