Skip to main content

Posts

Showing posts with the label docker

Utilizing Docker Secret

It likely happens that we need to provide some secrets like passwords, keys, or private things into our Docker containers. If we use the docker-compose tool to generate our containers, we basically can put the secrets as environment variables in the docker-compose.yaml file. But, what if we want to share our configuration with the others, they will find those secrets too. So, for overcoming that issue, we can utilize a feature provided by Docker itself which is the Docker secret, or we can just call it secret . For instance, we want to build a container for the PostgreSQL database. The PostgreSQL image allows us to set a custom database password by providing a value for the POSTGRES_PASSWORD or POSTGRES_PASSWORD_FILE variable. services: postgres: image: postgres environment: - POSTGRES_PASSWORD=$3cureP4ssword Or, we can utilize the Docker bind-mounting to store a file and instruct the Docker to read the secret information from the mounted file. services: ...

Running Docker Command Inside A Docker Container

But, why do we need to do this? There are several occasions that make you want to perform this action, especially if you are working on the development of a continuous delivery procedure. For example: You want to set up a closed environment (like a container) for a software testing process that requires external applications which can be run as containers. You want to build an application in a container then deploy it on another host using Docker API only, without the need for shell command execution. There are two common methods to achieve this objective. The first is by binding the Unix socket of the running Docker Engine into the container. The second is by installing a specific Docker Engine inside the container. For instance, we will run a container based on an image of Docker 20.10. We can run the following command. docker run -v /var/run/docker.sock:/var/run/docker.sock -it --rm docker:20.10 Now, you can run any  docker command inside the container that you just h...

Setting Up Docker Context

When we want to run a container on a remote Docker Engine host, we can utilize the context feature of Docker. Context allows us to maintain information of several Docker Engine hosts to be remotely accessed from our local Docker Engine host. Adding the record is done by running the following command. docker context create yourContextName --docker "host=ssh://user@remote.host" The connection utilizes SSH protocol so that we need to generate keys for establishing communication with the remote host. After storing the public key value on the remote host, we can spawn a new SSH agent on the current session on our host and add the private key into the agent. eval $(ssh-agent -s) cat /path/to/private/key | ssh-add - Before we can access the remote Docker API, we need to add the remote keys information to our ~/.ssh/known_hosts file by making an SSH connection for the first time or using ssh-keyscan . Now, we can access remote Docker API by specifying the context on the lo...

Serving Single-Page React App with Docker

If you build a single-page application using React while your app only gets backend data through API access, it will be convenient to utilize the Create-React-App toolchain. For serving your app to the public, you actually just need to serve the generated static files. Serving static files can be easier when you utilize a container for shipping a web server. Initiate react project then build the app. npx create-react-app my-app cd my-app npm run build Create Nginx server configuration for your app site.conf . server { listen 80; server_name my-app.localhost; index index.html; root /app; error_log /var/log/nginx/error.log; access_log /var/log/nginx/access.log; try_files $uri $uri/ index.html =404; } Create a Dockerfile for building containerized web server. # syntax=docker/dockerfile:1 FROM nginx:latest WORKDIR /app COPY ./build . COPY ["./site.conf", "/etc/nginx/conf.d/"] Build the image and run the container. docker build -t my-servic...

Create Multi-stage Dockerfile for Development and Production

By utilizing container technology like Docker, we can have an identical environment for development and production. But, working in a development environment requires us to be able to change the source codes directly. We can define multi-stage procedures in the Dockerfile and specific commands to run the container. For example, we have a Node.js program that will be shipped using a container. 1. Create a Dockerfile with stages for development and production. # syntax=docker/dockerfile:1 FROM node:14.17.1 as base WORKDIR /app COPY ["package.json", "package-lock.json", "./"] FROM base as development ENV NODE_ENV=development RUN npm ci COPY . . CMD ["nodemon", "-L", "app.js"] FROM base as production ENV NODE_ENV=production RUN npm ci --production COPY . . CMD ["node", "app.js"] In this example, the program for development is run using nodemon  therefore we need to install nodemon first by npm i -D node...

Run MongoDB in Docker Container

For faster application development and delivery, Docker can be a good choice. MongoDB image has been available in the Docker registry. You can run MongoDB in Docker container then connect your application or service to it. To make the MongoDB service becomes available in the container's network and its data can be persisted, there are several steps for configuration. 1. Create named volumes  for MongoDB data and a configuration file, then create a network for containers in your application system. For example, the volumes are my-mongo-data and my-mongo-config , and the network is named my-net . docker volume create my-mongo-data docker volume create my-mongo-config docker network create my-net 2. As mentioned in MongoDB for Docker' site , database data is stored in the  /data/db directory in the container. MongoDB for Docker also accepts environment variables for setting up initial username and password for root user which are named MONGO_INITDB_ROOT_USERNAME and MONG...

Shipping Node.js-based Application as Docker Image

Docker is a set of tools for managing and delivering services or application components as isolated packages called containers. This technology become more popular in present time along with increase in demand of micro-services architecture, continous integration, and faster features or services delivery. This article shows you how to ship your Node application as Docker image and deploy it as a container. 1. Initiate a Node project. mkdir my-node-app cd ./my-node-app npm init -y 2. Write a sample Node application named server.js in your project directory. const http = require('http'); const chalk = require('chalk'); const server = http.createServer(function(req,res){ res.end('Hello from Node server.'); }); server.listen(5000, ()=>{ console.log(chalk.green('Your server is running on port 5000!')); }); 3. Install your Node application dependency. npm install --save chalk 4. Write a Dockerfile file in your project directory. # s...

Installing Multiple Instances of Linux Distributions in WSL

By support of WSL (Windows Subsystem for Linux), you can install any Linux distros in a Windows machine. Recommended method from WSL documentation is by downloading the distribution from Microsoft Store or find .appx installation file available in Microsoft website. For running multiple instances of same Linux distribution, you can duplicate the data using export-import procedure, as I have mentioned in another post . Another method that might be more beneficial is by utilizing Docker. Currently, Docker has already had variety of images of Linux distributions in its registry. You can also store your own costumized distribution in Docker registry that can be distributed to any machines instantly. After you had WSL 2 and an installed Linux distribution from Microsoft Store, you are ready to have more Linux instances in your Windows. 1. List all installed distributions in your Windows. wsl --list -v 2. Run the distribution you desired from terminal, for example, you have insta...