Docker - Everything You Need To Know

What exactly is Docker? 

 
Whenever we have to install software, we have to take care of a lot of things. There are a lot of different versions of software available for different operating systems and their different versions. You have to go through the documentation and choose the correct fit for your needs and then run the executive file. Even after that, you may need to complete some other steps before you are able to use that software. Docker runs containers which contain the software plus the other additional things that the software needs to run. So, this means you just use a ‘docker run’ command with the name of the image that you want to install, and voila, your software runs in its own container, using its own resources. You do not have to worry which version of the software suits your operating system, etc. I will demonstrate this with an example of MongoDB installation.

What is the importance of Docker for me as a developer in IT? 

Well, developers can simply write their code, and create an image. This image will contain all the tools needed for the application to run. This image simply needs to be deployed on a production machine which has no prior software installed and the application will run exactly as in the development machine

So how exactly do we use docker? 

Once we have installed docker on our systems, we go to Docker Hub or some other registry. Search for the software that you want to install. In case you want to share an image of your application, you can save it as a 'tar' file and share it with others .You can then run a PowerShell command ‘docker run imageName’ and the software is ready for our use.
 

What's the difference between containers and dockers?

 
Yes, these two terms are used interchangeably a lot. But they mean different things. Containers are self-contained processes which include the running software along with its dependencies etc. Containers have existed Linux for a long time, but they were not much used then. According to the official website of Docker – Docker is ‘a platform for developers and sysadmins to develop, deploy, and run applications with containers’. So to sum up, dockers help us maintain containers and containers are processes that run applications.
 

Advantages of containers

 
As mentioned above, running as containers simplifies the process of running software and applications. Suppose you have an asp.net application. A developer can create an image of the working application. This image will contain the application, asp.net framework etc. Now this image can be deployed as a container on a prod machine that needs no other prior things installed. Whatever is needed for the application to run will be present in the container. The container will run the same on all the systems. So you will no longer have issues like an application is running on dev but failing on Prod.
 

Containers and virtual machines

 
Containers and virtual machines might look the same, but they are quite different. Containers will contain only the tools that the application needs and it will share the host operating system kernel with other containers. But virtual machines, on the other hand, will have their own fully independent operating systems. Since containers do not have their own full-fledged operating system, they are lighter than virtual machines.
 

Docker Engine 

The docker official website explains a Docker engine with the below diagram.

Docker Tutorial 

A docker engine consists of a client and a server. We users interact with the server using the docker CLI which is also the client. The client interacts with the server through the docker Rest API. The server or docker daemon is responsible for running the containers. When the user types in a command from the docker CLI, for example - the ‘docker run imagename’ command, the request is received by the docker daemon. The daemon will search for the image locally, and if found it will run it as a container. Think of an image as an executive file. If the image is not found locally, the daemon will search for it in a registry and then run it as a container.

Installing Docker

Now, let’s start exploring docker practically. You need Windows 10 Professional or Enterprise version with at least 4 GB RAM to install Docker. Since I didn’t have Windows 10 Professional, I created a virtual machine in Azure. Here are the steps:

Go to the Azure portal and click on the virtual machine.

Docker Tutorial

Choose a Windows 10 professional machine. Not all VMs support nested virtualization. So, I went and selected a VM of size D2s_v3. Selecting a virtual machine with a size that supports nested virtualization is important to run Docker.

Docker Tutorial

Also, make sure that in inbound and outbound port rules, all connections are allowed through RDP; else, you might not be able to connect through RDP.

Docker Tutorial

If you are trying to access Azure from your office you might run into issues. You might need to contact your system administrator to open up these ports. Once our VM is up and running we need to install Docker. Go here to install Docker for Windows.

Once you have installed the above software, your system will restart and Docker will ask you to enable Hyper-V. Click "Yes, restart the system".

Docker Tutorial
 
Docker Tutorial

By default, Linux containers will be enabled. You can switch containers by clicking on the whale icon as below.

Docker Tutorial

Go to PowerShell and type ‘docker run hello-world‘ and press Enter. You should see a message ‘Hello from Docker’ which means your docker is installed correctly.

Docker Tutorial 

Read the steps that are mentioned in the screenshot above. This is what we had talked about earlier.

It is possible that if you are trying to run Docker from your workplace, you might face some proxy related issues. You can set your proxy by navigating to settings.

Demo: Running your asp.net application as a container

Create a new ASP.NET MVC core project in Visual Studio 2017. While creating make sure you have the ‘Enable Docker Support’ option checked.

Docker Tutorial

I call my app ‘aspnetapp’. Once you enable docker support, a file called dockerfile is created in the Solution Explorer.

Docker Tutorial

Replace the existing code in this file with the following piece of code.

  1. FROM microsoft/dotnet:sdk AS build-env  
  2. WORKDIR /app  
  3.   
  4. # Copy csproj and restore as distinct layers  
  5. COPY *.csproj ./  
  6. RUN dotnet restore  
  7.   
  8. # Copy everything else and build  
  9. COPY . ./  
  10. RUN dotnet publish -c Release -o out  
  11.   
  12. # Build runtime image  
  13. FROM microsoft/dotnet:aspnetcore-runtime  
  14. WORKDIR /app  
  15. COPY --from=build-env /app/out .  
  16. ENTRYPOINT ["dotnet""aspnetapp.dll"]  

Docker Tutorial

Goto PowerShell and navigate to the project directory. Once there, run the command:

docker build -t aspnetapp .

Docker Tutorial 

Once the project is build run the following command:

docker run -d -p 8080:80 --name myaspnetapp aspnetapp

Docker Tutorial

Once this is successful go to localhost:8080 to navigate the app,

Docker Tutorial

So what happened here? The dockerfile gave information which is needed for creating an image. For example, it says that the image should be created with the base image as Microsoft/dotnet:aspnetcore. An image for application is created when you run the build command. If you write the following the command in PowerShell, you see the images listed.

Docker images

Docker Tutorial
 
When you run the image using the ‘docker run’ command, it runs this image as a container where ‘myaspnetapp’ is the container name and ‘aspnetapp’ is the image name. The run command instructs it to run on port 8080. So, when you navigate to localhost:8080, you can find your containerized application running. You can check out all the running containers using the commander ‘docker ps’.
 
Docker Tutorial

For further information on this demo, refer to the official docker website here.

So, developers can create their image and upload it to a repository. This image can then be simply run on production machines when the application needs to be made live.

Now let’s check out, how we can run MongoDB as a container.

Demo: Installing MongoDB

The traditional way of installing software

Now let’s install Mongo using traditional methods. If we head over to its documentation, it will list the steps that are required for installing MongoDB including running the executive, setting it up through the installer etc. Installing MongoDB is a lengthy process.

Now let’s see how docker simplifies this process.

Running software as a container

Go to Docker Hub and search for Mongo.

Before we run the command, click on the whale icon, go to settings -> Daemon and set the experimental flag as true.

Docker Tutorial

Docker Tutorial

Once you are done, docker will restart. You can then type in the following command in PowerShell or command prompt.

docker run --name some-mongo -d mongo:4.1

Here the docker installs a container with the name ‘some-mongo’. You can give some other name as you please. ‘Mongo’ is the image name and ‘4.1’ is its version or tag.

Docker Tutorial

It says that a newer image has been downloaded.

Let’s use the following command to run the downloaded image,

docker run some-mongo

Docker Tutorial

We will get a message ‘waiting for connections on port 27017’ which means our server is up and running.

So open another instance of PowerShell and run the following command,

docker exec -it some-mongo mongo

Docker Tutorial

Then type in the command,

show dbs

Docker Tutorial

This shows that there are no databases created yet in our server. We can now proceed with other mongo db commands here.

So we see docker simplifies the process of installing the software.

What happens behind the screens?

Your operating system can be divided into two major portions: Kernel & Userspace. The kernel has control over the hardware and consists of drivers etc. Everything other than the kernel like our applications, OS apps, and libraries etc. that are required by this application fall under user space. User space accesses the hardware through the kernel.

Traditionally when we install software we simply install the application and use the drivers and the libraries already present in the user space. But now with the containerization approach, when an image is created it will contain the application plus other drivers required for it to run. Hence an application will be independent of the resources that the operating system provides.

Docker commands

docker run imagename

This will run the specified image. This is equivalent to running an executive in traditional software.

docker –help 

This will list down all the docker commands available to you.

docker ps

This will list down the currently running containers. Currently, if we type this command in our PowerShell we get the following output,
 
Docker Tutorial

docker ps -a

This will list all the running and the exited containers.
 
Docker Tutorial

docker stop containername

This will stop the software. ‘docker ps -a’ will list the container as stopped, but ‘docker ps’ will not list it.

docker rm containername

This will remove the container. This is like uninstalling software in the traditional sense. Both ‘docker ps -a’ & ‘docker ps’ will not list it, since the container is removed.

docker images

This will list the downloaded images. Images are like the executives as far as traditional methods of installing software are concerned.

For more docker commands, visit here.

How to upload image to docker hub?

In the first demo above, we created an image for ASP.NET core which was stored locally. We will now have a look at how to upload images on Docker hub. First you need to create a free account on docker hub. Then create a repository.

Login from powershell as below:

docker login
 
 
Then run the following commands,
 

In the docker tag command, aspnetapp is the image name. Then comes my username/respository:tag.

Once this image is pushed on dockerhub, you can login there and you will be able to see the image in a browser.

 
I hope that this article has taken you one step closer to unraveling the mystery of docker. Feel free to reach out to me in case you want to discuss further.

This article was originally published on my website.


Similar Articles