Simple Kubernetes Cluster Using Ansible on AWS
There are so many options to create a Kubernetes cluster. The managed kubernetes clusters by AWS, Azure and GCP are the easiest of them all but what if you want to create a cluster on your premise? I have spent a lot of time doing this unsuccessfully. When I was finally able to do it , I wrote ansible script to make it repeatable.
This ansible scripts have been tried with AWS and require you to provision at least one master node and one worker node. It is recommended to associate an elastic IP to the instances once you create them. I am using an administrator node ( a node with ansible installed), a t2.micro instance on AWS).
Start by cloning the git repository that contains the ansible scripts.
Initially you need to configure password less ssh for the two k8s nodes that you create so that ansible in the admin node can ssh into the k8s nodes without a password . The steps to do this are available in the README.md file.
The hosts file needs to be populated with the ip address of the master node and the worker node(s). Once this has been done you can run the individual playbooks or execute the run.sh script which will call the ansible scripts in correct order. This pretty much finishes the setup of a kubernetes cluster.
The cluster is set up using kubeadm that allows you to bootstrap a cluster. Pretty soon I will be posting on setting this up using KOPS.