Managing your ssh keys with ansible

Well this was all sorts of fun. I’ve tried a bunch of options to manage my ssh keys in my /root/.ssh/config file experimenting with lineinfile and blockinfile from ansible, NOTHING was working … until I found this sweet ansible module called community.general.ssh_config module. sweet!

So first start and make a hosts file with your VM1 defined

vim ./hosts

then add your hosts

[vps_hosts]
vps1 ansible_host=192.168.5.200 ansible_hostname=HOST1 ansible_ssh_pass="ZeZecretPazwords" ansible_ssh_user=root ansible_become_pass=root 
vps2 ansible_host=192.168.5.100 ansible_hostname=HOST2 ansible_ssh_pass="ZanotherZeZecretPazwords" ansible_ssh_user=root ansible_become_pass=root 

Pretty simple. Now let’s create our playbook.

vim createsshkeys.yml

and add this code. I’ll explain in the next step what everything does.

---
    - hosts: localhost
      gather_facts: no
      tasks:
        - name: Generate SSH ECDSA key pairs
          ansible.builtin.openssh_keypair:
            path: "/root/.ssh/{{ hostvars[item].ansible_hostname }}"
            type: ecdsa
            size: 521
          with_items: "{{ groups['vps_hosts'] }}"
        
        - name: Add VPS1 to ssh config file
          community.general.ssh_config:
            user: "root" # changes the config for user root
            remote_user: "{{ hostvars['vps1'].ansible_ssh_user }}"
            host: "{{ hostvars['vps1'].ansible_hostname }}"
            hostname: "{{ hostvars['vps1'].ansible_host }}"
            identity_file: "/root/.ssh/{{ hostvars['vps1'].ansible_hostname }}"
            state: present

        - name: Add VPS2 to ssh config file
          community.general.ssh_config:
            user: "root" # changes the config for user root
            host: "{{ hostvars['vps2'].ansible_hostname }}"
            hostname: "{{ hostvars['vps2'].ansible_host }}"
            remote_user: "{{ hostvars['vps2'].ansible_ssh_user }}"
            identity_file: "/root/.ssh/{{ hostvars['vps2'].ansible_hostname }}"
            proxyjump: "{{ hostvars['vps1'].ansible_hostname }}"
            state: present

– name: Generate SSH ECDSA key pairs will generate a ssh key pair (both private and public) for all your vps entries in the hosts file and store them in /root/.ssh/ with the name vps1/vps2.

– name: Add VPS1 to ssh config file will use the community.general.ssh_config module to add the keys in your /root/.ssh/config file so that you can log in on your vps using this command

ssh vps1
or
ssh vps2

user: “root” : defines where the config exist. If you put root, then it will be in /root/.ssh/config, if you have another user called “user”, then it will use /home/user/.ssh/config

remote_user: “{{ hostvars[‘vps1’].ansible_ssh_user }}” : this will look into the hosts file and choose the ansible_ssh_user from vps1.

host: “{{ hostvars[‘vps1’].ansible_hostname }}” : this will look into the hosts file to choose the ansible_hostname

hostname: “{{ hostvars[‘vps1’].ansible_host }}” : this will look into the hosts file to choose the ansible_host

identity_file: “/root/.ssh/{{ hostvars[‘vps1’].ansible_hostname }}” : this will look into the hosts file to choose the ansible_hostname

state: present : defines if this ssh host needs to be in the config file. You can also delete it by setting this as “absent”

You’ll notice that I’m using a proxyjump from VPS1 to VPS2. So now, you can ssh vps2 and this will cause the ssh connection to proxy the connect onto vps1 to vps2. groovy!