SSH login into vSphere with Tanzu Supervisor Cluster nodes

vSphere with Tanzu supervisor clusters is created as part of enabling the “Workload Management” functionality on a vSphere 7.x environment. After the supervisor cluster is created, you will have three virtual machines created, but you will not specify the credentials for those VM’s during deployment. 

e.g.  the below screenshot shows the list of VM’s created after successful creation of supervisor cluster.

Now, the question is, how to login on these VM’s via ssh. In this blog post, I will help you with the set of steps needed for the login.

Login to Supervisor Cluster Nodes using SSH

Retrieve the password for supervisor cluster nodes

– Login to vCenter via ssh (using putty)

– Switch to the shell mode

Command> shell
Shell access is granted to root

– Run the below command to get the password

root@vcenter7 [ ~ ]# /usr/lib/vmware-wcp/decryptK8Pwd.py
Read key from file

Connected to PSQL

Cluster: domain-c175172:3a316c1a-ec57-4d25-8a8c-914c2a0aa7a0
IP: 192.168.1.11
PWD: 8JVDvj/u/cLR7TeIKCjeB9BQWj4Zpq9skp6DXnFxTvbsawX7Pmi2RV4k/4g7PDG2434IwbJtRzYBIMGBb4LbkQTrrE5AhRj5xtRpiWPWc0fFvyCQ42v+ScBIr019bHEve+35616dsUHi1c7LI2oaAGY7qO5gVKNdNgfEIvBUNFA=
------------------------------------------------------------

SSH access to the Supervisor Cluster Nodes

– Get an IP address of any one supervisor cluster node. You can find the IP address from vCenter UI.

– Open Putty and enter an IP address

– Click on “Open” button

login as: root
Keyboard-interactive authentication prompts from server:
| Password:
End of keyboard-interactive prompts from server
 10:04:04 up 28 days, 17:56,  0 users,  load average: 2.05, 2.20, 2.06
root@42326d181b972234351a13bf97c76d55 [ ~ ]#

– You can run kubectl commands directly from those nodes.

root@42326d181b972234351a13bf97c76d55 [ ~ ]# kubectl get po -n kube-system
NAME                                                       READY   STATUS    RESTARTS   AGE
coredns-594c6dccdd-4z5sf                                   1/1     Running   6          28d
coredns-594c6dccdd-fx5vn                                   1/1     Running   0          28d
coredns-594c6dccdd-jfrsc                                   1/1     Running   0          28d
csr-signer-423235dc88aa819e12d3f02ae605ed4b                1/1     Running   0          28d
csr-signer-42326d181b972234351a13bf97c76d55                1/1     Running   0          28d
csr-signer-42328685de52fe8710b7aadbb9378bee                1/1     Running   0          28d
docker-registry-423235dc88aa819e12d3f02ae605ed4b           1/1     Running   0          28d
docker-registry-42326d181b972234351a13bf97c76d55           1/1     Running   0          28d
docker-registry-42328685de52fe8710b7aadbb9378bee           1/1     Running   0          28d
etcd-423235dc88aa819e12d3f02ae605ed4b                      1/1     Running   0          28d
etcd-42326d181b972234351a13bf97c76d55                      1/1     Running   0          28d
etcd-42328685de52fe8710b7aadbb9378bee                      1/1     Running   0          28d
kube-apiserver-423235dc88aa819e12d3f02ae605ed4b            1/1     Running   0          28d
kube-apiserver-42326d181b972234351a13bf97c76d55            1/1     Running   1          28d
kube-apiserver-42328685de52fe8710b7aadbb9378bee            1/1     Running   2          28d
kube-controller-manager-423235dc88aa819e12d3f02ae605ed4b   1/1     Running   0          28d
kube-controller-manager-42326d181b972234351a13bf97c76d55   1/1     Running   1          28d
kube-controller-manager-42328685de52fe8710b7aadbb9378bee   1/1     Running   0          28d
kube-proxy-6skth                                           1/1     Running   0          28d
kube-proxy-9g74g                                           1/1     Running   0          28d
kube-proxy-sft9j                                           1/1     Running   0          28d
kube-scheduler-423235dc88aa819e12d3f02ae605ed4b            2/2     Running   0          28d
kube-scheduler-42326d181b972234351a13bf97c76d55            2/2     Running   4          28d
kube-scheduler-42328685de52fe8710b7aadbb9378bee            2/2     Running   1          28d
kubectl-plugin-vsphere-423235dc88aa819e12d3f02ae605ed4b    1/1     Running   2          28d
kubectl-plugin-vsphere-42326d181b972234351a13bf97c76d55    1/1     Running   3          28d
kubectl-plugin-vsphere-42328685de52fe8710b7aadbb9378bee    1/1     Running   2          28d
wcp-authproxy-423235dc88aa819e12d3f02ae605ed4b             1/1     Running   0          28d
wcp-authproxy-42326d181b972234351a13bf97c76d55             1/1     Running   0          28d
wcp-authproxy-42328685de52fe8710b7aadbb9378bee             1/1     Running   0          28d
wcp-fip-423235dc88aa819e12d3f02ae605ed4b                   1/1     Running   0          28d
wcp-fip-42326d181b972234351a13bf97c76d55                   1/1     Running   0          28d
wcp-fip-42328685de52fe8710b7aadbb9378bee                   1/1     Running   0          28d
root@42326d181b972234351a13bf97c76d55 [ ~ ]#

Thats all in this post.

Advertisement

4 thoughts on “SSH login into vSphere with Tanzu Supervisor Cluster nodes

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s