We have a Jenkins Job that builds something on node A and then needs to SCP it to node B.
You can just put the SSH keys for node B on Node A, and SCP with an execute shell build step, however I suspect that keeping SSH keys on a Jenkins node is a bad practice, and would like to avoid it.
What is the best practice regarding this use case? Is there a plugin or a feature for this? Should we just install Hashicorp Vault on the node and configure the keys there separately?
2 Answers
Yes, storing your ssh keys directly on a build node is a bad practice. Nodes can be replicated, deleted, or given access to from other systems, and you don't want to lose track of what systems have access to your secrets.
You should also not pass them directly into the build job, either as a parameter or as an environment variable. This can cause a huge headache with logging and tracking to make sure your keys aren't inadvertently output to places they shouldn't be. Instead, you should:
- Use a centralized secret store (such as Hashicorp Vault) to retrieve the secrets at build run-time.
- Use a plugin (such as the Credentials or SSH Credentials Plugin) to reference the keys during the build.
- 3,288
- 4
- 18
- 39
I would suggest using the Jenkins Credentials Plugin.
Another method which is more useful in case the Jenkins nodes are not able to connect via SSH is by pushing it to some remote bucket from NodeA and using triggers or a webhook to pull it to NodeB. This might however require access to the internet.
To apply this use https://wiki.jenkins.io/display/JENKINS/SSH+Credentials+Plugin in conjunction with https://jenkins.io/doc/book/using/using-credentials.