I’ve spent a wee bit more time on this issue today (have commented on the issue you raised on GitHub BTW but I don’t think we’re going to be able to resolve simply as I will expand on).
As noted, the root issue stems from the absence of the openssl-dev package when we run the docker image. I thought at first a quick fix might be to add an install command to Dockerfile. So, I tried adding the line
to the part of the Dockerfile that adds packages. This didn’t work however…
After speaking to Chris our devops guy, it seems the issue is probably more deeply routed than a simple missing package.
This docker image bases from the Alpine image, a very minimal version of Linux. The failure occurs because when the container runs, it attempts to build safe_vault if a pre-built binary hasn’t already been supplied. The problem is that Alpine isn’t a really good OS for a build environment, so it’s not a one-line change to simply install the missing package it’s complaining about, as it doesn’t have said package available.
We might want to refactor it into 2 different docker images: one for building safe_vault and one for deploying a pre-built safe binary. We would need to take some time to think about exactly how we would go about this.
So, basically we are building a docker container which then, when run, itself builds the safe_vault. A better solution would be to build the safe_vault initially and then supply that to docker when we build the container. In that case when we run the container, we would be deploying a pre made safe_vault.
Unfortunately at this moment in time our docker container doesn’t do this and to get it to a position where it does would require some work. I’m afraid it isn’t high priority right now so I really don’t know if / when it might happen…
If you are up for a docker learning “challenge” you could try it yourself maybe?
Sorry we haven’t been able to resolve this more satisfactorily.