I’m not really sure if this is the place to ask this, but thought I’d start here!
I’m a developer who works a whole lot with provisioning and ci/cd.
I wanted to check if the SaaS solution allows for usage of the vaults in CI/CD pipelines, that is, if there are any rate limits or similar, any rules against it, and especially a wish from the hosts for it not to be done?
I have no problems with self-hosting a separate vault for my own automated jobs, but if the hosted version works, it would be preferable!
This is the perfect place to ask
The CI/CD calls would actually use the CLI. Since that’s a local copy of your data, no restrictions are in place for accessing it.
Hi and thanks for the response!
So, I’m guessing that it would be preferred to cache the vault in that case, so it’s not re-downloaded every time a pipeline is running?
I was expecting the CLI to query the online vault for every request via the API?
I might be writing a terraform provider for my use case, and just want to make sure I don’t release something that would be hated by the company whom provides an awesome service
The CLI is like any other Bitwarden client, it downloads a JSON file with the encrypted data from the vault it is logged into, and all “calls” to the data reference that static file. Changes for non-shared credentials are synchronized nearly instantly, and shared credentials sync every 30 minutes or so. If you want to make sure you are up-to-date, you can simply call
bw sync in your application/script/etc.
The Bitwarden restful APIs don’t actually interact with encrypted data, only the clients do
Will see what I can do
Will likely come here and drops some info if I get a good TF (or other kind) script up and running