Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> pulling and pushing our images over the internet dozens of times a day caused us to hit the contracted bandwidth limit with our datacenter provider Deft repeatedly

I wonder what they were doing that resulted in blowing out their Docker layer cache on every pull and push.

Normally only a layer diff would be sent over the wire, such as a code change that didn't change your dependencies.



I'd rather have the agents prune their docker cache (or destroying and recreating agent) every night but it is not uncommon to see pipelines use the --no-cache option at every run to make sure they get the latest security updates.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: