Replacing your Dockerfile with Go code
January 11, 2023
Jan 11, 2023
None
Dagger's approach to building CI/CD pipelines is unique. Our SDKs let you develop your pipelines as code, in a programming language you already know, and then execute those pipelines as standard OCI containers. This approach is both supremely portable and consistent (because your pipeline runs the same way locally and in a remote CI environment) and it has the added bonus of compatibility with the existing Docker ecosystem.
One of the most common questions users ask when first encountering Dagger is, "I already have a Dockerfile, why should I use Dagger?" And the answer (like most good answers) depends very much on the intended use case.
Usage Scenarios
It's true that for most small and medium size projects without complex CI/CD needs, a Dockerfile can (and does) get the job done. But there are a number of scenarios where replacing a Dockerfile with a Dagger pipeline can yield significant benefits. Here are a few:
Replacing monolithic CI dependency images: Sometimes, teams maintain one giant everything-including-the-kitchen-sink Dockerfile containing multiple dependencies so they only have to reference one image (like
myorg/ci
) in all their CI pipelines. Building and pushing these Dockerfiles becomes quite a chore, and teams must also deal with registry rate limits (if using a public registry) or maintain a private registry instead. Dagger lets you maintain images purely as code, so you never have to worry about building and pushing gigantic monolithic CI images.Sharing data between CI pipelines and application code: Since your Dagger pipeline is usually written in the same language as your application, it's much easier to share data between the two. For example, if you have a constant (like an image name) that you use in your actual production codebase, you can import it directly into your CI code rather than attempting to pass it as a Dockerfile argument.
Greater re-usability and abstraction: Dagger lets you use general purpose programming languages and therefore supports more re-usability and abstraction. You can access native control structures like conditionals and loops, benefit from the language's existing testing tools, and add new functionality or integrate with third-party services using community extensions or packages. These operations are either extremely awkward or unsupported in the standard Dockerfile model, but very easy with Dagger. By the same token, Dagger pipelines also benefit from static typing and easier refactoring.
Gradually rewriting a legacy Dockerfile: Sometimes you are stuck with a Dockerfile that is both very important, and painful to use. Everyone knows it should be replaced with something more powerful, ideally using a real programming language - but rewriting it all at once, without breaking compatibility, is too hard, so it never happens. Since Dagger can natively run Dockerfiles with full compatibility, it’s easy to wrap your existing Dockerfile in a Dagger pipeline, and gradually refactor it over time, without breaking your team’s workflow.
Getting Started
If you've struggled with one or more of these scenarios, or if you're just curious about how it all works, we've put together a brief guide on replacing a Dockerfile with a Dagger pipeline. This guide explains how to use the Dagger Go SDK to perform all the same operations that you would typically perform with a Dockerfile.
It illustrates the process by replacing the Dockerfile for the popular open source Memcached caching system with a Dagger Go pipeline and covers the following common tasks:
Create a Dagger client in Go
Write a Dagger pipeline in Go to:
Configure a container with all required dependencies and environment variables
Download and build the application source code in the container
Set the container entrypoint
Publish the built container image to Docker Hub
Test the Dagger pipeline locally
We hope you find this guide useful and look forward to releasing more such guides soon! In the meanwhile, if you have feedback or would like to suggest new features or documentation, let us know in Discord or create a GitHub issue.
Dagger's approach to building CI/CD pipelines is unique. Our SDKs let you develop your pipelines as code, in a programming language you already know, and then execute those pipelines as standard OCI containers. This approach is both supremely portable and consistent (because your pipeline runs the same way locally and in a remote CI environment) and it has the added bonus of compatibility with the existing Docker ecosystem.
One of the most common questions users ask when first encountering Dagger is, "I already have a Dockerfile, why should I use Dagger?" And the answer (like most good answers) depends very much on the intended use case.
Usage Scenarios
It's true that for most small and medium size projects without complex CI/CD needs, a Dockerfile can (and does) get the job done. But there are a number of scenarios where replacing a Dockerfile with a Dagger pipeline can yield significant benefits. Here are a few:
Replacing monolithic CI dependency images: Sometimes, teams maintain one giant everything-including-the-kitchen-sink Dockerfile containing multiple dependencies so they only have to reference one image (like
myorg/ci
) in all their CI pipelines. Building and pushing these Dockerfiles becomes quite a chore, and teams must also deal with registry rate limits (if using a public registry) or maintain a private registry instead. Dagger lets you maintain images purely as code, so you never have to worry about building and pushing gigantic monolithic CI images.Sharing data between CI pipelines and application code: Since your Dagger pipeline is usually written in the same language as your application, it's much easier to share data between the two. For example, if you have a constant (like an image name) that you use in your actual production codebase, you can import it directly into your CI code rather than attempting to pass it as a Dockerfile argument.
Greater re-usability and abstraction: Dagger lets you use general purpose programming languages and therefore supports more re-usability and abstraction. You can access native control structures like conditionals and loops, benefit from the language's existing testing tools, and add new functionality or integrate with third-party services using community extensions or packages. These operations are either extremely awkward or unsupported in the standard Dockerfile model, but very easy with Dagger. By the same token, Dagger pipelines also benefit from static typing and easier refactoring.
Gradually rewriting a legacy Dockerfile: Sometimes you are stuck with a Dockerfile that is both very important, and painful to use. Everyone knows it should be replaced with something more powerful, ideally using a real programming language - but rewriting it all at once, without breaking compatibility, is too hard, so it never happens. Since Dagger can natively run Dockerfiles with full compatibility, it’s easy to wrap your existing Dockerfile in a Dagger pipeline, and gradually refactor it over time, without breaking your team’s workflow.
Getting Started
If you've struggled with one or more of these scenarios, or if you're just curious about how it all works, we've put together a brief guide on replacing a Dockerfile with a Dagger pipeline. This guide explains how to use the Dagger Go SDK to perform all the same operations that you would typically perform with a Dockerfile.
It illustrates the process by replacing the Dockerfile for the popular open source Memcached caching system with a Dagger Go pipeline and covers the following common tasks:
Create a Dagger client in Go
Write a Dagger pipeline in Go to:
Configure a container with all required dependencies and environment variables
Download and build the application source code in the container
Set the container entrypoint
Publish the built container image to Docker Hub
Test the Dagger pipeline locally
We hope you find this guide useful and look forward to releasing more such guides soon! In the meanwhile, if you have feedback or would like to suggest new features or documentation, let us know in Discord or create a GitHub issue.
Dagger's approach to building CI/CD pipelines is unique. Our SDKs let you develop your pipelines as code, in a programming language you already know, and then execute those pipelines as standard OCI containers. This approach is both supremely portable and consistent (because your pipeline runs the same way locally and in a remote CI environment) and it has the added bonus of compatibility with the existing Docker ecosystem.
One of the most common questions users ask when first encountering Dagger is, "I already have a Dockerfile, why should I use Dagger?" And the answer (like most good answers) depends very much on the intended use case.
Usage Scenarios
It's true that for most small and medium size projects without complex CI/CD needs, a Dockerfile can (and does) get the job done. But there are a number of scenarios where replacing a Dockerfile with a Dagger pipeline can yield significant benefits. Here are a few:
Replacing monolithic CI dependency images: Sometimes, teams maintain one giant everything-including-the-kitchen-sink Dockerfile containing multiple dependencies so they only have to reference one image (like
myorg/ci
) in all their CI pipelines. Building and pushing these Dockerfiles becomes quite a chore, and teams must also deal with registry rate limits (if using a public registry) or maintain a private registry instead. Dagger lets you maintain images purely as code, so you never have to worry about building and pushing gigantic monolithic CI images.Sharing data between CI pipelines and application code: Since your Dagger pipeline is usually written in the same language as your application, it's much easier to share data between the two. For example, if you have a constant (like an image name) that you use in your actual production codebase, you can import it directly into your CI code rather than attempting to pass it as a Dockerfile argument.
Greater re-usability and abstraction: Dagger lets you use general purpose programming languages and therefore supports more re-usability and abstraction. You can access native control structures like conditionals and loops, benefit from the language's existing testing tools, and add new functionality or integrate with third-party services using community extensions or packages. These operations are either extremely awkward or unsupported in the standard Dockerfile model, but very easy with Dagger. By the same token, Dagger pipelines also benefit from static typing and easier refactoring.
Gradually rewriting a legacy Dockerfile: Sometimes you are stuck with a Dockerfile that is both very important, and painful to use. Everyone knows it should be replaced with something more powerful, ideally using a real programming language - but rewriting it all at once, without breaking compatibility, is too hard, so it never happens. Since Dagger can natively run Dockerfiles with full compatibility, it’s easy to wrap your existing Dockerfile in a Dagger pipeline, and gradually refactor it over time, without breaking your team’s workflow.
Getting Started
If you've struggled with one or more of these scenarios, or if you're just curious about how it all works, we've put together a brief guide on replacing a Dockerfile with a Dagger pipeline. This guide explains how to use the Dagger Go SDK to perform all the same operations that you would typically perform with a Dockerfile.
It illustrates the process by replacing the Dockerfile for the popular open source Memcached caching system with a Dagger Go pipeline and covers the following common tasks:
Create a Dagger client in Go
Write a Dagger pipeline in Go to:
Configure a container with all required dependencies and environment variables
Download and build the application source code in the container
Set the container entrypoint
Publish the built container image to Docker Hub
Test the Dagger pipeline locally
We hope you find this guide useful and look forward to releasing more such guides soon! In the meanwhile, if you have feedback or would like to suggest new features or documentation, let us know in Discord or create a GitHub issue.