Functional DevOps

Functional DevOps


DevOps is a set of practices that combines software development (Dev) and IT operations (Ops). It aims to shorten the systems development life cycle and provide continuous delivery with high software quality. — https://en.wikipedia.org/wiki/DevOps

So it is a set of practices. In another word, those yaml files in your repository will be Devops, and whoever create and make them run, is DevOps Engineer.

Functional Programming

  • Referential Transparency
  • Immutable
  • Pure
  • Function and Data segregation

FP vs OO

len :: String -> Int
httpServer :: Request -> Response

Object contains internal state in memory and react differently base on the same input.

Is this OO?

httpServer :: Request -> DBState -> Response

but the other function which retrieve the DBState will be stateful

As long as you can factor out every thing that may be changed, you will get a pure function.

Infrastructure as Code


So what are the variables we can factor out here?

Ideally should be just one:

  • Git commit
genPipeline :: Commit -> Pipeline

🌏 the real world is not, unless buildkite agent and all scripts on it are versioned

Code can be locked by commit

So does infrastructure. One commit implicitly locks all other factors as well.

  • pipeline.dhall
  • source code
  • build.sbt
  • Dockerfile

And eventually each of them lock a bunch of factors behind as well

For instance build.sbt will lock:

  • sbt version
  • jar dependency versions
  • Scala version
  • project configuration

While pipeline.dhall will lock:

  • steps of pipeline
  • build agent for each steps
  • environments

Things out of control of pipeline.dhall will be like tools version on build agent, they are not locked unless the agent queue is also versioned.

Lock everything where possible

So the more things you can factor out and lock, the more predictable result you will get.

FP takes No assumptions

🌏 Let us see how to apply FP with some practical tools in real world.


Nix is a purely functional package manager. This means that it treats packages like values in purely functional programming languages such as Haskell — they are built by functions that don't have side-effects, and they never change after they have been built.

Immutable vs Mutable

It is assuming your environment are all the same, consider the following factors when you do brew install sbt

  • sbt version
  • JRE version
  • macOS version
  • timing, even your OS is exactly the same, when you run this command will result different
  • What about your linux friend?

Nix's only assumption

# for mac
nix-channel --add https://nixos.org/channels/nixpkgs-20.09-darwin nixpkgs
# for linux
nix-channel --add https://nixos.org/channels/nixos-20.09 nixpkgs
nix-channel --update

Once everyone subscribes to the same channel, the version and binary of anything you installed should be exactly the same as everyone

> nix-env -i awscli
> nix-env --installed --query --out-path awscli
awscli-1.18.80  /nix/store/2b2c56c44xi3gj4hvzcxcn1dp1lb579k-awscli-1.18.80

Give it a try, your awscli will be exactly the same as mine, even the path of the file is exactly the same noted that 2b2c56c44xi3gj4hvzcxcn1dp1lb579k is the 160-bit MD5 checksum of the package dependencies which guarantee we all get the exactly same awscli.

Which means we are not even assuming python version, everything awscli dependencies will be exactly the same.

> nix-store -q --references /nix/store/2b2c56c44xi3gj4hvzcxcn1dp1lb579k-awscli-1.18.80


If everyone is using brew, when you tell your friend to run

sbt test

You have no idea your friend will have

  • what version of sbt?
  • what JRE version sbt is running on?
  • what environment variables are in the context?
  • are required dependencies spin up yet i.e. database?


with import <nixpkgs> {};
mkShell {
  shellHook = ''
            source .buildkite/hooks/post-checkout
            source .buildkite/hooks/pre-command
            set +e
            set -a
            source app.env
            set +a
            source ./ops/bin/deps-up
  buildInputs = [

But if you nix-shell --run='sbt test' You have no assumption on user's system other than nix

Everyone with this command is guarantee to have exactly the same

  • sbt
  • JRE and everything which back sbt
  • tools like Dhall aws hub etc.
  • all source the required scripts and environment in post-checkout
  • has the same app.env sourced
  • all deps services are up

Wrap up as previous FP concept, nix-shell is something like a pure function

nix-shell :: shell.nix -> ConfigedRuntime


Nix makes sure your system is immutable and reproducible, there is another tool to make your Configuration immutable and reproducible as well.

dhall :: xyz.dhall -> configuration

Dhall is a programmable configuration language that you can think of as: JSON + functions + types + imports

Dhall is a "total" functional programming language, which means that: - You can always type-check an expression in a finite amount of time - If an expression type-checks then evaluating that expression always succeeds in a finite amount of time


Similar concept of nix, Dhall locks configuration and its dependencies with crypto hash It is not simply take sha256 of config file, it takes sha256 of normalized config

let bk =
      https://raw.githubusercontent.com/jcouyang/buildkite.dhall/0.1.0/package.dhall sha256:3c5e9eb0182755e85c65d0b16a79b2b0f9614dcffde05151835e3b1daf587e20

let scalaAgent = Some { queue = "ody-lab-scala" }

let main = "master"

in  [ bk.Steps.Command
        , label = Some "lint"
        , commands = [ "shellcheck -x ops/bin/*" ]
        , agents = scalaAgent
    , bk.Steps.Command
        , label = Some "test dhall"
        , commands = [ "echo '(./app.dhall).version' | dhall-to-bash" ]
        , agents = scalaAgent
    , bk.Steps.Wait bk.Wait.default
    , bk.Steps.Command
        , label = Some ":shipit:"
        , commands = [ "./ops/bin/git-tag.sh", "./ops/bin/tag-release.sh" ]
        , agents = scalaAgent

The above Dhall file has hash sha256:8ce5c8a0c0144bc5ff48b89087e5ef11c3523b4d28db1614ef7715cda1485154

Not matter how you refactor it, the hash won't change if the normalized value isn't change.

let bk =
      https://raw.githubusercontent.com/jcouyang/buildkite.dhall/0.1.0/package.dhall sha256:3c5e9eb0182755e85c65d0b16a79b2b0f9614dcffde05151835e3b1daf587e20

let scalaAgent = Some { queue = "ody-lab-scala" }

let main = "master"

let lint =
      , label = Some "lint"
      , commands = [ "shellcheck -x ops/bin/*" ]
      , agents = scalaAgent

let test =
      , label = Some "test dhall"
      , commands = [ "echo '(./app.dhall).version' | dhall-to-bash" ]
      , agents = scalaAgent

let ship =
      , label = Some ":shipit:"
      , commands = [ "./ops/bin/git-tag.sh", "./ops/bin/tag-release.sh" ]
      , agents = scalaAgent

let wait = bk.Steps.Wait bk.Wait.default

in  [ bk.Steps.Command lint
    , bk.Steps.Command test
    , wait
    , bk.Steps.Command ship

The refactor will result in exactly config as previous one, I'm 100% certain since the sha is exactly the same

> dhall hash < .buildkite/pipeline.dhall

if any of the value actually changed, for instance I have a typo

-       , commands = [ "shellcheck -x ops/bin/*" ]
+       , commands = [ "shellcheckasdf -x ops/bin/*" ]

dhall hash < .buildkite/pipeline.dhall

You can even tell what is going wrong by compare with the remote config at master branch

dhall diff "./.buildkite/pipeline.dhall" "https://raw.githubusercontent.com/MYOB-Technology/odyssey/master/.buildkite/pipeline.dhall"
[   < …
  . …
  { commands = [ "shellcheckasdf -x ops/bin/*"

  , …
, …

Type System

Dhall has the most powerful type system, which is at type level more powerful than even Scala

Bool : Type  -- The expression `Bool` has type `Type`

Type : Kind  -- The expression `Type` has type `Kind`

Kind : Sort  -- The expression `Kind` has type `Sort

Where Scala somewhere just near Kind level.

Powerful type system means you can do more calculation at typelevel(compile time), this is exactly what a config need, we don't need any cool runtime for config file, we just need the type system to help us check correctness of config.

⚠️ the following example just for showcase the power of type system, it is possible in language good at proof like Idris but not likely in Scala

Type is first class citizen, normal function can consume Type and return Type

let DependentType = ∀(a : Type) → Optional a → Type

the above function defines Type of Type, now let's define Type

let SomeTextOrNatural
    : DependentType
    = λ(x : Type) →
      λ(y : Optional x) →
        merge { Some = λ(z : x) → Text, None = Natural } y

SomeTextOrNatural is a Type, depends on the value of y, the return type is either Text or Natural

It is mix both Type and Value together which might be little confused but if you figure this out everything makes sense

True : Bool : Type : Kind : Sort
  • y is value because right hand side of : is Optional x
  • x is a type because RHS is Type
  • z is value because RHS is x which is type
  • merge { Some = λ(z : x) → Text, None = Natural } y returns Type because Text and Natural has type Type

Now we define some values, yay

let value = "asdf"

let someValue = Some value

And a value which has dependent type:

let someTextOrNatural
    : SomeTextOrNatural Text someValue
    = value

⚠️ someValue is a value, but at type position, SomeTextOrNaural Text someValue will return a Type which could be Natural or Text totally depends on the value of someValue

When we change value of someValue

let someValue = None Text

let someTextOrNatural
    : SomeTextOrNatural Text someValue
    = value

a compile error will print because base on the value, type of someTextOrNatural is now Natural

Error: Expression doesn't match annotation

- Natural
+ Text

15│       value

Wrap up

Basically with these two tools, we now can eliminate most of our assumptions.

We have all infrastructure as code, system runtime is configed as code and immutable once checkin your codebase, which guarantee everyone will have the same runtime on the same commit of code.

Configuration itself is immutable, once it is checkin we all confident the pipeline will always be the same for the same commit of code.

Being immutable doesn't mean you can't change the file at all, they are like expressions As long as the expression result in the same value, you can refactor as whatever you want. Change the variable name, extract functions, split into multiple files and import back in. These are all safe as long as the hash result in the same thing.