Most of the internal tools mentioned below are not open-sourced and they fairly depend on multiple services and providers that we use within Kiwi.com. On account of that, we cannot give you the specifics but we hope that this article can serve as an inspiration to your future setup for the deployment and development of Cloud Functions and beyond.
If you are interested in these tools and would like to see them open-sourced one day, feel free to give us feedback.
When do we use Cloud Functions?
Kiwi.com employs more than 400 software engineers divided into multiple teams that are taking care of various complex operations.
On the other hand, we are producing a variety of external and internal projects on daily basis. Things like Slack bots, simple API, backup tools, shippers, transformers, various cron jobs, etc. For these types of projects, we use (mostly) Cloud Functions.
At the moment, there are more than 30 Cloud Functions actively developed within Kiwi.com, owned by multiple teams, and new ones are coming up regularly.
How do we deploy Cloud Functions?
Our developers started to use Cloud Functions more and more, and they see it as a very useful tool where they can host these kinds of projects mentioned above. Due to increased demand for Cloud Functions, we decided to help our developers, and make them more productive at the end of the day.
Since we are using self-hosted Gitlab and Vault, and we love Gitlab CI/CD, we developed two internal tools that any of our developers can use to start a new Cloud Functions project — let’s call them sls-cookiecutter and sls-projects.
Cookiecutter that generates simple code examples and CI jobs
Terraform repository that takes care of the credentials, policies, Vault paths, CI/CD variables, etc
The main goals of these tools are:
- Decrease the learning curve for someone who just enters the world of Cloud Functions — in a huge software company like Kiwi.com, optimizing time on frequent tasks means everything
- Simplify the development and deployment of Cloud Functions
- Give developers some useful examples to motivate them
- Take care of the security — we did not want developers to share credentials between themselves
- Give minimum possible permissions to those projects
- Have an overview of all the Cloud Functions in the company
With the tools mentioned above, our workflow for someone who wants to start a new Cloud Functions project looks simple as this.
- Create a new Gitlab repository
- Use sls-cookiecutter to bootstrap your project
- Push your bootstrapped code to the Gitlab repository
- Take the received patch from the sls-cookiecutter output, adjust it accordingly and create a merge request to the sls-projects
- Wait for merge request approval and apply changes in sls-projects
- Deploy your project directly from Gitlab CI/CD
- Check your new deployed project!
This means that after ~5 minutes, a developer will be able to deploy its new project directly from Gitlab CI/CD, without dealing with any service accounts, credentials, Google Cloud SDK, or whatsoever.
Keep in mind that with these steps, we can deploy only a simple and workable version of this new Cloud Functions project, that returns just some simple response. Real magic happens only after developers commit their work, but the main point was to able to quickly start a new project.
But, how does it work?
As mentioned above, we developed two main tools that we will describe below for you to understand what kind of magic happens in the background.
Additionally, we will describe how an example project works after it has been configured with given tools.
An sls-cookiecutter is built with a command-line utility called cookiecutter and based on Python.
What happens in the sls-cookiecutter is described in the diagram above. After triggering an sls-cookiecutter generator, a developer is asked to provide some basic inputs like project_name, author, language, dev_stage, prod_stage, cron, etc. A developer can choose between different programming languages (Python, Go, or Node.js), whether it wants his function to run as a cron job, which stages wants to have enabled, etc.
After all the fields are provided, the sls-cookiecutter will output two main things. On one hand, there will be a directory with generated files where we can find the Gitlab CI config file, basic functions, Terraform files, and helpers. On the other hand, there will be a code patch that one can use to create a merge request to sls-projects. The received patch is a Terraform code that has to be slightly adapted before it is pushed to sls-projects.
Before creating a merge request to sls-projects, the one should also create a new Gitlab repository and push its new generated code to the repository. This has to be done because sls-projects need a Gitlab repository to properly setup CI variables.
An sls-projects is a Terraform project that takes care of the configuration needed by our Cloud Functions projects.
After the merge request gets approved and applied, sls-projects will configure cloud identity (service accounts and roles) in GCP Project and Vault, configure Vault secret paths and set environment variables to our project’s Gitlab repository. The whole deployment process depends on the environment variables that we are adding to the Gitlab repository, but we will talk about that below when we describe how an example project works altogether.
A developer can also set different GCP projects for the different stages, to require some special execution policies, etc.
Now, let’s imagine we have an “example” project that was configured using sls-cookiecutter and sls-projects and everything is already pushed to the Gitlab repository.
Most of the magic happens in functions.tf and .gitlab-ci.yml files.
Here, we define one or multiple Cloud Functions, where each function is defined as a Terraform module. To not repeat ourselves, we created a Cloud Functions module that is open-sourced and can be also used in your projects.
With the Cloud Functions module, you can easily set up your functions and dependencies, triggers, and so on. You can configure simple HTTP functions, cron jobs, or functions that are triggered by some other Google Cloud resources like Storage bucket or similar. If you want to more know about it, check the Github repository.
Deployment jobs are specified in .gitlab-ci.yml and even though they look simple at the first sight, they depend on the included .gcp_sls_deploy, where most of the magic happens.
Therefore, during each deployment, these multiple steps will happen:
- Request docker image with all the dependencies (Terraform, Vault, etc) preinstalled
- Set environment variables, based on the existing environment variables from Gitlab CI and based on the current stage
- Request temporary credentials from Vault
- Initialize Terraform with temporary credentials and custom state bucket
- Deploy project using terraform apply
- Remove temporary credentials from docker image and revert them in Vault
Cloud Functions in Kiwi.com are on the rise and are becoming handy for our developers to build and deploy some simple projects. We wanted to make the developer’s life easier and give them an ability to quickly set up a new project in Cloud Functions, together with all the accompanying resources like CI templates, Vault secret paths, etc.
To succeed with this, we developed two main tools — sls-cookiecutter and sls-projects, and we are enjoying seeing how developers are using it and how this way of doing it quickly became domesticated.
We hope this article will help you get some ideas on how to tackle your future development of Cloud Functions. At this point, none of those tools are open-sourced but we will be more than happy if you can give us some feedback, opinion, or just an example of how you are dealing with the deployment and development of Cloud Functions.
Are you interested in similar articles from Kiwi.com writers or in our open positions? Check our open positions and subscribe to code.kiwi.com newsletter to stay informed about upcoming events, new articles, or community activities.