Guide for Version Control with Ignition

Hey all,

I've been working on developing some guides on how to use version control with Ignition in a team-based environment. If you've never used git, or just don't understand the nuances of how to use it with Ignition, you may find these documents useful to get you started. These are by no means comprehensive and are not any gold standard, but merely some guideposts as you find what will work best for you.

Note: This isn't designed to be a substitute for studying the intricacies of Git, GitHub (GitLab, etc.), or Ignition.

See an typo or problem or want to contribute? Submit a PR or an issue!

1. Version Control with Ignition

This guide covers the basics of version control and shows you how to use it with Ignition. Good for learning or just a quick reference.

2. Version Control / Ignition Lab

A hands-on lab to help you apply version control concepts to Ignition projects. Admittedly, it's a bit contrived, but it's a start.

3. Version Control Style Guide

This is a basic style guide for version control in a team-based environment. Everyone has their two cents on how this should be done, here's mine. If you already have a convention, you may not need this.

4. Project Template

This is a sample of what we in Application Engineering use internally on our projects. This is basically a copy of what @Kevin.Collins has put together in previously, but all in one place and curated for a small-medium sized project.

24 Likes

@Eric_Knorr

This is great stuff. I really appreciate it!

2 Likes

@Eric_Knorr and IA staff, are you able to share if we might expect git integration/capabilities or anything that would substantially affect this workflow when 8.3 is released?

Don't expect any first party Git (or any other VCS) integration in 8.3.0.
Capabilities will expand significantly, since gateway config is moving to the filesystem in a similar manner to the existing project resource mechanism.
Expect some other improvements in the management of resources on disk, such as less hassle with the resource.json file.

7 Likes

@Eric_Knorr @Kevin.Collins would you be able to provide some context around your use of the project template? It's using git to manage a compose stack including a .gwbk and other assets to spool up, correct? But are you using SCM for the gwbk and projects within this? Do you also have a repo for your /data (gwbk) directory? That's what we've been testing as a root dir for our repo, then we can all run local dev gateways and push changes into SCM with a DevOps pipeline to update test gateways.

The way we use this in Application Engineering is pretty much only to source control whatever is currently held in files, so projects, any non-standard modules, additional icons, etc. Then the gateway backup is there to hold everything else. The only difference is that the gateway backup has the projects stripped out of it (using the script in the repo) such that when the -r flag is run on docker-compose up, it doesn't overwrite the bind mounted project folder. I'm working on a way to version control tags using Design Group's Tag CICD module, but it's still WIP.

To answer your last question, we do not have a separate repo for the data dir. We do mount a volume to hold data, but that's more of a convenience thing to save the gateway data - we haven't had a need to version control any deeper than that. So if a gateway change was made, we would take a new gateway backup, strip the projects, then commit that to the repo.

3 Likes

Can you expand on how y'all use the gateway backup script download-gateway-backups.sh? Do you just manually run it from the host while the container is running prior to the next commit?

Sure, that's exactly correct. After bringing the docker stack up, running the download-gateway-backups.sh script will go through the running containers, take backups, strip the projects, and put them in the gw-init/ dir. I have a short paragraph in the readme about how to use this script as well. :wink:

1 Like

Ah, I must have missed it. Thanks for this though, this fits nicely in with the workflow I had going.

Just FYI, the script as is was silently failing to actually remove the projects/ folder from the zipped gwbk.

Adding an asterisk to the zip -d seemed to do the trick:

...
zip -q -d "${ZIP_FILE}" "projects/*" > /dev/null 2>&1 || \
...

Not sure if this is just some quirk of running the script in WSL.

2 Likes

Looks like an amazing start! Thank you for publishing this. Lots to read through :slight_smile:

1 Like

Is there any guidance if you’re using project inheritance and want to bring git version control into the game? Are they competitive or complementary?

Hey @wdougmiller, there shouldn't be any issues with project inheritance and version control. Inherited project resources aren't copied into the child project unless the resource is overridden, so there is no duplication of resources that would look funny from a version control perspective. In the case that you do have overridden resources, it will be found in the child project (with any saved changes) as well as in the parent project (as the unchanged version).

2 Likes

How are you guys setting up your workflow with Perspective/Vision projects and Designer? What I'm doing feels clunky but I'm not too sure how to improve it at the moment. Nothing I'm doing is mission critical and I'm a developer of one, but we've been successful enough with Ignition that it is at risk of becoming otherwise

  • Ignition is on a server running VM, with Azure Dev ops for repos.

  • 'projects' folder of the Ignition install is set up in a git repo, so all projects are contained in one folder - I had it this way to manage .resource folder. Is this managed now and I'm out of touch?

  • My designer is connected to the server gateway, I work on 'Dev' projects and then when satisfied I push it to server, and export data in Designer to the relevent 'Prod' project. This means I could accidently overwrite a project in production
    I want to build this export workflow in DevOps, any issue there?

Are people running local gateways for their designer and pulling filesystem locally?

Thanks!

Consider reading this article:

Some general points of advice:

  • You don't need to care about, nor should you commit, the .resources/ folder.
  • I'd generally recommend a git repo per-project, although a "monorepo" is possible. It's going to be a lot more meaningful/achievable in 8.3, though.
  • I would suggest setting up a local dev Gateway, managed through a VM or Docker or whatever, for each developer. Pushes can be initiated from your local repo to a remote hosted on a forge like Github/Gitlab/Gitea/whatever, or directly pushed from local to the production system. A CI system like Jenkins could be used to ensure the local disk copy on the production VM is up to date with upstream.
1 Like

Thanks! Yeah I'm not sure at all why I had one repo for everything, it's been a particular pain and not at all how I work my 'regular' software development pipeline. Thanks for the clarify on the .resources/, I think when I set this all up I just saw changes in that whenever I saved and figured they needed to be captured.
Glad to hear if I get that going 8.3 will improve the experience.

The local repo makes sense as well - I think early on I was doing more work just getting devices working with our network that working directly with projects on the gateway that this all would connect to seemed logical, but I like working locally and just moving a connection over when ready.

Thanks for the second set of eyes!

I might keep my Dev Projects because being able to open up a Vision Client on a custom machine with like, three different devices on the floor in the exact setup it'll be used on is nice, when I have only 15-30 minutes to do so between shifts >_<

@Eric_Knorr Really appreciate these resources! I've been playing around with the project template and I have a few questions:

  1. Are you using the Projects folder to hold all projects in the gateway or for an individual project? I see that one repo per project is recommended by @PGriffith but this template seems to work more like a monorepo to me. Obviously there's more than one way to do things. Just curious what the intention was.

  2. I would like to use containers for development but still be able to use a VM in production. With this project structure how would I be able to strip out the project files for production? I played around with Git submodules so the project/projects and the project template have their own repos. It works but adds an extra layer of complexity.

I'm still pretty new to containers/docker workflows so any feedback is appreciated. Thanks!

Hey Trevor,

  1. If the objective I'm working on has more than one project, such as an inherited project, I would create a bind mount for each of the projects, but I would keep this to only related projects to the objective. Some of our initiatives such as the Public Demo have many projects in one gateway. We use 1 repo to hold all of those projects.

  2. This broaches a much larger topic about deployment.

    • In general, development can be done using containers, and a gateway backup, project export, or resource export could be leveraged to move this across to other VM-hosted gateways. This could probably be done via EAM if you get your docker networking set up right
    • Once things are in a git repository, it can be spun up in a production environment using docker compose, or use a derived ignition image and deploy an image to test/production.
    • If this isn't an option at this time, you could look into git sparse-checkout (example) to checkout only the projects into another VM-based gateway. I have not tested this and do not recommend trying this on production until the process has been fully vetted. FWIW, I've looked into using submodules in the past and they've always seemed like it adds more overhead and complexity than they're worth.
  1. That makes a lot of sense. I would prefer to keep the projects folder limited to only related projects. I will probably create separate repos for anything gateway scoped (e.g. tags, gateway backups, images, etc.) and manually move those resources into the project repos when necessary.

  2. I would like to eventually run containers in production, but for the time being I will most likely perform manual project exports between different VM environments. I also played around with git sparse-checkout and it seemed to be more trouble than it was worth.

Appreciate the response!