Select a language
When an engineer I work closely with first suggested we work on the Service Telemetry Framework docs upstream and then synchronize downstream, I was excited and enthusiastic but pretty clueless as to how to even begin.
There are lots of advantages to single-sourcing your content: you and the engineers save time on working on one set of docs, quality of upstream docs should improve, and if a change occurs, you only have to update it in one place.
Let me walk you through how I approached creating this upstream to downstream pipeline, step by step:
Reach out: The first step I took is the obvious one. I turned to my Red Hat colleagues for some direction. Nicole Baratta shared the Community Collaboration Guide, while Melanie Corr walked me through the steps involved and told me what I needed to know in order to get started. It’s a good idea to talk to your Documentation Program Manager (DPM) or people manager to find out if there is someone else on your team already working upstream.
Create an upstream repo in your git-based repository hosting platform for the docs: Here is the one for the Service Telemetry Framework (STF) documentation that I worked on in Github. If you are not familiar with how to set up a repository, see the how to guides for the git-based repository hosting platform you are using.
Think about the content: That was the easy part for me. I had my structure thought out in advance and the content was already structured in a topic-based, modular way. The source of truth is upstream, so I copied across the content to my new upstream repo and organized it according to my pre-planned structure.
Think about attributes and conditional formatting: I used conditional formatting to allow for upstream and downstream names. I didn’t use too many attributes and made the engineers aware of them also. When syncing, the build attribute is set to downstream so that the correct terminology is published on in the downstream location.
Agree on a process: The engineers I work with learned the AsciiDoc format quickly. Learning a particular format isn’t necessary for a successful doc sync, but everything helps. We had several meetings about how best to work upstream. Other teams worked on forked repos but the engineers and I cloned the new repository. I created a branch, made some suggested changes, committed and pushed the changes, creating a pull or merge request. I usually assigned at least one engineer to review the content. We agreed that every single change requires approval from at least one member of our team. The engineers approve the changes and the content is then ready for peer review. Publishing the upstream documentation is done automatically on merging to the master branch.
Get scripting: It’s a good idea to write a script for things that you'll do over and over. I use a simple Bash script with Git commands to sync the changes. I didn't have enough knowledge of Bash to write this script so I reached out to a colleague. When in doubt, reach out! Your script should be able to detect the changes you made upstream and copy them to the downstream repository.
Sync those docs: If you can, review the changes the script will make first without actually applying them. Run your script. Your changes have now been pushed to the downstream repository in the git-based repository hosting platform. Pull the changes to your local downstream branch and build your document. Ensure that everything is in order before you rebuild the document in the publishing tool and publish.
The best part of this whole process was working with the engineers. The engineers I work with jumped on AsciiDoc and also ensured that the upstream doc published correctly. They are so knowledgeable, patient, and keen to deliver quality content.
What I found the most difficult was getting started. While the Community Collaboration Guide is a valuable resource, it didn’t have all of the information I needed to make a start. Thankfully, a quick video chat session with a colleague, Melanie Corr, put me on the right path. SSO writer Andrew Munro sent me the SSO upstream-downstream workflow document that his engineering team wrote.