& more

Episode 30

Stack/Unstuck: Testing, PDFs, And Donkeys

Stack/unstuck hero graphic

Show Notes

We reach our penultimate episode for Stack/Unstuck, and arrive on the topic of testing. Testing isn’t necessarily part of any technology stack, but it is a vital part of building software. Sometimes, it can feel like testing is an afterthought, or just a box for busy coders to tick once completed.


We hear from our guests about how testing doesn’t need to be saved for a curtain call. It can have a starring role when identifying problems within different components of a software stack. And as we include it more in discussions and planning, and as we start thinking about it earlier in development cycles, testing can further an application’s potential, and help teams build software better.


00:00 — Kim Huang
Recently I met with Schaffer Stewart. He's a senior software engineer at LifeOmic, but when he is not doing that, he's spending his time contributing to civic coding projects.
Right now, Schaffer is working with a group called Open Raleigh Brigade on a particular project for campaign finance.

00:20 — Schaffer Stewart
The goal of the project is to make the data more accessible.

00:23 — Kim Huang
Voters often research candidates before they head to the polls, right? But finding information about donations, who donated to who, what organization, and how much money can be tricky. A person can go to the Board of Elections website, but in this case, it's confusing. There's lots of things that only an expert in campaign finance would be able to understand. And let me tell you, the user experience does not get much better from there.

00:51 — Schaffer Stewart
Even if you do manage to get results, then you're given this page that's a list of 10 different campaign finance filing documents, and you have to know which out of that list you want, which document set is actually the contributions.

01:07 — Kim Huang
So the problem was well established. Schaffer and his team wanted to build an MVP—a minimum viable product that would be then tested against the original version for usability and accuracy.

01:22 — Schaffer Stewart
So now we have these user stories and this idea, and so it's trying to figure out how we can start building out an MVP and testing the user stories, and testing the idea, and see what's possible.

01:33 — Kim Huang
User stories are a type of documentation. They're done to capture functional requirements based on things a user would want to do with whatever they're trying to build.

01:43 — Schaffer Stewart
They were very simple things along the lines of, I should be able to search for a committee or a candidate by name and just see all the contributions associated with them.

01:56 — Kim Huang
But what does one find when testing software and what effect can testing have on the software stack?

02:06 — Brent Simoneaux
This is Compiler, an original podcast from Red Hat. I'm Brent Simoneaux.

02:12 — Angela Andrews
And I'm Angela Andrews.

02:14 — Brent Simoneaux
We're taking you on a journey through the software stack. We call the series Stack/Unstuck.

02:20 — Angela Andrews
Today's episode, we are talking about testing. This is one episode of a series. So if you want to listen from the beginning, you can start from our episode, The Great Stack Debate.

02:36 — Brent Simoneaux
Let's go to producer Kim Huang for our story. We're talking about testing today. And Kim, this might sound like a silly question because I think I know what testing is, but in this context, what is testing?

02:53 — Kim Huang
Simply said, software testing is something that is done ideally before something goes to market or it's released for public use, right? You want to make sure that there are little to no bugs. I know it's sometimes unavoidable that bugs come up after a product release, but you want to make sure that the software that you design is doing what it's supposed to do, and it's not fundamentally, what they say, broken on release.

03:19 — Angela Andrews
I think testing is super important. I mean, I know we're talking about stacks here, and this really isn't part of a stack specifically, but it's so integral to every single part of the stack. I mean, I think it shows we want to do testing because we want to make sure that we're getting it right. We want to make sure that we're giving our users the experience that we expect them to have. So it's one of those things that you really can't or shouldn't live without it.

03:52 — Brent Simoneaux
I've got a couple of questions here.

03:53 — Kim Huang

03:54 — Brent Simoneaux
One, who is doing the testing?

03:58 — Kim Huang
That depends.

03:59 — Angela Andrews
I was going to say my favorite answer, it depends.

04:02 — Brent Simoneaux
It depends. Okay.

04:04 — Angela Andrews
Yeah, it does.

04:05 — Kim Huang

04:05 — Brent Simoneaux
Well, say more about that.

04:06 — Angela Andrews
Well, some teams actually have testers who do testing because they're not the ones that are writing the code, they're just testing it to see that it behaves the way that the programmer said it was going to behave. So they do quality control. They're like, "Does it do X, Y, and Z?" Sometimes developers write their own tests to make sure that their code behaves the way that they expect it to behave. So I guess it depends, is really the best answer for who does the testing.

04:41 — Brent Simoneaux
And at what part of the development process do we test?

04:48 — Angela Andrews
All Of them.

04:50 — Brent Simoneaux
All of them. Okay. So it's sort of like you continuously do this throughout the process.

04:53 — Angela Andrews
I think you do to some degree.

04:55 — Brent Simoneaux

04:56 — Angela Andrews
Now, if we're talking about pipelines, there is a section inside of a CICD pipeline, which is continuous integration, continuous development, where the testing happens. But I think back to when I was in bootcamp where I would write code and I would do little tests to make sure that what I wrote would produce the result that I expected. So it was just me hacking away on my keyboard, making sure that this is really run. Does it really account for all of the variables that could quite possibly be entered here? So like I said, testing happens all over the place.

05:39 — Brent Simoneaux
So testing isn't technically part of this software stack?

05:44 — Kim Huang

05:45 — Brent Simoneaux
Kim, I want to ask you, why are we talking about it in this series then? Why is testing important to this particular conversation that we're having?

05:54 — Kim Huang
Well, there can be a lot of friction inside of teams about what to test and when to test it. Just because you're trying to get something out there and you're trying to get an iteration of your product, kind of like what Schaffer was talking in the first part of the episode. You're trying to get something out there as soon as possible, and you feel this kind of pressure even, especially, if you're a product owner, for example. You feel a pressure to get things out there very quickly.
So it's easy to forego the one or the lowest rung on the ladder because you know you can just skip that. But it's not really advantageous to anyone to skip something like testing. It should be the opposite effect where testing should not only be a part of every aspect of software development, but it also kind of should inform the development itself.

06:47 — Brent Simoneaux
All right. We've got our definitions.

06:49 — Angela Andrews
We have level set.

06:50 — Kim Huang

06:51 — Brent Simoneaux
We've level set. Now, let's dig in.

06:53 — Kim Huang
Okay. When I first started looking into making a testing episode for Stack/Unstuck, I went to the place where I normally go when I am trying to find out anything. I went to my search engines. I went to Wikipedia. I went to different websites, different articles, and I kept seeing a name over and over again.

07:24 — Lisa Crispin
My name is Lisa Crispin, and I'm a testing consultant, an author, a trainer, a speaker at conferences, and a lover of donkeys.

07:33 — Angela Andrews
Donkeys? Okay.

07:36 — Kim Huang

07:36 — Angela Andrews
Like Shrek donkeys.

07:38 — Kim Huang
Correct. Lisa Crispin is a person who raises donkeys.

07:42 — Angela Andrews
And knows a lot about software testing.

07:44 — Kim Huang
Yes, a lot.

07:46 — Angela Andrews

07:47 — Lisa Crispin
Well, we have a little farm and we have four rescue donkeys that we take care of, and they pull carts and wagons that do work around the place, as well as just generally being cute and entertaining us and being nice companions.

07:59 — Angela Andrews
That's awesome.

08:00 — Kim Huang
I know. I love it. Lisa says there are a number of reasons why testing is crucial to software development, and not just in a practical sense.

08:12 — Lisa Crispin
I've actually worked with developers who were like, "Well, why do we need to test this?" "Really? Are you sincerely asking me that question?" And they were. And a lot of times they just see it as, "Oh, that's a less valuable activity, and people who are lower paid than I am should do that," which is not a great attitude.

08:31 — Angela Andrews
No way.

08:33 — Lisa Crispin
These days we really have to be concerned with ethics and we have to be concerned with regulations and laws.

08:41 — Angela Andrews
Wow. That's an interesting take because I could see someone saying that because they think that what they're doing is just so self-important, But these are all really important jobs and really important tasks. And without good tests, you're shipping crappy code. Seriously.

09:00 — Kim Huang
Yes. And crappy code carries a lot of what risks, right?

09:05 — Lisa Crispin

09:06 — Brent Simoneaux

09:07 — Kim Huang
And remember what Lisa just said about all of those regulations and laws, and ethics. Well, the world is only getting more and more robust with those types of laws as it pertains to IP, as it pertains to security. And not being in compliance with those laws can obviously be very bad.

09:26 — Lisa Crispin
A company could easily go out of business because let's say their code is not secure and somebody breaks into the database. There's just so many different ways. Or maybe their user interface is not accessible to people with disabilities, and there are regulations around that, that they're not meeting. And so they get fined.

09:43 — Angela Andrews
So I think we answered the question. That's why we test.

09:46 — Kim Huang
Yes. Releasing untested software and applications is a pretty risky proposition. But we talk about tests typically in the context of QA and end user testing, what the person using the software sees. What does this have to do with the stack that a development team can use?

10:10 — Lisa Crispin
Maybe you're testing at a unit level, but a lot of the problems don't come out because they occur between the different levels of the stack. It's when you go from the user interface all the way down to the database that a problem happens.

10:22 — Brent Simoneaux

10:22 — Angela Andrews
Let's unpack this. So she just said something really telling, We're so focused on the front end and what users are experiencing and things like that. But testing has to go a little bit deeper than what you see because she said it needs to happen all the way to the database. So think about forms that we see on websites and we're entering our information or whatever. That information has to be passed securely. We shouldn't be able to put any old character into a field and probably do some sort of sequel injection query on a database and take it down. We need to test to make sure that what this application is doing is not going to cause harm. And we have to test for that.

11:16 — Brent Simoneaux
It seems like she's also saying that if you're just testing in one part of the stack, you're not going to discover all the problems.

11:24 — Angela Andrews

11:25 — Brent Simoneaux
Some of these problems occur between the different components of the stack.

11:31 — Kim Huang
Let's talk about that concept of testing at the unit level. Maybe in this scenario, a team can't test the API level for whatever reason, but they can test at the UI level. Those tests can make development easier. Maybe it lessens the amount of errors and bugs that come back when they actually try to compile everything to make sure that everything can work.
So if these types of tests can be written with each layer of the stack in mind, just like Angela was talking about at the beginning of the episode, how she was in bootcamp, a little bit of writing here, a little bit of code there, a little bit at a time makes for a more effective and more efficient development process.

12:12 — Brent Simoneaux
So this makes a lot of sense to me, and that makes me wonder why everyone doesn't do this.

12:22 — Angela Andrews
Maybe because it slows down your work. You know what I mean?

12:25 — Kim Huang

12:28 — Angela Andrews
Because if you're working on something and you are head down and you have to really think about how to error proof and find your bugs, well that takes away from putting your head down and writing your code. But I think we should give it just as much weight as we do anything else because quality is key. Security is key. Being compliant is key. And if we kind of coupled that into our code writing process, I think it would feel less burdensome.

13:02 — Kim Huang
I agree. They do say that time is money, and I understand that sometimes testing code means you're not writing it. But at the same time, you only get one chance to make a first impression. And when you want to launch a product, if you want to launch an app, you want the best experience possible for whatever user or whatever customer you're trying to serve. Lisa does go a bit further about testing. She says that testing isn't just beneficial for development, it can even be beneficial for teams before work even starts.

13:40 — Lisa Crispin
A great example is frameworks like React come with testing frameworks like Jest, which give a lot of advantages over others. And so when a team is thinking about, what coding framework to use. They need to look at what testing framework it's using too for the unit level test. And for things like API level testing or workflow testing at the user interface or in workflow level.

14:03 — Kim Huang
Testing is important, but not just when something is ready to be released. It should be something that, to Angela's point before, is at every stage of the development process. So unit level tests can help catch problems at different layers of stack. And the availability of testing support and frameworks just can help teams choose the technologies that they use.

14:28 — Angela Andrews
I think that's interesting because you can decide, "Well, I want to do X and I want to use this language." You're now doing your homework and saying, "Well, does it come with the testing framework?" That kind of makes it easier because you are working within language that you're already really familiar with, hopefully.

14:51 — Kim Huang

14:52 — Angela Andrews
Right? I think that's interesting that now these frameworks have their own testing framework. I would guess it would make your job a little bit easier.

15:01 — Kim Huang
Yes. So coming up, we go back to Schaffer. He and his team are starting to delve into the build, and they are going to discover that testing can change a project midstream. So at this point of the story, Schaffer and his team are putting together that campaign finance dashboard. Specifically, they're working on user stories. The team asked each other important questions. Who was going to be the primary user for this tool? What do they want to see in what they're using? Schaffer did talk about voters earlier, but even at the beginning, things started to change.

15:51 — Schaffer Stewart
If there's lay people using the tool and things are off by a little bit, or there's something that's one or two records aren't quite right. That's not the end of the world. But one of our target users is researchers and journalists. So if these people are using the tool and stuff isn't 100% accurate, then the tool is not really useful for them. If they have to look at the data and the tool and then go back and double check everything against the board of elections, which is sort of a time consuming process.

16:21 — Kim Huang
So the first thing that the Open Raleigh Brigade team decided was to clean the data that they already had available. That was so development could start fresh.

16:32 — Schaffer Stewart
Having good data is critical. We have to make sure that during our cleaning process, we haven't done anything to misrepresent the data or introduced any inaccuracies into the data.

16:45 — Kim Huang
They started that kind of what they call de-duplication process.

16:47 — Angela Andrews

16:49 — Kim Huang
And those processes can require scripts to be written, at least to shorten time so that people are not doing this too manually.

16:58 — Angela Andrews
Manually, yeah.

16:59 — Kim Huang
And it also cuts down on errors, right? Because human error is always a factor. Schaffer stepped forward with his experience and expertise to help with this early on, and he wanted to use the technology that he knew to test and see if this could work and address some of the accuracy problems that were happening with the early iterations of the dashboard.

17:24 — Schaffer Stewart
And so in this case, I was helping out with it. So I know Node.js and Postgres, and so that was the tools I picked to try and get it done.

17:33 — Kim Huang
So Node.js, which is a runtime environment for JavaScript and Postgres, which is for the database layer. So he started working with those tools and then he ran some early tests, like little incremental tests to see if things were working at the time.

17:52 — Schaffer Stewart
The MVP gave us enough data that we could take it and start building a dashboard around it, but it didn't really scale well enough as we tried to add additional data. Node and Postgres weren't really the right tools for that, so we did a second iteration, which was mostly Python. So that took us a little bit further. We were able to get more data. It worked better, it worked faster. It was just kind of nicer to use.

18:14 — Angela Andrews
So they had to find the right tool for the job.

18:17 — Kim Huang
Yes. And in order to do that, they had to stop and test the work that they'd already done thus far.

18:23 — Angela Andrews
And they found out what they were doing wasn't working how they expected.

18:27 — Kim Huang

18:28 — Angela Andrews
Well, that's the good part about starting early and not getting so deep into the process and having to change tools.

18:38 — Brent Simoneaux
So Kim, how else did testing inform their work?

18:43 — Kim Huang
Schaffer talks more about testing here and a lot of what he says is very indicative of the nature of civic coding projects.

18:52 — Schaffer Stewart
So I think mostly it's just as we're building, we're testing these things to see if they work as we expect. During our regular meetings, we often would demo the project to show where we currently are, and that was also a good opportunity for people to play around with it and test it. You're on a Zoom call with 10 or 15 people and you kind of show them the tool and then most of them will pull it up on their own computer and start plugging things in.
And then they'll hit an error and they'll let you know right there in the call testing and seeing what happens if I misspell someone's name, I throw a typo in there, that sort of thing. And just based on that sort of refining, fixing bugs or small behaviors that don't really make sense, adjusting those.

19:32 — Angela Andrews
Well, that's pretty smart because having your team all plugging away, trying to pick something apart and someone finds something, one, you have a bug and you can file a bug on Bugzilla. I'm using that term, but you can file a bug and this is something that you have to program out of your code because it's behaving in such a way you don't want this to happen. So it's cool that they have this ongoing process. And because testing is so important, you have to do it like this. You have to pick it apart, move one piece at a time, and at some point you're going to find out you're going to have an application with a lot less bugs in it.

20:14 — Kim Huang
Exactly. You can even use testing to identify themes that are necessary to protect people's information. Things like addresses, phone numbers, items that people may not want everyone to know.

20:28 — Schaffer Stewart
We don't include that information in our dashboard because we don't want to make it any easier for people to abuse that sort of information. So yeah, it's a lot of just testing it internally and figuring out what we do and don't like and what we do and don't want to see.

20:43 — Angela Andrews
They want their dashboard to produce X and anything outside of X is not what it was meant to do. So they're trying to keep the scope super small.

20:54 — Kim Huang
It is a tight line to walk.

20:56 — Angela Andrews
Yes. People's personal information, you do have to be super careful with exposing that.

21:00 — Brent Simoneaux

21:04 — Kim Huang
So now that there's a MVP in place, they've been working on this at this point for many, many months. I think since 2020. So it's been actually a few years. And since then, they've built in even more complex user testing, specifically based on some of those power users that Brent was talking about before.

21:28 — Schaffer Stewart
One of the main leaders behind the project is a journalist and independent researcher. And so he's reached out to some contacts and colleagues in the journalism community, and we've gotten on some calls with them. We've given them the tool and had them test and search for people that they're familiar with to start looking at the data and making sure that the data is correct. So it's important that we get these subject matter experts and people who are in the domain to come in and start using it before we release it to a broader audience.

21:55 — Angela Andrews
Very smart.

21:56 — Brent Simoneaux
That is very smart, because I think you're not going to catch everything in the lab. You need people, experts, power users to kick the tires in the wild in some ways.

22:10 — Kim Huang
And you also need a discerning eye, an expert eye to figure out what's missing if there's anything that's missing. I feel like especially with journalists and people who have that expertise in campaign finance who are working a specific beat, or maybe they are working as a political analyst or something, they know exactly what to, in this case, what to look for, what it should look like, what it should not look like, and what is most useful to a user in the end.

22:39 — Angela Andrews
Yeah. They have that discerning eye.

22:44 — Kim Huang
Schaffer says the most important thing is for people that are working on development teams, especially if you're working on projects like this that are so complex. It's important for them not to feel down about having to scrap work when testing gives you a different outcome that you don't expect.

23:02 — Schaffer Stewart
It can be easy to feel like you did something wrong or you didn't write good enough code. It can at times feel like an attack on you. Don't take it personally. You built what was needed at the time and now something else is needed and that's fine. It might be in a different stack, and that's fine too.

23:19 — Angela Andrews
I don't think it was a waste of time. When you realize that you have the wrong tool for the job, you're doing your job seriously. You don't want to fit a square peg into a round hole. You want to make sure you have exactly what you need because your end user, they depend on that. They really do. So I get where he's saying, it could feel like, "Oh, I didn't do a really good job with this. What did I do wrong?" But sometimes this is not the right tool for the job. So good for them figuring that out before they got way too far along.

23:56 — Kim Huang

23:59 — Brent Simoneaux
So we've been talking this whole episode about the importance of testing, especially at various stages of the development process. And to me that seems like the way to go. So from everything we've heard, that seems like what we should all be doing. So I'm curious if the tools are there, how do we build this into our culture?

24:28 — Angela Andrews
Well, my opinion is make it okay to change your mind. Look at Schaffer's example. When you realize you're not making the best choice, make it okay to switch and pivot. Being agile, not only in your development cycle, but also in how you test sounds like the happy medium. Testing is super important. We can't bolt it on in the end, just like we can't do security bolting it on at the end. They need to be first class citizens in the development process.

25:02 — Kim Huang
Agreed. Testing is not a part of the software stack. We all know that, but it is a crucial part of the software development process. Testing can catch problems in the software stack before they become very big problems down the road.

25:19 — Brent Simoneaux
But my question is how do we change the culture around this? I think I understand the importance of it. I hear what you're saying, but to me this feels like a cultural issue. Right?

25:31 — Angela Andrews
This is all culture. It is.

25:33 — Brent Simoneaux
Yeah, it's all about culture.

25:34 — Angela Andrews
Well, how do you change any culture? You keep talking about it. You make it commonplace. It can't be taboo. I mean, think about how our language has changed over the years because we realize that this doesn't work. This does not work. We have to figure out a better way and how do we do it? We have to make it more approachable. We have to make it more accessible. We have to make it okay and...

25:59 — Brent Simoneaux
Normalize it.

26:00 — Angela Andrews
We have to normalize it. Exactly. So that's how any culture change happens. These weren't the ways that we did things. And what we had to do was we had to keep talking about newer and better ways. That's literally how you change the culture. It really does have to happen from up top too because that's where you get your buy-in. And then it all trickles down.

26:29 — Brent Simoneaux
Okay. So that's the stack. Did we do it?

26:32 — Angela Andrews
I think we did. We went through the entire stack from frontend to backend. Yeah, I think we did it.

26:40 — Brent Simoneaux
Well, up next, we're going to wrap this up. Right, Kim?

26:43 — Kim Huang
We do have one more episode though, and it's kind of like our wrap-up to go over some of the things we've learned while making Stack Unstuck. And we have a few special guests who are going to come in and leave us with their thoughts as well.

26:57 — Angela Andrews
Remember, you can go back and listen to any of the previous episodes, from front end, to frameworks, to databases, to the operating system. Yeah. Check out any of the older episodes and we want to hear what you think about this series. You can tweet us at Red Hat and always use the #compilerpodcast. We would love to hear from you. And that does it for this episode of Compiler.

27:25 — Brent Simoneaux
Today's episode was produced by Kim Huang and Caroline Craighead. Victoria Lawton tests our patience at every step of the way. Just kidding. We love you.

27:37 — Angela Andrews
Our audio engineer is Kristie Chan. Special thanks to Shawn Cole. Our theme song was composed by Mary Ancheta.

27:46 — Brent Simoneaux
Thank you to our guests, Schaffer Stewart and Lisa Crispin and her donkeys for their insights.

27:53 — Angela Andrews
Our audio team includes Leigh Day, Laura Barnes, Stephanie Wonderlake, Mike Esser, Nick Burns, Aaron Williamson, Karen King, Boo Boo Howse, Rachel Ertel, Mike Compton, Ocean Matthews, Alex Traboulsi, and Laura Walters.

28:11 — Brent Simoneaux
If you'd like today's episode, please follow the show. Rate us, leave us a review, share it with someone you know. It really does help us out.

28:21 — Angela Andrews
We enjoy you listening. We enjoy putting on this podcast for you. Until next time.

28:26 — Brent Simoneaux
All right. See you next time.

Compiler graphic

Featured guests

Schaffer Stewart

Lisa Crispin

series graphic


In this limited run of Compiler, we speak to people within development teams and communities to bring clarity to the tech stack. 

Explore Stack/Unstuck

Keep Listening