The Virtuous Cycle of Testing
Writing more tests makes it easier to write other tests in the future
There are many good reasons to write tests for your code: catching bugs, preventing functionality regressions, defining better interfaces to your modules, the list goes on.
But one reason isn't discussed quite as frequently that I like to call "the virtuous cycle of testing." Put simply, this virtuous cycle is the idea that writing more tests makes it easier to write other tests in the future.
I've seen it time and time again: everybody happily embraces tests until they get too hard to write. It's easy to write the test confirming that a banner component renders on a page; it's harder to write the test that confirms that the login flow and all of its error states are working properly. Similarly, testing interactions with external APIs is a common point for teams to draw the line of what they're willing to test since setting up a tool to mock responses can be daunting.
At a more abstract level, it's easy to write tests when you have a very similar bit of logic you can copy over and apply to the tests you need to write, but it's more challenging to write tests that explore a new domain and require extra setup and overhead. On top of that, if the new code being added isn't mission-critical, there may be a concious decision to not write tests at all.
And hey, I get it. Timelines force us into uncomfortable situations when it comes to delivering aspirational code. We can't write tests for every single use case in a system and that's ok. But that doesn't mean we should throw our hands in the air every time something is hard. A test suite that only covers easy-to-write use cases is better than nothing, but not much more.
Just as I've seen teams where a testing suite has dwindled over time, I've also seen teams who have a culture of excellence and go to the extra mile for testing even when it's difficult.
The most powerful mental model I've seen with having effective test coverage is to have the attitude "we should default to writing tests for new code unless there is a very good reason not to." Once that's in place, I've seen other pieces fall into place for how the team operates: engineers pointedly ask where the tests are during code review and offer to pair program when tests are missing.
Those are the codebases and teams I've enjoyed working in the most over the course of my career. The difference is palpable - from having fewer regressions in a system to making tests a normalized part of the development workflow. I suppose there's something to be said for the social dynamic as well in these cases - if everyone else is writing tests, I don't want to be "that guy" who just skips over them.
The next time you find yourself tempted to skip over a test because it's difficult, I would encourage you to not think of extra testing overhead as aspirational and instead think of it as an investment. Each time someone contributes a new test that covers one of those hard-to-trailblaze areas of code, it becomes a reference point that pays dividends for every developer on your team in the future. The more others can utilize these reference points, the more tests they'll write that will in turn help you in the future.