TL;DR; Some bit of Go (or at least a Go lib they're using) coerces NULL time values to 0, so they added logic later down the data pipeline to check for 0s and convert them back to NULLs.
The full context of the problem definitely sounds tricky, like I doubt I would have realized it during development, but that little bit of data coersion logic is a HUGE code smell, for me. Never would have passed review, in my book.
The author's takeaway seems to be "just cause you've fixed a bug doesn't mean that's the end of the story," which is valid, but the far more important takeaway for me is "don't compromise on best practices in data management." Like, in this case, they had a data model that leverages NULL (to indicate a "don't overwrite this field" scenario, and they fed it into something that doesn't support NULLs. They KNEW about this, and papered over it with a lossy data conversion, instead of actually implementing proper NULL handling for this conversion, or changing the data model to not require use of NULLs.