The Power of Undefined Values
Tools shape the way we work, because they change where we perceive risk when we write code. If common compilers warn about something, I'll code in a way that will trigger it in case of mistakes. eg: instead of:
int err = -EINVAL; if (something()) goto out; err = -ENOSPC; if (something_else()) goto cleanup_something; ... cleanup_something: undo_something(); out: return err;
I would now set err in every branch:
int err; if (something()) { err = -EINVAL; goto out; } if (something_else()) { err = -ENOSPC; goto out; }
Because when I add another clause to the initialization and forget to set err, gcc will warn me about it being uninitialized. This bit me once, and it can be hard to spot the problem when you're only reviewing a patch, not the code as a whole.
These days, we have valgrind, and despite its fame as a use-after-free debugger, it really shines at telling you when you rely on the results of an uninitialized field. So, I've adapted to lean on it. I explicitly don't initialize structure members I don't use in a certain path. I avoid calloc(): while 0 is often less harmful than any other value, I'd much rather know that I've thought about and set up every field I actually use. When changing code this is particularly important, and I spend a lot of my time changing code. I have even changed to doing malloc() in some cases where I previously used on-stack or file-scope variables. Valgrind doesn't track on-stack usage very well, and static variables are defined to be zeroed, so valgrind can't tell when I wander into the weeds. I think these days, that's a misfeature.
So, if I were designing a C-like language today, I'd bake in the concept of undefined values, knowing that the tools to leverage it are widely available. 10 years ago, I'd have said 0-by-default is safest, but times change. I think Go chose wrong here, but it may not be as bad as C for other reasons. I'd have to code in it for a few years to really tell.