this post was submitted on 03 Nov 2025
121 points (96.9% liked)
Programming
23517 readers
230 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I remember the recommendation to use a typedef (or #define ๐ฑ) for integers, like INT32.
If you like recompile it on a weird CPU or something I guess. What a stupid idea. At least where I worked it was dumb, if someone knows any benefits I'd gladly hear it!
If you're directly interacting with any sort of binary protocol, i.e. file formats, network protocols etc., you definitely want your variable types to be unambiguous. For future-proofing, yes, but also because because I don't want to go confirm whether I remember correctly that
longis the same size asint.There's also clarity of meaning;
unsigned long longis a noisy monstrosity,uint64_tconveys what it is much more cleanly.charis great if it's representing text characters, but if you have a byte array of binary data, using a type alias helps convey that.And then there are type aliases that are useful because they have different sizes on different platforms like
size_t.I'd say that generally speaking, if it's not an
intor achar, that probably means the exact size of the type is important, in which case it makes sense to convey that using a type alias. It conveys your intentions more clearly and tersely (in a good way), it makes your code more robust when compiled for different platforms, and it's not actually more work; that extra#include <cstdint>you may need to add pays for itself pretty quickly.So we should not have #defines in the way, right?
Like INT32, instead of "int". I mean if you don't know the size you probably won't do network protocols or reading binary stuff anyways.
uint64_t is good IMO, a bit long (why the _t?) maybe, but it's not one of the atrocities I'm talking about where every project had its own defines.
The standard type aliases like
uint64_tweren't in the C standard library until C99 and in C++ until C++11, so there are plenty of older code bases that would have had to define their own.The use of
#defineto make type aliases never made sense to me. The earliest versions of C didn't havetypedef, I guess, but that's like, the 1970s. Anyway, you wouldn't do it that way in modern C/C++.