this post was submitted on 03 Nov 2025
121 points (96.9% liked)

Programming

23517 readers
230 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
 

As a Java engineer in the web development industry for several years now, having heard multiple times that X is good because of SOLID principles or Y is bad because it breaks SOLID principles, and having to memorize the "good" ways to do everything before an interview etc, I find it harder and harder to do when I really start to dive into the real reason I'm doing something in a particular way.

One example is creating an interface for every goddamn class I make because of "loose coupling" when in reality none of these classes are ever going to have an alternative implementation.

Also the more I get into languages like Rust, the more these doubts are increasing and leading me to believe that most of it is just dogma that has gone far beyond its initial motivations and goals and is now just a mindless OOP circlejerk.

There are definitely occasions when these principles do make sense, especially in an OOP environment, and they can also make some design patterns really satisfying and easy.

What are your opinions on this?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Valmond@lemmy.world 4 points 2 weeks ago (32 children)

I remember the recommendation to use a typedef (or #define ๐Ÿ˜ฑ) for integers, like INT32.

If you like recompile it on a weird CPU or something I guess. What a stupid idea. At least where I worked it was dumb, if someone knows any benefits I'd gladly hear it!

[โ€“] HetareKing@piefed.social 5 points 2 weeks ago (20 children)

If you're directly interacting with any sort of binary protocol, i.e. file formats, network protocols etc., you definitely want your variable types to be unambiguous. For future-proofing, yes, but also because because I don't want to go confirm whether I remember correctly that long is the same size as int.

There's also clarity of meaning; unsigned long long is a noisy monstrosity, uint64_t conveys what it is much more cleanly. char is great if it's representing text characters, but if you have a byte array of binary data, using a type alias helps convey that.

And then there are type aliases that are useful because they have different sizes on different platforms like size_t.

I'd say that generally speaking, if it's not an int or a char, that probably means the exact size of the type is important, in which case it makes sense to convey that using a type alias. It conveys your intentions more clearly and tersely (in a good way), it makes your code more robust when compiled for different platforms, and it's not actually more work; that extra #include <cstdint> you may need to add pays for itself pretty quickly.

[โ€“] Valmond@lemmy.world -1 points 2 weeks ago (19 children)

So we should not have #defines in the way, right?

Like INT32, instead of "int". I mean if you don't know the size you probably won't do network protocols or reading binary stuff anyways.

uint64_t is good IMO, a bit long (why the _t?) maybe, but it's not one of the atrocities I'm talking about where every project had its own defines.

[โ€“] HetareKing@piefed.social 3 points 2 weeks ago

The standard type aliases like uint64_t weren't in the C standard library until C99 and in C++ until C++11, so there are plenty of older code bases that would have had to define their own.

The use of #define to make type aliases never made sense to me. The earliest versions of C didn't have typedef, I guess, but that's like, the 1970s. Anyway, you wouldn't do it that way in modern C/C++.

load more comments (18 replies)
load more comments (18 replies)
load more comments (29 replies)