Jump to content

What is the point of these useless typedefs?

Gat Pelsinger
Go to solution Solved by igormp,
36 minutes ago, Gat Pelsinger said:

Oh wait, I get it. It's for making your code look more cool and technical, isn't it? Yeah, I have this fetish to.

Those are more often than not abstractions related to other things, like different CPU architectures (a int and long change on size depending on which architecture you are on), or higher level stuff (DWORD and QWORD are common naming through the computing world, and iirc those names are also used in Window's registry editor).

 

Undersocres are often used to indicate private functions/variables. Since C doesn't have the proper concept of private functions, this is a workaround that's used, which helps to avoid naming collisions.

 

If you're planning on supporting windows in the long term and across different versions, using their abstractions is likely the best way, otherwise you'll be doing your own IFDEFs for each version in case any of the underlying types change.

Whenever I open any inbuilt C header file, it feels like I am reading some really complicated kernel level code, but upon close inspection, it's just useless typedefs and definitions and usage of __ and capital letters as their naming schemes. For a better example, I would take the Windows API for reference. When trying to code something using it, it feels like I am writing a completely different language. Like seriously, for example, they have a typedef of LPVOID, which literally means a normal void pointer. Something even more ridiculous is typedefing "void" as "VOID". What even is the point? And there are these DWORDS and QWORDS for 32-bit and 64-bit integers. Why not use normal ints and longs? Why use this completely new names with horrifying capital letter naming scheme (and also  __ for C inbuilt functions), which scares of beginner programmers?

 

Oh wait, I get it. It's for making your code look more cool and technical, isn't it? Yeah, I have this fetish to.

Microsoft owns my soul.

 

Also, Dell is evil, but HP kinda nice.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Gat Pelsinger said:

Oh wait, I get it. It's for making your code look more cool and technical, isn't it? Yeah, I have this fetish to.

Those are more often than not abstractions related to other things, like different CPU architectures (a int and long change on size depending on which architecture you are on), or higher level stuff (DWORD and QWORD are common naming through the computing world, and iirc those names are also used in Window's registry editor).

 

Undersocres are often used to indicate private functions/variables. Since C doesn't have the proper concept of private functions, this is a workaround that's used, which helps to avoid naming collisions.

 

If you're planning on supporting windows in the long term and across different versions, using their abstractions is likely the best way, otherwise you'll be doing your own IFDEFs for each version in case any of the underlying types change.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

@igormp

 

hmm. Maybe I get it. So if I ever wanted to make DWORDS to be 64 bit, then I can easily change the typedef instead of changing every single variable to be 64 bit inside the code.

Microsoft owns my soul.

 

Also, Dell is evil, but HP kinda nice.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Gat Pelsinger said:

@igormp

 

hmm. Maybe I get it. So if I ever wanted to make DWORDS to be 64 bit, then I can easily change the typedef instead of changing every single variable to be 64 bit inside the code.

Yes, that's pretty much it. And anyone using your library wouldn't need to care about it when compiling for a new version of your library/system as long as they get the updated library itself.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Gat Pelsinger said:

Oh wait, I get it. It's for making your code look more cool and technical, isn't it? Yeah, I have this fetish to.

The reason for these definitions is that these libraries are available across a variety of operating systems and architectures, which define stuff like integers differently. The underscores are just a naming convention used to avoid conflicts with something you might want to name your variables, for example.

Just now, Gat Pelsinger said:

hmm. Maybe I get it. So if I ever wanted to make DWORDS to be 64 bit, then I can easily change the typedef instead of changing every single variable to be 64 bit inside the code.

More like your code won't suddenly be filled with overflows the moment you try to compile and run it on a 32 bit system and your integers are now only 32 bits long rather than 64. Along with a large variety of problems that would make cross platform support a nightmare.

3 hours ago, Gat Pelsinger said:

Why not use normal ints and longs? Why use this completely new names with horrifying capital letter naming scheme (and also  __ for C inbuilt functions), which scares of beginner programmers?

If you're a beginner then you shouldn't concern yourself with any of this. Just use your standard language types and you'll be fine. You should focus on understanding basic algorithms, general programming concepts that apply across languages, common pitfalls... and definitely not on platform specific typedefs that were never intended for you to read through, or whether typing out a for loop slightly differently will result in a 0.000000001 ns difference in execution times that will almost certainly be optimized away by the compiler.

 

I can assure you in a professional environment nobody is going to know by heart which exact sequence of C instructions would be marginally faster without optimization, or expect you to know; they probably won't care if you even know what a typedef is or why it's useful, because specific language syntax that is only useful in some circumstances is quickly learned on the job if need be. What they will expect you to know is how to structure a program, how to quickly familiarize yourself with a new language, what industry best practices can be applied to common problems.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Windows api evolve from 16 bit all the way to the 64 bit era. It would need loads of macro preprocessor statements to handle all those legacy eras stuffs as a result. They don't do this just to make things over engineered and complicated to flex on you, although I've known some people who do these(e.g. typescript programmers who do things like type generic X intersects Y extends Z satisfies T of A union B, on top of needless design patterns like factory observable singleton dependency injection arg arg, holy shit dude). 

 

It is done this way because there is a need to. Also, premature optimization is a rookies mistakes. If you spend time trying to optimize runtime by a few milliseconds instead of building a working application then you are solving the wrong problem. 

 

Sudo make me a sandwich 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Sauron said:

If you're a beginner then you shouldn't concern yourself with any of this. Just use your standard language types and you'll be fine. You should focus on understanding basic algorithms, general programming concepts that apply across languages, common pitfalls... and definitely not on platform specific typedefs that were never intended for you to read through, or whether typing out a for loop slightly differently will result in a 0.000000001 ns difference in execution times that will almost certainly be optimized away by the compiler.

 

I can assure you in a professional environment nobody is going to know by heart which exact sequence of C instructions would be marginally faster without optimization, or expect you to know; they probably won't care if you even know what a typedef is or why it's useful, because specific language syntax that is only useful in some circumstances is quickly learned on the job if need be. What they will expect you to know is how to structure a program, how to quickly familiarize yourself with a new language, what industry best practices can be applied to common problems.

I completely agree with this. I can count on one hand the number of times I've had to use these Windows-specific typedefs on a project, and all of them were times I was writing a program meant to only run on Windows. If you're familiar with the important aspects of programming, picking up these kinds of things whenever you need it is easy.

 

The same thing goes for optimizing code. Nobody cares how fast the code runs at first, only that it's correct. If at some point you figure out that it runs too slow, then you can optimize it. Otherwise it's wasted effort most of the time. Reminds me of when I implemented bubble sort in a final project for my data structures and algorithms class once. Was it fast? No, but it didn't matter because I was only sorting 10 elements. Was it quick to write? Yes, and that was great considering the time crunch.

Computer engineering grad student, cybersecurity researcher, and hobbyist embedded systems developer

 

Daily Driver:

CPU: Ryzen 7 4800H | GPU: RTX 2060 | RAM: 16GB DDR4 3200MHz C16

 

Gaming PC:

CPU: Ryzen 5 5600X | GPU: EVGA RTX 2080Ti | RAM: 32GB DDR4 3200MHz C16

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, dcgreen2k said:

Reminds me of when I implemented bubble sort in a final project for my data structures and algorithms class once. Was it fast? No, but it didn't matter because I was only sorting 10 elements. Was it quick to write? Yes, and that was great considering the time crunch.

To add to this, sometimes algorithms that are significantly faster on millions of elements are actually slower on a few dozen elements 😛 but there's a certain subgroup of sorting nerds who will chastise you for using the "wrong" sort for your shopping list

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×