Jump to content

What's a tech term you hate?

da na
1 minute ago, Bombastinator said:

That’s just bizarre

Yep and that is probably what hurts my head the most and wants me to erase my memory for the rest of my life

 

Link to comment
Share on other sites

Link to post
Share on other sites

I hate 'Wi-Fi', but not for the reasons most others have stated.

 

Just because it's a meaningless term. They coined it to sound like 'hi-fi', which is a term with a meaning. This one is just... bleh.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, FruitOfTheLum said:

I hate 'Wi-Fi', but not for the reasons most others have stated.

 

Just because it's a meaningless term. They coined it to sound like 'hi-fi', which is a term with a meaning. This one is just... bleh.

what would you rather call it by

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, radoncombe said:

what would you rather call it by

Wireless would do! Same number of syllables!

Link to comment
Share on other sites

Link to post
Share on other sites

Dongle. But not because I don't like the term, I couldn't care less. I just hate how people use the word. Even Linus uses "dongle" when referring to regular old hubs. A dongle is just a single device that you plug into the computer to enable some kind of functionality, like a license dongle or a Bluetooth dongle for your mouse or keyboard. A hub with additional USB ports, card readers, Ethernet and HDMI ports is not a dongle, it's a hub.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/7/2021 at 10:05 AM, Rusty Proto said:

Twitter is actually not an app, but a platform. Try deleting the app itself from your hardware, then sending your browser to WWW.TWITTER.COM and see what I mean. You actually dun't need to install any software to use Twitter.

But if you have the twitter app, on your phone, twitter is an app.

And the website isnt great anyways, so really, app is the the only option.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

ray tracing.

nvidia did a good job pr that word. to mean nothing now.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

"Techies" ... umbrella term that speaks loud but says nothing.

I edit my posts more often than not

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, FruitOfTheLum said:

I hate 'Wi-Fi', but not for the reasons most others have stated.

 

Just because it's a meaningless term. They coined it to sound like 'hi-fi', which is a term with a meaning. This one is just... bleh.

wireless is a broad term.

WiFi is a actual term used for wireless internet conections.

Wireless could be bluetooth, or celular, or wifi, or some proprietary conection, or radio, its too broad to just mean one thing

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, dogwitch said:

ray tracing.

nvidia did a good job pr that word. to mean nothing now.

Yeah, any time I tell someone "oh I raytraced my render" they go "no you didn't you have a GTX card" 

CPU raytracing has been a thing for quite a while. Heck, you can even do that with a Pentium D. 

and I have done it as a joke... Took like 20x as long as my Xeon but it worked

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, dogwitch said:

ray tracing.

nvidia did a good job pr that word. to mean nothing now.

It does mean something though. 

Ray tracing is just the practive of tracing the rays of something and how they interact with real objects.

Real Time Ray Tracing, which is what nvidia and amd do now, is the idea of doing that in real time, 60 times a second

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, HelpfulTechWizard said:

wireless is a broad term.

WiFi is a actual term used for wireless internet conections.

Wireless could be bluetooth, or celular, or wifi, or some proprietary conection, or radio, its too broad to just mean one thing

Not really. If I walk into your house and ask for your wireless password, assuming you don't call the police you're going to know exactly what I mean!

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, FruitOfTheLum said:

Not really. If I walk into your house and ask for your wireless password, assuming you don't call the police you're going to know exactly what I mean!

Maybe, but what if I have a bluetooth device with a password? some do.

Or my hotspot is on, which is its own seperate thing, it creates a wifi service off of celular. That has a password too.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • More megapizels means more better
  • More Mega/Giga/Tera-Hertz means more better.

etc.

le sigh.

When i ask for more specs, don't expect me to know the answer!
I'm just helping YOU to help YOURSELF!
(The more info you give the easier it is for others to help you out!)

Not willing to capitulate to the ignorance of the masses!

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, HanZie82 said:
  • More megapizels means more better
  • More Mega/Giga/Tera-Hertz means more better.

etc.

le sigh.

For the most part, yes. For hz theres other things. For memory, the latencies, for cpus, the cores or arcetecture. but pretty much.

For the pixels thing, there are other things. like software for cameras, if you can see pst a certain resolution. but a 1 mp camera is going to be worse than a 10mp camera

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, HelpfulTechWizard said:

For the most part, yes. For hz theres other things. For memory, the latencies, for cpus, the cores or arcetecture. but pretty much.

For the pixels thing, there are other things. like software for cameras, if you can see pst a certain resolution. but a 1 mp camera is going to be worse than a 10mp camera

Those things are only true all other things being equal. A CPU using the same architecture, same core count, same cache size, etc will perform better at a higher clock speed. A camera with the same lenses, same firmware, same sensor, etc will give you a better picture at a higher pixel count.

 

However, a CPU using an inferior architecture, or with a lower core count, or with a smaller cache, etc can perform worse in spite of a higher clock speed. A camera with crappier lenses, poorly-made firmware, or a cheap sensor can give you a worse picture in spite of the higher pixel count.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, FruitOfTheLum said:

I hate 'Wi-Fi', but not for the reasons most others have stated.

 

Just because it's a meaningless term. They coined it to sound like 'hi-fi', which is a term with a meaning. This one is just... bleh.

What makes you think it is meaningless? One definition of fidelity is exactness or accuracy of detail. Before WiFi was created the way to transfer data over a network was by using cables, so Wireless-Fidelity (WiFi) could be referring to the fact that the data that would have been transferred over a cable, can be accurately transmitted over a radio signal using the WiFi protocol. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, HelpfulTechWizard said:

For the most part, yes. For hz theres other things. For memory, the latencies, for cpus, the cores or arcetecture. but pretty much.

For the pixels thing, there are other things. like software for cameras, if you can see pst a certain resolution. but a 1 mp camera is going to be worse than a 10mp camera

Oh yeah my old 4Megapixel camera with full 1" censor and properly sized lenses can still take better pictures then my 20 MP phone.

But yes theres so much other things that matter more, then just the numbers.

When i ask for more specs, don't expect me to know the answer!
I'm just helping YOU to help YOURSELF!
(The more info you give the easier it is for others to help you out!)

Not willing to capitulate to the ignorance of the masses!

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, YoungBlade said:

Those things are only true all other things being equal. A CPU using the same architecture, same core count, same cache size, etc will perform better at a higher clock speed. A camera with the same lenses, same firmware, same sensor, etc will give you a better picture at a higher pixel count.

 

However, a CPU using an inferior architecture, or with a lower core count, or with a smaller cache, etc can perform worse in spite of a higher clock speed. A camera with crappier lenses, poorly-made firmware, or a cheap sensor can give you a worse picture in spite of the higher pixel count.

which is what I siad?

1 minute ago, HanZie82 said:

Oh yeah my old 4Megapixel camera with full 1" censor and properly sized lenses can still take better pictures then my 20 MP phone.

But yes theres so much other things that matter more, then just the numbers.

which is kinda what I was going at?

9 minutes ago, HelpfulTechWizard said:

For hz theres other things. For memory, the latencies, for cpus, the cores or arcetecture. but pretty much.

For hz, thats pretty much right, but there are other things, latencys for memory, cores and arcetecture for cpus

9 minutes ago, HelpfulTechWizard said:

or the pixels thing, there are other things. like software for cameras, if you can see pst a certain resolution. but a 1 mp camera is going to be worse than a 10mp camera

for cameras theres software, if theres a point for taking a huge  pixel count photo if your never goign to zoom in thousands of times. I shouldve said usually, but i think my conjecture still stands. If you have a vast mp difference, your probably going to take better looking photos

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, HelpfulTechWizard said:

which is what I siad?

which is kinda what I was going at?

For hz, thats pretty much right, but there are other things, latencys for memory, cores and arcetecture for cpus

for cameras theres software, if theres a point for taking a huge  pixel count photo if your never goign to zoom in thousands of times. I shouldve said usually, but i think my conjecture still stands. If you have a vast mp difference, your probably going to take better looking photos

Then I misunderstood. I took especially the ending line "but a 1 mp camera is going to be worse than a 10mp camera" to mean that you disagreed with the sentiment. Having 10 times more of the metric still can't always make up the difference. Even a hypothetical 10GHz Pentium would get absolutely curb-stomped by a 1GHz modern CPU across the board.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, HelpfulTechWizard said:

which is what I said?

which is kinda what I was going at?

Yes and I (we?) were expanding on that. 😉
And i fully agree.

When i ask for more specs, don't expect me to know the answer!
I'm just helping YOU to help YOURSELF!
(The more info you give the easier it is for others to help you out!)

Not willing to capitulate to the ignorance of the masses!

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, YoungBlade said:

Then I misunderstood. I took especially the ending line "but a 1 mp camera is going to be worse than a 10mp camera" to mean that you disagreed with the sentiment. Having 10 times more of the metric still can't always make up the difference. Even a hypothetical 10GHz Pentium would get absolutely curb-stomped by a 1GHz modern CPU across the board.

Really though? Pentiums are still things, give  recent one 10ghz vs a 1ghz locked athlon 3300x, the pentium is going to win

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, HelpfulTechWizard said:

Really though? Pentiums are still things, give  recent one 10ghz vs a 1ghz locked athlon 3300x, the pentium is going to win

I meant the OG Pentium, like the 200MHz Pentium in my family's Gateway 2000 Windows 95 PC, not a modern Pentium.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, radoncombe said:

what would you rather call it by

Wireless internet?

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×