Jump to content

Tech myth debunk thread

Boinbo
Message added by Spotty

This thread is for TECHNOLOGY related myths only. The LTT forum is not the place for conspiracy theories about politicians and aliens. 

If the thread goes off topic again it will be locked and warnings may be issued.

8 minutes ago, Jonas_2909 said:

Here's one, hope noone's said it before: Better cooler = lower room temperature

I think this is also related to people seeing lower temperatures due to using a better cooler.

 

There's also the idea that say, for example, an AIO is always better than air cooling as far as cooling performance. They go run Cinebench once or maybe prime95 for 5 minutes and say the result to prove their thinking without realizing water has a much higher specific heat than metal.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, Mira Yurizaki said:

I think this is also related to people seeing lower temperatures due to using a better cooler.

 

There's also the idea that say, for example, an AIO is always better than air cooling as far as cooling performance. They go run Cinebench once or maybe prime95 for 5 minutes and say the result to prove their thinking without realizing water has a much higher specific heat than metal.

IIRC, JayzTwoCents demonstrated that the whole AiO v high end air cooler debate really hinges on what ambient temperatures are like. AiO tend to outperform high end air coolers in hotter environments, even if the opposite is true in cooler environments.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, wkdpaul said:

I would agree that seems to be the main issue here. Just like people arguing about scientific theories because they apply the general definition of the word "theory" to "scientific theory".

 

Seems to be what's happening here, a redundancy isn't a backup as far as people working in IT are concerned. It's only there to help with downtime, not data recovery.

Well the thing is backups and the definition of the word in computing is to recover from data loss. RAID is mechanism used to prevent data loss, it cannot be used for recovery when there is data loss. When a RAID 1 array has a disk failure there is no data loss so the mirror disk has not been used to recover from a data loss, it has prevented a loss from happening. The only time there is data loss in relation to disks/hardware in RAID is when the array goes past degraded to failed and when the array is failed you cannot use any part of the array for recovery without sending the hardware array to specialist data recovery services to try and get the data back from your primary and only copy of the data.

 

I don't even see where the confusion comes from in relation to the definition, it's quite clear it's about recovering from data loss or damage so that has to actually happen before you go to data backups, something that does not happen with RAID until it fails or is corrupted.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, amdorintel said:

<snip>

This is for PC/technology related myths. Not aliens, UFOs, or dodgy plumbers. Please stay on topic.

 

On 6/29/2019 at 9:40 AM, Boinbo said:

What are some common PC myths that many people believe to be true, but aren't?


I've added "Tech" to the title to make it clear for everyone.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, amdorintel said:

a myth that needs to be debunked is just because someone works in a field, they are not experts.

 

another myth is just because a person is labeled an expert does not mean they know shit.

 

dont put too much faith in the guy behind the counter, just because he wears a store uniform. futureshop had commission sales staff on the floor. canadian tire has commission credit card staff roaming the aisles. canadian tire also has commission auto repair counter clerks. those are the people to least trust if ever you were to trust anyone at all. same goes for lawyers beacause plenty of lawyers know fuck all, or maybe some know a lot yet their fee for services is not high enough or does not excite them to actually give a fuck. duty counsel in many cases is a joke, free lawyers dont care. except the lawyers that actually care, the ones that get people off for miscarriages of justice, but it takes a lot for those lawyers to get involved. too many just languish in the penal system.

 

what about red seal tradesmen, which is the goal for any journeymen. too many journeymen just do one job for 5 years, and they know nothing else but to pull wire and bend pipe. they pass the yearly tests yet i wouldnt trust them to do anything beyond what they did for 5 years, pull wire and bend pipe. same goes for a chef, they can get the endgoal of a red seal being a chef, but maybe they just worked in a mexican restaraunt for 5 years, doesnt mean they make good italian food.

 

easy to pass tests, but on the job training with a wide variety of tasks is the person you want.

 

another myth is ufo, people hear of ufo's and automatically assume aliens from outter space. they do not realize that a ufo is just an object thats not been identified, a light in the sky could be a star, a meteor, a plane, a reflection.

 

its the automatic assumptions that is human nature.

 

whats not a myth is that the human senses are tricked all the time, and that is why eye-witness evidence is the lowest kind of evidence there is, yet in a court of law its the highest form of evidence there is. what a fucked up system that is isnt it. tests have been proven time and time again that eye witness testimony is skewed. i just watched a show last night on wrongfully convicted man, and the police gave positive reinforcements on the suspect photo lineup, they wanted the case closed. some states in the usa have begun to put in place a system where an employee with no knowledge of the case has to give the witness the lineup so no positive reinforcement can be given, however the other half the states there is no law in place for that. in canada it is only recommended. so government is as always messed up and fucked up.

 

 

 

Wow! That was all over the map! You also seemed to contradict yourself at first, based on what you ranted afterward.

Jeannie

 

As long as anyone is oppressed, no one will be safe and free.

One has to be proactive, not reactive, to ensure the safety of one's data so backup your data! And RAID is NOT a backup!

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Drak3 said:

IIRC, JayzTwoCents demonstrated that the whole AiO v high end air cooler debate really hinges on what ambient temperatures are like.

To quote what you said earlier, "Just lame excuses and """expert""" opinions (that are no more valid than asking random people on the street)."

 

Why is it you can quote an expert and expect us to accept it but I can link to multiple experts and none are acceptable, especially since you couldn't be bothered to read any of them?

Jeannie

 

As long as anyone is oppressed, no one will be safe and free.

One has to be proactive, not reactive, to ensure the safety of one's data so backup your data! And RAID is NOT a backup!

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Lady Fitzgerald said:

To quote what you said earlier, "Just lame excuses and """expert""" opinions (that are no more valid than asking random people on the street)."

 

Why is it you can quote an expert and expect us to accept it but I can link to multiple experts and none are acceptable, especially since you couldn't be bothered to read any of them?

Because everyone is wrong unless they agree with me.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, leadeater said:

Well who on earth buys a tape drive if your backup requirements don't even fill a single tape, going back years that used to be the only option but now for those use cases we have cloud storage. You can get an LTO-7 drive for about $2k and each tape after that is between $40-$60 so it's not expensive but rather very cheap so long as you have the data footprint to meet that minimum entry point. I know it's a technicality here because barrier of entry might seem high, and it is for home usage, but even small businesses today have a fair decent amount of data so tape is a front runner for cost for almost everyone. There's more important factors that would push these people away from tape though, operational expertise and physical location being the big ones.

 

Direct to tape backup is very rare today though, most short term backups are done to disk systems then longer term weekly/monthly copies to tape, Disk to Disk to Tape is what it's know as. You can also make multiple copies of these longer terms backups, one to cloud and one to tape.

 

Home users and business users are so different they should really be covered and considered separately, I mean for home a simple external HDD and a free backup tool is right up there in the best of class methods only really being beaten out by cloud backups. Over engineering solutions is a big problem, but so is under engineering them.

I had a dig through my office's boxes. There's another tape unit in a box, and a bunch of tapes. So that means there's three different tape drives. Going back to at least the early 2000's. Then there is the box with "offsite storage" written in pen on them, also with early 2000's. I don't know if there would even be a way to restore these since the machine and the software to operate it probably was thrown away years ago.

 

And just because apparently this office had all sorts of bizarre backup processes, there's also external drives with no power supplies, zip drives, recordable CD's, and a drawer full of desktop hard drives. Which are backups and which are just junk, unclear.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Kisai said:

I had a dig through my office's boxes. There's another tape unit in a box, and a bunch of tapes. So that means there's three different tape drives. Going back to at least the early 2000's. Then there is the box with "offsite storage" written in pen on them, also with early 2000's. I don't know if there would even be a way to restore these since the machine and the software to operate it probably was thrown away years ago.

 

And just because apparently this office had all sorts of bizarre backup processes, there's also external drives with no power supplies, zip drives, recordable CD's, and a drawer full of desktop hard drives. Which are backups and which are just junk, unclear.

We had a clear out a year or so ago, threw out anything that was more than 7 years old. There were some extremely old tape reels that held like only a few kb of data. I'm not afraid to throw shit out, especially if it's impossible to read the data anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Kisai said:

I had a dig through my office's boxes. There's another tape unit in a box, and a bunch of tapes. So that means there's three different tape drives. Going back to at least the early 2000's. Then there is the box with "offsite storage" written in pen on them, also with early 2000's. I don't know if there would even be a way to restore these since the machine and the software to operate it probably was thrown away years ago.

 

And just because apparently this office had all sorts of bizarre backup processes, there's also external drives with no power supplies, zip drives, recordable CD's, and a drawer full of desktop hard drives. Which are backups and which are just junk, unclear.

 

54 minutes ago, leadeater said:

We had a clear out a year or so ago, threw out anything that was more than 7 years old. There were some extremely old tape reels that held like only a few kb of data. I'm not afraid to throw shit out, especially if it's impossible to read the data anyway.

This points out the need to keep backups and archives current. You can't just stick the media on the shelf and expect to be able to access it years later due to obsolescence. People have lost data when drives for 8" floppies were phased out and were no longer available. Same for 5.25" and 3.5" floppies, etc. Backups and archives have to be monitored to make sure the hardware and software needed to access them is still current and update them before the hardware and software go away.

Jeannie

 

As long as anyone is oppressed, no one will be safe and free.

One has to be proactive, not reactive, to ensure the safety of one's data so backup your data! And RAID is NOT a backup!

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/12/2019 at 10:16 AM, Ben17 said:

It usually open by defualt i think because you need to short the pins with a screwdriver to turn it on ?

You are correct - I mixed them up (been a while since electrical engineering class) - an open circuit (no electrons flowing) is the default state, and when you press the switch, you close the circuit, which engages the operation (in this case, a shutdown signal or power on signal).

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Here's one: PC is not an operating system, as many people associate it with Windows. Nor is it custom hardware you build yourself. It stands for personal computer. So that Macbook your friend owns? That's his PC. He owns that computer. That under-powered Linux machine you built for no reason but use all the time? That's your PC. You own that computer.

 

Now how about that Windows laptop your company issued you? Nope. That's not a PC. That's not a computer you own, your company does.

 

This whole PC thing started way back in 1981 with IBM and their marketing. Since then it's just been bastardized by idiots and assholes. Yes, a Mac is a PC. Yes, a Windows machine is a PC. Yes, even a Chromebook is a PC. This whole thing irks the shit out of me. I wish I could put an end to all of it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, TempestCatto said:

Here's one: PC is not an operating system, as many people associate it with Windows. Nor is it custom hardware you build yourself. It stands for personal computer. So that Macbook your friend owns? That's his PC. He owns that computer. That under-powered Linux machine you built for no reason but use all the time? That's your PC. You own that computer.

 

Now how about that Windows laptop your company issued you? Nope. That's not a PC. That's not a computer you own, your company does.

 

This whole PC thing started way back in 1981 with IBM and their marketing. Since then it's just been bastardized by idiots and assholes. Yes, a Mac is a PC. Yes, a Windows machine is a PC. Yes, even a Chromebook is a PC. This whole thing irks the shit out of me. I wish I could put an end to all of it.

So my phone is a PC.  So is my wireless router. It’s got a pretty limited Linux in it but it’s touring complete and I own it.

more definitions of words issues.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, TempestCatto said:

Here's one: PC is not an operating system, as many people associate it with Windows. Nor is it custom hardware you build yourself. It stands for personal computer. So that Macbook your friend owns? That's his PC. He owns that computer. That under-powered Linux machine you built for no reason but use all the time? That's your PC. You own that computer.

 

Now how about that Windows laptop your company issued you? Nope. That's not a PC. That's not a computer you own, your company does.

 

This whole PC thing started way back in 1981 with IBM and their marketing. Since then it's just been bastardized by idiots and assholes. Yes, a Mac is a PC. Yes, a Windows machine is a PC. Yes, even a Chromebook is a PC. This whole thing irks the shit out of me. I wish I could put an end to all of it.

Actually, it came from the fact that it could be used by a single user, VS big mainframe computers that couldn't really be used by a single person (one person could send request to the mainframe and staff would "process" those requests, or use the mainframe themselves but only through time-share, so it's not really "personal" in the sense that they're not the only one using it).

 

So you're technically right when it comes to computers that are shared between users. But for all intent and purposes, it was to differentiate between mainframe and "not-mainframe" computers. ;)

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TempestCatto said:

This whole PC thing started way back in 1981 with IBM and their marketing. Since then it's just been bastardized by idiots and assholes. Yes, a Mac is a PC. Yes, a Windows machine is a PC. Yes, even a Chromebook is a PC. This whole thing irks the shit out of me. I wish I could put an end to all of it.

The term "personal computer" itself however, wasn't really used to describe... well... a personal computer because having a personal computer back then meant you were basically well to do. I believe people back then tended to use the terms "home computer" and "business computer" to describe self-contained units.

 

In a similar vein, "GPU" wasn't something used in common parlance to describe video card hardware until sometime after the GeForce 256, which NVIDIA used the term in their marketing. Before then, the chips themselves were usually something like "3D accelerator" And similarly, a lot of FPS games during the mid to late 90s were often called "Doom clones" rather than "first person shooter"

 

And as far as being technical, "PC" still has a firm, fixed meaning. The IBM PC standard requires specific functionality in order to be compatible with it. If you don't have this functionality in your hardware, then any IBM PC compatible software may not be guaranteed to work. Such as the case when fail0verflow hacked the PS4 to get Linux running on it. They actually couldn't use a bog-standard x86-64 version of Linux, but had to modify it so it wouldn't try to look for functionality that didn't exist.

 

In any case, I don't really think anyone cares about the term anyone except when differentiating between Macs and everyone else.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mira Yurizaki said:

The term "personal computer" itself however, wasn't really used to describe... well... a personal computer because having a personal computer back then meant you were basically well to do. I believe people back then tended to use the terms "home computer" and "business computer" to describe self-contained units.

 

In a similar vein, "GPU" wasn't something used in common parlance to describe video card hardware until sometime after the GeForce 256, which NVIDIA used the term in their marketing. Before then, the chips themselves were usually something like "3D accelerator" And similarly, a lot of FPS games during the mid to late 90s were often called "Doom clones" rather than "first person shooter"

 

And as far as being technical, "PC" still has a firm, fixed meaning. The IBM PC standard requires specific functionality in order to be compatible with it. If you don't have this functionality in your hardware, then any IBM PC compatible software may not be guaranteed to work. Such as the case when fail0verflow hacked the PS4 to get Linux running on it. They actually couldn't use a bog-standard x86-64 version of Linux, but had to modify it so it wouldn't try to look for functionality that didn't exist.

 

In any case, I don't really think anyone cares about the term anyone except when differentiating between Macs and everyone else.

In the early 80's the commadore 64 was sold as a personal computer.  The term PC was already embedded in most IT (it was called IT then) circles by then.  A home computer was often sold as a PC, just that people called them home computers because they had one at home rather than at the office.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

A few that keep bugging me that I see people say:

 

X Application is Single Threaded

My main issue with this is the meaning of the term literally. Most applications are not single threaded. They have multiple threads.

 

You could distill this to "a single thread of execution", where "execution" is the actual work the application is supposed to do. And this I find acceptable, but only for simple applications. Like for example, running a LAME encoder with a GUI front-end. The LAME executable itself does indeed having a single thread of execution. Then again, the GUI front-end is simply calling another executable, which is literally single threaded.

 

You could posit that maybe the application has multiple threads, but because each thread only executes one after another, it's effectively the same as the application running everything in a single thread. And I would agree with that. However this does not necessarily mean the application can be called single threaded but only that it has the performance of running on a single thread. What if there's an exception catching system that if the thread runs into an exception or error, the thread dies? In a single threaded application, a problem anywhere would basically crash the application. But in an application with multiple threads that run one after another with an error catching mechanism, a thread that dies likely won't bring the application down if it runs into a problem.

 

Also, who's to say that even if an applications run tasks in order, each of those steps don't have a way to take advantage of running with multiple threads as well? For example, in the Killzone Shadow Fall Demo Post Mortem, slides 15-27 show how the game uses the CPU. And even though the game does run higher level tasks (AI, game logic, draw call compilation) in a specific order, those higher level tasks are running across multiple CPUs, indicating those tasks are multi-threaded.

 

Now a problem with performance of a single thread can introduce a bottleneck, but again, that doesn't mean the application is single threaded.

 

Applications don't see past X number of cores (or something similar)

Applications can see how many logical processors are in the system

The only reason why an application doesn't appear to use more than X number of cores is because on average, only X number of threads are available for execution. A reason why this may be the case is how the application was designed at the time it was designed. An application designed in 2003 probably  wasn't designed with the expectation people would be running multiple cores on their processor. Even two-processor systems were an extraordinarily rare thing. So it's going to be designed to perform best on single core processors.

 

This is primarily why Crysis can't seem to run past 80 FPS and performance takes a huge dump on certain levels. The game was designed around the expectation that we would have super high performance single-core processors in the future.

 

Applications need explicit simultaneous multi-threading support to take advantage of it

An application is not in charge of scheduling when it runs and where its threads go. That's the job of the OS. And modern OSes schedule by threads, not processes. For example, in Windows: (from https://docs.microsoft.com/en-us/windows/win32/procthread/about-processes-and-threads)

 

Quote

A thread is the entity within a process that can be scheduled for execution.

...

Microsoft Windows supports preemptive multitasking, which creates the effect of simultaneous execution of multiple threads from multiple processes. On a multiprocessor computer, the system can simultaneously execute as many threads as there are processors on the computer.

 

Therefore, an application doesn't need to explicitly support simultaneous multi-threading to take advantage of it. If it has multiple threads to run, it's already taking advantage of it.

Link to comment
Share on other sites

Link to post
Share on other sites

Woah there! It's been a while since I've been on, but this thread seems to be big! Not gonna lie, even though I haven't been on here in a while, it's cool one of my threads sort of blew up! Anyways, take care guys.

I lurk 

HP Spectre x360 13t late 2019
Core i5 1035g4
8gb ram
256GB NVME SSD
HP 24mh FHD Monitor 

OnePlus 5
Jabra evolve 75
Razer Blackwidow Lite

Steelseries Rival 3

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/10/2019 at 10:05 PM, comander said:

"640k ought to be enough for anybody" <- it isn't. 
 

Is this a distance square relationship? I.e if you're 6x as way you get 1/36th the radiation (more accurately per unit of area)?

Also, are there threshold effects i.e. the probability of damage drops like a rock if the radiation is below some threshold (i.e. can't get through a phospholipid bilayer) If there are threshold effects then the result is unlikely to be strictly additive (i.e. 1/36th effect for 36x as long results in the same effect)

Don’t know if it’s been answered earlier but it should be. The formula has (radius)^2 on the bottom. So a doubling in distance should result in a quarter of the radiation. The same is true for gravitation and static electric charge 

That's an F in the profile pic

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/13/2019 at 12:58 PM, mr moose said:

In the early 80's the commadore 64 was sold as a personal computer.  The term PC was already embedded in most IT (it was called IT then) circles by then.  A home computer was often sold as a PC, just that people called them home computers because they had one at home rather than at the office.

At one point in time there was "Computer" "Minicomputer" and "Microcomputer". The last one is explicitly what the PC is. Yes, when they were 50lbs and not portable at all.

 

The "PC" basically only ever described the original 8088 IBM 5150. After that point "PC Compatible" was the term used until everything but the Mac as competition went out of business for the home computer market. Then of all things, the Mac switched to the same CPU as the "PC Compatible's" and thus you could even run Windows on a Mac, thus putting the entire "PC vs Mac" out to pasture.

 

Like it might be fair to call a smartwatch, smartphone and tablet (eg iPad, not tabletPC) as belonging to a family of nanocomputers (of which H/PC (Handheld PC) and PocketPC were both Microsoft marketing terms of early PDA devices, and those were all bigger than current smartphones.)

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/13/2019 at 11:17 PM, Mira Yurizaki said:

A few that keep bugging me that I see people say:

 

X Application is Single Threaded

My main issue with this is the meaning of the term literally. Most applications are not single threaded. They have multiple threads.

 

...

Therefore, an application doesn't need to explicitly support simultaneous multi-threading to take advantage of it. If it has multiple threads to run, it's already taking advantage of it.

Eh, no. The reason "X is Single Threaded" is because the application is incapable of using those threads in any form that splits the load evenly. For example most games use exactly 1/(number of cores), and no more. So on a dual core system this is not unreasonable, but on a quad core or larger, it becomes immediately obvious when an application has not been designed to use multiple threads.

 

But many applications simply can not be multithreaded because of bad programming practices. For example "forking" a process is an extremely inefficient use of CPU and Memory resources, and this continues to be the default settings in many programs, including ones that are capable of multi-threading because of plugins/extensions that are not thread-safe. For example, Apache HTTPD continues to be prefork on Linux and BSD systems because of mod_perl or mod_php still being used by developers instead of the socket process manager servers (eg php-fpm.) What makes it worse, is that people continue to believe the myth that "forking" is the correct model when it's the worst option.

 

I've run web servers in multithreaded mode for over a decade. What made it possible was the predecessor to php-fpm which was fcgid. That allowed PHP and Perl to work on multithreaded systems while leaving php and perl (any anything else like python or ruby) to run in their single-threaded process without the overhead of spinning up one apache httpd server per script. It improved things so much that those machines ARE STILL RUNNING this configuration today.

 

So how do we get php, perl, python, ruby to be multithreaded? You can't. Most scripting languages are designed to do scripting, not heavy lifting. Javascript in the web browser? not threaded, web workers are not threads, they are preforks. Web browser tabs? preforked processes that you can actually see in task manager. Now there is a reason for the latter, one tab can't crash another, and can't access the memory of other tabs. But it wastes so much memory and CPU resources. There is something extremely sad and annoying about seeing 20 tabs take 4GB of RAM.

 

But it doesn't stop there. Go take a look at the kind of threading available in Unreal, Unity, Game Maker, and such. Two of these don't support threads. Unity doesn't support threads because it doesn't have a thread-safe API and Game Maker doesn't have any threads to speak of. If you want to use threads you're better off with GoDot and Xenko support threading in their scripting languages.

 

When people complain about games being single threaded, they are talking about games made in engines like Unity, or HTML5 engines that can't make use of threads as a consequence of the platform. Sure, you can certainly make something cool, and the game engine will only use threads for what can safely parallelism, if it's designed to use it, but in most cases, that's simply not a thing.

 

Only two things are ever really parallelized in games. Image/video decoding and AI/Physics. In the former, it's usually take care of by a GPU library, where as the latter barely makes a dent in CPU usage except in particle/cloth physics. In other words, you might see a game use 30% on a quad core, but that's only because that 5% is the media decoder that can operate in a separate thread. The actual playback still has to take place in the main thread. This is why you typically see program audio stutter along with frame drops in DX9 and earlier programs. Newer games use use callback API's that tell the main thread when they're done to copy the buffer. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Kisai said:

At one point in time there was "Computer" "Minicomputer" and "Microcomputer". The last one is explicitly what the PC is. Yes, when they were 50lbs and not portable at all.

 

The "PC" basically only ever described the original 8088 IBM 5150. After that point "PC Compatible" was the term used until everything but the Mac as competition went out of business for the home computer market. Then of all things, the Mac switched to the same CPU as the "PC Compatible's" and thus you could even run Windows on a Mac, thus putting the entire "PC vs Mac" out to pasture.

 

Like it might be fair to call a smartwatch, smartphone and tablet (eg iPad, not tabletPC) as belonging to a family of nanocomputers (of which H/PC (Handheld PC) and PocketPC were both Microsoft marketing terms of early PDA devices, and those were all bigger than current smartphones.)

 

A lot of computers in the 80's were refereed to as personal computers, the term pc compatible came to mean IBM compatible colloquially, but the term PC was always used to mean any computer a single user would operate.  Be it home computer, micro computer or whatever.

 

Several PC's from the 80's were advertised as PC's:

 

ZX spectrum

commadore,  most variants

Amstrad CPC (actually in it's name the Color personal computer)

 

When I was in the 80's it was common to call it a personal computer or PC. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Myth: Quality = Price (i.e. high quality = expensive, and low quality = cheap)

Fact: Quality is an ability to repeatedly and consistently meet a metric. If a product uses cheap parts, is assembled cheap, price is cheap, and still meets or surpasses that metric, that is high quality. Marketing uses what you think quality means to make their product seem upscale or high end.

Spoiler

CPU: Intel i7 6850K

GPU: nVidia GTX 1080Ti (ZoTaC AMP! Extreme)

Motherboard: Gigabyte X99-UltraGaming

RAM: 16GB (2x 8GB) 3000Mhz EVGA SuperSC DDR4

Case: RaidMax Delta I

PSU: ThermalTake DPS-G 750W 80+ Gold

Monitor: Samsung 32" UJ590 UHD

Keyboard: Corsair K70

Mouse: Corsair Scimitar

Audio: Logitech Z200 (desktop); Roland RH-300 (headphones)

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, The1Dickens said:

Myth: Quality = Price (i.e. high quality = expensive, and low quality = cheap)

Fact: Quality is an ability to repeatedly and consistently meet a metric. If a product uses cheap parts, is assembled cheap, price is cheap, and still meets or surpasses that metric, that is high quality. Marketing uses what you think quality means to make their product seem upscale or high end.

While I agree with that in general, keep in mind that, generally speaking, price can be an indicator of quality. The old adage that you get what you pay for is true more often than not. A product that uses cheap parts, is assembled cheaply, has a cheap price tag, and still meets a certain standard usually will not last as long or otherwise be as reliable as the higher priced product. On the flip side of the coin, many higher priced products are higher priced only because of their name reputation that they acquired in the past. Quality vs. price evaluation has to be done on a case by case basis.

Jeannie

 

As long as anyone is oppressed, no one will be safe and free.

One has to be proactive, not reactive, to ensure the safety of one's data so backup your data! And RAID is NOT a backup!

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Lady Fitzgerald said:

While I agree with that in general, keep in mind that, generally speaking, price can be an indicator of quality. The old adage that you get what you pay for is true more often than not. A product that uses cheap parts, is assembled cheaply, has a cheap price tag, and still meets a certain standard usually will not last as long or otherwise be as reliable as the higher priced product. On the flip side of the coin, many higher priced products are higher priced only because of their name reputation that they acquired in the past. Quality vs. price evaluation has to be done on a case by case basis.

How about “perceived quality and quality are not always the same thing”?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×