Jump to content

It's not just gamers that can't get Graphics cards, it's scientists as well.

grimreeper132
10 minutes ago, asus killer said:

they shouldn't be buying 500$ graphics cards in the 1st place, those are gamers cards, for the consumer gaming market. Different market from the scientific cards. I guess no one prohibits them from doing so, still, not the same market.

Not all researchers are flush with money and have set amounts with goals they need to achieve with it. Also if your task does not require a Quadro then why should you buy one? Just because you see the word research doesn't mean they need ECC VRAM or some obscure driver optimization.

 

You know who calls Quadros professional cards, Nvidia. Yep I'm sure they are totally impartial on the matter and have nothing to gain by up-selling you to a card you don't need.

 

Don't blame the researchers for putting RGB LEDs on GPUs, it might look 'unprofessional' but it's even more so to waste money on things you don't need.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, leadeater said:

Not all researchers are flush with money and have set amounts with goals they need to achieve with it. Also if your task does not require a Quadro then why should you buy one? Just because you see the word research doesn't mean they need ECC VRAM or some obscure driver optimization.

 

You know who calls Quadros professional cards, Nvidia. Yep I'm sure they are totally impartial on the matter and have nothing to gain by up-selling you to a card you don't need.

 

Don't blame the researchers for putting RGB LEDs on GPUs, it might look 'unprofessional' but it's even more so to waste money on things you don't need.

more I read this more I think of my PC which sorta is a work PC for my course, which has RGB on it as that was the cheapest RAM and GPU (with good cooling, I wanted a tri fan design rather than a dual for my 1080ti just so I had max cooling)

The owner of "too many" computers, called

The Lord of all Toasters (1920X 1080ti 32GB)

The Toasted Controller (i5 4670, R9 380, 24GB)

The Semi Portable Toastie machine (i7 3612QM (was an i3) intel HD 4000 16GB)'

Bread and Butter Pudding (i7 7700HQ, 1050ti, 16GB)

Pinoutbutter Sandwhich (raspberry pi 3 B)

The Portable Slice of Bread (N270, HAHAHA, 2GB)

Muffinator (C2D E6600, Geforce 8400, 6GB, 8X2TB HDD)

Toastbuster (WIP, should be cool)

loaf and let dough (A printer that doesn't print black ink)

The Cheese Toastie (C2D (of some sort), GTX 760, 3GB, win XP gaming machine)

The Toaster (C2D, intel HD, 4GB, 2X1TB NAS)

Matter of Loaf and death (some old shitty AMD laptop)

windybread (4X E5470, intel HD, 32GB ECC) (use coming soon, maybe)

And more, several more

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, SpaceGhostC2C said:

"shouldn't"? What does it even mean? They will buy whatever cover they needs at the lowest cost, like anyone else.

Those are GPUs, specialized processors focus on matrix operations. They are for the whoever can benefit from matrix handling market.

Except maybe you, I guess :P 

everyone has every right to buy whatever for whatever reason, it's a free world (for the most part), still i was just correcting you when you said it was "one market", it isn't and it's split by the manufacturers. I should have written "shouldn't" and not shouldn't, that part is my bad.

The point still stands they are buying something that was not intended for the use they are giving it. It may seem irrelevant but it isn't, for nvidia they have to calculate sales based in predictions, and if people are buying cards for uses and by users they never accounted for then shortages are more likely. And it's not by a few gpus, it's a deviation by the hundreds for just one case, some all up and you can see how it goes so wrong. A gamers buys one/two cards at most. A project like this buying cards are removing the cards intended/predicted by nvidia for hundreds of gamers.

 

That's my point.

.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, asus killer said:

everyone has every right to buy whatever for whatever reason, it's a free world (for the most part), still i was just correcting you when you said it was "one market",

 

Yes, it is, the market for GPUs is just one market in which GTX and Quadros (and non-Nvidia cards) are sold, but even if you want to treat market segmentation as two different markets (if it was, you wouldn't need segmentation strategies to begin with), then it would still be the case that the "GTX market" is just one market, and in that market buyers can be gamers, miners, scientists, deep-fakers, shiny object collector, or confused vaccum cleaner buyers. Whoever they are, they are all demand side in the same, single market for GTX GPUs, and when the price goes up, it goes up for everyone. 

The news would be if any group had some special deal or other form of shielding against price changes, compared to the general public.

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, BuckGup said:

AMD and Nvidia can easily produce more GPUs but they are scared the prices might drop and they would loose some money. Also Pascal is now end of life

Keep in mind that there still is a global DRAM shortage... 

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, SpaceGhostC2C said:

Yes, it is, the market for GPUs is just one market in which GTX and Quadros (and non-Nvidia cards) are sold, but even if you want to treat market segmentation as two different markets (if it was, you wouldn't need segmentation strategies to begin with), then it would still be the case that the "GTX market" is just one market, and in that market buyers can be gamers, miners, scientists, deep-fakers, shiny object collector, or confused vaccum cleaner buyers. Whoever they are, they are all demand side in the same, single market for GTX GPUs, and when the price goes up, it goes up for everyone. 

The news would be if any group had some special deal or other form of shielding against price changes, compared to the general public.

 

 

"About NVIDIA
NVIDIA (Nasdaq: NVDA) is the world leader in visual computing technologies and the inventor of the GPU, a high-performance processor which generates breathtaking, interactive graphics on workstations, personal computers, game consoles and mobile devices. NVIDIA serves the entertainment and consumer market with its GeForce® graphics products, the professional design and visualization market with its Quadro® graphics products and the high-performance computing market with its Tesla™ computing solutions products. NVIDIA is headquartered in Santa Clara, Calif. and has offices throughout Asia, Europe and the Americas. For more information, visit www.nvidia.com."

http://www.nvidia.com/object/io_1238405290161.html

 

they even have separate divisions. Agree to disagree and lets move on. :)

.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Bananasplit_00 said:

1: why not get quadros to begin with? 

I regularly watch the Puget build streams (not right now, they're on a hiatus when it comes to streaming), and plenty of the machines they build for scientific purposes have multiple 1080Tis in them instead of Titans or Quadros.  The difference in performance doesn't seem to justify the extra cost.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, LtStaffel said:

Is this a joke? Who needs GPUs to listen for extra terrestrial broadcasts, and better yet, what is their scientific evidence that such broadcasts have the potential to happen? What are the chances of it happening?

A grant sounds good to help scientists do really good stuff like find a cure for canc----no, they're listening to space with radios... Maybe the grant should go to folding@home instead?

+1

It sounds like the people runnig the SETI@home project. I suggest you take a read on their site as it's really interesting:

https://setiathome.berkeley.edu

 

Also, think of it more like looking for any sign of communication or signs of advanced technology, not something like a radio broadcast ^^

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, asus killer said:

 

"About NVIDIA
NVIDIA (Nasdaq: NVDA) is the world leader in visual computing technologies and the inventor of the GPU, a high-performance processor which generates breathtaking, interactive graphics on workstations, personal computers, game consoles and mobile devices. NVIDIA serves the entertainment and consumer market with its GeForce® graphics products, the professional design and visualization market with its Quadro® graphics products and the high-performance computing market with its Tesla™ computing solutions products. NVIDIA is headquartered in Santa Clara, Calif. and has offices throughout Asia, Europe and the Americas. For more information, visit www.nvidia.com."

http://www.nvidia.com/object/io_1238405290161.html

 

they even have separate divisions. Agree to disagree and lets move on. :)

they might have diffrent divisions but the people who they sell to are who ever wants to buy them, they aren't picky as they can't be their a company and they sell to who ever.

 

As to why a scientist uses them, budgets, if they need the preofrmance of a 1080ti and they have the choice between a 1080ti eant to be £700 closer to £800-900 or a P6000 at £5,000 they both have near identical preformance, they will go 1080ti as it's alot cheaper, and it means they can get more, which considering that they seam to be buying about 64 at at time they seem to need, it saves them alot for the nearly the same profomance. Using these cards as a just a rendering a game is not using the full power of these cards, they are designed that they can do stuff like scientific research even if they aren't targeted at this. From the sounds of it they don't need the extra features from the more expensive cards either, as if they did they would probably buy them instead.

The owner of "too many" computers, called

The Lord of all Toasters (1920X 1080ti 32GB)

The Toasted Controller (i5 4670, R9 380, 24GB)

The Semi Portable Toastie machine (i7 3612QM (was an i3) intel HD 4000 16GB)'

Bread and Butter Pudding (i7 7700HQ, 1050ti, 16GB)

Pinoutbutter Sandwhich (raspberry pi 3 B)

The Portable Slice of Bread (N270, HAHAHA, 2GB)

Muffinator (C2D E6600, Geforce 8400, 6GB, 8X2TB HDD)

Toastbuster (WIP, should be cool)

loaf and let dough (A printer that doesn't print black ink)

The Cheese Toastie (C2D (of some sort), GTX 760, 3GB, win XP gaming machine)

The Toaster (C2D, intel HD, 4GB, 2X1TB NAS)

Matter of Loaf and death (some old shitty AMD laptop)

windybread (4X E5470, intel HD, 32GB ECC) (use coming soon, maybe)

And more, several more

Link to comment
Share on other sites

Link to post
Share on other sites

not directly related but here a little something why space exploration and NASA are a good thing.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, ADM-Ntek said:

not directly related but here a little something why space exploration and NASA are a good thing.

 

I don't get how people think that NASA, ESA etc. are wastes of money they aren't they are pushing the boundaries of science and engineering which is never a bad thing and help us a hell of alot. Also it's really fucking cool, and I can't wait till we have a settlement on mars cause I really hope that happens in my lifetime.

The owner of "too many" computers, called

The Lord of all Toasters (1920X 1080ti 32GB)

The Toasted Controller (i5 4670, R9 380, 24GB)

The Semi Portable Toastie machine (i7 3612QM (was an i3) intel HD 4000 16GB)'

Bread and Butter Pudding (i7 7700HQ, 1050ti, 16GB)

Pinoutbutter Sandwhich (raspberry pi 3 B)

The Portable Slice of Bread (N270, HAHAHA, 2GB)

Muffinator (C2D E6600, Geforce 8400, 6GB, 8X2TB HDD)

Toastbuster (WIP, should be cool)

loaf and let dough (A printer that doesn't print black ink)

The Cheese Toastie (C2D (of some sort), GTX 760, 3GB, win XP gaming machine)

The Toaster (C2D, intel HD, 4GB, 2X1TB NAS)

Matter of Loaf and death (some old shitty AMD laptop)

windybread (4X E5470, intel HD, 32GB ECC) (use coming soon, maybe)

And more, several more

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, BuckGup said:

But they could still ramp it up a bit. The demand has faded but they could have easily flooded the market a month ago and sitll made money. Still though people would buy GPUs

Quote

When asked about increasing card production, we were given a few primary answers:

  • Betting on cryptocurrency is a big bet, because the mining market has not proven stable, and has proven unpredictable. To order on current demand doesn’t mean those cards show up tomorrow – they show up in a few months, and if cryptomining dies down in that time, that’s a big bag to be left holding. The manufacturers are forecasting months ahead (quarterly, actually), not weeks ahead.
  • NVIDIA and AMD could likely produce more GPUs, but board partners need to actually place an order for them. We’ve seen some uninformed content creators online accuse nVidia and/or AMD of undersupplying the market. Well, nVidia and AMD need a customer to sell those GPUs to – that’d be the board partners. If they’re not ordering more, nVidia and AMD aren’t going to make more. Simple as that.
  • New supply is showing up weekly, but it’s selling out fast. The best bet is to show up at a local retailer and ask when the next shipment comes in, then go there that day.
  • There is concern about over-production, especially if mining falls enough that the second-hand market becomes flooded, outstripping the ability of first-parties to make money.

https://www.gamersnexus.net/industry/3211-what-do-manufacturers-think-of-mining-and-gpu-prices

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, grimreeper132 said:

true but budgets don't always allow that, and if they need 64 cards about as powerful as a 1080ti (means to be about £650-700, are about £800-900) the cards they are meant to be buying are double to tripple that price, for not a massive gain in profromance they will take the consumer cards

If I was looking into building a large render server for Blender, Quadros would also be the very last cards I would look at too. The 1080 TI is faster in raw CUDA performance than the fastest Quadro (besides the GP100), and if that is what these scientists require, it would make no practical sense for them to consider going Quadro either.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Zodiark1593 said:

If I was looking into building a large render server for Blender, Quadros would also be the very last cards I would look at too. The 1080 TI is faster in raw CUDA performance than the fastest Quadro (besides the GP100), and if that is what these scientists require, it would make no practical sense for them to consider going Quadro either.

That's pretty much what I am trying to say

The owner of "too many" computers, called

The Lord of all Toasters (1920X 1080ti 32GB)

The Toasted Controller (i5 4670, R9 380, 24GB)

The Semi Portable Toastie machine (i7 3612QM (was an i3) intel HD 4000 16GB)'

Bread and Butter Pudding (i7 7700HQ, 1050ti, 16GB)

Pinoutbutter Sandwhich (raspberry pi 3 B)

The Portable Slice of Bread (N270, HAHAHA, 2GB)

Muffinator (C2D E6600, Geforce 8400, 6GB, 8X2TB HDD)

Toastbuster (WIP, should be cool)

loaf and let dough (A printer that doesn't print black ink)

The Cheese Toastie (C2D (of some sort), GTX 760, 3GB, win XP gaming machine)

The Toaster (C2D, intel HD, 4GB, 2X1TB NAS)

Matter of Loaf and death (some old shitty AMD laptop)

windybread (4X E5470, intel HD, 32GB ECC) (use coming soon, maybe)

And more, several more

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, grimreeper132 said:

they might have diffrent divisions but the people who they sell to are who ever wants to buy them, they aren't picky as they can't be their a company and they sell to who ever.

 

As to why a scientist uses them, budgets, if they need the preofrmance of a 1080ti and they have the choice between a 1080ti eant to be £700 closer to £800-900 or a P6000 at £5,000 they both have near identical preformance, they will go 1080ti as it's alot cheaper, and it means they can get more, which considering that they seam to be buying about 64 at at time they seem to need, it saves them alot for the nearly the same profomance. Using these cards as a just a rendering a game is not using the full power of these cards, they are designed that they can do stuff like scientific research even if they aren't targeted at this. From the sounds of it they don't need the extra features from the more expensive cards either, as if they did they would probably buy them instead.

money is not the issue, neither the types of cards. They don't want a graphics card to search for aliens, they want the same as miners, a calculator. What they want ideally is cheap rx570's same as miners. They could do it with cpus, with cheaper or with more expensive gpus, but it's a performance vs electricity cost.

I'm not an expert but if they really wanted a graphics card per se the quadros have different specifications, it's not like comparing video game performances.

The problem that i was point out is if everyone starts using consumer video cards for whatever calculations, be it mining, scientific or whatever then it's really hard for companies to predict sales and plan production accordingly, it helps me see the side of Nvidia and AMD. That's why they separate divisions, different buyers, different sales persons, etc... so they know the different markets.

.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, miagisan said:

You are kidding right? Most astronomical "listening" and "observing" are almost never done using audio or visual observations. 

I was pointing out that extra terrestrial life probably does not affect us as much as other more immediate issues that GPUs could be used to enhance the research of.

Join the Appleitionist cause! See spoiler below for answers to common questions that shouldn't be common!

Spoiler

Q: Do I have a virus?!
A: If you didn't click a sketchy email, haven't left your computer physically open to attack, haven't downloaded anything sketchy/free, know that your software hasn't been exploited in a new hack, then the answer is: probably not.

 

Q: What email/VPN should I use?
A: Proton mail and VPN are the best for email and VPNs respectively. (They're free in a good way)

 

Q: How can I stay anonymous on the (deep/dark) webzz???....

A: By learning how to de-anonymize everyone else; if you can do that, then you know what to do for yourself.

 

Q: What Linux distro is best for x y z?

A: Lubuntu for things with little processing power, Ubuntu for normal PCs, and if you need to do anything else then it's best if you do the research yourself.

 

Q: Why is my Linux giving me x y z error?

A: Have you not googled it? Are you sure StackOverflow doesn't have an answer? Does the error tell you what's wrong? If the answer is no to all of those, message me.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, LtStaffel said:

I was pointing out that extra terrestrial life probably does not affect us as much as other more immediate issues that GPUs could be used to enhance the research of.

yea true, but thats the example given by the BBC for some reason

The owner of "too many" computers, called

The Lord of all Toasters (1920X 1080ti 32GB)

The Toasted Controller (i5 4670, R9 380, 24GB)

The Semi Portable Toastie machine (i7 3612QM (was an i3) intel HD 4000 16GB)'

Bread and Butter Pudding (i7 7700HQ, 1050ti, 16GB)

Pinoutbutter Sandwhich (raspberry pi 3 B)

The Portable Slice of Bread (N270, HAHAHA, 2GB)

Muffinator (C2D E6600, Geforce 8400, 6GB, 8X2TB HDD)

Toastbuster (WIP, should be cool)

loaf and let dough (A printer that doesn't print black ink)

The Cheese Toastie (C2D (of some sort), GTX 760, 3GB, win XP gaming machine)

The Toaster (C2D, intel HD, 4GB, 2X1TB NAS)

Matter of Loaf and death (some old shitty AMD laptop)

windybread (4X E5470, intel HD, 32GB ECC) (use coming soon, maybe)

And more, several more

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, asus killer said:

money is not the issue, neither the types of cards. They don't want a graphics card to search for aliens, they want the same as miners, a calculator. What they want ideally is cheap rx570's same as miners. They could do it with cpus, with cheaper or with more expensive gpus, but it's a performance vs electricity cost.

I'm not an expert but if they really wanted a graphics card per se the quadros have different specifications, it's not like comparing video game performances.

The problem that i was point out is if everyone starts using consumer video cards for whatever calculations, be it mining, scientific or whatever then it's really hard for companies to predict sales and plan production accordingly, it helps me see the side of Nvidia and AMD. That's why they separate divisions, different buyers, different sales persons, etc... so they know the different markets.

One could say that it was Nvidia that encourages such use in the first place by bringing CUDA to consumer cards. Many "gaming" cards even advertise CUDA capability, so in a sense, they are also marketed for "whatever calculations" as well as gaming.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, BuckGup said:

AMD and Nvidia can easily produce more GPUs but they are scared the prices might drop and they would loose some money. Also Pascal is now end of life

Well if you never noticed, but production kept up just fine before last year. Then cyrpto skyrocketed and demand shot up. So Nvidia and AMD have two options. Either spend millions ramping up productions on an already dead generation or spend those millions instead on their Volta production that is soon to be released. 

 

Its bullshit that people are blaming Nvidia and AMD over the shortage of cards. Its like people think companies can magically pull another factory out of their ass in no time with no cost to their bottom line. Especially for an end of life product. 

 

Not to mention the fact that even if they ramped up production, the demand would be higher than ever due to fact of TOR. I mean imagine if a 1080 still only cost $500 when people at the moment are willing to pay $1200+ for one. It would be an endless fucking cycle. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, mr moose said:

This reminds me of a few years back when a few scientists were explressing there concerns about the reserves of helium.  They were upset it was being wasted on party balloons when it was desperately need for research.  Being lighter than everything except hydrogen when released it floats of into the atmosphere from there it can actually escape gravity.

 

But fear not, just like Helium I am sure they will find a new natural reserve somewhere of 1080ti's or vega's. 

if they literally wait a month there will probably be thousands of 1080ti's available for cheap if Nvidia actually releases new cards (they probably will) that are significantly better than last gen.

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, leadeater said:

24onyg.jpg

And they said I was crazy when I suggested they'd be mining for GPU's in the future.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Canada EH said:

scientists can afford the premium and buy in bulk if they require lots.

Yea, that’s not true at all. Budgets are being slashed every year (thanks Congress, the cost of one of those tanks the army doesn’t even want could fund our lab for a decade) and most grants are given preferentially to big, well funded labs leaving the rest of us clawing for basic supplies and reagents.

 

Our molecular docking core runs simulations using 1080s and 1080Tis and we are perpetually short on computational power (the job queue is always growing). The core will buy a couple new ones whenever the budget allows and once a tower runs out of space (we run 8 per system), we kill several months of budget to get a new tower to add to the cluster.

 

Science is already hard enough when you’re well funded but regularly putting projects on standby because you run out of reagents or have to repair 20yo equipment yourself because replacing it is 6mo of budget and a tech coming out is half that, is a massive resource drain.

 

We have been the world leader in scientific research for 100+yrs but now we think that science is a waste of money unless there is a profit to be made and we’re falling further and further behind. If people want to make America great then maybe we should support the research that lays the foundation for every single tech that industry is able to use for innovation, progression, and prosperity.

 

It seems like half the people here think that researchers have a ton of money to spend? That's only true for maybe 5% of labs at the top few universities but the vast majority of academic labs are not limited by a lack of interesting research prospects or lack of breakthroughs, the number 1 limiting factor is budget. Every PI (PhD/Professor that runs a lab) I know spends the vast majority of their time writing grant proposals and seeking new funding sources and rely of grad students and lab techs to do the actual research. Ever wonder why professors seem to spend so little time concerned with the courses they teach and TAs do a lot of teaching? It's because they have spend all their time trying to fund the lab using a constantly shrinking pool of funding.

Primary PC-

CPU: Intel i7-6800k @ 4.2-4.4Ghz   CPU COOLER: Bequiet Dark Rock Pro 4   MOBO: MSI X99A SLI Plus   RAM: 32GB Corsair Vengeance LPX quad-channel DDR4-2800  GPU: EVGA GTX 1080 SC2 iCX   PSU: Corsair RM1000i   CASE: Corsair 750D Obsidian   SSDs: 500GB Samsung 960 Evo + 256GB Samsung 850 Pro   HDDs: Toshiba 3TB + Seagate 1TB   Monitors: Acer Predator XB271HUC 27" 2560x1440 (165Hz G-Sync)  +  LG 29UM57 29" 2560x1080   OS: Windows 10 Pro

Album

Other Systems:

Spoiler

Home HTPC/NAS-

CPU: AMD FX-8320 @ 4.4Ghz  MOBO: Gigabyte 990FXA-UD3   RAM: 16GB dual-channel DDR3-1600  GPU: Gigabyte GTX 760 OC   PSU: Rosewill 750W   CASE: Antec Gaming One   SSD: 120GB PNY CS1311   HDDs: WD Red 3TB + WD 320GB   Monitor: Samsung SyncMaster 2693HM 26" 1920x1200 -or- Steam Link to Vizio M43C1 43" 4K TV  OS: Windows 10 Pro

 

Offsite NAS/VM Server-

CPU: 2x Xeon E5645 (12-core)  Model: Dell PowerEdge T610  RAM: 16GB DDR3-1333  PSUs: 2x 570W  SSDs: 8GB Kingston Boot FD + 32GB Sandisk Cache SSD   HDDs: WD Red 4TB + Seagate 2TB + Seagate 320GB   OS: FreeNAS 11+

 

Laptop-

CPU: Intel i7-3520M   Model: Dell Latitude E6530   RAM: 8GB dual-channel DDR3-1600  GPU: Nvidia NVS 5200M   SSD: 240GB TeamGroup L5   HDD: WD Black 320GB   Monitor: Samsung SyncMaster 2693HM 26" 1920x1200   OS: Windows 10 Pro

Having issues with a Corsair AIO? Possible fix here:

Spoiler

Are you getting weird fan behavior, speed fluctuations, and/or other issues with Link?

Are you running AIDA64, HWinfo, CAM, or HWmonitor? (ASUS suite & other monitoring software often have the same issue.)

Corsair Link has problems with some monitoring software so you may have to change some settings to get them to work smoothly.

-For AIDA64: First make sure you have the newest update installed, then, go to Preferences>Stability and make sure the "Corsair Link sensor support" box is checked and make sure the "Asetek LC sensor support" box is UNchecked.

-For HWinfo: manually disable all monitoring of the AIO sensors/components.

-For others: Disable any monitoring of Corsair AIO sensors.

That should fix the fan issue for some Corsair AIOs (H80i GT/v2, H110i GTX/H115i, H100i GTX and others made by Asetek). The problem is bad coding in Link that fights for AIO control with other programs. You can test if this worked by setting the fan speed in Link to 100%, if it doesn't fluctuate you are set and can change the curve to whatever. If that doesn't work or you're still having other issues then you probably still have a monitoring software interfering with the AIO/Link communications, find what it is and disable it.

Link to comment
Share on other sites

Link to post
Share on other sites

Errr...This is just a controversial thought: 

 

Just think about all of the cheap RX480/RX580/GTX 1060 flooding eBay once the hash rates of these cards drops significantly and miners can't unload these cards soon enough. It will be a glorious day for PC gaming! Well...RAM prices are still going up...so...

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, pyrojoe34 said:

Yea, that’s not true at all. Budgets are being slashed every year (thanks Congress, the cost of one of those tanks the army doesn’t even want could fund our lab for a decade) and most grants are given preferentially to big, well funded labs leaving the rest of us clawing for basic supplies and reagents.

 

Our molecular docking core runs simulations using 1080s and 1080Tis and we are perpetually short on computational power (the job queue is always growing). The core will buy a couple new ones whenever the budget allows and once a tower runs out of space (we run 8 per system), we kill several months of budget to get a new tower to add to the cluster.

 

Science is already hard enough when you’re well funded but regularly putting projects on standby because you run out of reagents or have to repair 20yo equipment yourself because replacing it is 6mo of budget and a tech coming out is half that, is a massive resource drain.

 

We have been the world leader in scientific research for 100+yrs but are now we think that science is a waste of money unless there is a profit to be made and we’re falling further and further behind. If people want to make America great then maybe we should support the research that lays the foundation for every single tech that industry is able to use for innovation, progression, and prosperity.

 

It seems like half the people here think that researchers have a ton of money to spend? That's only true for maybe 5% of labs at the top few universities but the vast majority of academic labs are not limited by a lack of interesting research prospects or lack of breakthroughs, the number 1 limiting factor is budget. Every PI (PhD/Professor that runs a lab) I know spends the vast majority of their time writing grant proposals and seeking new funding sources and rely of grad students and lab techs to do the actual research. Ever wonder why professors seem to spend so little time concerned with the courses they teach and TAs do a lot of teaching? It's because they have spend all their time trying to fund the lab using a constantly shrinking pool of funding.

I understand what you're saying but as a former University student this infuriated me a lot. Tuition fees are blowing up and the quality of the Education couldn't be more disappointing! University professors didn't care much about teaching, but they were very passionate and proud about their research. I relied mostly on my TAs for everything. The professors we're marginally useful. You could always tell they were too busy with research to provide a positive learning environment. 

 

My point is Universities suck at teaching because they focus way more on research and funding over learning. In my experience, Tech School was wayyy better at teaching me things as the instructors were more involved with the students. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×