Jump to content

pyrojoe34

Member
  • Posts

    1,644
  • Joined

  • Last visited

Everything posted by pyrojoe34

  1. If you think that’s what net neutrality is then you’ve been fooled by misinformation campaigns. Net neutrality does not give the government power to block sites at all, it just says that all data needs to be treated the same so that certain data can NOT be blocked or throttled.
  2. That's my point. Mouse sensitivity (DPI) is not polling rate. They are independent of each other. That's like saying a monitor's refresh rate is the same as the latency, they're not the same at all. DPI just tells the mouse what the conversion rate is for real life movement vs on screen movement is (like changing the location of the fulcrum on a lever. Polling rate tells the mouse how many times to report its location every second. What I'm saying is that your friend is confusing DPI with polling rate. What he's doing should have no effect on accuracy and doesn't make much sense to me.
  3. I think he's confusing the polling rate with the DPI.
  4. You think Intel will actually license x86 to Nvidia? Unless they get ordered to (anti-trust protections have become a joke so I doubt it)... I don’t see why they would... Unless something big changes, only Intel, AMD, and VIA can make x86-based CPUs. They could try to make a completely novel architecture/instruction set but that would be a massive hurdle and require every OS/program to be completely rewritten which I just don’t see happening any time soon.
  5. 750W is more than enough. I have a 6800k (OCed) which has a higher TDP than the 8700k, a GTX1080, 2 SSDs, 2HDDs, AIO, etc. and under gaming load it draws 350-400W, peak usage (using benchmarks that use more power than would ever be used in actual tasks) was ~520W in bursts, 450W continuous. You should have power draws 50+W less than my system with 1 GPU and at most 200W more than that with 2 GPUs.
  6. Well then you should either set a different monitor as the primary or run the games off a secondary monitor. You could also get an HDMI to DP adapter.
  7. You don't see this? Weird Anyway, switching the primary monitor should also work fine.
  8. Yes, if the main display is on the 1080Ti it will use that to drive games set on the primary display. You should have your primary monitor plugged into the 1030 if you don't want the 1080Ti to be used for games running on the primary display.
  9. You should be able to set in the BIOS which PCIe slot is the primary video output. You should also go into the nvidia control panel and select "Manage 3D settings". Under "Global Settings" select the GT1030 as the "preferred graphics processor". Under program settings you can specifically select which programs use the 1080Ti.
  10. 1U servers are already cramped for space. If you can squeeze in a few extra drives by using 2.5” then it’s a better option. Also they block less airflow and they obviously didn’t need the extra space to get a ton of storage in there. I’m also wondering if having too many NAND chips on a single controller will have negative effects on random read/writes and it’s better to wait for larger chips than cramming a ton of smaller chips on a single PCB?
  11. I mean system ram usage, and not the frequency but the % of total used (unless you mean 5000-6000MB?). Also make sure your GPU drivers are up to date. You can also try a fresh install by downloading directly from NVIDIA, running DDU, then doing a fresh install.
  12. Which is more or less the same as I said in separating out the roles. Keep research separate from education, combining the two means both sides suffer. Hire teaching professors to teach and researchers to research. It's kinda like a company hiring an engineer to do engineering, accounting, training, PR work, and management simultaneously when all they should really be doing is engineering.
  13. I completely agree that that’s an issue, it needs to be rectified. The way I see it the best way to tackle this is to free up the professors by not making them spend over 3/4 of their time trying to get funding (most do not do much actual lab work so it’s not the research itself that’s the issue, it’s the funding problem), by not hiring professors solely on their funding/publications but also consider their teaching ability (this means the ones who focus on teaching don’t get hired as often because they only hire the ones who spend all their effort on the lab), and by separating out the roles of teaching professors and researchers (hire teachers for teaching and researchers for research). There is also the problem (just like every level of education) in requiring them to do more than ever before and stretching them too thin.
  14. What is your RAM usage? Is that maxing out? What is your per thread CPU use? (as in, are any threads at 100%? e.g. 4 core with only 1 core pegged at 100% will say the overall CPU use is at 25% but the CPU is still the issue). This actually might be the key since you get better usage with higher resolutions and in areas that have higher relative GPU demand (grassy areas with less cars/NPCs). What is your drive usage? (not just read/write speeds but actual controller usage which is shown as a %) If you are using a HDD for both the OS and for games you might be bottle-necked when loading assets. (BTW if you are running your OS on a HDD I highly highly recommend you get at least a 120-250GB SSD for your OS. The difference is absolutely massive for general use and it will also lower the load on the HDD so it can just concentrate on loading game assets and not running the OS. Also, if your page file is on the HDD and you're running out of RAM (which is possible with modern games and only 8GB) then your virtual memory will be super slow). Make sure you're not limiting performance with vsync or a software framerate limit (either in games, in control panel, or in GPU programs like Precision X).
  15. Yea, that’s not true at all. Budgets are being slashed every year (thanks Congress, the cost of one of those tanks the army doesn’t even want could fund our lab for a decade) and most grants are given preferentially to big, well funded labs leaving the rest of us clawing for basic supplies and reagents. Our molecular docking core runs simulations using 1080s and 1080Tis and we are perpetually short on computational power (the job queue is always growing). The core will buy a couple new ones whenever the budget allows and once a tower runs out of space (we run 8 per system), we kill several months of budget to get a new tower to add to the cluster. Science is already hard enough when you’re well funded but regularly putting projects on standby because you run out of reagents or have to repair 20yo equipment yourself because replacing it is 6mo of budget and a tech coming out is half that, is a massive resource drain. We have been the world leader in scientific research for 100+yrs but now we think that science is a waste of money unless there is a profit to be made and we’re falling further and further behind. If people want to make America great then maybe we should support the research that lays the foundation for every single tech that industry is able to use for innovation, progression, and prosperity. It seems like half the people here think that researchers have a ton of money to spend? That's only true for maybe 5% of labs at the top few universities but the vast majority of academic labs are not limited by a lack of interesting research prospects or lack of breakthroughs, the number 1 limiting factor is budget. Every PI (PhD/Professor that runs a lab) I know spends the vast majority of their time writing grant proposals and seeking new funding sources and rely of grad students and lab techs to do the actual research. Ever wonder why professors seem to spend so little time concerned with the courses they teach and TAs do a lot of teaching? It's because they have spend all their time trying to fund the lab using a constantly shrinking pool of funding.
  16. Cut out a small piece of electrical tape. Less permanent than nail polish.
  17. Slightly off topic: You have a 75hz monitor? Why wouldn't you cap the framerate at 75 so you're not wasting power? The screen tearing is because your frame-rate is above 75fps. Cap it at 75 and you won't get tearing (the free-sync doesn't help at all with a 1080Ti and even if it did it wouldn't help above 75hz anyway).
  18. Hemp is (kinda) legal sure, but extracts from hemp are technically not legal. You can occasionally find stores that sell hemp extracts or CBD extracts in states where weed is still illegal but officially, if you make a hemp concentrate it is no longer legal and falls under the CSA. Those stores are technically breaking the law but it is officially federally illegal to possess hemp-based CBD extracts. Source
  19. According to DC laws sure, that means that DC police and prosecutors can't arrest and charge you for it. However, if they wanted, DEA agents and federal prosecutors could arrest you and charge with with breaking federal law even if it is "legal" in the area.
  20. The weed thing is weird. That’s the gov (kinda) deciding to not enforce federal law in states with their own medical/recreational laws. They can however choose to enforce it again whenever they want. NN on the other hand was a regulation that got removed, they did NOT make NN illegal. States have the constitutional right to create laws that do not override federal jurisdiction. Since there is no law that says NN is illegal (although Pai tried to add in an amendment to do exactly this), which means that states can choose to make their own laws regarding this since the FCC decided to repeal the existing regulations. Those two are distinctly different situations. I’m glad my state decided to take it into their own hands, we’re not exactly full of progressive constituents but I’m happy the legislators decided to force NN in the state. Hopefully other states will follow soon.
  21. Jesus man... get organized, nobody needs that many tabs at once... it's okay to close a page and open it again later. Having said that, if you're running out of 32GB (I assume? 4x8GB?) with a browser, games, and non-gigapixel photoshop projects... you're doing something wrong. If I can train an non-optimized machine learning algorithm to run a 10+ million complex variable prediction model with less than 20GB of RAM used, then your porn and games should not use that much either.
  22. Initialize the disk first, then create a new partition. If Disk Management gives you trouble (it happens) use diskpart via an admin instance of CMD. You can look up the basic diskpart commands with Google.
  23. Everyone here is oversimplifying the answer. The truth is much more complicated than " IPC > Frequency > Cores > Threads". For example: -Some games are able to parallelize really well, in those cases a function of IPC*clock*threads is the best CPU (higher value is better). That is to say that a CPU with low IPC and clock speeds but a ton of threads can still beat out a super high IPC CPU. -For tasks that can parallelize well but the threads can't efficiently share resources/pipelines (like hyperthreading does), then the equation is IPC*clock*cores. -For tasks that can only use a single thread at a time then the answer is IPC*clock. -In the case of a game that can multi-thread decently but has certain functions that are single threaded bottlenecks, then the answer depends on a lot of things. You need to know where the bottleneck lies (are the single threaded tasks "waiting" for the dependents to finish processing? or are all the available threads being used and ops are waiting for a free thread? Or are the tasks limited to just 2/4/6/etc threads? In these cases (most cases) you have to consider all the variables simultaneously. -Now there are plenty of other things to consider. Do the processes require a large low latency cache and it's not the "processing" itself that's the issue but rather data flow to the right place that's the issue? In that case you might be better off with a CPU with large caches (L1/2/3) even if the IPC*clock itself is slower. Even with these few points I'm still way oversimplifying the issue. As others have said, a general trend is that modern games tend to prefer a high IPC*clock given at least 4 cores are available. This isn't true with every game though, or even true without considering that different settings and framerates can favor different setups. My long-winded point is... never judge a CPU (or any microprocessor) by its public specs. Real-world tests, with your specific use case, is the only way to know what is better hardware for you.
  24. Yes, is exactly the speeds I get with mine... however, other than benchmarks (and things like simple file scans) you never see those speeds since the drive is not the bottleneck for most applications. Also with network transfers it doesn't matter at all since the max speed on a standard Gb connection is only like 120MB/s, not to mention that the speed of the other drive you are transferring to/from will also be a bottleneck unless both are high speed NVME drives. (transferring files from my SATA SSD to my NVME M.2 only me ~500+MB/s speeds because the SATA drive is the bottleneck.
  25. I think we're arguing different things. You seem to be saying that virtual surround tech right now is not very good (that's true, it is far from perfect). What you said above though is that a perfect surround experience with stereo headphones/mics is impossible. All I'm saying is that it is not at all impossible and only requires some tech advancements and personalized rendering models to get to the point where your brain cannot tell the difference between virtual surround and "true" surround. As long as the amplitude, frequency, and temporal shifts that reach your eardrums are identical, then the experience will also be identical.
×