Jump to content

Daniel644

Member
  • Posts

    3,403
  • Joined

  • Last visited

Everything posted by Daniel644

  1. dual channel is damn near more important to ryzen then speed.
  2. depends on your definition of a "full sized graphics card", a Founders 1080ti is significantly smaller then say a ROG Strix 1080ti that is an Inch and a quarter longer front to back and like an inch wider (toward the outside of the case).
  3. ultimately it depends on the resolution you game at (like what monitor do you have?), as you move up in resolution the CPU becomes less of a bottleneck.
  4. thats what I do with the one I linked above
  5. that one works in several ways, you set it to the RX position on the toggle switch plug your headphones into it and pair it to your phone and BAM your wired headphones are now bluetooth headphones. I've used this thing for nearly 2 years for exactly the purpose you described. those pictures are just some EXAMPLES of what it can do, it can function as a transmitter or a receiver as a receiver it does what you want, the power button doubles as a Play/Pause button, you LONG PRESS to turn volume up or down on those buttons and short press of those buttons functions as a track skip. I litterally used this exact device for 2 years to make my wired headphones bluetooth so I can pair them to the bluetooth of my phone or the bluetooth in my laptop.
  6. YES, there are little devices called Bluetooth Receivers (some can even be switched into transmitters) like this one https://www.amazon.com/gp/product/B01IV1H1ME/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1 you simply plug your wired headphones into this device and BANG your headphones are now Bluetooth. I went this route when I got my laptop in late 2017 and the headphone jack was on the "wrong side" from where it needed to be with where my laptop sits while I work.
  7. that area being like that wouldn't be the cause of the temps, more likely another symptom of the main issue
  8. Intel Price Drops https://images.app.goo.gl/vZWGiAuhcv6XBy55A also NOTHING will come with the Ryzen 4000 to justify upgrading from a 3000, there is barely a reason to upgrade from 1000 to 3000 unless you just need more cores and price to performance 1000 CPU's are insane, you can get an R5 1600 for $80 at Microcenter.
  9. it could be the headphone jack itself, I got one for one of my devices for like $8 on IFIXIT and replaced the jack in my Galaxy Tab 2 7".
  10. I once retrofitted JUST the LCD Glass Panel on a laptop to another laptop, the screen sizes wheren't 100% the same so I had to disassemble the LCD/Diffuser/Backlight assembly and swap JUST the actual LCD panel, but it worked and got that laptop going again.
  11. it will work, but you will get better performance with faster RAM
  12. could be a loose connection in the jack (like if you've ever partially inserted a headphone before), there could be a left/right balance setting in the program you are using that got adjusted to far, you could have a left/right balance setting in the sound settings of the phone or it could be a failing circuit.
  13. the 21:9 27 inch will be shorter then the 24" 16:9 but a 29" 21:9 is taller then a 27 21:9 (same aspect ratio but larger size means taller and wider) here is a website where you can punch in the different sizes and aspect ratios and see a visual comparison http://www.displaywars.com/24-inch-16x9-vs-27-inch-21x9 a 34" ultrawide is the same height as a 27" 16:9 but is about 7.75" WIDER, these are the screen sizes I suggest if you can afford them. edit: REMEMBER screen sizes are measured DIAGONALLY (like bottom left to top right or top left to bottom right) so if a screen gets wider but has the same diagonal dimension then it MUST get shorter in height, just think about the triangle.
  14. EXACTLY they could have put a regular 2080 into the test (surely they have those numbers from their Super review) since the 2070 Super IS a 2080 with disabled cores. Also (general comment unrelated to the post I am replying to), COME ON GUYS, really, REALLY???? STOP with these "performance per dollar weighted graphs" they are meaningless bullshit, trying to level the performance of a card that is so fucking sporadic is just stupid, people need to actually look at what they actually will do with this GPU (or any GPU and compare it with the others) because the performance on these cards is all over the fucking map, in 1 game the 5700 is beating a 2070 Super, in another the 2060 Super is SMASHING the fuck out of the XT, this inconsistency from game to game should scare the shit out of anyone considering buying these GPU's, it has to specifically win in everything you do with it otherwise you have to weight it yourself based on how much you do each thing to determine if it's the better overall value, YOUR "weighting system" isn't the same as someone elses, you don't know where the random users important weight points are, so attempting to weight averages across all programs is worthless because how many people actually use EVERY program you used to create this base statistic? it's almost as pointless as that PCIE 4.0 Bandwidth test that AMD did. This is as meaningless as that time when Paul or Kyle (I honestly forget which one) did that video where they tested a bunch of different games (with several different GPU's along the product stack) at 1080p, 1440p and 4k then averaged the framerate of the 3 resolutions when the higher end cards in the comparison where hitting the game engine framerate caps at 1080p which resulted in lower averages then if they had just thrown out the 1080p results entirely.
  15. well it's 2060 Super (launching later today) vs 5700 for NEW GPU's, check the video LTT just posted for the 5700/XT review to see how the 5700 compares to the 2060 Super. in the used market you might even find some 1080ti's selling at that price point depending on where you are.
  16. sometime today, I would estimate around 9 AM EST as that is when stores begin to open up in the USA and the typical Embargo lift time most videos get posted, although Nvidia lifted the Embargo days early to get ahead of AMD this time.
  17. damn 5700XT looks like Linus dropped something on the mold and AMD couldn't afford to replace it.
  18. ones with dual 8 pin power connectors because my Cablemod cables don't have any 4-pin CPU cables, just a pair of 8 pins.
  19. maybe in the future, thats the hope anyway, but right now there is only 1 game I know of that actually uses more then 4 cores. there is building for the future and there is spending money not worth spending. All i'm saying is it's easy to test, if you see your CPU usage high and your GPU usage low while gaming then you are being bottlenecked.
  20. not saying it will or it won't, i'm saying you have to do testing and see, I would still expect to see a higher framerate with a faster CPU, but like if you where playing at 1080p you might see your setup do 100 FPS where with a newer CPU you might get 140 FPS is some random game when you get up to 4k you might only see a 55 vs 60 FPS kind of difference, these are all very rough numbers that depend how much time you see your CPU pegged in the high 90% range or GPU not reaching 99% while you game https://forums.geforce.com/default/topic/1081788/geforce-rtx-20-series/the-myth-of-cpu-bottlenecking-the-gpu/
  21. all of this is true, but noting the OP is playing at 4k means the CPU is less of a bottle neck and you don't need a super high end CPU to "feed" stuff to the 2080ti when running at 4k resolution, in fact this is fairly easy to test, simply monitor CPU and GPU load while gaming, if the CPU is constantly hitting 98-100% or if the GPU never comes close to that percentage while gaming then it's bottlenecking, but even this isn't the end all be all test, even if a CPU isn't maxing out it can still cause studder. Doubling the cores from 4 to 8 has little effect in most games (there are a few that can use the more cores) so thats not really a reason to wait for the higher core counts to come. just remember the IPC from Skylake to Coffee lake really isn't much, it's effectively just the much much higher clock speeds that would make a difference there.
  22. just remember the only thing PCIE 4.0 will make a difference on for several more years is the M.2 SSD speeds as these new 4.0 SSD's come out, they are stupid fast, like 2 GigaBYTES, not BITS per second transfer rates but for 99.99999% of people that speed won't be noticeable in anything at all, someone like Linus hitting his NVME Raid Server with a 40 gigabit datalink between the PC and raid server would barely be able to use those speeds. If you aren't routinely transfering off large data sets to external raid setups you won't have a use for that, even the top end modern GPU's are barely saturating 8x PCIe 3.0 and those GPU's are plugging into 16x slots. Maybe if you where running next gen "3080ti's" (or whatever name they choose) in SLI then PCIE 4.0 might be worth it.
  23. you are nearly doubling the wattage of the chipset compared to x470, EVERY board i've seen including EVERY MODEL from Asus and Asrock have fans, the Asus product page for x570 even specifically says EVERY x570 has a heatsink and fan due to the thermal requirements https://edgeup.asus.com/2019/the-x570-motherboard-guide-ryzen-to-victory-with-pci-express-4-0/ (second paragraph under the "High Wattage Low Temperatures" Title).
  24. the chipset requires to much power to be passively cooled on x570, if you don't want a fan get an x470 board in that pricepoint that is on the list of boards with a strong enough VRM to handle overclocking the best Ryzen 3000 chips.
  25. well to upgrade to 3rd gen Ryzen you would need a new MOBO and CPU (but you would need both of those for ANY upgrade, the difference comes in that Ryzen reacts better to faster RAM so you might want to get faster RAM, the good news is RAM is CHEAP AF right now, we are seeing some of the best RAM prices in literally 3 years so now may very well be the time to grab faster RAM before it goes up again.
×