Jump to content

bowrilla

Member
  • Posts

    1,716
  • Joined

  • Last visited

Awards

This user doesn't have any awards

4 Followers

About bowrilla

  • Birthday Mar 22, 1986

Contact Methods

  • Steam
    sonofnyx

Profile Information

  • Gender
    Male
  • Location
    Berlin
  • Biography
    Professional Indiana Jones – just whithout occult Nazis and murderous cults … and unfortunately without whips as well …
  • Occupation
    Jack of all trades

System

  • CPU
    Ryzen 7 2700
  • Motherboard
    ASUS ROG STRIX B450I
  • RAM
    32GB Corsair Vengeance RGB DDR4-3200
  • GPU
    Zotac 1080Ti Mini
  • Case
    Phanteks Enthoo Evolv Shift X
  • Storage
    1x Adata M.2 500GB SSD, 2x 500GB SATA SSD
  • PSU
    Corsair SF600
  • Display(s)
    Samsung 46" LED TV
  • Cooling
    Custom Loop, 280mm Black Ice Nemesis GTX, 2x Noctua NF-A14 Industrial PWM, 2x BeQuiet Silent Wings 3 140mm PWM, 1x Cryorig XT140 PWM
  • Keyboard
    Das Keyboard 4C Professional
  • Mouse
    Corsair Harpoon RGB
  • Sound
    AKG Y50BT wireless
  • Operating System
    Win10 Pro, Ubuntu 18.10

Recent Profile Visitors

2,350 profile views
  1. Since there were leaks and hints of 4090 being in production since July to build up stock, My bet is that both 4080 versions are already in production for at least a month now. That means there's quite some stock of 4080 12GB cards in various warehouses. While you could make use of cheap underpaid and exploited workers in China to repackage everything, switch out all the promo gear inside the boxes and put stickers over the silk screens, the cheapest bet would be to sell these (as someone else mentioned) as Asia only (maybe even specifically for gaming/internet cafes - didn't they do that at least once before?) limited run or sell them to system integrators. The amount of work and money going into repackaging everything just to be forced to sell the cards for cheaper (because, let's face it: the chips are the expensive part and I don't think NVIDIA will reimburse AIBs if they suddenly decide to marked the chips as 4070). Question is: what happens with the rest of the stock of the 4080 12GB chips? That probably depens on whether the actual 4070 has the same chip or not. If it has: congratulations: more 4070 chips. But now NVIDIA would probably need to lower the price of those chips and I bet AIBs already paid several supply runs already - so NVIDIA needs to partially reimburse AIBs for those deliveries. If it's not the same chip then it probably will be a 4070 Ti oder 4070 Super down the line (which leads to the question: aren't those boards already in design?) - or they just cancel the whole chip production and find a solution (as suggested before) for a special sale. It leaves the question: was the 4080 12GB always intended as a 4080 or was it renamed? Many folks speculated that it once started as a 4070. How does the chip lineup look like? How much needs to be changed? How many chips have already been made?
  2. It was a matter of convenience: one app to handle encrypted messages and also SMS. I've set it as standard app for messages so I don't need the regular messages app on my screen for 2fa codes and those odd balls (including iMessage users).
  3. Fun for the benchmark run, pretty senseless for everyday usage.
  4. Well ... most reference card models are easy to find blocks for. EK, Watercool and Alphacool have usually lots of options for the most wide spread models. Custom PCB cards are tricky though.
  5. A German media outlet has a performance index derived from all their benchmarks for both native resolution and raytracing (and you can also select by resolution if you want). It's not a full list (lowest cards are 1070 and Vega 56).
  6. DLSS renders at a lower resolution and upscales. Rendering at a higher resolution would not give you a performance benefit. List them up in order of computational power and put number of users on the y-axis. You'll probably end up with a bell curve. You should make sure that those around the peak of the bell can run your game. Steam has the raw data for you. That's a long way until we reach that point. RTX 20 series was mostly useless with RTX with the exception of maybe 2070Ti/Super and above. The 2070 was already a stretch and the 2060 had pointless RTX support. Compared to the 30 series and now the 40 series everything but a 2080 or 2080Ti has pointless RTX support. Not to mention all those users with AMD GPUs. So it needs to be standardized raytracing and not just NVIDIA raytracing. And then there's the issue with consoles. Yeah, not going to happen any time soon.
  7. It doesn't. Rendering at a higher resolution than your monitor can display has basically no positive effect. There are better ways to avoid jagged lines. My thought exactly. If the game can't at least acceptably run on mainstream hardware it's not going to be very successful. Link on what? That was purely hypothetical. I mean, we do expect the 4080s to at least match the 3090, right? Probably the 12GB matching it and the 16GB exceeding it a bit. If we keep up these kind of performance jumps, game devs will face some issues in terms of scalability. Many visual features will probably need to be optional with some form of alternatives to fall back to. That's a lot of work.
  8. The coolers are just bonkersly oversized. Apparently all testers reported temps never exceeding 70°C under load with fans still staying very quiet. Another German outlet published some sound measurements. Apparently under load the 4090 FE stayed at around 3 sone and 100% fan speed was >10 sone. Should tell you something about how oversized the coolers are. Sure, just some rough values and we don't know about their air temperatures but still. That's extremely quiet.
  9. Jay mentioned something about 10mins of tweaking settings.
  10. The major question is: how well do game engines scale? Enough to include at least at mild settings higher end 10 series or 20 series cards? 40-80% performance jump is massive, 2 or 3 of those jumps and the range even just between 3 generations starts to become huge. And keep in mind: the 4090 is already CPU limited at 1440p. A 5070 or at least a 5080 will be on the same level. Do CPUs increase in performance by the same level as GPUs? I don't think so. So 4k will become more and more normal and 8k will become more interesting for high end gaming I assume. Yeah, I'm thinking 8k as well. But that would require also massive displays to make any sense. I mean, even 43" displays at 4k have >100ppi. That size is not really ideal on a desk anymore. Interesting times
  11. While I do understand your point here, I don't fully agree but that's ok. What I do think though however: I wonder how much sense RTX 50 will make in 2-3 years time when the 4090 is already being CPU limited at 1440p and even high end games with massive resource hunger run easily at 4k ultra. The "problem" with these kind of performance leaps is: developers can't really aim for that kind of hardware unless they intentionally leave out the majority of gamers with even a little bit older hardware. It's hard to top it all at this point and it makes me wonder what's coming up next. Personally I do expect the performance level of the 4090 to stay in the top 10 for probably at least 3 if not even 4 generations of GPUs. This generation might actually be what the GTX 10 series was many years ago and the 4090 being the Titan equivalent.
  12. Also keep in mind, that this is a synthetic benchmark so it should be basically the worst case in terms of performance loss. What we're seeing is a product that is basically overclocked by default. 30% power increase for just 10% more performance? That's overclocking territory. It would still have been an impressive card at 60-70% Power Target and would still have a little headroom for overclocking. They could have also just went with 3x 8-Pin (like some AIBs did) and have a ton of headroom.
  13. Actually der8auer did something like that. He checked performance vs power target in 10% steps. Turns out. At 60% power target it's still beating everything up by a mile and consumes 1/3 less power. Imho that chart indicates that NVIDIA initially planned with the way less efficient Samsung node and designed their coolers and power delivery systems for that. After the switch to TSMC they had a way more efficient process and massively oversized cooling solutions and an unnecessary power plug. At 60% Power Target you'd decrese performance by 10% (which is irrelevant considering that at 100% the card is 40-80% faster than anything else on the market) but you'd save 1/3 of the power. That kind of increase in power consumption for relatively little more performance is what you'd usually try to squeeze out with overclocking. The 4090 could heave easily be still the fastest GPU by a mile with just 2x 8-Pin PCIe plugs and a little bit of juice from the slot. In theory that cap would sit at 375W (comparing with the chart that's 80% power target). Edit: This is der8auer's english video:
  14. GTX 1080 FE MSRP/UVP in Germany/Europe was 789€, custom designs 665€. That's ~910€ for the FE model in today's money and 767€ for custom models - we all know that manufacturers tend to rather go beyond that and not below. P.S.: Also remember: European prices depend on exchange rates and MSRPs in the US do not include VAT. EUR-Dollar exchange rate used to be well above 1.10USD per Euro, at times even higher than 1.20USD per Euro. Currently it's 0.969USD per Euro. That's ~17% more just thanks to exchange rates.
  15. Depends. The both 4080 (or more like the 4080 and the 4070) are indeed pricier, though not my as much as people might think. Considering inflation the 2080 would cost in today's money ~940USD. Then again, the 2080 was already more expensive than every other xx80 card ever. I can only assume that NVIDIA tried to push the 4080 16GB (aka the actual 4080) by renaming what once probably started as a 4070 also a 4080 and pricing that card close to what xx80 card should cost: 1080 would be today ~860USD, even a 780 would be ~825USD in today's money. The 4090 however is not really more expensive than its predecessors - it's right in the range after adjusting for inflation: Titan X (Pascal): MSRP 1200USD in 2016 -> 1480USD in 2022 Titan Xp: MSRP 1200USD in 2017 -> 1450USD in 2022 Titan RTX (20 series): MSRP 2499USD in 2018 -> 2950USD in 2022 3090: MSRP 1499USD in 2020 -> 1715USD in 2022 3090Ti: MSRP 1999USD in 2022 Even Kepler and Maxwell Titans would be 1250-1300USD in 2022 And this is only accounting for the average inflation for each year and not price increases due to low capacities at TSMC and other chip foundries. Also considering exploding costs for fuel, logistics is going to be more expensive. For anyone interested, I've made a sheet comparing prices after inflation: https://docs.google.com/spreadsheets/d/1PDQqdNWZKmrA8aAY26DwPTc6hcQmK9zXZ5dUJFZCRmQ/edit?usp=sharing Source for inflation figures: https://www.usinflationcalculator.com/
×