Jump to content

daimonie

Member
  • Posts

    123
  • Joined

  • Last visited

Reputation Activity

  1. Like
    daimonie reacted to suits in Madison reveals experiences working at LMG   
    Can a mod pin the verge article to the top? Took a long time of digging through the forum to find out there were official responses to this.
     
    https://www.theverge.com/2023/8/16/23834190/linus-tech-tips-gamersnexus-madison-reeves-controversy
  2. Like
    daimonie reacted to Flinty in EK Fluid Gaming A240G   
    Cheers Jay, I’ve ended up going with the 360G, seems to be the better choice in this situation. 
  3. Funny
    daimonie reacted to TechyBen in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Releases RTX software... does not do "culling" or optimisation that's been done for like... 2 decades or more... adds it as an update 1 month after release:
     
    Yeah, so exactly as I said they should have on release. XD
     
    Though I give them the benefit of the doubt on the grass and rubble/explosion "bug", but even then, I guess it was REALLY rushed to not catch that in testing:
    Tester: Hey , Dave [the programmer], are we raytracing every ant, blade of grass, leaf and aphid in the forest level?
    Dave: Yeah... any problem with that?
    Tester: Nope, no worries, our target FPS is 2, right?...
  4. Like
    daimonie reacted to Lathlaer in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Well this seems relevant to the discussion ?
     
     
  5. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Square Enix are never in a hurry to do anything lol. That's what I love, and hate, about them.
  6. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    What if RTX came out with Geforce 20 series being the same cost, or better yet cheaper, than Geforce 10 series. Comments would be rather different under a different lens. 80% of it is it sucks because it's expensive to own and use, the other 20% is the less than expected performance increase over last generation.
     
    You can tell what people are actually trying to criticize and it's not RTX, that's just an easy punching bag heh.
  7. Like
    daimonie reacted to Brehohn in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Couldnt of said it better. I dont understand why people are so hell bent on a card that is able to do 1080p60 ray tracing, real time. I sometimes feel part of a minority of people that goes, "Wow. This is really cool! The future is bright!"
     
    Like, what else do people want? RTX is more powerful in terms of ray tracing performance and slightly more than the previous. Meanwhile, we have AMD giving us nothing. 
  8. Like
    daimonie reacted to Brehohn in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Everyone that complains about the fps drop by using this feature, read this.
     
    These cards are doing real time ray tracing. It is GOING TO BE INTENSIVE. This is by far the most amazing breakthrough we've had in graphics technology so far. Give it time. Also, global illumination and other factors are not yet activated in BF V DXR. 
     
    Give. It. Time.
  9. Like
    daimonie reacted to TechyBen in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Yep. Lots of games do the second viewport option.
     
    No, but it means the technology is already there. It just has to be toggled on/off. So FarCry 5 (which has migrated aways from Cryengine, but might be an in house derivative now) can choose Planar/screenspace/etc.
     
    Yes. As said, they dropped SuperSampling AntiAlising. They dropped some of their lighting tech (from true lighting, to additive 2d planes, to volumetric, to other shortcuts). The "Supersampling" tech is still there, the option to inject it in a shader is still there, but the hardware still cannot do it. It's not the developers/art team stopping people, or even the technology. The maths means some tech does not scale the way some people assume it does.
     
    For example 1080p + 2x Supersampling == 4K. So no one is hiding a tech to prevent people using it... if we cannot do 4K we also cannot improve 1080p with Supersampling. So some tech scales with resolution, some scales the opposite, and some is less required (anti aliasing). A 4K game could use the GPU cycles wasted on antialiasing for something else. A 1080p game may have to waste cycles on antialiasing.
     
    I've only said there is no roadblock to the art or technology. It will be a development, marketing or hardware choice.
     
    RTX may change things. RTX is proprietary and hardware dependent. DX12/Vulcan Raycasting will not be. So we will have to see which wins out, or if the tools will allow scaling/multi platform interpolation.
     
    Yes! So, why blame the engine or the resolution target? Who decides what technology is used in "Grown Home", or "The Order 1886"? Who chooses if BFV gets RTX or Planar reflections? If it gets MSAA or Supersampling?
     
    RTX is nearly as bad a sponsorship/proprietary/exclusivity as "Only on PS4" or "Only on PhysX". It's a really poor choice, but it's not a limit of the tech or art department.
     
    Here is a good video comparing the quality:
     
    The tech can do and cannot do some things. It can be scaled to some extent. Notice how BFV is really poor at knowing if the scene has any RTX elements (water/metal) and keeps polling/FPS down because it's still trying to render the empty RTX layer. So a lot of improvement can be made... but it still won't run on a 1030GTX, even if we had an infinite budget and infinite time.
     
  10. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    I'm a little confused by this section, it's saying the exact same thing as my quote, and the rest of what I said which appears you did not read by what you just tried to tell me. Remember I was not limiting my "4k really is not necessary" point like 6 pages ago to consoles, but as a whole. At least on consoles they more actively look in to things like checker board rendering because they are stuck with a true fixed point hardware limitation, PCs can and do have different GPUs of different capabilities. Not that I'm complaining that there is more choice on the PC side, plenty of games have the ability to set 3D render resolution now anyway.
     
    Also the section below that about the lighting, I had already linked to that and talked about it but I guess you didn't read that either. Nice to see you've made you conclusions about what I know about game developed on incomplete information that you actively ignored.
     
    I already do, need I remind you of the programming background I do have while limited and the fact I have personally made a few 2D and 3D games (3D assets provided). And I have followed game development with interest, I just never felt like it was at all ever necessary to jump in the woods and get lost in pointless discussion over an extremely basic point, "Don't waste time trying to render 4k we don't have the hardware for it yet". Because I thought I made it really obvious multiple times I had no interest in that side of the discussion.
     
    But hey at least you finally understood the discussion I was having, if only it didn't take 6 pages to get there.
     
    Edit: Btw if you really must know so you can be a little less condescending, I am a Systems Engineer and a hardware specialist in an academic institution and support researchers running computational experiments on large clusters of both CPUs and GPUs and I make the hardware recommendations to them about what to purchase as well as configuring the servers, so I might know a bit more about the GPU hardware than you or you seem to think. But this is even less relevant and further in to the woods of pointlessness. 
  11. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Thing is I never said it wasn't these.
     
    And this has been what I have been saying all along, one for another. I prefer the other, I'd like to see the other as that's just my personal preference to see.
     
    I actually gave a more than enough information to illustrate my point (I would argue too much since it wasn't actually required and clouded the point), I posted links to some information about the different types of choices that can be chosen and how they effect visual look of the game and also the CPU and GPU resources. Like, not the only thing, the choice to use real time lighting, baked GI, pre-computed GI.
     
    The choices that will be made will likely be different, not all of them but at least some not zero, when targeting 4k@30 vs 1080p@30 on the same hardware.
     
    I was never blaming anything, just wishing for a different goal/focus. I don't what all games or developers to do it how I'd like to see, I don't want to stop developers from picking a certain art style that simply will not make sense with what I'd like to see. It's not an all encompassing wish and I don't want to stop 4k games from being developed.
     
    My general wish was only that I'd prefer to see a game where the absolute hardware limits was pushed at 1080p or 1440p, that would mean no user would want to play it at 4k even if they could until better hardware came out. If I see a game, a really good one, hitting say 100 FPS on a 2080 Ti at 1440p then I'd say the hardware is potentially capable of doing much more, I'm not imply that the developers of that game didn't do everything they possibly could at the time.
     
    It's about the visual progression of games, will that progression happen as quickly for 1080p when 4k is the target. Maybe or maybe not, I know it's less likely while 4k is the target/primary consideration.
  12. Like
    daimonie reacted to CarlBar in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    To be fair i totally get the going of on tangents thing when i can see how thy relate to the discussion in question. I do that all the time. Currently being tested for Asperger's and thats one of the common things about autism spectrum disorders. The issue here is i'm not really sure how it relates to what where discussing in anything but the very loosest sense.
  13. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Well it is a complicated issue so fair enough for someone to try and point that out, though my fundamental point was graphical detail and resolution are not one and the same.
     
    It also does no good for me and my gaming experience if say the main character on the screen is highly detailed then hops in to or on a vehicle that is a simplistic model and thus is both a different visual quality and also unrealistic looking, this I find very jarring. This applies to everything on screen. Increasing resolution won't fix that. Boiling it down to detailed assets exist is not at all the issue.
     
    Lighting is also very important to our visual perception but is/can be computationally expensive, I don't mean Ray Tracing. There are many different types of lighting techniques all with there own pros and cons so if you use the wrong one, in the sense of needlessly picking a complex one when a simple one can achieve near the same result, or add an extra light source that isn't absolutely required then you are greatly increasing the GPU or CPU resource demand.
    https://80.lv/articles/learning-lighting-for-video-games/
    https://iq.intel.com/bringing-games-life-with-light/
    https://unity3d.com/learn/tutorials/topics/graphics/choosing-lighting-technique
     
    Or we could look at making hair look and act real to life, another extremely complicated effect to pull off convincingly.
     
    So for me I would rather what is currently considered not critical game assets or effects get more attention, make that tree a more complicated model, make it look more real, waste some resources on that sort of thing because for me it is not a waste. If doing so would mean 4k would now be impossible on current GPUs so be it.
     
    4k resolution output is not actually a well established thing, the hardware hasn't been both around that long or at a generally available price point. However there is a very big push towards it, mostly from marketing not a very strong technical need to. If 4k monitors and TV did not exist then no one would be talking about it, it wouldn't prevent graphical improvements because it doesn't exist.
     
    Games have been getting more visually detailed over a long period of time at 1080p, the hardware has been improving to allow it, the tools have been improving to allow it, budgets in the industry have been increasing to allow it, but what I personally don't want is this progression to be slowed down to meet the demands of 4k when everyone is saying that 4k now and even potentially in the next generation is unrealistic. We are now developing techniques to get around the limitations to enable 4k, or accepting certain sacrifices, and this is in my opinion not the area I would like to see development resources go in to.
  14. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    This is the closest you've gotten to what I was actually talking about which is good to see. Yep it's always been in flux. I'm also not asking for true photorealism I'm just saying that you can put assets, effects and lighting in to a game that the end result would be something that on current hardware would be too slow performance wise if you tried to run it at 4k on any GPU but would be just barely possible at 1080p. I'm not saying photorealsitic, I'm saying better than we have now. So if you actually wanted your game to actually be possible to run at 4k on current hardware the total end result wouldn't be as demanding.
     
    Here in the above hypothetical the decision to aim for 4k would have impacted overall design choices, if the goal was the best possible ever.
     
    I don't think you're getting my point about making the actual end result scene harder to render to the point current hardware can't do it at 4k and anyone would call it a playable game.
  15. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    You are giving inconsequential not relevant to actual game rendering that we will see and get in the attempt to show that high quality assets exist. So what? That's a nice pre-render, why are you showing it? Are you thinking at I'm under the belief that we could get a game to that final product quality standard right now by limiting ourselves to 1080p on current hardware?
     
    You keep asking if it's possible to render it, that's not the right question, that doesn't actually apply to what I'm talking about.
     
    Here is what the final, actual game, will look most like, not that pre-render.

     
    Boy it looks nothing like the pre-render, I'm totally shocked and did not expect that.
     
    You say you know the hardware limits but then for some reason think that targeting 4k, in game, is not going to impact what is possible to achieve on the final look of the game.
  16. Agree
    daimonie reacted to Misanthrope in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    I was watching Jayz video on this and what it occur to me is that this should have been on something like a big RPG title where you actually encourage exploration and could have time to slow down and appreciate the new fanciness...Also where playing at 30 FPS wouldn't fucking matter nearly as much.
     
    I guess Nvidia was shit out of luck since there seems to be no major RPG releases lined up. There's fallout 76 but that's already so broken I don't think slowing it down to 10 FPS was feasible anyway.
  17. Like
    daimonie reacted to Pixel5 in When did Watercooling PC components Become Mainstream?   
    data centers are a completely different use case and usually dont watercool the components itself but rather use water to cool entire rooms or containers.
     
    in the rare cases where they actually water cool components its because it requires less space in the server rack and rack space is more expensive than anything you usually put inside that space.
  18. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Yes I know that, frame rate target will effect the image quality. They aren't mutually exclusive. Raise one the other lowers, they are linked. 
     
    No I used that show show that asset being used in a game would result in a lower graphical looking game. The higher quality asset would result in a higher graphical looking game. Now where it's actually important to this discussion is if the low quality asset has to be used to achieve 4k on current hardware but the high quality one could be used if the start was 1080p instead. There is no guarantee that the higher quality asset will be available in the game if the development goal from day 1 was 4k.
  19. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    How many of those 3GB-4GB could have rendered the game at 4k with acceptable frame rates, none of them. The higher res pack could have been used from the start, because at 1080p the vram usage would have allowed it just not at 4k on cards not capable in the first place.
     
    And there is a lot in that article that supports both sides of this.
     
    What is clear though is 4k is a target that they are all trying to meet, they even say so, so from start to end of the development cycle 4k is a forefront requirement. There is no way this is not going to effect choice, drive technology a certain direction.
  20. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    I never said that ever. I said the assets that make it in to the game are not the highest possible, may be limited because there is no point putting something so highly detailed it's not possible to render them on current hardware, but what if it is but only at 1080p. Currently I do not see any games going with this choice, a choice that would mean no card in existence at the time could render the game at 4k.
     
    It'll just pick this one for example sake. At 1080p VRAM usage is around 3GB and 4k is around 4.2GB. So yes VRAM was an issue and would be a reason to not ship the game with the HD textures now available, darn all those 4GB GPU users who wouldn't be able to run the game at 4k regardless. The game could have shipped with the HD textures. You also know this is a 2018 game right? How long have 8GB graphics cards existed?
     
    Seems to me you have just shown an active choice, this year, to limit game graphics because of a developer requirement/target for a certain type of hardware and resolution setting.
     
    Lets go with an older 2014 one then, 1080p ~3.4GB and 4k ~4GB (No AA). For this game I would agree that the hardware at the time would not allow anything better from that game, the ultra texture pack could not have been used in the base game.
     
     
    But this example in no way represents what I am talking about nor wishing from games. I am not mistakenly blaming resolution, in my examples that I am using the resolution would be to blame. Because I am talking about creating a game that is so demanding no hardware on the market could render it at 4k but could at 1080p, who's doing that? No one. Why? Because right now 4k is a demand from users that developers wish to meet.
  21. Like
    daimonie reacted to asus killer in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    @leadeater and @TechyBen
     
    read this
    https://www.gamesradar.com/4k-gaming-the-hardware-isnt-really-ready-so-how-are-developers-making-it-happen-and-what-are-the-hidden-costs/
  22. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    It may not be possible to use them if there is a requirement for the game to be playable at 4k. Why you ignore the critical part to this I have no idea. I'm not talking you the customer sitting in front of the computer selecting 4k resolution. If that is the design requirement of the game, before you can play it, then you aren't going to make the game impossible to achieve the goal are you, you just failed to achieve one of the project scope requirement.
     
    And if it were to do the same to 2 2080 Tis? Because you make the game so graphically demanding and actually impressive looking compared to anything else before? But it worked perfectly fine on 1 2080 Ti at 1080p?
     
    You keep brining this back to user choice, user selecting in game option, the user. I'm talking about developer choice before you ever see, know or hear about the game. Project scopes, choice that are made to achieve the final product that you eventually see.
  23. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Because the ability to pick a resolution has nothing to do with my point. Allowing the ability to select an output resolution is not the same as setting a 4k target during game development that has an over arching impact on the choices you make while developing the game, from assets to lighting models, to shadows etc. All the way through the development that 4k choice has an impact, this is nothing to do with your ability to select a render output resolution.
     
    The choices you get in the game graphics menu for resolution is not what I'm talking about.
     
    And if your goal is photorealism and as I have said, and you just said, rendering at 4k does nothing to achieve this. By aiming for 4k it could prevent you from achieve it though.
     
    Bringing game artistic choice in to this really is not that relevant, the only thing it shows is impact of developer choice.
     
    The game development industry is not limited by the choices of film industry? How does this apply?
     
    There are very many 1440p monitors, there are also many 2560x1600 monitors. There are others available too.
     
    But we are talking about games
  24. Like
    daimonie reacted to leadeater in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    I don't bother with it much, rarely do I turn it up much. It's another one of those things with very quick diminishing returns with rapid computational demand.
     
    I honestly don't think you are aware of the point I was trying to make. If we have a choice of only being able to target 4k at the expensive of making real, more noticeable difference then we should not do it. Trying to render at 4k will not inherently give a game a big graphical improvement and it is very demanding.
     
    I have seen many games rendered at 4k, on very good monitors and it's not a big jump in game graphics. We can stand around all day arguing back and forward about how big a difference it makes but games fundamentally have gotten more graphically impressive over time independent of the render resolution.
     
    Targeting 4k, now, is a waste of resources and limits what we can do because we don't have unlimited GPU resources.
  25. Like
    daimonie reacted to CarlBar in Battlefield V with DXR on tested from Techspot(Hardware Unboxed)   
    Alright quick cliff notes version of the 3 raytracing types and how they work. 
     
    All raytracing, even hybrid starts by shooting 1 or more rays per pixel from the camera position through the pixels on the virtual screen, (the contents of this virtual screen are what you actually see by the way). These rays keep going till they hit somthing in the games 3d render of the scene. We'll call these ray's master rays. here's probably some real fancy name for them but "master Rays" will suffice for our purposes. What happens next exactly depends on what is being handled by raytracing. In a full raytraced scene several different types of ray then emanate from where each ray touches the the scene. Again for sake of all these under the general classification of "Secondary Rays". For the purposes of what is implemented in hybrid raytracing we only really care about 3, as those are the only types implemented.
     
    The first of these secondary rays is properly called the reflection ray. This shoots off at an angle set by the angle between the surface and the Master Ray. Just like light striking a mirror in real life. This reflection ray then shoots off and bounces around picking up colour info which is then combined with the colour info from the master ray to produce the final pixel colour value.
     
    The second is the Shadow Ray. I haven't see a lot on this so i'm not sure how what direction it goes off in is decided. But it detects the shadowing effect of any surfaces it pases through on it's way out. Again this is combined with master ray data to decide the final pixel colour.
     
    The third type are known as light rays, whilst hybrid ray tracing puts a cap on the number of light sources that can be used this fires a ray off at every light source in the scene until it hits the ray cap. Again picking up on lighting factors as it goes to again combine with the master ray to get a final result.
     
    The thing is all 3 of these effects can use the same Master ray to implement, in fact they have to if combined, and the results of all 3 are put together to produce the final pixel values.
     
    What this also means however is that the Master Rays are a one time overhead item, adding extra RTX effects doesn't necessarily double the workload. In fact because of the amount of bouncing they can do, (I believe BFV limits it to 4 bounces total), reflection is amongst the most intensive to do unless you have a scene with an enormous number of lights. But i think currently software caps it at the 3 closest light sources.As a result implementing additional effects isn;t a simple scalar.
     
     
    The fact that DICE calls it their denoising algorithm suggests they aren't using the tensor denoising, and from the sounds of it there's some serious hangups happening on the Raster side in terms of getting the special second render to the RT cores in a timely manner so the GPU is spending a lot of time sitting on it's hands between frames waiting for things to happen. That fits with the lower power consumption too. If most of the GPU is only active for part of every second and then furiously busy the rest your going to see a lot of dectivaited parts producing low power draw, but a high average clock speed.
×