Jump to content

Zat0ichi

Member
  • Posts

    146
  • Joined

  • Last visited

Everything posted by Zat0ichi

  1. So win11 might not be awful! I doubt it will be a flawless release. I have numerous little custom scripts running for quality of life improvements and I'm sure a new OS is going to mess them up. With that said it looks like touch screen daily drivers could be a thing. Weirdly though it won't be a graphic tablet grade interface, nor will it be a reason to repurchase a gaming monitor. A High resolution touch peripheral. I have a laptop for work that is touch screen but it drives another 2 monitors with separate KBM so is functionally useless. I read a snippet about Android integration. I'm up for integrating touch and I can see people asking for touch overlay add-ons. But what about dedicated products for current single machine gamer users? Are we looking at large low tier high resolution graphics tablets? This would almost sit under the KBM... Interesting intermediate product evolution. Sry bit of a brain fart but it's out now.
  2. I got lucky and snagged a 3070 at msrp a while back. Yay RTX and framerates! (well RTX isn't all that but I have it now) As a home worker and parent I can only game so much. So I looked into mining on a single card 20hrs a day and I am a complete n00b so needed a super simple setup. Its not much but its honest work. Tweaking stuff I got the power down to 50% underclocked the GPU but the Vram is clocked +1000. This clock is not gaming stable but the mine loves it. (Artifacting not crashing) Finally - a question. If i'm not pushing huge voltages through the card and temps are low am I damaging the Vram?
  3. @Falcon1986 Appreciate you taking the time to answer and help. 1. BT Home hub 5. FFTP 2. Standard WEP on 5ghtz 3. Not very large, porblem can persist when stood next to the router. 4.Theres a very strong wifi repeater on next door in one spot that is closer than my own router but its on a different channel and this issue is device specific not location specific.
  4. Content hanging. Just sometimes browsers just stop responding, apps fail to connect. The one thing that indicates it could be the router is streaming locally from a server. This sometimes gets glitchy and I have to restart the streaming app. I often manage to speed test my ISP when something is going wrong and my connection is rock solid on spec so it's not that. It lots of minor issues across all devices. I can live with the occasional hiccup. Billions of transistors, trillions of bits of information. Some glitches are inevitable But these minor issues seem to be getting more numerous. I'm streaming remux 4k content regularly by WiFi and there are alot of devices connecting. I don't know much about networking but I'm pretty sure I'm working the router hard.
  5. Yeah. Just listing the amount of devices connected made me realise how many avenues I need to search. It's an intermittent fault and as much as I like self helping this is a whole load of not fun. 1st world problems in a world on fire.
  6. I'm looking for some logging software to see if my router WiFi is failing. 4 tablets, 3 phones, 2 laptops, range extender, TV and a Nvidia Shield. Oh and my pc is hard wired. A few devices have issues with maintaining a connection. I use a work VPN and I have an ISP provided router. (UK) The reviews on the latest BT provided router are not bad. But it's not infallible I'm sure. My work VPN is stretched, on a good day I'll get 30mbit. (Domestic connect is FFTP 150Mbit) My phone is a pixel3a so a bit old but also has issues which makes me think my router might be a problem. How can I get an over view of that lot to see if it's the router or the devices?
  7. I'm in no hurry. Would like to join a few queues for a 3070. Then there's the AIB or refence question. AIBs are bound to have 2nd gen solutions because of the late SKUs and stuff and ironing out engineering bugs. Gigabyte had their Eagle triple fan for £470 at Currys (UK) Nvidia.co.uk has the reference listed for £470 (Obviously out of stock) I'd rather go without that pay more than £530 and most AIBs are pre-ordering at 630-680! If it takes longer than 6months then I reset my waiting clock for RDNA3, I love gaming, I love fidelity and immersive effects I hate being fleeced more though. my current 5700xt will serve me well in the meantime and I'll just have to suck up watching RT implementation mature from the side-lines.
  8. Just using this thread to empty my head. So the truth is out. 3070 is the best way to spend 500 If you need insane frames per second in non raytraced titles then AMD.
  9. I've more than just skimmed the surface of this but have by no means gone deep. Its not my profession (I'm an analyst by trade) I like immersive gaming experiences rather than bombastic high fps eyegasms and we are in a place to be considering film grade lighting when making games. The early ray tracing i became aware of was brute force photon replication. Models would have colour and material properties and light sources emitted pixel rays that would bounce a number of times calculating their colour from a 32bit pallet. Things have moved on and there are efficiencies in rendering and alot more horsepower I'm at a disadvantage as I have not read up on these new techniques but what is being waved about are the visual outputs Reflections are the most obvious yet their accuracy is not worth the overhead Contact shadows are also obvious yet of limited value when perfectly implemented Ambient occlusion is one of the first subtle things that is remarkably expensive but adds to the immersive feel of a scene Global illumination is another good one. Moon light glinting off shiny surfaces rather than walking around under a spot light. I'm interested to be educated in other raytracing benefits Looking at the difference between the ALPHA build of Witcher 3 and release version you can use that as an example of the atmospheric boost raytracing can give a game. TL:DR RT is a good thing IMO yet i think its marketable benefits are being skewed. You can easily spot reflections and shadows yet you feel good lighting. (Sorry got into writing this then lost steam halfway through)
  10. Loving the competition. Value per FPS at your chosen resolution should be the new metric. 3rd PARTY REVIEWS Nvidia gameworks? Nvidia DLSS? Nvidia CUDA? Greedy Nvidia tech. Designed to create revenue first then be a good product second. AMD and DX12 + consoles. Gaming exemplified and democratised. For the 10+ years up to the 1070 i was unwilling to accept the AMD wonkiness and I paid the brand premium. They have now gouged it too long and too far. Budget Saphire pulse 5700xt for me since march. It needed its fan curved tuned and I have had to spend all in all about half an hour reinstalling drivers but that is a fair compromise for the value proposition when looking at FPS per £$€.
  11. get the best 2nd hand GPU with your money. save again. get some more ram save again max the CPU. Shame money is so tight because this upgrade path is slow and is buying properly end of life components with no resale value. But this is the way to make the most out of what you have with what you can afford.
  12. Not sure. From what Ive heard/read the Ryzen 1600 is right on the line for what you need. New consoles are going to supercede it pretty soon. NPC and physics... I don't need high framerates, I'm more of a 60fps 1440p kinda guy. My knowledge is from the end of last year when I realised my old i5 3570k (9000 CPU benchmark score) was not enough for 60fps in open world games so I went b450 and a r5 3600. If you're on a super tight budget. Get a quality b450 mobo. MSI mortar I think was the best in class for the price. That gets you on the AMD CPU path. That socket can go from the 1600 all the way to 3900x. Possibly further! If you get a 1600 it will return almost nothing in resale value. (I factor resale value in to my purchasing a lot.) Budget. Look at the cost of an MSI b450 mortar/tomahawk. Add 16gb ddr4 3200 (3600 if you can) Add the best CPU you can afford. That's what you need to save up. Your 590gpu is still valid. Although you might have to drop Shadow and ambient occlusion settings.
  13. You're holding onto 1080p so you're not another "I only have $2000 for a new PC" guy. b450mobo - Ryzen 5 and 16gb of 3200DDR4 Cheapest way to get back in the game for a core upgrade. no CPU bottlenecking. keep the rest of your rig. IMO that's good for 5years. (In 5 years time you'll be able to max out the mother board with highest CPU and ram for a lot cheaper) GPUs... In the Uk you can get a 2nd hand 1070 for half the price of a new 2070. Yeah its only 25-30% faster. Enough to get you triple digit fps though. Buy it with a view to selling next year, "renting". Assume you'll lose half its value once all the dust has settled with the new ones. Doing it that way allows you to game how you want as soon as you can.
  14. 2nd hand gtx1070. CPUbenchmark is a very rough guide to CPUs. IMO you need a score of over 8000 to be in the AAA open world game other wise its the CPU limiting your FPS not GPU
  15. disappointment complete. 3070 cannot offer full RT at 1440p 60fps. Using Control as a flagship RTX implementation and Video cardz latest analysis. 2080ti in can do 40fps avg full RTX (noDLSS) @1440p According to nvidia's own hype the 3070 is a bit weaker. but There is always "medium RT" https://cdn.knightlab.com/libs/juxtapose/latest/embed/index.html?uid=c829d086-cab0-11e9-b9b8-0edaf8f81e27 Raytracing a desk shadow 100 feet (30m) away, is that really necessary? Taking the hit on fidelity in the face of insurmountable expense and a dire need to play Witcher 3 and Cyberpunk raytraced the 3070 looks good for 50-55 fps. Add a sprinkling of driver optimisation. AMD over to thee
  16. Fortnite! What a weird marketing battle ground . AMD "hides" an Easter egg there and Nvidia raytrace it. How long ago were these marketing strategies set in action?
  17. A second hand 2080 I guess.
  18. So the smoke is clear. 3080 is not a true 4k 60fps card. 60% faster for 60% less. All good. But the jaw dropping part they hyped is, as we knew, hyperbole on steroids. DLSS is the key and it's a cludge. Just like all their other things such as the Nvidia gameworks suite of effects and gsync Before I continue I'm not a red team fan boy. I've paid the Nvidia premium in the past (bat man city was a real moment for me on Nvidia hardware) DLSS is very clever and from what I've seen, staggeringly effective but after scanning the wiki article it appears The developer has got to prep a game for it. If it was like antialiasing it would be a winner. But machine learning needs training. I'm not sure how easy that is for a dev, do they just give the game textures, models and shaders and just let it have a think? Custom expensive software? Massive computational power needed for a sustained period of time? If they update a game do they then need to reship the DLSS component. So now, to reap all the benefits touted, a dev must implement RT, and a non RT lighting solution and also a DLSS component. After generating content and an actual game. And consoles. AMD non tensor consoles, that dictate the AAA title limits. All this Nvidia fluff has to be put on top that. I can't see adoption ramping up any time soon. Some nice reflections that punish FPS. Ambient occlusion replacement maybe? But now I'm really stretching my consumer level knowledge of game tech. Looks like full scene 1440p non dlss Ray traced lighting is not going to be on the cards. I get 1440p 60fps off my 5700xt. A 3070 would be lovely but I don't think there's going to be enough content to stretch it for a year or so.
  19. MSI used to be my goto brand (born out by reviews) but their 20series nvidia and last AMD offerings were poor. Brand loyalty is abused now . We have an army of reviewers out there generating multisample data to genuinely assess the cards. Read the reviews and make a choice.
  20. There are already cores within GPUs. Stitching those together is the key mechanism you need. So a core hub on die. Set you usage parameters and it will print the prerequisite number of cores. Still changing the print template at the scales involved on a chip by chip basis is a few years off. Single chip printing? The sticking it to a PCB is relatively simple. 7nm fab is crazy small and delicate
  21. That sounds normal. So csgo is an old game engine and rocket league is UE3. I guess you have game specific issues nor system. Try another UE3 and Source game? But you are dealing with high framerate gaming which I have never had to deal with.
  22. Hmm, bad directx install? Synthetic loads are ok, is it just those 2 games you've tried? Maybe try a full fat AAA title.
  23. Stop being intersting. This is the internet. I get your point though. You set an almost console rigid standard so people get the most efficient product that does not have buckets of spare overhead. We're paying for that over head. New ddr6x is expensive. 4k gaming is only ever going to need so much. Why pay for more than we're ever going to use. You need just a bit more horsepower than 3070 but not as much as the 3080. A mix and match build you own GPU. Nice idea. A scanner for you system that will then customise any part of you system with the optimum component. You could add a price filter then use sliders on various stats to balance your own product. The technological ecosystem that would be in place for that to be possible would make the concept of discrete component gaming obsolete. Neural laces in Peter f hamilton? Even though chip fab is just lithography the templates have the yield issues. That's the reality stumbling block. Custom printing chips rather than churning out the same one hoping to get a percentage that works.
  24. After the big hairy man's MSI video this pushing review date back stinks
×