Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

dragonhart6505

Member
  • Content Count

    669
  • Joined

  • Last visited

Awards


This user doesn't have any awards

1 Follower

About dragonhart6505

  • Title
    Member

System

  • CPU
    Phenom II X4 955
  • Motherboard
    Asus M4A78T-E
  • RAM
    8gb Samsung 1333mhz
  • GPU
    XFX RX 460 4gb Double Dissipation
  • Case
    Cooler Master Elite 330
  • Storage
    1TB WD1000 + 80GB WD800
  • PSU
    EVGA 900W 80+ SILVER
  • Cooling
    Zalman CNPS5X
  • Keyboard
    Dell
  • Mouse
    Logitech
  • Operating System
    Windows 10

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. At least in AMD Radeon ReLive, you can control the bitrate setting up to 100mbs. OBS has TORN UP my overall in-game performance and for no improvement in video quality, worse in most cases. Wheras in ReLive I can even set my RX 580 4gb to go all out at 1440p/60fps/100mbs while playing Black Ops 4 at the same resolution at the same time and have tremendous quality with no performance loss. I may not be getting 60fps at all times even without recording, but compared to OBS it's a dream come true. Point I'm trying to make is that ShadowPlay/ReLive are much better at handling the GPU endcoding than OBS will ever do since the software is built directly into the driver of the GPU and there is low-level frame recording vs OBS.
  2. to each his own. we clearly dont see eye to eye...so ill just be going now
  3. ill just leave this here then... https://cpu.userbenchmark.com/Compare/Intel-Core-i7-3770-vs-Intel-Core-i7-2600/1979vs620
  4. people see things differently. no fault in that. there are many more advantages to a physical upgrade of the components. im not saying the 2600 is bad, its quite good for todays standards yet. but it simply doesnt have the same output in overall metrics as even the 3770. any upgrade would be a waste to anybody. yet people still get the i9 9900k from an i7 4790k to do the same damn thing everyday...bitch at each other on forums about "technical improvements"
  5. well...yea. but this was simply a showcase/example of the performance improvement over driver updates between 2 cards of the same model but different spec. so the Vega 56 doesnt fit here. but i see what youre saying. Vega 56 reviews should be updated since AMD has in fact made improvements to their platform overall. theyre getting there, slowly but surely but i dont have or want a Vega 56. if i upgrade eventually in the future, itll be whatever is strongest in my budget availability. RTX doesnt interest me in the slightest, but if i can afford it ill get it simply for the performance upgrade. maybe the 5700xt if it comes to that even
  6. "Can you suggest me a good motherboard..." Op said it themselves. there is no "good motherboard" to upgrade to, even if he just wants for 16gb RAM. he could just get 2x8gb RAM sticks "I want upgrade my pc" Op wants an UPGRADE. only option is to go to another newer generation/platform. i dont see whats so hard to understand here...
  7. then he shouldnt need to upgrade the motherboard either. theres no benefit in either case. just get a whole new generation PC and be done with it then. only real improvements started around the 7700k and not even by that much either. cheapest option upgrade would be Ryzen then
  8. i have absolutely no idea what you mean by this or why you mention the Vega 56 at all. this is an RX 580, not even on the same level as a Vega 56...not by much anyway
  9. a new motherboard wouldnt be much of an "improvement" either...since theyd be getting literally the same as they have now as well. want an improvement suggestion? i7 4790k + new motherboard + 16gb ram + GTX 1060/RX 580 + SSD...an entire new system. there ya go...major improvement
  10. an improvement is an improvement, no matter how you look at it. benchmarks put the 3770 > 2600 in all cases, no matter small. optimal? not for the money. better performance? not for the money...but still better
  11. same brother. i got LUCKY, checking every day for ANYTHING. i keep seeing 8350 + GTX 1060/570 or Ryzen 2400g + 1050ti systems on FB market for like 750 and laugh my ass off. theres a guy in my city selling a Ryzen 7 2700x and 1050ti 4gb for $1000 and will not lower the price...no peripherals or monitor. the system ive got posted above i got with originally a 1050ti 2gb for $425 after tax...probably more than i should have spent for it but its a damn good performer. found a sucker to buy the 1050ti for $100, the RX 580 the same day for $90 so $10 profit lmao. recently put it into a Fractal Design Define Mini Silent case and its a dream. heres what the system can do at 1440p/High. ignore the stutters, i recently upgraded to 16gb of RAM and completely eliminated them. smooth ~60fps all the time with drops down to maybe 55fps for a second or so. not bad for a $400 system :
  12. ikr?! waiting to see if the same dude puts his 1070ti that he upgraded to from the 580 on Offerup anytime soon, what kinda deal i can get off him from that lmfao
  13. only by visual. cant really monitor it at the raw level. like i said, max out a game like Black Ops 4 at high resolution over your GPU VRAM usage. youll see what im talking about. if a texture is too large, itll unload from the VRAM and have to reload when it comes back into view. there are things like geometry and collision that obviously NEED to be there at all times, even if its not in view. but textures can come and go. if a texture is predicted to be around a corner or through glass or a wall close enough to the entity (your player) it can load in out of sight so its not so ugly when the entity (you) gets to that point. but if the texture is too large (ie 1440p/Very High setting) then youve run out of VRAM for your immediate surroundings within view and distant objects/entities/textures can just come and go at a whim without reason or predictability. thats how i understand it from experience. the actual DISPLAY RESOLUTION is what tears up your performance. you need the horsepower to push more pixels. in the case of 4k vs 1080 its 2x the total pixel count. no amount of overclock is gonna fix that. you need more and more powerful shader cores (or CUDA cores in Nvidias case) to keep up with that pixel count. that being said, i dont mind 30fps in 4k if i can keep it stable sometimes depending on what im doing. not the first time ive done so, even in Black Ops 4 which has a heavy load on the GPU at 100% just sitting in the friggin MENUS lmfao
  14. if the 2600 is doing fine, then leave it. it was a suggestion, i didnt say it was the best option
  15. VRAM isnt quite as important except to keep textures in memory without having to reload them everytime its called upon again. with more physical RAM, those textures can load faster since its being called upon by the game to RAM in the first place. ive noticed this in Black Ops 4 before and after upgrading my RAM to 16gb from 8gb. massive stutters if i were to go over my GPU VRAM allotment. after upgrading RAM, no more stutters and i can max the game out while keeping nearly 60fps all the time. HOWEVER, ill notice textures have unloaded and pop back into full resolution when turning. most noticeable on ground textures, like in Blood of the Dead, where the ground is ridiculously textured with virtual cracks and rocks. the texture will go totally grey for a second or so before becoming the texture it should be...but only in patches, sporadically around the area. as soon as its out of view it unloads again until i turn back to look at that area. my performance is affected between 1080p/1440p by maybe a 10fps drop off at the higher resolution. worth it to turn textures down to keep VRAM usage under 4gb so the textures at least dont pop in like they do
×