LazyLizard

Member
  • Content count

    76
  • Joined

  • Last visited

Reputation

  • check
    Agree 0
  • info_outline
    Informative 0
  • tag_faces
    Funny 0
  • thumb_up
    Thumbs Up 0
  • thumb_up
    Likes (Old) 5

Awards


This user doesn't have any awards

About LazyLizard

  • Title
    Member
  • Birthday 1992-05-30

System

  • CPU
    AMD FX-8350 4.0GHz 8-Core Processor
  • Motherboard
    Asus Sabertooth 990FX R2.0 ATX AM3+ Motherboard
  • RAM
    Kingston HyperX Blu 4GB (4 x 4GB) DDR3-1333 Memory
  • GPU
    SAPPHIRE Vapor-X Radeon HD 7950
  • Case
    CM Storm Enforcer
  • Storage
    OCZ ;Agility 3, 2 Vertex 460s, WD Green WD15EARX
  • PSU
    SuperNOVA NEX750G GOLD
  • Display(s)
    BenQ XL2720T
  • Cooling
    Corsair H100i
  • Keyboard
    Perixx PX-1100
  • Mouse
    Logitech G500
  • Sound
    Insignia 2.1 Speaker System
  • Operating System
    Windows 7 64 Bit

Contact Methods

  • Twitter
    twitter.com/CarlosMC92
  • Google+
    https://plus.google.com/u/0/104016388332942978993/posts/p/pub
  • Facebook
    https://www.facebook.com/carlos.cabral.1420
  • Skype
    carlos.cabral7
  • Origin
    carloss19
  • Steam
    xxcarlosxx19
  • Xbox Live
    carloss16
  • PlayStation
    XxcarlosxX19

Profile Information

  • Gender
    Male
  • Location
    New Haven, CT
  • Interests
    •PCs •Cars •World history
    •Movies •Video games
    •TV •Science
    •Anime •Food
  • Occupation
    Student
  1. Thanks you guys every time I googled bandwidth all I would get where networking results lol. Wish I could mark both your posts as "solved", since they both had pieces that I needed.
  2. So Techquickie brought to my attention HDMI 2.1 I was surprised that i had never heard of it. It get me the general overview of what to expect from it and what i needed to do to take full advantage of it. However the big performance leap should have given the cable the name 3.0 in my opinion, so my question is how do cable manufacturers keep backwards compatibility when they increase the bandwidth? Why is the new standard just 2.1 if it is such a big improvement? Honestly I would like a direction of where to do research about how standards get tweaked for better performance and added features.
  3. Time for spontaneous stops during my road trip to play games on the side of buildings lol
  4. Oh ok, other forums are also saying it depends on a individuals eye sight, so you could get away with lower resolutions. thanks for the reply.
  5. No need to insult the post, just cause you don't understand the question.
  6. I am not looking to upgrade now, the GTX 1080 can barley do 60 fp on average I now 8K wont happen for a while. I am just asking the forum what it has heard about 8K tech examples how high a screen can you go to remove the need for AA and so on.
  7. My favorite screen size right now for gaming is 27 inches, I am currently running a pc for 1080p gaming I figured that the pixel density at 4K on a 27 inch screen would remove the need for anti-aliasing, however it seems that jagged edges are still present on games. So since I wont upgrade my setup until I find a resolution high enough that its DPI removes the need for AA on a 27 inch I was wondering if anyone has any opinions or seen anything about 8K gaming.
  8. LazyLizardAMD FX-8350 Black Edition VisheraMSI GeForce GTX 970 GAMING 100MEG.SKILL Sniper Series 16GB (2 x 8GB) 240-Pin DDR3 SDRAM DDR3 1866 6.5(High)
  9. Virtual shot sounds interesting.
  10. really a 770 i mean come on just stop talking out your ass.
  11. Don't worry about your cpu until the next new cpu architects come out for both AMD and Nvidia all tripple A games lean mostly to the gpu. I myself have a 8350 paired to a 970 with no issues. I am also looking into the same gpu specifically the 980ti Hybrid from EVGA if you need proof of your cpus capabilities look at this http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html Its two 980 with a 8350.
  12. Lol my bad
  13. Is it a gigabyte switch?