Jump to content

winny3141

Member
  • Posts

    217
  • Joined

  • Last visited

Everything posted by winny3141

  1. What is Google turning into, a Private Equity Fund? Anyway, its still not as stupid as Cisco dissolving Flip Camcorders (I mean they didn't even sell of the company they just had it disappear!).
  2. This is Nvidia's version of the 7790 Boniare GPU (which was different from the current architecture at the time).
  3. AMD is not the best if an R9 290x sells for almsot 700$ http://www.amazon.com/XFX-RADEON-1000MHz-Graphics-R9290XENFC/dp/B00G2OTRMA Litcoin mining craise is killing AMD's position in the computer gaming market despite their console wins D:. Also FRAME LATENCY!
  4. I for one am excited to get a Shield of Steam Machine to stream my gaming PC to in order to play games on my TV and make my own console. With how Nvidia Grid is going, and the recently announced at CES 2014 Playstation Now streaming service, I think it is very possible to have game streaming in a couple of years. And saying that it is not possible that internet will advance in the few years to come, is like saying that we won't have octocopters in the sky delivering packages in 2015 when the FSA initiates rules and regulations. Technology is changing. Fast.
  5. The the Asus PB278Q is looking more attractive at 799$ with a TN 60 hz Panel (I know its not confirmed but come on, Asus released it alongside a republic of gamers monitor, it will obviously be able to display 60 frames per second). The sweet spot for 4K is the PB278Q with the Nvidia DIY scaler coming out on Friday, because you don't have to keep 60 frames per second, according to Anand Lal Shimpi, between 40-60 is fine. I am still deciding between the PA279Q or a PB278Q with Nvidia Scaler or even the PG278Q. So many choices, so little time (I am building a new comp in a few weeks!)
  6. I loved Death Note and Steins Gate and am planning to watch Sword Art Online especially with all this stuff about Oculus Rift Crystal Cove coming out. Also, Titan Fall looks pretty good, but some of the Anime is just plane old weird. I mean, just look at Japanese commercials and you'll understand. There are a few diamond in the roughs, but a lot of anime watching consists of sorting through a pile of un-watchable, childish crap. But that is not to say I don't like anime, I love anime!
  7. So this is basically like what Lightboost does for motion blur in that it inserts black periods in between graphics frames. I remember reading that OLED cannot handle this type of BPI (Black Period Insertion) between frames because it is not bright enough. To get impulse response equivalent to CRTs, the display has to be insanely bright. This is because when inserting black frames in between graphics frames, the naked eye cannot notice the black frames themselves, but it can notice a darker tint to the image that also results in a diminished gamma, delta-e color error, along with brightness. Its either Oculus Rift has a revolutionary new OLED technology that no one else has up their sleeves, or they are going to run into serious brightness and image quality issues whether the screen is Full HD or 4K! Anyway, seeing all these videos of people using Kinnects to control robotic arms, like NASA featured on the WAN show, this is really exiting with the included web cam and motion dot sensors that they use in special effect production in hollywood. Crystal Cove brings us much closer to Virtual Reality, its just I am afraid of the brightness hit and degraded image quality that will persist with an OLED screen with BPI. - winny3141
  8. Sharp is negociating with TV comapnies to have 8K TV show content by 2020. But then there is the argument, with Netflix and Hulu Plus, are we really going to have TV by then.
  9. I would actually be okay with this panel being TN so long that it is not a tiled panel implementation like the PQ321Q 4k 3500$ monitor that Asus is selling. That has two 1920x2160 panels stitched together using two professional technologies that help keep the graphics card displaying frames evenly between the two frames. fliplock is the first technology, and it "flips" the frame buffer of the graphics card to output frames to the two panels at the same time, making them one whole panel. Scanlock, the second technology, has the graphics card scan the monitor equally, so it gets the information about each panel equally, so it can, as evenly as possible, process the frames to display. Even with these great technologies, there are some issues, like for example weird flashing in games, the inability to change to lower resolutions, and the distorted displaying of the windows logo when booting up windows (tested on Windows 7, 8, 8.1). Also, with fliplock, one could speculate that the process leads to some performance issues (the GPU processing one frame is probably easier than two). So if it is one panel and 800$, I don't care if its TN, because it means 4k to the masses is coming. And let me tell you, this is no Seiki 4k crap tv that previosly released, this probably has better panel quality than the Asus VG248QE, which according to reviews was a pretty good panel. However, that is to not say that I don't like IPS as I am considering investing in the PA279Q over the PG278Q with my new computer purchase I am making in a few weeks. This is because the difference between IPS and TN is drastic. TN can only display 255,144 different colors while IPS can display 16,777,216 different colors (equivalent to the signal that modern graphics cards output to monitors). TN panels, as a result, show color bands, or parts of an image that look like lines that degrade from image quality. There are solutions to this problem like dithering, where random noise is inserted into these areas where the display cannot render certain colors and fuzz out, for lack of a better term, these areas. Then there is Linus' review where he said that the XL2420TE, that is a 6-bit TN Panel, is the best gaming LCD he has ever seen and has almost NO color bands displayed. But, even with a perfect dithering algorithm, there will always be noticeable discrepancies and obvious color bands, which is why 8-bit is better than 6-bit. The PB278Q WILL bring down prices of 4k panels so that 4k is more accessible to the masses and encourages Nvidia and AMD to produce graphics cards to perform better, so we will have sharper, more realistic gaming experiences, so I am excited for what this panel has to offer. - winny3141
  10. Why the heck would the NSA bite the hand that feeds them? The Congressman's job is to put word into law, and yet the very power that could keep them afloat the NSA is going to alienate as well? Who is the NSA fighting for?
  11. This will be a Litecoin miners boon. Less power, and more performance!
  12. The CatLeap PCB that is used in Overlord monitors are derived from Yamakazi monitors (like the one that Paul from Newegg has) and can achieve up to 120 Hz refresh rate. On an 8-bit IPS panel, up to 16,777,216 different colors can be displayed. On the other hand, 6-bit TN panels can display up to 255,144 colors. Its not that the pixels can't change fast enough, its whether or not the image processor that is located on the monitor PCB can output 8-bit images fast enough, and Catleap PCBs have been known to be able to have this ability. if anyone is wondering why there is only 1 DVI-D, it is to reduce input lag so that the pixel persistence can be higher. The prime suspect usually responsible for bad pixel persistence is the amount of inputs that the panel can take, as it takes longer for the monitor to decide where to get its information from. However, if you look at the VG248QE from Asus, it has a pixel persistence (the amount of time it takes to change a pixel's color for anyone who didn't know) of 7 ms according to Christian Eberle's review on Toms Hardware. The guy who actually hand builds these monitors is David Schribler, who discovered these CatLeap PCBs while participating on the Overclockers.net forums in which he then went to make his own website selling these monitors along with a team, www.120hz.net. The current website that he commands is Overlord. Hope that clears the air about any confusion that persists about 8-bit IPS equiped Monitor's ability to display 120 consecutive frames per second (120 Hz refresh rate). - winny3141
  13. I personally like the Samsung Galaxy S4 ROM that I have. It performs all the functions that I need, but most importantly, it includes gesture technology that I brag to my friends about all the time. Although many people may find the gesture technology a little bit tedious at times, the most helpful sensor feature I use is smart stay, because it is able to sense whether I am looking at the screen or not, and when i am reading an article it doesn't turn of the the display. I have seen Cyanogen, and I know it is cool with the included camera and how you can bring down the notifications screen and take pictures right in that screen, and you get a better touch interface, but the GS3 has been tested by Optofidelity to be the most touch responsive phone vs the iphone 5c, and 5s, and its predecessor, the gs4, is only that much better, so I dont think I have problems with touch sensitivity. Its not like I am not a phone power user, I am, but I prefer using my included Samsung ROM, for now. Source (Samsung Galaxy S3 has superior Touch Sensivitey when compared to the Iphone 5c and 5s): http://www.tomshardware.com/news/galaxy-s3-beats-iphone-5s-5c-touch-accuracy,24874.html - winny3141
  14. Gtx 780 can run at 84 C with reference cooler and that is withthe reference cooler due to gpu boost 2.0 keeping the card at that limit unless you specify otherwise. I thought that Powertune technology thar is featured in the r9 series of cards would keep tje 290x a little more consistent with temps, but the official frivers for the card isnt out yet. Anyway, tjose are extremely high temps and computer enthusiests are probably angered about how AMD traded silence for a cooler temperature, I know I am.
  15. John Carmack has no idea what he is talking about. If he kept up with the rumors, he woold know that nvidia plans to put arm chips in their Maxwell refresh, the 900 series, enabling some of the nvidia drivers to be offloaded to the graphics card, reducing cpu bottleneck. This is better than an API, because this is a way where nvidia could improve 0erformance in ALL APIs. To get a performance benefit, the game has to be programmed in the API, and the only game to do that so far is Battlefield 4. AMD broke the 1 TFLOP barrier in 2008, the 2 TFLOPS barrier in 2009, and the 3 TFLOPS with the 7970, and the 7970 GHz edition had wepl over 4 TFLOPS of compute power. Also, AMD APUs are being used in the next gen consoles. However, Nvidia takes everythong that AMD does and makes it better, and as much of a fan boy you think I am for saying this, just look at frame ratings and now the Mantle API being better executed by the ARM cores offloading driver processes, nvidia is defintely leading the graphics war now at least.
  16. http://www.youtube.c...h?v=OWnbEHMLRog Watch that video. It will show you that even on a single r9 280x, there is horrible frame variance that is many times above the "unnoticeable" barrier you speak of. I mean, even in a AMD Gaming Evolved Title, Battlefield 4, we see jumps of well over the acceptable, playable barrier of 20 ms all the way to 40 ms, and in Battlefield 4, 1440p, we see jumps of up to 60 ms on a single r9 280x. To add, Bioshock Infinite at 1440p over the single r9 280x has jumps of up to 80 ms times to render frames while the average is 24ms to render frames normally. Finally, with very demanding skyrim mods such as Project EMB, we see actual lag and variance of up to 80 ms, from the 24 ms average. I mean look at the screen recording Logan showed of the modded Skyrim game, it looks unplayable, laggy! Even the BF4 beta gameplay looked choppy in its own right. My friend, get with the times, the hardware AMD is rolling out might be great, but the drivers to support them are atrocious. I mean look at the frame variance of even AMD gaming evolved titles! - winny3141
  17. But you forgot to take in account FRAME LATENCY!!! Even on one card, there is horrible frame latency, I mean look at Tek Syndicate's benchmarks.
  18. Adaptive is great, because it gives the CPU the exact voltage it needs to run optimally. I would not fret over .6 Volts if that is what your motherboard feels your CPU needs to run optimally. You can set the CPU to manual voltage if you want, but that should NOT be your daily driver because of the fact that you are not going to take advantage of Haswell's power saving technology's. However, do NOT run intensive benchmarks with adaptive on. As I remember, Linus said in his Haswell Overclocking guide that giving more voltage to the CPU while running Prime95 can have the vcore rise a really high amount when your not intending to, having a likelihood of overpowering your cooling solution, and as a result damaging your cpu. DO NOT run intensive CPU benchmarks while in adaptive mode, only in Manual mode, when you are experimenting with overclocking. Hope that helped. - winny3141
  19. Yes, Power Tune technology. Also, you can argue that because of this technology, the card lasts longer because of the fact that the firmware is aware of temps, and keeps it at a safe temp, as a result not melting the gpu itself. However, I guess the gpu clock would significantly decrease overtime due to tim giving out, yes. - winny3141
  20. Finger Print Scanner on back of phone? That looks awfully similar to the LG G2 Scroll Bar. Is this going to be a more seamless way to unlock your phone?
  21. I can't wait until Linus does an unboxing of an invisibility cloak.
  22. A CPU is a serial processor while a GPU is a parallel processor. Kilobytez95 is right in the sense that a CPU has more powerful cores, but the GPU is more powerful then the CPU. This article might help explain this a little better: http://www.tomshardware.com/news/AMD-HSA-hUMA-APU,22324.html. Also this video should help to explain Parallel Processors a little better: http://www.youtube.com/watch?v=jtZu9MP4llQ. Hope that helped! - winny3141
×