Jump to content

NamelessTed

Member
  • Posts

    157
  • Joined

  • Last visited

Everything posted by NamelessTed

  1. 1. Don't get the new Dell 24" 4K monitor. It is not for gaming. At 4K resolution it is limited to a 30Hz refresh rate. 2. I am excited for 4K. I just made the jump to a 27" 1440p korean import and love it. I am still adjusting to the pixel density and the massive amount of screen real estate. I think that even at 24" that 4K would look great to have. 3. I think we might actually see a lot of monitor companies just skip 1440p and go straight to 4K. 4K is going to be the new big marketing thing that everybody is going to talk about. 4. I think we are going to see the price of 4K Monitors come down at about the same rate as GPUs that able actually play games at that resolution. If you wanted to push 4K right now you would probably want dual 780 Ti in SLI to get solid framerate, which is something like $1200? Those people can probably afford spending $800 on a monitor. It might be 3 years before a card in the $300-400 range will be able to push games at 4K at 60FPS.
  2. So I ordered the display and have been using it for a few days now, almost a week. I have to say that I am extremely happy with the purchase. For $250 I got an x-Star that is in absolutely amazing condition compared to what I thought I was getting. The monitor was reported as having a dead pixel in the middle sector, and a bright pixel in the right center sector. I have gotten close and really looked for them using solid colors on screen and can't find a bad dot anywhere. As far as backlight bleed goes there is a bit in the top left corner but i had to turn off the lights in my room and put a solid black screen up to see it easily. It isn't that bad, IMO. In darker scenes I can see it if I look for it but it doesn't stick out. All in all I feel like I got a great deal on this thing. I was expecting a couple of bad pixels and moderate backlight bleed and got a panel with no noticeable bad pixels and only slight backlight bleed. I am still getting used to the increased pixel density. It is just ridiculous how much screen real estate this thing offers. I definitely need to get Skyrim reinstalled with a few mods and lose another 100 hours of my life.
  3. I will say right away that I totally understand why a company would want to make a product like this, and I can understand the initial excitement for a totally modular and very easily upgradable PC. This will obviously be considerably more expensive than the current "traditional" PC form factor. I am just not sure the simplicity and the performance increases one would get via upgrading would be worth how much I imagine something like this would actually cost. You have to wonder what the target audience for something like this is. It reminds me of the ideas floating around to have a modular smart phone, which I believe is equally doomed to fail. From a mainstream perspective, most electronics or "disposable" in that you buy a device and use it until it doesn't work or isn't "good enough" anymore. That might be something like 2 years for a cell phone or tablet, 2-5 years for a laptop, or 5+ years for a desktop PC. By the time a device feels too slow or stops working it often just doesn't make sense to replace parts or upgrade bits and pieces to get a little bit more life out of it, buying a whole new system is often the smarter choice in terms of price per performance. Upgrading to an SSD, increasing RAM, or even installing a fancy new GPU might be smart upgrades depending on specs, but a lot of the mass market people this would be too advanced. They would likely require somebody else to do it for them. In this instance I understand that the ease of just plugging a new block in would be great for them. But would this same market be interested in spending potentially double the money upfront? The more price conscious option would be to spend half as much on a PC now, and then the same amount of money on a PC in 3-5 years time. A fully modular PC like this clearly isn't for the PC enthusiast that already knows how to build a system on their own. Those people will realize that it will be way more cost effective to build their own machines. So maybe the current market for something like this would have to be people that have lots of money, but don't want/can't build their own systems. It would target the audience of the boutique PC market. People that play games or simulators and want a super high end rig, and want it to look cool as fuck and be able to run anything you throw at it. But even with those people, would they necessarily be interested in purchasing something that they could upgrade themselves? If they had a $4k rig built for them wouldn't they just pay somebody to upgrade it for them if they ever needed it? So maybe something like this would be really useful for production or enterprise use. Maybe a small studio has a computer that they 3D render or do video editing on. I guess I could see how it would be useful to be able to just slap another GPU into a rig and get more performance out of it. If you had 100 office computers the task of going through every one and upgrading something like a CPU or GPU or whatever would be quite daunting. I guess even with maintenance it isn't a huge deal when you are just at home but if you have to maintain many PCs it would become a much simpler task of just pulling one out and popping a new one in without having to open up a case and deal with any sort of cables or screws or anything. I think if a more modular design became affordable it would be great for those enterprise scenarios. At the end of the day, the major issue for me is open vs proprietary. Linus talks about this a bit in his video. If Razer tries to keep this proprietary then it is totally dead in the water. Absolute waste of time. However, IF all the companies could get together and have an open standard then it could become useful. I think I had a much more negative view of the idea project Christine before writing this post. I still think it is likely pie in the sky, and would be too expensive. But, I think it is a step in the direction of things to come. Maybe we could have something intermediary where certain things like mobo, CPU, and RAM are all locked in place but the GPU, HDD/SSD, and front I/O are easily upgradable in a fashion similar to what Razer is showing.
  4. I have been interested in purchasing a korean import monitor for a while now. I ended up buying new headphones and a tablet recently so I have to save up for a monitor again but have found a couple of sale and am wondering if it might be worth purchasing. There are two available online, listed between $230-260. Both monitors are listed as having one dead pixel and one bright pixel each. I am just wondering how bad or noticeable these bad pixels might be. I primarily am going to be playing games and watching videos on the display but will also be using it for general web browsing. In my head I know the answer is to just wait a bit longer to afford a new monitor that shouldn't have any dead pixels. But if having just two bad pixels isn't very noticeable then I might just go with the cheaper option to get a new monitor sooner and then maybe grab another one 6 months down the road or so and use the "bad" monitor as my secondary.
  5. Gotta say that your guys' video came at the perfect time for me. Audio is something that I have wanted to become more knowledgable in for a while and something I have been reading more about but I am hesitant to really start spending money if I don't know what I am buying. When it comes to CPUs and mobos and GPUs I know what I am paying for. With audio, its all foreign to me. The only reason I bought an ASUS DGX a year ago or so is because the integrated audio chip on my motherboard died after a week and I didn't want to go through the huge RMA process on the motherboard for my brand new system. I got it for like $30 and it seemed to pair well with my Steelseries 5xB. The headphones always sounded pretty good to me, and I definitely noticed a difference when i plugged them into the amplified port so that made me happy enough. Today I just bought a pair of Sennheiser HD 598s. I had been looking at a few different pairs for a while in that price range and was at Fry's buying a CPU heatsink and fan for a friend's PC that I am about to go work on right now and saw the HD 598s for $140 as I was going to the register. I just had to grab them at that price and figured if I don't like them I can always return them with no issue at all, Fry's has never given me a single issue with returns. I've been listening to music for about 30 minute now, and have switched back and forth from Steelseries to HD 598s. I will have to spend more time with them but so far they seem to sound noticeably better. Im probably just going to stick with this audio card for now, but I now have my eye on the Mayflower DAC + O2 to put on my desk. I think if/when I finally build a new rig and go ITX or MicroATX I will have to fit some Mayflower hardware into my budget.
  6. I will say that I have had my 3rd H220 installed for a while now and it has been working great. If I could go back in time I don't think I would buy one, or any AIO water cooler for that matter. The new Cooler Master that is the replacement for the H220 seems to be almost identical so that would probably be the one to go with if a person wants to go AIO water. But I would just grab a big ass Noctua air cooler if I could. I have considered potentially selling the H220 but at this point its in my rig and im just a bit too lazy to take it out.
  7. I want to get something straight here. I have seen at least two other WCCFTech articles posted here on the forum. In which it is stated that nVidia is cutting Tegra 5 short and releasing Tegra 6 to market by the end of 2014. This is now the third one that I have seen, all of which were written by Usman Pirzada. I still haven't seen another single person or website on the entire internet to make this claim with the exception of articles linking directly back to a WCCFTech page. I also sent Usman a couple of messages on Twitter to get more information. He was not able to offer anything new. I pointed out that the roadmap image he used in his own article places Tegra 6 coming out sometime in 2015. He basically just claimed that it was an old roadmap, and that there is a newer one but offered no further insight. Has anybody seen or heard anything from a single reputable source about Tegra 6 releasing in 2014? I feel like I am taking crazy pills. This guys continues to make this claim again and again and has absolutely no evidence to support it and I have seen a dozen other sites report this false information. This kind of shit drives me absolutely crazy. To me it discredits the entire WCCFTech website and makes every other site sourcing them look stupid. It is also frustrating to see these things posted here and not see anybody else point out the obvious error. It would seem that everybody reading this on this forum believes what is being said and I feel it necessary to point out the falsity. Of course, if I am the one that is wrong I would LOVE somebody to show me actual proof. I would be tickled pink if I could get a device with Tegra 6 by the end of 2014.
  8. This has me very excited. I was tempted to get a Nexus 7 a while back but I decided against it. When i first heard rumors of the Tegra 7 I was quite excited. While I would prefer a higher resolution, I think 1280x800 will be "good enough" especially if the price point is as low as is rumored. I am especially excited about front facing speakers, something that was a major reason for me not buying a Nexus 7. I am really hoping I can play some Hearthstone on this baby through nVidia's streaming. That would be perfect with the stylus.
  9. Absolutely agree with you on this. It would be awesome to be able to build a rig in the Corsair 350D with SLI/Crossfire as well as having that top slot available for something like a sound card or wireless adapter. I know ASUS has one of their boards that have 3 full length PCI Express slots, but the way it is configured means you would have to sandwich the soundcard in between the two GPUs which isn't something that I would want to do.
  10. I totally understand where you are coming from, but I don't think you are seeing the whole picture. I won't deny that the general trend going forward is going to include more and more automation for every day eating places. McDonalds has already had completely automated drink pouring for a while. As technology becomes cheaper over time it only makes sense that more and more of basic tasks will be done with computers/robots. For fast food or more basic dining experiences I don't necessarily have any issue with this trend. Of course, just because some restaurants follow this trend doesn't mean "traditional" restaurants are going to be going out of business or changing to become automated. There are a lot of people that work in restaurants because it is what they love to do, myself included. The people working in a lot of restaurants don't just behave like robots. We focus on making great food and offering a great dining service. And there are a lot of people that really enjoy different dining experiences. I actually think it would be really fun to eat somewhere like that sushi joint that is highly automated in that way. It would have to be a very unique way to dine. I just take issue with your stance of painting the entire restaurant industry in such a negative way.
  11. I am totally ok with this. Watch Dogs is going to be a big game and I personally always prefer a game get delayed for 6 months than just have it get released with major issues. While I do really want to play the game, I would prefer that the game be finished properly and not rushed out the door. Good on Ubisoft for allowing more development time to get the game right. Whily Ubi has a tendency to pump out yearly releases for a couple of their major franchises it is good to see that they still understand how important it is for new IP to be properly developed. I just hope the game will be ready by Spring
  12. Everybody. Pump the breaks. Tegra 6 is not coming to market in 2014. It isn't going to happen. Ethnod has managed to find "two" sources that claim that Tegra 6 is launching in 2014 but have absolutely no evidence to support the claim. The horrendously written article at thinkcomputers.org is just using wccftech.com as their source. And if you go and read that article you will find no reason to believe anything that was written. The author simply claims that Tegra 6 is launching in 2014 for no apparent reason. He doesn't claim to have a source, or that nVidia announced it, or any sort of leaked roadmap. He presents absolutely no real information that would lead any person to believe that Tegra 6 was coming in 2014. In reality, he even posts the roadmap picture that we have all seen. Anybody that knows how to read a graph will instantly be able to tell you that "Parker" is clearly not going to be released until sometime in 2015. What is even crazier is that the author at wccftech.com even points out that the Denver CPU isn't even supposed to be on the market until 2015 but he somehow magically thinks nVidia is just going to start putting them in the Tegra 6 and release it in 2014. Then there is the fact that Tegra 6 is going to be using Maxwell GPU. nVidia has been rocking their Kepler chips in the desktop market for a couple of years now. Maxwell is slated to hit the market sometime in 2014 (hopefully sooner rather than later). Maxwell is a new manufacturing process at 20nm. It is a whole new chip. It is going to take time for nVidia to make these things, and they will very likely be in high demand when they release. Also understand that they don't have 100% yield rates on these things. Meanwhile, they already have, and can continue to manufacture Kepler chips. So, is it more likely that nVidia will run Tegra 5 through 2014 and into 2015 and utilize their Kepler chips or do you think they would potentially take chips away from being used in a desktop environment just so they can release Tegra 6 in 2014? Everybody, please don't believe a bit of this. Hopefully most people will just dismiss this as silly, or they will read the article and realize it is complete bullshit. Unfortunately, crap like this is how misinformation gets spread. As of right now, there is absolutely zero reason to believe that Tegra 6 will release in 2014.
  13. It is nice to see some benchmarks coming out but these are really weird to me. I understand that most people could extrapolate the performance numbers from one series of Intel chips onto another, but it is always better to actually benchmark the actual chips. I just wonder why they would test the newest chips from AMD and then test chips that are two full generations old from Intel. I am sure the numbers will end up being pretty close (with a slight increase for each generation) it just seems very strange to compare a new chip from one manufacturer and an old chip from the other.
  14. I have read through most of the comments on this thread and instead of getting into the whole debate I simply want to ask my initial reaction to reading the title and posting. How is this being desperate? Seriously, you can argue all day long whether this is good or bad, or who it benefits, etc. until the cows come home. But how is the fact that Google is allowing people to install a version of Chrome OS onto a Windows 8 machine being desperate in any way? It isn't like they are forcing everybody using Windows 8 into not being able to run the standard Chrome browser and making them only use this new Chrome OS version. Would you make the same argument for allowing people to install a virtual version of Android onto a desktop environment? Seriously, that would be an insane argument to make. I know you also complained about it only being available on Windows 8. It is possible that focusing on Windows 8 development was simply the easiest way for them to get the product out there. If you know anything about Google you probably know that they tend to release all of their products early and will even have a "beta" tag on them for years (hello gmail). Google doesn't develop behind the curtain and then release a full product. They start development on things and tend to release them before they are ready for mass market use and then make them better over time.
  15. Just to note, none of this information is going to be useful to anybody that doesn't actually understand what it means. Worth of a website/youtube is actually extremely complex and varies drastically based on the type of content and the target audience of the site/channel. You can have two website that get 1000 unique visits a day. One site can be a Bieber fansite where everybody visiting the site are 13 year old girls that. The other site could be a car enthusiast blog where a majority of the readers are 40+ years old that make $250k+ a year. Say a BMW ad runs on that car blog, that should hypothetically get a higher CPM or CPC than a some Forever 21 ad on the Bieber fansite. If somebody running a website also knows and is good at advertising they can often get higher cpm and cpc ads by simply being good at managing them regardless of the content and their audience. And you also need to realize that a site's "worth" is often very very different than how much money they actually make. A lot of websites are "worth" a lot of money but don't make money. Reddit is theoretically worth a bunch of money but they basically have no revenue model at this point. You also can't limit your scope to just looking at uniques and page views per day. There is twitter, facebook likes, youtube subscribers, tumblr followers, twitch.tv viewers, etc. etc. Basically, a simple calculator like worthofweb.com isn't going to be very good at making any sort of accurate calculations for most websites. And in the case of LinusTech, it is even more unreliable. It has a very brief period of time of data to pull information from. The longer a site has been around, the more data is available, the more accurate an estimate might be. There is also the issue that the main website is a forum so the kind of traffic it gets is very very different than something like a blog or a youtube channel is going to get. tl;dr: The estimate worth of LTT according to an online automated calculator is basically meaningless.
  16. My current setup is a 500GB Samsung 840 SSD with a 3TB WD and a 2TB Seagate. Obviously the SSD is my primary boot drive. I had a smaller SSD previously but found it was constantly full for me so I splurged and went with a 500GB drive. It allows me to have any games that I am playing on my SSD as well as the OS and any other programs that I have going. The big reason for going big on the SSD is that games are just getting bigger and bigger these days. 25GB+ is becoming the norm, and I think once we see the next gen come into full swing on the console side 50GB is going to be standard so I want to be able to have several games installed without having to move them over to a HDD deleting and re-downloading them later. As for storage the drives are just what I bought at the time that were a good price. I keep all my videos, photos, all sorts of documents and things on the two storage devices. I would say a majority of my content is only on a single drive right now which does worry me a bit but the important stuff is copied on both drives as well as most of it being in the cloud in one or two places. My ultimate plan is to pick up a couple of 4TB drives at some point and set up a Storage Spaces setup in Windows 8 so I can have a setup where my data is safe in event of drive failure and still be easy to add more storage in the future. It also really bothers me having different sized drives for some reason so I would like to eventually replace my 3TB and 2TB with all 4TB drives moving forward but we will see how that goes. tl;dr - 500GB SSD for OS and installing of games. 3TB and 2TB drives for storage and backup.
  17. These looks like pretty slick PSUs at a really good price. As it stands right now there would be absolutely no reason for a person to buy any TX series PSU unless they have a pretty serious drop in price. At equal wattages the RM series offers slightly cheaper prices, fully modular cabling, and higher efficiency.
  18. This is really fucking cool. I wonder if it would be able to run other apps on teh market as well like Netflix and such. It would certainly have the power to do so.This makes me think that Sony understands what is happening in the market right now. IF this device offers streaming outside of the Sony universe it would be something I would really want to get over something like a Roku. Shit, as it is I am really considering picking up one of those $200 12GB PS3s since I currently don't have a media box on my TV or a PS3 and I really want to play Beyond: Two Souls. My other option would be to find a used PS3 on craigslist. But seriously, I feel that Sony is diversifying their platform in a smart way.
  19. "Maxing out" is just such a relative thing to say. This whole hoopla happens every single generation. Back when Gears of War released on Xbox 360 they said that game was maxing out the system. They said the same thing about the first Uncharted on PS3. They say it about pretty much every game that comes out. And they are mostly right. They are using every bit of processing power that these systems have available to them. BUT, that doesn't mean that they are doing it in any optimized fashion. You have to realize that developers of these big AAA games aren't ever going to leave performance on the table. They have to look at the whole picture where resolution and framerate are just part of the story. If they see that they are above their framerate target at a given resolution they are either going to increase the resolution or go in and add more detail or implement different/more shaders, more tesselation, advanced lighting, etc. It works similarly on PC where if I have a GTX 760 I can run BF3 with all the setting maxed out and get X framerate or I can lower the settings and get 2X that framerate. In both scenarios my GPU is going to be running at 100%. It just becomes a matter of how well the game engine and/or drivers are utilizing that 100%.
  20. This looks really cool to me. I remember a few years back when they announced Mega Man 9 and I was baffled by it. They literally just made a game straight out of 1988 and sold it as brand new just to make money on the retro phase that people were really into. This game looks like the right way to do it. It looks like a totally modern game. They are taking the base ideas of what made those old-school games really fun but then attempting to update them for the modern era. The concept art looks absolutely stunning, the way a modern side scrolling game should look, not 8-bit. Hopefully they will be able to innovate on gameplay as well. THIS is a game I will very likely actually buy and play when it comes out. And they already hit their goal so congrats to Keiji and the rest of the team on that.
  21. Saw this price drop yesterday and was very tempted by it. I was still considering getting a Moto X but I'm not sure when it will actually be available from Google and current pricing from carriers seems to show it costing closer to $600. I pulled the trigger on the 8GB Nexus 4 just a bit ago and am quite excited about it. $200 is just a crazy cheap price for an unlocked no-contract phone and the specs are still pretty solid (much much faster than my current Nexus S). I know there are rumors that this price drop is just making room for a Nexus 5 but I am fine with that. I don't need that large of a screen, and I am very happy with the price. Best part of it all is that I could sell this phone for $150 after using it for a few months if something else comes along.
  22. I understand that people like the gold color scheme, I just can't stand it. It just looks so gaudy. I have an ASUS board with their different blue colors, which I don't really like either but it is bearable. I would just never be able to bring myself to buy a motherboard with all that gold coloring on it. Ugh.
  23. I have an ASUS P8Z77-V Pro in an R4 with a Swiftech H220 up top and it fits fine. It is a snug fit, but it fits. The tight spot is the RAM slots. My radiator sits just right above the RAM clips so I think i might have to remove the rad to replace RAM. I haven't tried it, but it is pretty close. From what I can see on the Sabertooth, the only part of the shroud that really sticks out is in line with the RAM clips so I imagine everything would fit.
  24. The idea of MS already working on Windows 9 or 10 is absurd. They honestly need to stick with 8 for a while. Releasing a whole new iteration of Windows every few years is going to seriously fragment the market. I understand that they had to move to 8 with their whole new UI and universal approach to have an OS work across a wide range of devices. But now that they have done that, they just need to focus on tweaking and optimizing 8 to get the most out of it possible. It is really frustrating for me to have to buy a new OS every time MS releases one. I know I could stick with 7, but there are honestly things in 8 that are better. Even with all the issues people had with Vista, it was impossible for me to go back to XP once I had it for a couple of months. As for bringing back Aero, it seems like it would be a step backward. I really thought I was going to miss it when I moved to 8. I thought Aero looked good, and thought Windows 8 looked too flat and soft. Now that I have used it I actually really like the overall visual style of Windows 8 more. I mean, it wouldn't hurt to bring it back as an option. People can use it if they want, or use the new 8 look if they want.
×