Jump to content

ItsAFeature

Member
  • Posts

    166
  • Joined

  • Last visited

Everything posted by ItsAFeature

  1. If they bundles in the Ultimate edition ($120... just bought it) then it would almost make more sense to get a Fury than even a 970. That could swing a good number of people. But of course EA wants 100% profit on the DLC content.
  2. There are customers in Europe who have already received their PG279Q. So far the consensus seems to be that the QC is better than the previous ROG SWIFT although the panel does suffer from a bit of IPS glow and a tiny bit of bleed. Which are both to be expected. Haven't heard about dead pixels yet, which is something the Acer version of this panel suffers from.
  3. I doubt the new flagships will be 2x as powerful as the 980Ti and FuryX. With the node shrink, I don't think it's illogical and unrealistic to expect at least 25%. And I'm currently holding off on my next upgrade for Pascal but I would be blown away if we saw more than a 50% performance improvement for the flagships. I wouldn't be surprised if they were capable of much more but hold off so that they can release another generation on the same node if necessary.
  4. I really want one of the elegant simple ones that are only for hanging a single pair of headphones on. I never unplug mine, so taking them off and hanging them on the wall next to my desk would have been perfect. Not too interested in the winning monstrosity TBH.
  5. They introduced the Verified Purchase, but I agree. There is no reason to let people review something they didn't buy at Amazon.
  6. It's an interesting update but you'd obviously have to see a larger sample size before we can draw conclusions. It also seems to me that the battery life is really low on both. Were they running something during the test? Or screens forced to stay on?
  7. Jaybird Bluebuds X. About $130. Great sound quality and I use mine at the gym all the time.
  8. Try living in New York City. I've been here a week and I'm already convinced that I'd have to make at least a million to live comfortably here. I imagine it's much the same in cities like San Francisco and Seattle. Plus I'm sure I could find a way to spend it on PC parts and cars.
  9. Oh come on. That comment was in reference to how it was said that he could have taken a paycut to his bonus in order to keep these people employed. It doesn't work that way. There's really a lot more to it than that. The company might be losing money and you could get a fantastic bonus because your business decisions saved the company from losing even more. Or the company might make a fair bit more than last year but your bonus is smaller this year simply because it is discretionary and subjective, therefore not directly tied to any one metric. For example, even my own bonus is only partly based on company performance. There is always a large chunk based on what those above you think about your own performance in your job duties. All I'm saying is that he didn't fire the Nokia employees to increase his own bonus, and neither would he have been able to spend from that bonus to keep them on. People got fired because their positions are no longer necessary. That's a good business decision. It's the right decision. Microsoft needs to change and adapt, the CEO seems to understand that and be leading a change that might save Microsoft and lead them to greater profitability than ever before. I'd say that kind of leadership is worth a hefty chunk of money. And as said above me, the $84 million is not entirely cash either.
  10. He isn't salaried at $84 million. It's a discretionary year-end bonus decided by the chairman and high equity holders of the company based on the job they think he did. Much of the bonus is meant to reward what they believe is a good job running the company but large parts is also based around ensuring that your CEO does not get an offer they cannot refuse from another corporation. In no way was his decision to lay off a large amount of their workforce based around enlarging his own bonus. It's just business. The employees were part of a technology that the corporation no longer invested in and thus there is no reason to keep them on. It's just part of life. If my own employer does eliminate my entire team due to not needing it anymore, I can't say that I would expect to be kept on. I haven't had exposure to, an interest in, or an understanding of the other parts of our technology and business. I would expect to be laid off and would start to look for opportunities elsewhere.
  11. One would think that if they're asking $850-1050 for that card, they could get the number of CUDA cores right on their website...
  12. Kind of sad that they put the most beautiful cooler on the one card I would never keep the air cooler on isn't it? I probably would never spring for this. I already have enough of a problem paying $200 more for the same card in the hopes of ~100-200 MHz better clocks. But making a $650 card into a $1050 card is way too rich for my blood. But I'm sure some people are both willing and able to pay that price.
  13. Wonder how that works with the difference in memory clock and VRAM. Sounds like you'd be better off to buy another 290 if that's what you had (and save some money) or buy another 390 to avoid gimping your current card.
  14. Only if you get caught Disclaimer: This is a joke. Don't blame your underage kids drinking on me.
  15. It's pretty clever. Fury is already shown to be hard to get your hands on so this is the time to make your own cards more enticing. At $629 the 980ti beats the $649 Fury X in almost every benchmark I've seen and is both available and a bit cheaper. At $479 the 980 might lose to the $549 Fury but you have to pay $70 more for a few percent extra performance. And if the Fury is anywhere as short in availability as the Fury X then the 980 is a solid winner here. What I'm surprised we haven't seen is a price cut on the 970. Right now it's in an unfavorable position against the 390.
  16. I had SLI 770s which is more powerful than a 690 and I basically gave up on them. Great cards but I ran into limited VRAM issues and with SLI support not always being the best, I decided to go with one more powerful GPU. The 690 is good, but it's old now.
  17. I don't really see a need to make a thread about some other reviewer on the LTT forum. Especially without some kind of supportive data. I don't feel like sifting through ToT's videos in an attempt to find errors. Everybody makes them. If you really think this is a cause for concern, you could post in the Off Topic section or something. But please write the OP in a more understandable way and support your claims. It would also be nice if it was less of an attack and more of a "Hey guys, I've noticed that ToT doesn't always get everything right. Just a heads up if you are using them for information about new products." And then point to some cases of this. It just has a better vibe to it.
  18. I spend a stupid amount on watches. I wear one every day and I match my watch to my outfit. I'm also a CS major, work in a tech job, and spend my free time on tech forums. And I vastly prefer using an iPhone. I should be everything that Apple is targeting with something like this. Not for wearing every day, but surely I could find times to wear it. Nope. They made something I can't find a use for, looks awful, and doesn't fit in in a professional environment. At least some of the Android watches and the Pebble can pass for a regular watch. The Apple Watch just looks out of place practically everywhere. You also look pretty goofy using it. I considered buying one for the gym to track my workouts (look! an actual use case for the bloody thing!). But it's too expensive for one use case. And I found using apps on it to be kind of garbage. Didn't even get to try a workout app before I decided to move on with my life. If someone can explain it to me, please do. But right now I see smart watches as a product geared towards people who can't be bothered to take their phone out of their pockets and are willing to spend hundreds on the honor of taking their calls on their wrist after spending hundreds on a perfectly capable cellular telephone that they also have to keep on their person.
  19. Don't take this the wrong way, but you sound young. So I'm gonna write this with the scenario that you are probably around high school age. Being young can be a disadvantage in many ways if you are trying to become a legitimate reviewer for new hardware, etc. You have to balance being relevant, with some personal charm, and facts. But not so many facts that you bore your viewers. And it's hard to get taken seriously by the companies you might try to contact. Then there's the time commitment. You have to write the review (script for what you will say), you have to film it, and then edit it. So there's some investments there in terms of camera and editing software. You said you've done Let's Play for years so I'll assume you've got good flow and comfort in that area. That's what I might try to structure by channel around. I enjoy watching Jayztwocents videos where he benchmarks something while he plays a game and talks about the product. It kind of combines what you're already doing but with an actual review. And don't try to lock down on one thing just yet. Do a review of a game you've been playing. Do a review of the chair you're using if it's at all interesting. Talk about whether it's comfortable for long stretches of gaming. Review your microphone. Talk about its pros and cons in terms of positioning when you play a game. Obviously you can review a GPU just like Jayztwocents does. BUT. And there is a big but here. You have to bring something unique to get the viewers. You can't be exactly like Jayztwocents, because he already exists. Or Linus. Or Austin Evans. Etc. Watch the videos from reviewers you like and try to find out what you feel is missing or what could have been done better. Then do that. Or something that might appeal to those who aren't interested in their videos for some reason. Find your niche. Edit: inb4 JordanTechTips becomes bigger than LTT.
  20. I think we both agree that AMD did have a chance of recovery, but it would have taken a whole different approach and different products. What most people seem to miss is that when you're that far behind, you can't make it up by simply catching up to your competitor. Sure, you might recover a few sales. But what AMD needed was to release something that would have made them the irresistible or logical choice. I haven't seen that product from them in years. And while something like the Fury X may be competitive against the 980ti, give or take, it isn't the foundation of a solid business. Sure, all the gaming enthusiasts get to talking about it, but revenue from the Fury X is a microscopic part of AMD's revenue. Whether you win the flagship GPU battle isn't what matters in the long run. Probably also doesn't help that it sounds like there's less than, what, 1000 Fury Xs in the continental U.S.? Anyways, back to the Intel topic. From the impression I've gotten from Intel employees and from their actions lately, I've never once thought they didn't want to innovate. The slowdown is, as PatrickJP93 said, problems with yields. Seems like a lot of people think that Intel is happy selling you a 1% improvement year over year. But they know you won't spend $200 on 1-5% improvement. That's not what they're trying to do, and even if they were a complete monopoly, I doubt they would ever try that. Seems like a great way to waste R&D money. While some are unhappy about 5% improvements, they fail to see the strides made in power consumption, iGPUs, and non-gaming technological advancements. If AMD went out of business, the only monopoly Intel would have is in a relatively small and shrinking market. They've got fierce competition in all of the big money making arenas. Plus they have to sell you a new gaming CPU at some point. I seriously doubt the price would move much, if any, if they were the sole gaming CPU option. They basically have been for the high end for years, and yet you see the 5820K be a better deal for entry level -E options than ever before IMO.
  21. Let me preface this by saying that my first rig was an AMD rig and I've always had a soft spot for them in my heart. I am willing to bet my job that Zen isn't going to save AMD. In fact, Zen probably won't make much of a difference at this point. AMD is the master of making their upcoming products sound like the bees' knees, yet for me they've been underwhelming for years. We've heard this "our next CPUs will blow the doors off Intel" rabble before. Their R&D budget is abysmal and even their own reports about Zen indicate that they expect only enough gain to stand on an equal footing with Haswell. So let's say Zen comes out and matches Haswell in per-core performance and TDP. Let's even say that Zen gets you 8 cores for the price of a quad-core Haswell. Here are the problems with that: 1. Skylake will already be out - Zen won't be the fastest offering. And AMD said they no longer want to be thought of as the cheap option so I can't see them pricing it so that people like me won't just spring for the better performing option. 2. Intel is already heavily entrenched with OEMs and in the consumers' minds. AMD won't win that battle by barely being competitive - even if they are a slightly better bang for the buck. And the extras cores they have don't matter to 90% of consumers anyways since games don't really take advantage of it and professionals are going to go for the higher performing option like Skylake-E. 3. Servers / supercomputing. You can't win over Intel here by matching Haswell. I wish it weren't so but I've mentally prepared to say goodbye to an old friend. Only hoping the name survives somehow. Oh, and maybe someone will bring back the ATI name while they're at it.
  22. I've been on the AT&T bandwagon since the original iPhone. At the time, they had the desirable phones and the coverage to match even though they were a bit pricier. Now I've converted to T-Mobile for the simple reason that it feels like a phone company for the 21st century. I get to switch phones often and they offer great rates for the kind of usage I'm interested in (lots of data).
  23. Recently went back to play RA2 and Yuri's Revenge to see if they hold up or if it's just nostalgia. Only really moved from my desk to eat and run out to buy another six pack. 10/10.
  24. I'm fairly certain that somebody on this forum already confirmed that Skylake won't be soldered.
×