Jump to content

Velerio

Member
  • Posts

    12
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Velerio's Achievements

  1. As said before: if you go epic (and you shouldn't under any circumstances), you should use heroic launcher: it's open source and therefore no ccp-spyware included, and it's also not crap as the official client totally is.
  2. I absolutely have no idea if it is cheaper or not, but if nvidia went instead of pcie 4.0x8 to pcie 3.0x16, it would deliver the same bandwith as pcie 4.0x8, but it would be far better for everyone who wants an upgrade. But as we saw it: Tech Reviewers mostly went: it has pcie 4.0 and didn't even reacted to the lowered lane-count. But if this went pcie 3.0, i think many would have ripped the card apart for even dare to do it; and i think this might be a reason (beside that 8 pcie 4.0 lanes might be cheaper than 16 pcie 3.0-lanes) amd and nvidia went this way; because: now with pcie 3.0 does not sound as good as: PCIE 4.0 on the cardboard. But it is a kick in the face for everyone who still have an older motherboard and are mostly the target audience for this card.
  3. Absolutely; the issue is that the 4060ti is by far more expensive than nowadays the 3060ti for a miniscule of better performance and even in some cases less performance than with the 3060ti, since it has 16 lanes compared to the 8 lanes of the 4060ti. No, it isn't, even if you seen the video; there are games where even the 3060ti is slightly above the 4060ti. And even then: is it worth 399 compared to what you get the 3060ti nowadays? And that's something that LTT should cover in their reviews.
  4. Dear LTT: please don't forget this for your next review of the 4060. The target audience for a rtx4060/ti/rx7600 probably don't have PCIE 4.0: There are people out there with 10th gen Intel GPUs or even B450 Motherboards with the 5800X3D that is still one of the best CPUs for gamers. The 4060ti is probably a non-upgrade for everyone who has a pcie 3.0x16 slot.
  5. Sad because it's so true. That said, i think that lower end gpus will probably be replaced by APUs in the forseeable future. The only saviour and i can't believe it might be intel in this situation. Another issue is that AMD seems to follow their new business strategy: charge far too much, get bad reviews, then later discount them to realistic values and still sell and earn a crapton of money.
  6. I personally don't view cards lower than the 6-range as part of real gpus; every 5 and below card was crap, because you lose for every tier above 6 around 30%, but for an 5 card you lose more than 50%. That's why this entire lineup of cards is so messy, since Nvidia simply messed with the numbers; 3090 3080 = 3070 3070 = 3060ti 3060ti = 3050ti 3060 = 3050 Therefore yes, And no, i don't see the RX 6500 as the worst gpu; even through as said before; every card with the 5 or lower can't be said as a proper gpu; at least it was a card that was buyable at msrp if nothing else was possible. But right now, in this situration, the And the icing of the crapcake is that this gpu only has 8 pcie lanes, so you will effectively get only 4 pcie 4.0 lanes on an pcie 3.0 system. So yes, this card at this time is probably the shittiest card that they have released in this decade.
  7. Does Benchmarks exist for the 4060ti on a PCIE 3.0 Motherboard? LTT sadly didn't even recognized that this card only has PCIE 4.0x8 so the speed is even less on an PCIE 3.0 Motherboard that probably most users in this price range poccess. Sad that there is only 1 youtuber out there that even recognized it that it only has 8 Lanes, and that's Der8auer. He did a test with pcie 4.0x4, but not sure if this can count as a proper test on an PCIE 3.0 Motherboard. So anyone upgrading from a 3060ti might actually LOSE performance with a 4060ti, sadly there is not a real test out there that recognized it.
  8. What i would like to see is an more ECO FRIENDLY PC TEST: It's always about more watt, more watt and more watt, but what if it isn't even necessary; Can People tell a difference in a gaming blind test with 2-3 Identical Builds, one where CPU and GPU is on full power and one where both runs on ECO mode (and one maybe in the middle)? And if people can tell a difference, ask if it is worth having a higher consumption. And does all games need more fps; shooter ok, but what's with the rest. I think that Power Consumption of CPUs and GPUs goes way out of control nowadays and maybe add a bit awareness for this too. I absolutely don't say that we should all limit our gpus or that the state should regulate them, but i think that people should be able to easily set how much power their pc should consume and easily switch from low power state to a high one themself (and maybe set it on a per game basis; because i think that there are games where you don't need to run it on 200 fps).
  9. Just a question: you might already have a credit card or be able to get one; as far as i know the debit-card system will be abandoned and everyone will get now a Debit Credit Card that you can attach to your account; but it only get replaced when your card get replaced; maybe ask your bank: i don't know how it is in the Netherlands right now; but my bank said this will happen europe-wide. If your debit-card doesn't have a credit card number, maybe ask your bank; they don't replace it until the expiration date normally, but often enought they replace it for the new ones for free.
  10. I have a revolutionary new idea: you take a Keyboard, a low powered one and instead of a cord you have a Battery that is REPLACEABLE. This replaceable batteries come in different sizes, let's call them Size A, AA and AAA because... it's the first letter. You can put these Power Cells inside the Keyboard with a closing mechanism. Then, when they get empty, you put the empty ones out and new ones in. In the best case you can then rechage these empty ones with a charger that can be plugged in into every standard power plug and recharge them for several hours. Then, after you recharged them, you can put them to the side until your keyboard is empty again. But Sarcasm aside.. really, your video shows a problem that does not exist and replace it with a bigger problem: desks where power coils should be worked in (until they break, then you can replace your whole desk); always talking about the environment and then want to promote keyboards with non replaceable batterys to charge themself on desks where you need to align the keyboard perfectly just to charge them... that's nonsensical in an utterly idiotic way: if you want RGB, you use a cord, if not, get replaceable batterys, they last for MONTHS! Or you simply use a perfectly fine keyboard even without rgb with a cord. But there is a reason nobody makes stuff like this, because it is stupid. And i think that logitech&co thought about it. And there is a reason why the Keyboard that does it is so extremely expensive.
  11. Hello. Only found out your videos a while ago, and then bingewatched them for hours (and even disabled adblock to sponsor you more; a very rare occasion). I saw you using many cooling solutions out there, BUT: Why wasting the heat? You have a home, you have maybe even a server at home, and you cool the server. But then you also use energy to heat up the water. So instead of wasting the heat, a solution where you not only have a very good cooling solution, but also are able to heat up water. There are already solutions out there, but maybe a more technical review in this kind would be nice. Because Server all over the world are right now blowing out energy out the buildings, Some big server farms manage to do this, but is this also fearsible for a smaller company or maybe even for your home? Because saving energy not only save cost, money but also is good for the world. Who knows: in the future we no longer heat with gas or oil, instead we all heat at home with the cloud!
×