Jump to content

Midnitewolf

Member
  • Posts

    382
  • Joined

  • Last visited

Reputation Activity

  1. Informative
    Midnitewolf got a reaction from steelman in How important is RTX   
    I use Metro Exodus as the benchmark because it is the only really demanding game on the market right now.  The key words are "Right Now".  If past history is anything to go by, many games, 12-24 months out that use ray tracing, will be as or even more demanding than Metro Exodus.  This is why I say you need a 2080 Ti because it is the only RTX card that has the performance to run Metro Exodus with its highest setting and ray tracing on. 
     
    I guess my point is if you buy RTX 2070 Super just for the ray tracing capability, the likelihood is, your going to have to either drop the settings considerable or turn off ray tracing on AAA titles 18 months down the road anyway. That being the case, unless you plan on replacing the card in that time span, there is no real point to ray tracing capability being a deciding factor in your purchasing decisions.  You would be just as well off saving money with a 5700XT.  However, this changes at the 2080 ti level because it is powerful enough to run full ray tracing currently on the most demanding game so it is likely to be able to give you an adequate ray tracing experience for the foreseeable future at least up to 1440p.  This is why I say, don't bother with RTX unless you are willing to purchase a RTX 2080 ti.
  2. Like
    Midnitewolf got a reaction from StrategyWarrior in Everytime when i push the windows button to show the taskbar i get this flickering (Gsync enabled)   
    Honestly I forget the exact reason this occurs but it seems to be common whenever adaptive sync is being used.  I seem to recall it has something to do with the frame rates momentarily dipping outside the adaptive sync range which is what causes the momentary flickers.
  3. Agree
    Midnitewolf got a reaction from Sir0Tek in What GPU to update to?   
    Get the best GPU you are comfortable paying for.  I say this because 144 hz gaming is no joke, in fact it is more demanding then 4k60hz.  Even a 2080ti can't run Metro Exodus and RDR2 1080p at ultra settings and push the 144 fps you ideally want for 144 hz gaming and your going to be down to like probably 80 fps at 1440p.  Of course this is only true for the half dozen or so most demanding titles and setting your game settings down to High instead of Ultra works wonders but just be aware that if you looking for the best of the best graphics and 144 hz (144 fps) gaming on all games, your going to need the top GPU out there and them some. 
     
    Honestly for most people I personally think they should focus on 1080p or 1440p 60 hz and just feel like it is a bonus when they can use the entire range of your 144 hz monitor.  That is what I have done and have been plenty satisfied with my performance. 
  4. Agree
    Midnitewolf got a reaction from Mister Woof in Ryzen 9 3900x vrs rysen 7 3700x   
    Just to reiterate what other have said, for current games your not going to see much difference between a 3600 and a 3950x, perhaps 1-2% at each step or about a 5-10% spread across all processors.
     
    However, in the future this might be different.  Let's face it we are getting to the point where we just can't push the raw clock speeds any further which is why every chip is going further and further toward multicore/miltithread as the focus.  That being the case, more and more games are going to start being developed to leverage the mutlicore/multithread capabilities of these processes.  That being the case, at some point in the future, with at least some games, you might very well find that having 8, 10, 12 cores vs 6 give you a huge performance advantage.
     
    The challenge is deciding how long it will be before games actually take advantage of 8 or more cores and if you feel you would keep the processor you bought today, long enough for it to even be an issue before you upgrade.
     
    For myself, there are at least 2 games I am waiting on that are releasing in 2020 that have mentioned specifically they were planning on developing their games to take advantage of I opted for an 8-core 3800X or the 6-core 3600 just to give myself a bit more wiggle room just in case some time in the next 3-4 years, games started getting a big boost from having more cores.  Honestly, though, I am not going to hold my breath waiting for games that get a huge performance advantage from having lots of cores/threads as I think we are at least 4-5 years out from that.
  5. Agree
    Midnitewolf got a reaction from BTGbullseye in Basic RTX 2070 Super vs Top RX 5700 XT?   
    I pretty much agree with everything you said.
     
    Honestly the biggest problem I have with Nvidia is that they are price gouging us and have been for some time now.  I will admit that I think Nvidia does have an edge but it is an edge worth maybe $20-$30 more, not $100 more at least when your comparing the 5700xt vs the 2070 Super, and it is just ridiculous that the 2080 ti is selling for around $1200 on average when the 1080 Ti was only about $800.  That puts a bad taste in my mouth and makes me NOT want to buy Nvidia or support them, especially when there is another option available like in the case of the 5700xt vs 2070 Super.  It honestly would make me feel like an idiot to give Nvidia $100 more for roughly the same performance.
  6. Informative
    Midnitewolf got a reaction from Cysquatch in AMD drivers really that bad?   
    I have used AMD for years and haven't had any major issues overall.  What tends to happen with new product launches with them is that the drivers are a mess and it takes a bit of time for them to get it under control but once they do, everything is fine.  Nvidia on the other also can have issues with the launch of a new product but they tend to launch better off then AMD and usually get the issues under control a bit faster.  This tends to make Nvidia a better option if your buying at the very introduction of a new product. 
     
    As for the 5700XT currently, it has been out long enough that most of the driver issues have been tamed.  Personally while I wouldn't have touched a 5700XT at launch, I don't see any reason to hold that view now and considering the price to performance that a 5700XT offers, I definitely feel it is the best card to purchase at that performance level currently. 
     
  7. Like
    Midnitewolf got a reaction from DildorTheDecent in What's going on with Ryzen 7 3800x clock?   
    I think others have mentioned this but all core boost is much lower than single core boost.  Most stress tests run on all core so that is why you see a lower boost.
     
    When monitoring my 3800x I generally see 2-3 of my cores alternatively hitting 4.524 Ghz max boost while all the cores can reach at least 4.441 Ghz which I feel is great since I am running it completely stock.  Under an all core stress I see around 4.211Ghz on all cores if I am recalling correctly.  With manual overclocking I can achieve an all core boost of 4.5 Ghz at 1.4 volts no problem and honestly I think if I wanted to really tweak it, I could probably get that on 1.385 volts but since my non-OC boosts were so good, I didn't find it necessary to do much than run a single stress test at 4.5 Ghz all core just to see if could do it stably.  Overall I am tickled pink that each and every core can get within 50 mhz of the rated max boost and that 3 of the cores can exceed this stock.  It is a great CPU.
  8. Like
    Midnitewolf got a reaction from Gard in 3600 - 3600x price worth it   
    This one is hard to say.  Honestly I don't really think so.  The 3600x will be a better binned chip which means all things being equal, you might get another 100mhz give or take out of the boost clock but this translates into like 1-2% real world performance.  In fact unless you want or need the extra cores, there isn't much of a case to argue getting a 3700x, 3800x or even 3900x over a 3600 to be honest because your only looking at maybe a 4-8% gain in performance from one end to the other. That is why the 3600 is the king of value right now.  If I had to be honest, maybe if I could get the 3600x for $10-$15 more than the 3600, I might consider it just because having th "X" in the name makes the 3600x sound cooler but I don't think I could justify the extra cost for any other reason.
  9. Like
    Midnitewolf reacted to HanZie82 in Why do I need a dedicated video card   
    I would suggest one of the newer Ryzen CPU's, The cheapest with a intergrated GPU. A correct motherboard and RAM.
    That would have the built in GPU have the latest decoding hardware so all 1080p and even 4K is rendered via hardware which will improve speed tremendously.
     
    edit: Maybe something like this. And it might be a little more expensive than really necessary but this has an upgrade path so if you eventually wanna game on it you dont have to buy a whole new machine. And if just for video it should last you over a decade.
    PCPartPicker Part List Type Item Price CPU AMD Ryzen 3 3200G 3.6 GHz Quad-Core Processor $138.00 @ Shopping Express Motherboard MSI B450M PRO-M2 MAX Micro ATX AM4 Motherboard $98.00 @ Shopping Express Memory Crucial 8 GB (2 x 4 GB) DDR4-3200 Memory $68.20 @ Newegg Australia Storage Kingston A400 120 GB 2.5" Solid State Drive $29.95 @ Amazon Australia Power Supply Cooler Master MWE Bronze V2 450 W 80+ Bronze Certified ATX Power Supply $65.00 @ PCCaseGear   Prices include shipping, taxes, rebates, and discounts     Total $399.15   Generated by PCPartPicker 2020-02-03 15:42 AEDT+1100   (Not 100% sure about the PSU, that might need some further research).
    Oh forgot to check for motherboard spec, this one doesnt do 4K@60Hz. Im sure there are other that would.
  10. Agree
    Midnitewolf got a reaction from CPUguy101 in RAM 3200 vs 3600   
    I was going to mention this.  Ryzen loves both high speed and low latency so the goal is to get the faster ram with the lowest latency.  
     
     
  11. Agree
    Midnitewolf reacted to CPUguy101 in RAM 3200 vs 3600   
    If you enjoy playing at higher frame rates 120+ then i would recommend 3600-3800 memory with nothing higher then 16 cas latency for Zen 2. 
     
    Ram is really the ONLY area where you can tweak your setup with a Ryzen 3000 CPU as there is 0 overclocking headroom and PBO and Auto OC basically do nothing useful and can even hurt ST performance. 
     
    I currently have a very high binned samsung bdie kit but it can be expensive, mainly bought it for fun to OC and tweak things((Currently at 3800mhz at 14 cas timings). 
     
    What matter what you get remember this simple trick
    Memory divided by TCL and the higher the number the better, example down below. 
     
    3600mhz 18 cas timings = 200
    3200mhz 14 cas timings = 228
     
    In the example above the 3200mhz kit would be better and most likely either allow for an OC or even tighter timings. 
     
    Regardless on what you pick remember to tweak further then XMP by using Dram Ryzen calculator. 
  12. Informative
    Midnitewolf got a reaction from aldoggy in Speculated prices of future cards and by when   
    From what I have seen for the most part prices don't really go down anymore, at least not by a lot.  Usually when the new card comes out, the old cards drop maybe $50 and they just sit at that price until the inventory runs out. 
  13. Like
    Midnitewolf got a reaction from Viddiecent in Does RTX worth make it getting a 2070S or Should I get a 5700XT?   
    Overall I would say no.  From what I have seen the performance hit is so drastic that even a 2080 ti can struggle to hit 60 fps at 1080p if you max it out.  The reason to buy the 2070 Super is because Nvidia has better drivers and compatibility though you have to pay $100 extra for that.  Myself, I think I would go for the 5700XT just because you can't beat its price vs performance right now.
  14. Agree
    Midnitewolf reacted to Ralf in No RGB AIO Cooling solutions and RAM for 3950X on X570   
    Arctic Liquid Freezer II 240
  15. Agree
    Midnitewolf reacted to dalekphalm in Electricity is haywire in my building, whats the worst that can happen to my components.   
    Okay so if we're dealing mostly with flicker, you're in good shape because the individual fluctuations/spikes/drops will be seconds or even milliseconds, so the battery really only needs to kick in for short durations.
    Okay so let's just ballpark 700W total system draw (including spikes, such as when the system powers on, and when you're stressing the CPU/GPU to the max via games or benchmarks).
     
    If you get actual hard measurements, this number can be tweaked.
     
    As Alex says, your actual system draw is unlikely actually as high as 500W. I punched in your PC's specs into a PSU calculator, set it to 100% CPU utilization w/ both a moderate CPU and GPU overclock, and it estimated just under 400W. Most monitors consume about 30W or so - some as high as 60W.
     
    So let's assume worst case 460W - round it to 500W and be done.
    A UPS would definitely still help - a good UPS will actually take your line voltage and use it to charge the battery, and will actually condition the power and output a cleaner power signal.
     
    If it's happening all day long? Yeah - that UPS will only last so long. But at least it'll be the UPS killing itself instead of everything else.
     
    It should still help in the short term while the electrical is sorted.
    All true.
     
    @johndole25 what voltage does your home operate at? 120V? 230V? Something else?
    Basically I would recommend a 500W UPS of decent build quality.
     
    Now there are different types of UPS's. There are two main specifications to look at:
    1. Overall type
    A. Line Interactive
    These will power your equipment directly from your mains electricity, and if it detects voltage drops or spikes, it'll use a transformer to bring the voltage back in line. They are less accurate and less protecting though - they have an average "accuracy" of about 15% (So if the target is 120V, it could be + or - 15% - that means anywhere from 102V to 138V).
    B. On-line
    These are higher end units and will take the incoming voltage, convert it to DC, then convert it back to AC at the target voltage - they have an accuracy of about 2-3% of target voltage. These will be much better suited for you.
     
    2. Sinewave output
    A. Pure sine
    Most people will say these are best, and that they are recommended for computers.
    B. Stepped "approximate" sine
    Some say these are garbage, others say they are fine.
     
    I've ran plenty of computers off of Stepped sinewave UPS's (most low end ones are stepped) and never had an issue - I wouldn't run a server off of it though.
     
    A basic Line interactive UPS will still help, but it'll be much more susceptible to dying with lots of voltage flicker. But they're also significantly cheaper.
     
    here's a comparison:
    APC BX1300G
    1300VA / 780W
    Line-Interactive / Stepped approximate sinewave
    https://www.apc.com/shop/ca/en/products/Power-Saving-Back-UPS-XS-1300VA-120V-Canada/P-BX1300G-CA
    Retail: ~$200
     
    APC SRT 1000VA
    1000VA / 900W
    On-Line double conversion (AC -> DC -> AC) / Pure sinewave
    https://www.apc.com/shop/ca/en/products/APC-Smart-UPS-SRT-1000VA-120V/P-SRT1000XLA
    Retail: ~$1000
     
    Granted, if you do your research you can likely find an on-line/pure sine UPS for a lot cheaper. Plus there are plenty of good used deals (just buy a new battery).
     
    Even a shitty UPS will help a bit.
     
    But ultimately you need to get an electrician to solve the problem long term.
     
    Edit: Some reading material
    https://blog.tripplite.com/line-interactive-vs-on-line-network-ups-systems-and-which-should-you-choose/
  16. Agree
    Midnitewolf reacted to BarackOBatman in Better for the long run   
    I'd say buy as much graphics card as you can afford.
    5700 over the 5600XT. If you can afford it, 5700XT
  17. Agree
    Midnitewolf reacted to Mister Woof in Best card to pair with a Ryzen 5 3600 for 1080p 60fps gaming   
    there's always something on the horizon. at some point you just gotta roll w/it. but if you don't need it now, waiting isn't going to harm you
  18. Informative
    Midnitewolf got a reaction from Vejnemojnen in Finding Max Safe Voltage for 3800x?   
    I did a lot of research and honestly, I don't see any reason what so ever to do an all core OC on my 3800X.  I mean I can easily get a 4.5 Ghz all core on mine but aside from workloads which utilize all the cores, I actually lose performance since 3 of my cores will autoboost to 4.543 Ghz and none boost less than 4.441 Ghz.  Basically for gaming, I end up getting a couple more fps just leaving everything at stock settings. Further, I don't have to worry about the voltages because it only boosts the voltage when needed allowing the chip to run at very low voltages the vast majority of the time.
     
    As others have mentioned, you end up getting much more out of fine running the RAM, in fact if you want the best performance boost, start off by getting the best RAM you can afford, focusing on high speed, low latency and tight timings. Also make sure your optimizing your Infinity Fabric settings as well.
     
    To be honest though, Ryzen doesn't have much OC'ing headroom anywhere.  The best you can do is achieve incremental enhancements of 1-2% here and there which might give you a cumulative enhancement of maybe 5-10% if all the stars align.  If you can get a fantastic cooling solution that keeps the system running under 60 C, Ryzen CPUs boost a few hundred mhz better and you get a 1-2% here. If you have the best RAM configured with the best speed, timings and latency, you get 1-3% there.  If you optimize the Infinity Fabric to work with your RAM OC, you get another 1-2% there, so on and so forth.  It really almost isn't worth it to try to OC Ryzen anymore because out of the box, the hardware (CPUs, MB, RAM, etc) is pretty much pushing the limits.
  19. Like
    Midnitewolf got a reaction from Simple_Jack in Best card to pair with a Ryzen 5 3600 for 1080p 60fps gaming   
    Honestly it is all relative.  I have a Vega 56 currently which generally reports better performance than a 1660 or 1660 Ti and I can tell you that my experience is already subpar for 1080p gaming but then again I tend to like to run my graphics settings on the higher side.  Also it isn't subpar until it is.  What I mean by that is that my Vega 56 was perfectly fine for me until just a few weeks ago when I picked up a few games where it is struggling.    Immediately, at that point it became not fine.  This is why I think you honestly have to look at all current games, at least the popular ones, and make your recommendations based on that.  I mean you never know what is a person is going to want to play.  Even if 90% of all games can hit 60 fps at 1080p on a given card, what if a month down the road the person asking for advice decides he wants to play a game that is part of the 10% that it can't hit 60 fps with?  
     
    As far as needing a $400 GPU for good 1080p/60 fps, yes I do believe that because for the best overall experience your GPU should be able to output probably at least 70 average fps (so that 1% lows don't drop below 60 fps) on all current mainstream games on at least high graphics settings.  Further, since your going to own that card for quite some time, I think you should be purchasing with at least a fairly solid expectation that you will get those same minimum 70 fps on high setting for the vast majority of mainstream games for at least 18 months beyond the initial purchase of the GPU.  
     
    As far as why the $250 market exists, well that is because not everyone can afford a $400 GPU, in fact I have been there myself.  Those that can't have to make some sacrifices and I believe most who buy these cards either realize it and accept those sacrifices because they don't have a choice, don't care that they are going to have to reduce setting to medium or maybe lower on some games or just quite frankly don't understand the limitations of those cards because someone told them that all they needed for 1080p gaming was a 1660 Ti. 
  20. Like
    Midnitewolf got a reaction from steelo in Best card to pair with a Ryzen 5 3600 for 1080p 60fps gaming   
    Just to add my 2-cents.  If we are talking RDR2, with HIGH settings, not Ultra, the 1660ti barely gets 60 fps with 1% lows down to 52 fps at 1080p on a OC'ed 9900K so if you have anything other than the best current gaming processor, fairly aggressively OC'ed at that, your getting well below 60 fps in that game.  Metro Exodus is not quite this bad but it is close and I seem to recall the new Star Wars Jedi game is fairly demanding as well.  The point is games are only going to get more demanding as we move forward and if your already having to reduce your graphics settings down to high, med-high or even as low as medium to get a solid 60 fps, your just setting yourself up for disappointment a year or two from now.  
     
    As for 2060/2060 Super, well honestly, as I see it, they aren't even close to overkill for 1080p/60fps, in fact I would say it is probably the minimum level of GPU you would want to buy if your looking 2 years into the future.  To be fair, I guess if your context is ONLY based on games available right now though, then you would be correct in saying a 2060s+ card is a bit overkill for 1080p/60 but I would expect most people buying a GPU today are looking for it to also perform 2,3 and even 4 years in the future.
     
     
  21. Informative
    Midnitewolf got a reaction from Statik in Finding Max Safe Voltage for 3800x?   
    I did a lot of research and honestly, I don't see any reason what so ever to do an all core OC on my 3800X.  I mean I can easily get a 4.5 Ghz all core on mine but aside from workloads which utilize all the cores, I actually lose performance since 3 of my cores will autoboost to 4.543 Ghz and none boost less than 4.441 Ghz.  Basically for gaming, I end up getting a couple more fps just leaving everything at stock settings. Further, I don't have to worry about the voltages because it only boosts the voltage when needed allowing the chip to run at very low voltages the vast majority of the time.
     
    As others have mentioned, you end up getting much more out of fine running the RAM, in fact if you want the best performance boost, start off by getting the best RAM you can afford, focusing on high speed, low latency and tight timings. Also make sure your optimizing your Infinity Fabric settings as well.
     
    To be honest though, Ryzen doesn't have much OC'ing headroom anywhere.  The best you can do is achieve incremental enhancements of 1-2% here and there which might give you a cumulative enhancement of maybe 5-10% if all the stars align.  If you can get a fantastic cooling solution that keeps the system running under 60 C, Ryzen CPUs boost a few hundred mhz better and you get a 1-2% here. If you have the best RAM configured with the best speed, timings and latency, you get 1-3% there.  If you optimize the Infinity Fabric to work with your RAM OC, you get another 1-2% there, so on and so forth.  It really almost isn't worth it to try to OC Ryzen anymore because out of the box, the hardware (CPUs, MB, RAM, etc) is pretty much pushing the limits.
  22. Agree
    Midnitewolf reacted to Samfisher in GPU futureproof dilemma.   
    There's no such thing as no bottleneck, everything will be bottlenecked by 1 thing or another.  I'm still on a 1070 with a 3700x and I'm doing fine at 1440p and frequently hit 100+ fps on most games that I play.
  23. Like
    Midnitewolf got a reaction from parotia in Bought G.Skill Flare X, but I am no longer sure if it's compatible. Confused with information.   
    RAM can be compatible even if it isn't on the list.  I have G.skill Ripjaw V's in my system and they weren't on my MB's list and while the XMP profile didn't work, I had no issues manual adjusting the timings to work correctly.  All the list indicates is the RAM kits that the MB manufacture has actually tested and know hat work.
  24. Like
    Midnitewolf got a reaction from parotia in Bought G.Skill Flare X, but I am no longer sure if it's compatible. Confused with information.   
    I think your referring to Samsung B-Die when you say B-Die.  Off hand I don't know if the G.Skill Flare X is Samsung B-Die or not but I am not sure I would classify Samsung B-Die as a "Feature".  Generally speaking most people seem to feel it is the best quality of chip than can be added to a RAM module but that doesn't mean other types or manufactures like Micron or Hynix are necessarily bad chips.  Generally speaking no matter what manufacturer of RAM, if it is rated to run at a specific speed, latency and timings, it should be able to operate at those ratings.  My Ripjaws are Hynix chips but reach their rated 3600 mhz CL 16 specs just fine.
     
    I think the reason Samsung B-Die gets so much recognition is due to its better ability to generally overclock beyond its rated specs. Other types can often do this too, but the Samsung B-Die stuff tends to more consistently offer OC potential beyond the rating and/or have more headroom available for overclocking. It is definitely the best choice for going beyond the ratings.  However, that doesn't mean Micron or Hynix can't often match what you can get from the Samsung B-Die, just that you have a lower chance of it happening.
  25. Like
    Midnitewolf got a reaction from Bombastinator in 2700x and rx580, GPU bottleneck or not?   
    Yeah.  Also make sure you considering function and reliability over looks. That is one of the first rules about building a PC. If your on a budget, the first thing that gets tossed out the window is RBG and other glam.  Your always better off getting a better, more powerful (up to a point) PSU, graphic card, MB, processor or RAM than RGB or a fancy case.  
     
    Personally if you can afford it I would look trying to find a good rated 650w 80+ gold rated PSU as a base.  It will offer more power than you will likely need but that just means it will operate at closer to peak efficiency and down the road will likely be powerful enough to take on any upgrades, even a full system rebuild, that you might want to throw at it.  It will probably cost $20-$30 more (sorry not sure what the currency conversion rate is) though.  
     
    Here is my list of priorities when building a system, in decending order.
     
    PSU CPU MB GPU RAM HDD/SDD CASE RGB Basically you want to try to balance things according to these priorities within your budget. 
     
    Obviously you want to start off by picking a CPU that meets your needs but generally speaking, even something as low end as a Ryzen 1600 can give you solid gaming performance so having the biggest, baddest CPU usually isn't that important.  As soon as you have the figured out, start off with a good, solid PSU, it doesn't have to be massively expensive, but make sure it is good quality and give you a bit of power overhead to work with for if and when you want to upgrade.  Then take a look at MB's and try to find a good balance of price and performance, particularly paying attention to the quality first and features second.  Then try to budget in the best GPU you can but also consider that this is also one of the easiest upgrades you can do to a PC.  As far as RAM, I always felt I could skimp a bit here. It is nice to have super fast RAM with tight timings but generally speaking I have found that the performance difference between the fastest RAM and the slowest RAM is kind of small in the grand scheme of thing, maybe like 3-5% if that so if budget it a concern, you can cut a few corners here and go with 3200 or 3000 mhz RAM vs 3600.  As far as the HDD/SDD, it is nice to have an SSD but honestly a good, old fashion HDD is still adequate so you can save money here if you have to.  Probably the best budget bet here is a small SDD, maybe 256 MB, for the OS and maybe a few games and then maybe a 1 TB HDD for everything else. This is better in my opinion than skimping on a PSU in for example.  As far as a Case, you can get by with a pretty cheap case and there are some really good values for around $50 out there if you do a bit of research.  Finally RGB is just for show and the worst place to spend any money. Yeah it looks cool but at the end of the day, 99% of the time the only one looking at your PC is you and I can guarantee you will have a more enjoyable time with better performance than looking at rainbow lights. 
     
    This is only my advice for building any gaming PC on a budget, obviously, if you have other needs or use cases, some of the priorities might change but generally speaking, using this method for building a gaming rig on a budget has worked well for me for around 20 years.
×