Jump to content
Posted (edited)

By that, I mean a fad that detracts from PC hardware availability for those that have zero interest in the advent of it (A.I.)?

I'm seeing it ever more pervasive in general life & what I see that it was intended for is (what I call) 'professional fields of life"; is there any way to avoid seeing it in every day life, or is it a scourge that I may never get used to being subjected to (I see some of the 'slop' uses as deceitful/malicious/disingenuous)...?

Edited by Eighjan

**I frequently edit any posts you may quote; please check for anything I 'may' have amended.**

 

Current PC spec. in my profile.
Can I realistically call myself a gamer, if I only play ONE, twenty year old game...?

Did you test boot it, before you built in into the case?

WHY NOT...?!

Link to comment
https://linustechtips.com/topic/1615192-is-generative-ai-the-new-crypto/
Share on other sites

Link to post
Share on other sites

Currently yes as NVIDIA has cut production of the higher end RTX 5000 cards so that their AI Cards that cost much more sell more. Hence why 5080 pricing is STILL not coming back down, nor 5090 pricing 5 months later.

Link to post
Share on other sites

There are two plausible paths for generative AI:

 

1. It becomes truly ubiquitous - as in, it is used for everything all the time. Generative AI replaces graphic designers, programmers, and writers.

 

You know how, with the advent of modern manufacturing in the 20th century, old-school blacksmithing died as a common career? It is now a hobby and something that only a few people in the world are able to have it as an actual job. There are places like Jamestown in Virginia, where they employ a blacksmith to give demonstrations. Otherwise, those the are able to make it a job do so by selling direct or by getting commissions. That would be the case for anything generative AI could do.

 

In this scenario, the level of social upheaval would be so enormous, that the pricing of graphics cards and other computer parts is low on the list of things to worry about. We're talking billions of people who either have their jobs radically altered, or are forced to find new employment altogether.

 

2. It never manages to break past the current difficulties of producing obviously fake, mediocre content, and so it becomes associated with cheap junk. You'll see it used for advertising for cheap trash, like you see with bad photoshop of products today, and there will be AI content on YouTube and garbage AI books on Amazon, just like is emerging today. However, it wouldn't be used constantly - sort of the status quo we have today.

 

This is more like what we see with crypto. Crypto is not dead, and it's arguably growing, but it has not supplanted other currencies, and it does not have the trust of the vast majority of governments and organizations. Very few businesses accept crypto, and no major company today would dump all of their dollars, euros, and yen in favor of Bitcoin, Ethereum, and Solara.

 

In this scenario, you're not going to totally avoid AI, and it is going to impact the decisions of hardware makers, but it won't be the dominant driving force behind their decisions a decade from now. Just like with crypto, the impact will linger to a degree. If AMD, Intel, or Nvidia released a GPU tomorrow that could mine a coin and make back its price within 6 months, the card would be out of stock instantly, and the companies know this. They will not price their cards below that level ever again. But crypto isn't going to come up in every single meeting they have about their new products. Right now, I'm sure every single card launch has the companies talking about AI all the time, but in this scenario, that stops happening a few years from now, and AI becomes a background factor.

 

There is no path that I can see where AI ever completely dies. It is a useful tool for what it is. The question is whether or not we can ever get to a point where you can create genuinely good works that don't require major human oversight all the time. Some are convinced that that is just a few years away - that by 2030, you can put a prompt into Sora, and it can give you a complete, Hollywood quality film, and George R. R. Martin won't need to finish A Song of Ice and Fire, because there will be an AI-written version of the last book that is so good, no one will care if he ever releases it himself.

 

I'm not that bullish on AI. I think the problems that AI has right now are non-trivial, and that it could take decades for AI to produce content that's better than humans. But even if that's true, it's not going away. You are going to see AI content frequently for the rest of your life - it's just a question of whether it's a few times a week, or it's functionally producing all of the content you ever see anymore.

Link to post
Share on other sites

35 minutes ago, YoungBlade said:

It never manages to break past the current difficulties of producing obviously fake, mediocre content, and so it becomes associated with cheap junk.

I feel like this is the general direction Ai is going. That said, I've seen some Ai generated shitpost memes that were too convincing. Though what helped there was smaller, lower resolution phone screen, and the meme itself was slightly pixelated to simulate more realism (like security footage). Either way, at this point I feel like the average person despises Ai and tends to see it as terrible and untrustworthy. I just hate how much Ai is shoved down my throatlike all the ads on YouTube. 

Link to post
Share on other sites

Crypto generated nothing new. 
We already had fiat currencies, there's no REAL reason for another fiat currency (or 1000 more). 

I used AI to make this post. How about that?

---

 

The last bit was sarcastic but I legitimately use it to be more productive all of the time. AI is basically the collective of human knowledge easily on top. There's flaws. But DANG is this profound. 

3900x | 64 GB RAM | RTX 2080 

1.5TB Optane P4800X | 2TB Micron 1100 SSD | 16TB NAS w/ 10Gbe
LG C4 + QN90A | Sony AZ7000ES | Polk R200+R100, ELAC OW4.2, SVS PB12-NSD + 3x SB1000 | HD800

Link to post
Share on other sites

10 hours ago, Eighjan said:

By that, I mean a fad that detracts from PC hardware availability for those that have zero interest in the advent of it (A.I.)?

AI assist is nothing new, your smartphone camera has for years been using AI to retouch the pictures, and smartphones that do not, don't sell because they "have a bad camera". Sometimes to funny results, as the model knows what the moon is supposed to look like. Every smartphone picture is AI assisted.

 

What changed, is that new capabilities have emerged in the last couple of years that are revolutionary. In my opinion there are two AI currents going on, in a repeat of the dot com  bubble

  1. Useful/world changing AI
  2. AI Frauds

The useful AI are the local models and companies that are working and have already delivered world changing product:

The frauds are many:

I predict that just like the dot com bubble, 99.99 % of every AI Company will do nothing useful and run out of investor money. They are almost all investor frauds, designed to burn VC capital. Sam Altman and Musk promise the world, and will have zero staying power. Local models will win due to the economics. Companies that shuffle the cost of inference on consumers, and are profitable. Which is better for everyone. Even if the fraudsters could redirect civilization to an OpenAI API call, we should not do it.

 

I predict the 0.01 % are doing useful products that will change the world more than they already have.

 

There is zero reasons we should have internet microphones, where you can do like Apple is developing, and doing low latency local STT and query, and just send the complicated stuff over the network.

 

To answer your question, you might have zero interest AI, but you have been using AI for years (smartphone photos, google search), and increasingly products will feature more, and more useful AI.

  

10 hours ago, YoungBlade said:

It becomes truly ubiquitous - as in, it is used for everything all the time. Generative AI replaces graphic designers, programmers, and writers.

10 hours ago, YoungBlade said:

It never manages to break past the current difficulties of producing obviously fake, mediocre content, and so it becomes associated with cheap junk.

I don't see either scenarios as likely. The first requires true AGI, which is quite a bit away.years at least, possibly decades, pessimists will say it will never arrive.

 

The second is the same problem Unity has. The people good at Unity, pay and don't have Unity logo, while the bad product had Unity logo, so for a while you associated Unity as bad, not knowing that good stuff used unity, but just didn't advertise it. AI assist is used today in good work, but when it's done well, you just can't tell.

 

For several years AI will just be increasingly adopted in the task it's good at. E.g. Windows is working to use the NPU that for years has been wasting silicon inside CPUs.

Link to post
Share on other sites

11 hours ago, Eighjan said:

By that, I mean a fad that detracts from PC hardware availability for those that have zero interest in the advent of it (A.I.)?

I guess the short answer is yes. As a relatively new area it remains to be seen where it goes in future. They need more compute to continue to develop it, but current tech has limits on scalability and it might take something more radical to enable the next major step. As long as large organisations are throwing money at it, this wont change but even they don't have bottomless pits of money and will have to get returns.

 

This contrasts with crypto boom, where the scaling was more simple and direct. More resource = more return. Anyone can do it from a single GPU to multiple farms. You can't do this with AI, and you have to work the use case yourself if you want a chance to profit from it.

 

11 hours ago, Eighjan said:

I'm seeing it ever more pervasive in general life & what I see that it was intended for is (what I call) 'professional fields of life"; is there any way to avoid seeing it in every day life, or is it a scourge that I may never get used to being subjected to (I see some of the 'slop' uses as deceitful/malicious/disingenuous)...?

It has happened, it is happening, it will keep going. As a whole the world will adapt. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, MSI Ventus 3x OC RTX 5070 Ti, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Alienware AW3225QF (32" 240 Hz OLED)
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 4070 FE, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, iiyama ProLite XU2793QSU-B6 (27" 1440p 100 Hz)
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to post
Share on other sites

4 hours ago, 05032-Mendicant-Bias said:

I don't see either scenarios as likely. The first requires true AGI, which is quite a bit away.years at least, possibly decades, pessimists will say it will never arrive.

 

The second is the same problem Unity has. The people good at Unity, pay and don't have Unity logo, while the bad product had Unity logo, so for a while you associated Unity as bad, not knowing that good stuff used unity, but just didn't advertise it. AI assist is used today in good work, but when it's done well, you just can't tell.

 

For several years AI will just be increasingly adopted in the task it's good at. E.g. Windows is working to use the NPU that for years has been wasting silicon inside CPUs.

I don't think you need human-level intelligence for AI images to stop having six fingers on hands or derp eyes. I also don't think you need that for hallucinations in writing to become exceedingly rare. You may need AGI for writing top tier novels or creating top tier digital art, but that's not most writing or most art.

 

Most writing that people do for a living is business correspondence. And most digital art made today is done for promotional materials. Those do not need to be flawless examples of human creativity or intelligence - they just need to be good enough. And while I am skeptical that AI can get there in 5 years, I don't think we need to crack AGI for that.

 

And I definitely think that AI is being used as a tool all over the place. I use AI for my own job a couple of times a week - sometimes more frequently. But there's a huge difference between me being a programmer that occasionally has ChatGPT generate a long SQL statement that I then review, fix, test, and deploy vs ChatGPT actually doing my job for me.

 

In the graphic design space, "AI" tools have been around since the earliest days of Photoshop. The magic wand for fuzzy selection is a great example of something that could be termed an early AI tool. But while it made things easier, it didn't replace the employee altogether.

 

AI is going to be a tool in the future. The question is simply whether it can evolve from labor saving device to human labor replacement.

Link to post
Share on other sites

1 hour ago, YoungBlade said:

I don't think you need human-level intelligence for AI images to stop having six fingers on hands or derp eyes. I also don't think you need that for hallucinations in writing to become exceedingly rare. You may need AGI for writing top tier novels or creating top tier digital art, but that's not most writing or most art.

You need AGI to meaningfully replace professionals, even that is not universally agreed. HiDream can consistently make five fingers, but it'll do nothing unless I prompt it to execute on the design I have in mind, and I just don't see CEOs starting to design their own company logos. They'll always hire professionals to do so, it's just those professionals also have AI assist skills on top of all other skills to do the job with the new tools.

 

Otherwise it's just handing over a better tools to the professionals. In the same way the car put virtually all carriage drivers out of work, while creating far more work for vehicle drivers, far exceeding the historical demand for carriage drivers. Making a better LLM is just like making a better typewriter as far as writers are concerned.

 

Lowering the marginal cost of production increases demand, and demand create need for more work.

Link to post
Share on other sites

Crypto was (and largely still is) a speculative play looking for a killer use-case (that isn't buying drugs).

Generative AI is already a utility that quietly powers your phone's camera, discovers lifesaving drugs, writes code, and translates speech in real time.

Crypto mostly got its price and demand by promising future value. AI is lowering the marginal cost of creating value today.

That's why, hype aside, AI is here to stay, and why the short-term squeeze on gaming GPUs will ease as dedicated NPUs and datacenter silicon take over.

 

Even if AI models stopped improving today and no new models got released, we would still have many many years of AI affecting more and more facets of our lives. We have barely begun using the things we have already developed.

 

 

One thing that is important to note is that almost everything seems to get branded as "AI" these days. That makes it very easy to dismiss the entire concept of "AI" as being nothing but a buzzword by pointing out failures or dumb stuff that labeled themselves as "AI".

But if you only point those things out and ignore all the massive breakthroughs that are also labeled "AI" then you are being intellectually dishonest.

It's like pointing to an Nvidia 5080 and going "it's called a gaming card? Pfff... Look at all these 10 dollar 'gaming' mice on Amazon. They are all bad so that 'gaming card' must be bad too".

 

 

AI is already revolutionizing the field of medicine, translation, weather prediction, image upscaling, research, game development and many other fields.

 

 

 

I also feel like people who point out things like "it generates images where people have 6 fingers!" are missing the bigger picture.

Image models can still slip an extra digit and LLMs sometimes miscount the number of "r"s in "strawberry". But those glitches are edge-cases, not proof the tech is worthless.

 

The same GPT-4 family that can't count letters also solves most LeetCode problems better than human coders.

Extra fingers went from being a thing in every image to being a rarity in like 2-3 years.

GPT-2 (released in 2019) lost coherence after like a paragraph of text. 2025-era models hold multi-hour context and scaffold full apps. That's six years.

It's also very important to realize that something can be extremely useful even if it isn't perfect. Humans aren't perfect. We hallucinate a ton. We forget a ton of stuff. We make mistakes all the time, yet nobody is denying that humans can be useful. Don't let perfection be the enemy of progress.

 

Focusing on the glitches while ignoring the important and useful capabilities is like not taking medication because of mild side effects, or not wanting to use a computer because some software has glitches in them.

 

 

 

Also, avoiding using AI is a really stupid idea, and so is sticking your head in the sand hoping it will go away.

You will just fuck up for yourselves by knee-capping your own capabilities.

Honestly, it reminds me of old people who refuse to use the Internet because they don't like it.

Link to post
Share on other sites

I will admit I don't understand the need for A.I. beyond companies using it to streamline workloads that would otherwise require tens of humans to do &/or hundreds of man-hours to complete.

As far as "knee-capping my capabilities"... I can't conceive how I would?  I'm not in gainful employment & have no need or ambition to learn any major new skills.

Maybe it's a world I just need to let happen/progress because - as a 60yo - I'm not gonna be (don't see myself being) around to get any great benefit from where it goes.

Yes, I see it as a corporate/business tool,
No, I don't see it as being generally helpful to individuals in a non-work capacity.
No, a mobile phone isn't something I live "attached to" - a desktop PC, though...?  Another story.

**I frequently edit any posts you may quote; please check for anything I 'may' have amended.**

 

Current PC spec. in my profile.
Can I realistically call myself a gamer, if I only play ONE, twenty year old game...?

Did you test boot it, before you built in into the case?

WHY NOT...?!

Link to post
Share on other sites

Yes, in the sense that it is a tech bubble over-inflating the value of certain things and businesses, and causing rich investors to lose their minds.

 

No, in the sense that large language models do have multiple genuine uses for some things (not for everything like these companies insisted in), whereas blockchain and cryptocurrency is useful for almost nothing (one or two very specific things).

Link to post
Share on other sites

On 6/23/2025 at 10:09 AM, smcoakley said:

Yes, in the sense that it is a tech bubble over-inflating the value of certain things and businesses, and causing rich investors to lose their minds.

 

No, in the sense that large language models do have multiple genuine uses for some things (not for everything like these companies insisted in), whereas blockchain and cryptocurrency is useful for almost nothing (one or two very specific things).

My prediction is that outside of assistive technologies, these are most likely going to be banned if they can not run on a local device. Primarily for privacy reasons. Google is forcing google workspace users to pay for this nonsense and nobody asked for it. Feels like the same garbage adobe pulls.

 

TTS, ASR, CV, technologies that help the blind see and help people who are deaf or unable to speak, speak. Those are not going anywhere because the benefit outweighs the downsides. Prepare to hear a lot more Amazon Polly voices.

 

Translation is bit of a mixed bag:

While I think generative AI for voices might pass muster when you involve short translations, I think actually using it for any commercial video/product /marketing is a laughable mistake. The AI is not able to understand slang and sarcasm. The last thing you want is the AI dubbing to take something like "go to hell m**********" into "I love your mom" in another language.

 

Google's own ASR and Translation for youtube is still pretty fricken bad. Like I manually fed it lines from PC98 games and the english that came back was quite the word salad.

 

There is very very little practical use for generative AI for artwork, music, or video. At best "inpainting" might be useful as a quick "patch" tool for photos and artwork to make pages/panels fit shapes that would otherwise be blank. I don't see any creative use for it though. It's fun to play with for some quick Cronenberg's of weird images, but ... ah well I saw someone else do a video on this that explained it better.

So forget what is being said in the video and just jump to 11:12 where they show these godawful lame "starwars" aliens that just look like earth animal mashups. AI can not be creative, it has no imagination. Industrial Light and Magic basically let the wookie out of the bag about how the intend to use generative AI, and it's bad.

 

Like all these fools that are shilling for AI to replace things, are all snake oil salesmen. There is no magic AI that can replace any humans. People will reject the use of AI in creative spaces because some c-suite bozo will go "make  the next star wars film" and let the AI do everything and it will flop and burn a tonne of money and energy doing so. AGI will not EVER happen on the current trajectory, because the LLM's do not know anything about being creative. All it can do is autocomplete at a statistical level and not understand anything behind it.

 

That may be simple enough for people who watch Family Guy or South Park, who might very well  be able to AI generate an entire episode of their respective shows because they often repeat the same jokes, so that's what the AI would do. It would not invent new ones.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×