-
Posts
25,207 -
Joined
-
Last visited
Content Type
Forums
Status Updates
Blogs
Events
Gallery
Downloads
Store Home
Everything posted by BuckGup
-
I posted a bit ago about AI not being able to generate content that can pass it's own tests like creating a drivers license even though it can validate them with 95% accuracy. I just found out it's this https://en.wikipedia.org/wiki/P_versus_NP_problem
Which apparently is still an unsolved problem. It's a nice sanity check when you formulate something on your own and find it's reenforced by others.
-
Doesn't it seem strange AI can validate documents, like drivers license, with 95% accuracy almost instantly yet it can't create one that would pass it's own test.
-
Imagine a beautiful highly detailed image in your head, now draw it.
-
Yeah it does seem strange, just looking at it, it should be easy to create a neural net which generates an image, and takes the other AI's output as the validation check, and then generates an image until it passes the check?
Seems doable.
But I guess the bigger problem is that every company has a different ai model for validating those documents, so creating a neural net that can generate a document that passes all the different checks might be more difficult.
-
How are content serving algorithms suppose to stay relevant when the users interactions change based upon the current algorithm? So in essence creating a pseudo algorithm on top of the existing one? You can find evidence of this by the hidden rings of recommended videos that you need to dig for. The content doesn't show up by traditional means like search results or "mainstream" content but instead is linked to a video that sorta acts as the entrance into it.