AI wasn't designed to be factual (at least not so far). If you consider them to be like a forecasting machine for weather or anything else like maybe a tool gamblers would use to predict the outcome of a bet or forecasting the weather or something like this you might succeed with them.
However, if you want AI to be factual like an encyclopedia you won't be successful because they are not "At least yet" entirely factual.
It's just like the other day I was asking meta.ai to demonstrate self renewing capabilities in an artwork of Purple Delta 7 and when it gave me one woman coming out of another woman I thought it had made a mistake. However, the AI was actually right because it was demonstrating (Directly) the self renewing or self duplicating qualities of a Robot Android capable of self regeneration in this sense.
I was just too dense as a human to get this at first.
So, unless you can see where the AI is coming from as a human you aren't going to get what it really is you are talking to or writing to. It's not human. It's something else that sometimes appears to be human. But, (at least for now) it's not human at all anymore than an elephant is a human.
It's a different thing entirely.
And this is where all people are likely to get into trouble in relation to AI by expecting things it cannot be (at least yet).
No comments:
Post a Comment