Thursday, May 23, 2024

"Hallucinations" is now an AI Term to describe when prediction generators are not factual in what they are representing

 So, now around knowledgeable people regarding AI you will hear them say things like: "This result has hallucinations because the facts depicted are not fully accurate."

And you are likely going to here this term in relationship to AI more and more for awhile and in fact it might not go away. I'm listening to something on Youtube that my son recommended called "Artificial intelligence: How generative AI will change....." I'm presently listening to the section: "The Trust Factor: Navigating AI's Hallucinations". 

One of the interesting things they were saying is that "The quality of the questions you ask AI affect the output that it is going to give you a lot.

My son was telling me about this too because he says it is like "panning for Gold" in that things can go all sorts of directions depending upon how you interact with the AI you are accessing. 

The results likely won't ever be consistent in this sense and more quixotic sort ol like interacting with a new person so you might not have any of the results you want or expect when you phrase your questions differently. 

In other words you could ask the same question 10 different ways and get then 10 completely different answers in a long or a short form in text, verbally or in art or music or lyrics or prose.

So, this is why my son says accessing AI is a lot like "Panning for Gold" Because you never really know what you are going to get as a result.

So, it's always a surprise when you look at what the AI gives you in this interaction.

No comments: