I noticed this a few times I tried using ai to solve a wordle I was about to give up on. It isn't just worthless, it like can not guess a 5 letter word based on a few rules like "second letter not a, d or g; first letter l, last letter w; word is not " lower"
Its not that it can't solve it, it can't even guess slightly correct. I think ai language isn't as connected to "spelling" as we think, I've heard of people using ai to translate instructions to Mandarin and then feeding the Mandarin query in because Mandarin is more "meaning dense" therefore uses less tokens and gets better answers
yeah they do not think nor reason just most likely token prediction, most obv in small models but larger are a bit better at making you believe. emperors new clothes, i bet the house of cards will go down soon enough going to be interesting times at least
I noticed this a few times I tried using ai to solve a wordle I was about to give up on. It isn't just worthless, it like can not guess a 5 letter word based on a few rules like "second letter not a, d or g; first letter l, last letter w; word is not " lower"
Its not that it can't solve it, it can't even guess slightly correct. I think ai language isn't as connected to "spelling" as we think, I've heard of people using ai to translate instructions to Mandarin and then feeding the Mandarin query in because Mandarin is more "meaning dense" therefore uses less tokens and gets better answers
yeah they do not think nor reason just most likely token prediction, most obv in small models but larger are a bit better at making you believe. emperors new clothes, i bet the house of cards will go down soon enough going to be interesting times at least