23
My AI tried to write a grocery list and gave me a recipe for cement
Last week I was testing a new language model on my laptop and asked it to plan my shopping trip... it started listing 'portland cement' and 'aggregate mix' as ingredients. Has anyone else's AI assistant gotten weirdly literal with simple tasks?
3 comments
Log in to join the discussion
Log In3 Comments
milam4817d ago
My old voice assistant once heard me say I needed to "pick up some milk and concrete" when I was mumbling about groceries and the hardware store. It added quick-set concrete to my list for three weeks before I noticed. I mean, it makes sense they get confused, but it's still wild when it happens. I guess they just grab onto words without the context.
8
the_lee16d ago
But what if that gap is the point?
2
blair_taylor3217d ago
Yeah, that's the whole thing with tech now, it hears the words but misses the meaning completely. It's like when my GPS tries to take me down a road that's been closed for years because the map data is old. These systems are just pattern-matching without any real understanding of the world. So you end up with concrete on a grocery list because it knows both are things you can buy, but not that they come from totally different places. It's getting better slowly, but that gap between data and common sense is still huge. Makes you realize how much we fill in the blanks without even thinking.
-1