Your point has no bearing whatsoever on my statement. You could also misread a ruler but doesn’t mean there’s anything wrong with the ruler. Given I can reliably read a ruler, then I can ‘blindly trust’ it assuming it’s a well manufactured ruler. If you can’t that’s definitively a you problem.
My point wasn’t that people don’t make mistakes they obviously do. My point is that calculators are deterministic machines; to clarify that means if they have the same input they will always have the same output. LLMs are not and do not. So no it’s not the same thing.
You are implying that one must ensure the veracity of the output of a calculator in the same way that one must ensure the veracity of the output of an LLM and I’m saying no, that’s strictly not true. If it were than the only way you could use an LLM incorrectly would be to type your query incorrectly. With a calculator that metaphor holds up. With an LLM you could make no mistakes and still get incorrect output.
You can make mistakes with a calculator. It’s more about looking at the results, verifying the data, not just blindly trusting it.
Your point has no bearing whatsoever on my statement. You could also misread a ruler but doesn’t mean there’s anything wrong with the ruler. Given I can reliably read a ruler, then I can ‘blindly trust’ it assuming it’s a well manufactured ruler. If you can’t that’s definitively a you problem.
I mean it kinda does. If all you do is type numbers into calculator and copy results there’s a chance the result is wrong.
The same way some people use AI, which is wrong.
My point wasn’t that people don’t make mistakes they obviously do. My point is that calculators are deterministic machines; to clarify that means if they have the same input they will always have the same output. LLMs are not and do not. So no it’s not the same thing.
I never said it was the same. I just said you have to be careful with tools you use. It applies to every tool.
You are implying that one must ensure the veracity of the output of a calculator in the same way that one must ensure the veracity of the output of an LLM and I’m saying no, that’s strictly not true. If it were than the only way you could use an LLM incorrectly would be to type your query incorrectly. With a calculator that metaphor holds up. With an LLM you could make no mistakes and still get incorrect output.
I’m implying that you should be careful when you use tools, and not blindly trust the output.