Blog
20 hours ago
ChatGPT Is Gaslighting You With Math
ChatGPT can't reliably add 61 numbers in random order, giving 12 different wrong answers despite showing convincing fake work. The AI optimizes for speed over accuracy by default, simulating calculation rather than using its code interpreter. Clicking "Think Longer" reveals it had a calculator all along. For data professionals: if there's no code block, there's no trust. It's performing analysis, not doing it.
Source: HackerNoon →