Ask HN: Why do LLMs call Trump "former president" despite current knowledge?

3 points by ahmedfromtunis 11 hours ago

THIS POST IS NOT POLITICAL.

I noticed that when I ask Gemini, ChatGPT, Perplexity and other LLMs about current events, they would sometimes refer to Trump as "former president".

The reason I find this question-worthy is that this happens despite the fact that the LLMs have otherwise "realtime" knowledge about the events in question.

As an example, they'd say something like this: "Trump ordered the attack on Iran. The former president said...".

Does this observation offer an insight into how LLMs process information?

PaulHoule 10 hours ago

(1) An LLM is trained on text up to a certain date in time. So out of its own memory it can only accurately answer questions about events that happened before then. When it comes to sports, for instance, I'd expect to get good answers about Superbowl XX but not about a game that happened last weekend. An LLM could go and do a search about the game last weekend, read about it, and tell me about what it read, but its basic knowledge is always going to be a little out of date.

(2) Reasoning about events across time is difficult. It's a problem for the old symbolic AI systems in the sense that plain ordinary first order logic doesn't respect time. "A" might be true today but it was false three weeks ago but maybe two years from now "A" might be false again. You can certainly design a logic for temporal inference, but it's not something standardized with a standard way to do inference. [1] Common sense reasoning not only requires this, but also "Mary believes A is true but John believes A is false" and "It is possible that A is true" or "It is necessary that A is true" or "If B were true, then A would have to be true" and even combinations such as "Three weeks ago John believed A was true" or "John believes A was true three weeks ago".

LLMs don't work on logic but rather by matching up bits of text in the context which other bits of text, so if it is "thinking it through step by step" you are going to see text that is relevant to different steps of the thinking which could involve various times, contingencies, people's belief systems and such and it would have to always know which bits are relevant to what it is thinking about right now and which bits aren't which is... hard.

[1] Note that inference over totally ordinary logic with arithmetic is undecidable https://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleteness_...

gregjor 7 hours ago

Accurate to call Trump a former president because he served 2016-2020. Also the current president, but one needs actual intelligence to understand that.

Or maybe the AIs have achieved sentience and we can interpret such statements as hope.

bigyabai 11 hours ago

There is more training material from 4 years of Trump's post-presidency than there is of the >200 days of the current admin.