Quote from: Twark_Main on 02/25/2026 11:25 pmQuote from: JulesVerneATV on 02/25/2026 09:19 amNow they say AI thinks it is seeing something after adding it all up, real or another 'Artificial Hallucination'? sometimes AI does find stuff but other times it seems to talk about events that are not really there, an LLM problem?AI Reveals Unexpected New Physics in the Fourth State of Matterhttps://scitechdaily.com/ai-reveals-unexpected-new-physics-in-the-fourth-state-of-matter/andPhysics-tailored machine learning reveals unexpected physics in dusty plasmashttps://www.pnas.org/doi/10.1073/pnas.2505725122Dusty plasma is ubiquitous throughout the universe, from Saturn’s rings to interstellar space , and is critically important for planet formation , technological processes , and potentially the emergence of life . In a dusty plasma, dust particles’ interactions have known approximations based on tractable physics, yet they are poorly understood in environments that deviate from the simplest equilibrium conditions, for example, in systems with background plasma flows or with external magnetic fields . Particles interact through complicated forces mediated by the plasma environment , and violate some of our basic expectations: They are nonreciprocal and can source energy from their nonequilibrium environmentAre we just calling anything with a neutral network an "LLM" now, and attributing LLM problems to them all?As far as I can see this is nothing but good old classic pre-LLM machine learning (eg digit recognition), which isn't really known for the sort of hallucinations output by LLMs.We first identified hallucination thru LLMs and now associate it with them. But when it's happening from the underlying neural network, then why would the mechanisms which give rise to hallucination be exclusive to LLMs?...
Quote from: JulesVerneATV on 02/25/2026 09:19 amNow they say AI thinks it is seeing something after adding it all up, real or another 'Artificial Hallucination'? sometimes AI does find stuff but other times it seems to talk about events that are not really there, an LLM problem?AI Reveals Unexpected New Physics in the Fourth State of Matterhttps://scitechdaily.com/ai-reveals-unexpected-new-physics-in-the-fourth-state-of-matter/andPhysics-tailored machine learning reveals unexpected physics in dusty plasmashttps://www.pnas.org/doi/10.1073/pnas.2505725122Dusty plasma is ubiquitous throughout the universe, from Saturn’s rings to interstellar space , and is critically important for planet formation , technological processes , and potentially the emergence of life . In a dusty plasma, dust particles’ interactions have known approximations based on tractable physics, yet they are poorly understood in environments that deviate from the simplest equilibrium conditions, for example, in systems with background plasma flows or with external magnetic fields . Particles interact through complicated forces mediated by the plasma environment , and violate some of our basic expectations: They are nonreciprocal and can source energy from their nonequilibrium environmentAre we just calling anything with a neutral network an "LLM" now, and attributing LLM problems to them all?As far as I can see this is nothing but good old classic pre-LLM machine learning (eg digit recognition), which isn't really known for the sort of hallucinations output by LLMs.
Now they say AI thinks it is seeing something after adding it all up, real or another 'Artificial Hallucination'? sometimes AI does find stuff but other times it seems to talk about events that are not really there, an LLM problem?AI Reveals Unexpected New Physics in the Fourth State of Matterhttps://scitechdaily.com/ai-reveals-unexpected-new-physics-in-the-fourth-state-of-matter/andPhysics-tailored machine learning reveals unexpected physics in dusty plasmashttps://www.pnas.org/doi/10.1073/pnas.2505725122Dusty plasma is ubiquitous throughout the universe, from Saturn’s rings to interstellar space , and is critically important for planet formation , technological processes , and potentially the emergence of life . In a dusty plasma, dust particles’ interactions have known approximations based on tractable physics, yet they are poorly understood in environments that deviate from the simplest equilibrium conditions, for example, in systems with background plasma flows or with external magnetic fields . Particles interact through complicated forces mediated by the plasma environment , and violate some of our basic expectations: They are nonreciprocal and can source energy from their nonequilibrium environment
Quote from: sanman on 02/27/2026 10:35 amWe first identified hallucination thru LLMs and now associate it with them. But when it's happening from the underlying neural network, then why would the mechanisms which give rise to hallucination be exclusive to LLMs?...So just to be clear, you're not saying we've ever identified LLM-like hallucinations in such networks, correct?
We first identified hallucination thru LLMs and now associate it with them. But when it's happening from the underlying neural network, then why would the mechanisms which give rise to hallucination be exclusive to LLMs?...
Join us for the Royal Society Michael Faraday Prize Lecture delivered by 2025 winner Professor Michael John Wooldridge.