Been getting more hallucinations recently on ChatGPT-5 when asking for facts Prediction markets should be able to price truth in inference to solve this Fast, liquid & reliable markets for micro-claims could be the answer to improving accuracy of LLMs long term