I don’t want to let machines think for me at all, but especially with how many LLMs are so consistently wrong about what they say and give such bad output! Why are people putting their faith in something that says 2+2 is 5 and will fight you when you try to tell it it’s wrong?