ChatGPT has a style over substance trick that seems to dupe people into thinking it’s smart, researchers found::Developers often prefer ChatGPT’s responses about code to those submitted by humans, despite the bot frequently being wrong, researchers found.
“ChatGPT is great” depends on what context you’re talking about. Use it for generating mock data and I’d say it’s pretty great.
Getting help with my code as a professional software developer is pretty decent and much better than Google or Stack.
Getting it to tell me about Physics, not great in my situation as I don’t know enough about physics to know where it’s wrong.
Point being, it can be a great TOOL to aid you in your specialist field of work.
People think that it’ll do your job for you but it’s more a kin to a calculator. It’s a tool to help people.
I find it to be an excellent tool to help me write. Staring at a blank page is one of the hardest hurdles to overcome. By asking questions to chatGPT, I start organizing my thoughts about what I want to write, and it gives me instant words on the page to start manipulating. I am a subject matter expert on these topics and therefore screen what it gives me for correctness. It’s surprisingly good, but it has hallucinated some things. But on the balance I find it very helpful.