Letting a language model do the work of thinking is like building a house and using a circular saw to put nails in.
It will do it but you should not trust the results.
It is not Google. It can, will, and has made up facts as long as it fits the format expected
Not at the very least proof reading and fact checking the output is beyond lazy and a terrible use of a tool. Using it to create the end product instead of as a tool to use in creation of an end product are two very different things.
I'm glad you understand my point.
Chatgpt is not Google. It's a language model that will give you something that looks like the thing you asked for it to provide. It can and will pull facts out of its recycle bin if it fits the cadence of what it expects the answer to look like.
I don't mind the tool itself if you use it as such. I do mind when people use its output as the final product. See: the lawyer who used chatgpt for a legal brief
When I say "fucking zoomer", I mean it in an endearing way. Promise.
If a Gen z kid irritates me, I just link the sounds of dialup modems to them. Much more cathartic.
Letting a language model do the work of thinking is like building a house and using a circular saw to put nails in. It will do it but you should not trust the results.
It is not Google. It can, will, and has made up facts as long as it fits the format expected
Not at the very least proof reading and fact checking the output is beyond lazy and a terrible use of a tool. Using it to create the end product instead of as a tool to use in creation of an end product are two very different things.