I wouldn’t trust ChatGPT with teaching me about some tool. It in my experience very convincingly spews out stuff it invented, and if one is still learning I can see it being hard to spot those errors. I use it to fix syntax errors in SQL queries, though, since I can’t be bothered to try understanding the not-so-helpful error messages I get with my queries, and because if chaptgpt tells a lie it will be caught by my syntax checker.
So, I guess you can use it, if you always assume it to be trying to mislead you until proven to the contrary.
I would and I have, but you can’t always blindly trust what it says. It’s better to ask it to explain in detail the code it produces, so you can really learn and also as a safeguard.
I wouldn’t trust ChatGPT with teaching me about some tool. It in my experience very convincingly spews out stuff it invented, and if one is still learning I can see it being hard to spot those errors. I use it to fix syntax errors in SQL queries, though, since I can’t be bothered to try understanding the not-so-helpful error messages I get with my queries, and because if chaptgpt tells a lie it will be caught by my syntax checker.
So, I guess you can use it, if you always assume it to be trying to mislead you until proven to the contrary.
I would and I have, but you can’t always blindly trust what it says. It’s better to ask it to explain in detail the code it produces, so you can really learn and also as a safeguard.