When I asked a student about some bizarre code in her do file, she said she had been using chatGPT to help her. (I see there's a thread about a specialized Stata GPT; this isn't that.) She was running into serious memory problems and asked chatGPT for code to reduce the amount of memory she used. It helpfully showed her how to specify the data type when generating variables--and had her always specifying variables as doubles, even when they could have been bytes.
We appear to be entering a world where we need to warn our students not just about the danger of grabbing potentially helpful code that someone has posted (that may or may not fit their needs), but also the dangers of using AI-written code. I imagine the preferred solution is the same in both cases: sure use the internet or AI to get ideas and sample code, and then don't use it until you understand it. But my n=1 experience of AI-written code suggests that it has a higher risk of flat-out doing the opposite of what you want instead of the most common flaw with googling code, which is merely adding irrelevant gunk.
We appear to be entering a world where we need to warn our students not just about the danger of grabbing potentially helpful code that someone has posted (that may or may not fit their needs), but also the dangers of using AI-written code. I imagine the preferred solution is the same in both cases: sure use the internet or AI to get ideas and sample code, and then don't use it until you understand it. But my n=1 experience of AI-written code suggests that it has a higher risk of flat-out doing the opposite of what you want instead of the most common flaw with googling code, which is merely adding irrelevant gunk.
Comment