Using AI Appropriately

Humans do not stand alone. In programming, as in other arts, we do better work and learn more when we communicate and cooperate. Sometimes, however, people copy the work of others as a way to avoid their own work; such plagiarism is unhelpful and unethical. The other extreme, however, leaves you working without the benefit of other people. Finding the right middle path is one of the things students must learn in their work.

This document focuses on using Artificial Intelligence (AI) tools.

For guidance on getting help from other people, see the working with people guide.

For guidance on getting help from online sources, see the working with resources guide.

Generative AI as a software engineering tool

Generative AI is reshaping how programmers work. Recent developments have made AI remarkably capable of writing programs, and some software engineers are finding that AI tools make a good ‘copilot’ they can use to speed through the tedious parts of their jobs. AI does sometimes ‘hallucinate’, and write bizarre wrong code—or worse, superficially correct code with subtle bizarre behavior. There are also still a lot of things a human programmer is expected to do that no current AI tool is capable of.

However, with an experienced human programmer telling the AI what to do and checking its work, as a team they can work very efficiently and the human is freed up to spend less time on writing boilerplate code where they are limited by their typing speed, and more time on tasks only a human can do—such as deciding what problems to solve, picking the strategies to use to solve them, debugging complex issues, and working with other humans.

Generative AI as a learning tool

The strengths and weaknesses of generative AI, when it comes to writing code, make it almost useless for learning the skills relevant to an early CS class. What it can do well is mostly what you should be doing for yourself, and where it is limited and needs a human to guide and double-check it, the human has to already have those skills!

An AI chatbot is like an overeager advanced student who is is really eager to show off what they know and isn't at all worried about what you actually understand. It is not capable of identifying what you do and do not understand. It is not going to help you identify ways to test your knowledge and assumptions. It just wants to give you the answer to your problem.

Seeing examples of how code works and common strategies for solving problems is an important first step in learning what is possible and to develop a mental toolbox. However, it is very easy to watch someone else solve a problem and believe you understand everything, only to run into all kinds of issues when you try to solve a similar problem on your own. Working your way through those issues is the only way you will learn how to effectively build algorithms and debug code.

Are there acceptable ways to use AI in class?

Maybe. A good heuristic is to think of an AI as a person, and ask whether your use of it would be acceptable when working with a human. This extends not only to the concern that they might try to do your work for you, making it harder for you to learn, but even to the concern that they might plagiarize from you if you let them see your work.

Questions that would be acceptable to ask a human, such as help interpreting a compiler message, or how to use a language feature, are probably safe. However, one way AI is different from a person is that if you ask a classmate something they don't know, they probably won't confidently make up something random. Given AI's rare but unpredictable hallucination, it's not as reliable as materials and people from your actual class.

The role of AI in computer science—and society in general—is rapidly changing. If you think of a way to use AI that isn't addressed in this guide, or if you're not sure how it might apply, talk to your professor. Not only do we want to help you learn, we're curious about ways we might not have thought of to use AI tools!

Policies

Unless specifically instructed to do so as part of a learning activity, you should not use any form of generative AI to write code for a CS course. Turning in code generated by an AI represents a violation of the academic integrity policy.

This policy exists to protect your long-term interests (to develop a set of skills and knowledge) from short-term temptation (‘I need to get this assignment turned in and I need to leave for work in an hour.’).

You should recognize that the danger to your long term goals exists even if you use AI with the best intentions. ‘Just’ asking how to solve a problem so you can ‘see how it is done’ is going to rob whatever you are working on of much of the potential learning—even if you don't directly copy and paste code generated by the AI. This is like studying a walkthrough of a game before you play it yourself for the first time; even if you play it yourself, you've lost the opportunity to see it fresh for yourself.

Until someday when there is an actual AI tutor available that can guide you to solving problems instead of showing you their answers, stay away from AI while trying to learn CS. As the old idiom goes, ‘give someone a fish and you feed them for a day; teach them to fish and you feed them for a lifetime’. Current AI tools are only capable of giving you fish.