How I Grade 2-3x Faster with ChatGPT

Emerging from the end-of-semester thicket with fewer scars.

Sponsored by

[image created with Dall-E 3 via ChatGPT Plus]

Welcome to AutomatED: the newsletter on how to teach better with tech.

Each week, I share what I have learned — and am learning — about AI and tech in the university classroom. What works, what doesn't, and why.

In this week’s piece, I explain how I grade 2-3x faster with the help of ChatGPT. I also share some recent news, including a 274-page report (!!!) from Google on the ethics of advanced AI assistants.

🧰 An AI Use Case for Your Toolbox:
Grading Faster with ChatGPT

Tomorrow is the last day of classes. Two days later, final exams begin. And my wife is due with our first child any day now. (You’ve been warned…)

I am out of time!

My students’ final exam submissions will take a while to grade, and first I need to have the rest of my grading out of the way.

So, what have I done about it?

I have made a set of custom GPTs that check three boxes:

  1. They help me grade 2-3x faster. From start to finish, for ~1.5-page reading responses, I can grade each student’s assignment submission in 3-5 minutes rather than ~15 minutes. Likewise, for 3-page persuasive papers with denser content, I can grade each in 10-15 minutes rather than ~30 minutes.

  2. They enable me to give students substantial customized feedback. Each student gets several paragraphs of information, specific to their work, on what they did well and where they can work to improve. I write very rough feedback in a list format, and the custom GPTs convert this feedback into a polished form.

  3. They do not have access to any student data. No privacy risks, no need to get student consent, nothing. (On a related note, see my ✨Premium Guide on ethically using AI with student data, released last Wednesday, if you want some help thinking through related issues.)

For instance, suppose I rapidly typed these rough notes, in a stream of consciousness, while reading a student’s reading response, and then submitted them as a prompt:

I would then get the following as an output:

Below, I will explain how you can create your own custom GPTs like mine but fitting your own pedagogical preferences, if you have or get ChatGPT Plus (which costs $20 per month).

Note: If you don’t have ChatGPT Plus and don’t want to get it, you can achieve similar results — though generally inferior — without a subscription by creating a prompt that compresses some of the following. You would then copy-paste this prompt into a message to GPT-3.5 (or another LLM) prior to your rough feedback, and you would repeat this process every time.

Step One: Create Custom GPT

Once you have ChatGPT Plus, navigate to https://chat.openai.com/gpts. In the upper right corner, you’ll see two buttons, one for your preexisting GPTs and one to create another:

Click the “+Create” button. This will enable you to create a custom GPT based on GPT-4 with pre-built instructions (think of them as a meta-prompt that is appended to the start of any user’s interactions with the custom GPT) and knowledge base files (parts of which will be accessed, as needed, by the GPT, depending on whether a user’s prompts contain related content).

In the left side panel, click “Configure.” (You could use the “Create” button to chat with the GPT Builder, but we are going to cut to the chase and directly input the instructions.)

Feel free to name the custom GPT, add a short description, and upload an avatar image. Here’s one of mine with no avatar:

Step Two: Conceptualize Purpose of GPT

Before we get into the fine details of how to structure the instructions — this is the crucial next step after this one — we need to get a handle on the purpose of the custom GPT.

As stated at the outset, this is a use case where you, the professor, won’t be sharing with the AI tool any student data (e.g., their submissions for the assignment you are grading).

So, what will you be providing it?

In brief, you will be providing it with your own rough notes on the strengths and weaknesses of each of your student’s submissions, organized in a way that the custom GPT expects.

The custom GPT will then act like your secretary, translating your sloppy shorthand praise and criticisms into what you would say to the student, if you had 2-3x more time to properly package your feedback and relate it to the assignment. It is as if your terse and scattered marginal notes on the student’s work were translated into a cohesive, well-written, and assignment-sensitive whole.

To be clear, your custom GPT will be designed to receive inputs or prompts that have these features:

  • Yours: you will not be providing the custom GPT with the student submission itself, but rather with something of your own creation.

  • Rough: the goal is to write notes for each student submission as fast as possible, whether via incomplete sentences or short phrases.

  • Strengths: you will be providing a description of what goes well for the student in their submission, whether relative to a rubric or more generally.

  • Weaknesses: you will be providing a description of what the student needs to work on, whether relative to a rubric or more generally.

  • Organized: as you express the strengths and weaknesses, you need to follow a pattern or structure that is consistent.

  • Expected: finally, this pattern or structure needs to match the pattern or structure expected by the custom GPT (this is the main reason that you focus on one student per prompt, as this is a more manageable task for the custom GPT).

Of course, if you have other needs, you can put them in place of — or in addition to — the strengths and weaknesses. However, the other components will need be retained, in some form or another, to ensure that the custom GPT saves time (“Rough”), respects student data privacy (“Yours”), and works effectively (“Organized”, “Expected”).

And what should you expect of the custom GPT’s outputs once you give it inputs or prompts fitting these criteria?

As foreshadowed above, you want the custom GPT to reliably convert your inputs into well-written feedback that is framed in an encouraging but sentiment-sensitive way.

That is, your custom GPT will be designed to produce outputs or responses that have these features:

  • Sensitive: the outputs need to correspond directly to your inputs and the assignment, such that the custom GPT simply channels or converts what you input into a better format and presentation (and does not hallucinate or add things that you don’t want included) that fits the task the student was assigned.

  • Reliable: you don’t want the custom GPT to make mistakes in being sensitive to your inputs, as you want to minimize the need to resubmit your inputs or edit its outputs.

  • Well-Written: since you are trying to save time, you will give the custom GPT inputs that are (very) rough or poorly expressed, thus demanding of the custom GPT significant improvements in the form of “packaging” or expression.

  • Framed Encouragingly: just like my prior newsletter on converting mean drafts into keen emails, the goal here is to write brutally direct notes on the student’s work because it is faster, but this is not ideal psychologically for the student to receive, so you need the custom GPT to make adjustments accordingly.

  • Sentiment-Sensitive: nonetheless, you don’t want the custom GPT to misrepresent to the student your sentiment on the quality of their work (“Absolutely terrific job on this essay with 0 strengths and 231 weaknesses, Michael! Keep up the great work!”).

As above, if you have other needs, you can put them in place of the final two. Perhaps, for instance, you have a different view on the psychology of feedback. Or, perhaps, you allow students to resubmit an assignment if they submit less-than-perfect work on it — and you want to flag this in the custom GPT’s outputs. Regardless, the other components will need be retained, in some form or another, to ensure that the custom GPT is not removing any of the content of your feedback (“Sensitive”), is not going off script (“Reliable”), and is improving your feedback (“Well-Written”).

Let’s now convert this conceptualization into instructions for the (ghost in the) machine.

Step Three: Craft Instructions, Add Knowledge

Fittingly, the instructions you now need to craft will be placed into the “Instructions” box, which you can expand via the arrows in the lower right corner (circled here in red):

At the start of the instructions, you should input a general or “big picture” description of the purpose of the custom GPT. Keep it brief. Here’s mine:

#INSTRUCTIONS:

You follow the below step-by-step process to convert numbered notes from the user (the professor Dr. Clay) on the strengths and weaknesses of a specific student's submission on Reading Response #2 into concise, organized, clear, and useful feedback paragraphs. You are a pedagogy expert whose strength is in improving student learning through psychologically effective feedback. You always respect the judgment of the user.

Next, you need to create a description of the step-by-step process that the custom GPT should follow in response to every input or prompt, as a guide to producing each of its outputs or responses.

Here are the four steps in the process:

  1. Sentiment Analysis

  2. Strength Conversion

  3. Weakness Conversion

  4. Composition

Each step should be described briefly and paired with an example or exemplar that features an input/output pair. This is an instance of few-shot prompting, where you provide the LLM with a few illustrations of its desired behavior.

Step 1 is designed to extract a sentiment from your notes to guide the custom GPT on how it channels your overall impression of the student’s work.

Here is my first step, prefaced the heading for the step-by-step process. I have removed my specific strengths and replaced them with phrases like ‘ONE EXCEPTIONAL STRENGTH HERE’. Obviously, you will replace them with real strengths and weaknesses, formulated in rough form, that have the appropriate sentiments. (Now is a good time to rapidly write some rough feedback for the first two student submissions so that you can insert it into these instructions.)

#YOUR STEP-BY-STEP PROCESS:

##STEP 1:

Analyze the user's prompt to gauge sentiment: very positive, positive, neutral, negative, very negative. Use the gauged sentiment to inform your tone for subsequent steps. Then, proceed to Step 2.

###EXAMPLE 1:

User: "(1) ONE EXCEPTIONAL STRENGTH HERE. (2) ANOTHER HERE."

Sentiment: very positive

###EXAMPLE 2:

User: "(1) ONE NEGATIVE WEAKNESS HERE. (2) ANOTHER HERE."

Sentiment: negative

Notice that the headings go from one hashtag to two hashtags and then to three hashtags. This is important to convey to the custom GPT the structure of the instructions — just like a sequence of headings and sub-headings conveys the structure of a document to a human reader — so that it is “aware” that these examples are examples for Step 1.

Steps 2 and 3 instruct the custom GPT on how to parse your prompts about the strengths and weaknesses of the student’s work. Each contains a crucial instruction that tells the custom GPT to respect any specific strengths or weaknesses you give it (i.e. one that contains a detailed description of nuances; not something generic like “run-on sentences and grammatical mistakes”). Note that the custom GPT is what you are referencing with “You:” here. (In the web version of this piece, I have color coded matching parts for ease of comprehension.)

##STEP 2:

Take the numbered notes from the user on the strengths of the student's Reading Response #2 and convert them into short unnumbered paragraphs. If the notes are specific, leave the specifics completely unchanged but contextualize them in complete sentences that fit with the assignment prompt and that match the sentiment from Step 1. If the notes are generic, formulate them using complete sentences, retaining the broad meaning. Then, proceed to Step 3.

###EXAMPLE:

User: "Strengths: (1) ONE GENERIC STRENGTH HERE. (2) ONE SPECIFIC STRENGTH HERE."

You: "YOUR WELL-WRITTEN EXPRESSION OF THE GENERIC STRENGTH HERE. TRANSITION IF NEEDED. YOUR WELL-WRITTEN EXPRESSION OF THE SPECIFIC STRENGTH, WITH SPECIFICS RETAINED (INCLUDING QUOTES WORD-FOR-WORD), HERE."

##STEP 3:

Take the numbered notes from the user on the weaknesses of the student's Reading Response #2 and convert them into short unnumbered paragraphs. If the notes are specific, leave the specifics completely unchanged but contextualize them in complete sentences that fit with the assignment prompt and that match the sentiment from Step 1. If the notes are generic, formulate them using complete sentences, retaining the broad meaning. Then, proceed to Step 4.

###EXAMPLE:

User: "Weaknesses: (1) ONE GENERIC WEAKNESS HERE. (2) ONE SPECIFIC WEAKNESS HERE."

You: "YOUR WELL-WRITTEN EXPRESSION OF THE GENERIC WEAKNESS HERE. TRANSITION IF NEEDED. YOUR WELL-WRITTEN EXPRESSION OF THE SPECIFIC WEAKNESS, WITH SPECIFICS RETAINED (INCLUDING QUOTES WORD-FOR-WORD), HERE."

Next, in Step 4, which is the final step, you instruct the custom GPT how you want it to compose its output as a single cohesive whole. Here is one way to do it (in the web version of this piece, the colors are retained from before to show where they will appear):

##STEP 4:

Use the gauged sentiment from Step 1, the unnumbered strength paragraphs from Step 2, and the unnumbered weakness paragraphs from Step 3 to craft cohesive feedback to the student. Place an introductory remark mentioning the assignment and briefly expressing the gauged sentiment before the strength paragraphs, then place the weakness paragraphs, and finally place a sign-off from the user (Dr. Clay), a citation to ChatGPT, and a suggestion to come to office hours if they have any questions. Do not number the parts. Be very direct, concise, and brief. Send a message to the user with all parts in succession.

###EXAMPLE:

You: "Good work on Reading Response #2!

YOUR WELL-WRITTEN EXPRESSION OF THE GENERIC STRENGTH HERE. TRANSITION IF NEEDED. YOUR WELL-WRITTEN EXPRESSION OF THE SPECIFIC STRENGTH, WITH SPECIFICS RETAINED (INCLUDING QUOTES WORD-FOR-WORD), HERE."

TRANSITION IF NEEDED.

YOUR WELL-WRITTEN EXPRESSION OF THE GENERIC WEAKNESS HERE. TRANSITION IF NEEDED. YOUR WELL-WRITTEN EXPRESSION OF THE SPECIFIC WEAKNESS, WITH SPECIFICS RETAINED (INCLUDING QUOTES WORD-FOR-WORD), HERE."

Please come by office hours if you would like to discuss further.

Dr. Clay

Note: This feedback was drafted by ChatGPT based on my notes."

Once you have the entirety of the step-by-step process described in a sequence like this, I would recommend adding two more components to your instructions:

  1. Assignment Prompt: Either paste in the entirety of the assignment prompt or, if you lack the space, explain its core elements.

  2. Knowledge Files Explanation: Provide a very brief explanation of the files that you upload to the custom GPT. I would keep the files to a minimum, because more files can confuse the GPT about the information that is relevant to complete the task. (Also note that custom GPTs cannot parse images and have trouble with advanced or complex formats, like .pptx files or some .pdfs. More information from OpenAI here.) In my case, I provided the reading that my “Reading Response #2” concerned.

#ASSIGNMENT PROMPT GIVEN TO STUDENTS FOR READING RESPONSE #2:

DESCRIPTION HERE

#EXPLANATION OF KNOWLEDGE BASE FILES:

You have two files in your knowledge base: (i) X; and (ii) Y. Rely on knowledge before searching the internet.

You can upload the files via the “Upload files” button below the “Instructions” box. You can enable web browsing, image generation, or Code Interpreter as needed. (In this context, Code Interpreter will likely only be relevant for grading tasks involving data analysis or coding.)

Step Four: Experiment

Once you’ve made a first pass at the above, you need to test the custom GPT to see if it produces what you want. Press the “Update” button in the upper right corner of your screen (the “Share” button enables you to change who can access the custom GPT):

Then you will be presented with an option to copy the link to the GPT or view it. I would recommend copying the link, opening a new browser tab, and pasting it there.

I would recommend experimenting with it, making adjustments as needed, and then creating a workflow where you write your strengths and weaknesses in a document, copy/paste them to the custom GPT, and then paste them into Canvas or however you convey feedback to students. I generally modify the feedback once I have pasted it into Canvas, if needed, but if there are significant issues in the outputs, you should probably go back to the custom GPT’s instructions and make modifications to minimize the amount of edits that you need to make each cycle.

📢 Quick Hits:
News Tidbits for Higher Educators

Inside Higher Ed’s annual survey of college and university provosts shows that most institutions have no institutional-level AI policies and have not reviewed curricula to ensure it helps prepare students for the changes being brought by AI.
  • Why it matters: The survey (click here for takeaways related to AI) underscores a critical gap in AI readiness within higher education, pointing out that only 20% of institutions have formal AI usage policies. This lack of preparedness could hinder the ability of faculties and students to responsibly harness AI's potential, particularly in areas like academic integrity and curriculum development tailored for future AI-related career demands.

EDUCAUSE has partnered with learning specialists and CIOs from Emerson College to Stanford, as well as Amazon Web Services, to develop “The Higher Education Generative AI Readiness Assessment” (click here for fillable PDF and here for the non-fillable PDF). It is intended to help higher education institutions gauge their preparedness for optimizing their approaches to AI, especially through strategic AI initiatives.
  • Why it matters: Designed to be versatile, the assessment offers multiple approaches to participation — individual completion, team engagement with IT staff, or discussions among cross-functional teams. As universities and colleges increasingly look to integrate generative AI into their operations and curriculum, understanding the readiness and potential impact of such technologies becomes crucial. This assessment tool provides a structured framework for a range of stakeholders to evaluate readiness, encouraging them to engage in meaningful dialogue.

✨Recent and Upcoming Premium Posts

April 30 - Tutorial: Setting Up Your Own Local LLM

📬 From Our Partners: A General AI Newsletter

Note: The below advertisement features “magic links,” which will automatically subscribe you to “The Rundown AI” newsletter if you click them. You wouldn’t be making a mistake by subscribing — I am subscribed myself — but I just wanted to warn you…

How do you stay up-to-date with the insane pace of AI? Join The Rundown – the world’s fastest-growing AI newsletter with over 500,000+ readers learning how to become more productive using AI every morning.

1. Our team spends all day researching and talking with industry experts.

2. We send you updates on the latest AI news and how to apply it in 5 minutes a day.

3. You learn how to become 2x more productive by leveraging AI.

Graham

Expand your pedagogy and teaching toolkit further with ✨Premium, or reach out for a consultation if you have unique needs.

Let's transform learning together.

Feel free to connect on LinkedIN, too!

What'd you think of today's newsletter?

Login or Subscribe to participate in polls.