Ideas on organizing, regulating, managing and using AI in the curriculum or education for college or University students

Hi all, I would love to hear your thoughts on using AI officially in the classroom, for College or University students.

I’m part of a board of advisors of an university in The Netherlands and we are debating on how we should regulate and organize the usage of AI, stimulating the students to use it (wisely) and be open and transparent about it.

I can imagine that there a tons of examples of “initial” policies or suggestions that people here could share and look forward to discuss this.

5 Likes

Hi there, founder of StudyWand.com here.

Since a 2020 university grant (15k) we’ve been working on applying AI in the classroom, including some aspects in other non-english languages.

You can use AI to cheat on essays, but also to formulate practise exercises and to personalise learning for content; this is what studywand does - converts Youtube videos and PDFs into AI quizzes, automatically.

In terms of how learnings behave, I studied the psychology of learning and in one experiment, found that users more or less preferred in this order:

  • 1-2-1s with teachers
  • Studying with materials made by class mates
  • Studying with AI flashcards
  • Making their own flashcards to study /studying alone
  • Writing assignments/essays

If you incorporate AI-created flashcards, you can ensure you prioritise solutions that emphasize accuracy (like studywand, which quotes your text and allows you to review cards), and help students adjust to the new age where they will rely on LLM content.

In terms of policies, I think it depends greatly on your philosophy with open or closed exams. Essays will not be as powerful or useful in the future, and might diminish in their proportion of assessments.

I entirely believe in the benefits of flashcards, as they helped me study at a top university, coming from a less than academic start. Specifically, I wrote a dissertation on flashcards and would suggest your policies certainly permit AI-generated flashcards. One of the key unconsidered benefits for teachers of retrieval practise with AI over creating cards for students or students creating cards is insights into students knowledge gaps, and in particular, addressing foresight bias - the fact that particularly in some subjects like Physics, students don’t know what they don’t know (watch this amazing Veritasium video, it also explains why misconceptions are so handy for learning physics): Khan Academy and the Effectiveness of Science Videos - YouTube - basically, if you use AI quizzes (or any prepared subject-specific right/wrong system), you learn quickly where your knowledge sits and what to focus on, and reduce your exam stress. As a teacher, AI learning can make learning more of a “gradual hill climb” then a “storm the trenches”, if you can convince your university along this path. It has worked for flipped classrooms, and worked way back in 2002 when Frank Leeming demonstrated students learned better, and preferred learning, with regular quiz questions every few days - rather than a big final exam.

Lastly, for teachers, the constant preference for 1-2-1 learning and socioemotional support that can scaffold will become more important, and be irreplaceable by AI; you can’t trust AI to be looking out for you. Teachers of tomorrow will build two parts character, one part knowledge, where in the past they built two parts knowledge, one part character.

To quote from my dissertation experiment on background reading for retrieval practise, which is being enabled by AI today… this is a requote of a comment I’ve just left on HackerNews this afternoon, but I think you’ll find it insightful: Retrieval practice – typically, quizzing - is an exceedingly effective studying mechanism (Roediger & Karpicke, 2006; Roediger & Butler 2011; Bae, Therriault & Redifer, 2017, see Binks 2018 for a review), although underutilized relative to recorded merit, with students vastly preferring to read content (Karpicke & Butler, 2009; Toppino and Cohen, 2009). Notably mature students do engage in practice quizzes more than younger students (Tullis & Maddox, 2020). Undertaking a Quiz (Retrieval practice) can enhance test scores significantly, including web-based quizzes (Daniel & Broida, 2017). Roediger & Karpicke (2006) analysed whether students who solely read content would score differently to students who took a practice quiz, one week after a 5-minute learning session. Students retained information to a higher level in memory after a week with the quiz (56% retained), versus without (42%), despite having read the content less (average 3.4 times) than the control, read-only group (14.2 times). Participants subjectively report preference for regular Quizzing (Leeming, 2002) over final exams, when assessed with the quiz results, with 81% and 83% of participants in two intervention classes recommending Leemings “Exam-a-day” procedure for the next semester, which runs against intuition that students might biases against more exams/quizzes (due to Test Anxiety). Retrieval Practice may increase performance via increasing cognitive load which is generally correlated with score outcomes in (multimedia) learning (Muller et al, 2008). Without adequate alternative stimuli, volume of content could influence results, thus differentiated conditions to control for this possible confound are required when exploring retrieval practice effects (as seen in Renkl 2010 and implemented in Methods). Retrieval practice in middle and high school students can reduce Test Anxiety, when operationalised by “nervousness” (Agarwal et al 2014), though presently no research appears to have analysed the influence of retrieval practice on university students’ Test Anxiety. Quizzing can alleviate foresight bias – overestimation of required studying time – in terms of students appropriately assigning a greater, more realistic study time plan (Soderstrom & Bjork, 2014). Despite the underutilization noted by Karpicke and Butler (2009), quizzing is becoming more common in burgeoning eLearning courses, supported by the research (i.e. Johnson & Johnson, 2006; Leeming, 2002; Glass et al. 2008) demonstrating efficacy in real exam performance.

3 Likes

Hi there, thanks for your answer. You have been doing amazing work in this space and it does talk about a solution that could be helpful. However, I’m looking for some inspiration in terms of policies, not product features.

I’m already convinced students are using tools, such as yours, that could be highly beneficial to their learning, but i’m looking for information / examples / experience in writing policies and the right to stimulate the right behavior. Any ideas are welcome :smiley:

At the same time I think that the amount of products will multiply over time and it will be used for solutions I cannot even dream-up today, so how to stimulate and structure this over time?

2 Likes

This article is about using ChatGPT for preparing presentations

It’s in french.
Let me know if you can’t translate it

1 Like

Hi Omeie, apologies for the delay in reply, but French translations are no problem (thankfully) It is a great article thank you. It makes a lot of sense and could be utilized

1 Like

Hi there! I’m from noty.ai - and I believe you should try it in your education. Noty.ai is an AI-powered tool that automates meeting transcription and note-taking processes. With Noty.ai, you can easily record your meetings, and the tool will transcribe and summarize the content, extracting key points, action items, and highlights. Streamline your meeting notes and boost your productivity with Noty.ai

HI founder of study wand , trying it out. Running into issues receiving error messages: e.g. “running out of memory”. How to fix that? Thanks

Hi DrDoRo.

Thank you for asking about this. Sorry about this and I hope I can help - firstly, I believe I’ve found a bit of information about your experience and see a related error about “Internet Connection”. This could be due to demand being oversaturated, or that the internet connection is not stable (we added this as a requirement throughout the upload due to bots).

Secondly, if you just send the PDF (to [email protected]), we can explore ourselves and share the generated test(I promise you this is not us “going behind the curtain”), if you are comfortable and able to do that. 95%+ of PDFs successfully process on our current system, and the internet connection error indicates it may well process fine if we can overcome that issue.

1 Like

As a student, I have found the esaays used by the typical chat-gpt would probally give me around a “C” grade, which is passing, but humans are definitely better at essay writing; with creativity, and adding real world experiences that enrich the overall esaay. I would recommend a policy where students could use it (because it does help in areas like proper format, references, in-text citations, spelling, and such) but that to get above a “C” they need to show some effort, apply real world experiences, and show some craetivity.

Hello, friends of the community, I’m Gabriel from Brazil. I am starting an academic research project on the impact of AI in our country’s educational institutions, how our schools are preparing to receive this information, how they are preparing their students to handle this information, and how the school will provide the necessary support and education for their students to learn how to use AI in the best possible way. My goal with this is to understand the current needs and challenges of our schools. I apologize if there are any errors in my English; I am still developing my writing skills.