Skip to content

AI Tools for Education

Teaching & Artificial Intelligence

Artificial intelligence (AI) generative tools are a class of algorithms that generate new and unique outputs, such as  text, images, or music, based on input data or parameters. AI is rapidly reshaping every aspect of our world—including education.

From generative AI tools like ChatGPT, DALL·E, and Microsoft Copilot to recommendation systems, adaptive learning platforms, and predictive analytics, AI comes in many forms. While generative AI has captured public attention for its ability to create text, images, code, and more from simple prompts, it's just one part of a broader ecosystem of intelligent technologies already embedded in search engines, learning management systems, and student support tools.

This rapid evolution brings both excitement and uncertainty. Many instructors are grappling with how to respond—balancing the promise of enhanced learning and engagement with concerns about academic integrity, bias, transparency, and equitable access. Meanwhile, workforce demands are shifting: most employers are already using AI in their operations, yet many students report they haven’t been prepared to use these tools effectively.

A recent study completed in 2024 has shown that employers across the world are using AI, and graduates are often entering the workforce underprepared:

73% of employers use GenAI while more than half (55%) of graduates said their program did not prepare them to use GenAI tools during their studies. Programs that incorporate GenAI into curricula not only drive new education value but also improve career-readiness. Three in four (70%) grads believe basic GenAI training should be integrated into courses to prepare students for the workplace, and 69% say they need more training on how to work alongside new technologies in their current roles.”

At the University of Utah’s Center for Teaching & Learning Excellence, we’re here to support you in navigating these changes. Whether you're exploring how to redesign assignments, integrate AI ethically into your courses, or help students build the AI literacy they’ll need in the workplace, this page offers strategies, examples, and resources to guide your teaching practice in a rapidly evolving landscape.

Learn more about the University of Utah's approved AI tools and AI Policies and Guidelines.


 

Why AI Matters

Artificial Intelligence (AI) is no longer a distant concept—it is currently reshaping higher education. Its integration into teaching and learning environments has brought about a shift in instructional design, classroom practices, and institutional decision-making. For faculty, this means moving beyond awareness to active engagement with AI tools in order to prepare students for a workforce that already demands AI literacy.

In the research, we see that AI technologies power adaptive learning systems that tailor instruction to individual student needs, increasing engagement and promoting deeper learning (Das, Mutsuddi, & Ray, 2024; Yadav, 2024). Tools such as intelligent tutoring systems, predictive analytics, and automated feedback mechanisms allow for real-time instructional adjustments based on student performance (Muhie & Woldie, 2019). These systems not only support students with immediate feedback and differentiated instruction but also help educators identify learning gaps and deliver targeted interventions (Asatryan, 2023).

However, these advances come with new responsibilities. Ethical challenges—such as data privacy, algorithmic bias, intellectual property rights, and the potential commercialization of education—demand thoughtful, transparent implementation (Maphalala & Ajani, 2025; Ojha, 2024). Educators must not only understand the opportunities AI presents but also lead its integration with a strong foundation in equity, pedagogy, and ethics. Faculty play a crucial role in helping students build the skills, mindsets, and critical thinking necessary to engage with AI meaningfully and responsibly.

At the University of Utah, we recognize both the promise and complexity of AI in education. We are committed to supporting instructors in developing the digital competencies, reflective practices, and instructional strategies needed to use AI for deeper learning, career readiness, and institutional innovation.


 

Getting Started with AI

Instructors: Are you thinking about how to engage with AI in your courses, but not sure where to begin? Check out the resources below!

As AI tools become more embedded in academic and professional settings, instructors play a critical role in guiding students toward thoughtful, ethical, and effective use. Rather than banning or ignoring AI, educators can design learning experiences that leverage its strengths while promoting deep thinking, skill development, and academic integrity. The tips below offer practical ways to integrate AI into your teaching with clarity, purpose, and impact.

  1. 1. Start with a Purposeful Conversation

    Begin your course with an open discussion about AI. Clarify how tools like Microsoft Copilot can support—but not replace—student learning. Frame AI as a professional skill and ethical responsibility. Ask students:
    When is it helpful? When is it harmful?

  2. 2. Design Assignments for Deep Thinking

    Use assignments that require analysis, synthesis, and evaluation—not just recall. Scaffold tasks that draw on student voice, lived experience, and multiple sources. Incorporate multimedia or collaborative elements that are harder for AI to replicate meaningfully.

  3.  3. Build In AI Reflection

    Ask students to reflect on how they used AI in an assignment: What prompt did you use? What worked well? What did you revise or reject? Include these reflections as part of the grade or feedback process to support metacognitive growth.

  4.  4. Offer Choice with Guardrails

    Consider giving students the option to complete assignments with or without AI tools. Offer both “AI-assisted” and “AI-free” versions to promote transparency, build ethical use habits, and allow students to practice discernment and responsibility.

  5.  5. Ditch Detection Tools

    Avoid
    unreliable AI-detection tools, which often yield false positives. Instead, prevent misuse by designing clear, specific, and creative tasks—then emphasize honest communication and trust.

As an instructor, you have a duty to your students to ensure their information is protected. There are a few important security considerations to keep in mind whenever you choose to use generative AI tools:

  • "Never enter or upload sensitive or restricted data per Rule 4-004C: Data Classification and Encryption, including protected health information (PHI), FERPA-protected data, personally identifiable information (PII), donor information, intellectual property (IP) information, and payment card industry (PCI) information" (UIT). You may choose to anonymize your data for use with AI tools.
  • Any intellectual property belonging to students, staff, or faculty should not be input without explicit consent.
  • When in doubt, contact UIT or the U of U Automation Center of Excellence (ACoE) at ACoE@hsc.utah.edu for further information and guidance.

To ensure your data is best protected, we highly recommend using Microsoft Copilot over other generative AI tools like ChatGPT.  

Microsoft Copilot (formerly Bing Chat Enterprise) is a generative AI chat platform. Microsoft Copilot uses GPT-4, a language model created by ChatGPT creator OpenAI. It is currently available at no cost to University of Utah and University of Utah Health staff and faculty as part of the university’s A5-tier Microsoft Campus Agreement.

Microsoft Copilot provides commercial data protection, however, sensitive or restricted data, including protected health information (PHI) and employee and student information, should never be shared through Microsoft Copilot chat (access Policy 4-004: University of Utah Information Security for more information). In addition, to help University of Utah Health maintain the highest standards of privacy and confidentiality, using Microsoft Copilot for patient-related activity is prohibited.

(@theU)

To access Copilot:
  • Visit copilot.microsoft.com, or open Copilot via the sidebar in Microsoft Edge, the sidebar on your Windows OS machine, or mobile app (available for Android and iOS).
  • Select “Sign in with a work or school account” under the Sign in icon in the upper right corner of the page.
  • Enter your unid@umail.utah.edu and uNID password.
  • Complete Duo two-factor authentication.

The conversation is protected when a green shield appears in the upper right corner next to your username. It is critical to verify that the green shield is present for all conversations.

Chats with Copilot are not added to the OpenAI language model database. Copilot does not retain any personal information or store data from conversations. Do not input sensitive or restricted data without anonymizing your content.

We highly recommend including a statement in your syllabus detailing expectations around use of generative AI tools with regards to academic honesty. This may look like:

It is expected that students adhere to University of Utah policies regarding academic honesty, including but not limited to refraining from cheating, plagiarizing, misrepresenting one's work, and/or inappropriately collaborating.

This includes the use of generative artificial intelligence (AI) tools without citation, documentation, or authorization. Students are expected to adhere to the prescribed professional and ethical standards of the profession/discipline for which they are preparing.

Any student who engages in academic dishonesty or who violates the professional and ethical standards for their profession/discipline may be subject to academic sanctions as per Policy 6-410, Student Academic Performance, Academic Conduct, and Professional and Ethical Conduct.

You are free to use and edit the statement listed above. Find other optional syllabus additions available from CTE.

Some departments may elect to create a broad AI policy for all courses in their department. Take a look at a sample department AI use statement from Writing and Rhetoric Studies.

As AI tools become more integrated into academic work, it’s essential for instructors to clearly define expectations around their use. Transparent, thoughtful course policies help ensure academic integrity, promote equitable access, and support student learning. Because there’s no universal standard for AI use in education, instructors must proactively guide students by setting clear boundaries, explaining appropriate use, and adapting policies as tools evolve.

Below are key areas to consider when establishing AI-related policies in your course:

  • Set clear Syllabus policies: In course syllabus and assignment instructions, instructors should clearly state if, when, and how generative AI may be used. They should define acceptable use cases (e.g. brainstorming ideas, improving grammar, changing linguistic style) versus prohibited uses (e.g. submitting AI-generated text as one’s own work). Use CTE’s Student Guide to AI Generative Tools to set classroom norms.
  • Clarify academic integrity expectations: Clear distinction should be made between legitimate assistance and cheating. Examples of both can help students understand. Since there is no universal standard for what counts as “AI plagiarism” and no foolproof detection method for AI-generated content, it’s crucial to spell out course expectations. For example, instructors might allow AI-generated support for researching a problem and potential solutions but not for writing entire case analyses.
  • Transparency and reference of AI usage: Treat AI outputs like any other source that students must cite. If students use a generative AI tool during an assignment, require them to openly acknowledge how they used it (e.g. “I used a generative AI assistant to suggest an outline for this report, which I then revised”). Prompts, any direct text or data from an AI model should be quoted or cited, and paraphrased AI content should be attributed as well. Credit should be correctly attributed to the source.
  • Privacy issues and data security: Instructors can help students understand that sensitive and confidential information (e.g., personal data, proprietary case details) should not be input into public AI tools. Many AI platforms retain and use input data, so sharing sensitive content can violate privacy or copyright.
  • Ensure Equity and Accessibility: Being mindful that not all students have equal access to or comfort using generative AI. A clear explanation should be provided in courses regarding available AI resources and (if any) costs. The same AI resources should be available to all students.
  • Discussion of bias and ethical use: Instructors can discuss the ethical dimensions of AI as part of course norms. Students should understand that AI models can exhibit biases, make up information, or reflect limited perspectives based on their training data. For instance, instructors might present examples of biased AI decisions in a class discussion to emphasize why ethical use of AI is important in business decisions.
  • Continuous adapting and refining of policies: Generative AI technology and best practices are evolving quickly. Instructors and DESB should treat AI usage policies as living documents – solicit student feedback on what is working or not, and stay updated on institutional guidelines or new tools. Instructors should be prepared to adjust guidelines each semester. For example, if they discover an assignment was too easily handled by AI, they should plan to revise it or change the allowed AI assistance for next time.

As generative AI becomes more integrated into academic and professional work, students need more than just permission to use these tools—they need structured support to use them effectively and ethically. Instructors play a key role in fostering AI literacy by introducing tools gradually, explaining how they work, and creating space for open dialogue. Building students' understanding of AI not only supports responsible use in your course, but also prepares them for the expectations of an evolving workforce.

  • Familiarization with the Tools: Before asking students to use a generative AI tool, instructors should take time to use and understand it themselves. Instructors can try out different prompts for their assignments to check what kind of output AI produces. This will alert them to any quirks, limitations, or inaccuracies specific to their assignments or class discussions. 
  • Gradual introduction of AI leads to better integration and adoption: Instructors should be careful not to drop students into a challenging assignment with AI as an allowed tool if the students never used it before. Instructors could start with a low-stakes or ungraded activity to let students get hands-on experience. They could have an introductory discussion board where everyone tries a simple prompt related to the course and shares an AI-generated result. This sandbox approach gives students a chance to use and learn about the interface.
  • Beginning with the basics of AI literacy: Dedicating some class time to explaining how generative AI works in general terms and what its limitations are. Students don’t need to know the technical details, but they should know, for instance, that large language models predict text based on patterns – they do not “think” or verify facts.
  • Foster an open dialogue: Creating a classroom culture where students can openly discuss their experiences with AI as long as it pertains to the goals of the course. Some might be anxious that using AI is “cheating,” while others might be overconfident in its answers. Inviting students to share their views and questions about AI tools. For instance, instructors could have a short reflection discussion asking: “In what ways do you think using AI could help you learn in this course, and what concerns do you have?”

If you choose to integrate AI into your assignments, the key is intentionality. Effective use of AI in coursework should always serve your learning objectives—not override them. By designing assignments that emphasize real-world application, critical thinking, and student reflection, instructors can help students engage with AI tools responsibly while still demonstrating their own understanding and originality. The tips below offer ways to thoughtfully align AI with your course goals and reinforce the human skills that matter most.

  • Align AI Use with Learning Objectives: When integrating AI into an assignment, instructors could start from their learning goals. Considering what knowledge or skill students must demonstrate, and let that drive how (or if) AI is involved. For example, if the objective is to improve strategic decision-making, they might allow students to use an AI tool to gather data or options, but the analysis and final decision must be entirely their own. Therefore, it is important to give clear instructions to students: “You may use generative AI to research factual information, but you must write the comparison analysis in your own words and explain the reasoning without AI assistance.”
  • Designing effective assignments: Designing assignments that involve real-world business situations. Generative AI struggles with nuanced, context-rich problems. For instance, instead of a basic essay question that an AI could easily answer with a generic response, they could assign a project where students apply course concepts to a specific problem. Examples include analyzing a recent business case from the local community or tailoring a marketing plan to a niche product.
  • Describe assignments clearly: Break complex assignments into a series of smaller steps or milestones. For example, an assignment could require an initial proposal, workable solutions, a draft, a feedback stage, and a final submission. This step-wise design emphasizes that writing and thinking are iterative processes, not one-shot products. It mirrors actual business plans or projects, where initial drafts might be collaborative (possibly with AI) but final decisions are human-driven.
  • Describing and Documenting: Students can be asked to reflect on how they used generative AI as part of completing the assignment. This could be a short write-up or annotations in their submission. For instance, it could be concise “AI Usage Statement” with prompts such as: Which AI tool did you use, and for what purpose? What did the AI output, and how did you verify or modify that output? What would you do differently next time? By having students clearly describe their process, the AI use becomes transparent; a student is more likely to use the AI ethically if they know they must describe it.
  • Critical evaluation of AI output: Turning AI into a subject of inquiry within the assignment itself. Rather than treating the AI as an all-knowing oracle, instructors should design tasks where students must evaluate, fact-check, or critique the content produced by an AI. For example, an assignment might ask students to pose a question to a generative AI relevant to course material (e.g., a question about a management dilemma or segmentation strategy), then analyze the quality of the AI’s response. Students could be prompted to identify errors in the AI’s answer or missing considerations, and then provide a corrected or improved answer based on their understanding of the course content. This teaches an important skill for today’s workplace – never take AI output at face value without oversight and verification.
  • Emphasize the Human Role in Assessment: Design rubrics and feedback to explicitly reward the analysis, creativity, and original thought that students contribute. One effective strategy is to ask students to “describe the value you (the student) added” when using AI for a task. Consistently reinforcing that grades reward human judgment, rationale, and customization; this helps students understand that AI is a support tool  rather than a substitute for learning.

Incorporating generative AI into class activities doesn’t just support learning—it helps students build real-world skills in critical thinking, communication, and ethical technology use. From brainstorming sessions to debates and collaborative analysis, AI can serve as a thought partner, a devil’s advocate, or even a simulated stakeholder. These activities model how professionals use AI in dynamic, high-stakes environments and give students the chance to practice working with AI, not just around it.

  • In-class demonstration of AI capabilities and limitations: Explaining simply how generative AI works or how instructors expect it to be used in their course. For example, they can do a in-class demonstration by using an assignment question and asking an AI tool to respond. The students can then critique the AI’s response. Students often find it eye-opening to critique an AI-generated answer as a group – they might spot plausible-sounding mistakes or wrong assumptions that they wouldn’t have questioned if they saw the answer alone. The discussion can then be guided further with questions like: “What did the AI do well?”; “What important points did it miss?”; “Is there any information that seems incorrect or unsupported?” or “How can you build on this”.
  • Incorporate AI into Think-Pair-Share: Active learning routines like think-pair-share can be adapted to include AI as a participant. For instance, in a business strategy class, instructors might pose a question like “What are two potential risks of entering Market X in 2025?” First, have students “Think” individually and also query an AI tool for an answer. During the “Pair” phase, each student can compare their own thoughts with what the AI suggested, discussing differences. Then in the “Share” phase, student groups report whether the AI brought up any novel ideas or if the human perspectives caught nuances the AI missed. This approach uses AI to encourage ideas and solutions rather than to give a single correct answer. It educates students to critically compare AI-generated ideas with their own reasoning.
  • Using AI for case discussions: Generative AI can act as a proxy for perspectives or stakeholders in business cases. One idea is to set up a debate where one argument is supplied by an AI. For example, students could be assigned to argue in favor of a certain business decision, while an AI (prompted by the instructor beforehand) produces arguments against it – students then have to respond to the AI’s points in real time. Alternatively, students might debate against an AI: e.g., student groups vs. ChatGPT on a case problem, where the AI’s rebuttals are presented, and the students must counter them. This can be done by the instructor moderating (feeding the AI the students’ arguments and reading its replies). Such exercises encourage students to think on their feet and practice analytical skills. Another application is having the AI simulate a stakeholder. In a marketing class, an AI could play the role of an dissatisfied customer with a grievance, while a student practices a difficult conversation with it. The AI’s responses give a dynamic (if imperfect) simulation for the student to react to. These activities make abstract concepts more concrete and can build communication skills.
  • Brainstorm with AI: Teaching students how to brainstorm with generative AI can help students how to effectively use it and where it can be most effective. For instance, if students are developing a business plan or marketing campaign in class, they can ask the AI for 10 outside-the-box ideas to consider. The role of the AI here is to inject some unexpected or wide-ranging suggestions into the mix. Once the AI generates ideas, students can evaluate and build on those ideas – which ones are actually feasible or novel? which ones are unrealistic and why?
  • Emphasize Human-AI Synergy Throughout: In every in-class use of AI, it can be reinforced that AI is a partner to augment human ability, not a replacement for it. This can be as simple as consistently asking students after any AI-assisted activity: “How did your thinking or decision-making improve the AI’s contribution?” or “Would you have arrived at the same answer without the AI? Why or why not?”. By having students articulate the unique value of human insight, judgment, and creativity, instructors can educate students that business professionals will succeed by combining AI tools with human expertise. For instance, after the AI-included activity instructors can discuss  when the students outmaneuvered the AI or caught something it missed – these illustrate the importance of human critical thinking. Similarly, it can be pointed out when the AI offered a helpful cue that a human might have taken much longer to produce (showing the benefit of the tool). This balanced view helps students appreciate the the synergy of working with AI.

At the University of Utah, we’re actively shaping how AI enhances teaching and learning. Through a growing suite of AI-powered tools and projects, faculty, staff, and students are exploring new ways to improve instruction, support student success, and drive academic innovation. From virtual tutors to feedback dashboards and exploratory guides, these tools reflect our commitment to ethical, student-centered, and research-informed use of AI in higher education. Below are some of the projects being developed right here on campus.

UBot: Your Always-On AI Study Partner

UBot is a virtual tutor developed in the University of Utah’s Innovation Lab to support student learning around the clock. Instead of giving direct answers, UBot uses a Socratic questioning approach to guide students toward solutions—encouraging critical thinking, problem-solving, and deeper learning. Built on advanced large language models, it’s designed to feel like a helpful, knowledgeable Teaching Assistant with access to course materials like syllabi and lectures.

Originally launched in high-failure-rate courses as part of the Academic Recovery Project, UBot has quickly proven its value—especially during late-night study hours when traditional tutoring isn’t available. With features like topic selection, chat folders, accessibility support, and secure, anonymous data handling, UBot helps students stay confident, supported, and on track for success.

UGuide: Explore Majors with Confidence

UGuide is a virtual assistant designed to help students explore majors in a personalized, low-pressure way. Built in the University of Utah’s Innovation Lab, it pulls together information from across campus—like career paths, course schedules, and support resources—and delivers it through an easy-to-use chat interface.

Currently in early pilot stages, UGuide is part of a growing suite of AI tools aimed at improving student success by making guidance more accessible, scalable, and student-centered.

TEF-Talk: Smarter Feedback for Better Teaching

TEF-Talk is an AI-powered dashboard that helps faculty and administrators make sense of student course feedback, peer reviews, and self-reflections—all in one place. Developed in partnership with CTLE, the Office of Undergraduate Studies, and the Spark Innovation Lab, the tool uses AI to turn qualitative data into clear, actionable insights that support teaching improvement.

Now in pilot phase, TEF-Talk aims to make teaching excellence more visible, measurable, and meaningful across campus.

Guides for Students

Guides for Educators

AI News

Because of the rapid development of AI technology and its many applications to different fields, we recommend faculty stay abreast of developments in their own field. Outlets that report on AI developments relevant to education include the AI Tool ReportThe Rundown, and AI Education News. For specific ideas about how AI may affect educational practice, consider reading Derek Bruff’s Agile Learning blogOne Useful Thing, or the Hechinger Report.

Continue Learning

Take CTE's AI Course on Canvas.

Fortune recently published a list of 5 Free Online AI Classes from top tech firms and universities, or consider the online AI classes taught by Google and Stanford U experts. You can also take a Machine Learning and AI Micro Bootcamp offered by U of U Professional Education. There are also free online Intro to AI courses offered by Udacity and Coursera.

There are a lot of discussions surrounding AI and how it might impact the future of higher education. Take a look at some of the selected conversations we'd recommend:


 

AI Support

Interested in learning more? Take CTE's AI course via Canvas!

Discuss AI integration, assignment design, policies, and more with us! Send an email to cte@utah.edu or submit a help request to our team.

Submit a request to cte

Last Updated: 7/7/25