AI use at MMU

Tags AI

Safe and Responsible AI Use at MMU: A Guide for Employees and Students

Artificial Intelligence (AI) is rapidly transforming how we learn, work, and communicate. At Mount Marty University (MMU), we embrace the potential of AI to enhance productivity, creativity, and innovation across campus. However, with great power comes great responsibility. This guide outlines key principles and best practices for using AI tools safely and ethically — whether you're a student, faculty or staff member.


Why Safe AI Use Matters

AI tools like ChatGPT, Grammarly, and image generators can support academic writing, automate routine tasks, and spark new ideas. But improper use can lead to serious consequences, including:

  • Data breaches involving sensitive student or institutional information
  • Academic integrity violations such as plagiarism or misrepresentation
  • Bias and misinformation from unverified or inaccurate AI-generated content
  • Compliance risks with FERPA, GLBA, and other regulatory standards
  • Students should always confirm their instructor has approved AI use before using in a course

By using AI responsibly, we protect our community, uphold MMU’s values, and ensure these technologies serve us well.


Best Practices for Safe AI Use

1. Protect Sensitive Data - Never input personal, financial, or confidential university data into AI tools. These platforms may store or use your data in ways you can't control.

2. Maintain Academic Integrity - AI can assist with brainstorming or grammar checks, but it should never replace original thought.

3. Verify AI-Generated Content - AI can produce convincing but incorrect or biased information. Always fact-check and use trusted sources to validate any AI-generated output.

4. Respect Copyright and Intellectual Property - Avoid using AI to replicate or distribute copyrighted material. When creating content with AI, ensure it doesn’t infringe on others’ rights.

5. Understand Tool Limitations - AI is not human—it lacks context, judgment, and ethical reasoning. Use it as a support tool, not a decision-maker.

6. Adhere to the Code of Conduct - AI use falls under MMU’s Code of Conduct policies


AI Use in the Classroom and Workplace

Faculty and staff should clearly communicate expectations around AI use in coursework and projects. Students should verify with instructors if AI tools are permitted and in what capacity. In administrative settings, AI can streamline tasks, but it should never replace human oversight..


MMU’s Commitment to Ethical AI

MMU is actively exploring policies and training to support safe AI use across campus. We encourage open dialogue, continuous learning, and thoughtful integration of AI into our academic and operational environments.


Need Help or Have Questions?

If you're unsure about using AI in a specific context, reach out to your instructor, supervisor, or the IT department. MMU is here to support you in navigating this evolving landscape responsibly.


Mount Marty has access to Google Gemini and Microsoft Co-pilot Enterprise editions. These are the only approved AI tools for ‘Level 2 – Internal’ data that can be used. (Data Classification Policy)

All Employees are bound to proper AI usage through their signing of the Acceptable Use Policy (which can be found here and that section is listed below).

Uploaded Image (Thumbnail)