Article
From Policy to Practice: How One Campus is Guiding Faculty and Student AI Use
Across higher education, conversations about AI often begin and end with academic integrity. Most institutions have issued statements or updated policies warning against plagiarism and improper use. But as many faculty and students will tell you, those policies rarely provide the clarity or guidance needed to use AI constructively.
That gap was the focus of our recent AI on Campus webinar, “From Policy to Practice: A Framework for AI in Academic Tasks,” which spotlighted the Medical University of South Carolina’s (MUSC) efforts to create a shared framework for acceptable AI use. Developed with leadership sponsorship and faculty/designer collaboration, MUSC’s AI Acceptable Use Framework for Academic Tasks is emerging as a model for how institutions can move beyond prohibition toward purpose.
Why Policy Alone Isn’t Enough
Participants opened with a frank acknowledgement: policies that simply say “don’t cheat with AI” are not enough. Students remain anxious and confused, and faculty lack consistency in how to set expectations. Meanwhile, employers are already rewarding graduates who can demonstrate AI literacy, adaptability, and critical judgment.
Campus leaders face a choice: continue to treat AI as a compliance issue, or equip their communities with frameworks that translate values into practice.
MUSC’s Framework in Action
MUSC’s AI Acceptable Use Framework for Academic Tasks is an adaptation of the AI Assessment Scale (AIAS) developed by Perkins, Furze, Roe, and MacVaugh (2024), a peer-reviewed model for the integration of generative AI in educational assessment. By adapting this research-based foundation, MUSC created a practical framework for guiding faculty and students at the task level and brought more of a shift to focus on the process, not just product of student efforts.
The MUSC framework lays out five categories of AI use:
- No AI (e.g., exams, clinical skills evaluations)
- AI Planning (e.g., research and outlining)
- AI Limited (e.g., instructor-designated components, such as poster design or revisions)
- AI Extensive (e.g., student discretion with instructor oversight)
- AI Exploration (e.g., AI required as both tool and subject of inquiry)
Each category includes documentation expectations, helping students reflect on how they used AI and how it shaped their work. Faculty have flexibility to adapt categories across disciplines while maintaining consistency in communication.
In the webinar chat, attendees raised questions about accreditation, student differentiation between AI use levels, and institutional privacy requirements. This shows just how relevant MUSC’s approach is for a broad range of campuses.
Faculty and Instructional Designer Perspectives
The heart of the webinar came from MUSC faculty and instructional designers who described how the framework plays out in real courses.
One example: the Musculoskeletal (MSK) III Teach Back assignment, where doctoral physical therapy students use MS CoPilot to generate patient case studies, then critically evaluate, revise, and apply them. Students must produce a treatment plan, patient education strategies, and demonstrate skills in the lab.
The assignment positions AI as a reflective partner rather than a shortcut. Students engage in evidence-based vetting, apply professional judgment, and build metacognitive skills while remaining fully accountable for the final product.
“Students begin to see AI not as a tool to ‘get the answer,’ but as something they must learn to interact with critically and professionally.” — MUSC faculty panelist
Faculty noted that embedding clear expectations into assignment sheets, FAQs, Brightspace quizzes, and video demos helped students shift their mindset from “Is it allowed?” to “What does appropriate use look like for me as a future clinician?”
Implementation Insights
Panelists emphasized that implementation required more than writing a framework:
- Clarity and consistency across syllabi and policies to ensure students encounter the same expectations.
- Faculty development through workshops and resources to help instructors adopt the framework in their own courses.
- Communication tools such as video walkthroughs and assignment FAQs that reduce confusion and promote confidence.
Participants echoed these strategies in the chat, with several connecting MUSC’s approach to TILT (Transparency in Learning and Teaching) and noting parallels to their own efforts to scaffold assignments more clearly.
Community Reflections
Beyond the panel, participant voices highlighted the broader momentum in higher education:
- Instructors described experimenting with Blackboard Ultra AI tools and sharing how students found them helpful for critical reflection.
- Others noted the emergence of tools like Google’s “Learn Your Way.”
- Attendees underscored the importance of treating AI not simply as a compliance issue but as a professional preparation priority.
“Accrediting bodies are already asking to see evidence of AI integration.” — Webinar participant, via chat
These reflections reinforced that MUSC’s framework is not an isolated experiment but part of a growing shift in higher education toward constructive, guided AI use.
Takeaways for Campus Leaders
The MUSC case study offers several lessons for leaders who want to move beyond policy statements:
- Adopt frameworks, not just rules. Clear categories help students and faculty navigate appropriate AI use.
- Support faculty implementation. Provide sample assignment language, FAQs, and training opportunities.
- Prioritize student preparation. Frame AI as a professional skill requiring judgment, reflection, and adaptability.
- Engage the community. Ensure leadership endorsement while involving faculty and IDs in development.
Now is the time for institutions to step beyond vague references in academic integrity policies. As MUSC’s example shows, frameworks that translate policy into practice build trust, consistency, and workforce readiness.
To explore the full discussion and hear directly from MUSC faculty and instructional designers, we invite you to watch the webinar replay.