0

AI & the LMS: Early Days

The learning management system (LMS) is at the center of the contemporary instructional technology ecosystem on most college campuses. At UP, the Moodle LMS is used for a range of learning management functions. Some faculty use it as a static course website or course library, where announcements, syllabi, and course materials are posted. Others take a more hybrid or ‘flipped classroom’ approach, using Moodle to post assignments, do testing, and host class discussion. Still other faculty build out their entire course in Moodle, taking advantage of its various interactive and communications features to extend the classroom and create a fully fledged online course experience.

LMS’s have been used in all these ways for the past twenty-five years. While there have been (myriad) functional improvements, expanded capabilities, and features to support a wide range of teaching and learning styles, the basic way faculty and students interact with the LMS is very much the same as it was in 2005.

That all seems about to change. The rapid onset and accelerating development of consumer-facing AI has LMS builders like Canvas, Blackboard, and Moodle scrambling to develop user-friendly, education-specific AI features to support faculty designing and offering hybrid and online courses.

In the short time that ChatGPT and similar products have been in the marketplace, they’ve become the subject of often contentious discussion in higher education circles. While virtually everyone understands the inevitability of the need to come to terms with these tools, concerns about their potential harms are often top of mind as institutions scramble to develop policies and practices for the responsible use of AI.

Those concerns have found their way into the instructional technology marketplace as well, and it has been interesting to see how LMS vendors are integrating ethical AI governance and evaluation into their organizational structures to ensure that these new tools are safe and credible. Each company began it AI journey by laying carefully considered infrastructure of policies and principles. From Moodle’s AI Principles, to Blackboard’s Trustworthy AI Approach, and Canvas’s Approach to an Ethical AI Strategy, these statements of principle represent a collective, virtually industry-wide approach to managing the presence of AI in the LMS . . . before the tools even exist. As educators, we can be thankful for this — it hasn’t gone that way in the frenetic AI gold rush in other sectors of the economy.

Their caution is warranted. Many of us remember Turnitin rushing to the market with AI-driven AI detection features last summer, a tool that was so unreliable and harmful that many schools, including UP, turned it off. Nevertheless, changes are coming fast. One of the useful abilities of tools like Copilot and ChatGPT is the ability to quickly create serviceable drafts or outlines that can be human edited. In LMS terms, building out an online course site can be time and labor intensive. With that in mind, companies are developing AI instructional design tools — Blackboard has already launched their version — and these promise to drastically reduce the time it takes to develop course structure, assignments, and assessments. In some ways these tools will mirror the what some of us currently do with ChatGPT, engaging in “conversations” that help refine ideas about learning design while the design assistant takes on much of the tedious work of course building. Similarly, Moodle believes AI/LMS integration will help make online courses more accessible, inclusive, and personalized.

There may be significantly different approaches to integrating AI into the LMS. So far Blackboard seems to be taking the approach of either building or buying AI tools and making them core features of their platform. Canvas and Moodle, on the other hand, rely on third party developers to extend the functionality of the platforms, so for them a vetted “marketplace” of AI tools is the right approach. Instructure, Canvas’s parent company, has already launched a beta site for its Emerging AI Marketplace. This should allow them to ensure a safer experience for Canvas users by making sure that the tools in their marketplace align with their goals and principles. Moodle, as a managed open source project, depends on its “community of the willing” developers to build apps that will be collectively vetted by the global Moodle community.

ATSI will certainly be following these industry conversations and hopefully evaluating some of these kinds of tools for use in Moodle over the coming months. As we have all experienced, AI is a volatile space and it works better when used with care and caution. We believe that preparing ourselves to use these tools through sound evaluation and implementation planning is as important as using them. Our goal, as always, is a better user experience of Moodle for faculty and students and we won’t add new tools that do not improve that experience, or which make it less inclusive or accessible to any member of our community. UP has done a good job raising awareness and laying the groundwork for responsible AI on campus, but the ground needing to be worked will only expand from now on.

Mark Jenkins

Leave a Reply

Your email address will not be published. Required fields are marked *