AI policy for webinars?

35 views
Skip to first unread message

Amy Hofer

unread,
Apr 28, 2026, 2:08:13 PM (3 days ago) Apr 28
to Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network
Hi all, this message is cross-posted. 

Has anyone developed a policy about whether to allow AI tools to attend webinars that you run? I'm curious about how others are balancing accessibility uses vs concerns about privacy and accuracy. 

Thanks in advance, and I'll share back a roundup of replies. 

Amy

--
Amy Hofer (she/her)
Statewide Open Education Program Director

Daniel Rockwell

unread,
Apr 28, 2026, 4:36:42 PM (3 days ago) Apr 28
to Amy Hofer, Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network
Hi Amy et al,

We do not have a policy about AI tools and webinars yet, but I am adding this to my list of things to discuss, Thanks.

This does bring up other functionality questions for me: 
  • If the webinar is a “zoom webinar” or set up with registration restrictions can an "AI attendee" even access the webinar?
  • Can zoom (or other platforms) restrict AI bots from joining meetings?

Sincerely,
Daniel Rockwell, PhD
(he/him/they/them)
Director | WOU Center for Teaching & Learning
Western Oregon University

Land Acknowledgement: Western Oregon University in Monmouth, OR is located within the traditional homelands of the Luckiamute Band of Kalapuya. Following the Willamette Valley Treaty of 1855 (Kalapuya etc. Treaty), Kalapuya people were forcibly removed to reservations in Western Oregon. Today, living descendants of these people are a part of the Confederated Tribes of Grand Ronde Community of Oregon (https://www.grandronde.org) and the Confederated Tribes of the Siletz Indians (https://ctsi.nsn.us).

--
You received this message because you are subscribed to the Google Groups "Open Oregon Educational Resources" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openoregon+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/openoregon/CAPKOurZFONYMV-NjrvFepNu_1fp25YjEp%3DsCWaRWDY_N-0o5UA%40mail.gmail.com.

Ashley Langsdorf

unread,
Apr 28, 2026, 5:15:23 PM (3 days ago) Apr 28
to Amy Hofer, Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network
I'm so glad you started this conversation! I'm also curious to hear how others are approaching this. 

I facilitate a peer mental health training that is considered a confidential space, so we set the expectation that no one is recording or using a transcription program to record what is being said, as part of our group agreements in the beginning of the training. However, I anticipate that we will have future situations where someone may need an accommodation that might involve AI/transcription programs like Otter and similar. I'm actually surprised it hasn't come up yet. 

Ashley

--

Jennifer Cox

unread,
Apr 28, 2026, 5:44:10 PM (3 days ago) Apr 28
to Ashley Langsdorf, Amy Hofer, Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network
Good afternoon all,

Good conversation. I'm co-leading our college wide committee on AI. We've finished an Administrative Policy, Procedure and Guidelines. We have an Instructional Policy, but we are currently working on the Instructional Procedure and Guidelines (a project for next year). So my question is: are you asking about all spaces, administrative use, or instructional (faculty or student)?

Our college guidance on this in not in our Policy.  But we also have a very specific governance structure Policy(what)=> Procedures(who/how)=> Guidelines(best practices/details). We say the devil is always in the Guidelines ;-) . Two items in our documentation potentially advise against using an AI meeting attendant. 

1) In our guidelines document, we spell out what acceptable use looks like. For any AI note taking to used in an administrative meeting, you must gain consent from all parties in attendance. You must stop the noteaking or recording if there are sensitive topics or guests request it since sometimes you can't always anticipate where a coneration will lead. 

2) Additionally, we require human in the loop. That language lives in our Procedures document. 

Between those two items, I don't believe that AI administrative meeting attendance would fly at our institution. More importantly, I think we need to discuss this and possibly outline it in our instructional procedure or guideline: "An AI attendant doesn't replace student attendance in a course."

To give some background on how we arrived at this. We recently added our General Counsel to our AI governance committee. Apparently, there are a number of legal cases filed regarding the use of agentic AI. This comes down to discrimination and liability for not having a human in the loop or process (and wholesale outsourcing tasks to AI).  

Hope this helps and doesn't further muddy the waters. 

Sincerely,

Jennifer Cox

Dean of Library and Learning Resources

Chemeketa Community College | She/Her

4000 Lancaster Dr. NE, Salem, OR 97305

Bldg. 9, Room 210503.399.5105 jennif...@chemeketa.eduwww.chemeketa.edu






Chris Esposito

unread,
Apr 29, 2026, 12:24:15 PM (2 days ago) Apr 29
to Jennifer Cox, Ashley Langsdorf, Amy Hofer, Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network

There may be a few distinctions we could make here that help define options for AI use based at least in part on enforceable restrictions that can be placed on its dissemination and use of the identities of the participants and the contents of the material discussed during a meeting or other use.

As background, before coming to academia (I'm in Computer Science at EOU) I spent several decades at Boeing on both the commercial airplane and military sides of the company. Most of the material discussed was either
1) mundane sorts of stuff (e.g., meetings about the scheduling of other meetings :-) ),
2) technical but not proprietary (e.g., general questions about aerodynamics),
3) proprietary to Boeing
4) export-controlled (i.e., disclosure limits imposed by US State Department),
5) some of it required varying degrees of security clearance.

Disclose stuff in #3 under the wrong circumstances and you may get fired.
Disclose stuff in #4 or #5 under the wrong circumstances and you might wind up in prison.

My memory of the possible AI tool usage options was that there were 3:

1) You use a model that was available on some piece of the public cloud, and so any questions / answers were most likely incorporated as training data with no restrictions.

2) You use an on-premises version of a tool that ran on local systems wholly controlled and monitored by the company. The knowledge base of such a system was derived from and updated from internal sources only; access to the outside from the tool was blocked at the network level and strictly monitored.  Some meetings did have automated note takers, but the results were subject to review by the owner of the meeting before dissemination.

3) Some of the tools had license types that offered versions of the public tool that were segregated and promised to not incorporate context data for  use as part of any training data set. Given AI tool provider behavior on privacy and intellectual property in general, I'm deeply skeptical of such promises (I'm also skeptical of AI as it is currently envisioned and implemented, but that's a separate discussion).

Boeing opted for usage option #2. If #1 was ever allowed from the company network, it was brief.
------
Suppose there was an option similar to #2 - it would run on IT systems wholly controlled by a university. Note-taking at meetings or raw meeting contents, for example, would be only visible to those granted access to the university systems that hosted them. Would this satisfy the perfectly reasonable privacy concerns expressed so far?

Chris Esposito

Joe Corsini

unread,
1:01 PM (10 hours ago) 1:01 PM
to Chris Esposito, Jennifer Cox, Ashley Langsdorf, Amy Hofer, Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network
All,
Being a software engineer, Chris has an in-depth technical view of the 'AI' software and its dangers. AI writing algorithms are a huge issue in my classes - students use it outside of class to write their reports and answer study and  test questions in my online class. Recent studies are showing clear declines in cognitive ability in cohorts of students using the AI writing programs. I have moved to having them write everything in class to make sure they actually do the thinking themselves. Even the searching algorithms can be an issue if not using the right ones, because of the way that information is accessed (poorly executed science or biased opinion in many cases) and presented  by say Google AI. Managing this in and out of the classroom represents yet another level of control that we impose, making our jobs increasingly difficult. 



--
 
 
Joe Corsini,PhD
Dept Chair Biology
AAP Faculty Union Steward
Eastern Oregon University
One University Blvd
La Grande, OR
97850 

Tom Nejely

unread,
1:11 PM (9 hours ago) 1:11 PM
to Joe Corsini, Chris Esposito, Jennifer Cox, Ashley Langsdorf, Amy Hofer, Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network
I'm at Klamath community college. I see how this conversation is growing. There are three factors at play discussed so far in this thread. And, there's one 800 lb gorilla. 

The 800 lb gorilla is that AI usage is approaching fully universal usage. College employees, and college students are using AI and will continue to do so. That is whether that usage is appropriate or not. And whether it is or is not applied with forethought and consideration of potential consequences. 

The use cases that have been discussed in this thread, are 
1. Internal to the college staff, faculty, other employees. Security, and appropriate use are very real issues.
2. External to the college but still professional staff, faculty and other employees. This is the webinars and meetings and conferences discussion.
3. Student use for assignments, learning assistance, and full substitution for work or learning. 

I'm not offering solutions here, because the issues are many and complex. But I wanted to clarify the direction this thread has taken.

Angela Wood

unread,
3:08 PM (7 hours ago) 3:08 PM
to Tom Nejely, Joe Corsini, Chris Esposito, Jennifer Cox, Ashley Langsdorf, Amy Hofer, Open Oregon, CCCOER Advisory, SPARC OE Forum, Open Textbook Network
Tom, 
I appreciate the clarification. This is very complex.

Angela

Reply all
Reply to author
Forward
0 new messages