Skip to main content

Eckerd College community members learn about AI from Google global enterprise client executive

By Ashlyn Porter '28
Published November 19, 2025
Categories: Academics, AI Studies, Communication, Computer Science, Public Events

Kayla Siagha, a global enterprise client executive from Google, offered the Eckerd College community encouragement to be smart stewards of artificial intelligence. Photo by Julia M. Hildebrand

Artificial intelligence can be an overwhelming topic that is often avoided instead of explored. 

This tendency to ignore AI out of fear and fatigue would be a central theme of a Nov. 6 Eckerd College event featuring Google Global Enterprise Client Executive Kayla Siagha. Her talk had been organized by the College’s Communication discipline, the new AI studies minor, General Education and the Rahall Communication Center. All members of the Eckerd community were welcome to attend and ask questions.

The event drew a crowd of more than 50 students and faculty members who hoped to find some clarity from the presentation.

Siagha had earned her MBA in marketing from Coastal Carolina University. She worked at Cisco Systems for a couple of years before landing a position at Google as an account executive. Along the way, she gained experience in AI and its many applications at the corporate level.

Her presentation covered examples of AI that are used daily, such as Google’s search engine, Amazon Alexa, digital maps and autonomous vacuums. In addition, Siagha explained the types of AI and how to apply them to one’s daily routines.

Kayla Siagha

Perhaps the most valuable advice was how students can prepare for a world turning to AI and remain competitive in the job market.

The lecture was co-sponsored by the Communication discipline, AI studies minor,  Foundations Collegium and the Rahall Communication Center. Photo by Ashlyn Porter ’28

“There’s a lot of value in bringing AI into work,” she stated.

She emphasized the importance of adapting and growing with this technology, and it starts with curious exploration. Siagha recommended asking large language models such as Google Gemini or ChatGPT questions to get familiar with prompt engineering.

Moreover, she explained the current weaknesses in AI and stressed the utility of strengthening human capabilities in those areas.

“We need to make sure we strengthen the skills that AI is not good at,” she said.

In other words, one cannot outsource critical thinking and use AI effectively. It is up to humans to broaden their emotional reasoning, communication skills and empathy—all traits these tools lack.

“[We must] learn to lead with AI and not be led by AI,” she repeated throughout her talk.

She noted that without educating oneself on the application of AI, its ethics and limitations, it will likely outpace humans in a way that leads to harmful consequences. It must remain a tool, not a crutch.

Following her presentation, she answered student questions—which mainly involved concern about the environment, security and privacy.

Although not all fears can be quelled, the event allowed community members a space to learn about the rise of AI and how to prepare. To Siagha, this is the first step.