When Dan Wang, Columbia Business School's Lambert Family Professor of Social Enterprise, began encouraging discussion in his Technology Strategy course last year, he tapped into a centuries-old teaching model known as the tutorial system.
Popularized by academics at the University of Oxford and University of Cambridge in the 1800s, the model sees professors lead students in small, frequent discussion sessions where they are expected to communicate, critique, and interrogate their ideas. In Wang’s Technology Strategy course, the system was simple but rigorous: Students were asked to analyze case studies and argue prescriptions, defending their arguments and building critical thinking skills along the way. The practice was a success, according to Wang, with positive spillovers into the classroom by way of high engagement and intense conversation.
There was just one major difference between the model used in Wang’s course and the one used by 19th century scholars: In the Technology Strategy course, the students’ discussion partner was wholly AI. That’s because Wang, together with then-CBS student Johnny Lee ’23, developed and hosted a portal on the machine learning platform Hugging Face that allowed students to have a dialogue with an AI-based chatbot fine-tuned with Wang’s notes, a relevant case study, and additional research material.
“It’s a precise conversational partner that students are engaging with,” says Wang, who also serves as co-director of the Tamer Institute for Social Enterprise and Climate Change.
Providing a unique debate partner is just one of the many ways CBS faculty are integrating AI into the School’s curriculum. At the same time, faculty are ensuring that AI use in the classroom remains ethical and that students understand the emerging technology’s limitations.
AI After Class
For Wang, integrating AI into his curriculum takes myriad forms, from the aforementioned discus- sion partner to a full-on scenario simulator. Just a few months after the inception of ChatGPT, Wang began integrating AI into assignments in his Technology Strategy course, which explores how firms capture value, deal with competitors, and navigate social and environmental implications of new technologies. Throughout the class, students are presented with case studies and dilemmas that act as a vehicle for learning how to apply frameworks, marshal quantitative data, and ultimately gain fluency in the logics of technology.
For homework, Wang asks students to vote on related multiple-choice poll questions and explain their answers—a common pedagogical assignment Wang found ripe for AI enhancement. ChatGPT’s inception meant that students could simply plug in questions and generate an answer in a few clicks. Rather than interfere with this, however, Wang decided to include a new provision on his syllabus that not only allowed students to use a mode-based chatbot for poll question assignments but encouraged students to do so, as long as it improved the quality of their work, did not misrepresent their thinking, and was noted in their follow-up explanation.
With a data set of 2,200 homework assignments to analyze, Wang gleaned valuable data, such as how many students chose to use AI to answer questions—an average of around 25 percent on each question, though it varied widely—and how the technology impacted their answer choice. For example, students who used AI assistance tended to select the most popular poll question answer.
Wang also notes that, anecdotally, allowing students to use AI at home translated into increased engagement and participation within the classroom. “Students often use GenAI to refine and challenge their positions,” Wang says.
Modeling the Growth Mindset
CBS Professor Ashli Carter’s use of text-to-image generative AI in the classroom has allowed her to help students build resilience, find connections, and learn from past failures—critical skills for any rising business leader. As a lecturer in the School’s Management Division, Carter began using generative AI as a teaching tool in 2023, especially in her LEAD: People, Teams, Organizations course sections.
There, Carter says generative AI has proved to be an excellent way of practicing a growth mindset. “Generative AI is actually really good at modeling how we can iterate and learn from failures and mistakes.”
In class, Carter, along with other LEAD professors, asked students to come up with an innovative product and use generative AI to complement their design process, helping them learn from each other and be inclusive in generating new ideas. She also uses the technology as a reflective tool in workshops and Executive Education classes, asking participants to use generative AI to illustrate their inner critic and inner champion. She finds it to be a great way for students to “negotiate” with their mindset and bring them closer to achieving their goal.
When using any emerging technology, whether as a student or at work, Carter encourages users to “mind the zones” with self-compassion and curiosity. That means recognizing whether and when a certain generative AI technique is outside or within your comfort zone.
“We are all at different points in our learning journey with generative AI. It’s important to notice where you are and think about how you might stretch,” Carter says.
AI for Good—and Bad
Olivier Toubia, the Glaubinger Professor of Business at CBS, wants the business leaders of tomorrow to understand that while AI can be beneficial, it can also create division and inequality when put into the wrong hands.
He says that AI, and generative AI in particular, will make workers more productive and decrease inequality, all while improving work-life balance. However, it’s crucial to understand that AI can just as easily increase inequality by displacing workers and reducing job opportunities.
“That would be determined by the actions of our leaders of tomorrow. That is why it is so important to have our students think about AI, because they will be the ones shaping the impact of AI on society and business,” Toubia says.
Not as Scary as It Seems
Daniel Guetta, an associate professor of professional practice at CBS, likes to teach students that AI models are not as complex as they might seem at first glance.
“At the end of the day, it’s a whole bunch of very simple operations that get combined in very complex ways to get the answer that you see,” says Guetta, who teaches business analytics and operations management courses.
With that in mind, Guetta has found success in the classroom showing students the constituent parts of an AI. That starts with building a simpler version of the model in programs such as Microsoft Excel, where Guetta has even created a whole neural network.
By doing so, Guetta is able to demonstrate that minor computations—addition and multiplication— are working in tandem to fuel large AI applications.
Watch CBS faculty discuss how they use AI in the classroom: