When Dan Wang, Columbia Business School's Lambert Family Professor of Social Enterprise, began encouraging discussion in his Technology Strategy course in 2023, he tapped into a centuries-old teaching model known as the tutorial system.
Popularized by academics at the University of Oxford and University of Cambridge in the 1800s, the model sees professors lead students in small, frequent discussion sessions where they are expected to communicate, critique, and interrogate their ideas. In Wang’s Technology Strategy course, the system was simple but rigorous: Students were asked to analyze case studies and argue prescriptions, defending their arguments and building critical thinking skills along the way. The practice was a success, according to Wang, with positive spillovers into the classroom by way of high engagement and intense conversation.
There was just one major difference between the model used in Wang’s course and the one used by 19th century scholars: In the Technology Strategy course, the students’ discussion partner was wholly AI. That’s because Wang, together with former CBS student Johnny Lee ’23, developed and hosted a portal on the machine learning platform Hugging Face that allowed students to have a dialogue with an AI-based chatbot fine-tuned with Wang’s notes, a relevant case study, and additional research material.
“It’s a precise conversational partner that students are engaging with,” says Wang, the School’s Lambert Family Professor of Social Enterprise and co-director of the Tamer Institute for Social Enterprise and Climate Change.
Providing a unique debate partner is just one of the many ways CBS faculty are integrating AI into the School’s curriculum. At the same time, faculty are ensuring that AI use in the classroom remains ethical and that students understand the emerging technology’s limitations.
Modeling the Growth Mindset
For CBS Professor Ashli Carter, using text-to-image generative AI in the classroom has allowed her to help students build resilience, find connections, and learn from past failures — critical skills for any rising business leader. As a lecturer in CBS’s Management Division, Carter began using generative AI as a teaching tool in 2023, especially in her LEAD: People, Teams, Organizations course sections.
There, Carter says generative AI has proved to be an excellent way of practicing a growth mindset. “Generative AI is actually really good at modeling how we can iterate and learn from failures and mistakes.
Professor Ashli Carter uses generative AI to help students build resilience.
In class, Carter, along with other LEAD professors, asked students to come up with an innovative product and use generative AI to complement their design process, helping them learn from each other and be inclusive in generating new ideas. She also uses the technology as a reflective tool in workshops and Executive Education classes, asking participants to use generative AI to illustrate their inner critic and inner champion.
When using any emerging technology, whether as a student or at work, Carter encourages users to “mind the zones” with self-compassion and curiosity. That means recognizing whether and when a certain generative AI technique is outside or within your comfort zone.
“We are all at different points in our learning journey with generative AI. It’s important to notice where you are and think about how you might stretch,” Carter says.
AI After Class
For Professor Dan Wang, integrating AI into his curriculum takes myriad forms, from the aforementioned discussion partner to a full-on scenario simulator. Just a few months after the inception of ChatGPT, Wang began integrating AI into assignments in his Technology Strategy course, which explores how firms capture value, deal with competitors, and navigate social and environmental implications of new technologies. Throughout the class, students are presented with case studies and dilemmas that act as a vehicle for learning how to apply frameworks, marshal quantitative data, and ultimately gain fluency in the logics of technology.
For homework, Wang asks students to vote on related multiple-choice poll questions and explain their answer — a common pedagogical assignment Wang found ripe for AI interference. ChatGPT’s inception meant that students could simply plug in questions and generate an answer in a few clicks. Rather than interfere with this, however, Wang decided to include a new clause on his syllabus that not only allowed students to use a mode-based chatbot for poll question assignments but encouraged students to do so, as long as it improved the quality of their work, did not misrepresent their thinking, and was noted in their follow-up explanation.
With a data set of 2,200 homework assignments to analyze, Wang gleaned valuable data, such as how many students chose to use AI to answer questions — an average of around 25 percent on each question, though it varied widely — and how the technology impacted their answer choice. For example, students who used AI assistance tended to select the most popular poll question answer.
Wang also noted that, anecdotally, allowing students to use AI at home translated into increased engagement and participation within the classroom. “Students often use GenAI to refine and challenge their positions,” Wang says.