We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode 335: How AI is Changing Academia with Dave Marchick, Dean of the Kogod School of Business

335: How AI is Changing Academia with Dave Marchick, Dean of the Kogod School of Business

2025/5/12
logo of podcast AI and the Future of Work

AI and the Future of Work

AI Deep Dive Transcript
People
D
Dan Turchin
D
Dave Marchick
Topics
Dave Marchick: 作为 Kogod 商学院的院长,我认为高等教育必须拥抱人工智能,而不是视其为威胁。我们正在积极地将人工智能融入到所有课程中,从**天的新生到最高级别的研究生课程。我们不是要培养人工智能专家,而是要培养能够利用人工智能工具的商业领袖。我们的目标是让学生精通人工智能,并培养他们在人工智能时代取得成功的关键技能,例如沟通、协作和批判性思维。我认为,那些拒绝拥抱人工智能的大学将会落后,而那些积极适应的大学将会蓬勃发展。我坚信,人工智能是高等教育的未来,我们必须做好准备。 Dan Turchin: 我完全同意您的观点。我认为,高等教育机构需要重新思考他们所提供的价值。在人工智能时代,知识的获取变得越来越容易,因此大学需要专注于培养学生在工作中取得成功所需的技能。我认为,Kogod 商学院正在做的事情非常令人兴奋,我希望其他大学也能效仿。我们需要拥抱人工智能,并将其视为改善教育和为学生赋能的工具。

Deep Dive

Shownotes Transcript

Translations:
中文

Future leaders need to learn AI as a tool for everything they do. I now hear from deans and college presidents and leaders across the country, and many schools are struggling with this. So in many parts of academia, the mandate is still do not use AI. It's cheating. And I would say that is putting your head in the sand.

Good morning, good afternoon, or good evening, depending on where you're listening. Welcome to AI and the Future of Work. I'm your host, Dan Turchin, CEO of PeopleRain, the AI platform for IT and HR employee service. Our community is growing thanks to all of you, our loyal listeners. As I hope you know by now, we launched a newsletter recently. It includes a

Some fun facts and clips that don't always make it into the weekly episode. I encourage you to click on the link in the show notes, register for that newsletter, and join us for additional content there. We learn from AI thought leaders weekly on this show. The added bonus, of course, you get one AI fun fact each week.

Today's fun fact comes from Jonathan Ford, who writes in Penn State online about a cheat-a-thon contest the university launched to explore AI's strengths and flaws in higher ed. The Center for Socially Responsible AI at Penn State is offering $10,000 in prizes, challenging faculty and staff across the nation to explore the boundaries of generative AI in academia.

Here's how it works. University faculty around the country can submit questions from their exams, projects or assignments that would be difficult for students to answer using generative AI tools like ChatGPT and Gemini.

Questions can come from any subject at the undergrad or grad level. Along with each question they submit, they must provide the ideal answer. They will get full points. University students then select a question submitted by faculty, and they may not use other tools like online libraries, search engines, or Wikipedia, but they can use generative AI.

The student responses are compared to the answers from the faculty and prizes are awarded from that. My commentary, I applaud the creativity and self-awareness here. The students will use AI to improve the quality of their output, whether or not it's encouraged. Using AI doesn't need to compromise the learning experience. Today's students will also be expected to augment their work product with AI in the "real world".

The wrong approach is naively expecting tone-deaf bands on AI to work in academia. The right approach is embracing AI in academia like we're embracing it in industry and everywhere else. This is very germane to the topic we'll be discussing today. And of course, we'll link to the full article in the show notes. Now shifting to this week's amazing conversation, I've been looking forward to this one. Today's guest is no stranger

stranger to anyone in the worlds of private equity, public service, or philanthropy, and perhaps podcasting as well. He needs no introduction, but I'll introduce Dave Marchik anyway.

had a distinguished career as an attorney businessman, academic and diplomat. He was recently named Dean of American University's Kogod School of Business. For 12 years prior to that, he was managing director at the Carlyle Group, one of the world's largest private equity firms. He served on the firm's management committee and helped drive Carlyle sustainability and diversity initiatives and was part of the team that successfully took Carlyle public in the government sector.

Mr. Marchik has served in the Biden and the Clinton administrations, including as the senior appointee at the Development Finance Corporation, along with leadership roles at the State Department, Commerce Department, as well as the White House. Mr. Marchik launched a podcast called Transition Lab, and he co-authored the book called The Peaceful Transition of Power, An Oral History of Presidential Transitions.

Mr. Marchick holds a law degree from George Washington University, a master's in public affairs from the LBJ School of Public Affairs at the University of Texas at Austin, Hookham Horns, and a BA from the University of California at San Diego, go Tritons. And without further ado,

Dave, it's my pleasure to welcome you to AI and the Future of Work. Let's get started by having you share a bit more about that illustrious background and how you got into this space.

Thanks so much, Dan. Thanks for hosting me and congrats on everything that you've done, including this podcast with over a million listeners and bringing just incredible conversations about AI to the masses. And I listen and I'm a fan and it's a privilege to be on. So I'm kind of an accidental dean. I don't have an academic background.

I don't have a PhD. I probably was... I tell people I was the top third of the bottom half of my class as a student. I was not a perfect student. But I had a great and fun, successful business career. And I found myself retired at age 52 and was finished with that phase, commercial phase of my life and wanted to pivot into doing something...

that was mission-oriented and in service of others. And I found myself teaching at the Tuck School of Business at Dartmouth and working on the transition. And then AU approached me, and I live a mile away from American University. And I thought it would be both fun, interesting, and educational for me to try to learn how to be a dean and to take the skills that I learned in the private sector into the academic world and try to take

a business school to the next level and support the next generation of leaders. And so here I am, and I'm having a fun time. What was more challenging for you, going from the public sector to private equity or private equity to academia? Such different domains, all of them. So I would say that I've applied the same skills to each of the different domains, which are leadership skills, communication skills, leadership,

branding, positioning, and being excellent at process management, driving teams of people towards a goal. And I've never been the smartest person in the room, but I've rumored that I've been someone who surrounds myself with people that are smarter and could listen to them and then enable them. So

I would say in academia, what I've found is the same skills apply to academia in terms of leadership, communication, positioning. The challenge is academia moves very, very slowly. It's tradition-based, and I've had to learn a lot, but I've also, I think, been able to bring a sense of urgency and

opportunity to the business school. And we'll talk about that today with AI. So it's, I think each of my different phases of my career has brought different challenges, but I've applied the same kind of tool toolkit to each of the challenges and then adapted in different ways. In the opener, I talked about, call it the challenging relationship that academia has with AI.

Now you're the dean of the business school at Kogod American University. What's Dave Marchik's perspective on AI and academia, friend or foe? I think it's a little of both, but it's akin to when the calculator was created and professors or teachers of math said, do not use the calculator because you're not going to learn math. Or when Excel became ubiquitous and

And professors would say, don't use Excel because you're going to lose your ability to enter data into a ledger. So I think that many in academia, frankly, have their head in the sand about the changes that this technology is going to bring to everything that everyone does in every domain, very much like what you said in the opener.

And that we need to embrace it and understand both the strengths and the weaknesses of AI and the challenges it's going to bring. But in the workforce, students who will become graduates will be expected to know to be fluent in AI applications in whatever they do.

And so, you know, we are delivering essentially two products in academia. We're delivering knowledge to students and learners throughout their lifetime. And then we're producing knowledge to the masses through research and scholarship and the traditional scholarship that faculty produce. In terms of student learnings,

Our job is to prepare students to be better when they graduate, to have productive and fulfilling, interesting lives. And in the same way that someone needs to read and write and be a good communicator, future leaders...

need to learn AI as a tool for everything they do. And that's the way we've embraced it. And at my school, I would say I now hear from deans and college presidents and leaders across the country, and many schools are struggling with this. I heard from a leader of a major, highly ranked public university last week who said they put out a call for faculty to embrace AI in the classroom.

And this is a large public university. They got one faculty to say, I'm charging ahead. So in many parts of academia, the mandate is still do not use AI. It's cheating. And I would say that is putting your head in the sand. We agree in that respect. And I want to know, so take us inside.

You know, the rooms where the decisions are getting made about curriculum and, you know, you said it yourself, academia moves slowly. Presumably a lot of your faculty are kind of set in their ways. You have the power to intervene. You clearly have an opinion. How do you navigate that conversation with faculty members who may not be ready for it?

So let me tell you the story of why we did this. And like, this was totally accidental. And like you, Dan, in your career, you come up with an idea, but then you're running an organization and then you pivot. And so we had two speakers in the academic year two years ago. We had Kent Walker and Karan Bhatia from Google.

who basically said AI is going to be as profound as electricity or fire. And I said, okay, maybe that's hyperbole, but let's assume it's going to be big. And then we had a CEO of a venture capital firm named Brett Wilson, who runs a venture capital firm in San Francisco that only invests in AI. You may know Brett. And a student asked his...

And Brett, a question after his presentation said, am I going to be replaced by AI? And Brett said, you won't be replaced by AI, but you could be replaced by someone who knows AI if you don't. And so a light bulb went off in my head right at that moment. And I said, we got to embrace this. We have to run, not walk. So I went to my faculty and I said, we need to drive this throughout our curriculum.

And they said, great, let's form a committee. And I did like an eye roll. I'm like, okay, this is going to be a typical academic committee. It's going to take two years to produce a 100-page report, and nothing's going to happen. So I said, great, you have six weeks, and give me no more than five pages. And the faculty came back with a fantastic proposal, which said, let's infuse AI into everything we do.

American University is not going to produce engineers or computer scientists like Stanford or MIT. That's not our niche. We're producing business leaders who go into marketing and finance and accounting. And so let's infuse AI into everything we do, starting with the first day that students come for the orientation, ending with the highest level graduate courses and everything in between.

Let's have it in the undergraduate and graduate curriculum, in the core curriculum and electives. Let's create two types of courses, what we call SAGE courses, which are courses with at least 50% AI content, and artisan courses where there's a touch of AI. Let's train our faculty because our faculty needs to be trained on this technology and how it applies. So we brought in private sector experts to train our faculty.

And our faculty, I would say, you know, 70 to 80% jumped at the opportunity to learn and to change. There's a certain subset that says, I've been doing what I'm doing forever, and I've been through 15 deans, and you're not going to change me. But I would say most of our faculty have run, not walked, and the impact has been fantastic. And perhaps the most important indicator of the change

impact is that the market has responded. So for our market, you can look at applications. So our applications were up 22% last year, and they're up 40% this year. And the questions we're getting from students and parents are, how do we learn AI in everything we do? Because that's the future. And so far, it's working.

So we've got to go down a level deeper. So the parents and the prospective students say, how do we learn AI? And you said a good portion of your courses are now 50% plus AI. There's a lot of ways that you can incorporate AI. There's everything from the math and stats that make AI work and how do neural nets work. And there's also AI can co-teach a course and author curriculum. There's also encouraging students to use AI tools and

And then I'm just going to leave this one dangling, but I think I talk about this a lot on this podcast. There's the ethics of AI and maybe what it means to be an informed citizen in a world where these AI tools are ubiquitous. I don't want to bait the witness here, but what does it mean to incorporate, infuse AI into the curriculum? Okay, great question. So let's start with a first-year 18-year-old freshman, okay?

They get here for orientation and then their business 101 course. This is how business works, what's a company, the different aspects of a company, operations, sales, etc. This is really basic stuff for 18-year-olds that don't know anything about business. We start with giving them prompts where we know the AI is going to come up with the wrong answer. So the first thing we teach students to do is how to prompt, but then to doubt the output that AI produces.

Because the worst thing that anybody can do is just assume that AI is going to give you the right answer, cut and paste, and that's your answer. Okay. Then as you think about this sequence of learning, the basics of each aspect of business education, so marketing, finance, accounting, management, etc. Then we go deeper and deeper. So students, one of the fundamental things

skills that a business student needs to learn is negotiation. So we have AI help them with negotiations. Negotiate with your boss. You've had a million of these conversations in your life, Dan, on how to get more responsibility and more money. Negotiate that with AI. Next assignment, have AI evaluate the strengths and weaknesses of your argument. Okay, next assignment. Now have the AI be a jerk. We've all had jerky bosses.

And you need to learn how to deal with jerks. That's part of learning. So have AI be a hostile boss, tell you you're dumb, that you're meaningless, that your work product is not very high, and you don't deserve a raise or more work. How do you respond to that? Have AI speak in a language that is different than yours. So those are cultural aspects.

So over the life cycle of their curriculum, they learn how to underwrite an investment, how to do fundamental consumer research, how to design a marketing campaign, how to use AI for entrepreneurship ideation for ideas, and how to read a balance sheet or even write part of an S1 to take a company public. Then on top of that, we have doubled down on all the skills that AI cannot do.

Because the data shows that AI will take a bottom decile writer and make them a median writer. They're not going to create a John Steinbeck, but they'll make everybody kind of okay. What AI cannot do is cannot teach you how to collaborate with a team. It can't teach you how to public speak and learn on your feet, how to make a pitch work.

and respond to tough questions. Dan, you've taken companies public. When you go meet investors, investors are asking you very tough questions. I'm sure that you practice and you knew the answers to every question before you went on your roadshow. So we have doubled down on emphasizing the professionalism and communication skills that are going to be more important in a world of AI, assuming that AI is going to be able to make you a kind of

medium level writer, coder, researcher, analyst. Dave, if you were here interviewing to take my job as host of this podcast, I'd say you're hired.

That was such a good, that was such a satisfying answer. Was that the recommendation in the five-page, six-week review that your faculty came back with? That was a very interesting answer. It was part of it. And here's like the interesting debate that came out of this in an academic, and I'm kind of opening a kimono here. So traditional academics would look at the skills like professionalism, communication,

teamwork and collaboration as soft skills. We're here to teach statistics. We're here to teach coding. We're here to teach accounting. So we've had to have a cultural adjustment at our business school because a lot of those skills can be automated. And so the softer skills are actually more important than ever. And in many ways, they're not soft skills. They're skills of the future.

So when I was at Carlyle, it took us nine months, millions and millions of dollars of legal fees, 50 to 70 iterations, and thousands of hours of work to write an RS1 to go public. Okay. AI can do that in a couple of days and get you to kind of 70% of where you want to go. So you can say, you can take any of the AI tools or a specialized tool like Hebbia,

And you can say, here's all our data. Here's our pro forma. Here's our business, description of our business. Here's our strategy. Take Blackstone and KKR and Apollo's S1 and give us a draft S1 modeled after them with all of our data. Okay. And then ask me questions where there are holes.

And so AI, like it could have saved us extraordinary amounts of time and money. We still would have had to have the teamwork, the analysis, the pitch, the shaping, but it's an incredible tool that can create incredible efficiency. And every student needs to know the strengths and weaknesses and the way to use it. And that's what we're teaching. We want them to be fluent in all AI applications.

That wasn't a plant, but George Savolka is the CEO of Hevio, his recent great guest on the podcast. I listened to that podcast. He's great. Another great, great recent guest talking about AI and academia is a gentleman named Chris Caron, who's the CEO of a company called Turnitin. And Turnitin is kind of the most used platform for plagiarism detection in higher ed.

I would call it kind of the hammer in the whack-a-mole game. The assumption is, you know, students will cheat, will try to violate the policies, and here's how to detect it. What I hear from you is a different approach. It's, here's how we...

are going to train you to partner with AI because it's a skill that you need. We're not here to penalize you for using it or make you feel guilty for using it. Was that intentional in your strategy? That is the goal. It's a cultural shift. So, you know, like on day three of freshmen being at our school, they kind of approach a lot of our faculty and say,

do you really want me to use this AI tool? Because in high school, I was told I was cheating. And so I don't want to get in trouble that I'm asking you quietly, but is that, and the professors have to say, yes, you're mandated to use it. AI detection, I'd be interested in your view, Dan. I don't know if AI is going to be good enough to detect AI.

I mean, I have gone to some of the AI detections with materials that I have written personally, and it suggests that 50% of it was written by AI, and it wasn't written by AI. So I guess our assumption at our school is that everybody's using—this is kind of like your opening—everybody's using AI for everything and just assume that they're using it and adjust—

based on that. Now, if a student produces something that clearly came from the equivalent of Wikipedia and turns it in as their assignment, they're going to get busted and called into the professor's office. But we train our students not to do that. Look, in any academic setting, there's a certain number of students that are cheating. They were cheating before AI, and they'll cheat after AI. There are services all over the world for students when they write their applications to

colleges where an advisor is being paid to help them. How much does the students work? How much does the advisors work? It's happening. AI doesn't change it. It just makes it a little easier. And we just assume that everybody's using AI for everything. If it replaces the learning process, that's bad. But if it accelerates the learning process, that's good. As educators, we should

We should celebrate learning tools that nurture, enhance, augment the learning process. And I think given that that's the world that students are graduating into, the last thing in the world we want to do is make them feel like they're cheating by using a tool that they're going to be expected to use in the workplace. 100%. And we're maybe two weeks away from announcing that our next phase of our AI journey is we're going to have a tool in every student's

laptop, desktop, laptop, desktop, and workstation and their mobile device, which is an AI tool which will allow for kind of mini LLMs to be created for each class. It'll allow students to create their own tutors. It'll allow students to create their own workgroups with AI. It'll allow our professors to create their own kind of environment where they can upload

all of their materials, past tests, worksheets, and students can use AI to educate themselves. And in the same way that you and I, when we were young, we would have study groups, and I always try to find someone that's smarter than me. Now students can use AI and create their own tutors. They can look at

previous tests that faculty have given and say, give me 10 questions and answers that I can prepare for my test. So we're trying to create tools that will enhance learning, not replace learning. So AI as a

as the concept has been around since the 50s. Marvin Minsky coined the term at, I guess you're not an alma mater, but you teach at Tuck. The term was coined in the 50s at Tuck. But it really wasn't until November of 2022 when ChatGPT from OpenAI was launched that the world just encountered this

fascination with the future of AI. Now, if you take that 18 year old, let's say they came in and the curriculum was available even in 2023, let's say, I don't know when it was, but presumably we're not yet at the point where a cohort of 18 year olds has gone through

a full curriculum and graduated with this set of learning techniques, how do you think about measuring on a longitudinal basis? How are you going to know if this approach, this progressive approach to embedding AI in the curriculum is working? That is a great question. And the way I'd answer it is this. We don't need to know yet. This is an experiment.

Part of what I've done with my faculty is to say, in business, you take risks all the time and fail. And in business, you kind of make stuff up as you go along. You have a thesis, you test it, you adjust. We're doing that. So we are 18 months into this process. We have not yet graduated one student that came in

as a freshman with our new AI-infused curriculum and graduated. And we're on a journey. So I would say, ask me in three or four years. We did engage with Google in a recent conversation and are continuing to engage with them around the exact question that you asked. And they basically said, what you're doing is incredible. We're not seeing other schools do it. How are you measuring outcomes?

And I was there with several faculty and I kind of gave an answer. And then the other faculty gave an answer and I stopped and I said, you know, we don't really know the answer to that. We have to figure it out and we'd love to work with you. So, you know, if you or any of your listeners has ideas for how we can better measure outcomes, that would be great. We need to figure that out. And we do know anecdotally that,

that, for example, our entrepreneurship faculty are saying the quality of ideas that students are coming up with for new ideas for businesses is materially better with AI. Because you can go to ChatGPT or Perplexity or any of the other tools and say, you know, give me 100 ideas for a new business I can create with this pair of scissors. And it'll come up with 100 ideas much faster than you and I can come up with it.

98 of those ideas might be terrible, but two of them might be good. And then you work those ideas in the same way, Dan, that you built companies and grew them and sold them. You had a thesis, and I'm sure the original idea for your company pivoted after testing it. And so that's what we're teaching students to use AI. So how do you measure outcomes? We're figuring that out. We haven't figured it out yet. And we're really open to ideas.

Let's take the flip side of this conversation. So I think you're way ahead of where most in academia are in terms of embracing AI. What do you say to prospective students who say, what's the value of that degree? Because

In the real world, I'm going to be vastly accelerated beyond what I could learn in any textbook. Maybe higher ed is dead. Maybe I can't justify the $400,000 or $500,000 when the tools are getting so good and I may not need the degree when I go to work. I think that is an existential question that every academic leader should be grappling with. If you look at the data,

More and more people doubt the value of a traditional degree, and AI will accelerate that. So as a business leader, Dan, you would say, okay, what do we do about this? We can improve our marketing. We can lower our price.

or we can improve our product okay my view and what we're trying to do at kogod school of business is we have to improve our product we have to do better we have to add more value and in order to justify our existence and the high cost that students pay and so far the market is responding positively with an increase in applications but

Schools are going to find themselves disintermediated. And there's already demographic pressure on universities in the United States. There's cost pressure. There's pressure on genders. So the number of males going to college is much lower than the number of females going to college. And the Harvards and the Stanfords and others, they have market power and they're going to be fine. But somewhere schools in the bottom half

quartile or bottom decile in terms of value added, they're going to go out of business. Most schools don't have someone with a business sensibility who says academia is governed by market forces. And it's up to you to put a good product in the market because, yeah, you are competing with maybe it's not going to be other higher ed institutions. Maybe it's higher ed or not higher ed.

I hope you get an opportunity to share that perspective. I've got two kids that are going to be in college soon. And I, you know, certainly as a parent, I'm wondering what they're going to. It's I mean, we're governed by them by supply and demand. OK. And in a in an environment with lowering demand based on demographics and based on perception of value.

there's going to be a flight to quality in the same way that in your businesses, there's a flight to quality. And so, again, the tools at our disposal, better marketing, lower prices. I'm not aware of any university lowering their tuition or improving the product. And so we are really looking inward every day saying, what can we do better to serve our students and support them?

So imagine one version of the marketing pitch is if you were to put on your Carlisle Group hat and say, I'm hiring for the best and the brightest. Come to Kogod and here's why you'll be attractive because this is what I look for in a private industry. What does that mean?

If you actually were to roll back the clock and you're at Carlisle, what is it about the education that you're providing students at American University that you think makes them compelling candidates? So if you think about it from a business perspective, the tools at our disposal to create better outcomes starts with faculty. Okay.

And we're taking a very, very strategic approach to picking spots where we can be excellent. There's 600 business schools in the United States and there's thousands and thousands of universities. And there's only a few Harvard, Stanford's and Wharton's and Yale's. So what I've said is we have to pick our spots to be excellent, to be truly world-class. And we're picking three or four areas where

we can be among the best in the world. So those are sustainability. We have one of the best sustainability programs in the world. 8% of our investment has been in AI-related faculty. And because of our innovation, I'd say we've been able to punch above our weight in terms of attracting higher quality faculty than we normally would. Obviously, if you're an academic and you get an offer from Stanford or Harvard or Yale,

You go there. That's the cream of the crop. Okay. So we're competing with Stanford, Harvard, and Yale, but we're also competing with other universities for the best quality faculty. And because of our innovation and leadership, I'd say we've been able to hire better faculty than a school like American traditionally could.

and attract fantastic faculty members who are going to have a greater impact on students and also produce the scholarship that helps create knowledge and enhance our reputation. But we're being very, very strategic in allocation of capital in the way a business would.

and investing in the areas of strength like a business would. Many in academia approach it as kind of you don't want to make anybody mad and you have to have peace in every valley. And we're taking the opposite approach, which is we're going to pick three or four verticals

AI, sustainability, finance. We have this niche in business entertainment and entrepreneurship, and we're going to invest, invest, invest in those and be world-class in a few areas. And then there will be spillovers in other parts of the university so that it creates an overall upward trajectory and both improves our brand and produces better students and more attractive students.

This one is sped by, but unbelievably, we're about out of time. And I think we hit on about zero of the topics that we had teed up. But you're going to have to come back. I hope you'll take me up on the offer. I would love to. I'm grateful for the opportunity. I love what you're doing. And to have a million listeners is unbelievable. I mean, it's just...

From my old podcast days, I have podcast envy. So it's really impressive what you've done. Well, I'm actually, I'm not letting you off the hot seat without answering one last question for me. So not only were you a very successful former podcast host, but you're an expert in the field of presidential transitions. You wrote the book on it.

And we had a great former guest, Tom Wheeler, who's the former head of the SEC. He wrote a book called Tech Lash. It was such a fun discussion. He's a national treasure. But he talked a lot about some of the parallels, going back to the Gilded Age and what we're encountering now. I want the guests to learn a little bit from you, even outside of the topic of AI. Sure.

What did your research for the book and your podcast and all the learning that you did, what can we learn about the time that we're going through right now? What does history tell us? So history tells you that a presidential transition is one of the most perilous periods of time in the United States preparedness, because unlike in a business or a university or in any other organization,

When you have a transfer of power in the United States, the entire top ten layers of the organization depart. So 4,000 political positions are vacant and on January 20th, an entire new set of people come in. And it's impossible to get people in place in their seats at the right time fast enough. So let me give you an example.

In 2001, President George W. Bush was president. He had a shortened transition because of Bush v. Gore in Florida. So the normal presidential transition is 75 days-ish. Because of the litigation, he was not declared president until December 13th of the year 2000. So he had 35 days. Eight months after he took office,

9/11 happened and he only had 50% of his national security team in place. And when the 9/11 commission did its autopsy of what happened leading up to that day and on that day, they said the fact that he did not have his whole team in place impaired our ability to respond. It didn't have an impact on preventing the tragic day, but it had an impact on our reaction time, okay?

So fast forward to eight years later, President George W Bush said, I don't want the next person to have what happened to me. So he rolled out the red carpet on an equal basis for the Republican nominee who was then John McCain and the Democratic nominee Barack Obama. Okay, and in the fall of 2008, what happened was there was a financial crisis.

right at the time of the presidential election, which peaked during the transition. And George W. Bush's cooperation with Obama helped smooth the recovery, mitigate the damage, and also accelerate the recovery on the outside. We saved the auto industry. We saved a number of banks. We injected huge amounts of money into the economy as stimulus. Fast forward to the two Trump

This is not a partisan statement because I just said George W. Bush basically laid the gold standard for... Fast forward to Trump on the way in and on the way out, and then on the way in again, those have been three of the worst presidential transitions in history and have created chaos. And so I guess the main lesson is like any business transition, any organization transition,

The transition of power in the United States is a critical period of time where continuity and effective management has a dramatic impact on the American people, and therefore it should be done well. And sometimes it is not done well, and that hurts all Americans, regardless of their political persuasion. That is fascinating. You started off with the adjective perilous, and I wasn't sure where you were going with that, and I hadn't thought about it in those terms.

I'd say chilling is another adjective. It's an interesting topic and very important for our country. I'd also say you should write a book on that, but you really did. Let the audience know, what's the book called? The book is called The Peaceful Transition of Power on Oral History. And actually what I did is I took a lot of our podcast transcripts and I published them with people like James Baker, the former Secretary of State, and

And the leader, Josh Bolton, who was the Bush transition head and chief of staff. And Chris Christie, who managed the Trump transition. So it's a fun book. And if you like politics and you like management, it blends the two of those and history. And so...

Buy the book. And you didn't have the benefit of AI to assist you in editing. I didn't. It would have made it a lot easier. Good. Well, Dave, we're out of time. This has been fantastic. Please take me up on the offer. Come back. Let's continue the conversation and just all the best of luck to you and to the students that are part of the Kogod School. Thank you so much, Dan. Thanks for having me and thanks for what you do.

Well, that's a wrap for this week on AI and the Future of Work. As always, I'm your host, Dan Turchin from PeopleRing. And of course, we're back next week with another fascinating guest.