We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Lindsay Weinberg, "Smart University: Student Surveillance in the Digital Age" (Johns Hopkins UP, 2024)

Lindsay Weinberg, "Smart University: Student Surveillance in the Digital Age" (Johns Hopkins UP, 2024)

2024/12/21
logo of podcast New Books in Critical Theory

New Books in Critical Theory

AI Deep Dive AI Insights AI Chapters Transcript
People
L
Lindsay Weinberg
Topics
Michael LaMagna:探讨了智能大学的概念、预测分析在招生和留学生中的应用、以及由此带来的隐私和公平问题。他表达了对学生数据被收集和使用的担忧,并提出了关于FERPA的局限性的问题。他还讨论了教师在智能大学中的角色和面临的挑战,以及高等教育机构应该如何应对这些挑战。 Lindsay Weinberg:详细阐述了智能大学的概念,指出其不仅涉及数字工具的应用,还与对高等教育未来图景的更广泛的想象和意识形态相关联。她批判性地分析了预测分析在学生招生和留存中的应用,指出其中存在的算法偏差和对弱势群体的歧视。她还讨论了学生隐私问题,指出FERPA的局限性以及大学和研究人员对学生数据的获取存在伦理问题。此外,她还分析了智能大学对教师工作条件的影响,以及人工智能技术对高等教育的广泛而深远的影响。最后,她呼吁学生和教师团结起来,抵制压制性的数字技术,并重新思考高等教育的意义,倡导更民主地整合智能工具,并重新思考高等教育的资金和政策。

Deep Dive

Key Insights

What is the central argument of Lindsay Weinberg's book 'Smart University: Student Surveillance in the Digital Age'?

The book argues that the integration of digital technologies in higher education, such as predictive analytics and AI, is part of a longer history of universities supporting technologies that reproduce racial and economic injustice. It critiques the smart university concept as a top-down initiative that often prioritizes cost savings and control over student and faculty autonomy.

How does Lindsay Weinberg define a 'smart university'?

A smart university refers to the integration of digital technologies like predictive analytics, IoT, and AI into college campuses. These technologies are used for student recruitment, retention, campus security, and mental health tracking, often under the guise of data-driven governance. The term also reflects broader ideologies about the future of higher education, often shaped by administrators or private tech vendors.

Why are smart university initiatives attractive to university administrators?

Administrators are drawn to smart university initiatives for their promise of cost savings, efficiency, and control. These technologies provide a way to visualize large amounts of data, guiding decisions about student recruitment, retention, and faculty performance. However, this often comes at the expense of student and faculty privacy and autonomy.

How are predictive analytics used in the student recruitment process?

Predictive analytics are used to create risk scores for students, determining which ones should receive the most recruitment efforts. This involves analyzing historical data on successful students, including demographic information, website engagement, and email responsiveness. The goal is to optimize recruitment strategies to maintain enrollment numbers and institutional competitiveness.

What are the privacy concerns related to student data in smart universities?

Students often unknowingly sign away their privacy through dense terms of service agreements. Universities collect detailed data on students, including behavioral and demographic information, which is frequently shared with third-party for-profit companies. This raises concerns about data misuse, especially since FERPA (Family Educational Rights and Privacy Act) facilitates public-private data sharing under the guise of providing educational services.

How do smart university technologies perpetuate racial and economic discrimination?

Smart university technologies often rely on historical data that includes biased demographic information, such as zip codes, which are proxies for race and class due to historical segregation. Predictive models assume that student success is based on individual choices, ignoring structural barriers like financial constraints or inadequate K-12 preparation. This reinforces existing inequalities in higher education.

What role does AI play in the smart university, and how does it impact students and faculty?

AI is increasingly used in smart universities for facial recognition, predictive policing, and monitoring student and faculty performance. It impacts research priorities, course content, and decision-making about promotion and tenure. AI tools also shape how students are prepared for the workforce, often prioritizing efficiency and control over ethical considerations.

What are some positive aspects of the smart university, according to Lindsay Weinberg?

Weinberg acknowledges the potential for digital tools to support students, such as auto-captioning for accessibility. However, she remains skeptical of most smart university initiatives, as they are often top-down, lack student and faculty input, and prioritize cost-cutting over meaningful educational improvements.

What future projects is Lindsay Weinberg working on after her book?

Weinberg is focusing on how faculty working conditions are being impacted by digital technologies, such as performance monitoring software and AI integration. She is also exploring how unionization in other sectors could provide solutions for protecting faculty rights and academic freedom in the age of AI.

Chapters
This chapter explores the increasing reliance on technology in higher education, examining its impact on students and the academic community. It introduces Lindsay Weinberg and her research on the intersection of technology and higher education.
  • Increasing reliance on technology in higher education.
  • Weinberg's background in targeted advertising and its relevance to higher education.
  • The start of Weinberg's work on the book Smart University.

Shownotes Transcript

Translations:
中文

The holidays are about spending time with your loved ones and creating magical memories that will last a lifetime. So whether it's family and friends you haven't seen in a while or those who you see all the time, share holiday magic this season with an ice cold Coca-Cola. Copyright 2024, The Coca-Cola Company.

The 2024 F-150 Lightning truck gets dirty and runs clean. With an EPA-estimated range of 320 miles with the available extended-range battery, it's the only electric vehicle that's an F-150. Visit Ford.com slash F-150 Lightning to learn more. Excludes platinum models. EPA-estimated driving range based on full charge. Actual driving range varies with conditions such as external environment, vehicle use, vehicle maintenance, high voltage, battery age, and state of health.

For over 50 years, Burlington's legacy has been great deals on coats for all weather conditions. So before you're caught unprepared for the winter weather, head to Burlington for name brands, quality items, and surprising fits for every family member. Stock up on coats, sweaters, and accessories before the cold hits so you can finally stop avoiding the elements and start living comfortably. Warm up at your nearest Burlington location less than one mile away. Burlington. Deals. Brands. Wow.

Welcome to the New Books Network. Welcome to the New Books Network. I'm your host, Michael LaMagna. Today, I'm joined by the author of Smart University, Student Surveillance in the Digital Age, published in 2024 by Johns Hopkins University Press.

Higher education increasingly has become reliant on tech solutions and systems to provide greater access to resources and services, produce efficiencies and operations, and theoretically increase the quality of the educational experience for our students. This reliance on tech solutions and digital tools comes at an expense to students and the academic community.

Joining me to discuss this book is author Lindsay Weinberg, a clinical assistant professor and the director of the Tech Justice Lab at Purdue University. Welcome to the podcast, Lindsay. Thank you so much, Michael. I'm so excited to chat. So before we get started talking about your book, Smart University, I was hoping you can tell us a little bit about yourself, your background and your career path.

Yeah, so I did my PhD in the history of consciousness program at University of California, Santa Cruz. And I had done my dissertation on the history and ethics of targeted advertising. So I was really interested in understanding like where does the phenomenon of targeted advertising come from? What's its relationship to kind of cultures of consumerism and issues of mass production?

And I was really interested in tracing its history and its present day implications. And I also studied how targeted advertising can cause issues of discrimination online in terms of who's targeted for particular goods and services. I looked at issues of data privacy as well. And then finally, exploitation. So how is user data increasingly swept up into systems that benefit corporations at the expense of kind of everyday folks like you and me?

And so when I started my postdoctoral fellowship at Purdue, I actually was tasked with working with administrators at the university and kind of mapping the latest innovations in pedagogy. And so many of these innovations were technology. And what I found is many of the types of tools I had studied as a graduate student in the context of targeted advertising and e-commerce were

were now being seen as ways to solve problems in higher education, whether that's targeting students with ads to get them to come to your university or trying to use predictive analytics to model for student success. Many of the types of issues around exploitation, discrimination, and data privacy felt very relevant to the higher education sector as well. And it was through that postdoc that I started working on the book. And then I finished the book as I've been doing kind of my professorship at Purdue as well.

Wow. So you could see what really sparked that interest in that movement from consumer culture all the way into how this is impacting higher education. So as we get ready to jump into your book, I was wondering, you know, how do you define or what do you mean by a smart university?

Yeah, I think I try and break it down in the book in a few ways. So one is just a way of naming all of the ways that digital technologies are being increasingly integrated into college campuses. And this could look like predictive analytics, the Internet of Things, artificial intelligence.

data, what's called data-driven governance in higher ed. And I use air quotes on that for the listeners and the audience. I think one of the reasons I do that right is because it's never really the data driving the decisions. There's always a human being or group of human beings who are involved in deciding what data is being counted, how it's being counted and what it's being used to do.

But nonetheless, the Smart University is really trying to name these transformation towards an increasingly digitally mediated higher education sector. And that plays out in a number of ways. It could be at the level of student recruitment, student retention. It could be an influx of digital tools into campus security. And I also talk about in the book the rise of kind of self-tracking mental health apps on college campuses as well. So it can really touch on

all aspects of the teaching and learning experience, as well as has implications for faculty labor in higher ed as well. And I think maybe some unique features of the smart university is it's not merely just this kind of suite of digital tools that are being used on higher education campuses. It's tied to broader imaginaries or ideologies about what the future of higher education is supposed to look like. And so the questions become

the question becomes whose imaginary are we being trapped in as either students or faculty? And so the book also tries to map when it comes to proponents of smart university initiatives, who are those folks, who are those voices? And it's oftentimes campus administrators or private education technology vendors, and sometimes university researchers and IT professionals if the tool is being developed in-house.

And then I guess the last thing I'll say about the rise of kind of smart universities is it's the idea is really to be able to hedge against uncertainty. And that uncertainty might be demographic. When we think about precipitous declines in enrollment, it might be environmental. When we look at smart university initiatives that are really about using sensors to manage waste or water consumption on campus, or it might be financial. And that's very much bound up in the kind of

defunding of higher education itself and a lack of kind of robust state and federal support for higher education as a public good. Very interesting. And so we could see how you came into this work when you arrived at Purdue. And, you know, the one thing I have to think about is what makes creating a smart university so attractive to administrators? And you kind of touched on some of that, but I was wondering if you can expand.

Yeah, I think part of it is about cost savings and efficiency. So again, this idea that administrators are having to make do with less and less resources. And so data provides this kind of untapped, almost seemingly raw material for higher education administrators to be able to use to make decisions.

I think the other thing that a lot of these tools promise is a way of visualizing at scale very, very large and vast amounts of data to be able to guide those decisions as well. So when we think about tools that are providing dashboards with all of these metrics and color-coded indicators that can purportedly be used to make these very difficult choices, I think part of the promise is control, and in particular, a form of administrative control.

that relies on a tremendous amount of data collection and surveillance over students and faculty. So I think it's both of those things. It's about efficiency, it's about the promise of cost savings, and it's about the promise of control over students and faculty.

Very interesting. And you talked a little bit about that demographic cliff that we're coming to, right, or we're actually at right now. And, you know, as we think about that in relation to predictive analytics, right, we can see why this is very much of interest, especially for administrators thinking about how to recruit students, the recruitment process and how to get them enrolled. And I was wondering if you could talk a little bit about how this smart university, these predictive analytics, these different tools are being used in the recruitment process.

Yeah, absolutely. So I think one of the interesting things that the book talks about, particularly in the wake of COVID, is there's this period where students are not having to sit for exams. And so there's less data that can be bought from the college board. And I think many listeners might not realize that universities buy a tremendous amount of data from the college board, which gathered information about students who sit for the standardized exams for admissions, to be able to kind of figure out who to target for

for enrollment or to try and target students that they know they're going to reject because they want to inflate how competitive the student, the university seems according to these like national metrics, right? So there's already a precedent of universities collecting data to try and target and market themselves to particular types of students.

But I think one of the things that that moment created was more demand for different types of data sets that could be used by admissions officers to try and figure out what types of students to try and target with recruitment materials. And so there's a number of kind of for-profit ed tech solutions that are now being marketed to university administrators to try and basically create risk scores for which students should be, you know, which students should receive the most investment from an admissions officer.

And many of those ways of allocating risk are based on predictive analytics. So they're using historical data sets based on students who've been successfully recruited, enrolled, and retained to try and guide decision-making around which students to target. And some of that data that they're using is not only demographic data, but it could be how often the student engages with the university's website for the department that they're saying they want to attend to, or how

quickly the student opens a university email in addition to things like GPA, right? So there's all kinds of ways in which biases can creep into these predictive analytics, but nonetheless, that's the goal, right? It's trying to look at this problem of a precipitous enrollment decline and think about who can we target to make sure that they come to our campus or who can we target so that we can reject them to make sure that we remain seemingly competitive on a national scale.

I'll offer a little personal anecdote here. I have a college junior and a high school senior, and the advice that I gave both of them was make sure that you're engaging with the website, opening emails, attending those open houses for this exact reason so we can see how this system can be gamed, really, if you know what the practice is.

And one of the troubling things with predictive analytics is the underlying algorithms that basically perpetuate existing forms of racial, economic, and social discrimination. I was wondering if you could talk a little bit about that.

Yeah, so one of the data points that can also be used, particularly on the recruitment side, is zip codes. And we already know that zip codes are oftentimes a proxy for things like race and class because of longstanding histories of segregation in the United States. So where people tend to live, where their communities tend to be, can really reflect longstanding histories of redlining.

and racial segregation and economic segregation. So already there's ways that things like racial and economic bias could be creeping into a dataset if universities are trying to look at zip codes where clusters of students are applying and successfully being retained. So that's just one example that we could look at, but we can also think about it in the context of success models for student retention.

again, they're building on data sets that's looking at students who've successfully persisted throughout higher education. And it almost makes this assumption that the reason those students have persisted is because of their own individual choice-making behavior. And so if the institution could just monitor these students successfully, monitor the types of choices they make in college, whether that's the type of

classes they take or how frequently they go to the library, then they can nudge other students to make those same decisions, right? And this completely overlooks all of the structural barriers that contribute to why many students drop out, many of which are financial, many of which are related to a lack of adequate preparatory structures in K through 12 or institutional failures to provide adequate

financial support and mentorship once those students are actually enrolled. So that's just another example of how racial and economic bias play out in the context of predictive analytics here.

And you really hit on an important thing, and that's kind of how students are being labeled in these systems, right? And so as we're thinking about it, institutions are using a number of platforms, whether it's for recruitment, for retention, making the different decisions on tracking progress for students. Right.

And a lot of these systems will label at-risk students based on some of this historical data that we know could possibly be flawed. So how do these systems actually work to make these determinations?

Yeah, I think it's really interesting because there's so many examples of predictive analytics that are also being designed not only by kind of outside for-profit companies and then marketed to the higher education sector, but also former criminologists who are then building predictive models for student success, but they're importing all of the categories and philosophies that are essentially coming out of the criminal legal system. So I think just...

that's an interesting thing to be thinking about, right? Is we already have a criminal legal system that's racially biased and that the categories of risk tend to lean into very punitive and carceral systems. And I think we see the same type of parallels when we look at the type of stigma that might be attached with being labeled

as a high risk student. But to answer your question about kind of how these systems work, right, they're taking a tremendous amount of data based on how students have historically performed and then looking at current students and how they compare. So the data could be coming from the learning management software and how the student is engaging with the platform

that their courses are using. It could be things like their attendance records. It could be library visits. It could be high school GPA and other markers that are coming in to try and indicate where that student showed up in higher ed and then where they're going. But a lot of

is also behavioral data. So how they choose to spend their time on campus, how often they're engaging with particular campus resources and opportunities. And then the goal is to mark those students for some kind of intervention or nudge. Sometimes those nudges are automated. Sometimes it's an indication that an advisor is going to step in and counsel that student.

One really interesting example that the book talks about is really building from the scholarship of Maddie Whitman, who's an anthropologist. And she did a field study in an R1 big public research university with in-house data scientists who are building a student success model for their university.

And what she describes is the ways they were putting data into two buckets, one bucket being things that students can change. So they can change, you know, how often they show up to class or they can change how often they go to the library, or they might be able to change, you know, how often they're spending times with friends studying. Um,

But there's also this other bucket of data, which are the things that they can't change. And that's all the demographic data. That's their race. That's their class. That's their gender. That's their zip code. Those are just the facts that can't be included in this model. These data scientists are arguing because these are not things that students can change. And yet we know that the greatest predictors of student success are actually those demographic categories.

And the reason why demographic categories are so predictive is not because there's anything fundamental or essential about being from a particular, having a particular demographic feature, but rather that demographics are really telling us

social and political forces that have often shaped that person's life, barriers that they've experienced during their K through 12, or ways that they're likely going to not be provided adequate institutional support once they're enrolled in a given institution. And yet these are framed as outside of what predictive analytics can solve. And so these data scientists were like, we're just not going to even include demographic data into our predictive model. And in that sense, they kind of conceal the

the university's knowledge of what are the greatest drivers of inequality itself. Are you a professional pillow fighter or a nine to five low cost time travel agent, or maybe real estate sales on Mars is your profession. It doesn't matter whatever it is you do, however complex or intricate monday.com can help you organize, orchestrate and make it more efficient. Monday.com is the one centralized platform for everything work related. And with monday.com work, it's just easier.

Monday.com for whatever you run. Go to Monday.com to learn more.

The 2024 F-150 Lightning truck gets dirty and runs clean. With an EPA-estimated range of 320 miles with the available extended-range battery, it's the only electric vehicle that's an F-150. Visit Ford.com slash F-150 Lightning to learn more. Excludes platinum models. EPA-estimated driving range based on full charge. Actual driving range varies with conditions such as external environment, vehicle use, vehicle maintenance, high voltage, battery age, and state of health.

Oh, very interesting. Now, you talked about these nudges that the system can produce, whether it's an automated nudge, if it's alerting an academic advisor and things like that. What's really troubling as I listen to this is that often these nudges will...

you know, force students to make decisions that maybe they wouldn't have made, maybe not be exposed to academic discipline, subject area, course program, that they're really funneling students into areas that they believe that they're going to retain them in and that these students are going to be successful by completing their degree. And, you know, I'm just kind of wondering, why should we be concerned about this as educators?

Yeah, and I think we can think of it as decisions the student wouldn't have otherwise made, but we can also think of it as decisions that some students simply can't make. So for a student who's working two jobs or for a student who's taking care of a family, maybe in theory it would be nice to be spending more time at the library, but the idea that a student is simply able to make an infinite number of choices and it's just a matter of creating the right structures or incentives to get them there, I think already is

is presupposing a very specific type of student that those models are interfacing with that's non-representative of many students' experiences. But to your question about kind of these tools that are guiding students towards particular degree paths, I think if we want to deal with issues of segregation in different disciplines, or we want to deal with the fact that

that certain disciplines, especially in STEM, are already overrepresented with men and disproportionately white men as well, then having predictive models that are based on students' persistence in those fields with all of these biases laid in the data sets is going to entrench the segregation that we already see in a number of fields. I think it also...

it's basically presenting the solution as one about, again, student choices and also getting them in and out as fast as possible. So that's where I think the logic of austerity is playing out here, where rather than providing students with enough funding support that they can take four or five years to get a degree, there's this notion that we really need to get them in and out as fast as possible, make sure they persist.

And part of that is by getting them on a debris path that we know they're going to finish as quickly as possible and be successful within. And I also think that what these models are in accounting for is ways that institutions can fail students and ways that institutions can be reformed, right? So if we see that students from a particular demographic back

or students from a particular part of the country don't tend to be as successful in a given major, well, then maybe that major needs to be redesigned or maybe there needs to be more support provided during the first and second year of students' education. And what we've actually seen are trends towards the opposite. We're seeing an increase...

in ratios when it comes to students and faculty. So more faculty having to teach larger and larger and larger numbers of students without adequate support. We're seeing the overturning of affirmative action, anti-DEI legislation, which makes it harder and harder just to offer support for particular student groups as well. And so I think we're living in this moment where all of the robust solutions to these problems that require

institutional accountability and that require more substantial funding are framed as outside of what's possible to transform. And then technology is here as it's, oh, here's this great solution. And all it requires is for students to make different choices or all it requires is for a student to choose a different degree path, even though that might dramatically shape their life chances in the future and then problem solve. Right. And so I think it's that it's that logic of problem solving that the book is also trying to critique and challenge.

Yeah, that's very interesting. Now, as I'm thinking about what we're talking about right now, it really gets to me that there's so much data being collected on these students. And oftentimes when they first enroll at an institution, they're asked to sign all these different documents.

And we know with terms of agreements and terms of service, nobody's actually reading that. They're just signing away. And so what are the privacy concerns for students, especially as we think about a lot of this being offered from a third party for profit company? You know, what kind of privacy concerns should students be thinking about?

Yeah, I mean, I couldn't agree with you more, Michael, that the term privacy policy is so misleading. I mean, not only because many folks don't have the time or the expertise to actually read them, right? They're like laden with legal jargon and very, very dense and oftentimes very, very confusing. But most folks need these services. So if I'm a student and I don't,

like the privacy policies of the learning management software my professor is going to use, it's really hard to feel like a student can then opt out, right? These are necessary services to being part of these communities oftentimes. So I think it can be very coercive and extractive in that sense.

But yeah, just in terms of like the level of fine grained detail that institutions have access to about students, I think there's tremendous concern there. And I also think that for a lot of administrators and even university researchers, there's this sense of almost entitlement.

to data about students. That part of the contract of being part of the educational community is that you need to give over your information in order to be either cared for or to have services provided to you or for the good of research itself.

So I think one of the examples the book talks about are these research studies that were done at a number of universities. These were done with institutional review board approval. So they were seen as meeting the threshold of research ethics as far as these universities were concerned. And these were studies that were carried out oftentimes with funding from companies like IBM,

And these were studies that were being done to improve facial recognition technology. And so there was long range surveillance camera photos of students walking between classes without their knowledge or consent to create these data sets. And these data sets were particularly attractive for computer vision researchers because these are so-called students in the wild, right? They're

They're partially obstructed views. They're walking in between classes in ways that are not particularly predictable. And so it was this very, very desirable data set on behalf of university researchers. And all of this was perfectly permissible, right? It was seen as perfectly permissible by the IRB. And also privacy was actually not quite effective in terms of raising concerns, right? Because technically students don't have a reasonable expectation of privacy when they're walking between classes because they're outside in a public space.

space. So I think the book also tries to think both with privacy as a useful concept when it comes to thinking about increasing students' ability to control what information is taken from them and how it's used. But I also think privacy can be a very limiting concept because it relies much on kind of an individual legal framework and whether or not it can be exercised by students, whether or not students can consent.

And those are important questions, but I don't know that they get to these other issues of discrimination or what happens when it's legal, legally speaking acceptable, but not ethically speaking acceptable. Well, that's a great distinction between the legal and the ethical. I really think that's important.

Now, when you have these conversations on campus with other faculty, administrators, staff, do they often fall back on, well, FERPA exists. And so if we're not supposed to be sharing all this information with third parties, well, FERPA would tell us not to do it. But really, what's the limitation with FERPA when we talk about student privacy and interacting with these for-profit companies where we're basically giving them free student data? Yeah.

Yeah, this is a really, really interesting question. And it's one the book thinks a lot about. But I also wrote a recent piece for Inside Higher Ed because there was this controversy around the University of Michigan. And there was a data set that was being sold by a third party that had kind of student papers that were part of that data set.

And the students had actually consented to having their papers used for research purposes, but this was like 20 years before this study with AI was being carried out and now their student research papers were being used to train AI.

I think there's just kind of an example where if you looked at the comments when this story originally broke, everyone was like, how is this possible with FERPA? Like, how is something like this allowed? And FERPA enables public-private data sharing. Again, it falls into the same trap as a privacy policy. It's actually designed to facilitate these processes and to carve out

in what ways the data sharing can happen. And so FERPA, especially after 2008, was amended to give a tremendous amount of latitude to education technology providers, third parties outside the higher education ecosystem more broadly, as long as you could say they're providing a quote-unquote educational service.

And so there's so much discretionary power in terms of what counts as an educational service. And so really it's created a lot of possibilities for data to be shared among the number of different providers that I think are cause for concern. And also where it's being stored, right? Because these aren't on-campus servers. These are servers owned by the third party too. Right.

Yeah. So I was thinking, as you discuss in your book, you talk about, you know, health tracking, whether it's, you know, it was very interesting to hear the discussion about, you know, having smart devices in your dorm room that are there to help you, as well as we all know people have smartwatches or other smart devices to track their health. And how does that fit into this conversation as it relates to the smart university? Yeah.

- Yeah, I love that example of a student asking Alexa why tuition is so high and Alexa says, "Oh, I can't answer that one." But if you need to know when your library opens and closes, Alexa's great, right? So I think, yeah, that question of what can be asked and answered, how are problems framed and how does technology play a role in structuring what seems possible to transform about the university is really what the project is about. And I think to the question about wellness,

I was really interested in the book in looking at the rise of mental health self-tracking apps. And so again, rather than universities investing resources and providing sufficient mental health resources on college campuses, the idea was we're dealing with these constrained budgets. There's oftentimes students who have quote unquote low levels of anxiety and depression. Why not offload those students onto essentially this form of kind of self-help resources?

mental health self-tracking that was being facilitated through this private company. And so I downloaded the app myself and really almost did kind of like an auto-ethnographic study of looking at, you know, what type of information is the app asking of me?

How is it structuring how I feel about my own mental health? What types of questions about mental health causes does the app attend to and what doesn't it attend to? And things like dealing with experiences of racism or discrimination on campus, there's no real space for that. Issues of tremendous financial insecurity,

the way the app is structured is it's all about cognitive behavioral therapy and it really invites you to kind of document your moods and document your thoughts and then see whether or not those thoughts can be restructured to be more positive essentially and so I was really interested in both the type of framing that the app is using to talk about student mental health and also trying to understand why is it the case that something like a self-tracking app

is more attractive in this particular moment than something like robust mental health resources for students on our campuses.

And I'm sure it all comes back to austerity and budgets, right? All the time. So as we're having this conversation, I'm just thinking about our students. And you make a great point in the book that, you know, universities, colleges are moving in this direction because we have these students that are, you know, for a better term, digital natives, right? That they've grown up their entire life with technology and that the university has to adjust to these students, right? Yeah.

You know, it's just one of these things that we have to pivot and become this new institution. And, you know, as I think about this, you know, we think about the racial, ethnic, economic, social discrimination that's built into these algorithms and now this thinking, you know,

are they taking into consideration the digital divide that still exists among our students, right? And we've seen this during the pandemic. Everybody had to pivot online. A lot of students didn't have a computer in the home or reliable internet access. They're taking classes on their phone. So how does that fit in this? Are administrators, institutions thinking about the digital divide and if students have the necessary digital literacy or technical skills to actually use all this technology? Yeah.

Right. I think the concept of digital natives is so disingenuous in terms of how it's used in this conversation, right? This idea that the demand for smart universities is coming from the students themselves. They expect this kind of high-tech, seamless, integrated experience.

There's little if any concrete evidence cited when administrators or ed tech vendors are making this claim. And also to your point, there really is no such thing as a digital native, right? So many students come from such a range of backgrounds. Some are dealing with tech their whole lives. Others are still dealing with unreliable Wi-Fi access. And that unreliable Wi-Fi access will show up in terms of whether they're the student who clicks on the university email or

or whether there's a student that needs to go to the public library to be able to access the internet. So I think you're totally right. There's both a disingenuous representation of where the demand is coming from and the word digital native is used for that. And then I think a total absence with...

engaging with this idea of the digital divide as well. And in terms of thinking about what is to be done, I do think helping students gain more critical digital literacy is really important. So having the lens turned on the institution itself, in what ways is your university using digital technology that's impacting your rights as a student or your learning conditions or the working conditions of the faculty you care about?

about, I think the university provides a really important opportunity where we can take students, regardless of how much familiarity they walk into our classrooms with digital technology, we can use that as a launching point to get them to start thinking about how technology is transforming the student experience and whether that transformation has been for better or worse when it comes to issues of privacy, discrimination, and exploitation. Yeah, I think that's the key, right? Better or worse. Yeah.

How is it playing out? Now, we focused a lot on students right now. But, you know, if we think about the academic college community, we have to think about the work of faculty members. And so how are faculty members being impacted by this new smart university?

Yeah, so this is something I've been thinking. I think the book touches a little bit on the faculty experience, but I'm really keen for my next project to really be thinking about faculty. One of the things the book talks about are these faculty performance moderating softwares, and those are becoming increasingly popular. So again, there's this idea that university administration should have a scalable, large-picture overview of all of the

of faculty research productivity. There's some technology now that uses RFID tags to know whether a faculty member is actually in their office when they say they are. So a lot of this is kind of, you know, issues of discipline and control and surveillance kind of go hand in hand, right? So it's a way of expanding that kind of algorithmic

algorithmic administrative gaze over faculty. And also I think making work itself for faculty increasingly more monitorable, but also more able to be kind of regulated and disciplined and made even more piecemeal. So when we think of even plugins for learning management software,

When we think about the conditions for online educators, increasing pressures to use artificial intelligence tools in your classrooms, like these are also technologies that turn teaching and learning into more discrete tasks that can then be uploaded on a learning management software. And then you can monitor whether students are engaged with it or not. So I think it's also part of a kind of a digital tailorization. If we think about, you know,

the rise of scientific management in the factories, I think we're going through a similar type of transformation when it comes to the rise of tools to track and manage and monitor every aspect of faculty instruction and performance.

This episode is brought to you by Amazon Prime. There's nothing sweeter than bacon cookies during the holidays. With Prime, I get all my ingredients delivered right to my door, fast and free. No last minute store trips needed. And of course, I blast my favorite holiday playlist on Amazon Music. It's the ultimate soundtrack for creating unforgettable memories. From streaming to shopping, it's on Prime. Visit Amazon.com slash Prime to get more out of whatever you're into.

This episode is brought to you by AWS. Amazon Q Business is the new generative AI assistant from AWS. Many tasks can make business slow, like wading through mud. Help! Luckily, there's a faster, easier, less messy choice. Amazon Q can securely understand your business data to help you streamline tasks, like summarizing quarterly results or doing complex analyses in no time. Q got this. Learn what Amazon Q Business can do for you at aws.com slash learn more.

That's an excellent point there. I really like that. And, you know, you had me thinking about, you know, these textbook publishers. Again, we're talking about these third-party ed tech outside vendors, right? And even these textbook publishers realize that it's now time to produce content that's easily plugged into a learning management system.

And really what happens there in terms of the educational experience? Because again, this is all predicated on producing a quality educational experience, right? But are we giving up too much control to, you know, for-profit ed tech companies, book publishers and things like that?

Yeah, I think it's definitely about a corporate takeover of the very foundational infrastructures and missions of universities. I think that's a huge part of it. But I also think this is building on decades before

now of trends towards a lack of shared governance and a lack of academic freedom as well. I think when we think about the decline of tenure track faculty and protections post-tenure, we think about rising levels of precariously employed faculty. Part of that is also about trying to stop workplace democracy, workplace organizing and faculty autonomy over over curriculum and their working conditions more broadly. So I think it's part of both of those stories. And I think this is why

It's also a question of administrative power and in terms of who in the university system should really have control over decisions over how classrooms are governed or the type of content that faculty are teaching. That's very interesting. Now, you mentioned AI. So how is AI going to, dare I say, exacerbate this problem? Yeah.

Yeah, I think, I mean, one problem we have is that I think lots of things are called AI that are really just, you know, statistics and linear regression, right? So I think one of the things we see is there's so many ed tech vendors who are using AI to talk about, you know, very rudimentary systems in the hopes of getting, you know, some of the submission dollars or these...

Yeah, administrators and budgets. But I think it really depends on the context. So I think we're seeing the rise of facial recognition technology on college campuses to suppress dissent. We're seeing university researchers who are helping to build predictive policing technology, risk assessments for the criminal legal system. So I think at all levels, AI is impacting research priorities at universities. It's impacting education.

you know, instruction and course content. And it's increasingly impacting how administrators are making decisions about things like promotion and tenure or how students are being prepared for the workforce as well. So I think it has a really pervasive influence and there's many different ways I think it's going to impact students and faculty, both once they're part of the university community and then once they're past.

Oh, interesting. So let me ask you this. Now, if we think about where we're heading as institutions of higher education, you know, what can students and even faculty do right now to kind of address some of the failures that we're seeing in these smart universities?

One thing that I think is really important is that a lot of these tools are predicated on trying to set up antagonisms between students and faculty and the companies behind these tools. So just to give you one example, OpenAI recently released a set of guidelines and the audience for those guidelines is primarily students. Basically, how can you use ChatGPT in the classroom in a way that your teacher is not going to get distracted?

And there's all kinds of ways where if you close read that document, it's constantly making it seem like you have these behind the times, old recalcitrant faculty and students really need to be at the forefront of pushing this stuff in a very, very careful and ethical way, but pushing this stuff into their classrooms.

And so I think it's really, really interesting the ways that it's trying to keep students and faculty apart and make them seem like they're on opposite sides of the spectrum here. And I think also we see kind of the weaponization of student feedback forms for faculty.

you know, in this particular period as well. So I'm really interested in the book and thinking about how can students and faculty come together and build coalitions to push back against repressive forms of digital technology on their campuses. And I think we have a lot of examples to work with. I think if we look at what happened with,

exam proctoring software, which was found to be highly discriminatory. It presupposed that everyone had a reliable internet connection. It didn't work well for students with disabilities or students in religious head coverings.

expensive for me since, and of course was found to be in violation of privacy when it came to requiring room searches. So the list goes on and on in terms of how harmful of a tool it was, but nonetheless, at many universities, it was still being used. And through open letters and really, really working together, students and faculty at a number of campuses have been able to either get the tool into an opt-in system

use on their campuses or banned altogether. So I think that's just one really great example where folks who are situated differently in the academy are working together to push back. I also think it's really important for folks to organize. I think it's important for faculty to be able to organize if you really want to advocate for workplace democracy, you need support and you need that kind of collective power that's only possible through organizing. And I think same thing for students, right? To find ways to come together

as communities with a shared set of interests and to demand greater control over the types of tools that are being used on your campus and not just the tools themselves but the broader vision of higher ed that they that they're that they're being that they're symbolizing right and i think that's also really important it's not only about the technology itself

but about the technology as a symptom of a larger notion of what the purpose of higher education should be. Is the purpose of higher education glorified job market training and inevitable indebtedness? Or is the purpose...

of higher education, preparedness for full participation in democracy, and a public good that deserves robust investment. Unfortunately, I don't think we're having enough of those conversations right now about what the role is of higher education. Now, if we think about it, we're going to continue to see these constrained budgets, right? At the state level with public institutions, even with a lot of private institutions, we know the demographic cliff is here, enrollments are going to be declining. But really, how can we

Imagine a future university that integrates some of these smart tools, but does it in a positive way.

Yeah, I think there are efforts to reform a lot of the vendor assessments that are done at universities to make those more democratic, to ensure that students and faculty are in the room in addition to IT professionals and university administrators. I think the more you democratize conversations around to what extent, if any, should a specific process be automated or should digital tools be integrated into the campus community, I think that's a step in the right directions.

But I would say I'm not fatalistic. I do think it's possible to turn back the clock on some of these terrible higher education policies. And I think we don't really have much of a choice if we actually want to see a much more just future in the coming years and decades. And I think, you know, there's groups, there's New Deal for Higher Education, there's HILU, there's lots of groups out there that are trying to think about how to transform higher education policy and funding to be more just and equitable.

That's such a great point. And hopefully we can get to those policy level solutions, right? And so as we're thinking about it, we've really focused in on some of the problems that we see with these smart universities, the kind of digital surveillance that is happening. But really, I was hoping we can kind of pivot a little bit and talk about some positives. So really, what are some of the positive aspects of this smart university?

That's a good question. I have to say, I'm pretty suspicious and critical. So I think on the face of them, right, the idea that campuses should be more personalized and attentive to student needs. And if there's a way to leverage data to do that, that sounds great, right? But I think the problem is that so many of these tools are taking as a given the fact that there's just such little funding to support students or that we need to get students in and out of campus as fast as possible. So in that sense, I'm

I'm pretty much a skeptic, but that isn't to say that I don't think that there's a possibility for digital tools to be useful or that I'm anti-artificial intelligence broadly. I think being able to use things like auto-captioning to be able to better support students with disabilities in your classroom if you can't have an interpreter, right? That's a good thing to have. But nonetheless, I'm really concerned that most of these solutions are coming from a very top-down perspective

decision-making process. They're primarily designed with people who have little to no experience in the higher education sector, and they take as a given very, very political choices that have been made about both the present and future higher education system. So I think that's why I tend to really kind of lean into a more critical interpretation of what these tools are doing. I agree with you.

So as we're thinking about this, and I've taken up a lot of your time, but I'm wondering, you kind of hinted on it. What kind of projects are you working on now that the book is out?

Yeah, so really, I really want to drill down into the faculty side of this. In what ways are faculty working conditions being transformed by the rise of things like faculty management performance software, the increasing pressure to integrate AI tools into your classroom? I really want to get more of a sense of how these new digital transformations are fitting into larger trends towards increased precarity and lack of academic freedom for faculty.

And I'm also interested, and this is work I'm doing with a collaborator, Robert Ovets. We're both really interested in what types of solutions are possible through unionization and learning from unionization in other sectors where AI has really been a site of struggle and a battleground. And then to what extent, if any, can higher education as a sector learn from those struggles in other areas when it comes to imagining guardrails for ourselves that protect us?

Oh, that's very interesting. That sounds like a very interesting project and something I'm personally interested in. But Lindsay, I want to thank you so much for taking the time to speak with me today. I got to tell you, I really enjoyed your book, if you couldn't tell. This has been a great conversation. Thank you. I really, really appreciate it. Well, again, I want to thank you for being here. I'm your host, Michael LaMagna, and thank you for listening to The New Books Network.