Failure To Disrupt Book Club: September 21, 2020
Oct 30, 2020 08:00 · 8231 words · 39 minute read
- Well welcome everyone formally to the Failure to Disrupt Book Club. We’re so glad that you’re all here. We’d invite you to introduce yourself in the chat window. Tell us who you are and where you’re from and what kinds of things you do that are related to these topics. And I’m so thrilled to welcome Chris Gilliard and Audrey Watters. Audrey Watters is a long-term friend and colleague.
00:25 - Chris is a new friend and colleague, and I’m excited that we can get together. We’re gonna spend the next 10 weeks each Monday at 3:00 p.m. or on a recording for those of you who can come, who are sleeping right now but wanna listen later talking about one chapter of the book at a time. We’ll have a bunch of folks who come in to help us advance the conversation. I’d encourage those of you who are listening synchronously to be active discussing things in the chat window as we’re going along talking, so that you all can interact as well. There are about 70 people online.
01:02 - So that’s probably too many for all of us to turn on microphones and chat with each other that way. But hopefully we can have a great back channel going. As a number of you have found, we also have a set of forums that are associated with the book club and those are at failuretodisrupt.teachingsystemslab.org. A number of you have introduced yourself and I’m super glad and grateful that you’ve done so. And then a number of folks have also started discussing the book and discussing the chapters in the book.
01:34 - You can see some of the initial conversations that are happening there at the prologue and introduction chapter page. From what we can tell of the people who are coming you represent all different kinds of backgrounds, your faculty members and colleges, your K-12 teachers, your librarians, your technologists, you have a wide range of backgrounds. And I’m excited that we can all get together and talk about this book that I’ve been working on for the past five years and finally get to get the release as you all saw in the prologue. It was written before the pandemic. I was doing copy editing on March 23rd or something like that right as schools were trying to shut down. And certainly one of the steps that we did is we went through it and we said, is there anything that I’ve written in the last five years that I’m gonna feel really stupid about having written now that we’re in the midst of a pandemic.
02:36 - And we actually went through and we didn’t change anything. So now there may be things that do in fact look stupid but as far as I could tell the other people I asked to look through it, there was some sense of like, nope, this sort of is a book that at least makes a fair effort at trying to explain how we got to where we got. So I want to let Chris and Audrey introduce themselves and tell us a little bit about their work and then sort of turn it over to them to share a little bit about what they read. But we’ll just start by asking you Chris and Audrey, not just to know who you are but kind of like, what’s your EdTech story. What’s the interaction that you had with EdTech somewhere along your journey that sort of got you particularly engaged in addressing these issues.
03:21 - Chris why don’t we ask you to go first, introduce yourself and tell us your EdTech story. - [Chris] Sure, my name is Chris Gilliard. I teach at a community college, MaComb Community College outside of Detroit. I teach English and I do work in mainly interesting kind of privacy and surveillance stuff. But so my EdTech story I’ve told this several times but I teach first year writing.
03:51 - And my students or students were doing some work and they were working on what used to be called revenge porn and is now more often referred to as non-consensual intimate imagery. But my college had heavily filtered internet. And so when my students search the term revenge porn, the filter acted as if they were looking for porn and returned either no results or results for like a TV show called revenge and things like that. And that’s kinda what started me down the road of thinking about what it means when institutions set a policy or set educational or tech policy for one reason, and the reason they had done, filtered the internet was because after our staff were apparently looking at porn. But they set it for one reason, but it had all kinds of fallout that negatively affected students ability to do work.
04:58 - And so that, there’s a lot of reason why I do what I do now actually. - That’s great. Thanks Chris. Yeah, I mean what a great introduction to schools’ complex systems and the unintended effects of technology decisions in one part of an institution over towards another. Audrey can you introduce yourself. - Yeah, I’m Audrey Watters. I write about education technology, both I would say the history, how we got here and sort of the speculations about what the future’s gonna look like. I am gonna be participating throughout this book club. So I’ve got lots of EdTech stories which is always challenging when people ask me sort of how did I get into EdTech.
05:46 - And I sort of, I have to sort of pick one that is most sort of most suitable for the topic at hand. So I’ll choose this one today. In the late 90s I was a graduate student also at the time teaching writing. And the university that I was a grad student at was very generously I thought gave us all web spaces. Our own domain, our own little tilda.domain, Tilda domain. And I could make my own website. But I could also make websites for all of my courses. And I was really excited about this. I thought, isn’t this great? Like if students lose the syllabus, I can say oh, it’s online.
06:26 - And somehow (laughs) when students lost the syllabus they expected me still to print it out. But then like I said, in the late 90s, the university decided that everyone should start using this new product that they’d purchased called blackboard. And they strongly discouraged any of us from posting our syllabus and our course materials on our websites. They wanted it all inside the learning management system. And for me, I didn’t really think much of it at the time. It just, I still want it a lot now.
07:02 - But it really did strike me as a fascinating decision that the university had decided that we were discouraged from sharing our teaching materials online. And as a graduate student teaching for the first time, it was really useful for me to be able to see what other people were doing in their classes. And so I was kind of I was shocked. I was shocked that the university would make a decision and that would discourage us from openly sharing our work. - And thus– (murmurs) Audrey’s background there. They had no idea what they unleashed. (Audrey laughs) Some middle level administrator or subcommittee there made some decisions to…
07:46 - They could have just left well enough alone. They could have just said, please put it in blackboard or wherever you want to. - Wherever you want, (mumbles) - They had no idea what they unlocked and that’s great. Well, I’m Justin Reich. I teach at MIT now. I used to be a world history teacher and then I started a consultancy called EdTech Teacher with my colleague Tom Daccord and yeah, sort of found myself in education technology that way. I will say that like I definitely sort of I feel like I came of age with personal computers and my dad was not an engineer.
08:23 - He was like a doctor who did drug development kinds of things. But he like got us an Apple II Plus and we subscribed to basic magazine and would type out the computer programs from basic magazine into the computer. And in schools, I feel like that sort of… The EdTech memory that strikes me most is I went to a Montessori school which had a handful of computers that had a logo programmed on them or installed on them. And I used to write chooser and adventures. Like that was sort my sweet spot, was recreating the chooser and adventure books that were popular.
09:00 - And just in the last few years, I’ve really been thinking like, I think I was alone doing that. I think I spent a lot of time in that. Like, I don’t think there were kids on either side of me and that’s been sort of striking me as well. But why don’t we dive into it now? So hopefully if you haven’t, if anybody didn’t do the reading you’re forgiven, you all still pass the book club. Nobody can fail book club. Yeah, the first rule of book club is nobody can fail book club. But Chris or Audrey could you start us off telling us like, what did you read? I’m sort of gone now as the author, it’s out here in the ether to be sort of reinterpreted in postmodern sorts of ways.
09:44 - Like from the start of this, what’s the book about? What’s it not about? What does it got and what is it missing? I know we started with Chris for the introduction so Audrey, do you wanna kick us off here? - Wow, I’ll pick just a few of these were the things I think that really resonated with me. One thing having being in the middle of my own book was the prologue and this thought which is something that I’ve been thinking about as well. And I think that probably everybody in the near education and education technology is too, is like all of these things that I think I know, that I think I’ve done and written about EdTech. What does it mean and what does it matter now that we’re in this world in which we’re all online because well, not all of us are online. Most of us are online but we are compelled to sort of change the way in which we teach and learn because of the pandemic.
10:42 - And so actually reading that prologue, my heart went out to you Justin because that having to sort of rethink a book at that moment must have been daunting. But really what fascinates me about the intro and I think about what you do throughout the book is I’m so interested in the ways in which some of these predictions from the charismatic people that you talk about, the way in which these predictions keep getting told, keep sort of failing to disrupt, right? And yet every time some of these folks reemerge and they’re often even the same people that reemerge, we all sort of nod and act as though this is going to be something that changes education forever. And so one of the things I hope we can sort of maybe talk about is sort of what is it about the charismatic person in EdTech, but maybe what also is it about the tech side of EdTech that makes people really, sort of really interested and really perhaps susceptible to some of these ludicrous predictions. So I think that that’s the piece that really interests me. - Awesome, how about you Chris? What’s your starting point to it? - [Chris] A couple things.
12:06 - I mean I’m interested in that as well because there are people out there right now making the claims you know, I love in the book where you say well, this thing that they were saying was new has actually been around for millennia. But there are people right now still making those claims as if they’re new and they’re kind of like media darlings. And so I am kind of interested to tease out. And I don’t know if the book dives more into this, like why are we still telling the same story as if it’s new? I think that’s really interesting. But there’s two other things. So one of as I mentioned, I’m like interested in privacy and surveillance and one of the things you mentioned in the book is how school as an institution serves multiple functions.
13:05 - And like one of those functions is like to watch people. And I’m really interested in how that function which is not always openly stated but as like EdTech, more and more kind of EdTech moves in from other industries whether that’s prison or platforms or whatever, like that surveillance aspect gets magnified and more openly articulated. - Chris will you say more about (indistinct) you said before, which I found very compelling because I don’t think I’ve ever thought about it this way. A purpose of school is to watch people. Like, what do you mean by that? What are some examples of that? - [Chris] Yeah, I mean for one the pandemic highlighted that. That it’s a place where kids go because their parents have to go to work. I mean, it’s a place where kids get fed. It’s like all these things.
14:08 - And I mean, as an educator, I don’t want to overstate this, right? I mean, I believe strongly in education but it is a place where people, in some ways, it holds people until they are adults, And I’m trying to state that in the least offensive way possible. And so, but I mean watch in all the different ways you might think about it. I mean watch as in oversee, watch as in take care of, watch as in monitor. And so that is often it’s a function that’s not… I think again the pandemic has really highlighted the extent to which that is true.
15:01 - But I also think that most people understood that to some extent or another, but a lot of EdTech and it could be the LMS, or it could be cameras in schools or whatever it is like that surveillance function like has really blossomed, not the right word, but in the last– - Most certainly in the way that like kudzu blossom. (all laugh) - [Chris] And like has 10 years or something. - (mumbles) Mushrooms blossom too. - [Chris] Yeah. - I mean, I think that there’s something about even sort of the analog school that architecture, right? If we think about the architecture of the classroom I mean, it is traditionally sort of designed in a way to have the teacher at the front of the class in a particular physical spot in which she or he, but she I think, it has sort of the purview of being able to see every body. But then also the ability to sort of walk up and down and monitor, like you said monitor what students are doing. And I think that that is part of, I think someone said in the chat, like part of sort of the hidden curriculum.
16:21 - But I do think that there is this dilemma where I think we’ve some folks sort of now see the surveillance doubling down on their surveillance and are pushing back on it and some are really demanding more of it, right? And so like how dare like students must have their screens on. We must be able to see the students now because technology, students have the ability to turn it off and sort of these new ways in which we’re actually further asking students to give up their autonomy under these scenarios. - So Chris, one of the things that I often, I think I like default intuition that I have when I analyze these things is to say something like, wow, that’s great. So I should be more attentive as educator, as a researcher to how watching all the different roles of watching in schools. And then my thought is like, okay, so there’s probably some ways that watching makes technology worse or where technology makes watching worse.
17:30 - And there’s probably some ways that technology makes watching better. Actually, my wife is teaching at MIT this semester and she’s teaching a lab class in material science. And normally when they do these labs, like all the students sort of huddle around the machine and watch a machine do some operation and shoot data out. And actually no one has a very good view of the machine. It’s just the most convenient thing to do is to have ready to go up there.
17:54 - But this year they’ve taken a video camera and put the video camera in front of the machine. And now everyone has the exact same excellent view of the machine. I have another colleague who teaches Photoshop and she’s saying, we’re gonna be logged into Zoom forever now in my courses. Because it is just way easier to have people share the screen sort of immediately, like she’s using Zoom both with her students in-person and remotely because now instead of like walking around to look at someone’s machine and things like that she can just put it on the screen for everyone to look at. And then I think about all the terrible ways in which we’re recording every activity that happens in schools and that seems dreadful.
18:34 - And then I sort of in my mind start doing these like cost benefit, risk reward analysis, sort of what’s good, what’s not so good. I just wonder if someone who comes at this through the lens of privacy and surveillance, like do you have the same sort of like intuition towards risk and reward, cost and benefit or is it, from your point of view, is like too risky to think about reward. Like there are concerns here that just sort of automatically overweigh potential benefits. - Yeah, I mean I think part of it is that we’re stuck with technology invented by people who actually didn’t think about those questions. So (laughs) like the big example I’m working, I like, I’m using now is like the whole thread that went around Twitter and made it into a bunch of different magazines about Zoom backgrounds and how often people with dark skin, their face is not picked up when you use a virtual background.
19:43 - And there’s I think it’s an educational technologist who posted a thread on this. And he literally like posted his head and he’s bald, I appears to be white guy, and he has a virtual background and that works fine. And then a faculty member who was seeking his assistance is what appears to be a dark skin black male. And so he like looks like the headless horseman. So (laughs) it’s just like a body with no head. And so like, and Zoom I mean it’s been in, so like the… Well, I’ll shorten this. The people who made Zoom didn’t think about these things. They didn’t think about harassment. They didn’t think about Zoom bombing. Like they didn’t, there’s all these things they didn’t. And so it’s a difficult question to do risk reward because it forces that question onto the user when those questions should have been asked and answered or addressed, or at least sort of gamed out to some extent way before that. And now we’re just kind of stuck using technology that wasn’t invented for us or by or for the purpose in which people are using it.
21:08 - - Or bias, Audrey what’s your response to that? - Yeah, I mean I think that interestingly so this was called in Madeland, I think it’s TRU up in British Columbia posted about help, Chris this is the anecdote that Chris relayed and he posted it on Twitter. And interestingly, Twitter privileged each time Colin posted a screenshot, Twitter privileged Colin’s half of the screenshot with him in an end. When confronted Twitter was insistent that they had thought about racial bias when they created the algorithm for cropping. But they sort of, but obviously something had gone awry, but then even farther back it sort of like where do these questions for algorithms, for this use of algorithms to do this work doesn’t seem to have been analyzed either. And I think that, I mean it comes back actually to some of the things I’m interested in this introduction Justin is sort of like, what is it? Is there something about, is it a culture? Is it about the disciplinary training that technologists have? Is it something about this idea of wanting to engineer society or engineer school that I think leads us to sort of end up with these technologies being built by people who haven’t thought about these things.
22:36 - And what does, how do we get here with the folks with the sort of engineering crowd missing the boat so dramatically on these questions that– - Well, I think that you’re asking great questions about, so Chris introduced us to this idea that the technologies that we use in education are often not designed by educators and therefore they don’t even have a hope of having these considerations because Zoom was designed for people who are thinking about like board meetings and corporate meetings and things like that. - There wouldn’t be any black people on the board. So wait (all laugh). - Among groups of people that are disproportionately white or dispersively white and Asian men who are doing the development of them. And well, I mean it seems like the point you’re pushing on is sort of like, why do these kinds of people get power? Why do, and then why does it become compelling to them to want to change things so dramatically? Like, why is that, why is the charismatic such a compelling rhetorical argument? You know, why is it compelling to the university heads or the K-12 school boards to adopt these things? Why is it compelling to venture capitalists and philanthropists? I mean, to me and I don’t know what you all think of this. One of the reasons is actually the one that my good colleague Cheyenne Theodey just mentioned in the chat, which is because technology has changed every other industry.
24:24 - That’s something that I certainly felt like when I started my doctoral research in 2009, 2010 I felt like, and I was studying the use of social media in K-12 settings, I mean an explicit framing of my research at that time was something like, this social media stuff seems to be dramatically changing journal and dating, it’s changed the meaning of the word friend. It changed the meaning of the word like. It’s probably somewhat reasonable for people to think that education is just another sector like medicine or journalism or retail or law, except that it’s not. Except that there’s a bunch of features of education which make it not like every other sector and why it won’t change in the same ways. That has always been a compelling argument to me that is just sort of like, I don’t know, of bad luck for society or just like a weird set of coincidences that it turns out that there’s a bunch of parts of society that you can dramatically change, maybe dramatically improve, maybe dramatically worsen through technology. But education just like happens to be one, which for a variety of reasons is more impenetrable to those forces.
25:36 - But I don’t know if that our argument sounds compelling or reasonable to you all. - So my thing is I think that education has changed. And I think that education technology has changed education but I don’t think it’s changed it in this sort of sweeping way in which some of these charismatic people talk about it. I mean we haven’t yet, although I mean, give them a chance we haven’t yet sort of destroyed public education. And we haven’t yet sort of outsourced and privatized all of our public institutions and the way in which some of these narratives talk about.
26:10 - Clayton Christensen and Michael Horn’s other prediction was that maybe we won’t play Michael Horn. Clayton Christensen’s other prediction was that half of universities will be bankrupt in the next I think 15 years, which he made seven years ago. And I mean maybe they will be bankrupt, but maybe it will be because we’ve decided to divest from public education. I mean, I do think schools have… I do think education has changed and I think it’s changed in a lot of ways, but I don’t know that it’s changed in the ways in which necessarily engineers identify because they’re looking at certain technologies, having made certain kinds of changes in certain kinds of ways and tend not to I think see the rich landscape and the ways in which things changed through politics and sociology and things changed through culture. But I always think the ways that we just changed as more akin to sort of Larry Cuban and David Tyack sort of like well, it’s the tinkering thing that changed a little bit. But I think the education has changed.
27:17 - I just don’t think that it’s changed in the sort of big science fictiony way that some of these technologists like to predict. - Well, that’s why we’re friends Audrey (both laugh). If anybody hasn’t heard of Larry Cuban, Larry Cuban is this brilliant tech historian. He’s now professor emeritus at Stanford. He was a high school History teacher and then he was a principal. And then he was a superintendent. He was actually the Superintendent of Arlington Virginia public schools at the time that my wife was an elementary school student there.
27:45 - And he’s written many brilliant books and one of them is “Teachers and Machines” which details sort of the efforts to incorporate radio and film and early personal computers in schools. And I actually, I just sent Larry a copy of the book and I said, the press wouldn’t let me call this teachers and machines too. But I wish they had. Certainly the aspiration is to have the book sort of do that. Oh, did you have any thoughts on– - [Chris] Yeah, I mean I’m really interested and I like how you call it kind of a tinkerer’s guide. And I think actually Audrey would probably be much better at talking about this than I am, but I think, so let me try to say this the right way.
28:34 - The obsession with disrupting or like massive change is not that impetus doesn’t come from like wanting things to be better, it comes from wanting to scale and make more money. And so (laughs) like that, so the idea that education is going to drastically change or even to kind of go back to talk about like this issue of scale, right? So, like I often get the critique right when I talk about like proctoring systems, like remote proctoring systems. Someone says, well like isn’t it the same if someone walks around the room while you’re taking a test, like it’s actually not like, so that drive towards massive disruption in scale I think is what is part of the reason that so many of these things are problematic. Yeah, they’re driven more by like investors and by the notion of massive, like education remains one of the places that is less been, it’s still being alluded in ways that many of our other institutions have already been emptied out. I’ll just put it that way. - That there’s still space there. - This is, we’re maybe jumping ahead to some of the other parts that we were thinking of doing during the book club, the stump the chump thing.
29:59 - But this is one of the things I would like reach back at you Justin, is this idea of scale. And I know like learning at scale is kind of your jam, but like for me, like that’s the problem with like this word scale, right? Is that like is the scale means something different than public education. To scale means something different than adequately coming up with funding, public funding, taxpayer supported funding that supports access for everybody to have an edge of top educational opportunities. It’s a scale means something different than open for example. And if not, why not? And if so, like what does it mean to talk about learning at scale versus for example public education? - That is great.
30:54 - I mean I guess learning at scale for me is there are lots of learning environments with many, many learners and few experts to guide them. And some of the ones historically have been printed books and printed textbooks, some of which have been integrated in a variety of ways into public education systems. And some of them have been deliberately ways of creating new pathways into higher education like or into education, like the Harvard classics. This sort of like library of books that one of the Harvard presents publishes in the early 20th century which says like yeah, read these like 50 books. And this is like basically as good as a Harvard education is and it will be free and accessible and so forth.
31:44 - Children’s television is another mechanism which is about serving many, many learners with few experts to guide them. And the availability of the internet just creates lots of new pathways for these kinds of large scale learning environments to exist, which build on existing efforts, but are not exactly the same as sort of existing technologies. You know the proliferation of adaptive tutors, of massive open online courses, of peer learning communities. They seem to be things that are not quite like books and television that they have a different set of affordances. I think it comes back to my kind of like somewhat pragmatic optimism that like we can build these things and we can build terrible things with them, or we can build great things with them. And it’s gonna matter a lot.
32:38 - I think a point where we agree it’s gonna matter a lot like what is the political economy in which we generate these things. A political economy in which we have very robust support for public education, for public higher education is one in which we’re gonna build technologies and people are going to be like cool, this can slot in here. This is how we can prepare people extra for these things or stuff like that. And then I think there are other political economies like including the one that we’re in particularly in higher education with kind of austerity and a junctification where as we shrink higher education, we shrink like the value of what we can generate. I mean, I would also say that like some of the artifact of being interested in learning at scale too, one of the things that I was interested in doing with the book, which I think like the vast majority of the public is not particularly interested in. It’s weird that it’s still in there.
33:38 - But I just observed that there are like different communities of people that study things that try to operate at scale. So I propose these three genres of learning at scale that we’re gonna read about in the next few weeks. Instructor guided things, algorithm guided things and peer guided things. And I observed that like, it tends to be different communities of people who build and study these things, but I actually think they have a bunch of similar kinds of challenges and problems. And so part of what learning at scale is meant to do is be like, oh well, let’s get people to come together and say like, oh well maybe there’s some things about making more equitable technologies that folks had scratch and figured out that might be useful for the people who are working at extra Khan Academy or other kinds of things like that.
34:22 - And then sometimes I think like that is like a weird piece of like scholarly politics to try to weave into your book. Justin (laughs) like most people are not going to find that helpful or interesting but you can find bits and pieces in there. I don’t know Audrey does that help at all or Chris do you have reactions to that as what learning at scale is. - [Chris] Go ahead Audrey. I’m still trying to process this so go ahead Audrey. - I mean, I think you’re right. I’m pushing at you purposefully I mean, I do think that it matters in some ways though, how much we let these narratives again like we’re circling back on things again, but how much these sort of powerful narratives seem to seize particularly imaginations of politicians and administrators, right? That there’s something about these tech no fantasies that really resonate.
35:12 - I mean, I just remember during the year of the MOOC, the ways in which people lost their minds, administrators lost their minds. I remember when the Virginia UVA fired it’s– - My alma mater. - Yeah, the board fired the president because they thought that she wasn’t moving quickly enough. And all of the David Brooks op-eds and sort of saying like this is it. This is the end, everyone get on board. Higher ed will never be the same. It’s the end of college as we know it, I think tech crunch pronounced and it was very much part of this narrative that you could sort of see be really crafted and repeated by people who might’ve had a background in teaching machines to think but didn’t really necessarily have a background in teaching humans to learn.
36:10 - And so it was, it’s such a powerful, politically so powerful. - So, one way I might reinterpret your critique is something like Justin, it was the charismatics who invented this at scale phrase, why are we using it? - Why are you using it? - (All laugh) Because the frame gives them a privileged higher ground. That’s a great critique and one I hadn’t thought of, and I hope that people will keep thinking about that too. One question that Kristen DiCerbo asked both in the chat and in her post in the forums is first, so I propose that there are these three groups actually build on this work by Morgan Ames who’s a wonderful anthropologist who just wrote this book called “The Charisma Machine” about the one laptop per child program. And I completely stole the term charismatics from her and she stole the term tinkers from David Tyack and Larry Cuban and skeptics is widely spread.
37:05 - So I deserve no credit for any of these three terms except perhaps to use them. But she asks this question, are skeptics and tinkers and charismatics, are they trying to solve different problems or like are they different stances towards the same problem or do you think they have fundamentally different problems that they’re trying to address? Oh Chris, do you have a reaction to that? - [Chris] Oh well, I might. I wanted to go back though, I’m sorry (laughs) - Go back, go back, you’ll come back later. - [Chris] So the one thing I wanted to interject is that like this, so all of these narratives sort of about like how things have been disrupted and what that means. And it’s very important to note who’s telling that story.
37:51 - And like I’m not saying anything new, but like, so if we look at you know it’s easy to bash Facebook right now, right? But if we look at sort of like Zuckerberg’s narrative of what Facebook is, and if we look at what people in Myanmar think Facebook is, or what white supremacists think Facebook is, you know or and we can, so Americans in particular I think really enjoy this narrative. I mean, you can see it when these guys go testify before Congress, right. That they all kind of like, everyone’s just like falling all over themselves to talk about how great they are. But like it’s to me… well, it’s actually not an open question to me. Like, I know where I stand on this, but it’s an open question about whether or not like Amazon or Facebook or Google, or like whether or not like that the scale they’ve been able to achieve it actually made society better. Like, I think it’s obvious.
38:53 - Like, I think that they have not (laughs). But like, so that like story about like what that means I think it’s very important. Like how we, how easily we adapt that or accept like I mean Facebook was like you stole a bunch of data and people’s pictures to rank women. Like, that’s how it started. (laughs) And so like, but every other like layer on top of that, about it’s used to connect people or like it does those things, but it’s also a massive engine for white supremacy and misinformation. And so like I always like wanna like inject suspicion into that overall equation.
39:43 - ‘Cause I think it is a question about whether or not things are better because of scale. - I definitely feel like Facebook is a great example of where the skeptics figure things out faster than me as a tinker did. And looking back like the tinkering that I was doing with Facebook was probably a pretty dumb idea. I mean, there’s definitely a period, I don’t know in like 2010, 2011, I think like a stance that I often bring as a tinker is to be like oh come on, people are doing these things. It’s like, check it out and see what could come of it.
40:16 - And I think there was a moment in 2010, 2011, 2012, when educators were like really down on Facebook and you could find a bunch of evidence of young people organizing on Facebook to do productive educational things. Like kids were forming Facebook groups, they were networking with each other, sometimes we’re doing in class or other kinds of things like that. But a key feature of what they’re doing is a lot of times they’re like talking about their homework or helping each other and things like that or starting new groups. And so like the tinker stance to that is like come on, there’s like some risk and reward here, but like look at the reward component of this. You’re a young people sort of organizing in some meaningful way.
40:54 - And I looked back on my advocacy of those kinds of approaches now and think a lot more like now skeptics totally had one (laughs). Like I should’ve just told everyone to get their kids to log off and leave it alone because it was a bad idea. And that I mean to me that’s like a… That is one part of my EdTech advocacy history but I definitely don’t feel particularly good about right now. Audrey, do you have any thoughts on this question on whether skeptics, tinkers and charismatics have different problems that they’re trying to solve or different stances on the same problem? - I think that’s a really interesting question. And I think about where I put myself obviously and where other people would put me, I think it’s, you know my work is pretty clear, (laughs) pretty clear what– - People have referred to you as a skeptic Audrey.
41:51 - - People have suggested in the past that I might not be a biggest proponent. But I think that in some ways I feel like it comes down to questions of power too. It’s sort of like, I don’t feel like the world needs another charismatic person willing to do sort of a Ted talk version of why we need to disrupt education with the latest gizmo or gadget. And in fact, I feel like there’s so much power in the technology sector, in the finance, the way in which it moves politics like Chris said when these folks testify in front of Congress, it’s major headlines. These entrepreneurs become philanthropists and philanthropy in my opinion is sort of a way to bypass democratic decision making particularly around public education issues.
42:47 - So to me, it feels like it’s incumbent upon me to push back as hard as I can. And I don’t feel like I have to sort of solve like it’s not about solving a problem per se. It’s just about sort of how does one sort of push back against the sort of vast political and economic power of the technology industry? And so I mean that’s a different role, that’s a– - Yeah, and I think that’s… I think in the frame of Kristen’s question, are they trying to solve different problems? And the answer would be, yeah. One thing that I hear you proposing is something like where power is centralized and small numbers of unelected people get to impose their vision on public education.
43:35 - That should just like be resisted because that is not a good thing in a democracy. And technology is a vehicle in which powerful people can make those impositions without negotiating democratic processes and sometimes while getting celebrated in the press and those kinds of things. And presumably if we give charismatics, that like the most charitable possible interpretation, like these are people who think that they have some kind of power to, they wanna use their power for good to solve problems that are limiting human development or other kinds of things. I mean, I think that would be an example of like solving different problems, in which a person coming at it from your perspective would say, no, no, no. The first order problem is not that kids aren’t learning to compute math problems fast enough, the first order problem is like how do we decide what happens in public education? And after that, we can figure out like whether or not we want the intelligent tutors to be in there or not.
44:38 - - [Chris] Yeah, I think– - Here’s a question. Go ahead Chris. - [Chris] No, I was just gonna, I thought Kristen’s question was really important. But the other I think part I would add is that I think I read this in the chapter that at some point you described education as conservative. - As a small C conservative system needs to change. - [Chris] Yeah, and I mean I think that rightfully so.
45:07 - I mean I think that’s where I think, so like I do think that those three groups are probably trying to solve different problems, but even the framework of whether something’s a problem, right? And like how much it needs to be changed and at what pace I think are like quiet like that attitude is like very different, right? If you see education as a problem or something that needs to be disrupted that’s very different. But also obviously there’s like lives that are being disrupted. Like it’s conservative small C for a reason, right? And that’s because like large scale changes or like quick changes, things like that have all these ripple effects that like charismatics are often not equipped or interested in answering or yeah. - Yeah. I mean, who gets hurt when someone to borrow Sebastian Thrun’s phrase, has a lousy product. It can be pretty significant that the experiment that Udacity ran at San Diego State. No, it wasn’t San Diego state. It was a UC– - San Diego State.
46:24 - - Yeah I mean, people get hurt when we screw things up in education. And so I think the stakes are pretty high. So sort of sweeping aside the sort of gestures that they make. I think they do often forget that the educators and the educational institutions again have responsibilities that are… I think, it’s a different set of responsibilities than say the pizza restaurant has to its customers. I think it’s a different responsibility than doctors have to their patients. - We’ve got two minutes left actually because like there are a few MIT students in the class and by contract an hour long period can only take up 50 minutes. So I only get three to 350 and then I have to let them go. Here’s another thing that I’ve been thinking about is sort of maybe an implicit critique in some of our discussion. So one question you asked Audrey was like why does this cycle keep happening where charismatics propose disruptive change, the arguments are found compelling, they don’t work out as people hope.
47:34 - And here we are back again, proposing them again. Like another person in that cycle is someone who raises their hand and says like hey, we can look at the history of past cycles to make better judgments about what happens in the future, like we can analyze these charismatic arguments and see how likely they are. In fact it might be possible to neutralize the power of the charismatic argument altogether with sufficient empirical analysis of history. Like, is that person in egghead who should be like shuffled off the stage or is that a viable approach to addressing these sort of boom and bust cycles in education? - I don’t know how viable it is, but I do think that one of the things I think that David Tyack said is that even when people are utterly ignorant of history and let’s just include say tech CEOs in that description, I think that they still have an idea of history, right? And so even if someone has never read a book on the history of education, doesn’t know who Larry Cuban is, has no idea when technology first appeared in the classroom thinks that maybe it was when they invented their little app that they still do carry with them powerful ideas about what the past was like. And I think it’s incumbent upon us to talk about the ways in which the past was like, what the past was like because we’re still burdened with that. We still carry that forward.
49:02 - So I do think that history matters ‘cause we’re stuck with it even if people are ignorant of it - What do you think Chris, the last one. - [Chris] Yeah, I completely agree. I mean to use a current example so that social dilemma documentaries on Netflix and like one of the guys getting roasted because he said well, nobody said there was no uproar when bicycles were invented, when actually there was, I mean like… And so (laughs) like the notion, like because we all went to school, like people feel like they know the history of school but that lack of perspective, that lack of history I think is super dangerous. And again, like in whether you wanna call it institution how you want to name schools. It’s more dangerous I think in a place like schools than it is in any other institutions.
49:58 - - Well, Chris Gilliard and Audrey Watters this has been a fascinating conversation, a great way to kick off the book series to those of you who are here. There’s a bunch of questions that are in the chat and a bunch of questions that are in the discussion forums. And I’m sorry that we didn’t get to all of them, but keep the conversation going in the forums, failuretodisrupt.teachingsystemslab.org. Next week, we’ll have George Siemens and Liz Loesch, two folks who were very much involved in the year of the MOOC thinking and interacting and talking with us. Audrey will keep coming back. Chris I hope you’ll keep hopping in as you have time in the weeks ahead.
50:37 - But really wonderful to be able to get to spend this 50 minutes with you al. As I mentioned in the chat, if you live overseas and you don’t have a copy of the book yet, shoot me an email which I typed in the chat and then I’m more than happy to send you a PDF. And Chris and Audrey once again, thanks for joining me this week. This was a really rich conversation. - Thanks. - [Chris] Thanks so much for inviting me. .