OA21 Plenary: ALT & Apereo Screenside Chat. Ethics & Learning Technology.

Jun 23, 2021 19:09 · 6534 words · 31 minute read

okay welcome to this alt and apereo screenside chat on ethics and learning technology perspectives from the pandemic i’m ian dolphin i’m the current executive director of apereo if you’re not familiar with apereo it’s a membership organization largely of higher education institutions which collaborate to produce open source software this panel as you might have guessed is about global responses to the pandemic and concerns that have been raised not just in our community but across higher education about privacy extending outwards from legal to ethical concerns if you have questions please ask them you can ask them via twitter the hashtag is openapereo21 or in the frame on the pod page that you’re viewing this from and with that i’d like to hand over to marin from alt because this is a partnership uh screenside chat with the association for learning technology and i’m very pleased it is thank you ian for the warm welcome and hello to everybody who is joining us here this is my first open apereo conference and i’m delighted that we are collaborating on this session my name is dr maren deepwell i’m the chief executive of alt the association for learning technology in the uk but we have members all across the world and one of the key pieces of work we’ve been engaging in over the past six months is to develop an ethical framework for professional practice in learning technology thanks maren perhaps a panel could introduce himself starting with bella just say a few words about who you are thank you my name is bella abrams i am the i.

t director at the university of sheffield we’re a large research intensive university in the north of england and i am also a trustee of alt with a background of working with learning technology for most of my career and i also have a real interest in the ethics of using technology and ensuring that we get the best for our students and for our staff who choose that technology and so that’s why i’m here on the panel today thanks bella over to you paul for an intro hi there i’m paul prinsloo i’m from the university of south africa down south in africa where we are having winter so i’m a bit freezing at the moment but i do like look nice i’m a research professor and open distance learning and my main focus is on the ethics of learning analytics and the privacy concerns with that thanks paul over to the man in the hat chuck uh thanks my name is charles severance i’m a faculty member at the university of michigan school of information um and uh one of the founding members of the sakai project which i like to say is you know eventually it’s the sakai and moodle will be the only ways to really do learning management uh while guaranteeing the privacy of your students although i think it’ll be a while before the world agrees with me on that fact and so i’m really excited about getting the narrative of privacy uh and protecting student data as like a part of the sacred trust that campus i t folks should you know dedicate themselves to that protection rather than dedicating themselves to the easiest possible solution which uh doesn’t respect privacy so putting privacy as a value that is um on par with convenience so thanks chuck we’re going to start with uh because this was a flipped panel and i hope all the audience have managed to find the video on the program that we shared in advance but i wondered if the panel might just like to review uh the other videos that they’ve seen not their own at this point and raise any issues that they’d like highlight any pointers from the videos that they’ve seen and offer perhaps some brief observations to get us started and let’s do it in reverse order so chuck well um i i think that paul’s video talked about learning analytics and privacy and i think that this is really an under under talked about topic in that um learner analytics is like the solution to all of humanity’s problems apparently at least from the people who are selling analytics solutions and um people ask me you know why don’t you why aren’t you more critical publicly of learning analytics as privacy and part of that is because um it’s like tilting at windmills i mean people are so convinced that learning analytics is such uh a great thing that you know how could you be opposed to like oxygen i mean how could you don’t like oxygen and then from maren and bella’s um i just felt so much joy listening to somebody who is taking a quieter gentler approach i mean i’m just my goal is to just scream and shout and wave a club and and tell people that they’re stupid if they’re not doing privacy and and you may or may not be surprised but that’s not necessarily accomplishing all that much even though it is how i feel i feel very strongly about this and so uh watching bella and maren talk in a way that might actually convince real policymakers and real people and and i can be the the crazy in the background that delivers the solutions but not i i don’t think i can motivate people to uh to go the way i want so that’s my reaction to the two that i saw thanks chuck over to you paul great uh chuck one day when i grow up i want to produce videos like you uh with that aside and what i really loved what you did is to point to the fact that many institutions delegated a privacy to the provider and the responsibility to overlook at the edge ethics that we delegate that to providers and providers do offer institutions especially in africa as a one-stop shop institutions in the african continent often does don’t have the capacity to do the analytics themselves so the providers and the platform providers come in and offer us this one-stop shop uh that is very convenient and then bella and maren i can just echo what what chuck said what i really loved about your soft approach is to make room for alternative interpretations and to really open the floor to say maybe maybe there are other ways of looking at informed consent maybe there are other views on on privacy and the ethics in using learning technologies so thank you very much bella we’re in a really interesting moment and um we’re kind of uh there’s the intersection of the law in some cases there’s a lot of institutions making really significant decisions without fully understanding the consequences of what they do and as you both said delegating the responsibility of that to people who don’t have an interest and that’s one of the things that i got from both of your videos is the idea that someone else will worry about it um and if someone else does worry about it then it’ll all be fine in the end and i think that’s that’s the kind of risk that we run at the moment is that the uh the worry that in two years time people that have made really on the surface of it simple decisions about the useful technology are actually then dealing with the adverse consequences that massively impact students that impact the way that staff feel about the institution that they work for and that all of those things kind of combine into possibly even a backlash in the use of technology um from our students and that’s that was the kind of and and thank you both for your views on kind of maren and mine approach to how we’re doing uh the ethical framework work i think the what i wanted for us to do with having the kind of principles and then the checklist is for people to start thinking about them for themselves to understand how having principles in around the use of technology can then affect the decisions that you and possibly the wider institution that you work in might be affected that’s why both of your videos were so insightful thank you over to you maren thanks bella thanks everybody for those introductions and yeah we’d like to encourage all of our participants to go and have a look at the online program if you haven’t yet spotted the videos um i wanted to start us off with a bit of discussion and i think bella’s comments just now really lead well into that so setting aside different legal frameworks for a moment um it seems probable that perspective of what is ethical will differ from culture to culture and the question for the panel is what implications do you feel this has for those who design and create open technology software in a global context so that is our opening discussion question and um chuck maybe if we come back to you first and then um paul and bella will come to you as well okay maren i that’s a that’s a great question um so so for me i get around to a lot of different countries and sort of have this conversation with lots of people and it is entirely different culture to culture and here is there are there are three kinds of cultural responses um so one is the let’s do what america is doing right the the we whatever america is doing must be right so we’re just going to line up and uh examples of places where that culture is the norm are the uk for example um you know i’ve seen a lot of things happen in the uk where they just install the same thing that’s popular in america they just go and say yes i mean i’ve seen that in spain for example um and then i see a second kind of cultural response is one that really is you could almost call it isolationist and that is this is our culture and we are going to view this through the lens of our culture and we don’t care what the united states is doing and we we will and so the the a good example of that would be france and germany right and so france and germany have i think very high standards and they use completely different software and sometimes they use software that’s only for their country because it’s the way that uh they can control it japan is kind of the same right they they kind of have figured out where their center is and then they then they stick with it and then the the third kind of culture um i would say is probably india and china as the third kind of culture and that is the culture that basically says everything we’re doing is so sort of challenging and difficult that we’re going to take the easiest solution that we can afford that we can you know just run and um and so they they for them privacy isn’t even part of their decision making because they they already feel kind of like they’ve got their own little closed environment in india and china for example but and so that that’s that’s the cultural and it really is who are you following are you following united states which is a terrible example are you following your own soul of your country or are you like just like so far away that you don’t really think too much about it i am i think many people hope that they are so far away that they don’t really have to think about it but i think we can all agree that that’s not going to be a viable location for for very long at all and bella maybe if you want to come into that conversation now and then we’ll come to paul as well what are your thoughts i’m hoping that my sound is slightly better i’ve tried to move the microphone and i think that culture and the way that people consider ethics massively impacts but i also think that that actually is driven in some cases by um the money available that chucks just mentioned and institutional capability or company capability to to invest in in the best possible technology i think there’s also um kind of the way that the um that the analytics are used or analytics or any technology are used in an organization why i think the actual choice of why something is used can drive a lot of that decision-making and so what we’ve seen in our institution during the pandemic is our faculty of education making some really hard decisions about the use of technology because they were adamant that you could not do open book examinations that were unproctored in an engineering institution an engineering situation which i i didn’t agree with but they were adamant that it it meant that their exams remained fair i think what we’ve seen in other areas of our institution and more broadly in the uk there has been more compassion around the use of technology and the use of techniques and the selection of technology particularly in the pandemic my question probably for the future is whether that will continue as we start to see more blended learning taking place across higher education in other areas of of teaching and learning and and i think that one of the reasons that we were so keen to think about our um alt ethical principles in terms of a framework was to give people almost scaffolding to go through those thought processes um whilst they were thinking about technology and i think that’s that’s one of the things that we could probably use to help people adapt for all of those different conditions which are both cultural financial and institutionally focused thanks bella that’s really helpful and i think you make an interesting point around care here as well because at least in the uk but i think in other countries as well we’ve seen uh more concerns around issues of of digital poverty and digital equity which i think we could maybe come back to at the end of this question um and it would be interesting to see what you know from those in the room as well if you have any any thoughts on on how that informs our understanding of of how to develop ethical um open learning technology solutions but paul um let’s come to you next and then and we’ll see where we get with the discussion and um chuck bella if at any point you want to jump in just give me a signal and we’ll bring you back in thanks maren i think what we should not underestimate and i pointed that out in the video is to think about the use of technologies as part of the imaginary and i i propose that we think in terms of a global colonial imaginary so thinking about the framework and the frameworks that are in place come from a particular north atlantic epistemology and anthology the way we are in the way we see the world as if that is the norm and if that is the only way to look at data and the only way to look at ethics so that’s the first point the second point is that the notion of an individual providing consent is a is aware concept or a north atlantic concept based on the notion of a social contract between an institution and the individual and in other cultures and other communities i cannot make a decision on my own my decisions affect those around me my my my decisions are decisions that affect those that came before me and those that come after me so in i think it’s very important to consider how our frameworks embody a particular dominant narrative so i want to to ask just few questions to say how do i give consent if i’m connected to those around me and their data are my data my data are their data when i’m connected to the earth beneath my feet the sky above me those who came before me and those who follow me when i consent i consent to be categorized and seen through the eyes and the categories of an ontology and epistemology white categories and a white male gaze by giving permission i allow myself my relations those who came before me and those who will follow me to submit to and be colonized by the white case by consenting i perpetuate the employment of my data to serve those knowledges who do not care about me my relations those who came before me and those who will follow me by providing consent they will make claims about people like me and my data will become a tool to further exploit people like me so so how what are ways to think about consent that is not embedded in the north atlantic case that’s from my side maren that’s really um that’s so interesting and i think you’re right the north atlantic gaze is certainly a strong one when it comes to many of the policies um that inform the development of the tools that we’re using um before we move on to any further question i just want to give both chuck and bella to kind of come back into this question um maybe chuck to you first and then bella just put any other observations on this so as bella was talking um i was thinking to myself about my sort of revolutionary warrior perspective that i take that says you don’t you must be brave and you must go into the dangerous unknown just because i said that it’s the right thing to do and even though that’s kind of how i act i understand that what is incumbent on us is to give something that’s easier to use doesn’t require as much institutional commitment um costs less to run and and so to some degree i it even though i am sort of like fire breathing i i i believe that we got to dedicate ourselves in sakai and moodle to making these things that make the good choice a much more equivalent choice rather than just say it’s the good choice it’s the tough choice make the tough choice and so i’m a warrior every day but i kind of understand that there’s a much longer term that that has if we’re going to actually win the battle we actually have to take a much more gentle and long-term approach thank you i i think there is a longer head and a bit of a spirit i think this will definitely need to to put some urgency behind this because it is an urgent matter bella we’ll come to you and then ian will bring you back into the conversation as well and kind of thinking of both what paul and chuck have said you know that the idea that uh products and services should be designed with more than just the creators in mind so thinking about that gaze because this this is the this is the kind of trap we all fall into is we build something because we think it’s a good idea uh we don’t think of something because uh necessarily we’re coming rooted in a different culture or or um thinking about how other people necessarily might consume that technology other than from our own perspective but also balancing that with everything that chuck said you know the need to do things quickly the need to do things and that are frictionless in a lot of ways because we’re all facing problems that we want to solve for people as quickly as we can so i think that’s the paradox that all of us are working at the moment is how can we kind of adopt um friction or low friction solutions that take into account the global nature of what we do and i think the way that we do that is to keep asking questions why are we doing this who are we doing it for who’s going to use that is it going to be different in south africa or india or is it going to be different and not just ask ourselves those questions ask people in south africa and india those questions and i mean using research and service design work that we all do as a kind of basic principle should also come into what what harm could be done to you from this product as well so maybe those are the questions that we should be asking uh our users as well but i think that the problem is trying to do all of those things as quickly as we all want to do them is really hard so doing that in a structured way i think that’s a really good point um for us to include a bit of a call to action to our participants who are listening today um because one of the things we’ve been trying to do as part of the work that bella and i are talking about to develop an ethical framework is to make a list of those questions those prompts and we’ve extended the consultation to get your input um so the consultation is open and we’ve put a link into the the chat and we hope to get your um your input we’d really like to have you know as diverse um voices as possible and if you think there is a better way or a different way than we definitely want to hear from you so please do um provide us with some input and ian one of the questions that um that you’ve raised i think was whether we’re looking to maybe translate the framework into different contexts and potentially different languages and certainly one of the things that we’re looking into is making it um openly licensed so it can be adapted and remixed into different contexts and ian i wonder whether you wanted to join into the conversation to get us further in our discussion i mean i think it’s incredibly valuable that alt is making the ethical framework that they’re developing an open publication and an open framework and that gives people the chance to look at translation issues and this actually connects with the apereo communities that i represent who spend a lot of their time developing software and there’s a tendency to think about internationalization in linguistic terms in terms of translation rather than in cultural terms also and that of course adds development cycles but it makes a product which is vastly more valuable in the long run so i that’s a point that i i didn’t want to lose well a few members of the panel have have mentioned consent and i think paul mentioned informed consent and it’s possible that the legality of data gathering might ultimately revolve around that concept could you could the panelists comment on what you feel informed consent might look like in higher education first of all and then perhaps also make some comments about how equipped you feel higher education is both culturally and technically to secure and manage that consult content now i am aware that that is a big question and we’ve got 15 minutes but perhaps paul if i start with you you give us some perspectives you’re muted great thanks thanks ian i do think i like to speak about consent as fragile it is really until further notice it is uh there’s enough evidence to show that notice and terms and conditions and consent is in unable to really protect price there’s just too many loopholes and the law is constantly trying to catch up so i think an ethical framework to provide key pointers for consideration is most probably the way to go secondly i i do think it depends on how we see data if we see data as as a commodity that i own as property and you can steal my data and i must protect my data or we can see data as a right and my right to privacy and i can control who has access to what or we can see data as identity is an integral part of me and i think that’s that’s a view that is shared by many indigenous peoples that my blood samples my digital footprints the it’s part of me it’s my identity you don’t steal me you don’t steal from me you steal my identity you so so i think that is very important and lastly i do think if our notion of consent can can encapsulate that data is relational uh it’s about not only about my individual data but when you have my data you actually have access to the people around me their data and my community so it’s you use my data to inform decisions about people like me so data is relational secondly to ensure participant input in all aspects of our design and our use of learning technologies that’s very important thirdly to ensure that our students and our users of our technology owns the data and i know that’s contestable what will they do with their ownership but i think it’s community input it’s community say in their how their data are going to be used and then finally is if we do use their data if they do consent to use a particular learning technology that we show them the benefits that they can provide input in the benefits and how they want to benefit from the data we collect from them thank you yeah thanks paul that’s incredibly valuable bella would you like to make a comment on that i’m nodding vigorously at that idea the idea that um in order to gain consent you can demonstrate to users about the benefits of of the use of that technology i think that’s the the key and i think i’m probably the gdpr veteran on the panel um and i think that um when we think uh three years ago now my daughter is three years old and she came into the world i think the day that gdpr became law uh and um i think if we think back to what we thought about informed consent was when we were thinking about gdpr it was how can we write our statements to make sure that people properly read them not whether they actually genuinely give informed consent and i think we’re moving beyond that now and back into all the points that that paul has just made on a daily basis we all use in our lives so many products that consume our data and the vast majority of those are not in the educational space they are getting us to places they are we are buying services and all of those things we all click a thing quickly because we want to do that thing and the reality is is that there are patterns and and uh things that everybody knows about us because we wanted to use that service and i don’t think that genuinely is informed consent that is um you know we we click that button in order to do that but in order for us to do all of the things that paul talked about and the things that i would want our students to do which is rather than a kind of blanket at the beginning you agree to something it’s actually quite a long process and you have to engage users in the idea of what the product will do for them and why their data is important as part of that process and then like paul says you avoid issues around identity the concept of theft the fact that people feel harmed by the use of that technology but doing that in such a way that doesn’t kind of decrease user experience is critical and and that i think is the challenge that we all have is how can you explain to someone codedly quickly what it is you’re going to do with their data and then for them to understand that rapidly in order to consent to it to then use the service i think it’s important that institutions do it at a kind of global level for students but i think that when we are building software and putting together different products as well to kind of provide services across aging we should also be explaining that there is a lot of things that we know about people and bringing those things together um and being able to clearly articulate to those consumers what those things are is a i mean it’s incredible challenge and i think for it to be realistically informed it will require a huge amount of work thank you chuck we could go for a whole hour on this this is we’ve really a lot of really good ideas have come out here and i’m so i think that uh consent is a disaster and i’ll tell you why and that is paul is always talking about like the north atlantic here in the middle of the united states of america we are affected in the same kind of way by silicon valley and that is that you know no one cares what we think silicon valley makes decisions and then we’re stuck with that and so if we look at this obsession with consent it’s not really consent what you’re doing is waiving your rights i mean all these little things are waivers of rights which means that facebook google canvas blackboard pop up a thing that basically says i waive my rights i’m not consenting to anything i’m waiving my rights which means that these companies do not have the laws applied to them which means that all of the gdpr work that we did which is a great thing is null and void because the first thing you do is give up your rights just as a condition of entry you give up your rights and so i’ll be honest we had more time to talk about this i think consent is a waste of time what i think we should do is come up with another of the gdpr’s concepts and that’s the retention of data if we could focus on the notion of look my university is going to retain some of this data for quite some time because i need a transcript but you know a lot of this other data really only needs to live for about six months and if that data could just vanish and so now what happens is all the things paul was talking about about it’s me it’s my community it’s my society well the data is gone right and so why are you holding this data forever what what is the university’s purpose to hold all the data paul was talking about literally forever and the answer is there is no good reason whatsoever to hold that data for more than perhaps six months three months after the course is over we do need to hold certain things but then they should we know we should know who holds them and that is my university holds my grades my university is now outsourced the holding of my grades and then if i’ve outsourced anything like to google docs or something it goes away and we don’t have expiration and so i think you know the gdpr is you know it hasn’t had the effect that we would like for it to have but um there’s wonderful wonderful ideas in it if we would just listen to it and do what it says bella you’ve got a raised hand and it’s a really interesting point chuck and i’m gonna bring in another dimension which is the oh well is it the legal is the political dimension in the uk so we have the new office for students who are um a kind of government agency who monitor higher education and i think uh your point would be fantastic without the niggle in our minds that we might be asked to submit data to the government at any given point we don’t know what that data might be and so then i think you you get into the kind of institutional loop of well there are a lot of things that we know about our students and i think most of them probably fit within that six-month horizon that you just talked about but there’s always the what if what if and i think that that’s another thing that we probably as institutions as educationists need to push back on which is actually what is what what do we actually need to hold about our students both from a governance point of view and then uh from a privacy point of view and then like you say what can we delete because we never delete because we always think that it will be useful at some point thanks that’s a really good point um we’re not getting questions from the from the audience so audience if you’ve got questions please post them in twitter with that hashtag openapereo21 or if you’re viewing this in pod use the user frame there to post a question but i wondered if uh we’ve got three different panelists here from three very different places is there anything you want to ask one another to explore any of the things that you’ve heard over the course of this this session in preparation for it chuck you have your real hand up yep so maren could you talk a little bit about like uh it leaders at various universities that you’ve kind of test marketed the the framework with and like do they just kick you right out of their office in five seconds or do they go oh that’s in some can you talk a little bit about actually like deploying this and talking to people about it and what their reaction is absolutely so um we’ve started this works in october last year and bella alongside with two other trustees are chairing a working group which includes over 100 people and that includes learners all the way up to director level senior staff and also representation from industry and our experience has been that there’s a couple of different responses so overall the idea of a framework with principals is very welcome but at a senior level in particular there’s obviously some concern that there’s all sorts of existing frameworks and policies which with that might have to align um but the idea of having a checklist has come out actually really positive in the um consultation that we’ve just done so we’ve had 160 plus responses so far and over three quarters of the people who responded would like to use a checklist not in terms of checking a box to say we’ve complied but having a checklist of prompts to guide their discussion either internally or with suppliers because a lot of our members i think struggle to know what questions to ask suppliers and particularly people who might not be you know super passionate advocates of privacy already who haven’t thought a great deal about this um on the other spectrum of our membership we have people who are so passionate about this work they maybe feel we don’t go quite far enough and want us to be more radical and make it more um more broad and maybe more abstract and what we’re trying to balance is something you know between having the right principles but also making it somehow practical because the community we represent um includes many leaders and practitioners who for whom this is maybe not a new thing but certainly not something that they can devote all of their time on however important they feel it is and they need something very tangible so by trying to align it with you know existing similar principles for learning analytics for example we’ve been trying to give them some tools to start the conversation rolling and i think that’s a little bit of what um we discussed earlier that you know we want to try and effect change at scale not only for the enthusiasts or the passionate advocates um that are out there and in order to kind of lift everybody out of the sort of oh it’s someone else’s problem mindset we need to give them i think an entry into that conversation would you say that’s fair bella i would and i think there’s an um i’ve really enjoyed the process that we’ve just gone through with our working group i think engaging with a member-led community on something as complex as this has got huge benefit um but as maren said we can’t meet everyone’s needs with that so we are coming at it from a kind of balancing act one of the things that i have noticed though and this goes for colleagues that i’ve met from the us as well chuck is that actually having something like this is really welcome because from senior people in institutions not least because nothing like this really exists at the moment and a lot of people don’t necessarily spend huge amounts of time thinking about privacy it does come as part of a checklist or all the kind of ethics of the use of technology so having some concepts of principles and a checklist even if they’re not perfect is a really good starting point for people who um who do want to kind of think about those things but wouldn’t necessarily know where to begin and i think that’s been kind of one of the real benefits of we are almost at time uh and that seems like a good place to pause the conversation and allow it to continue through other channels uh thanks to all our panelists thanks to alt for agreeing to participate in this joint venture which i sincerely hope is the first of many i have a word from our sponsors before we leave sponsor expo breakout room assignments room one is big blue button engaging students online with big blue button room two is longsight stump the developer bring your thorny questions for Earle Nietzel and adrian fish so there you go look forward to that thanks everyone been a great time thoroughly enjoyed it merci thanks thanks everybody great discussion bye thank you everyone bye-bye.