The following is an unedited transcript from the ATLIS Summit: DEI - Understanding the Conversations

Susan Davis  00:03

Welcome, everybody to the final ATLIS summit of 2021. How did that happen? This is an important summit for us. And I think for everyone in our space, dei understanding the conversations and this is some of the this topic grew out of some of the work we did in our virtual conference last April, a lot of it around the idea of conversations. So we're going to talk to you in a minute about how to approach this conversation today, because it didn't seem right to have a conversation about conversations without making it a conversation. So we're going to take a few minutes at the beginning, though, and this is December, it's a time in which we, I think we often take for reflection, which is appropriate in winter. And I would love for you all to think as a moment of gratitude for everyone who has paved the way for us to come together today to have this conversation to have this space to talk about these important things. So if we just take a moment of reflection for that, we're gonna have a few quiet moments following that we hope that everybody will contribute to the document that is the collaborative document for this conversation today. So I am just going to copy a bunch of stuff into the chat and we're gonna we're gonna quietly reflect and then spend some time thinking on on paper so to speak. I'm just going to repeat the comments in the chat for those who are coming in and wondering why we're having this quiet reflective time. Okay, so don't stop writing in the shared collaborative document just because we're starting to talk to you. But I do want to kick off the conversation. First by saying that essentially our question today is about leaders in technology in schools. What are the conversations that they need to be having with their teams? What are they in their teams need to understand about the conversations that are happening and the diversity and inclusion space and what what conversations are happening in the classrooms or online and that sort of alternative classroom space that students in habit one Are the DEI conversations that are happening at a systems or administrative level at your school? And what are the conversations that we need to be thinking about from a cultural and historical perspective, because we're all human beings living in this world together. So that's sort of the framework for that we're going to start out with, but I first want to introduce our panelists, and I'll describe a little bit about what this format is going to be today. So our panelists, and welcome to Dr. Beth Holland, who is the partner for research on measurement at the Learning accelerator. She is an old friend of Atlas, and we're really happy to have you here with us today. We also have with us, Dr. Jeffrey Morrison, Director of Education Technology at Trinity School, Atlanta. And we have as well, Francis Cortez O'Connor, academic technology. So I thought that was really interesting about your title, Francis that it's like all of academic technology, it doesn't matter. You know, you're not worrying about the director or anything like that. It's just like everything. And so, she is at National Cathedral School in DC, and Collins, Samuel, and Khan, I'm going to say your full name, because that's the way I see it on everything that i i Look at Cullen, Anton Samuel, and he is the Chief Technology Officer at the Burley school in New York. So welcome to all of our panelists, they are going to talk for just a couple of minutes about their area of concern, or the area of focus that they are bringing to this conversation today. And then we are going to turn to them and begin to and just have them ask you the most pertinent or challenging or on fire question that's in their heads right now today, we might skip over some of the basic stuff, we might be going deep into some of the conversations that I know that many of us really want to have. So I'm going to start out with Francis, tell us about your sort of overview of your perspective here today. While you're on this panel.


Frances Cortez O'Connor  07:31

Thank you, Susan. I am here today, among some amazing thinkers, really gratefully to step outside of my own classroom experiences and my own school experiences that I suspect are shared with a lot of colleagues elsewhere. The first question actually represented here in the document was the first question I wanted to bring to the table. And it's recurring in so many ways for me at school, I run a diversity group, it's an affinity group for students in middle school of Latin X heritage. And for the first time, or maybe the first time I would say last year, I had students bring some reading to the table, it was Twitter source reading, much of it was really accurate. They were really on fire to discuss the inequity described within said reading. But Twitter being the echo chamber that it is didn't entirely allow for a holistic conversation. So I found myself in new ways wanting to nurture students as they learn and feel on fire for something and desire to be seen. But also, there were some pieces that kind of, if not alighted, the truth just didn't tell the whole picture. And students were clinging, because they trust this information, these information sources so dearly, because the web told us that it's there. So that was like an interesting dissonance for me that I would love to hear. Others sort of speak to challenge me to again, sort of focus in nurture relationships with students and help them be seen and represented in our environment. But it also challenges me to understand that they're not necessarily adept at how to have these conversations or how to seek information that's entirely accurate. So that was my first and big questions a second, you'll see represented there. And these do not have to go in any particular order. But as they develop curriculum for this year, as my students return from a year at home, I am dusting off the scope and sequence I want to understand where we're going what we're doing in our math classes, where we are, but I also am really cautioned by whether it's the Facebook reporting that we see I teach in an all girls school, or whether it's just so many examples in popular culture of students being made to create content, but without the guidance having content ie the algorithm get away And then very quickly, so what's the big public interest tech? Question?


Susan Davis  10:04

So, okay,


Frances Cortez O'Connor  10:05

pretty big. So okay,


Susan Davis  10:07

let's circle back to those. Let's get a get a little bit of an overview of everybody who's here on the panel. And then we're gonna come and start with you with your first question. Francis, bath, do you want to tell us about your digital equity perspective?


Dr. Beth Holland  10:22

Sure. Um, hi, everybody. It's always so nice to be back at Atlas. It's like coming home. So I work right now is the partner at the Learning accelerator. We're a national nonprofit, we support districts across the country, really, who are looking to understand like, what are the systems and structures that would lead to a more equitable education for all of our learners. And so, from a research perspective, digital equity is something that's been really serving at the foundation of the work that I've been doing for the last several years, and most recently that's manifested, and the creation of a new digital equity leadership guide. And the question that's coming out right now. And while why I'm really excited to hear from all of you in this particular audience is, I have been starting to uncover this ongoing sense of, we've given every kid a device, we've got every kid with, you know, a hotspot or access, we're done with digital equity. And at the same time, I'm starting to find more and more challenges, whether it's looking at what are the ways in which our students are empowered to use these tools? Or do our students have ownership of them? And is it sufficient to meet their device meet their learning needs, is the content and it representative of their motivations and their identities and their interests? And so really starting to dig into a much more nuanced and complex understanding of digital equity, and trying to see how to make sure that tech leaders like yourselves are also having those questions. And so, you know, again, we've got these questions in the document. And I see that some of you who have already started writing, and we'll circle back, but it really becomes this question of like, how do you tech leaders help to design and facilitate, you know, experiences and practices that meet the unique needs and interests and identities of every single student? And what are the conditions that are required to support this work, because, you know, teachers, you know, Francis wants to do wonderful things in her classroom. And we can also acknowledge that if the conditions don't exist, for her to be able to create those amazing experiences, then there's going to be a disconnect. And so that's really where I'm coming from. With all of this. How's that for? relatively brief? A brief introduction.


Susan Davis  12:35

Great. Great. So we're gonna move on to to Cullen, and he is going to be with us today's representing the IT infrastructure systems perspective.


Colin Anton Samuel  12:51

I was gonna go last caught me off guard. Um, my name is Colin, I'm a chief technology officer at the Berklee School in New York City. Be just like Francisco, we're also an all girls school. I got to this place from working at my previous school where I taught technology in a co ed school. And my class was an elective course that was coupled with a yearbook course. And, you know, only boys took the course. So basically, my school had created a system by which girls did not want to take technology. So as we worked within our systems, trying to figure that out, I changed schools, I came to this school. And I started from the point of representation technology, which is that increasingly, young women in my school weren't didn't believe they belong into technology space, because the departments that represented in those schools didn't look like them, or represent what they thought was technology. So, you know, we went through a process of not only looking at how our department looks and feels and acts towards our student body, but also other ways we can sort of, you know, use digital equity as a lens to be a better better provide a better student experience for our students. And that's taken me in a lot of different journeys, say, give me an atlas a few times. I'll stop talking. Now. My questions are in the template. I'll see a lot many of you sorry, talking there. But we'll have a conversation about it.


Susan Davis  14:21

Okay, great. Great. Colin, and Dr. Jeffrey Morrison, share your perspective, which you're coming to us as a member of the Atlas dei Task Force as a member of Dallas board, but also as yourself on your personal journey. I know. Talking about the cultural and historical perspective.


Dr. Jeff Morrison  14:45

Sure. Sure. And I think it's always important to frame kind of being the white guy in the eye conversations, my my positioning in this. So my PhD work was in social foundations, multicultural education, and I looked at that Goji within that context. So I've been immersed in this world for a long time. And I look at meaning the historical context and the construction of whiteness and white supremacy within capitalist frameworks, and how those have functioned throughout society since the beginning of American America. So my question today is looking specifically at, you know, the historical context of technology in di, and I'm really interested in the intersection, you know, I'm going to call out the elephant in the room. You know, we're an independent schools, and we're having conversations right now, I'd be surprised if not all of us are about di topics, and dei frameworks that we haven't had in the past. You know, I'm in the South. I'm in Georgia. So that may be a little different than New York City, as far as those conversations, but we're having conversations that were not brought up in the past. So I'm really interested in having the conversation about how are we how these conversations being framed in schools and how us in the technology department can help kind of set A contextualization within those frameworks in order to make sure our students are learning critical, digital literacies, which is very important. And it's extremely important in that realm that they do learn how to be critical of the digital world we live in, but in a framework that is politically sensitive to the insanity that's happening today. So that's kind of where I'm coming from in this conversation.


Susan Davis  16:22

Thanks. Thanks, Jeff. So let's circle back to Francis, you can see we've kind of built this down from, you know, being in the weeds in the classroom to moving out to a broader and broader context, we want to be able to ask all of our panelists to ask the question that is most pertinent on their minds. But the document, as you see has many more questions for us to delve into. We hope you'll continue to respond to that document, both during our conversation and after that our conversation. So let's, let's go back to Francis Francis, let's start with your first question. Ask it again. Okay,


Frances Cortez O'Connor  17:02

here we go. Very first question, again, is I'm scrolling. Kids are having dei conversations in digital spaces now. So how are we jumping in to support them.


Susan Davis  17:17

And we really would love for everyone to just chime in, if you feel comfortable speaking on camera. And if you are more comfortable just writing in the document, that's fine. But we really hope to have a conversation here. And and I would just sort of follow up on a couple of people have said in the responses, we are not having conversations about this, we're sort of pretending that it doesn't exist. That's my that's that's my sort of step beyond that comment. Does anybody want to talk about why why and how you're not having that conversation? When these things are happening in the world? And in our, you know, larger cultural context


Dr. Jeff Morrison  18:09

of sky? Treiman? You know, panels?


Susan Davis  18:10

Sure, yes. Well,


Dr. Jeff Morrison  18:13

I think a lot of it has to do with the I was black on black at tech school movement, you know, that was a digital freedom kind of discourse among students. And it really through the heads of school and these elite private schools back in the back seat right here for the first time. I think that's students had the ability to explain their perspective students of color in these white elite private schools in a safe environment anonymously. And I think that freaked a lot of our schools out honestly, I think the traditions and I think that that so they don't know how to grapple that, I think, you know, I think it's something that was a very needed conversation, and very, in a really good use of social media. And I think it should be encouraged to frame social media constructs, within a space that students feel safe. But I think that really, that dei conversation was stabbed right in the face of our private schools really set them back and having those conversations with students. Honestly, that's just my personal opinion.


Susan Davis  19:17

Here, but some time has passed. Right. So some schools have begun to address this. Yeah. Yeah. Yeah, go ahead.


Frances Cortez O'Connor  19:25

What Oh, I'm sorry, Beth. I also know you go. I just wanted to acknowledge Beverly, go for Beverly. I'll be quiet.


Beverly Golden  19:31

Oh, yes, we are. I'm from Polytechnic in Pasadena, California. Yes, we had all these discussions and we had reparation discussions, and it's been pretty intense. Um, the one thing that I wanted to bring up was I put in resources. I put a link to a CNN report about algorithms and the far right and my take is, I'm trying to figure out how to teach that to kids. I'm a tech integration specialist, but I'm trying to figure out how to bring this information to children. And then what do they do with it? You know? Um, so Mike, I have my question is to have a question. But actually, Beverly,


Frances Cortez O'Connor  20:14

what I was gonna say is that I'm going to add to my own question, a question to a question. But I promise I'll answer as well. The word problematize belongs there, right? I do need to either construct with my students as much as I challenge and like you're saying, Bring the article to class. That's what I desire and need to do as well. And but I have to be there as like not Cassandra, but still hold the voice of moral caution, though, as we go, because we understand the implications of predictive technologies that we're going to create our kiddos have to also right. So how is it we create bumpers that we create a meaningful goalpost for them as we always do an instruction, but now also with a series of guests why, you know, this other thing can happen so that so I want to call it the problematizing piece, I don't want it to be a stopping point for my kiddos. But I'm now pushed to educate. But problematize right to stop and think maybe maybe the answer to this, as I think aloud, is that I'm asking them to stop and think more frequently. Is that more process points for all of us as we go? Is that more critical thinking or talking back to implication, you know, or me posing potential issue, rather than letting them be as movies as quickly as I once would?


Beverly Golden  21:35

I also I really want them to understand, I have high school kids, I want them to create algorithms and understand how algorithms work. And I'm hoping that with that kind of understanding that that is this, you know, like, as future leaders that they embrace that, you know, when they go off and become lawyers and whatever. But I don't know if anybody has any thoughts about teaching? Or if this is this is part of what we're talking about today.


Susan Davis  22:09

Certainly, it's absolutely a part of what we're talking about today.


Vinny Vrotney  22:17

Vinnie, want to go ahead, and I'm going to go ahead and jump in. In that, to me the question, you know, what? The question is, how are we jumping into support those students and in my job as a technology leader, I'm not jumping in. That doesn't mean that others aren't jumping in at the divisional level within our counselors, within other members of senior leadership. And so I think that, that you're, you're part of the how are we? How are we defining the we within that, because it's for some, they are jumping into that particular conversation. And this did lead to our school, engaging in many listening sessions as part of the work that we're doing, but in terms of the technology department specific, we did not do anything to help support the that those particular conversations at that particular point in time.


Susan Davis  23:21

I know, Peter, you you have your hand up?


Peter Antupit  23:25

Yeah, it was actually thinking a little bit about this from as the tech leader, but actually as the alternate hat that many of us wear, right as an advisor or as a classroom teacher, etc. And that's where our conversations are starting really is, isn't advisories and in even in assemblies, where we've we've given students anonymous voice, where they'll read about microaggressions that they've encountered, so the whole community hears it, and then bring it back to, to advisory and have a discussion about those types of top microaggressions just straight up scenarios, and all sorts of other things that we talk about. And we do it through a bunch of different lenses at different times of the year. And then I take I take that and bring it to my technology team, as a little bit of a debrief about what the school is talking about. From the student center part. No, we don't really have an ed tech department. But that's where we're talking about it. So we do that. We also of course do all faculty meetings, our senior leadership team has gone through multiple dei work, we have a new di framework, etc. But it's it's centering the student voices first using our different hats, and then bring it back to our teams to understand what our students are experiencing.


Susan Davis  24:56

Back This seems like a good transition to move on to you and your question.


Dr. Beth Holland  25:01

I think is I actually though, as a researcher would like to do a little experiment, because this is something I've been working with, here's how this is going to work. I'm going to hit return in a moment, and a couple of letters are going to appear in the chat. And I want everyone to type what they think the unscramble is, you know, like, we're going to unscramble the words. But don't hit enter until I say, Go deserver and get the words get the rules, like I'll put the letters you type, don't hit enter. So we'll see it all like waterfall out. And I'll explain my rationale in a minute. So look at these letters, S, C, O TA, I want you to type what you think the word is, if you were to unscramble the letters, I'll give you a second short word. And then on the count of three, we're all going to hit return. Okay, we're ready. 123 Everybody hit return. Perfect. What who's right and who's wrong? Um, and so this is where you have a question. Now this, a colleague of mine was working with an algorithm with an adaptive learning platform in California. And one of the answers that was coded to only have a single correct right or wrong answer. Was these letters, right? So in California, what do we think the predominant answer might have been? Someone can say something goes tacos, right? Well, the system said it was codes. I'm in New England. So the second I saw it, I'm like, oh, it's codes. And on California, like, oh, it's tacos. And the point is this is when we think about these algorithms. And I am bringing this back to this idea of digital equity. As technology leaders, we're determining what technology is coming into our schools. So we can ask the question of, if we're bringing something in that has an algorithm, if we're bringing something in, that's driven by technology, where the technology is making essentially a judgment? How are we helping? First, how are we asking the questions to vendors to say like, what's the bias in your system? And how did you mitigate it? Who was the representative sample that was used to generate this algorithm generate this artificial intelligence? Like as the CTO technology leaders, that's our first responsibility. But then how are we helping our teachers? And our Dean's understand that when you have a kid that goes, Wait a minute, I said tacos? And it told me it was wrong? How do they have the language to help their students understand just because a computer said something doesn't mean that we're going to have it discredit? This is a really simple answer. But imagine if some of these systems are starting to pass essential judgments potentially based on bias. And this is an equity issue. In kind of tag teaming, Francis's question about how are we teaching kids to this broader question of how are we as technology leaders creating essentially the conditions for practices to happen? I go back to I mean, there was the big talk, it was a TED talk about filter bubbles that like everybody was talking about, I don't know, five years ago, there's no one remember, Eli, somebody or others and the filter bubbles, right? And the idea that if you search on Google, you'll get what you want to hear, which ties into that idea of, like, how do we go down the right, how could someone go down the right wing pathway? And actually to, to answer that question, Beverly, data and society did a fascinating, like anthropological study of like, how do people get so far down, because of the, the way the algorithms are driving them. And so, all of this to say, particularly as there's more and more technology that are coming into our schools and coming into our kids lives, you know, it really is, what's our responsibility as technology leaders to really design and facilitate these experiences, which means we are going to be working with our dei teams and our curriculum teams. So that we're meeting the needs and the interests of our students. And then I think our real position is, is using this idea of like, what are the conditions and conditions is everything from infrastructure to policy, you know, how are we communicating what we mean about digital equity? When, you know, I taught and I, you know, I was the Director of Academic Technology and as independent preschool through eighth grade school for years. And the way we were handling tech, digital equity was we had what was called the Star schools grant with the state of Rhode Island where we made sure that every single kid had Internet access. This is before hotspots existed. This was really pre smartphone. I had a flip phone when I was still teaching there. So this was a while ago. And we used to collect old laptops from you know, our more affluent families that turned them over every 10 months and we'd wipe them down and give them to the But couldn't afford a laptop. And that was how we could ensure every kid had something at home. So it is, this all ties together into this digital equity space. So I hopefully that just connected some, some dots. But that's where, you know, I am curious is how, how are we thinking about this at a conditions level? Like what are we doing to create opportunities for success? I'm gonna stop talking now


Beverly Golden  30:32

do you want us to chime in? And are you guys?


Susan Davis  30:34

Absolutely Yeah, please.


Beverly Golden  30:37

I love what you just did with the whole tacos thing. And the article that I put in, I put it in the chat talks, they talked to a Raul right, alt right person who actually went down the algorithmic whole rabbit hole and came out as an extreme conservative, and he reflects about it. And it is, I feel like for our voice, our white boys, they're, they're subject to all of this, as well as I think the issue of gaming as well. So I just wanted to just say, yay.


Vinny Vrotney  31:16

I'll go ahead, you know, one of the things, you know, we've been looking at was, you know, with our one to one iPad program, it you know, used to be, you could bring in any iPad, pretty much you wanted. So as you can well imagine, some were iPad Pros, and some were hand me downs with cracked screens, and we went to, you know, one size fits all that the school buys the iPad, everyone gets the exact same model iPad, with the exact same case, to ensure equity there. And then we were, as we have a, you know, minimum standard for our BYOD program in the Upper School, we're having conversations about, you know, starting a maximum standard, again, so that, that playing field, could be could be level, so you know, that we don't have, you know, some kids, you know, coming in with something that barely meets the standards, and some someone coming in with something totally off the charts. So we're trying to find some equity and BYOD as well.


Dr. Jeff Morrison  32:23

I just want to say that I love that I'm stealing it, did you create that I need to cite you for that,


Dr. Beth Holland  32:28

um, I was in a workshop when that one came up. So you can just take it because I thought it was a pretty brilliant example. Yeah. And


Dr. Jeff Morrison  32:37

I think what that goes to the speaking is, you know, I think it's important to teach, you know, AP Computer Science kids the importance of algorithmic development, but just teaching kids that this exist, right, that's the first thing like, allowing them to see it exists in an example like that, like, that's really an amazing, like, I just think that's where I want where I go with it is just letting it exist. Right? This is happening for everyone. There was an article in medium to today that came up that this woman, Mario's married woman have three children. Her algorithmic bias, there, they they think that she should date 20 year old men, and how she went through this whole thing about how women are, you know, sexualized and symbolizes a certain way to, to hook up with younger men. And that's all algorithms, right. So this is happening pervasively throughout all of society. And I just love that, that exercise is a great way to tell kids like this is happening, you know, don't think that you're unique in the, in the way that tic TOCs, addressing your your your values. So thank you for that.


Susan Davis  33:50

Great. I really wish we had hours and hours today, I want to move on to Cullen.


Colin Anton Samuel  33:59

First of all, there's lots of really good conversations happening here. And I want to thank both Francis and Beth first, for what they shared. My perspective is very, very different. I came at it in a very different way. I mentioned in the beginning, I worked in a school where it was really problematic that this school is set up systems by which boys could take computer science classes and electrical, not girls and I went to a new school that was all girls. And we had this this big problem. You know, and then I realized, you know, a lot of us have talked about like, you know, sort of digital equity and sort of algorithmic sort of in justices and then I became a victim of one where I had to police pull me over and put eight guns in my face in June of 2017. And in my fight to actually fight the system of what's been known as sort of policing software. I started to research companies that did this type of work that actually developed software that, you know, that sort of came after, you know, to help systems. And I was shocked to find out that most of the leaders and board of directors of these companies are actually white males trained by independent schools. So in essence, the schools, the schools that we've actually supported, brought up, we've actually created systems that actually almost got me killed. So, um, I started to work with, you know, how, in ways that I, as a sort of technology, person can sort of, you know, you know, a systems person can really create a level of equity, not so much in terms of out things or giving up in terms of what I'd call representation sort of equity, where, you know, in a school of all girls, how can I ensure that, that somebody who is working on the computer science or teaching computer science or learning it can design systems that allow them to understand that, you know, the world is really their sort of test test space? So we, you know, we did a lot of work around sort of how to recruit and get people in our departments that, you know, are representative of the world of the environment that we in, we've, you know, none, turn everything on the side and focus heavily on sort of empathy as a key point to hiring our department. If you don't have it, we don't hire you. But also in a lot of other things that, you know, people have already said, like, how can we, you know, have empathy, br lead sort of mission point, is it apartment, not necessarily focusing on the technology, and blinking red and green lights, but the people in the building? And I have a bunch of questions there. I was nice to prep for this, to see what Beth's work on a digital equity audit. I then fell down the rabbit hole of the audit. I know that Larry just made a point about the one to one program and handing everyone the same computer, which to me is sort of I would call first generation one to one programs, because I mean, a lot of schools that didn't COVID But do we but what found what I found out is that we gave everyone the same computer. But everyone gets the same support system by which they use a computer remotely? Probably not. So how do we identify systems still ticket the second and third level that just giving a computer is just the beginning point and not the end?


Dr. Beth Holland  37:25

Can I jump back in it's build off that. So I mean, Carl and I, we've talked about this at length over the years. And one thing that it also just made me think putting my researcher hat back on, I went down a different rabbit hole in the writing of the digital equity guide, and and really starting to think about the role of race and technology and who I who creates the technology? Who's responsible for deploying it? And what is the history that we understand about the technology in place, and I ended up reading a book from Dr. Charlton McElveen, who's at the NYU Steinhardt School. And it's called Race and technology from Afro net to Black Lives Matter. And He traces the history of technology from really the like the beginnings of computers in the 60s, all the way up through the Black Lives Matter movement, from the black perspective. And as someone who grew up in Atlanta, Georgia, I had no idea that Georgia Tech was the center of so much work that was happening that never gets talked about, because everyone jumped straight to the white guys in Silicon Valley. And there was an entire history that was erased. And so again, when we think about our representation for our students, and we think about the experiences and the tools that we're giving them, particularly as they're getting older, are we also helping them to question everything and say, like, whose perspective is driving this thing that I'm using right now? And whose perspective values what I'm doing right now. There's, you know, another professor, Dr. Craig Watkins at the University of Texas in Austin, who talks about the digital edge, and he spent a lot of time this was early 2000s, observing the ways in which black and Latino youth are interacting with technology, and how that was conflicting with what teachers in schools held as the right way to use it. And so how is that being communicated? In terms of what we want our students to do? And so it, I think it really ties back in, you know, call into what you're saying, How are we helping our students understand from an empathetic lens of, you know, what does it mean to be creating and consuming technology and using it with an eye towards a better future? And not necessarily, you know, tying back to Jeff's perspective, like that capitalistic perspective, which is really what drives the Silicon Valley, you know, tech side of life, and they don't I mean, I know from what I I've been out of, you know, an independent school for a while now that we were barely touching on any of these things. I know. This is almost a question like, I would I would not have had the administrative support to fully bring these things up. I was tiptoeing around I'm in this was, oh my gosh, I'm old now almost 10 years ago, and they were definitely not ready for the conversation, then I'm not sure how conversations are going now.


Susan Davis  40:14

Let's turn off on that question. I mean, I, we keep coming back to the conversations, and how, how to frame the conversations and how to have a conversation that's going to move forward and our or even just sort of wrestle with what is how do we get there? And best question is, you know, I'm sure more conversations have been happening. But it seems to me that the fear of the conversations is still pretty prevalent.


Beverly Golden  40:51

I still think that even as a tech person, there's so much I don't understand, and I'm old, and I've been around, and I'm old school tech, and I. But it's this really, I think, when we're addressing administrators, they're these really abstract things that we're talking about. And I remember when years ago, we taught people that markets were designed a certain way for kids, you know, every anything that was extra was like at a child's eye level. And I feel like we need to find ways to show administrators, this is happening, and it's affecting our kids. Um,


Susan Davis  41:34

yeah. Just let's, let's not leave you out of the discussion here. Let's help turn in your direction for that bigger Cultural Historical context.


Dr. Jeff Morrison  41:47

So how do you want me to frame that question or kind of keep,


Susan Davis  41:51

just keep going, just go, whatever, do whatever makes sense?


Dr. Jeff Morrison  41:55

Totally. So I think, you know, it's just how we scaffold and teach these things. So it's right now, I think it's worse now than it's ever been since I've been in private schools as well. It's always been bad, but you know, with the whole CRT crap and all this other, you know, these markers, these dog whistle racism, like it's just, it's a constant pervasiveness in our face now, right. I think what historically, we're seeing things, people that would have been on the margins that are coming into the fold of white whiteness in Whitestone, like that, right. And so there CRT, you know, we're teaching kids CRT, and, you know, honestly, we had a thing of school the other day, Hanukkah was earlier this year, right? So we had a huge Hanukkah celebration, we actually had parents, and my head of school is very progressive. And it was like, You got to be kidding me. Actually had parents asking why we weren't talking about Christmas. And my head of school, and I were talking like, I kidding me? Like, it's everywhere. What do you mean, we're not talking? Because Hanukkah was early. We're so Diwali was before Thanksgiving, you know? So there's this, there's this white fragility, quote, unquote, in world now more than ever. So I think, I think it's very important for us as educators to push the conversation forward, without creating noise on the outside. And one way to do that, you know, with the whole algorithm talk, this is the way we scaffold the conversation with our students, and how we build off historical perspective, right. And the one way, kind of that I've done this, just give you one little thing that I kind of push on is I oversee our Maker Space immediately, but our media state space. So one way we do it in our media space, and I've been doing this for years, on my 10th year here is I wanted to make sure that African American history, Indian history, Native American history, women's history celebrated every single day, in our media center, we will celebrate the months but it's going to be at the forefront everyday. We're not going to pull out these books at a certain time. So I think media centers and high school libraries are great ways to ensure that kids see through the window a little bit more without creating this big Oh, they're teaching CRCT make sure you have your book challenge policies down to a tee. I don't know if you guys all have that, on the record, make sure you have that we got it down I I relish book challenge, and also in the MakerSpace. So in the makerspace, one thing that we did is to scaffold off kind of where we come as a society within this unfair representation, especially within technology is you know, he started talking about housing. And so we created communities, and that allowed us to bring in redlining, and that allowed us to bring in zoning. So I think there's ways as educators and technology educators and maker space educators, because we don't see we're not seeing as the social studies, you're teaching my kid about anti whiteness, right? We're seeing as Oh, the makery people. And my makery students are reading about red zoning and how unfair that was and how to create more integrated communities. So I think giving kids experiences thinking about how we give younger kids especially experiences and empathy, like Han was saying and allow allow them to learn empathy through relevant instruction. He's going to help, hopefully scaffold that into larger, more complicated conversations when they're older. So I think there's a lot of ways that the tech people because we know that tech people comp with ed tech people can kind of meld them ways. Like fairness. When we talk about fairness and digital literacy and our Instructional Technology Lab, we talk about gender pay inequality. Little kids are a huge unfairness, right? And so when we start talking about fairness, and equity and stuff, we bring up gender pay, and how women don't make as much money. And they, you know, the fourth graders go nuts. They're like, that's not fair. So there's, and we talk about how that is important. How, why is that important? You know, how did how could women? How can women look differently on the internet than men, and that opens up conversations. So I think there's a lot of ways and I work in a K eight, so I'm not doing the algorithms. And I really respect the teachers that are teaching students how to write algorithms without bias. That's I sound like, I'm just looking at trying to get kids to understand empathy and fairness, and that the world hasn't been fair for everybody. You know, this is a rich, private school, their life is fair, right? And so I just think, as educators and looking at the cultural context and historical context is really great ways to integrate very, quote, unquote, controversial topics today into instruction without causing a headlight. And kids will get it. They're like, Oh, my God, that's so unfair. You know, when you talk about redlining, with sixth graders, they're just blown away, like they're blown.


Susan Davis  46:32

But But I want to challenge you also that, yes, I think oftentimes, as educators, our natural impulses, like, Let's go solve this in the classroom, we got to be having these conversations as adults. Oh, I agree. Absolutely. And we've got to be having these conversations within the tech team, which may think this does not have anything to do with me.


Colin Anton Samuel  46:55

So yeah, I mean, so one of the things I always talk about, um, so as I'm a person who I'm very fortunate that I not only run the systems at my school by also having to teach advanced computer science, so I do. So I'm in the building 12 months of the year, and I work with everyone, anyone in the building. And it's so easy to find places to sort of carve out empathy in a school, like one of the things we did in our advanced computer science class is the kids will write an application design application. And I really don't care if it works. I care if you can actually I actually have a member of our dining services, take the app, I call him up. And I said, All right, you're not in the kitchen. Right? Now you come and look, and I want you to run this app that a student of mine has designed mind you 80% of our dining server dispatch does not speak English. And the message is clear. And that like, Oh, I've written an app, or like this entire group of people don't understand the instructions. And then the light goes on. And you know, you do little things like that, where you say, Okay, here's an example of how designing for the world works is that you've designed it to get a grade, you've designed it. So the code loads without an error. But to me, it doesn't work because the person you just handed it to, doesn't understand your instructions. So now go and redesign that. I you know, I know there's a lot of conversation but algorithms but I think if we start earlier to focus on ways to design a way out of bad algorithm, I think that's a big place to start.


Susan Davis  48:34

Absolutely. Absolutely. We have about 10 minutes left. I want to turn to the people who have come to this summit because you're interested in the conversations and hear what what is on your mind. Either a question that's on our document that you'd like to go to or something else that we haven't thought about yet and don't don't feel like you have to raise your hand just jumped right in. I told our panelists I was going to be really patient with a quiet time to think. Who has something on their mind that hasn't been addressed yet? You and feel free to use Use the questions that we've already started in the document.


Dr. Beth Holland  50:07

Do I count


Susan Davis  50:08

have questions, you will absolutely always count that.


Dr. Beth Holland  50:16

I can pause another minute, or you want me to ask another question.


Colin Anton Samuel  50:20

You should share the digital equity audit with the group. Yeah,


Beverly Golden  50:24

I like the way um, I would love to hear more about Collin Samuels. Um, oh, no, I forgot my thoughts. I love to hear more about how you dealt with the programmers and the things that you did or was that Jeff Morrison? I apologize? No, that's not what it was called, how you how did you actually address the programmers so that they would like, what were some of the things that you did? Or what were some of the questions you asked? Or what were some of the results? You got? If, but we can talk about that later?


Colin Anton Samuel  50:59

I mean, are we talking about flying, I mean, but I mean, essentially, I don't tell them anything. I give them an assignment to create an app or create a code that works. And they'll run it. And then literally, they think that I'm the person that's going to grade it. But I don't really create it, I literally give it to someone else to run, whether it be a student that's younger, or someone who doesn't speak English, or someone who is a Luddite by by just, I have them run it, and I go, Well, I understood what you wrote here. But you literally wrote in something that only I could write, if you really wanted to do this for a job, you have to write things that everyone can use it multiple people can use, you have to understand how to use words and things of that sort. I mean, that's really the, to me, I design teaching computer science, to me is big, has sort of jumped the shark from like, Okay, you have to learn how these functions work. So you have to learn the ethics behind what you're building. And I even tell people, like, if it doesn't load, I don't care. Have you really made? Have you made considerations so that everyone can use what you've built? I think, I hope more people think that way. I think it's really important.


Beverly Golden  52:09

I was thinking like, how do you bring it to non non computer science kids, and like, make it for the entire school to create an app, or I know, and I'm doing a thing with a group of ninth graders. But I mean, I was thinking that like, this is something that you can just blow up and have, like every kid create an app. And then and then have the questions, you know, ask the questions.


Colin Anton Samuel  52:32

Right? I mean, how you want to frame it is is up to you. I mean, if you have the sort of framework by which you can sort of say, okay, you can bring it to other students have a lunch period, or students can come in and try it. Unfortunately, I could do that. I'm sure most people can anyway. But I really feel strongly that, you know, our job as educators is to really attach empathy to every single thing we do. And I, I, one of the things I've done with my staff is that my staff always wants the latest, greatest computer, but they always get the worst computer that someone has in our school. So that they learn very, very quickly, when people complain how those complaints are about just little things you do just to make sure, I think, you know, and there's a whole litany of them, you can do but I we spend a lot of time and a lot of effort, like really dissecting the things we say the words that matter, and attaching empathy to all of them. But for that lesson, you can literally just lunch period, or you know, if you have something some after school club, where you could actually sort of screen testings kids written and see how well they understand it and put together a sort of, I do it, you know, off the cuff, but some places actually in other places where they actually have like a survey of like, did you understand this? But um, there's many different ways to do that.


Susan Davis  54:01

That's you want to ask your question.


Dr. Beth Holland  54:05

I got distracted now with cones question. I forgot my Oh, actually.


Susan Davis  54:10

Denise, Denise, want to jump in? Thanks for keeping an eye on that. Francis.


Denise Musslewhite  54:16

Hi. Hi, everybody. I am just in awe of all of your insights about this topic. And I wanted to add that and I apologize for missing the first part of this that I have recently spent time doing the a DI in the workplace certification. And one of the basic underpinnings of that certification is that there needs to be this like, individual self discovery process as the number one component of making progress and I am challenged by that, a call to action because I believe as someone who's been in independent schools for over 20 years that we are in our infancy when it comes to that self discovery process as faculty and employees of independent schools. And here we are talking about how to move the needle for our students, when we ourselves, the educators haven't done that into internal work as an organization to, to discover our own blind spots. So I wonder what other schools are doing to move the needle there, so that we can have these conversations and move the needle from a technology standpoint in unison and not while seeing resilience from our administrators?


Colin Anton Samuel  55:55

Denise, are you are you are you taking the certification at a specific place? I know there's a bunch of others. I'm just curious.


Denise Musslewhite  56:01

The certification that I am almost finished with is the use USF certification, the DI in the workplace that has been viewed or completed by over 100,000 people that's available for free.


Susan Davis  56:19

Yeah, I and I've mentioned that a little bit in this conversation, Denise about how, you know, we've got to address these things on a personal level. But also personal, like whoever is sitting next to me that that that kind of personal level, as well as a leader of a tech team with all the people that you work with on a daily basis. But we also acknowledged, I think, maybe a little bit earlier in this program, that the environment, I'm not just the history of independent schools, but the current difficulty for having any kind of conversation, much less about something this important is, it's it just makes it people really get afraid to even speak. And I'm really hoping that we can find some ways past that, past that difficulty of not speaking, either because I don't know what to say, or, because I'm afraid that this will get out and the parents will be after me or whatever it is, you know, I feel like we're at a real turning point. And and we need to we need to take on some responsibility for moving forward.


Dr. Beth Holland  57:35

Susan, if I can add one thing I was, in some of the earlier days of the work that I started doing on the digital equity front, I was in a conversation similar to this with a different organization. And a gentleman chimed in and said to me, so I repeated it back. And we were still at that point, I'll frame it to talking a lot about the homework gap. Right that who has access at home and who doesn't. And so he says, he says to me, he's like, let me see if I get this straight. You're basically saying that we have to care about digital equity, because it's a way to get people to actually think about somebody else's kids. And at the time, I was really offended. And the more I thought about it, and like he's absolutely right. And this is like even at TLA. This is a conversation we're having, we don't really care about digital equity, we care about Big Picture equity, but digital equity is a really, really concrete way to start having these conversations. And Cullen had asked me to mention the digital equity guide. And in putting that together, the very first step is you have to get a team. And the team can't just be the technology department. And it's by starting to have these super concrete, you know, questions around does everyone have a device but then coming back to to Larry's to point Larry's point, is that device sufficient to do your work? And do you have ownership over it? And there was an article that just came out today about gaggle, which I am not passing any judgment on one way or the other. But it did make a comment of kids who don't own their own personal device can never get out from underneath being like surveyed essentially by their school. So what does it mean to have ownership and now we're having these really, really concrete conversations around the device, which seems kind of unthreatening, but could be that like stepping stone into these more critical conversations around equity. So that's my secret backdoor in that I've been working with ever since that person I thought offended me and then I realized it was actually pretty brilliant.


Susan Davis  59:35

I hate to say that we are at the top of the hour. And I was supposed to I'm doing what I tell myself I should never do and I've skipped over the reflection piece. So I am going to in treat you all to move to the bottom of the shared document. And if you don't do it down, do it before the end of the day and take some time to run flecked on all or part of what we have talked about today. We've really just scratched the surface. But I have to say I want to applaud everyone who joined us today. We scratched the surface. Thank you so much.