Hi everyone, thank you so much for inviting me here today, it's a pleasure to speak to you all. I'm Denae Ford Robinson, a researcher at the intersection of online community, software engineering, and human computer interaction, and today I guess my talk takes a little bit of a different twist from what others have been talking about. I'm going to be really consolidating some findings from prior research of other amazing scholars and researchers looking into safety. So i want to take this time to talk about how we can keep developers safe, drawing on their work, and in this talk I'll be sharing findings from their research, and in the hope that we can help to devise interventions that can better support the developer community and really understand how we can keep people safe and the community safe at scale. Safety is something that many of us often take for granted. Unfortunately it isn't until we're on the other side of an uncomfortable or unsettling situation that we begin to think of why it is important to have processes or tools in place to keep us protected. So this article that i have shared on the screen here is from the Atlantic really highlighted that. In a series of studies of online engagement they were really finding - in diving deeper into how this small number of people can often silence the dominating view or the silent view in online forums. So thinking back to some of the work that's already been presented today from Kelly Blinco and others about how that criticism - how that harassment can really be discouraging to folks, sometimes it's easier to do this in certain settings. Likewise a Pew Research Center study conducted in 2017 found that four in 10 americans have experienced some form of harassment online. So this type of harassment I'm describing includes name calling, doxxing, stalking, or really just spamming, right, being disrespectful, not being as welcoming. These types of online harassments can especially dam - can be especially damaging for open source communities, the maintainers and marginalized people specifically, as it makes it difficult for them to productively participate in these online spaces especially when working publicly. So when we translate this activity to open source we really have to think about how this can impact the people who build systems in the public eye. So some of the ones I've previously investigated were centered around open source software for social good. So the projects I have on the screen are kind of - really centered around how people are building tools - developers are building tools to support a broad societal good, to help a specific population. So the one on the left here is Little Window which is centered around supporting victims of intimate partner violence and the one on the right is refuge restrooms which helps non-binary, genderqueer, and intersex folks find safe restrooms. Developers and contributors who work on these types of projects can have quite polarizing views, and the people who are observing and using this project can see that there have been quite polarizing political perspectives and thus are at risk. And like many other socio-technical forums these conversations around these projects can get heated and sometimes sink to the next level and even often times get out of - out of context to become quite harmful. So this is where psychological safety really matters. So developers feeling unsafe can directly impact your quality of work and thus the quality systems and tools that you build. So a survey from the Global Wellness Institute reported that 62 percent of US workers - found that when workers felt immensely unwell it significantly impacted their work performance and how they see their growth overall. Safety ultimately provides a level of confidence that makes you feel more comfortable engaging. So I like to think of safety in relation to going outside in below freezing weather. So you could still go outside as you are in a t-shirt or shorts but with a jacket or a blanket around you you feel a bit more protected from the elements while you're out there. So you can still contribute to open source and engage in the broader developer community but with safety precautions and safety mechanisms, you know, you can feel a bit more protected in that journey. So the question I really have here - I want to post the community into our broader developer community here - is how do we recreate the safety and assurance in online programming communities. Because if you don't feel safe and supported then honestly why would you engage? So long story short, when it comes to products, safety is the minimum, and it's important that developers feel like they are psychologically safe. Without feeling safe again you cannot be effective, and the quality of your work that you need to produce. What I also have on the screen there at the bottom is a report from - Stuart Geiger's report of the 2017 GitHub survey where they talk a lot about some of the approaches people have used to kind of circumvent or resolve some of their harassment and unwelcoming experiences that they've - they have experienced on the platform. I'm hoping that throughout this talk I'll kind of be giving a couple of nuggets from prior empirical research that can inspire some ongoing work that you as the developer community can start building on as well. So when we try to answer this question for the developer community it's extremely important that we leverage prior studies that have been investigating this in other settings. So there's been several studies that look at what collective action looks like for community members to feel safe and protected. So I draw on a lot of scholars from a broader computer-supported cooperative work systems and forums like Amy Zhang who has worked on this tool called Squad Box which is really geared at trying to support people who have faced harassment and allowing them to collectively grab the group of peers to contribute - to support others. Another one I highlight here on the slide is iHeartMob which is again a community dedicated to offering support of the victims of harassment, so again collectively gathering your peers, your colleagues as well, to come up with your own system to feel - victims feel supported - help people feel supported, and thus encourage them engaging. So again the big question here I have is, how do we - what does it look like for developers trying to engage in communities. And when I do that, I also want to highlight what has been quite successful. So I want to call out things on the slide here which are from GitHub's community safety team. So for those of you in the audience may be familiar with folks - this was a previous team, like, at GitHub, other members of the team include Lexi Galantino, Danielle Leong, Caroline Ada Ehmke, February Keaney, a slew of others. I mean they've been tasked with building tools to figure out how to keep people safe on these platforms. So one of them is these temporary interaction limits where you can caution or give people or encourage people to take a beat when they may be spamming issues, as well as thinking about different reporting mechanisms that can report to the admins or the maintainers of the organization, versus what content may be more appropriate to share to the GitHub support overall. So thinking about seeing these successful features that have been built on infrastructure or in situ in the platform but think about what we can do broadly at a level of different scale within each organization is what I want us to charge to look at. So the Code of Conduct research - there's been significant Code of Conduct research in developer settings as well as in research academic research settings looking at how we can encourage folks to engage. So on the slide here I have the Python Code of Conduct as well as the Contributor Covenant which has been really successful at adopting - folks have been adopting some of these wishes and these guidelines for how people should engage. But there's also been not as much of growth in these areas as we would have hoped to see, so Renee Li and Hana Frluckaj and Laura Dabbish had done some really interesting work analyzing how developers and repos have been talking about the Code of Conduct, and after removing some of the noise and machine commits and conversations that were happening, of their sample of 3800 Code of Conduct conversations they were seeing that 32 percent of them - conversations were centered around creation, and - which is actually quite low still, because the Contributor Covenant has been around for some years now. But there's also a significant amount of conversation as well happening around how to moderate these communities, how to keep people safe, and again engage in collective action, what it means to keep folks engaged, safe, and ultimately sustainable communities. So what I want to charge you all to look at is, thinking about what this looks like for the next generation of developers, or the next generation of community safety. One way this could be or we can think about how we can support and grow folks is by highlighting those who have been doing successful and desirable activities and behavior. Again we're separating this from the actor - from the individual versus their behavior - because we want to be able to promote them. Another perspective as well is, think about how are we are engaging with human and automated approaches to sanction and discourage undesired behaviors. So that said, I want to charge us to think about what we can do today to help support community safety with a quick one-on-one. First I want to run through and say that if we can - if people care about - if communities care about it then it should be measured. Second, if it's important, there should be a person accountable for it, so not just enough to be about lip service that we're trying to create and foster a great community, we want to have human and automated processes, that we have a person to point to and can help us mitigate some of those gray area situations. Likewise, if there is accountability for this process, there should be a reason - there's an approach to track it. So we want to be able to have and be transparent about the process for how people may be sanctioned and how we are promoting and encouraging positive behavior on platforms. There is the CHAOSS community, which many of you may be familiar with, where folks are really looking for developers, researchers as well, interested in testing out these interventions at a broader scale. So I want to charge you all to look at different ways of, like, psychological safety beyond just reducing harassment or reducing recidivism but thinking about we can challenge safety - contributor safety - think about how we can grow learners as well in this space. So, as I close here, I want to leave you all with a few things that you can do today. First of all, you can make sure your project has a Code of Conduct. If it doesn't, you know, it's already a sign - it's a signal for many contributors. You can ask the project maintainers why they don't have one, better yet, ask them how it is enforced. Second, if your project does not have a Code of Conduct, propose they adopt one. The Contributor Covenant is a great place to start - several projects have adopted this and made it make sense on their own. So again thinking about you can adopt broader systems and rules to be bespoke for your project, and a lot of the research I mentioned before is about the evaluative and iterative process for how these Contributors - Contributors' Covenants grew. Third, I want to encourage folks to be transparent about the safety procedures in place. When you're not, you often put contributors at risk, and if you want to grow your community and also increase the diversity of contributors you have to be sure you're ready for them, so have these procedures in place and then the people will come, because in fact when it's - it can be more dangerous to promise people safety that you honestly cannot offer them. And I also do want to - also put on a call here as an FYI that we are working on some approaches to think about how we can support community safety and sustainability growth at scale, so I wanted to make sure you all on the lookout for that as well. And with that I'll close and thank you all for your time.