Building the Evidence Base
Review the YouTube Terms of Service and the Google Privacy Policy
What is evidence-based research? Why is it important to measure program activities and impacts and what are some strategies to do so? How can research be used to support engagement and empowerment for historically marginalized and underserved communities? Find answers in an recorded discussion moderated by Linda A. Seabrook, Senior Counsel for Racial Justice & Equity for OJP, with a panel of distinguished experts in the field.
This is the first in a series of 6 webinars designed to help reduce barriers community based and culturally specific organizations may face in accessing OJP programming and funding.
STACY LEE: Good morning, everyone. And welcome to today's webinar, The OJP Capacity Building Series, Session 1: Building the Evidence Base, hosted by the Office of Justice Programs. At this time, it's my pleasure to introduce Linda A. Seabrook, Senior Counsel for Racial Justice and Equity at the Office of Justice Programs, for some welcoming remarks and to begin the presentation. Linda?
LINDA A. SEABROOK: Thank you so much, Stacy. Good morning, everyone. And thank you so much for joining us today for the first in this series of six webinars that respond to what we have heard from community-based and culturally specific organizations as information necessary to help to address—and hopefully reduce—barriers to accessing OJP programming and funding. In my role as Senior Counsel for Racial Justice and Equity, my primary responsibilities are to lead OJP implementation of one of the first executive orders signed by President Biden, EO 13985, which is entitled, “Advancing Racial Equity and Support for Underserved Communities Through the Federal Government,” as well as DOJ's equity plan. Both the EO and DOJ's plan call on grant-making entities like OJP to engage with historically marginalized and underserved communities to identify and address barriers to receiving federal funding and programming in order to remedy the historic failure to invest sufficiently, justly, and equally in communities of color and other underserved communities. This capacity building series is only one way OJP is working to fill that charge. So let's get to our conversation, because we started a little late, and meet our panelists.
First, our goals for this conversation. As a result of our discussion today, our aim is that you will be better able to explain what makes a program or practice “evidence-based,” develop strategies to measure your program's activities and impacts, and identify ways to locate and recruit the “right” research partner. To help us in meeting these objectives, we are so lucky to have such a distinguished panel of experts. We want them to take a little bit of time to introduce themselves to you. So let's start with Karma Cottman. Karma?
KARMA COTTMAN: Good morning. Thank you so much, Linda. And apologies, everyone, I'm having some technical difficulties. My name is Karma Cottman, and I am Executive Director with Ujima, The National Center on Violence Against Women in the Black Community.
LINDA A. SEABROOK: And Dr. Hawkins?
DR. STEPHANIE HAWKINS: Great. Thank you so much. It's really a pleasure to be with everyone this morning. I am Stephanie Hawkins. I use she/her pronouns, and I lead the Transformative Research Unit for Equity at RTI International, which we lovingly refer to as TRUE. And just to give a little bit of context, TRUE is the research organization within RTI that leads the equity-centered research portfolio. And there are two core impact areas that we focus on, both researcher capacity building as well as equity-centered research, which includes our narrative change research and our science of community engagement. So with that, I will pass it on to Nancy.
DR. NANCY LA VIGNE: Good morning, everyone. I'm Nancy La Vigne. I'm Director of the National Institute of Justice. We are one component in the Office of Justice Programs with the U.S. Department of Justice. And our mission is to support research, evaluation, development of new technologies and standards for a wide array of crime and justice topics. I only joined NIJ as its new Director in May of this year. And really, it's a thrill of a lifetime to be able to lead this agency. And Linda asked me to share a little bit more about my vision since I'm brand new. And that is that I understand that research has a rather sordid—checkered, shall we say—history in many communities, and particularly communities of color. And I was trained very formally, as were most people who acquire Ph.D.s, around evaluation methodologies that were objective and that kept an arm's length from the people and the topics that were the objects of study. (And I use that word “object” deliberately.) But I've learned over the years that that is not the best way to conduct research, that you can hide behind this guise of objectivity in a way that is really lacking in inclusivity. And so I want to promote an inclusive research platform for NIJ and all the research that we support throughout the country. And that means research has to engage the people closest to the issue or problem at hand. Researchers can be an arrogant bunch. Stephanie, you might agree. We think we're the experts, right? Well, maybe we're the experts on methodology, but we're not the experts on the specific topics. Those experts are people with lived experience and practitioners who are working on the topics. So that's something I want to bring to the fore, both during my tenure at NIJ but as well as for this conversation today. Thank you.
LINDA A. SEABROOK: Thank you, Nancy. And we're so excited to see what will come from NIJ. So let's start with you, Nancy. What is meant by the term evidence-based, and why is it important to measure a program or practice?
DR. NANCY LA VIGNE: Well, as I was saying before, evidence is a subjective term, right? But there are some things we know. We know that it's important to document things empirically with numbers but combine those numbers with narratives, right? Because we can have some really good, hard data and not understand how to interpret it. And that brings me back, of course, to this notion of inclusive research. But evidence is important from the perspective of people who are developing and implementing programs on the ground, because they need to be able to track whether they're implementing those programs as intended. So that's the first thing, right? Well actually, even before that, it's trying to figure out how to develop a program that's going to be impactful. So, if you're developing a violence prevention program, community-based, the first question is, “Well, where should we focus our efforts and who are the people we need to focus on with?” And that is guided by data and an understanding of local context. And when I say data, I'm talking about both quantitative data, with administrative records, which we know there's some issues with. Sometimes there's a lot of biases that are baked into that data. And that's why it's also important to include a qualitative component. So with that information, you can help develop programs that are going to reach the right people, engage the right stakeholders. And then with that baseline information, you can also track, are you meeting your program goals? And that's really important because you want to make sure that you're implementing the programs with the greatest degree of fidelity. And then that data also enables you to see whether you're meeting the outcomes that you aspire to see. And so all of that is critical both to making sure that you're using your program resources wisely—you're increasing the odds that you're going to get that return on investment that you so want to see—and you're also setting yourself up for success with prospective funders who want to see the quality of the program implementation as well as whether it's leading to positive change and increased safety.
LINDA A. SEABROOK: Great. So just building off that, I'm going to turn to Karma now. As a culturally specific organization, Karma, what impediments do culturally specific and other community-based organizations sometimes face in building that evidence base for their programs and their intervention?
KARMA COTTMAN: Thank you, Linda. And thank you so much to Director La Vigne for saying more about qualitative and quantitative information, and particularly for culturally specific organizations. I neglected to say that Ujima is a national culturally specific organization focused on domestic violence, sexual assault, community violence in the Black community. And one of the challenges just from the beginning for communities of color is that we have a storied history with research. There have been, as many of us know, historically, challenges with the ways that research has impacted our community, the way that research has been conducted, and then ultimately, how research has, in a long term, told our story, right? And the story not necessarily being directed by our communities. And so, in the same way that you have historical trauma, historical issues with systems, you have that with research. And even as we talk about evidence—for many of our community, they kind of cringe, they cringe when we hear the term evidence-based, and so don't really understand how it can be used effectively, right? And so one of the largest challenges that we have is, one, having researchers that understand our community, are vested in our community. Having the capacity to negotiate with researchers, so that research tells the story that we want it to tell. Having research that checks back in with our communities, so that once the research is conducted, we know how it's going to be utilized and we have a say in that. Having research that actually talks about the resiliency, the strength, the good things about our community. Often, when we see research about our communities, it's from a place of lack or from a place of challenge. And so being able to tell the full story of our communities and all of our diversity, all of our strength, but also using research and really understanding how research can be used to harm our communities is critically important. And so having those relationships, built-in and banked relationships, is also something that our communities often do not have. And then finally, what I would say is that research takes resources. And for many of our communities, we don't have the resources to navigate, right? To engage with the researchers in a way from a strength-based place rather than from a place of being in lack. So, you know, there's a variety of things. I'm really excited, though, to be on today, to be able to talk about how research can impact our community in a positive way, how we can use evidence, and to really sort of be able to navigate away from some of the harm that's been caused.
LINDA A. SEABROOK: Yeah. There's so many things you said in there that are so, so important, especially that it's not just about deficits, right? But where are the advantages in communities of color and how do we highlight and uplift, use research to uplift? I really appreciate that answer. And you also talked about researchers being part of communities of color and culturally specific communities. So let's then go to Stephanie. How can the community drive change and find the best learning partner or evaluator in order to support them in building that evidence base?
DR. STEPHANIE HAWKINS: Thank you. Excellent question. And, you know, as I think about the role that community plays, community is critical, right? They're core. The community knows their context better, certainly, than anyone else. And so even the idea of the question of community driving change feels like a no-brainer. But that's not the reality of how we have worked in the research world. So I think part of the value of this conversation is really amplifying the importance of sort of shifting how we think about research. So one of the ways that I think about it is, it's about power sharing. So, Nancy, you talked about researchers and sort of the experts. And I do think that researchers and practitioners come with a toolkit of both measurement, experiences that can be shared, right? So researchers and practitioners bring value, but it is not to elevate those roles above community experts. So I think one of the ways we can think about this is, we're all learning together. So what's the ecosystem that we want to create for change? How do we build off this ecosystem to drive change? And that that driving is really based on community needs rather than the way that either researchers or practitioners believe things should happen. I think a core opportunity that we have, and specifically the National Institute of Justice has, is how we think about solicitations and how we frame the opportunities for funding, like Karma, you just said, resources. Well, how do we think about the whole ecosystem—from solicitation, to reviews, to funding, and execution of that work—if everyone is here as a part of that learning ecosystem? So not elevating any one participant above another. And so I do think when we create opportunities for the community to detail the issues they want to solve, then we can use the rest of the ecosystem to help solve those problems.
LINDA A. SEABROOK: So, Dr. Hawkins, I want to thank you for kind of lifting—unintentionally, maybe—lifting up some of our other equity work here at OJP. So that's exactly what we're doing. You know, we have a multipronged approach to increasing equity and increasing access to funding opportunities for culturally specific programs, community-based organizations. And, you know, what you say about framing opportunities in a way that that make them more accessible, right? So that's one path, as well as increasing our peer reviewer pool. So I really encourage everyone to go on the OJP website, www.ojp.--I guess maybe Stacy can put it in. I don't usually access the exterior website. Sorry. Because there is a section there about becoming a peer reviewer, and we would really, really appreciate community-based organizations, culturally specific organizations, being involved in the process of evaluating funding opportunities here at OJP. So thank you for that, Dr. Hawkins. So let's turn to you again, Nancy. What are some ways that organizations evaluate...
DR. NANCY LA VIGNE: We have to welcome Karma first.
LINDA A. SEABROOK: Oh, okay. Sorry. I turned over the mic.
DR. NANCY LA VIGNE: No, no.
LINDA A. SEABROOK: Yay, Karma.
DR. NANCY LA VIGNE: No, I'm just saying Karma's on screen, so I want to celebrate that.
LINDA A. SEABROOK: Yeah, yeah.
KARMA COTTMAN: Thank you.
LINDA A. SEABROOK: So good to see your face.
DR. NANCY LA VIGNE: There's nothing like technical difficulties to kind of disrupt...
KARMA COTTMAN: I know.
DR. NANCY LA VIGNE: ...things and just so glad that you're here on screen, on camera.
KARMA COTTMAN: Thank you so much.
LINDA A. SEABROOK: Yeah, absolutely. Absolutely. Thank you.
KARMA COTTMAN: Yeah.
LINDA A. SEABROOK: So, Nancy, going back to you. What are some ways that organizations can evaluate or measure their programs, maybe outside of partnering with a researcher?
DR. NANCY LA VIGNE: Could I answer a different question first? I just want to pick up on the...
LINDA A. SEABROOK: Sure.
DR. NANCY LA VIGNE: ...thread that Stephanie started. And just share two concepts with the audience. One is known as community-based participatory research. And that is, I think, exactly what Stephanie was referring to. This notion where it's not the researchers that get to decide the specific topic, they don't get to define the problem, and then say, "We're going to now research it." It's a partnership from the outset, right? So you might have some general notion as a researcher and researchers might be better positioned to seek funding to conduct the research, but then the question is, reaching out to a community and saying, "All right, we see there's a need with violence in your community. What is the research question that you think we should be exploring here?" And then it's defining that in partnership, and then it's, "How are we going to study this?" And it's engaging community members, the people closest to the problem, in actually conducting the research.
So I had a little experience with this a few years back. It wasn't full-on community-based participatory research (and this was when I was at the Urban Institute) because we did have a topic in mind, and that was community perceptions of law enforcement. We also identified the community to work with in partnership with a local advocacy organization as well as by examining data on where the heaviest police presence was, and the most police stops, traffic stops, and stops and frisks. So it was in one particular Latinx community. But we crafted the survey questions in partnership with them. We said, "What should we be asking you and your fellow residents about your experiences with police, your views of police, etc.?" And so we crafted that survey together. And then we didn't just parachute in and conduct the surveys ourselves. We recruited, trained, and compensated people from that community to conduct those surveys. And then when we had the data, we did our analyses, but then we brought them back to the community and learned things that it wouldn't have occurred to us to explore in ways that really made a huge difference. And I'll give one example, if I may. So we asked whether people wanted about the same, more, or less police presence in their community. And we found that most respondents said they wanted the same or more. And our advocacy partner said, "What? That cannot be true." So when we started exploring the data in partnership with the community, they said, "Can you cut the data by age? So let's see if the responses are different for people over 40 versus under 40." And we did that. And the difference was striking, right? People over 40 want more police in their community, people under 40 wanted less police in their community. If we had not engaged them at that level, we might have misrepresented something as being truth that wasn't truth. So that's just one example.
I know it's not the question you wanted me to answer, but I wanted to kind of bring to life this notion of true engagement and true partnership with community members.
LINDA A. SEABROOK: No, I really appreciate that. And you brought up such important points, it almost expands, right, the legitimacy of that type of research, because you're connecting with the community. So just getting back to the question of, so can you think of or tell us some ways that organizations can perhaps evaluate or measure their programs outside of partnering with a researcher? Because that's a--that's a big hurdle.
DR. NANCY LA VIGNE: I can. But I'm wondering if Stephanie might want to talk about what it would be like to partner with a researcher and the pros and cons of that, and then I can, kind of, follow up and say, "Well, if you're not partnering with a researcher, here's some other ways." Just so I'm not hogging all the airtime.
DR. STEPHANIE HAWKINS: Yeah, no problem. So, excellent question. So how to partner with a researcher? So I'll give some--I'll start off with a story to ground, sort of, what my thoughts are. Several years ago, I led the Girls Study Group project funded by the Office of Juvenile Justice and Delinquency Prevention. And that was really to understand, sort of, the trajectory and the opportunities that we have as a community to interrupt young girls’ involvement with the juvenile justice system. In that work, we realized that many of the community-based organizations doing the very, very hard work of creating innovative opportunities for young people and really focused on the application of services, not necessarily on research, didn't have bandwidth, they didn't have the human capital to devote to research. And so they often didn't have those opportunities to get funding. Because in those times, as many of you all probably know very well, having an RCT, a randomized control trial, was the main thing to focus on as you were thinking about research designs, and some of these smaller organizations really didn't have the opportunity or the numbers to really generate that type of design. So in the course of that work, we realized that what's really needed is more evaluation technical assistance. What was really needed was to help the smaller organizations know how to be the best consumers of what researchers can offer and how to know if certain designs were really applicable to their work versus being told the types of things that they should do. So that type of empowerment is critical for smaller organizations who may not have that skill set in their staff, but it's also critical for us really uncovering the types of programs and services that are really based in evidence. So it's not just having a certain research design that leads to a certain outcome. It's really making sure there's a close match between understanding what's happening on the ground and what types of success and opportunities for replication that can come from those types of opportunities. So that's the story. We created this—we used some of our funds, with approval—to create an evaluation TA and workshops where we worked really one-on-one with community-based organizations, developing their logic models, and really on a finer-tuned understanding of what their needs were. That's an example of how you match really well the needs of an organization with the services that an evaluation team or a set of researchers might offer. Understanding that sometimes our toolkits are very different, but finding the best match is really critical. So Nancy, I don't know if that's the--if you want to ping off of that. Or even, Karma, from your experience….
DR. NANCY LA VIGNE: Yeah, I'd love--I'd love to hear from Karma, actually, and maybe some good and some less good experiences that you may have had in partnering with researchers.
KARMA COTTMAN: Absolutely. So I want to lift up one--something that you said Director La Vigne, which was recruit, train, compensate. And that's going over and over in my head: recruit, train, compensate. And I think if there was something that we could leave people with today, I will leave them with those three: words recruit, train, compensate. Because for so many of our communities, research has been about us, it has been centered around us, but it has not been driven by, led by, our communities. And so those--that was so incredibly important. And then as Stephanie talked about her work with OJJDP, one of the things that comes up for me is, very often our communities are not seen as experts in our own reality. So even when we're talking with and working with researchers, there's this idea that we have to provide evidence based on our own realities, right? And you have people who kind of shrink away from that because they know that they're the expert in what they've experienced. And we talk about that in so many different forms, whether it's in victimization, whether it's in lived realities, or as the example that you gave, related to, you know, law enforcement experiences. So the first thing is, really being able to believe the community, believe what we say, believe community as the expert in their realities, and then having research that complements that but does not necessarily negate the voice of community. That is so critically important.
From my experience, an example that I can give, is I've worked very closely with the National Organization of Sisters of Color Ending Sexual Assault, SCESA and I worked together on advocacy. And one of the things that we were looking at is how to engage researchers from historically Black colleges and universities to really tell the story of the Black community, Black victims, etc. And the issue that we kept running up against is that we had folks who say, "Well, they don't have the experience in this area. They're not a part of any of the academies." And so it was almost like the work that had been done by these astounding organizations was being negated. And so one of the issues that even comes up for researchers is their own validity, right? And so we are continually, I think, navigating this place of how do we identify expertise? How does that get solidified? When we define evidence, what does that look like for our communities, and who gets to tell our stories?
DR. NANCY LA VIGNE: Uh-hmm. Yeah. Karma, just focusing on what's termed academe or academia, and everything that is built into that structure, creates disincentives to engage in the kind of research that we've been talking about. You know, what's elevated as the best kind of research is the research that most closely replicates experimental designs conducted in laboratories, right? That's just so removed from narratives and human experience. And scholars across all race, ethnicity, identities, who are attracted to more qualitative research often end up at the lower part of the hierarchy because of this very traditional notion of what counts for credible research. And that's where I do feel very strongly that community-based, participatory research, which is not for everyone, because it is very time consuming and very expensive, you know, that's on one, like, exemplary end of a continuum. But even if you're conducting research that is more empirically based, still, it should take the time to be inclusive. And that means in engaging the people on the ground, making sure that you're not dictating the research questions, always bringing the data back to the people who helped develop it. It's their data more than it's your data, researcher. I mean, that's the thing that, I think that, if I could say nothing else, like, honor where the data came from, right? You know, it's not your survey data. It's the data that came from the voices and experiences of people who participated in the survey. And then, when you have the findings, what are the implications for changes on the ground? Do you get to decide that in your ivory tower? No. You bring those findings back and you say, "Okay, what does this mean for you? How do these findings resonate with you? And what do you see as the implications for change and improvement?" And that's the--that's the process, right? But it's really hard to find the right researchers to team up for this work, because they're not supported, and they're not rewarded, for this kind of work. And that's something that we certainly want to change at NIJ. And when we'll be releasing solicitations for research proposals, you'll see new language in there that says that, you know, we will prioritize research that is inclusive in nature and we will also prioritize research proposals that have lead roles for HBCUs and other minority serving institutions, as well as proposals that team up with culturally specific organizations, not as a token five percent of the proposed funding, but at least thirty percent or more, right? We've got to prioritize this in meaningful ways that can really speak to the community, so that people can be better rewarded and incentivized for engaging in this type of research that we all on this panel think needs to occur.
LINDA A. SEABROOK: Right. And such an important point. And just so everyone knows, it's actually going to be 40 percent or more. So that's something to be on the lookout for. So I do want to kind of get back to that question, though, because--and it doesn't have to be you, Nancy, if anybody else has any thoughts about this, of like, what is a creative...
DR. NANCY LA VIGNE: No, I'm sorry. I will--I will answer it, because now you probably think I'm dodging this question altogether and I’m really not.
LINDA A. SEABROOK: No, not at all. I don't think you're dodging it. I'm just--I'm just opening it up to others, too.
DR. NANCY LA VIGNE: So let me just take a quick crack at it.
LINDA A. SEABROOK: How do we evaluate or how can organizations evaluate or measure these programs outside of partnering? Yeah.
DR. NANCY LA VIGNE: Right. I think--I think it's hard to do outside partnerships that help create the capacity and build the knowledge and skills on the ground, right? Maybe it's through technical assistance, maybe it's through one of these partnerships that we've just been discussing, but we can't expect people to all of a sudden have skills, right? We need to build those skills. But also, you know, if we're engaging in participatory research, yes, people are learning skills, but like, maybe we can give them certificates, you know, you've now demonstrated that you understand how to conduct focus groups, right? You know, give them ways that they can build their resume and their ability to be those local community-based researchers. I think it's kind of hard to do without some kind of support or guidance and I know that's what OJP is here for.
DR. STEPHANIE HAWKINS: And may I just add to that, you know, I don't want us to leave with the idea that partnering is not a good thing. I actually think the partnering is critical, because what we ultimately are doing is bringing different experts together to solve a problem, right? And so it--it's really about the power sharing. So not elevating one expert higher than another. The partnership to me really creates the robust ecosystem that we want, where we have community members driving the questions that they desire to be solved. We have practitioners and researchers coming together to co-create those solutions and creating those opportunities. And one of the things I do also want to lift up is also the critical role of narrative change. Because much of what we think and how those thoughts lead to policy are about how we think about things. So even this conversation about elevating community is really about shifting the narratives about the role of community in solving the problems and the challenges that are in communities that, by the way, are not created by the communities, right? So if we're thinking about structural racism, and if we're thinking about how that plays out across generations, we're asking communities to solve problems that aren't theirs alone to solve. So the partnership piece really becomes critical because we are able to bring the best of our ideas together to solve problems.
KARMA COTTMAN: Can I just add to that? And that is so critical, that the partnership is so important, and that the issue around power, and what power looks like in those--in those partnerships, I think is the other thing that has to be addressed. Like you have to invest in the capacity of culturally specific organizations in order for them to show up in those partnerships, right? You have to be able to talk about what does power sharing look like when the resources are so different, you know, when the staffing is going to be so different, when the resources to invest are so different. And think about developing those partnerships, which to Stephanie's point, are important because they lead to policy on the ground. When you're talking about interrupting structures, you know, there is a requirement that there is an evidence around what do those structures look like, how have they impacted the community, etc. And so our communities need to be able to show up in those partnerships from a place of strength, they need to have the capacity to do that. And for culturally specific communities, for communities of color, that has not been our history, that has not been the investment, and so there has to be that investment in the capacity to show up.
LINDA A. SEABROOK: Yeah, I just really appreciate that everyone is giving me talking points for OJP's Equity Program, because Karma, that's exactly what that we're trying to do and we are going to do. So, stay tuned for more, everyone. So Karma touched on this quite a bit, and I just want to develop it a little more, Dr. Hawkins, about—or Stephanie—about research, right, on Black and Brown communities. I remember that like, some years ago, Oprah Winfrey starred in a movie called, “The Immortal Life of Henrietta Lacks.” And a lot of people knew about the life of Henrietta Lacks and, you know, and the death of Henrietta Lacks and what happened thereafter. But there were a lot of people that didn't know, and it's inspired, like, this national conversation, I remember, like on the history of research conducted on Black and Brown people, which is the opposite of what we've been saying, right? Can you explain a little bit of that history, and then, you know, and your reasons for establishing TRUE.
DR. STEPHANIE HAWKINS: Yeah. Thank you. So it's interesting. Henrietta Lacks, the story of her life and the immortal use of her cells, right? That's sort of what that story centers. I actually serve on RTI's Institutional Review Board. So this becomes really relevant when we are talking about research, and permission, and approvals, and awareness. And that's really the critical piece of Henrietta Lacks. It's the awareness that your cells, in this case, are going to continue to be used for science purposes. And are you--are you aware of your rights as a research participant? That's the critical piece. And as her family uncovered this, they were not aware. And, you know, all of the amazing scientific outputs of her cells continue to be used. And I know that they are--you know, her life has really been elevated to reflect the contributions that her cells have made to science. But really, what that does is it demonstrates the ways in which Black and Brown communities are engaged in research, and their awareness about their roles in research. So thank you for using the word “on” versus “with,” right? Because what we do is we join with people, right, to conduct research and solve problems. But that isn't the history of this country and how research has been conducted.
So, you asked about the origin story of TRUE. And for us at RTI, the origin story of TRUE is really about how do we center research differently? So, like many organizations and places around the country, TRUE emanated from me watching—and thousands and millions others—the murder of George Floyd. And working at a research institute, it made me ask the question, how is our research being conducted in service of equity? And thankfully having a leadership team that wanted to really explore how are we doing that? And yes, we are doing that in many places across our institute. But how can we center this to go deeper? So, ultimately, TRUE was designed so that we can really transform research and center equity in how we look at our outcomes. So for us, it's really about the researcher. How do we take a critical view of our roles as researchers, the power that we hold in that space? How do we critically look at our privilege and how that plays out in the work that we do? How do we think about the process of research? I look at equity as both a process and an outcome. So what's the process of conducting research? Is that done equitably? So we talk about community and we talk about permissions. Are we engaging the community equitably so that they can help solve some of the questions that they want solved? Not that we want solved. And then ultimately, are we using methodological approaches? To Nancy's point, you know, are we also amplifying qualitative data collection? Are we really exploring narrative change in addition to the high level of quantitative explorations that we do? So it's both the researcher, it's the process of researching, and it's also the outcomes of our research that we really wanted to shape at RTI. And that's the work that we're doing. And I mentioned, sort of, these two core components. We also realized that there's a role for us to build the capacity of our fellow researchers. We talked about HBCUs. I'm a very proud graduate of Spelman College and Howard University. So I have done all my core training at HBCUs, and I'm very proud of that. So I've had some advantage of being aware of what it means to take a racial equity lens just in the course of your training. But I also appreciate that not everyone has had that opportunity. So, we are also taking a look at ourselves as researchers across the institute to figure out where are the ways in which we can build our capacity so that all of our work is done equitably, not just as a one-off research project. So, those are sort of some of the origins of TRUE. I don't want to take up all the space. I want to make sure that we have opportunities for questions.
LINDA A. SEABROOK: Exactly. I was just going to turn to questions, but before I do that, you mentioned a term that I just want to make sure the group knows about. IRB, an Institutional Review Board, what is that, just very quickly, and how it is that--what's its purpose?
DR. STEPHANIE HAWKINS: Yes. Yes. It's a critical purpose. So, you know, all of the federal, I guess, contracts and--for us, all of the federal agencies, we operate with our own Institutional Review Board, not all organizations have their own. But it's a manner in which we can make sure that the safety of research participants is upheld, that they are aware of the fact that they are participating in research, and that mostly they know their rights as research participants. So those are just some of the core things. It is all very federally mandated. So I highly encourage everyone, even as a research participant, that you understand your rights as a research participant. That's what consent forms do. And so we don't always look at the details, but those are all driven by the requirements for research studies to conduct their work. And justice is a core component, interestingly, of IRBs. So it's an interesting exploration. If you want to know more, I'm happy to talk.
LINDA A. SEABROOK: Thank you. And also thank you for founding TRUE. It's just so critically important. So I just wanted to make sure I said that. So, I want to make sure that we are able to field some questions from these excellent panelists and take advantage of this opportunity. So, I'm going to ask Stacy, who's helping us out. Do you have a question for the panel? And any panelist can take it.
STACY LEE: Yes. What initiatives does OJP have in place to help younger researchers stay in the field and develop skills related to evidence-based research?
LINDA A. SEABROOK: I think that's for you, Nancy.
DR. NANCY LA VIGNE: I'll take the first crack at that. So, we have a few different avenues to support young and emerging scholars. We have the W.E.B. Du Bois research solicitation, released last year after it taking a four-year hiatus that coincided with a different administration. So, we've brought that back, and that supports research that is conducted on various issues related to racial disparities in the criminal justice system and ways to mitigate or prevent them but also lift up opportunities for emerging scholars to be supported and mentored by more senior—or what's called principal investigators. So that's one way. We also are bringing back the Graduate Research Fellowship Program, which is a grant program specific to young scholars. They're typically seeking funds to support their dissertation research. So that will be released this year. It was not released last year, unfortunately. And NIJ also welcomes research assistants, who are typically doctoral students and work part time with us to learn more about the different topics that we cover. Learn more about what it's like to be in a grant-making association. And of course, we learn from them as well. And they leave not just with more information and more experience but also an inside look about what it takes to write winning proposals, right? So that really helps also support emerging scholars. I also want to acknowledge Angela Moore on team NIJ, who has joined here and has the ability to weigh in if I've missed anything critical on this front.
LINDA A. SEABROOK: Angela, can you join us?
ANGELA MOORE: Yes. Hello, everyone. And Nancy, you covered everything.
DR. NANCY LA VIGNE: I stole your thunder.
LINDA A. SEABROOK: Well, I'll just add too, I mean, I think even across OJP programs, we have those priority considerations that Nancy spoke about earlier. Across other OJP programs as well, we have priority considerations to promote and encourage projects that focus on racial equity and the removal of barriers to access an opportunity for historically marginalized communities, as well as that priority consideration about partnering—and meaningfully partnering—with culturally specific organizations. So let's take one more question, in the interest of time. Stacy, do you have another one?
STACY LEE: Thank you. So, can the panelists share examples of community participatory research done right, along with a few points on why it's a good example?
LINDA A. SEABROOK: I'm going to actually alter that question. When is--when is community participatory research done right and when is it done wrong?
DR. NANCY LA VIGNE: I was going to defer to Stephanie, but...
DR. STEPHANIE HAWKINS: Yeah. I was going to say…. I’ve moved back so I can create space. I don't know Karma if you had direct experience. I'm happy to jump in, but I feel like I've been talking a lot.
KARMA COTTMAN: No, please.
DR. STEPHANIE HAWKINS: Okay. Okay. So, when is community-based participatory research done right and when is it not done right? Is that the question? First of all, having the community in the development from the very beginning is the absolute first step. So it's not that, "Hey, I've got a funded project, now let me go find a community to work with." It's at the very beginning, you're talking with communities and figuring out what questions do we need solved? And then together we're pursuing funding for the answers to that. So it's a mix between funding and doing the work, right? So, I would say making sure we are establishing authentic, trust-based relationships with communities. And that's one of the core components of TRUE. You know, I talked about the science of community engagement, but the reality is, you should never pursue a community for a funding opportunity, right? You should be working with communities and figuring out like, hey, we have some questions we need answered. And so let's go find some funding to solve these questions. But not the other way around, which historically has happened. And some of that is driven by how opportunities are shaped, not always, but some. So I'm delighted to hear the sort of forward focus that NIJ and perhaps DOJ more broadly has for making sure that the community is the center of the questions that are being developed and finding that research ecosystem to support it.
DR. NANCY LA VIGNE: Can I weigh in on something that bugs me? And that's like, what do we mean by “the community?” And it relates to when this work is not done as well or as authentically as it could or should be. And I mean, first off, as my little story about that research project indicates, the community is not a monolith. A community doesn't mean that everyone thinks the same, everyone has the same priorities, you know, everyone has the same access to opportunity. I mean, we know that, right? But I feel like we dumb this down in a way that makes it too easy for certain types of researchers to check a box, right? And that we need to be really intentional about even who we engage with when we say we're engaging with the community. And I think that intentionality is around making sure that that engagement isn't with one person or with an advocate for the community that may not even reside in the community, right? These are some of the details that I think, you know, when very well-meaning, well-intended researchers say, "Okay. We're going to start doing this work." And they're not thinking as intentionally as perhaps they could or should.
LINDA A. SEABROOK: Yeah. And I'm going to turn to Karma now, because I'm like channeling Karma as Nancy was talking, right, about folks speaking, right, monolithically about communities of color. And I would just love if you could just expand on that real quickly.
KARMA COTTMAN: One of the most important things is to say who you're talking about. When we talk about communities, we're often not specific. When we're talking about Black and Brown communities, we are just lumping people together and not really, one, being respectful of their individuality, of the diversity within our communities, of the languages that we speak, right? And so, really being able to be very clear, who are we talking about when we're talking about community? And I--for me, it really goes back again to leadership and resources, right? And really being able to take leadership from community. That's how I've seen this work well, is being able to take leadership from community, as both Stephanie and Nancy said, ensuring that community has a role from the beginning to the end, right? Making sure that the research that's being collected or conducted has an impact that's intended to be uplifting for the community and understanding the implications of policy. I'm always going into any relationship, particularly with researchers, thinking about, what are the unintended consequences for our community? What does this look like when it--when policy comes from the research that's being conducted, right? Because very often, that's what the research leads to. And so, really understanding that when we're talking about community, we're talking about a body of experts that are experts in their lived reality. And so, there has to be engagement from that perspective.
LINDA A. SEABROOK: Yeah, 100 percent. And, you know, it reminds me of, like, when we talk about immigrants, right? The immigrant community. And automatically, there's almost this—you lean toward Hispanic/Latino communities. And, you know, there are immigrants from all over the world, the entire continent of Africa, the Caribbean, right? So, yeah, that's just such, such an important point.
So, mindful of time, and I just really enjoyed this conversation so, so much. And I want to thank our amazing speakers for spending their valuable time with us today and sharing their thoughts and their expertise. And I also thank all of you for joining us. Please be on the lookout for the remainder of the capacity-building series that will cover topics such as logic models and program planning, budgeting and record keeping, and audits, what are they good for? (I know you want to sing the song.) So have a great rest of your day. And until next time, be safe, be well, and thank you.
Disclaimer:
Opinions or points of view expressed in these recordings represent those of the speakers and do not necessarily represent the official position or policies of the U.S. Department of Justice. Any commercial products and manufacturers discussed in these recordings are presented for informational purposes only and do not constitute product approval or endorsement by the U.S. Department of Justice.