Social Work Tech Talk: Tech and the Real World

by

SOCIAL WORK TECH TALK

by Gina Griffin, DSW, MSW, LCSW

     If you haven’t guessed by now, I love technology. My dad was a radar tech in the military, and he passed on to me his love of all things tech. By way of example, I just downloaded a Disney app for the Disney theme parks, which makes me rapturous because I can easily track all of my reservations. And we bought a Portal to help my mom to keep in touch with us. However, as with most things, the presence of technology presents a side that is less than benign. Used in the wrong ways, it can—and does—cause harm to our communities.

     Here are some common challenges presented by technology, in the context of social work practice:

Automated Decision Making

     This has become a standard in the realm of providing social services, and services in general. SNAP and loan applications, applying for college, and attempting to find help with healthcare or housing services have all become automated processes. The original intent and belief was that allocation of these services would become fairer and less biased. Unfortunately, the opposite has become true. Algorithms are used in these processes. These are sets of instructions that are trained by feeding them specific data sets, and allowing them to learn how to make decisions. Unfortunately, personal biases still wind up in the process (O’Neil, 2017). Although race and gender may never be referenced directly, the training sets may include references such as ZIP Codes, which are very closely attached to race. For women, listing known women’s colleges on your job application might reveal your gender, and you may wind up down-selected in comparison to men. In the long run, women, people of color, and the LGBTQ+ community may find themselves denied for loans because of the presumption that they are a bad risk, even if this is untrue. Worse, there may be outright errors in their files, and—because of the hidden nature of the algorithms—finding and disproving the errors can be next to impossible (Goldkind et al., 2018).

     How can we help? Social workers and other human service providers need to be involved on the front end when decisions about these products are made. Even if you’re not inclined toward technology, you can help by having a basic understanding of how this technology affects your clients, so you can help them to understand what is happening. And overall, we need to lobby for laws that allow our clients to own and control their own data, to be provided with a transparent view of the systems and decisions that affect them, and to opt out of data collection or uses that they believe will be harmful.

Justice System and Policing

     The justice system already relies heavily on technology to police our communities, as well as to make decisions about who will be granted parole and who won’t. Recidivism models are routinely used in the court system to make recommendations about who is likely to return to jail. Although many of these systems are known to be biased, and to disproportionately affect people of color, they’re still used in the majority of states to make these determinations (O’Neil, 2017).

     Predictive policing has also become a problem. Police forces began to use software that would predict areas that they believe are more likely to be affected by crimes. However, the use of these methods has become a self-fulfilling prophecy (Goldkind et al., 2018; Robé, 2016). They become a recursive loop. Areas that are heavily policed will naturally demonstrate more crime, because we find what we are looking for. That data is fed back into the equation, those communities are deemed to have a higher rate of criminal activity, and so more policing is assigned. Individuals are also assigned risk profiles, based on predicted behavior compiled from various data in their lives (Goldkind, 2018).  There are many questions about the constitutionality and the legality of these methods, but they are still used by law enforcement.

     Over time, these types of methods create considerable stress on individuals and communities, and people of color are more commonly affected. As an example, in 2011, Black and Latino individuals ages 14 to 24 made up 4.7% of the NYC population, but they made up 41% of stop-and-frisk incidents (Robé, 2016). Concerns about winding up on the wrong side of these types of methods become a part of the negative impacts that we consider when we discuss social determinants of health.

     Additionally, other types of surveillance are problematic. As an example, social activists are subject to having AI used to provide facial recognition to law enforcement agencies. Private organizations gather information from social media, they create databases, and law enforcement purchases subscriptions to these resources (Miyamoto, 2020).  Again, there are questions about the legality of these processes. And the intention of these types of tactics is to intimidate community organizers.

     How can we help? Again, awareness is the first step. And advocating for more just and transparent laws, which regulate the use of surveillance tactics, are areas where social workers can provide assistance. Additionally, we have to think of ways to become proactive. Obviously, protesters can use their own video to record events as they unfold. And apps such as I’m Getting Arrested (available free in the Google Play Store) can help protesters and organizers to have a plan in place for being detained, before it happens.

Agencies

     Agencies are being called on more and more often to quantify their outcomes based on data. Large agencies may not consider this an issue. Many large agencies employ data analysts or data scientists who possess specialized skills related to collecting data and sharing outcomes (Miyamoto, 2020). Smaller agencies may find themselves, and their clients, at a disadvantage. Even though most social workers learn the basics of statistics, it’s likely that they haven’t used those skills since their master’s program. Furthermore, data science skills are much better suited for prediction and analysis of agency data. And at this time, these are not skills to which most social work students are exposed (Perron et al., 2020). In the long run, it’s the clients who pay. Small, specialized organizations that focus on assistance in the LGBTQ+ community, or services for women or people of color are more likely to lose out against larger organizations that are better equipped to provide data for funding. When those small agencies suffer, their clients suffer.

     How can we help? Social workers can help systemically by offering data science as part of the formal education of our students. While several schools have begun to offer training for skills such as R and Python, many have not. Individually, if you have a love of math, programming, and data analysis, you may want to learn these skills on your own. Larger organizations often offer this training, and skills such as hospital informatics, to employees as part of continuing education. If you’re part of a smaller organization, you may want to learn one of these skills on your own. There are lots of excellent resources online, and many of them are free.

     As social workers, it’s important that we begin to understand that the abuse of technology is a human rights issue. We need to more actively dedicate ourselves to finding ways to help our clients and our communities to make sense of these practices.

References

Goldkind, L., Thinyane, M., & Choi, M. (2018). Small data, big justice: The intersection of data science, social good, and social services, Journal of Technology in Human Services, 36:4, 175-178. https://doi.org/10.1080/15228835.2018.1539369

Miyamoto, I. (2020). Surveillance technology challenges political culture of democratic states. In A. L. Vuving. (2020). Hindsight, insight, foresight: Thinking about security in the Indo-Pacific (pp 49-66). Daniel K. Inouye Asia-Pacific Center for Security Studies. https://apcss.org/wp-content/uploads/2020/09/04-miyamoto-25thA.pdf

O’Neil, C. (2017). Weapons of math destruction. Penguin Books.

Perron, B. E., Victor, B. G., Hiltz, B. S., & Ryan, J. (2020). Teaching note—Data science in the MSW curriculum: Innovating training in statistics and research methods. Journal of Social Work Education. https://doi.org/10.1080/10437797.2020.1764891 

Robé, C. (2016, Fall). Criminalizing dissent: Western state repression, video activism, and counter-summit protests. Framework: The Journal of Cinema and Media, 57(2), 161-188.

Dr. Gina Griffin, DSW, MSW, LCSW, is a Licensed Clinical Social Worker. In 2012, she completed her Master of Social Work at University of South Florida. And in 2021, she completed her DSW at the University of Southern California. She began to learn R Programming for data analysis in order to develop her research-related skills. She now teaches programming and data science skills through her website (A::ISWR) and free Saturday morning #swRk workshops.

Back to topbutton