Social Work Tech Notes - Social Work and Future Technology: What Can Be Automated, Will Be

by

by Stephen P. Cummings, MSW, ACSW, LSW

     Early in my career, I worked in the Emergent Treatment Center at a Trauma I hospital. I’d been working on the inpatient side of the hospital for a time, but the pace in the ETC was nothing like what I’d experienced before. Patients would admit in clusters. AirCare would bring patients to the hospital from out of state. Family members who woke up that morning expecting a normal day were now sitting by their loved ones, trying to make meaning out of the new series of events rapidly unfolding.

    As the social worker, I would seek to complete assessments, determine who would make decisions, and attempt to predict discharge planning needs during this emergent admission process. Throughout this experience, I held on to the belief that social work, along with all other allied health care professionals in this setting, would be safe from the encroaching tide of automation through technology.

    Looking back, I wonder if I was making a misguided presumption. Is automation a threat to social work practice in the field? Or is it a tool?

“What Can Be Automated, Will Be”

    The phrase “what can be automated, will be automated” has been bandied about, taking on the status of a truism of late. The phrase is attributed to Harvard Business School Professor Dr. Shoshana Zuboff (2013). She made this statement approximately 30 years ago, and her prediction seems to be coming to fruition. Although robots aren’t working the front lines of social work practice, there are numerous examples of automation in the wild, embraced and used by the public and our profession.

The Difference Between AI and Automation

     Before diving in, let’s first clarify what we mean by “automation” and “artificial intelligence.” These two terms are used interchangeably with some frequency. I’m guilty of this, especially when I consider how I presumed my social work practice was protected against the encroaching expansion of workplace technologies by simple definition—social workers believe in the dignity and worth of the person. A given requisite here, “belief,” presumes human engagement.

    This is something of a binary disjunction, a logical fallacy that sees a problem as a mere choice between two parts. In this case, I saw the crossroads in simplistic terms: embrace technology fully, or reject it out of hand. As I noted in an earlier column (see Remember Hand-Written Progress Notes? A Social Worker’s Tale of Technology and Cultural Change, Winter 2018), I embraced new technology to allow social workers to enter assessments and progress notes into a computer database. This new technology was not openly embraced by all members of the social work team to which I belonged.

    Moore et al. (n.d.) provide an overview of automation in multiple contexts: numerical control (programmable tools to control repetitive tasks, such as 3-D printing), computer-aided manufacturing (computer software to control machinery), flexible manufacturing systems (an array of machines and other automation tools), and industrial robots (useful for handling materials, manipulating pallets).

    While automation may seem separated from the routine engagement of social work practice, it’s likely you’ve benefited from automation already. Auto-correction, a useful if sometimes frustrating part of our daily electronic communication, is an example of automation. Google embeds auto-response suggestions to quicken the composition of email responses. My son’s school often sends questions about upcoming volunteer opportunities (“Would you be able to help out the booster club this Friday”); Google provides a small selection of choices (“Of course!” “I’m sorry, but I can’t.”). I probably haven’t written an original response to my son’s school in about a year.

    If automation takes the effort out of computation and repetitive tasking, artificial intelligence (AI) seeks to embody human nature. As Moore et al. put it, “Another way of looking at AI is taking human skills and tendencies and applying them to inanimate objects and ideas” (2018). One of the more well-known goals of AI is for a machine to become so convincing to a human that the machine passes the so-called Turing test. If you’ve been frustrated by the voice-automated assistance on the phone, or misunderstood by your Alexa or Siri application, you’ve experienced the failure of the Turing test. This is similar to the concept of the Uncanny Valley, in which an object resembles something human, but our emotional response to that object tells us something’s...off. If you watched the recent movie Rogue One: A Star Wars Story and found yourself feeling uneasy during scenes where well-known actors were recreated with digital technology, you’ve experienced being lost in the Uncanny Valley.

    Google has received public attention in attempting to make a machine seem human enough. Onstage at a recent developer’s conference, the automated Google Assistant appeared to make a voice call to a local establishment and set up an appointment with a human employee. Speculation exists among the tech community that this demonstration may have been staged (Kosoff, 2018). Watching this demonstration, I thought it was a pretty high risk, as the human member of the conversation could have derailed the calm tones of the Google Duplex assistant in any number of ways: mumbling, getting distracted or frustrated, or just ending the call abruptly. However, the overall goal of the presentation doesn’t seem too outlandish. Google has been working on real-time language interpretation through the mobile-device based Google Translate (Russel, 2015), a feat I would not have believed possible 20 years ago.

 The Impact on Social Work

     So, where is social work positioned in this change? Existing indicators suggest that, despite upheaval in other professional areas, social workers are well-positioned to thrive in the midst of automation. NPR, in a 2015 overview of professions, indicated that social work (in the category of “mental health and substance abuse social workers”) have a 0.3% risk of automation, making it the hardest job for robots to do, according to their report, based on the need for “cleverness, negotiation, and helping others” (2015). (The most likely career to be automated? Telemarketers.)

Social Work Leadership in the Tech Arena

     Here, I’ll return to my mantra: social workers do not have the choice, nor the luxury, of ignoring technology’s impact on our profession. The NASW Code of Ethics, in updates that were made official in 2018, includes the understanding of technology throughout the changes. These changes reflect the need to include technology in our definition of competence (1.04): “Social workers who use technology in the provision of social work services should ensure that they have the necessary knowledge and skills to provide such services in a competent manner.” I note here that “social workers who use technology” could encompass our practice in just about every area. Any social worker using technology to chart, communicate, or promote services fits this description.

    In addition to the updated NASW Code of Ethics, the Council on Social Work Education recently published a report, “Envisioning the Future of Social Work,” which includes a strong focus on the social worker’s role (2018). This report emphasizes the need for social workers to embrace technology in social work practice and to emphasize technology in social work leadership.

    I reflect on my experience in the emergency room years ago and how social work hasn’t changed much in how we engage with patients and families. Some elements are quite different: in that environment, the screening process can be completed now by the patient or caregiver using an electronic touchscreen, allowing for demographic data and patient history to be completed prior to the initial meeting with the social worker. Perhaps the most significant shift in how patients interact with health professionals using technology can be observed at the Veterans Administration, where a Virtual Medical Center is available to their patients (https://vavmc.com/). This virtual environment isn’t for all patients, but it does reflect the changing landscape.

    Social workers should prepare, not only to adapt, but to lead in these new virtual spaces.

References

Council on Social Work Education. (2018). Envisioning the future of social work: Report of the CSWE Futures Task Force. Retrieved from: https://www.cswe.org/About-CSWE/Governance/Board-of-Directors/2018-19-Strategic-Planning-Process/CSWE-FTF-Four-Futures-for-Social-Work-FINAL-2.aspx

Kosoff, M. (2018, May 17). Um, did Google fake its big A.I. demo? Vanity Fair. Retrieved from: https://www.vanityfair.com/news/2018/05/uh-did-google-fake-its-big-ai-demo

Moore, A., Solomon, R., & Barney, M. (2018). Automation overview: From manual processes to machines. Computer Science Online. Retrieved from: https://www.computerscienceonline.org/cutting-edge/automation-ai/

National Association of Social Workers. (2018). Code of ethics. Retrieved from: https://www.socialworkers.org/About/Ethics/Code-of-Ethics/Code-of-Ethics-English

National Public Radio. (2015). Will your job be done by a machine? Planet Money blog. Retrieved from: https://www.npr.org/sections/money/2015/05/21/408234543/will-your-job-be-done-by-a-machine

Russel, J. (2015, January 4). Google Translate now does real-time voice and sign translations on mobile. TechCrunch. Retrieved from: https://techcrunch.com/2015/01/14/amaaaaaazing/

Zuboff, S. (2013, June 25). Be the friction: Our response to the new Lords of the Ring. Franffurther Allgemeine. Retrieved from: http://www.faz.net/aktuell/feuilleton/the-surveillance-paradigm-be-the-friction-our-response-to-the-new-lords-of-the-ring-12241996-p5.html?printPagedArticle=true#pageIndex_4

Stephen Cummings, MSW, ACSW, LSW, is a clinical assistant professor at the University of Iowa School of Social Work, where he is the administrator for distance education.

Back to topbutton