If you're in social work, chances are you've come across terms like artificial intelligence (AI), machine learning, and algorithms. Some of you might be using AI in your practice or personal life. It is a technology that enables computers and machines to mimic abilities such as learning, decision-making, and acting independently. It has definitely penetrated most of our daily lives and is already making its way into social work practice, raising fundamental questions and posing challenges for its integration without critical examination.
This technology is not new and was founded in 1956 as an academic discipline, but it gained widespread attention after 2020, particularly with the release of a generative AI, ChatGPT. Most of us suddenly realise the capabilities of this type of AI, particularly in quickly generating human-like content, including text, images, code, audio, and video, and its potential. AI has now become a central point of discussion in both social work practice and education. Now, the question is, why has AI become such a hot topic in social work? The answer is simple: the profession is under immense strain. Social workers are drowning in admin. Studies show many regularly exceed their contracted hours, managing excessive caseloads with insufficient support, leading to stress, burnout, and problems with recruitment and retention (Haider et al., 2025). In this context, AI is being sold as a solution. The promise is that by automating paperwork, AI can free up social workers to do what they were trained to do: spend time with people, build relationships, and make a difference (Garkisch & Goldkind, 2024).
In the UK, several local councils are piloting and some implemented tools like "Magic Notes" – a generative AI that records meetings with clients and automatically produces case notes and assessments (Haider et al., 2025). The company behind it claims it can cut the time spent on an assessment from 90 minutes to just 30. In the US and Australia, researchers are using machine learning to analyse child welfare records, identify domestic violence, and even predict which children might be at future risk (Li et al., 2025). There are also chatbots being developed to offer basic mental health support, especially for young people who might be hesitant to reach out to a human (Haider et al., 2025).
Despite the hype, how widespread is this, really? The honest answer is not very. The technology is still in its infancy in social work. After an exhaustive search, one review found only eight empirical studies on AI-assisted case management globally (Li et al., 2025). Another noted that most of the academic literature is conceptual - it's a lot of people thinking and writing about what might happen, rather than evaluating what is happening (Garkisch & Goldkind, 2024). So, while the potential is huge, the actual day-to-day use of AI in social work remains limited to small-scale pilots and exploratory projects (Haider et al., 2025).
Looking ahead, undoubtedly, the possibilities of AI in social work practice and education are significant. Imagine a tool that could analyse anonymised data across a local authority to spot emerging patterns of need, allowing for early intervention before a crisis hits (Garkisch & Goldkind, 2024). Imagine virtual reality simulations that help students practice difficult and realistic conversations in a safe environment (Haider et al., 2025). Or imagine a system that instantly matches the right service to the right person, based on a complex understanding of their needs (Li et al., 2025). That’s the promised future, but it is a big ‘but’, the ethical concerns are profound.
An AI literature review by Haider et al. (2025), which evaluated and synthesised studies across health, social care, and social work, identified several moral and ethical challenges posed by AI in social work. First, bias: if you train an AI on historical data that reflects systemic racism or classism, the AI will simply automate that discrimination, making it faster and harder to challenge (Haider et al., 2025). Second, the “black box”: with some AI, even the designers and developers don't know exactly how it reached a decision or generates outputs. How can a social worker be accountable for a decision they can't explain to the people they work with or a court? Third, dehumanisation: social work is fundamentally relational. There's a real fear that handing over too much to machines will erode the empathy, intuition, critical and creative thinking, and curiosity that are central to keeping people safe. And who is accountable when an AI-driven assessment is wrong? The developer? The agency? The social worker who trusted it? (Haider et al., 2025).
So, how do we balance the opportunities with the risks? The answer isn't to bury our heads in the sand. AI is coming. The task is to shape it to align with social work values. This means demanding transparency – we need to know how these tools work. It means insisting on human oversight – AI should support, not replace, professional judgement (Garkisch & Goldkind, 2024). And crucially, it means involving social workers and people with lived experience in the design and testing of these tools from the very beginning (Haider et al., 2025). Current developments suggest that AI will not replace social workers, and we do not know what the future holds; one point is clear: social workers need to understand and competently, fluently, and critically engage with AI. The conversation is just beginning, and we all need to be part of it.
Read more in Sharif Haider's paper:
The emerging use of AI in social workExplore our social work workforce challenge page for more help and guidance:
Training solutions for social work team retention and attractionGarkisch, M. and Goldkind, L. (2024) 'Considering a Unified Model of Artificial Intelligence Enhanced Social Work: A Systematic Review', Journal of Human Rights and Social Work. doi: 10.1007/s41134-024-00326-y.
Haider, S., Ferguson, G., Flynn, A., Giraud, J. and Vseteckova, J. (2025) Emerging use of AI in social work education and practice: A rapid evidence assessment of the literature. Sheffield: Social Work England.
Li, L., Wang, M. and Jian, M. (2025) 'Artificial Intelligence-Assisted Case Management in Social Work Services: A Systematic Review', Research on Social Work Practice, 35(1), pp. 1-11. doi: 10.1177/10497315251329531.
Please contact us to speak to one of our business team advisors.
Not on our mailing list?
Sign up to receive regular emails that are full of advice and resources to support staff development in your organisation.