Practice Data
by Jessica M. Grosholz, Ph.D., Jane Roberts, LCSW, Ph.D., and Melissa M. Sloan, Ph.D.
Yes, you can! Students have been asking about publishing research, and we imagine that is directly correlated to their pending graduation. They are asking things such as, “Don’t I need a Ph.D. to publish?” and “Don’t journal articles only come from authors associated with a university?” The answers are a resounding “no” and “no.”
New social workers and others of you still early in your careers can definitely be thinking about publishing the hard-won findings from any research-related projects you’re conducting, and those findings can often be derived from your seemingly routine, day-to-day organizational work and collection of data (even though you may not be thinking of it as data).
Case in point: For many years, one of the authors collected monthly data on hospital clients served during her years as a medical social worker, eventually realizing that several outlets were clamoring for the information that could help other social workers, other hospital department personnel, and even statewide or national audiences through the hospital network’s monthly magazine. The magazine subsequently published several practice-related articles on elders adjusting to hip fractures and young mothers facing adoption dilemmas.
Research or Program Evaluation?
The word “research” can seem daunting or lofty and reminiscent of hard labor during college or grad school days. However, if you think of your routine collection of data (number of clients served, demographics, even progress notes or outcomes noted throughout a client’s progress), you have something to share. Most social work positions demand some accounting, generally in the form of monthly data collection and reporting. We have found this to be true throughout our careers, whether in medical social work, child and adult protective services, psychiatric social work in a mental health setting, independent practice, or hospice social work. So, if you’re collecting client data anyway (and if you’re not, you probably should be), and if it can be aggregated to disguise the individual identity of any one client, you have a dataset that can be analyzed for best practices, or for outcomes, that will tell you whether your practice methods or client interventions are going in the direction that you intend. Thus, this information is highly sought by your colleagues, as well.
Yes, if you set out to collect data for some specific research purpose, such as determining the most effective method of treatment for a client population or your clients’ beliefs about some phenomenon like substance use or child-raising, you will need approval from an IRB, the research oversight Institutional Review Board. The IRB’s sole purpose is to assure that you’re not harming anyone, and rightly so. If you are using human participants in a deliberately planned research project, they have the right to decline to be involved and to know what involvement will look like. It’s usually feasible to align yourself with a university researcher with similar interests if there is one in the area. Additionally, many large organizations, such as hospitals or research centers, have an IRB whose authorities will also work with the public. There may be a nominal fee that your organization would be happy to cover for you to be able to share your work, although it’s been our experience that there is usually no fee. But even IRB approval for a full-fledged research project is not a daunting task. A general rule in regard to the IRB is: If it’s “research,” you need IRB approval. If it’s a report, a program evaluation, or an overall review of your work to see if you’re doing what you planned to do, it’s not generally considered research.
Suppose that you simply want to find out how clients believe their treatment is working, or the most frequently-occurring complaints from your client base. You probably do what one of the authors did as a medical social worker—you just keep those demographics and client perceptions noted somewhere in a monthly report or similar mechanism. If you’ve examined and summarized 12 months of those reports, you have material for an article that may be of interest to colleagues. Too often, we stop at this point and do not go on to share our useful information. Most organizations are affiliated with either a parent organization, a professional association, or some other network of like-minded professionals. Please don’t hesitate to submit your findings to these outlets.
However, even a formally organized and planned study of some relevant community issue is not that hard to set up. Here’s an example of an easy, straightforward, inexpensive study that we conducted in one Florida community and the ways we went about it.
A Typical Example: A Study of a Foster Care System
A community oversight committee approached a practicing social worker and university instructor about studying the effectiveness of processes in the provision of foster care services. Three faculty members (the authors) collaborated to use qualitative research methods to gain a conceptualization of a Florida tri-county foster care system—its successes, its challenges, and any gaps in service provision. The university faculty then aligned with the foster care service board to plan the study, and it became a practitioner-university collaborative effort.
The study was devised to determine the perspectives of those considered to be major stakeholders—foster care social workers, the foster care system administrators, addiction and substance abuse professionals, case managers, family preservation workers, and former foster children and parents themselves. Additionally, other community leaders—clergy, child protection investigators, and legal professionals—were invited. All participants were provided a description of the study, informed consent forms, and information about participating in and withdrawing from the study.
Four focus group interviews were held with the identified key stakeholders in the tri-county area, and the focus groups were held at a readily-accessible site centered in the geographic area. A sample of 10 participants was selected from each of six stakeholder groups identified by a family safety coalition, the group who had contracted for the study.
A note about collaborative studies such as this one: the added value of a connection among community service organizations, clergy, legal and investigative staff involved in foster care, and other community leaders is immeasurably valuable. It brings together sometimes discrete and differently-valued professional disciplines and offers a relatively objective and safe environment in which to freely discuss similarities and differences. Posed in the light of evaluative research, differences of opinions and beliefs about the foster care system, for example, could be examined within the purview of an evaluation that would shed light on differing perspectives and potentially bring about mutually agreeable solutions. Further, universities typically want to provide real-world research-derived direction and findings of use to real-world practitioners.
Study Outcomes
A total of 35 of the 60 invited individuals participated in the focus groups (giving us a high 58% response rate), and participants were combined into the following four groups: (1) education and prevention, (2) permanency and aging out, (3) investigative/legal, and (4) management and contracts. Participants were each given a $30 gift card to a local supermarket as a thank-you for their participation, and although many of the public officials necessarily declined the “reward,” it appeared to be gratefully accepted by the foster families.
All of the groups were asked the same three questions and permitted a 90-minute discussion period in which student volunteers recorded data:
- What do you think are the most effective services provided to children and families in the three counties?
- What gaps have you personally observed in the system of care?
- What do you think could be the most important improvements in services provided to children and families in the three counties?
Outcomes of the study revealed that participants were in accord on many issues but engaged in differing beliefs on others. For example, focus group members noted the value in sharing their information and intentions for foster children within small committees that are scattered about the tri-county area, but they noted that greater coordination of those committees would be helpful. They also identified useful outcomes of community collaborations that could result in shared grant-writing and fundraising efforts, or in problem-solving and raising awareness of foster care challenges within the communities.
The practice implications of such discussions are enormous. Perhaps one agency finds out that others are working toward the exact same goals, and perhaps newfound alliances spring up so as to share limited agency resources and better advance mutual goals and agency missions.
How Sharing Program Evaluations Can Advance Your Work
The practice implications of this type of study are evident, even within one’s own organization or independent practice. However, your outcomes are showcased when published or otherwise shared with the community and a larger network. For this reason, we highly encourage the sharing of any similar endeavor that you have conducted or that you currently have planned. We believe you can see how simple and straightforward this study was, although it yielded valuable information to community agencies that did not have time or resources to conduct such an inquiry themselves. Further, your social work degree prepared you to do such studies, in addition to clinical or community-action work.
We all engage in continual quality improvement efforts, and these should be shared with colleagues. It also helps your organization to see your productivity, enhances your own competency as you do so over time (one of our social work ethical principles), and in some ways indirectly helps your clients help others. Publishing your work outcomes is a decided plus if you pursue higher education again at some point, and it can simply set you on the road to an additional nuance of evaluation and publication of your work that is useful in so many ways. Summaries of practice outcomes can also be truncated into clinician guides or information sheets to share with clients and colleagues.
So, please, let others know your successes and outcomes of your work!
Jessica M. Grosholz, Ph.D., is an assistant professor of criminology in the College of Liberal Arts and Social Sciences at the University of South Florida Sarasota-Manatee. Jane Roberts, LCSW, Ph.D., teaches MSW courses as well as baccalaureate-level social work courses in interdisciplinary social sciences at the University of South Florida Sarasota-Manatee. Melissa M. Sloan, Ph.D. is an associate professor of sociology and interdisciplinary social sciences at the University of South Florida Sarasota-Manatee.