An interesting study

By | December 30, 2012

nomo-globeI said in yesterday’s post that I would talk more about the Russo-Fitzgerald-Eveland study on the evaluation of training programs for resident physicians. It’s an interesting piece of research in many ways, both substantively and in terms of process. It’s basically a report on Ruthann Russo’s dissertation research; Steve Fitzgerald was her dissertation chair. The study was completed and passed before I ever got involved with it. Steve brought me in when the article was largely complete to provide a second opinion on some of the statistical analyses and how they might best be presented. In typical fashion, I proceeded to jump in with both feet, suggesting a rather different way of conducting and presenting the analyses – that is, through a structural model that could incorporate different kinds of effects and estimate all coefficients simultaneously. Initially somewhat skeptical, my colleagues generously allowed me to run with this long enough to convince them of its benefits. The article represents, I believe, some of the best kinds of research collaboration that we can undertake – different people providing different kinds of contributions and expertise playing out these differences in an atmosphere of mutual respect and joint purpose. None of us individually could have written this article; but collectively, we’ve come up with something good.

Substantively, the article is interesting because it reports the results of a truly legitimate field experiment, without many of the compromises with true experimental design that so often plague field experiments. It featured random assignment of participants, clearly defined alternative treatments and control, a manageable time frame, and clear and relevant outcome measures. Moreover, the results were interpretable in practical terms and even implementable without gigantic crises. This was made possible by a well-defined set of research questions beginning the study and extensive cooperation among a lot of participants to make it happen in the right way. I’m on record in a number of places as being somewhat suspicious of the experimental model derived from natural sciences and applied uncritically to behavioral science problems; I believe it can provide an air of spurious precision to the resolution of problems that are not inherently that precise. In this case the model proved its worth, although it also demonstrated just how complex and costly it can be to implement these designs correctly.

As I said, my principal contribution was to frame the analysis as an estimation of a single structural model rather than a series of linked and overlapping multiple regressions. My original task was to review the regression findings that had been developed. In the course of identifying a number of errors that had crept into the analyses, it occurred to me that the whole problem could be more clearly and parsimoniously phrased as a single model. I had taught structural modeling for the past couple of years as an elective course in our PhD program, although I had never used it myself in an actual study with real data. It was a significant learning experience for me, both in terms of improving my facility with the analytical approach and with trying to explain the approach to my colleagues, who, unlike students, were not content with letting me get away with glittering generalities.

The real credit in this case, as in almost all really good research, goes to the data. As I will develop in future posts, I tend toward a somewhat mystical view of data and data quality. I believe that data are living things, insofar as they are directly derived from living things, and therefore deserve the degree of respect that we naturally accord to living things. Give good data a chance to tell their story unimpeded by your own preconceptions, and you’d be amazed how often useful truths can emerge. In this case, the researchers respected the data and enabled a clear story to be told. In a future post, I’ll talk more about this element of how good data can be facilitated, as well as how they can be damaged. For now, let me simply express again my appreciation to my colleagues for the opportunity to participate in this excellent study and to contribute something new and different to it. Good research is fun, and the element of fun in it always needs to be respected and encouraged.