The Ontario Action Researcher
 

Editorial

IS THERE META IN THE MADNESS? ACTION RESEARCH AND THE USE OF META-ANALYSIS

Kurt W. Clausen, Editor

A slur has continued to dog the small-scale qualitative researcher - a hard core "scientist" can bring an action research article to its knees with a well placed "that's all well and good. but can it be generalized?" With these words, a good piece of useful investigation can collect dust on the shelf into perpetuity for want of any connection to the outside world. A teacher may have collected data on her students and found ways to benefit her class: but what relationship could this have with other situations? Other communities? Other provinces or countries? For years, they have been laughed off with the comment "none".

Beginning in the late 1970s and early 1980s, however, a number of researchers began promoting a new style of research entitled "meta-analysis". The term can be largely attributed first to the work of Gene V. Glass (see Glass, 1976; Glass & Smith, 1979; Glass, McGaw & Smith, 1981) who tried to incorporate the results of quantitative research studies for larger conclusions:

The term is a bit grand, but it is precise, and apt, and in the spirit of "meta-mathematics," "meta-psychology," and "meta-evaluation." Meta-analysis refers to the analysis of analyses. I use it to refer to the statistical analysis of a large collection of analysis results from individual studies for the purpose of integrating the findings. It connotes a rigorous alternative to the casual, narrative discussions of research studies which typify our attempts to make sense of the rapidly expanding research literature. (Glass, 1976, 3)

Since then, writers have advocated for the use of this tool: "there is now a better chance that the small experiment will be recognized as making an important contribution to the aggregation of knowledge in the social sciences" (Fitz-Gibbon, 1985). As such, it could enhance the uses of small scale research studies; it could provide a way of relating conclusions drawn from numerous experiments without having to have coordinated them from the outset (allowing researchers to use older data); finally, it could allow historical researchers to study an effect over time (p. 46).

Knowing the uphill struggle of this endeavour, Glass was the first to point out the limitations of this device. It must be rigorously undertaken. And this is a difficult prospect when dealing with a very uneven playing field of research. Gerard Dallal (2003) points out that meta-analysis must always struggle with two issues:

  1. Publication bias (also known as the file drawer problem ): Dallal shows that journals are much more likely to accept papers that show some correlation than those that show none. Those that show no effect are just filed away (hence the term). When a meta-analysis is performed, these researchers tend to just draw on the ones that have been published, giving an unfair advantage to one side of a debate.
  2. The varying quality of the studies : Dallal also comments on the fact that when meta-analysts are assembling the available literature, it can be difficult to determine the amount of care that went into each study. "Thus, poorly designed studies end up being given the same weight as well designed studies. This, too, can lead to misleading results when the data are summarized."

In the end, Dallal states that due to these issues, and the fact that this sort of research is done when no large-scale, high quality studies exist, it is all but impossible for any meta-analysis to draw clear conclusions. This is even more difficult when dealing with more qualitative studies. However, action research projects, like those that find residence in this journal, call out to be linked to a larger community. Terri Lynn Kirkey 's work examines the uses of differentiated instruction techniques as a means of aiding the primary classroom, while Woehrle, Fox and Hoskin 's contribution weighs the benefits of a new schedule within their school board. Of the three submissions for this issue, the work of Jones and Song comes closest to a small-scale meta-analysis: they draw conclusions from four action research projects within their university.

If you do decide to undertake this type of research, Egger, Davey Smith & Phillips (1997) outline the hallmarks of a good meta-analysis:

  • "A meta-analysis should be as carefully planned as any other research project, with a detailed written protocol being prepared in advance.
  • The a priori definition of eligibility criteria for studies to be included and a comprehensive search for such studies are central to high quality meta-analyses.
  • The graphical display of results from individual studies on a common scale is an important intermediate step, which allows a visual examination of the degree of heterogeneity between studies.
  • Different statistical methods exist for combining the data, but there is no single "correct" method.
  • A thorough sensitivity analysis is essential to assess the robustness of combined estimates to different assumptions and inclusion criteria" (p. 1533)

This is challenging enough for medical research (to which these researchers refer). When it comes to Action Research studies, the meta-analyst now reaches virgin territory. To many it may prove to be a daunting task with a great deal of initial legwork to find the level playing-field. However, the resulting connections may prove well worth the effort.

References

Dallal, G.E. (2003). The Little Handbook of Statistical Practice. Available online at: http://www.tufts.edu/~gdallal/meta.htm .

Egger, M., Davey Smith, G. & Phillips, A.N. (1997). Meta-analysis: Principles and Procedures. British Medical Journal , 315(7121): 1533-1537.

Fitz-Gibbon, C.T. (1985). The Implications of Meta-analysis for Educational Research. British Educational Research Journal, 11(1): 45-49.

Glass, G.V. (1976). Primary, Secondary, and Meta-analysis of Research. Educational Researcher , 5 (10): 5-8.

Glass, G.V., McGaw, B. & Smith, M.L. (1981). Meta-analysis in Social Research . London : Sage.

Glass, G.V. & Smith, M.L. (1979). Meta-analysis of Research in Class Size and Achievement. Educational Evaluation and Policy Analysis , 1(1): 2-16.