Method

My autoethnography prompts:

  • a consultative situation,
  • a collaborative inquiry,
  • affording opportunities for authentic assessments, .

What it is I am studying was much rarer before COVID. I used to teach trades math within a “social ontogeny” suddenly severed when Zoom boxes chopped off our “hands on” instruction. Teaching apprentices online is a new learning experience for us, requiring discovering relevant methods, rather than describing methods already well documented in the literature, . Direct observation works best with professional development workshops. . A grounded theory approach seems to me to be the best choice of research method for our workshops.1

Unit of Analysis

Collaborative groups of workforce instructors serving NYC union members, choosing, of their own volition, to join a Saturday afternoon UFT Consortium of Worker Education Chapter Digital Literacy Professional Development workshop, e.g., . Six were offered in 2023.

Situation

“Our workshops” means I am not a detached observer: I am a participant, embodied within our Chapter’s goal of mastering andragogic digital literacy skills. Unlike the East LA PREP project we read about in class that suffered implementation issues, we already joined six Saturday afternoons towards constructing sustainable, resilient UFT-CWE DL-PDs. Our chapter already valorized a lot of trust and commitment, relevant for many research parameters, infra.

triangulation

No samplings will gathered, as there is only so much you can do in a four hour PD. For similar reasons, “triangulation” is not practicable proximate to my situation.

problematic

The “natural attitude” about digital literacy is instrumental—clicking quicker, swiping screens sequencially, down a checklist. I am more interested in deploying digital literacy towards furthering workforce development, analogous to the migration of mechanical library searches towards information literacy, which focuses on teaching students to critically engage with information collaboratively. How to measure and assess these deeper understandings is what I am trying to get at when designing UFT-CWE DL-PD workshops.

For example, if and when we apply a treatment to a DL-PD next year, the most simple being randomly splitting DL-PD participants into experimental and control groups, Hawthorne and John Henry effects will be minimal, because we are all UFT-CWE members. Further, the results from any UFT-CWE DL-PD treatment would be reported out first, and perhaps only to, UFT-CWE members, due to our command and control of our LMS. So everyone in the UFT-CWE will benefit from a treatment, regardless of whether in fact they receive it.

I believe the effects from the following biases will also be minimal: history, maturation, floor and ceiling, instrumentation, statistical regression, mortality, school, class, artificiality, and reactivity.

On the other end, teacher effects definitely occur, as the Post Training Satisfaction Survey gives higher ratings to the last DL-PD than I did. Testing effects may occur, as Revision #1 infra includes something like a pretest, the word or free list. If groups are chosen from a quasi-experimental method, then selection bias could occur. Lack of internal validity is possible, as we are trying to discover appropriate methods in our situation.

Nonrepresentativeness is more problematic. Professional development of worker instructors is a tiny niche in the world of education. It is precisely my choice of grounded theory that makes generalization, by definition, unlikely.

1 I received feedback from my professor that, “action research,” is more accurate vernacular to describe my travails. [Ed. 12/31/23]


Mitchell, C., Theron, L., Smith, A., & Stuart, J. (2011). Picturing Research. In L. Theron, C. Mitchell, C. Mitchell, A. Smith, A. Smith, & J. Stuart (Eds.), Picturing Research: Drawing as Visual Methodology (pp. 1–16). SensePublishers. https://doi.org/10.1007/978-94-6091-596-3_1
Lewis, R. (2023, November 18). UFT-CWE Digital Literacy #5. Google Docs. https://docs.google.com/presentation/u/1/d/1vFA41iyVItAPf2bDWq1jYX7hn_PGF6HKu0NbtwX3Af4/edit?usp=embed_facebook
Lewis, R. (2023, October 18). UFT-CWE Learning Community. Padlet. https://teacherscollege120.padlet.org/rl3280_/uft-cwe-learning-community-pbp06nm9wa5s1p6m
E.Clarke, A. (2005). Situational Analysis. SAGE Publications, Inc. https://doi.org/10.4135/9781412985833
Crawford, K. (1998). Learning and teaching mathematics in the information era*. In Learning Relationships in the Classroom. Routledge.
Boote, D. N., & Beile, P. (2005). Scholars Before Researchers: On the Centrality of the Dissertation Literature Review in Research Preparation. Educational Researcher, 34(6), 3–15. https://doi.org/10.3102/0013189X034006003
Baptiste, I. (2000). Beyond Reason and Personal Integrity: Toward a Pedagogy of Coercive Restraint. Canadian Journal for the Study of Adult Education, 14(1), 27–50. https://cjsae.library.dal.ca/index.php/cjsae/article/view/1938
Stahl, B.-C. (2003). How We Invent What We Measure: A Constructionist Critique of the Empiricist Bias in IS Research. AMCIS 2003 Proceedings. https://aisel.aisnet.org/amcis2003/376
Hudley, C. (2006). Who is watching the watchers? The challenge of observing peer interactions on elementary school playgrounds. New Directions for Evaluation, 2006(110), 73–85. https://doi.org/10.1002/ev.188
Barron, B. (2003). When Smart Groups Fail. The Journal of the Learning Sciences, 12(3), 307–359. https://www.jstor.org/stable/1466921
Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry & research design: choosing among five approaches (Fourth edition). SAGE.
Becker-Klein, R., Peterman, K., & Stylinski, C. (2016). Embedded Assessment as an Essential Method for Understanding Public Engagement in Citizen Science. Citizen Science: Theory and Practice, 1(1), 8. https://doi.org/10.5334/cstp.15
Yin, R. K. (2018). Case study research and applications: design and methods (Sixth edition.). SAGE PUBLICATIONS.

Posted

in

by

Tags: