Surveying the field, coordinating research

Bradley Dilger, Purdue University — dtext.org

(Neil Baird, Western Illinois University)

My empirical research

  • No formal training in graduate school; Dartmouth Seminar (2011), self-taught with emphasis on qualitative methods
  • Writing transfer in the major, with Neil Baird
    • 16 student and 15 faculty participants for three years
    • Recent CCC, forthcoming WPA, more under review
  • Crow, the Corpus & Repository of Writing
    • Software, methods for integrating corpus + repository
    • Shelley Staples (Arizona), Bill Hart-Davidson (Mich St)

Call me a wannabe quantitative writing researcher!

Quantitative work has always been with us...

  • Data-driven assessment
  • Linguistics, especially applied linguistics (TESOL, etc)
  • Interdisciplinary work which uses discourse analysis and similar methods (Cheryl Geisler, Ellen Barton)
  • Writing research which relies on educational psychology (Steve Graham & his collaborators)
  • European writing research (Journal of Writing Research)

...and more types are emerging.

  • Jamieson, Howard, & Serviss, “Citation Project”
  • LeCastro & Miller collection Composition as Big Data
  • Omizo & Hart-Davidson’s use of machine learning in Hedge-O-Matic
  • Journal of Writing Analytics (ed. Moxley & Elliot)
  • Software like Eli Review & MyReviewers which provides service and generates data as well
  • Interdisciplinary work in digital humanities

Corpus linguistics + rhetoric & composition

CCCC 2017 & AAAL 2017 were back to back in Portland in March... yet both organizations all but ignored each other.

Even so, corpus methods are gaining attention: scholars like Zak Lancaster, Laura Aull, and Dylan Dryer.

Building on this work, the Crow project is exploring ways the two fields can work together.

Yet another corollary

to White’s Law

Writing research is more and more likely to be shaped by quantitative approaches — with or without the engagement of writing program administrators — so let us participate in that research, that shaping, that engagement.

We can imagine more engagement with quantitative methods in terms of research questions.

Research questions #1 and #2

How can WPAs become more aware of, find more value in, become more proficient using, and more effectively help others develop data-driven, quantitative methods for writing research?

How can WPAs effectively collaborate to achieve these goals?

Research questions #3 and #4

How can WPAs ensure data-driven, quantitative writing research is both ethically and procedurally sound?

How can WPAs constructively shape interchanges between rhetoric and composition and other fields conducting quantitative writing research?

Two methods

  • Survey of researchers in rhetoric & composition who are conducting writing research with quantitative, data-driven methods.
  • Formation of research coordination networks to advance this work.

Survey: INQWIRE

Survey: INQWIRE

Inventory of nerdy, quanty writing researchers

Key purpose: build a list of WPAs, others in rhetoric & composition (broadly speaking), and those in other fields whose interests overlap with rhetoric & composition

Census model: not update, but snapshot at given time

Main elements of INQWIRE

Inventory of interested researchers and research projects

Terms which shape conversation

Disciplinary influences (both native and external to typical WPA work)

List of publications and presentations

Brief research methods “literacy narratives”

INQWIRE methods & distribution

Typical distribution (email lists, social media) supplemented with direct invitations for targeted researchers — especially those outside rhetoric & composition

Respondents will be able to choose if their identifying information will be published separately from results (opt-out)

INQWIRE publication

Summary, analysis of results in journal previously mentioned or similar (Enculturation, Composition Forum)

Two key parts:

  • Survey results in the aggregate
  • Inventory of researchers

If possible, publish open data as well (pending opt-out results)

Exploring research coordination networks

RCNs as defined by NSF (1/2)

“The goal of the RCN program is to advance a field or create new directions in research or education by supporting groups of investigators to communicate and coordinate their research, training and educational activities across disciplinary, organizational, geographic and international boundaries. RCN provides opportunities to foster new collaborations, including international partnerships, and address interdisciplinary topics. Innovative ideas for implementing novel networking strategies, collaborative technologies, and development of community standards for data and meta-data are especially encouraged….”

RCNs as defined by NSF (2/2)

“...RCN awards are not meant to support existing networks; nor are they meant to support the activities of established collaborations. RCN awards do not support primary research. RCN supports the means by which investigators can share information and ideas, coordinate ongoing or planned research activities, foster synthesis and new collaborations, develop community standards, and in other ways advance science and education through communication and sharing of ideas.”

Example: Interface at Purdue

RCN for nerdy, quanty research?

INQWIRE survey will help explore feasibility, sustainability of formal research coordination network

Fishman & Dilger, CCCC 2015: RCN for empirical research? Considerable interest, but that focus is too broad

Use NSF guidelines to consider feasibility

  • Topic & focus of research coordination
  • Principal investigator
  • Steering committee
  • Network participants
  • Coordination & management mechanism
  • Information & material sharing
  • International participation
  • Data management plan
  • Postdoctoral researcher mentoring plan

Challenges in developing RCN

  • Difficulty of interdisciplinary work
  • Focus on United States in WPA research (and rhetoric & composition as a whole) when much quantitative writing research is outside USA
  • Limitations, structure of our professional organizations
    • Mission, scope of NCTE
    • Splintering in tech comm (ABC, ATTW, CPTSC, IEEE-PCS, SIGDOC, STC)
    • Limited funding to support research

Next steps

  • NSF guidelines previously mentioned
  • Consider disciplinary overlaps, limitations, structures:
    • What approaches to interdisciplinary collaboration are the best fit?
    • How can researchers with different goals and motivations work together?
  • Consider sustainability, time frame for engagement — likely six to ten year project

Thank you

Bradley Dilger — dilger@purdue.edu
dtext.org – 309-259-0328

Surveying the field, coordinating research - Google Slides