Thursday, February 21, 2013

2.19 Surveys

We started class today with a discussion of Michaela Cullington's Texting and Writing . Our talk focused on her research questions and findings, how her essay was "built" and on a critical examination of how Cullington's assumptions and methods set her up to find what she found.

Focus and organization: Cullington's research question was about how or whether texting was influencing the way high school students write, with a particular focus on the influence of "slang" and the use of acronyms. We noticed that she set up her question in her introduction and that as part of her set up she introduced the fact that there were two "sides" on the question=> that texting is making student writing "worse" and that it is making it "better". This short overview of her study concluded with her findings, which were that texting is not really having much of an effect on how students write.


After her brief introduction, Cullington presented literature review sections for describing "concerns" and a "reply to concerns" about texting.

In the concerns section, she cited researchers and teachers as concerned about grammar, the ability to write "standard English", using short, "sloppy", orundeveloped sentences and improper word choice .

In the reply to concerns, researchers pointed out how texting motivates students to do more writing, invites more creativity (is in fact a new language), teaches writers to write for different audiences, sharpens diplomatic skills, and demands that writers summarize and express themselves concisely.

This literature review was drawn to our attention by the use of headings, as were the Methods and Findings sections. In the Methods section we noted that Cullington's overall approach was to:

  • survey students on 4 points: how often they text, how long they have been texting, what abbreviations they used, and whether they notice themselves using textspeak in their formal writing.
  • talk to teachers (see essay for questions)
  • look at student writing for influeces of textspeak
In the findings section, Cullington discussed the fact that students know the difference between writing for school and texting - and do not use textspeak in their formal writing - and that she found no instances of textspeak in student papers. Her closing statement is that "ultimately, experts and the students themselves see no influences.. ."

Critique
We noticed that when Cullington reviewed the literature (what other researchers have written) about texting and wrirting - there were two, different definitions of "writing" implied by the discussions of concerns and replies to concerns. The "concerns" researchers were primarily defining writing in terms of formal features = grammar, "standard English" and word choice; they also saw language as "transparent" => as if meanings were clearly there and a reader could "see" meaning in writing as clearly as if looking through a window.

The response to concerns researchers saw writing more in terms of the definition of language provided by Gee= writing as meaning making (where meanings are not "there" but created through interpretations of both writers and readers, and those interprestations are influenced by past experiences). This definition of writing was more about writring as "saying, doing, and being" than about wriring as "standard English." Cullington did not point out which definition of "writing" she would use in her essay => though she applied the "concerns" (standard English" definition).

This failure to see writing as language (in the ways Gee defined it) set up the kinds of questions Cullington asked. Because she saw writing as about form, and meanings as "transparent". she asked mostly about writing forms - abreviations etc - rather than about the kinds of writing practices identitifed by the "reply to concerns" researchers.

Differences between surveys and interviews:
We used our discussion of Cullington to talk about differences between surveys and interviews. Surveys ask for short, clear replies to clearly defined questions. They assume that meanings are (or can be) clear, and that the choices offered by the survey question will cover all that is important that the research participant has to say.

Interviews are more open. They assume that language is complex and interactive, and that the research subject may have multiple, conflicting things to say in response to the researcher's questions. Interviews also allow space for research subjects to say things the interviewer has not anticipated. In surveys, this is not always possible.

Creating an effective research survey.
In our review of Cullington's survey, we noticed that her conclusions reflected the kinds of questions she asked. If she had asked questions that grew out of an assumption that writing was about "saying, doing, and being" rather than about "standard English" => she may have been presented with very different kinds of information. When I asked you about how you used "texting" (and your phones) to write => you pointed out MANY ways that "texting" contributed to your writing that were simply not possible for Cullington to find out through the questions she asked.

With this in mind - I asked you to look at the Student Learning outcomes for the Writing Option program - and to consider the survey our program uses to evaluate the effectiveness of how well we teach to those outcomes.

Audience and purpose: The survey I handed out in class is given to students when they enter the Writing Option Major, and to students who have completed the Writing Option Major. It was designed to assess (evaluate) what students know, what they do, and how they feel with respect to the 5 learning outcomes for the Writing Option Major (see the last blog post).

Groups:
Allyson, Deanna, Yoleiny
Sharyn, Chris, Alison
Paul, Devon, Brianna, Amy
Mike, Oriana, Rikki, (and Sarah if you are able to get in touch)

For Thursday:
Blog 9: With your group (one post per group) do the following:
1. Identify which questions assess which learning outcomes (some questions may apply to more than one learning outcome) (For this task - you should list each of the 19 questions under one or more of the 5 learning outcomes).
2. Evaluate the effectiveness of each question in terms of what it allows students to communicate about the knowledge, feelings, and practices they might have with respect to each learning outcome. (for this task, you should produce some comments that account for WHAT information each question provides with respect to the learning outcome it is listed under, and some observations about how well it provides that information)
3. Note (make a list of) any information about the 5 learning outcomes that the Department might need = but that will NOT be gathered through these questions.

Additional questions:
What definition(s) of "writing" and "learning" is assumed by this survey?
Are those definitions a good match for the definitions assumed by the student learning outcomes? Are they a good match for the way the students taking the survey will define writing and learning?
How might you change this survey so that it could provide a more accurate reflection of what students learn in this program? List your suggestions.

As we discussed in class, Dr. Sutton and I cannot assume that our students (you) use language or draw from the same assumptions in the same ways we do. Your input on this excercise will help use develop better tools (surveys) to assess our program. Thank you!

No comments:

Post a Comment