Education and exams

It’s a busy year for my two sprogs, one in Year 11 and the other in Year 12. It seems as if one just finishes one lot of exams and then the other one’s in the middle of another set, and then there’s all the HSC assessment tasks to keep on track. It’s all a bit stressful, but we’re generally managing.

Still, a question raised by one examination experience reported to me today:

When one particular question on an exam paper is literally incomprehensible for the whole class when they compare notes afterwards – not just one or two students, but every single one of them agreed that they had no idea what the jargon used in the question even meant – is it the students’ problem that nobody understands the question, or is it the teacher’s problem that nobody understood the question?

For what it’s worth, I also studied this particular subject for my HSC, and I found the particular jargon quoted entirely incomprehensible as well. No doubt the theories and related jargon surrounding this subject have changed somewhat in the intervening decades, but still.

Interestingly, the rest of the questions on the paper were “easier than expected” for my particular sprog, who is one of the students who regularly gets some of the highest marks for this subject, and most of sprog’s peers agreed that the exam was not too difficult overall. There was apparently not quite enough time for about half the students to completely finish the final essay, but otherwise not too bad. The incomprehensible question seems to have been very much an outlier.

Categories: crisis, education, parenting

20 replies

  1. The teacher’s problem for sure. If teachers want students to present clear and concise prose, they’d better be setting an example.

  2. Yep, definitely the teacher’s problem.
    This is the kind of thing that used to make me and my family boil 🙂
    Given the time of year, I’m assuming this is not the actual HSC – I wonder whether the paper was written entirely by the teacher (which makes one grimace even more) or whether the teacher was grabbing questions from past/sample HSC papers…

  3. Every single university lecturer I ever had firmly believed the latter. Even the… ones I didn’t like. A handful of students failing = students’ problem; almost every student failing = teacher’s problem.
    Of course, there were reasonable limits to this, such as if roughly half of the students in a second year calculus class failed a question on basic arithmetic, the problem might just lie somewhere else entirely. Although that doesn’t make it any less the teacher’s problem…

  4. I’m reminded of Dorothea Salo’s grad school experience:

    Then, that afternoon, I got to the Medieval Literature portion of the exam, which I was hoping to High Pass. What I found in the opening section, which consisted of identification of terms and works, shot my confidence to hell in an instant…Going back to my notes from the survey course later, I could not find even one of the terms I had failed to define.
    … I spent that day writing a letter protesting the selection of terms for that exam, and doing my best to justify my protest. … I did not feel comfortable waiting until the results of the exam were back; assuming I did not pass (and I believed I hadn’t), complaining after the fact would just look like ordinary sour grapes…
    … Most of the terms I couldn’t manage were found in one reference book that was indeed part of the exam reading list, although it still seems strange to me that we were apparently required to memorize the entire book (how were we to judge which of the many terms it mentioned were important, and which were not? would this not be the purview of a survey course, the one of which I took did not mention any of these terms?). One term I questioned, however, they admitted they could not find, so everyone taking the MA exam got one identification written off.

    Sprog’s exams aren’t as high stakes as one-shot PhD qualifiers, but it seems like a fair enough test, broadly: can the teacher find that jargon used in that way in the textbooks? That’s not definitive proof it’s a fair question, but it gets a dialog going.
    I’ve had a few ambiguous or contradictory or unanswerable university exam questions: generally speaking if the question can’t be corrected early in the exam it gets written off. (The person who set the exam is usually required to be on campus and near their telephone, and if the question needs a last minute revision they will actually come into the exam room, read the revised question, and write it up on the board. Less likely to work with cross-institutional exams like the HSC itself.)

  5. Teacher.
    This is more unambiguous with some subjects than others. My very first uni maths exam one question used a term I’d never seen before. When the first few students complained the lecturer scoffed at them for not knowing such a simple definition, but as more students raised their hands he eventually had to admit he’d never taught us it and gave the definition.

  6. I have just re-entered the workfoce after being a “home manager” for the past 2 years.
    I find the Corporate Speak unintelligible, It certainly is not English. Hardly a complete sentence and every PowerPoint presentation reads like a vision statement.
    Is this the teachers fault?

    • @Bryan, your point seems incoherent. Can you clarify in some way that is on-topic for this post which is categorised under “education” and “parenting”?

  7. Don’t omit the possibility of the problem coming from one level further up: how were the teachers briefed? I’ve always wondered how teachers are confident about what level of theory and theoretical language to work with; you could settle at a number of different levels that make a huge difference to the pitch of an essay.

  8. Seconding Orlando. Let’s not be so quick to leap on the teachers.
    I used to teach English in Louisiana, and while I’m sure there are some significant differences between US and Aus. education policies (just as there are significant differences between Louisiana and, say, Pennsylvania where I live now), I’m offering up my experience as an example of educational clustermess.
    *On my second read-through, I realized I wasn’t clear whether the exam in question was a regional standardized test or a test developed specifically for a course; my response deals mostly with the former.
    One of the classes I was assigned to teach was LEAP Prep, a course created to train 10th graders for their big standardized test that year. I say “created” not “designed,” because I was given absolutely nothing to work with, not even recommendations for materials, let alone the materials themselves. So I created it.
    To do so, I had to buy test prep books. Wherever there is a standardized test, there pops up a symbiotic industry for materials that claim to help students “beat the test.” (I needed them just to even know what the test looked like, as I had never seen it.) With such materials available, standardized tests often change language or format to become less “beatable;” the prep materials then change to catch up; on and on ad infinitum.
    My students could not even purchase textbooks for their regular English class, so I purchased the books, paid for copies to be made for handouts, pored over them in order to design lesson plans, and created mock-up tests and debriefings to help demystify this experience for them – all while being exhorted by my superiors not to “teach to the test,” a major faux pas under our No Child Left Behind act.
    Now, obviously that story is extreme in terms of how underresourced my school was. But I hope it at least complicates your question. It is not as simple as “who’s at fault: teacher or student?” Because in the case of standardized tests, someone mediates the materials that teachers receive concerning the test; someone higher than the teacher is dictating how much time will be spent preparing for the test, and how much about the test teachers know in advance. (Or, no one mediates it as in my case, and the teacher must pour his/her own resources into scouting it out.) Then, the testmakers themselves are writing the test in response to changing concepts of what students are supposed to know, and in response to the prep materials available. Even in the case of non-standardized tests – that is, tests developed specifically for a course rather than a region-wide assessment – the teacher is usually expected to develop these in conjunction with regional benchmarks and school-wide policies. . . which are not always clear and concise or readily available themselves.
    It’s frustrating to be a student and feel tricked by a question that could not be prepared for. But let’s bear in mind that teachers are expected to pour immense amount of effort (and sometimes personal resources) into preparing students for the ever-changing fads of educational assessment without sacrificing any of the more comprehensive and critical thinking skills that can’t be captured by exams. And then let’s direct some ire appropriately toward the systems that produce these conditions.

  9. At high school I’d say its most likely the teachers fault (but hey everyone’s human!). At university when they really want to stretch the students sometimes you do have questions which may not be strictly within exactly what was taught and they don’t expect you to be able to do all of them. But in the latter case this can be good from separating the average from the excellent.
    Again sometimes even university lecturers get it wrong. I know one case where even with scaling they had to get most of the students to resit another exam because too many people got 0 – and you can’t scale 0. It was pretty common in the uni exams I had to come out of an exam having done less than half the questions and still end up with a distinction or better.

  10. I can’t tell from your post tigtog, whether the question was part of an HSC trial (or externally set exam), or appeared on an internal school based examination.
    If the former, then HSC trials/exam questions would be determined by a board of examiners for the subject, which can mean a question may have had a number of authors and should go through a number reviews before they are finalised. I’ve found this can lead to a degree of opaqueness (to put it mildly!) when attempting to analyse certain types of exam questions. As there are multiple authors, this can mean that the ‘style’ of questions across the examination could vary widely, as different pedagogical styles are pitted against each other.
    If this was an internal school set examination, then it is likely that the teacher is at fault. However my query is, as none of the other questions on the exam were incomprehensible, is possible that the teacher may have drawn upon other resources (ie: taken from external/standardised/practice tests, or from in-put from other teachers within the subject area, or in-put from the school head of discipline) for this particular question? Alternatively, it could be that the teacher wanted to expose the students to jargon loaded questions to find out how they coped (as these types of questions might feature in future externally set exams).
    Just my two cents!

  11. Elisabeth has a good point about HSC trial exams sometimes including atypical or maybe even technically faulty questions as a just-in-case. I remember my English trial (what was then called 2U Related and now appears to be English (Advanced)) had a question where we were asked to technically analyse a piece of very purple prose: the trick being that we all assumed that because it was on the trial it must be a good piece of writing, and the lesson being that no, in fact, it’s very possible to dislike the piece set for analysis and pretending you admire it is rather transparent to the markers.
    That particular lesson might have been better taught in-class but we all came into Year 11/12 class with such weak preparation in technical analysis of writing techniques that there was only so much that could be squeezed into the teaching hours available.

    • Hi, it was an internal school exam for Year 11, so not likely to have been set by external examiners. I like the idea that it could have been a deliberately opaque question just to see how they coped with that. I could get behind that form of pedagogy.

  12. I always argued back when questions were confused or confusing in their wording (which led to a snitty conservative Chemistry teacher asking me if I was ‘taking assertiveness lessons, or does it just come naturally?’). So long as some proportion of the class also misconstrued it, and answered it based on that misconstruction, the marks were usually reallocated. But in practice exams, I think it’s pretty standard to have a question designed to throw students, in whatever way, so that if/when they encounter a question that similarly throws them in the final, it… does so less? Meaning, has less effect on their capacity to answer the other questions? But maybe that sense arises from a youthful self with way more faith in the pedagogical awareness of my teachers than is justified (though that’d be pretty uncharacteristic of me as a student, I reckon! ;-))

  13. It could have been a favourite question of one of the teachers ‘give em this and see how they cope with it’, it could have been a test to see what type of skills they had and if they needed skilling up in one area, it could have been a late night question of the ‘oh god I just need one more’ kind.
    If possible can you ask the teacher and let us know? I’m all curious now. kthx

    • Sprog and classmates plan to have ‘WTF was that?’ as their first question of this teacher when classes resume after the rest of the exams are done. I will definitely let you all know about the response.

  14. Of course none of these theories are of much assistance at the moment a student is confronted with an incomprehensible question in a pressurised exam situation!

    And like Mindy, I would be interested to know what the teacher says after being given the student feedback.

  15. BTW I should note I am a long time (if sporadic) Hoyden lurker and I love the variety of posts, topics, ideas and conversations your blog provides. Cheers.

  16. I have a student arguing with me about a question at present. There is precisely 1 point at stake. On the one hand, it’s pretty annoying, but on the other, well, I admire the student’s gumption. (FTR, I think the student is actually not correct, and that zie is reading things into the question in order to get the result zie wants, but hir analysis is nevertheless insightful. I’d like to give hir the point on those grounds, but I am not the only teacher on this course.)

  17. In senior high school exams are often purchased from external companies for the trials and half yearlies. Especially for subjects where complex stimulus material is used as part of the exam (ie, maps in geography exams, diagrams and illustrations in science and maths papers). Even though it’s an internal exam, external papers are commonly used. It saves teachers the time of setting these complex HSC style papers and removes the bias of having one class’ teacher set the exam where there are multiple teachers teaching the same subject in larger schools. It is thought to better prepare the students for the externally prepared HSC exams. Sometimes these papers don’t always have appropriate questions, and aren’t always given to the schools electronically so that questions can be changed, even if stimulus material can’t be. Sometimes teachers do not bother to change them and administer as is, the thought being that it will better prepare students for the HSC experience.

%d bloggers like this: