stray thought about preparing scholars
Apr. 18th, 2010 04:07 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
I know a fair number of people who found graduate school, shall we say, not that satisfying. This may not surprise you if you've ever known any graduate students, but it probably should. Grad programs filter their entry pool pretty heavily on traits like academic achievement and interest; among the set that makes it in, you'd think hating school should be a fairly rare occurrence. What's going on here?
The canonical answer is that the unhappy ones are doing something wrong. The culture of higher education places the burden for success squarely on students, especially at the graduate level: no one can do the work of learning, or of career planning, for you. And there's some truth to that, for sure. However: graduate stipends are small, compared to the salaries of entry-level jobs that students would likely qualify for, and the justification is that tuition is part of compensation. When mentorship is weak or lacking, when professors' failure to read and comment on submitted work renders its completion meaningless, when standards for success are so ill-formed that decisions seem arbitrary -- those things, in a sense, constitute a reduction in pay.
So I started wondering the other day: why do we treat graduate school as school in the first place? Instead of pretending that learning to be a scholar is anything like learning to be a lawyer or a surgeon, why not move to a model more like other jobs -- where people are paid entry-level salaries for a few years while they learn enough to be hired later as independent workers (aka postdocs, instructors) and managers (professors)? I am not sure that it would have to cost more; compensation that currently goes back into the Graduate School could go instead toward salary for TAs and RAs, which, given professors' frank acknowledgement that graduate coursework is a waste of time, seems entirely appropriate to me.
My hunch is that this model would take some pressure off the mentor-mentee relationship, which is often fraught with expectations that go unmet. Rather than trying to turn everyone into Supermentor, it seems more sensible to adopt a structure that acknowledges reality -- your professor is just another boss -- and encourages scientists to take responsibility for their careers by paying them and treating them as young professionals instead of as students.
The canonical answer is that the unhappy ones are doing something wrong. The culture of higher education places the burden for success squarely on students, especially at the graduate level: no one can do the work of learning, or of career planning, for you. And there's some truth to that, for sure. However: graduate stipends are small, compared to the salaries of entry-level jobs that students would likely qualify for, and the justification is that tuition is part of compensation. When mentorship is weak or lacking, when professors' failure to read and comment on submitted work renders its completion meaningless, when standards for success are so ill-formed that decisions seem arbitrary -- those things, in a sense, constitute a reduction in pay.
So I started wondering the other day: why do we treat graduate school as school in the first place? Instead of pretending that learning to be a scholar is anything like learning to be a lawyer or a surgeon, why not move to a model more like other jobs -- where people are paid entry-level salaries for a few years while they learn enough to be hired later as independent workers (aka postdocs, instructors) and managers (professors)? I am not sure that it would have to cost more; compensation that currently goes back into the Graduate School could go instead toward salary for TAs and RAs, which, given professors' frank acknowledgement that graduate coursework is a waste of time, seems entirely appropriate to me.
My hunch is that this model would take some pressure off the mentor-mentee relationship, which is often fraught with expectations that go unmet. Rather than trying to turn everyone into Supermentor, it seems more sensible to adopt a structure that acknowledges reality -- your professor is just another boss -- and encourages scientists to take responsibility for their careers by paying them and treating them as young professionals instead of as students.
(no subject)
Date: 2010-04-18 10:16 pm (UTC)It isn't quite the same, as they still get stuck with ass-low salaries, but they can push for better treatment.
---
I'm not convinced graduate school for scholarship is that different than, say, earning a JD. THis may only be true for the first years -- at least in the sciences and engineering fields, there is a lot or relevant coursework left, before one can reasonably begin research. Were I to go back, beyond any refreshers, I'd need a whole lot of exposure to new concepts (mathematical basis of CS practices, field theory, more advanced statistics, network theory, etc.) before I'd be ready to work on a thesis.
If I were to go mathematics, rather than CS, I'd need a huge number of classes to cover what is considered basic to a PhD level candidate.
For
---
The big break, as I see it, isn't that graduate students are professionals and should be treated as such (which is, btb, quite possibly true), but that the class structure and format at the graduate student level is not only completely unlike that of undergraduate education, but the variance from one school to another is also out of synchronization.
At UI, my undergraduate level classes were all of a instructor-student format. While some things were addressed by the class (solved together on the board, or whatever), there was always a divide that the instructor was in charge, and the students were there to learn. This remained true of both the large/lecture classes and the small/discussion classes I was in.
In the graduate level classes, the classes were almost all collective. While the instructor was considered an expert (and except for my compiler class, which no one grokked), even lectures were presented to equals, feedback, challenges, and critiques were encouraged, and there were no sacred cows. This was even more prominent in the upper level graduate classes I took[1], compared to the lower level ones.
As for
Being prepared for graduate work at the first did not necessarily leave her in a position to correctly anticipate the nature of the latter.
----
[1] Including the one I took my freshman year. Oops. The class had no prerequisites listed and I liked graph theory so, I took it.
(no subject)
From:(no subject)
Date: 2010-04-18 10:55 pm (UTC)I think the bad-mentor-as-reduction-in-pay idea is interesting.
That said...I'm not sure "your professor is just a boss" helps things. I mean, maybe the mentor thing comes with personal entanglement that the boss thing doesn't, but people who aren't good mentors will not suddenly be good managers. And people who work for bad managers are generally miserable, regardless of sector. I don't think the need for management, hence managerial ability, necessarily decreases.
(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
Date: 2010-04-19 01:29 am (UTC)It sounds sort of similar to those horrible classes at our beloved alma mater where the professor clearly had no interest in teaching a bunch of undergraduates at all. This was obviously because the professor wasn't being paid to teach, he was being paid to do research (or more accurately to secure grant money) so for the most part could give a flying fuck about teaching anyone other than their own grad students (if them). Since reviews were file 13ed (or at least not given any weight) they never taught. Of course, this overlooks that having unhappy undergrads undermines the department in a host of other ways, but it's the way it happened. It was the rare professor indeed who actually taught - why do you think the undergrads continually nominated the same person in each department for various teaching awards? That person was the only one who tried, so won by default.
So let's look at the mentor/mentee relationship. My understanding is that the mentor more or less had to agree to having a mentee or they wouldn't get one. Is that correct, or are professors 'urged' to have at least one mentee each year or does this vary by school? I think pretty clearly that if they are forced to have one, the same scenario holds.
To return to the original question though, we use that as our model because we always have for whatever reason, and it works well enough that nobody has sought to change it. After all, who cares if a certain percentage of the grad students in a given field fail to achieve a degree? It just reduces the competition for a finite amount of positions and grant money. And if the professor is a terrible mentor, so what? The school doesn't care too much, and if the professor happens to be tenured then they can't do a lot about it anyway.
(no subject)
Date: 2010-04-19 09:00 pm (UTC)Professors appear to have a steady stream of highly skilled workers willing to put in long hours at subsistence wage, and should any of them dislike it...well, there are plenty more applicants willing to fill the slot. It's not even like there's an expectation of having to be Supermentor, it's more like an expectation of having indentured servants. Why would a professor want to encourage the treatment of grad students as young professionals? It'd just make them uppity and less willing to put up with the crap.
incentives and names
From:Re: incentives and names
From:Re: incentives and names
From:Re: incentives and names
From:Re: incentives and names
From:Re: incentives and names
From:(no subject)
Date: 2020-06-25 05:24 pm (UTC)(Here via
(no subject)
From:(no subject)
From:(no subject)
From: