Tuesday, February 19, 2019

Hot dogs are not sandwiches

I recently saw this debate brought up again and realized that I haven't ever truly weighed in on it. So let's set the record straight once and for all.

I'm too lazy to look up the kind of detailed description of the debate that I imagine I'd find if I did a quick internet search, so do that yourself. Or trust my flawed recollection of it.

The initial question "Is a hot dog a sandwich" is used as a kind of prompt to encourage exercising critical thinking skills. There are a variety of interesting responses to the prompt and one ultimate, objectively correct answer, which I've never actually witnessed another human being provide. We'll come to that. But to set the stage, I need to recount the typical responses...
  1.  One party might argue that a hot dog is not a sandwich because a sandwich consists of contents (such as meat) between two pieces of bread. A hot dog bun is a single piece of bread.
  2. Another party might argue that a hot dog is a sandwich because a sandwich consists of contents (such as meat) and bread to hold the contents in some way. Other items that are, by widespread consensus, known as sandwiches, use configurations other than the traditional two slices of bread with contents between them. A hot dog meets the same criteria as those sandwiches.
From there, the whole thing continues in the manner of a debate, with both sides trying to come up with examples to use as supporting evidence for their own classification schemes. Typically, the affirmative side in the debate will cite open-faced sandwiches and submarine sandwiches, trying to invoke some nonexistent classification scheme reminiscent of established taxonomic classification systems. And the negative side in the debate will attempt to extend the logic of the affirmative side to absurdity with more extreme examples.

The average person knows intuitively that the affirmative side in the debate is wrong, so the affirmative side generally clings to a rigid framework of some hypothetical sandwich classification schematic. They stick to a point along the lines of "These other objects are all sandwiches and they have these properties in common, so another object with those properties (meat and other contents held inside bread) is, by definition, also a sandwich." Meanwhile, the negative side tends toward pithy ace-in-the-hole counterpoints. The most popular of these seem to be...
  1. Defining a hot dog as the sausage itself and insisting that no matter what the sausage + bun combination is classified as, the hot dog has no bread and is not a sandwich.
  2. Asserting that language exists to facilitate communication and that because most people do not think of hot dogs as sandwiches, it is improper to classify a hot dog as a sandwich.
It doesn't generally come up, but those objections are mutually incompatible. If we are to regard general usage of language as our standard, then the phrase "hot dog" is clearly used to  apply to the sausage + bun combo in common parlance, and so the gotcha technicality of defining "hot dog" as sausage alone isn't tenable. If we do embrace a gotcha technicality, then we are accepting that there should be some rigorous classification scheme for this, and the inconsistencies of common parlance are irrelevant.

The problem is that the negative side in the debate, the "not a sandwich" camp, are right, but seem to be unable to articulate the real reason that a hot dog is not a sandwich. So I, in all my splendorous wisdom, am here to tell you the right reasons that the "hotdog is a sandwich" camp is wrong.

There's a hint. Earlier in this post I noted that the argument for the affirmative side hinges on invoking a taxonomic classification scheme. They want to rigorously define different categories of food. Although they do not normally need to go so far as to construct diagrams, they're clearly drawing inspiration from existing systems of classifications used professionally by experts in other fields, such as engineering and biology. They create, or refer to the hypothetical creation of, a systematic approach to defining foods in different categories, as though the foods are machine parts or flowering plants. Then the word "sandwich" is matched to objects already established to have the name and the list of properties associated with the label are deduced from the features those objects have in common. Those arguing for the affirmative are acting as detectives trying to map out the logic to determine what is and is not a sandwich. The huge, gaping flaw in all of this is that there's already an established professional classification scheme for food and it is based on food preparation. The names assigned for food are assigned by food creators, by chefs. We know where the names come from. They are not assigned post-hoc by investigators endeavoring to establish a classification scheme. They come from chefs.

The origins of the term "hot dog" are unclear and it is used inconsistently, sometimes applied to sausage alone and other times applied to the sausage + bun combination. The former is older, which might matter if that was the topic of the debate. But the relevant part is that hot dogs have been around for well over a century and that they were not marketed as or thought of as sandwiches because the history of their creation is different. In a meat sandwich, the emphasis is on placing the meat and other ingredients on or into the bread. Keep in mind that the term is a culinary term, so we're concerned with food preparation here. In evaluating a culinary term, it makes no sense to step outside of the culinary realm and act like we're space aliens observing the object with no prior information. The term is a culinary term, so the preparation of the object as food is the impetus for the term used to describe the object. And while lots of sandwiches have their own steps involved, the unifying element is the placement of contents on or into bread. A hot dog isn't like that. It is created in a factory or sausage shop, then packed and shipped to a food vendor. It is then unpacked, cooked, and added to a bun. Then toppings and condiments, aka "fixings" are added and the combined product is sold and consumed as a "hot dog." It is definitely true that applying the same label to the initial sausage and to the finished product is muddled and is not the way we'd want to do things if we were establishing a taxonomic system from scratch. But that's how it goes! The term demonstrably gets applied, widely, to both. That's the culinary usage. The traditions and the emphases in the two culinary traditions are distinct. To dismiss this is folly.

Q.E.D.

Tuesday, February 12, 2019

Schloss Grünfluss and Seattle Snowmageddon 2019

Obligatory mention that I don't post here nearly enough. I know, I know. I say it all the time and I mean it every time, but I continue to transgress. It's been months. I'll totally turn this ship around. I can do better. I will do better. What's so difficult about checking in every once in a while and writing a blog entry or something? Nothing.  Nothing difficult about it at all. I'll do it.

Posting here has the effect of taking me back to the LJ days. Mentally, not literally. Of course, but you knew that. Anyway, some minor details in the presentation, combined with prompting from friends and the whole vast community weirdness of it, well, it impelled me toward journal posts of a personal nature. I mean, I didn't share stuff that I didn't want to be made public! I'm not that stupid. I knew I was, by definition, making that information public. I don't think I was foolish to share the things I did, but there's a marked difference anyway. Well, I note the difference, but I never really pinned down the reason. Still can't.

Maybe it's just that I'm older. More grown up. Time has passed, of course, but I don't buy it. Could it have been the interaction with friends? Seems like a more plausible explanation. I was not and am not consciously worried about some sort of unforeseen social consequences for my public sharing of personal details and meandering introspection. In fact, I remember, years ago, coming across this strip...

Dreams

It resonated with me then, and it still does today. I haven't been holding back because I worry what other might think. Instead, I've been holding back due to some inexplicable torpor. I didn't stop sharing out of fear or maturity or because I decided it was wise. I stopped sharing out of habit. I stopped sharing because it felt easier, because it felt like journaling was becoming a chore.

Even in the LJ days it wasn't all personal. I used that site as a sounding board for everything I wanted to say. I shared links, I posted essays of a kind, I created strange and wonderful communities, and I created a kind of narrative that bound myself and others up into something I'd not initially expected. Oh, it was probably trite or banal more than I'm remembering now, but I did produce some real content and managed to do so on a surprisingly regular basis. I did all that when the whole plan at the start was just to have a "journal." I wrote about what was on my mind, about what was going on in my life, and anyone could see it. My dad even found my LJ one time, and I remember that well. Now I have a blog no one reads. This started out as something more serious, something I was going to use to devote more time to longform content. More editorial, although not truly professional. But I think it's been forgotten by everyone except me. I could say anything and probably no one else would see it. On LJ I refused to censor myself even though people probably were going to see it (and did). Here on Blogger, I changed to a platform that was effectively private, but I couldn't be bothered to talk about things. The irony isn't lost on me.

I know I've brought it all up before. It's important to me, though. When I've reviewed my old content, it's restored memories that were seemingly lost, and it's left me wondering about the gaps. But this place, this half-assed effort, is too sparse. I earnestly want to change that. Go back through my posts for the past several years though, and see that I've expressed that sentiment before.

My last post on LJ, titled "#600" was my goodbye of sorts. Reading it again now, it's surreal. So much that's important is still the same. But strikingly, so much has changed. And I filled in some of the gaps using this blog, but I forget how and when? I worry about those details that weren't filled in? How much did I talk about my graduation from Green River? How much did I talk about my trip to Europe? They both happened later that same year! And then I moved to Seattle (temporarily) and went to the University of Washington and graduated there too and floundered in unemployment again for a time and then got a job again and, well, on and on it went. Life happened. Time progressed. So much to talk about. I think, in a way, I became paralyzed by how far behind I was. Such a stupid excuse for my lack of activity here, but it rings true. I had some ideal in my head for all the things I'd say when it came to my experiences at the University of Washington, but I didn't take the time to post about it here. It dragged on and I know I thought I'd make up for it some day with some giant retrospective post. But that became too daunting and inertia took care of the rest. So much time passed I even went and got a job there. Huh. I feel like I probably mentioned the job at some point, but did I?

I can't really undo the lost years. I can't repair this. Not fully. But let's try anyway. And instead of trying to play catch-up, let's start here. Let's start where we are.

I bought a house. There's more to it. A whole lot of things happened that led up to this. But we'll catch up on those matters or we won't. The important part here, the focus, is that I bought a house. I have moved to Auburn. That's where the house I bought is, so it seemed sensible to move there. I am calling my house "Schloss Grünfluss." It's my house, so I get to call it whatever I want. I don't mind telling you that it's actually kind of between the Green River and the White River, so I had in mind some variation on the name "Mesopotamia" but I gave up on that. No, Schloss Grünfluss it is.

The move is pretty recent. I had some plans for getting settled in, and they've hit a bit of a snag in the form of the heaviest, most prolonged snowfall that the Puget Sound area has experienced in a long time...

Monday, November 5, 2018

It is oh-so-very important to me that other people vote, for some reason

I've long wondered why, every two years (or really whenever there's a major election), I hear and see so much emphatic "you must go vote" rhetoric. If it were more specific, "vote for this" or "don't vote for that" then I'd have thought little of it. But more often, it's been a simple a simple exhortation on the act of voting itself. I've always thought that this was absurd...
  • You do not gain anything by having more people voting. If anything, you lose something because your vote counts for less.
  • You'd think that people who are disinclined to vote, who lack interest and would need your chiding to motivate them, would be the same people who should not vote.
  • It's none of your business what, if anything, other people voted for anyway. Trying to meddle is boorish and downright un-American.
Honestly, even after I first registered to vote, this stuff annoyed me. I deliberately abstained from voting, or from voting on certain things, because I felt ignorant. I deemed that I didn't have enough background knowledge to be making these decisions and so I opted not to make an ass out of myself, even if it was a secret ballot. A few years of seeing how zealously involved people could be while simultaneously revealing the extent of their own ignorance somewhat tempered my stance on that. I suspect, for many people, that's a natural part of, um, being an adult. Actually, I suppose it's a cycle that's repeated in some other areas as well. At first, I'm overwhelmed and paralyzed by trepidation, wary of how poorly I understand what's going on. Then I see other people unwittingly demonstrating how they're massively clueless right before they brashly assert themselves, acting with supreme confidence where I had been so timid. And it doesn't break me out of my habits at first, but over time, it wears me down.

Encouraging other people to vote, though? No personal gain, no immediately obvious reason. It can't even be for the cause of advancing a political agenda because you have no reason to expect that your interlocutors would vote along your lines. Obviously that part doesn't apply if you're talking to specific people you know and you can gauge some of their beliefs and such. If think to myself, "Timmy doesn't really want to vote, but he agrees with me on most stuff, so if I convince him to vote, it'll be more votes for the stuff I want" then at least there's a kind of potential gain. I wouldn't go for that sort of thing myself, but I could understand that others might. However, that only applies to specific, reasonably close, acquaintances. And what I've encountered more commonly is general, vocal, public exhortation. And for that, a political agenda doesn't really apply.

For probably close to a decade, my best guess was that this whole phenomenon was some combination of virtue-signalling and a meme. And maybe both of those do play a role. Both are rather more nuanced than my straightforward sentence here. Virtue-signalling could tie into a generalized notion of "civic duty" but also to notions of youth and mentoring, to community development. And saying something is a meme says nothing about why it's a meme. I do believe that some of this is because of behaviors that are replicated after observation, spreading across the population like an infection, but behaviors only do that under certain conditions, and I couldn't be sure what those conditions were. I still can't.

In the past few years, I've been revising my assessment of this topic. I don't doubt that virtue-signalling plays a role and that some of the behaviors involved are memetic. But I've come to place the bulk of the blame elsewhere. I suspect that most of the people exhorting others to vote are doing so because they earnestly believe that the people they're exhorting would vote along their own lines. They truly think, on some level, that they can encourage others to vote and that those people, who might not have bothered to vote at all, will happen to check the same boxes that they themselves intend to check.

This seems so patently silly that even though I must have first considered it many years ago, I didn't take it seriously. What changed? Well, it wasn't a single, monumental thing. It's just the accumulated weight of my observations and what they've taught me. The vote-encouragers are almost always politically entrenched in some pronounced, polarized manner. Oh, they've got their own quibbles with a given political party or movement. They've got nuance and depth. But they've also come to so strongly view the other side as so cartoonishly villainous, so egregiously evil, so obviously wrong, that they sincerely believe prospective fence-sitters see things in the same light. A kind of "Yes, perhaps Timmy doesn't see things in the way I do, and perhaps he has problems with the general atmosphere in 'A.' But 'B' is so bad that anyone could see it, and Timmy would never vote for them."

I should have noted this fallacy a long time ago and given it the proper appreciation I now think it deserves. Because I think it's very common. Wherever people get split into two camps, people in both camps will tend to believe that the uninvolved would probably mostly side with them. They extend their own distaste for the the other side into reality itself, perceiving it to be a tangible, visible miasma.

Actually, I wonder if this might just be an extension of some deeper human instinct. In the absence of contrary information, I assume that the tastes of others are similar to my own. So I think that Reese's Peanut Butter Cups are delicious and Almond Joy are disgusting, and I know that there are people out there who have the exact opposite perspective, who think that Almond Joy are delicious and peanut butter cups are disgusting. But I dismiss them as deviants. And my baseline, default assumption for the general public is that they would enjoy peanut butter cups and detest coconut abominations.

Well, it's silly and wrong when I do it with candy. But it's just as silly and wrong when you do the same thing with politics.

Sunday, October 21, 2018

The use of pseudonyms and initialisms by women SF authors

A little over a year ago, I took a bit of time to blog about Isaac Asimov's posthumous collection, Gold. While, for all I know, it might not be the case for anyone else, this was an especially emotionally moving book for me. In 1995 and 1996, Harper Collins published a two-volume collection of Isaac Asimov's "final" works, starting with Gold: The Final Science Fiction Collection and concluding with Magic: The Final Fantasy Collection. As I noted in that earlier blog post, I happened upon the second volume while browsing the KCLS catalog in 1997, and was intrigued because I had recently been enthused by the card game "Magic: the Gathering" and the video game "Final Fantasy VII." I'd never heard of Isaac Asimov, and checking that book out on a whim completely changed my life. I guess I said all of that already...

Earlier this year, probably on the FOCL (Friends of the Covington Library) booksale shelf, I picked up an old used copy of a book called Gold: The Final Science Fiction Collection by Isaac Asimov. I immediately recognized it and knew that I had to buy it. That I recognized some book is not unusual. That I recognized a relatively obscure posthumous book of uncollected writings from decades ago is probably a bit odd (for me). But what is important is how and why I recognized it. You see, Gold is one volume of a posthumous collection of Asimov's work. It was published in 1995. He died in 1992. There was a companion volume, Magic: The Final Fantasy Collection published in 1996. I read it in 1997, about twenty years ago. And at that time, I had no idea who Isaac Asimov was. I can confidently say, with no exaggeration, that that book changed my life.

I wasn't especially interesting in science fiction. Didn't have anything against it, but I just didn't know anything about it or especially care. I liked books, though. I was always looking for new books to read. Somehow, while performing different searches on the KCLS catalog (it was computerized, even back then), I came across the title. It had both "Magic" and "Final Fantasy" in it. I hadn't heard of Isaac Asimov before in my life (although I'd later discover that his work had hugely influenced a whole lot of things that I did know about). But I had recently become enamored both with a card game, Magic: the Gathering (which I still play) and with a video game, Final Fantasy VII (which is probably a bit hamfisted in retrospect, but it impressed me at the time). So with nothing other than a title that piqued my curiosity, I checked it out. My eleven-year-old mind was thoroughly blown. It's not so much that Magic was Asimov's best work, but that it exposed me to a world I hadn't seen before. The fiction was fun, fascinating, and really hooked me in. But the nonfiction in the collection was something I'd never imagined, something so totally novel to me that it was, well, I can only describe it as formative. I had to have more! Magic led me to looking into Isaac Asimov, which led me to I, Robot, which I recognized as the title of an Alan Parsons Project album, so I checked that one out too. But the copy I checked out came bundled with Foundation, so I also read that, which subsequently transformed me into a science fiction nerd for life.

If I'd been pressed to cite the most important book I've read in terms of its influence on me or my appreciation for it, I'd probably start thinking of things like Alastor, The Moon is a Harsh Mistress, The Death Gate Cycle, East of Eden, The Gods Themselves, Twenty Thousand Leagues Under the Sea, or maybe even The Lord of the Rings. And then I met Gold. It's a fine read, don't misunderstand. But I am perhaps uniquely affected by it. The stories are interesting and the nonfiction is, for those with a specific interest in the world of science fiction publication in the late 20th century, enlightening. But for me, this book was heartwrenching. It was so evocative, so similar to Magic, a book that had been buried in the back of my mind. I'd forgotten how damn impressed I was by that book, how it had driven me in the sorts of books I sought out thereafter, how as a kid I'd gotten a kind of crash course of insight into topics I'd never even considered. The memories came flooding back. I'd seen that Gold was the other volume back then. But I didn't find it at the library back then and eventually I moved on. I came full-circle twenty years later, by sheer coincidence.

Asimov had been dead for five years before I ever ran into him. And it would be another twenty before I realized just how much he inspired me.
 Anyway, in Gold Isaac Asimov mentioned other science fiction authors several times and had this particular line, which I'd happen to come back to...
Don't get me wrong. There were women writers even in the early days of magazine science fiction, and women editors, too. When I was young, some of my favorite stories were by A.R. Long and by Leslie F. Stone. I didn't know they were women, but they were.
 Because of that one line, I bought a copy of Leslie F. Stone's Out of the Void. I later saw the claim on a website that Asimov had cited Leslie F.  Stone as being one of the female science fiction writers who disguised her sex to appeal to a male readership, with the appended statement that Asimov was incorrect, that Leslie had been born with the name Leslie and had always been Leslie and wasn't hiding anything. However, as far as I can tell, their source for that is the exact line I just quoted, which doesn't actually say what they said it says. I could be wrong about that, though. Sorry, bit of a tangent there. That point occurred to me, but it's not very important and not the real reason I started this blog post.

I'm currently reading Joan Vinge's The Snow Queen. In the preface, I saw this...
Along with that new openness to creativity in the field came an influx off women writing science fiction. There had always been a few women SF writers who were very successful, but the majority of them wrote under male pseudonyms, like Andre Norton, or had female names that could easily be mistaken for male names, like Marion Zimmer Bradley. Others used only their initials and last name. The majority of readers very likely never realized that they were women, until they started coming out of the gender closet.
I've seen the sentiment, essentially, in other places. But here it was concrete. And I don't want to just dismiss it: I wasn't around back then. By the time I heard about Andre Norton or Marion Zimmer Bradley, it was probably common knowledge that they were women. And I grew up with the rise of the world wide web, which probably changed the landscape for these sorts of things. And yet, I have some real qualms with this. It's not that I contend none of the women who wrote science fiction disguised their gender. I just see casual claims like Vinge's here. "The majority of them wrote under pseudonyms" and such. Did they? Is that true? I'm skeptical. Asimov may have sincerely assumed, in his youth, that the magazine stories with "Leslie F. Stone" in the byline were written by a man, and that's understandable since the name "Leslie" would have been more common for boys than girls back then. But it really was her real name, the one she was born with. Even back in 1905, some parents named their little girls "Leslie." Same goes for Marion Zimmer Bradley: that was her actual name. And I, for one, immediately assumed that she was a woman the first time I saw a story under her name. I didn't learn until much later that any men were ever named "Marion."

Andre Norton is a bit of a strange case. I know that when I first saw the name, it was in the context of Mercedes Lackey writing about her, so I got the information up-front, and didn't think much of it. Honestly, I didn't think anything of it until I came across the concept in Joan Vinge's preface just recently. And then I kinda did a double-take. Because of course Andre is a masculine name. There was Andre the Giant and the tennis dude and probably some others I assume. But, and I forget how I knew this, it was also Andre Norton's real name. It wasn't a pseudonym: it was her legal name. But here's where it gets strange. I looked it up with teh Google, and it seems that Andre Norton was born "Alice Mary Norton" and legally changed her name to "Andre Alice Norton." According to her obituary, she initially used Andre as a pseudonym in the 1930's because publishers told her that her books would get more boys reading them if the author name was masculine. I don't know where the obituary got that information, but assuming it's true, that really would have been an early case of the concept Joan Vinge (and others) mentioned.

Still, I'm not convinced. Maybe I should be, but it all seems extremely circumstantial. I do know that there were other examples, like the well-known case of "James Tiptree Jr." being a pseudonym for a woman whose identity was initially a mystery to the public. But some people like pseudonyms! The same author also used "Raccoona Sheldon" as a pseudonym and that was before her identity was public. Lots of authors, men and women, use pseudonyms or initialisms. I haven't even seen someone claim that it's more common for women, let alone use evidence of such a disparity to argue that this had to do with the perception by readers of the quality of writing by women. But what I have seen is allusions to some dark age of science fiction when women had to disguise themselves. It all sounds far more exciting than "there were women writing, but some of them use pseudonyms and other used initialisms and others did neither of those things."

Now, what I do see to be pretty demonstrable is that some women used overtly masculine pseudonyms (like when the aforementioned Amelia Reynolds Long wrote under the name "Peter Long"). And in constrast, I don't know of any instances of 20th century male science fiction authors hiding their sex with female pseudonyms. Presumably it happened at some point, but it seems like it must have been more rare. But why is that? My tentative answer is that men generally felt less comfortable with the notion of using a feminine pseudonym than women did with the notion of using a masculine one.

Friday, September 28, 2018

Crap from Facebook Special Edition: failing grades

So this one bugs the hell out of me. Might as well blog it. Or something...

Backstory: a middle school teacher in Florida was fired and went to the press with a story about how it was because they heroically refused to give students grades of 50% on missing assignments. As many others have already noted, the school district denies that such a policy exists, and administrators are not really allowed to comment on the circumstances, so the public only gets on side of the story. The teacher was new and was in a probationary period, so the district didn't need justification for terminating her. That's why it's common knowledge that rocking the boat when you're still on probation is a bad idea. If you're officially in a feeling-out period and the people whose job it is to decide whether you're the right fit perceive you to be stirring up shit even when you know you're still on probation, you're going to get what's coming to you. Anyway, I take all of this with a grain of salt.

I saw the story on Facebook and a bunch of people commented in defense of this teacher. Right away, when I saw that reports indicated she'd been on probation, I was suspicious. But even aside from that, I found myself thinking, "Regardless of whether that policy is real, it's probably a better system than most traditional grading policies anyway." Among those bothering to comment on Facebook, though, this would have been a minority position for sure. The crux of the issue seemed to be the notion that a 0% grade for 0% work turned in was just/fair/proper/righteous and that any deviation from this was giving something for nothing, and goshdarnit, kids will grow up all spoiled and such if they get something for nothing. Well, I found that stance to be naive, but I was inclined to leave it alone.

As it happens, the maintainer of one of the Facebook pages I follow is also a middle school teacher, and he wrote an article about this story. A few paragraphs into his article, he linked to some old quotes of the "kids these days" variety. He went on to explain, using basic statistics and some common sense, why he eschewed 0% grades. His reasoning seemed sound to me, and I had this, admittedly misplaced, hope that such a thought-out explanation would change the minds of some of those curmudgeons bemoaning the noxious puerility of today's youth. And perhaps, for some, the article really was food for thought. But most of the comments were critical, grating, arrogant, and really just stupid. And that's why I'm blogging about it!

The naysayers seemed to have two primary objections. Many comments hit on both, but some only picked one or the other...
  1. Giving a student 50% for no work is inherently unfair. Zero work should mean zero credit. Assignments that are turned in, but done poorly, could perhaps be 50%. But missed assignments must be 0% and anything else would cheapen the work put in by the students who did complete the assignment.
  2. Students who get something for nothing will grow up wrong. They won't be prepared for the "real world." Kids need to learn that their actions have consequences. Coddling them will hurt them in the long term, etc.
Some of these comments veered right into the territory of "kids these days." So I quipped about the irony of including the old quotes in the article to make a point, only to get more of the same comments, proving the point. But I can't leave well enough alone. I won't get sucked into that comment section. I try to avoid that sort of thing these days. But just in case anyone who stumbles across this blog might harbor similar sentiments, I'll use this space to explain exactly why you're wrong.

Any grading scale for schoolwork is, to some extent, arbitrary. Confusingly, most schools issue grades using different scaling in different situations. Growing up, I had scales from 0 to 5, from 0 to 4, from 0% to 100%, and letter grades A/B/C/D/E/F and then just A/B/C/D/F (don't ask me what happened to "E" or why "E" was cut instead of "F"). And different teachers had different standards for how those grades were assigned in the first place. Some classes were graded on curves, others were not. Some classes had grades weighted around exams, others were mostly determined by homework. Some classes would drop the lowest few scores before computing an average for the course (like a class with 12 graded assignments could drop the two lowest scores on assignments, then average out the grades on the other 10 to determine course grade). Some classes were set up so that grades were determined by the results of collaborative projects, others were based on individual work. Some teachers had rigorous systems of penalties for late assignments, and others just outright refused to accept assignments pas their deadlines. Some teachers offered extensive extra credit in various ways, and other teachers hated extra credit. And that was just me. Just my real-life experience in the one school district where I grew up. In the abstract, we could devise all manner of wacky grading schemes. There isn't one that is inherently the best. Virtually every grading system at some point takes different grades and uses them to compute a mean, but that isn't absolutely necessary. We could devise a system wherein the median assignment grade is used to generate the course grade, or we could use some other metric entirely. A percentile scale is convenient for doing computational analysis, for getting numbers that are easy to work with for putting into charts and stuff. But don't delude yourself that the numbers are real. They're as arbitrary as any other system.

As far as "the real world" goes, I found it especially amusing to see remarks like, "I can't just fail to show up for work and get paid 50%." In case you didn't know, children don't get paid to go to school at all. Now, some adults do make the conscious decision to show up to a place and do what other people tell them to do without expecting to be paid directly for that, but that's something we do of our own volition with the goal of earning some advanced certification or license, and even then only for a few years. Children are expected to attend school full-time for thirteen years, and not for some high-caliber certification, but for a high school diploma, essentially a worthless piece of paper. That process is wholly unlike anything in anyone's adult world I know of. Griping about how their grade-scaling was set up in such a way that won't "prepare" them is bullshit.

Grading systems are inescapably arbitrary, but we can apply knowledge from different fields to discern advantages and disadvantages of them. While there might not be a single, most objectively fair/just approach, it's obvious that some methods would be more or less generous than other methods. As the author of the article notes, averaging assignments graded on a 0 to 100 scale is actually quite a bit more punitive than most things we deal with the rest of our lives. It means that one missed assignment effectively cancels out several other assignments with respectable scores. Bear in mind that overall course grades as high as 59%, and sometimes ones even higher than that, are considered inadequate, considered "failing." In your adult life, you can probably make some minor mistakes and have them either overlooked entirely or have the issue raised and brought to you for correction, and if you can correct it, you might get some minor penalty (or even none). Even serious blunders could be forgiven if you can show some attention to preventing recurrences and if your other performance has been satisfactory. In most professional settings and in other adult situations, you usually get some "benefit of the doubt." A 0 to 100 scale doesn't do that, especially not when a simple mean is used. Two perfect 100% scores and one missed assignment average out to 66%, usually reckoned as a D, barely passing. If you're generally capable of churning out work that scores 80%, which is typically considered B-,  a single missed assignment would average out with four instances of your usual work to get a 64%. And even after another round of five assignments, you'd have a 72% average; you'd be a C- student. That's pretty harsh. Adults can smugly go, "But that's good. Kid's gotta learn accountability. Blah, blah, blah." But ultimately, that's a more punitive standard than you're held to for most things in life, and when you are held to standards that high, it's typically because you are a competent professional who chose to enter into some contract for tangible benefit: you have your shit together and you want to do serious work and get the rewards that come with it. Kids have way less control over their lives than you do.


Opponents see 0% as the natural baseline and are incensed that he seemingly moves the baseline up to 50%, effectively giving away points for free, which they believe to be too generous. But that's framing it like children in school as supposed to be measured as elite competitors in some sort of contest, and you want the numbers to reflect their output. The point of school is to educate, not to cull the weak. As he takes pains to emphasize, a lot of factors outside of school affect performance. Setting the baseline at 50% offsets some of this. Some students might still fail. Consistently missing assignments or missing too many assignments and turning poor quality work on other ones is still going to result in an E average. But at least it gives struggling kids a chance.

Monday, August 20, 2018

Why I'm wary of labels

In various conversations over the years, I've noted my wariness of labels. While that trait isn't unique or even especially novel, it usually comes up against steadfast resistance. Part of that is just down to context. But people love their labels and insist on them. Labels are convenient. Now, I'm also someone who is self-admittedly obsessed with classification. So this might seem a bit odd. After all, wouldn't we use labels to classify things? Yes, but also, that's not really the problem that worries me. So here's something I just brainstormed that might help...

Suppose that A makes a certain claim.
Suppose that B makes another claim.
Suppose that those two claims are mutually incompatible.
Suppose that you might take issue with other features of one or both claims.
Note that A and B are both letters of the alphabet and that this classification can be used to group them together.
Finally, suppose that you use all of that to dismiss C, as C is also a letter of the alphabet and therefore is assigned a connection to both the claim put forth by A and the claim put forth by B.

That makes it pretty clear, right? You shouldn't dismiss C, poor C, because of something that A said, nor because of something that B said. Doing both simultaneously doesn't make it better. And yet, all too often, it seems like the use of demographic labels in discourse is a smokescreen to hide "straw man" arguments. Careful observation has led me to the belief that this isn't the goal of using labels. People want labels because it helps them compartmentalize, and compartmentalization is (most of the time) a useful cognitive tool. I see the appeal, but I also see it as dangerous. So that's why I'm wary.

Monday, July 9, 2018

Tyrants

You think that you're a good person. Well, maybe not, but at least you're an OK person. Not deranged or whatever, you know? Live and let live, that sort of thing. And so you wonder, psychologically, what it is that makes someone become blatantly evil. Not in a "why do people do bad things" kind of way, but the real monsters. Mass murderers. Hitler, Stalin, Mao, etc. Those guys. What led them down that path? It's so far removed from your own experience that it's a bizarre subject to even consider. How does the mind of a tyrant work? What goes through their heads?

And then your cell phone breaks and you have to get a new one. Your new phone automatically comes with autocorrect on the text messages and it triggers before you notice it. You see that autocorrect slightly altered one of your sentences without your permission. Nothing embarrassing or important, but it made the words wrong. You get annoyed and want to turn it off, but it isn't intuitively obvious to you how to get to that option in your phone's settings. So you think to yourself, "The people responsible for this should be marched up to a trench and shot. Yes, that is the appropriate consequence for this injustice and I would like it carried out immediately."

Yikes.