To boldly split what no one should split: The infinitive.

Lies your English teacher told you: “Never split an infinitive!”

To start off this series of lies in the English classroom, Rebekah told us last week about a common misconception regarding vowel length. With this week’s post, I want to show you that similar misconceptions also apply to the level of something as fundamental as word order.

The title paraphrases what is probably one of the most recognisable examples of prescriptive ungrammaticality – taken from the title sequence of the original Star Trek series, the original sentence is: To boldly go where no man has gone before. In this sentence, to is the infinitive marker which “belongs to” the verb go. But lo! Alas! The intimacy of the infinitive marker and verb is boldly hindered by an intervening adverb: boldly! This, dear readers, is thus a clear example of a split infinitive.

Or rather, “To go boldly”1

Usually an infinitive is split with an adverb, as in to boldly go. This is one of the more recognisable prescriptive rules we learn in the classroom, but the fact is that in natural speech, and in writing, we split our infinitives all the time! There are even chapters in syntax textbooks dedicated to explaining how this works in English (it’s not straightforward though, so we’ll stay away from it for now).

In fact, sometimes not splitting the infinitive leads to serious changes in meaning. Consider the examples below, where the infinitive marker is underlined, the verb it belongs to is in bold and the adverb is in italics:

(a) Mary told John calmly to leave the room

(b) Mary told John to leave the room(,) calmly

(c) Mary told John to calmly leave the room

Say I want to construct a sentence which expresses a meaning where Mary, in any manner, calm or aggressive, tells John to leave the room but to do so in a calm manner. My two options to do this without splitting the infinitive is (a) and (b). However, (a) expresses more strongly that Mary was doing the telling in a calm way. (b) is ambiguous in writing, even if we add a comma (although a little less ambiguous without the comma, or what do you think?). The only example which completely unambiguously gives us the meaning of Mary asking John to do the leaving in a calm manner is (c), i.e. the example with the split infinitive.

This confusion in meaning, caused by not splitting infinitives, becomes even more apparent depending on what adverbs we use; negation is notorious for altering meaning depending on where we place it. Consider this article title: How not to raise a rapist2. Does the article describe bad methods in raising rapists? If we split the infinitive we get How to not raise a rapist and the meaning is much clearer – we do not want to raise rapists at all, not even using good rapist-raising methods. Based on the contents of the article, I think a split infinitive in the title would have been more appropriate.

So you see, splitting the infinitive is not only commonly done in the English language, but also sometimes actually necessary to truly get our meaning across. Although, even when it’s not necessary for the meaning, as in to boldly go, we do it anyway. Thus, the persistence of anti-infinitive-splitting smells like prescriptivism to me. In fact, this particular classroom lie seems like it’s being slowly accepted for what it is (a lie), and current English language grammars don’t generally object to it. The biggest problem today seems to be that some people feel very strongly about it. The Economist’s style guide phrases the problem eloquently3:

“Happy the man who has never been told that it is wrong to split an infinitive: the ban is pointless. Unfortunately, to see it broken is so annoying to so many people that you should observe it.”

We will continue this little series of classroom lies in two weeks. Until then, start to slowly notice split infinitives around you until you start to actually go mad.

Footnotes

I’ve desperately searched the internet for an original source for this comic but, unfortunately, I was unsuccessful. If anyone knows it, do let me know and I will reference appropriately.

This very appropriate example came to my attention through the lecture slides presented by Prof. Nik Gisborne for the course LEL1A at the University of Edinburgh.

This quote is frequently cited in relation to the split infinitive, you can read more about their stance in the matter in this amusing post: https://www.economist.com/johnson/2012/03/30/gotta-split

Lies your English teacher told you: “Long” and “short” vowels

I remember, long ago in elementary school, learning how to spell. “There are five vowels,” our teachers told us, “A, E, I, O, U. And sometimes Y.” (“That’s six!” we saucily retorted. (We were seven.))

“When a vowel is by itself,” our teachers continued,”it’s short, like in pat. When there’s a silent e at the end, the vowel is long, like in pate1.” Then there were a dozen exceptions and addenda (including the fact that A could be five different sounds), but the long and the short of it was, there are long vowels and there are short vowels.

And you know something? There are long and short vowels in English. We actually briefly discussed this before, many moons ago during our introduction to vowels, but I wanted to add a little more detail today.

The first important thing to remember is that writing is not equivalent to the language itself.2 Our spellings are generally standardized now, but they are only representations of words, and they do not dictate how a word actually sounds. Furthermore, English orthography uses five or six symbols to represent more than a dozen different vowel sounds (not exactly an efficient system). In our example above of pat and pate, these words actually contain two distinct vowels pronounced in two different places in the mouth. The same is true of the other “long” and “short” vowel pairings. It’s almost like these sounds ([æ] and [eɪ], in IPA) aren’t really related, they just timeshare a spelling.

In another sense, though, it’s not so incorrect to say that pat has a short A and pate has a long A. To illuminate this claim, we’ll need two ingredients: an understanding of vowel tenseness in English, and an important sound change from the language’s past.

For scholars of English, a more important distinction than vowel length is vowel tenseness. Like the long/short vowel spelling distinction, linguists have identified pairs of vowels that are separated by no more than a little difference in quality. The difference, though, is not a matter of length, but whether the vowel is tense or lax, i.e. whether the muscles in the mouth are more tensed or relaxed in the production of the sound. These pairings are based on the sounds’ locations in the mouth and are therefore a little different than those traditionally associated with the letters. Pate and pet demonstrate a tense-lax pairing, as do peek and pick. The sounds in these pairs are very close together in the mouth, pulled apart by the tenseness, or lack thereof, of their pronunciation.

In some dialects of English, like RP or General American, tense vowels (and diphthongs) naturally acquire a longer duration of pronunciation than lax vowels. In short, the tense vowels are long. Therefore, it wouldn’t actually be false to say that pate has a long A and pat has a short A, but the length of the vowels is an incidental feature of English’s phonology and isn’t really the important distinction between the sounds (not for linguists, anyway).

It isn’t always that way in a language, and in fact, it wasn’t always that way in English. We’ve mentioned this before, but it’s pertinent, so I’ll cover it again: in some languages, you can take a single vowel (pronounced exactly the same way, in the same place in the mouth), and whether you hold the vowel for a little length of time or for a longer length of time will give you two completely different words. This is when it become important and appropriate to talk about long and short vowels. Indeed, farther back in English, this was important. In Old English, the difference between god (God) and gōd (good) was that the second had a long vowel ([o:] as opposed to [o], for the IPA fluent). In all other respects, the vowel was the same, what many English speakers today would think of as the long O sound.

In a way, these Old English long/short vowel pairings are really what we’re referring to when we talk about long and short vowels in English today (even if we don’t realize it). The historic long vowels were the ones affected by the Great English Vowel Shift, and the results are today’s colloquially “long” vowels. The short vowels have largely remained the same over the years. Maybe in this sense, as well, it’s not so bad to keep on thinking of our modern vowels as long and short. So many other quirky aspects of English are historic relics; why not this, too?

In the end, maybe the modern elementary school myth of long and short vowels isn’t entirely untrue, but there’s certainly a lot more to the story.

Notes

1 This is a delightful, if somewhat archaic, word for the crown of the head. I love language.
2 I imagine some of our longtime readers are fondly shaking their heads at our stubborn insistence on getting this message across. Maybe it’s time we made tee shirts.

The myth of language decay: Do youths really not know how to speak?

Hi everyone!

My name is Sabina, I’m 28 years old, from rainy Gothenburg, Sweden (unlike Riccardo from sunny Bologna). Why am I here? Well, to talk about linguistics, obviously! Specifically, I’ll be talking about a persistent and prevalent language myth: the myth of language decay.

This is the idea that modern forms of language are somehow steadily getting “worse” in comparison to previous stages of the language. The thought that there was, somewhere, somehow, a “golden age” of the language, after which it became unstructured, uninformative or just plain “bad”. This idea is a form of prescriptivism, as described by Riccardo in last week’s post, and perhaps the most widespread one at that.

You might think that this is not as common a myth as I say, but consider: have you ever heard someone claim that “young people” don’t know how to write? How to talk “properly”? Maybe even how to read? These are, indeed, examples of this myth.

However, is it true? Do young people really not know how to write/speak/read their native tongue? Of course not, they just do it in a different way.

The myth of language decay is intimately connected to the phenomenon known as language change. Now, language change is often described by linguists as a necessary, vital and continuous part of the language’s development and survival. Just imagine if we spoke English the same way as in the Middle Ages, or even as in Shakespeare’s time! English today is certainly different from back then, but it is in no way worse. Think about it, would you really want everyone to speak like Shakespeare did? Or Chaucer? Or perhaps as in Beowulf?

It is interesting to note, however, that the idea of language decay rarely touches the history of the language. Chaucer and Shakespeare lived approximately 200 years apart yet no one really claiming that Chaucer’s English was “bad” in comparison to Shakespeare’s, do they? (As a matter of fact, Chaucer has earned himself the nickname “Father of English literature” so it really can’t be, can it?).

Let’s take a more recent example: Charles Dickens (1812-1870) to J.R.R. Tolkien (1892-1973) to George R.R. Martin (1948-). Now, if you sit down and read through the works of these three authors, all of whom have been hailed for their writing skills, you will probably notice a rather distinct difference in not only style, but perhaps also in lexicon and grammar. Yet no one is arguing that Dickens and Tolkien didn’t know how to write, do they?

But guess what? Someone probably did when Tolkien started writing! Someone probably did when Martin started out. Someone probably even said it about Dickens, Austen, Woolf, Brontë, Shakespeare, Chaucer, etc, etc.

In fact, people have been complaining about language “decay” for a long, long time, specifically since the time of Sumerian, a language spoken in the region of Sumer in ancient Mesopotamia. Now, you might be thinking: “Sabina, surely you’re exaggerating things just a bit?”.

I am not.

Sumerian is the first language from which there is surviving written material1 and in 1976, a researcher named Lloyd-Jones2 published a piece of work detailing inscriptions made on clay tablets. Among other things, these contained an agonized complaint made by a senior scribe regarding the junior scribes’ sudden drop in writing ability.

Basically: “Young people can’t write properly!”.

Consider that for a second. People have been complaining about supposed language decay for, literally, as long as we have evidence of written language.

Given this, you can imagine that people tend to have a strong reaction to language “decay”. Consider the case of Jean Aitchison, an Emeritus Professor of language and communication at the University of Oxford. In 1996, Professor Aitchison participated in the BBC Reith Lectures, a series of annual radio lectures given by leading figures of a particular field. Professor Aitchison lectured on the naturalness of language change, stating that there was nothing to worry about.

The result of this? Professor Aitchison received hostile letters to her home. Consider that for just a second: people took the trouble of sitting down, writing a threat, posting it, wait for the post to reach her, just to get their sense of accomplishment.3 That’s a pretty good indication of how strongly some people feel about this.

So, why are we reacting that way?

Well, we spend year upon year, in school, in newspapers, even in social media (with its “grammar Nazi” phenomenon), teaching people that there is a “correct” way of using language. We work hard to achieve this standard. Think of it as learning how to ride a bike. All your life, you’ve been told that you should sit on the bike in a certain way. It’s very uncomfortable, but you work and work and work to apply the right technique. When you’ve finally mastered the skill (and are feeling quite proud of yourself), someone comes along and tells you that you can sit on the bike anyway you want. Risk of you lashing out? Probably at least somewhat high.

But see, the thing is that, when it comes to language, there really is no “correct way”. Take the word “irregardless” for example. Many immediately get this kind of stone-faced expression and thunderously proclaim that there is no such word. But actually, there is. It’s a non-standard dialectal variant, used with a specific meaning and in specific contexts (in this particular case, irregardless is a way to shut a conversation down after already having said “regardless” in those varieties4, isn’t that interesting?).

But people think that there is somehow something “wrong” with this word, and those who use it (or other non-standard forms) will often be judged as speaking “bad English”, throwing more fuel on the fire for the myth of language decay. Especially since the older generations, for example, may retain their ideas about what is “correct” usage, while younger generations may have a different idea about what is “correct” and use the language in a different way.

So, what’s my point with all this? Well, my point is that the moment that a word from a non-standard dialect makes its way into the standard language, it’s going to raise some discussion about the “decay” of the language. This is really particularly true of the younger generations today who actually introduced a whole new form of language into their standard vocabulary: internet and/or texting slang!

This is fascinating! We’re introducing a new form of language! But… When young people start using, I don’t know, “brb”, “afk”, “lol”, etc. in their everyday speech, other people may condemn this as “lazy, uneducated, wrong”, etc., etc. and the myth of language decay rejuvenates.

But the thing is that languages change to match the times in which they exist. It may change due to political readjustments that have occurred or to reflect the different attitudes of the people. And sometimes, we can’t point to anything that made the language change – it simply did. Regardless, the language reflects its time, not a glorified past. And that is a good thing.

Unless, of course, you would perhaps prefer to remove most -ed past tense endings, especially on strong verbs, and go back to the good old days of ablaut (that is, vowel gradation carrying grammatical information, e.g. sing, sang, sung)? Or perhaps lower all your vowels again and skip the diphthongs? Or perhaps… yeah, you see where I’m going with this.

No? Didn’t think so. In that case, let’s celebrate the changes, both historical and current, without accusing them to somehow make the language worse.

Because, truly, the only difference between changes that made the language into the “glorious standard” of yesteryear and the changes that are happening now, is time.

Tune in to Rebekah’s post next week where she will explain the different periods of English and make it clear why Shakespeare did not write in Old English!

Bibliography

1 Check out the 5 oldest written languages recorded here.

2 Lloyd-Jones, Richard. 1976. “Is writing worse nowadays?”. University of Iowa Spectator. April 1976.
Quoted by Daniels, Harvey. 1983. Famous last words: The American language crisis revisited. Carbondale, IL: Southern Illinois University Press. pp. 33.

3Aitchison, Jean. 1997. The Language Web. Cambridge: The Press Syndicate of the University of Cambridge.

4Check out Kory Stamper, a lexicographer for Merriam-Webster, explaining “irregardless” here.

Introduction to the blog and some words on Descriptivism

Hello everyone! Welcome to our shiny new blog! My name is Riccardo, I’m 25 years old, from Bologna, Italy (homeland of good food and jumping moustached plumbers) and I’m here to talk about linguistics. Well, we all are, really. That’s why we’re the Historical Linguist Channel™!

So, “what is a linguist?” I hear you ask through my finely-honed sense for lingering doubts. Well, a linguist is someone who studies language, duh. What’s that? You want more detail? I can understand that. After all, few academic fields are as misunderstood by the general public as the field of linguistics. People might think that the Earth is flat, or that aspirin turns frogs into handsome, muscular princes (or was it kisses?), but at least they know what an astronomer or a doctor is and what they do. No such luck for linguists, I’m afraid. Misconceptions about what we do and absurdly wrong notions about what we study are rife even within the academic community itself. We’re here to dispel those misconceptions.

In the series of articles that follows, each of us will debunk one myth or misconception which he or she (mostly she) finds particularly pernicious and wants out of the way immediately before we even start regularly updating the blog’s content. In this introductory article, I will explain the most fundamental source of myths and misconceptions about linguistics there is: the difference between descriptive and prescriptive linguistics.

But first, let me begin with an unfortunately not-so-exaggerated portrayal of the popular perception of linguists: the Movie Linguist.

Scene: an unexplored Mayan ruin, deep in the jungles of Central America. Three explorers cautiously walk in a dark hallway, torches blazing over their heads. Philip, the dashing young adventurer, leads forward, cutting the vines that grow in the ancient corridors with his machete. He is followed by Beatrice, a beautiful young woman he naturally will end up kissing towards the end of the movie. Trailing behind them is a bespectacled, nervous man, awkwardly trying to hold onto a ream of papers and charts. He is Nigel, the linguist. Suddenly, they break into an enormous room. The group leader raises his torch with a sweeping motion. The music swells: the walls of the chamber are covered with inscriptions.

Philip: My God… look at this.

Beatrice: What is it?

Philip: Look at the inscriptions on the walls.

Beatrice: [gasps] Could it really be…?

Philip: Egyptian hieroglyphs… in a Mayan pyramid!!

Beatrice: But it’s impossible! How could they have arrived here?

Philip: I don’t know. Nigel! You’ve got to see this.

Nigel enters the chamber, and immediately drops his papers in astonishment.

Nigel: I- it’s incredible! The theories of professor McSweeney on cultural cross-pollination were true!

Beatrice: Can you read it?

Nigel: Well, given the nature of the expedition, I was presumably hired for my expertise in Meso-American languages. Fortunately, I am a Linguist™, and that means I can read every language ever spoken by every human being that ever lived.

Nigel kneels next to the closest inscription. He thoughtfully adjusts his glasses.

Nigel: Hmmm… I recognise this. It’s an obscure dialect of Middle Egyptian spoken in a village exactly 7.6 km due East of Thebes in the year 1575 BC. I can tell just by superficially looking at it.

Philip: What does it say?

Nigel: Unfortunately, this dialect is so obscure that it wasn’t covered in the 72 years of back-breaking grad school every linguist must undergo to learn every language ever spoken. I will need time to decipher it.

Beatrice: How much time? This place gives me the creeps.

Nigel: Just a few hours, and I will do it with no help from any dictionary, reference grammar or corpus of similar dialects to which I could compare it. After I decipher it, I will, of course, be able to read, write, and speak it natively with no doubt or hesitation whatsoever.

A skittering sound echoes in one of the hallways.

Philip: Be quick about it. I have a feeling we’re not alone…

In the end, it turns out the inscriptions on the wall warn intruders that an ancient Egyptian god slumbers in the tomb and that he will not be appeased by anything except fat-free, low-calorie double bacon cheeseburgers which taste as delicious as their horribly unhealthy counterparts, which is, of course, a dream far beyond the reach of our puny human science. A thrilling battle with the minions of this god ensues, until the explorers come face-to-face with the burger-hungry divinity himself. They manage to escape his clutches thanks to Nigel, who now speaks the Middle Egyptian dialect so well that he manages to embarrass the god by pointing out that he ended a sentence with a preposition.

Somewhere along the way, Philip and Beatrice kiss.

Our objective here at the Historical Linguist Channel is to bring your image of linguists and linguistics as far as possible from the one I just painted above. Said image is unfortunately very prevalent in the public’s consciousness, a state of affairs which makes linguistics possibly one of the most misunderstood academic disciplines out there.

So, without further ado, I will get into the meat of my own post: the distinction between descriptive and prescriptive linguistics.

What is descriptivism?

Most people know at least some basic notions about many sciences: most of us know that matter in the universe is made of atoms, that atoms bond together to form molecules, and so on. Most people know about gravity, planets and stars.

Yet, remarkably few people, even amongst so-called “language enthusiasts”, know the most basic fact about linguistics: that it is a descriptive, and not a prescriptive, discipline.

What does it mean to be a descriptive discipline? As the name suggests, a descriptive discipline concerns itself with observing and describing a phenomenon, making no judgements about it. For a descriptive science, there are no superior or inferior facts. Facts are just facts. A planet that goes around its star once every 365 days is not any better or worse than one which takes, say, 220. As an academic science, linguistics merely concerns itself with studying language in all its forms and variety, without ascribing correctness or value on some forms over others. To a linguist, “I ain’t done nuffin’ copper!” is as good an English sentence as “The crime of which you regretfully accuse me has not taken place by my hand, and I resent the implication, good sir!”

Now, you might be thinking: Riccardo, doesn’t every scientific discipline work that way? To which I answer: yes, yes they do. Linguistics, however, is slightly different from pretty much all other scientific disciplines (with the possible exception of sociology and perhaps a few others) in that, for most of its early history, it was a prescriptive discipline.

A prescriptive discipline is basically just the opposite of what I just described. Prescriptive disciplines judge some forms of what they study to be better or “correct”, and others to be “wrong” or inferior to others. Sound familiar? That’s probably because it’s how most people approach the study of language. Since the dawn of civilisation, language has been seen as something to be tightly controlled, of which one and only one form was the “right” and “correct” one, all others being corruptions that needed to be stamped out. Another very prevalent prescriptive idea is that language is decaying, that young people are befouling the language of their parents, transforming it into a lazy mockery of its former glory, but that’s a story for another post.

Prescriptive linguistics is concerned with formulating and imposing a series of rules that determine which form of a language is correct and which forms are not (in Humean terms, descriptivism is concerned with “is”, prescriptivism is concerned with “ought”. And you thought this wasn’t going to be an exquisitely intellectual blog).

In general, if you ask most people on the street to cite a “rule of grammar” to you, they will come up with a prescriptive rule. We’ve all heard many: “don’t end a sentence with a preposition”, “it’s you and I, not you and me”, “a double negative makes a positive”, the list goes on.

If you ask a linguist, on the other hand, you’ll get descriptive rules, such as “English generally places its modifiers before the head of the phrase” or “English inflects its verbs for both tense and aspect”.

A very useful way to think about the difference between a descriptive and a prescriptive rule is comparing it to the difference between physical laws and traffic laws. A physical law is a fact. It can’t be broken: it simply is. I can no more contravene the law of gravity than I can purposefully will my own heart to beat in rhythm to Beethoven. But I can contravene traffic laws: I am absolutely physically capable of driving against the flow of traffic, of running a red light or not switching on my headlights during poor visibility conditions.

In general, if a rule says that I shouldn’t do something, that means that I am capable of doing it. Even more damningly, if someone felt the need to specify that something should not be done, it means that someone has been doing it. So, completing the analogy, the paradoxical reason you hear your teacher say that you can’t end a sentence with a preposition in English is that you CAN end a sentence with a preposition in English. In fact, it is far more common than the so-called “correct” way.

What you will never hear is an English teacher specifically instructing you not to decline an English noun in the locative case. Why? Because English has no locative case. It lost it in its rebellious youth, when it went by the name of Proto-Germanic and it had just split from Indo-European because that’s what all the cool kids were doing. Finnish, which is not an Indo-European language, is a proper hoarder: it has no less than six locative cases.

Academic linguistics is exclusively concerned with the “physical laws” of language, the fundamental rules that determine how each language differs from all others. It takes no interest in offering value-judgements. Which is why a linguist is the last person you should ask about whether something you said is “good grammar” or not, incidentally.

So, are descriptivism and prescriptivism radically and fundamentally opposed?

Well, yes and no.

A limited form of prescriptivism has its uses: since languages are not uniform and vary wildly even over relatively short geographical distances, it is very important for a country to have a standardised form of language taught in school, with regulated forms so that it doesn’t veer too much in any particular direction. This makes communication easy between inhabitants of the country, and allows bureaucratic, governmental and scientific communication to happen with the greatest amount of efficiency.

The problem with prescriptivism is that it is very easily misused. Only a frighteningly short step is needed to go from establishing a standard form of language to ease communication between people in the same nation to defining all varieties of the language which do not correspond to this standard form as debased trash worthy only of stamping out, and any speakers of those varieties as uneducated churls, or worse, traitors and villains. For centuries, some languages (such as Latin) have been touted as “logical”, “superior”, the pinnacle of human thought, while other languages (mainly the languages of indigenous peoples in places conquered by Western colonialists, surprise surprise) were reviled as “primitive”, incapable of complex expression on the level of European languages.

Linguistic discrimination is a woefully widespread and tragically unreported phenomenon which is rife even in what would otherwise be socially progressive countries. In my native Italy, more than 20 local languages are spoken over the whole territory, some as different from Italian as French is. Yet, if you ask most people, even cultured ones, the only language spoken in Italy is Italian (the standardised form based on the language of Florence). All the other local languages are reduced to the status of “dialects”, and often reviled as markers of lack of education or provinciality, and described as less “rich” than Italian, or even as ugly and vulgar. The Italian state doesn’t even recognise them as separate languages.

Even comparatively minor variation is a target for surprisingly virulent hate: one need only think about the droves of people foaming at the mouth just thinking about people speaking English with the intonation pattern known as “uptalk”, characteristic of some urban areas in the USA and Australia.

Be descriptive!

So, what’s the takeaway from this disjointed ramble of mine?

Simple: linguistics is the scientific study of language, and sees all forms of language as equally fascinating and worthy of study and preservation.

In our posts and our podcasts you will never hear us ranting about “bad grammar”, or describe certain languages as superior or inferior to others. Our mission is transmitting to you the wonder and joy that is the immense variety inherent in human language.

Along the trip, you’ll discover languages in which double negatives are not only accepted, but encouraged; in which sentences MUST end with a preposition, when the need arises; languages with a baffling number of cases, baroque verb systems, and grammatical categories you haven’t even heard of.

We hope you’ll enjoy it as much as we do.

Tune in next Thursday for the next introductory post on the thorny question of language evolution, where Sabina will set the record straight: are youths these days ruining language?

Bibliography

Most introductory linguistics textbooks begin with a section on descriptivism, but if you want something free and online, the introductory section for The Syntax of Natural Language by Beatrice Santorini and Anthony Kroch is thorough and full of examples. You can find it here: http://www.ling.upenn.edu/~beatrice/syntax-textbook/