During our Spring Break equivalent earlier this year, I set aside a fairly substantial amount of time to hack away at making the perfect Emacs setup (for me, anyway). It was incredibly relaxing to just spend a few hours digging in and not feeling the stress that comes from know I have better things to spend my time on. Since there’s always more hacking to do, I figure this is going to be a pretty good tradition for me. Here’s how the tradition almost died and then came back to life.
My Emacs activity has died down a lot since I got my new laptop. Why? Well… I went a bit too wild setting up Emacs on it. I literally installed every interesting-looking package available from ELPA, MELPA, and Marmalade. That’s a whole lot of packages. Not so much to use them all immediately - it was just to keep up with their development. When I restarted Emacs to try a few, disaster struck: I was greeted with a stack trace full of gibberish to the tune of “debugger entered: symbol nil” or some such nonsense. At any rate, the part that normally says “here is where something went wrong” was nil, (), etc. followed by a lot of garbage.
Things mostly worked, though, or at least half worked. For instance, I keep files across sessions using dekstop.el - only some files would be kept. I had previously opened an elisp file that contained comments written in Japanese - Emacs decided that must mean I want to use Japanese character encoding for everything, forever. It forgot how to write proper line-ending characters, and when I would re-open a text file I’d been working on, there’d be control characters everywhere. I’m sure other things were broken and I just didn’t notice, but this was all incredibly annoying.
So I hadn’t been using Emacs a whole lot, which is fine since I’m not in programming courses anymore. IDLE works well enough for my work at the lab. In the meantime, I’ve continued squirrelling things away in Springpad to look at in the future (up to 329 items right now). The biggest new source for these has been, strangely enough, a subreddit dedicated to Emacs. I’m not much for reddit, normally, because it’s the kind of place where you can waste a whole lot of time. This is exactly what happens every time I visit r/emacs, so it’s both a blessing and a curse. It’s sort of intimidating to have hundreds of things to look at, but it’s all interesting stuff and it’s all working towards having a configuration I can use for years to come. But, of course, that presumes I’m actually making use of this configuration.
But that debugger thing was a real pain in the butt. I had no real way to start investigating it, since it didn’t give any hint as to what was causing it. I was considering starting over and adding packages one by one, to see where the problem was coming from. That would have been miserable and time-consuming, but it would be appropriate penance for an incorrigible customizer.
On a whim, though, I thought I’d try something today: update all my installed packages. I was worried that maybe the problem wasn’t me - maybe the newest Windows build of Emacs 24.2 was messed up. Or maybe some old package I was using had finally crapped out. If it was really a package I’d installed, and not a complex interaction between multiple packages, it would be simple to fix. So after a few months of anxiety I set 45 packages to update and went to have dinner. Maybe the problem had been fixed already.
I came back, restarted Emacs… No debugger! All my files from the last time I used a healthy Emacs were back! File encodings were back to normal! Now I can finally start hacking again, maybe figure out how to use el-get on Windows to install golden-ratio.el…
If you’re curious (and I know you aren’t, you can skip the rest of this) here’s the sorts of things I’ve got planned at a bare minimum.
You might be wondering: why do I care about elisp libraries, since I’m not a developer? If these sorts of things take off, it makes things better for everyone, because elisp packages can incorporate reliable components that implement useful functionality. Feels good to make predictions that come true.
PS: Today’s section titles come to you courtesy of Kirby, because I couldn’t come up with any way to organize this around the titles of Star Wars movies
[[This is a short paper I wrote near the end of my Introduction to Linguistics course. The assignment, for bonus marks actually, was to read a book and write a brief summary and respond to the reading. Hopefully it stands well on its own, without the book. Dixon’s book was a pretty good introduction to historical and comparative linguistics - topics we didn’t have a lot of time for during the course itself. At any rate, the material he presented was basic enough and clear enough that I was able to understand it easily. So hopefully this essay is equally digestible.
The other main goal was for us to read about a controversial alternative to the accepted (as far as textbooks are concerned) wisdom about language change. It sounded pretty plausible to me, so I figured I’d go along with it. In retrospect, the most useful things I learned from the book had nothing to do with Dixon’s model itself. At any rate, it was a good experience, and I’m glad Professor Anonby gave us the assignment. Looking back almost two years later, it’s striking how much I’ve taken to heart that if something sounds too good to be true in science… it probably is. Look, ma, I’m a critical thinker.
A couple of good readings on the topic I found when I started looking for other papers using Dixon’s model:
In his book The Rise and Fall of Languages, R.M.W. Dixon discussed the problems with the family tree model of genetic language relationships and proposes an alternative model to supplement it. While the family tree model works well for Indo-European languages, he shows how it has failed to apply to other linguistic areas. As an example, he discusses groups of Aboriginal languages in Australia. Many of them can be grouped into sub-groups based on location, but construction of a proto-language and creating the upper levels of the family tree proved to be difficult. Dixon’s proposed model of punctuated equilibrium claims that in linguistic areas in equilibrium, such as Australia prior to its invasion in the 18th century, language features tend to diffuse amongst neighbouring languages. This leads them to converge towards a common prototype. On the other hand, when that equilibrium is punctured – by invasion in Australia, though there can be other causes – languages tend to split and form the kinds of genetic relationships seen in the Indo-European family tree.
Dixon describes the kinds of linguistic features that tend to diffuse amongst languages in contact in a linguistic area, and provides an analysis of the strengths and weaknesses of the family tree model. As a transition into his theory of punctuated equilibrium, he describes the possible modes of change in languages – that language can change quickly and decisively, or it can change gradually over time. The first applies to the family tree model, while the second is more appropriate for the punctuated equilibrium model. Dixon then elaborates on recent human history, and how nearly every part of the world has undergone drastic punctuations to their equilibrium, making it easy to think the family tree model applies everywhere in the world. The nature of European invasion is such that, by trying to study an equilibrium situation, it is punctured by the linguist attempting to study it. In any event, few areas of the world still exist in such isolation, so the task of the linguist becomes that of a historian of language, trying to capture snapshots of how languages were before they were influenced by outside sources.
While I am certainly not an expert, I am tempted to agree with Dixon’s punctuated equilibrium model. It seems to provide for the shortcomings of the family tree model for language areas outside modern Europe, where clear genetic relationships are more difficult to define than within the Indo-European family. Indo-European languages have existed in punctuated equilibrium for most of recent history (the past several thousand years), creating the ideal family tree style model, but that’s likely not the case for more isolated areas such as Australia where various groups would have co-existed relatively peacefully for thousands of years. Additionally, the punctuated equilibrium model does not claim to invalidate the family tree model, because it is naturally included for linguistic areas with punctuated equilibrium. Instead, Dixon’s model supplements the existing theories and expands upon them to account for other linguistic areas. I don’t know enough about world history to think of an area of the world that has neither been isolated in linguistic equilibrium nor affected by punctured equilibrium, but I would think Dixon’s model applies for just about every linguistic area of the world.
Dixon discussed two types of responsibilities that linguists have – first a social responsibility, then a scientific responsibility. The social responsibility, for the benefit of our understanding of human language, is to document undocumented (or minimally documented) languages to preserve them and see the massive range of possibilities that exists in human language. The scientific responsibility, which is not unique to linguistics, is not to take established theories for granted. To assume that the family tree model applies everywhere in the world, and to use comparative linguistics to “prove” tenuous links between languages, is to deny the possibility that other options exist in the world’s languages. The social responsibility feeds into the scientific, as well, because documenting new languages that may not fit the accepted theories will help to refine linguistic theory.
The question of where linguistic research should focus its attention, on data or on theory, is tied closely to the responsibility of a linguist. Speculative theories can be used to direct research – Dixon’s theory of punctuated equilibrium is a good example – but they cannot exist without any data to support them. Data can be used to create new theories, but new theories cannot be created without any data. If the available data is never expanded by linguistic research on undocumented languages, then new theories are unlikely to appear. Dixon is an example here as well. He did field work in areas he felt needed study, and found that the data he collected did not fit with existing theories of genetic relationship. The new data led to the creation of a new theory, which could not have existed otherwise.
Overall, I’m quite glad that I read the book. It was enlightening to see what a linguist really does, and I appreciated the theory and how it helps to describe language development outside modern Europe. Sometimes it’s tempting to forget that other parts of the world do not have history defined by bloody wars and political strife, and the fact that a culture could exist so peacefully that it would have no concept of competition (Dixon 113, footnote) is remarkable to me. I knew about the Germanic and Romance languages, in a general sense, but I had yet to be convinced by family tree theory that all languages in the world conform to a genetic relationship pattern. My budding linguistic knowledge now includes the family tree model and the punctuated equilibrium model, which should prove helpful in the future. It’s always good to have more ways to approach a problem, and that’s certainly something Dixon’s book provides.
Before I got my first laptop, I’d always used desktops for my own purposes (obviously). The only laptops I’d used were terrible Vista-era Acers that my family needed me to troubleshoot all the time (usually because of some Acer “value added” software that replaced perfectly functional Windows defaults). But then I got an HP Paviliion dv6-2210 in 2010, and it was nice, and I could bring it to class and have my natural computing environment with me outside the house. It was wonderful, and I loved that laptop, with its homegrown UI cobbled together with Rainmeter and Emerge Desktop… Plus, I could carry it in my backpack when I moved between houses, much unlike a desktop computer. It was pretty important to me.
And then my new laptop, a Lenovo X230t, arrived. I haven’t intentionally used my old laptop since the new one arrived. As soon as I turned it on, I was entranced by its electronic wiles. Oh, HP Pavilion dv6-2210, I did love you - until something that’s better in every way arrived.
I immediately unpacked it, stuck the battery in, and set it down on the kitchen table. When I started up the X230t, the first thing I noticed (or, didn’t notice) was how quiet it is when it runs. The fans make a very slight whirr, but it’s only noticeable in a quiet room. My old laptop was left alone in my bedroom down the hall, and it was making enough noise when idle that I could still hear it.
The next thing I didn’t notice was how amazing the screen is. Since the screen can tilt fully from 0 to 180 degrees, and rotate clockwise from 6:00 to 12:00, it needs to have amazing viewing angles. Now, I didn’t know what it meant until I read this blog post by Jeff Atwood, but the X230t has an IPS display. Here’s the difference: when I was playing D&D the other week and our DM wanted to show us an image on his laptop, he had to tilt and rotate the entire machine so that everyone around the table could see. At the wrong angle, the screen was just a gray blob. With my laptop’s screen, everyone could see everything at once. It wasn’t until that event that I realized how awesome this screen is.
Moreover, somehow the quality of the screen has kept the low resolution (same old 1368x768 as my old laptop) from feeling cramped. I used a wide-screen, 21" monitor at work over the summer and I hated going back to the tiny screen of my HP laptop. I went to Emerge Desktop in order to get a completely minimal UI - no chrome at all, just 16x16 icons for my quick launch, currently running programs, and notification area. Oddly enough, on the X230t I’m still using the “big” taskbar that I used to think was massive and ugly - at the same screen resolution! Actually, one big reason not to forego Explorer as my shell - there’s a default Lenovo widget that displays battery power in terms of time remaining, and that’s awesome.
The battery life is fantastic - I got a 9 cell battery by default and a “slice battery” that doubles my total battery life. It’s a plug-and-play thing that attaches to the bottom of the laptop, rather than an alternative to the regular battery. In other words, I don’t need to shut down and swap batteries if the regular battery is running dry. It can be charged separately from the main battery. While it doesn’t add any new ports or anything, that’s okay, because it would probably provide less battery life if it did. Or add more bulk. The moral of the story is, I can get through an entire day without needing a power outlet, and this is amazing freedom for someone who never had more than five hours (at best) from a full charge.
Speaking of which, I was worried about going back to typing on a laptop keyboard, but it’s been fine so far. Granted, I’m doing more written (vs typed) assignments this year because of the classes I’m taking, but still. Typing on the keyboard for lectures hasn’t made my hands hurt, but programming for a few hours on it does make me sore. The keyboard does have backlighting, but I’m a touch typist so it’s literally useless.
The trackpad has a nice texture to it, and - glorious day - there are three mouse buttons between it and the spacebar. The third is initially configured as a scroll wheel, but can be turned into a middle-click, and that’s my favourite thing ever. The trackpad does support a variety of gestures, but I can’t remember to use them. Doesn’t help that they’re less reliable than keyboard shortcuts (I couldn’t get the three and four finger gestures to work). But maybe I’ll get into it some day.
It has that signature ThinkPad red thing in the center of the keyboard, but I can’t get used to it. It also has a touchscreen, and I’m going to talk about that in a separate post.
The Bluetooth on the X230t actually works, so I can finally look at Bluetooth headsets as an option over wired headphones. Yay, future. It does lack an SD card reader (my HP laptop had one, it was useful occasionally) and a CD drive, but the only time either of those gave me grief was when I wanted to put some files on the SD card of my new 3DS XL (I’ll write about that at some point too). It’s got a really weird Print Screen key in the middle of the right-hand ctrl and alt keys, which is incredibly annoying when I try to hit ctrl+v and accidentally take a screenshot instead (overwriting the previous clipboard entry).
Apparently, there used to be a keyboard shortcut for changing the scroll wheel function into a middle mouse button, but according to someone at Lenovo some changes had to be made for Windows 8. That forum topic is actually pretty interesting - there’s a variety of posts from people as to how the X230t compares to the previous model, the X220t, which highlights a few interesting things about it. Plus, someone who actually works at Lenovo came in to comment, which impressed me.
Other than that, I’m not sure what else to say about the hardware. I opted for a better wi-fi module instead of a webcam, because I only ever used the webcam in my HP laptop twice. USB webcams are much better because they can be angled separately from the screen. It doesn’t have an HDMI port, so I need an adapter that changes DisplayPort to HDMI. It has an always-on USB port that’s still powered when the laptop is asleep, so I can charge my phone from it whenever I want. The processor is a Core i5, of the Ivy Bridge variety, which turns out to be better than the i7 in my old laptop - and the built-in GPU is better than the independent GPU in the old laptop, too (at least according to the Windows Experience Index, and I’m too lazy to run real benchmarks).
So all in all, I have fallen head over touchscreen for this laptop. No regrets on the purchase. I’ll probably write a third post about the touchscreen and various other pre-installed software-esque stuff, but no guarantees. Have lots of other stuff I should have written about long ago…
Alright, I know this is uncharacteristic for my tumblr these days, but this is pretty much how I feel about dependencies on Windows. All the time. I waste entire life (or at least, entire summer life) messing with dependency bullcrap. All I wanted to do was use C++ to put a picture on the screen, and it became a several hour time suck trying to get CMake to work to install a graphics library (and all the dependencies of its dependencies…). So then I moved to Python and PsychoPy, but the only reasonable way to install PsychoPy on Windows is to download their standalone package that includes Python. So now I have three versions of Python installed (2.7.3, 3.2, and PsychoPy’s 2.6.6) and I have to mess with my PATH to get everything in the right order.
The alternative option was to add their dependencies to my 2.7 installation, but that would be an endeavour worthy of the above .gif times three. Trying to handle the fact that PsychoPy has a dozen (literally, count them) dependencies, the fact that there are three Python package managers (pip, easy_install, setuptools) and none of them seem to work 100% of the time…
It sucks and I much prefer writing code. Even if the code uses the subprocess module and calling kill() doesn’t actually work on Windows so I have to borrow a function that makes calls to the Win32 API… That is better than managing dependencies outside Unix. Because it means the software for running my experiment is almost ready!
Last Friday, the Cognitive Science department at Carleton hosted a talk by Kayt Sukel, a science writer with a recently published book about the neuroscience of love, sex and relationships. While I enjoyed the talks I attended by Paul Thagard and Zenon Pylyshyn, their main job is to do research, and so their talks were fairly functional. Kayt, on the other hand, writes for a more general audience - unsurprisingly, her talk was really entertaining. There was a lot of laughter, and only a little bit of blushing. But it was super interesting, too, and I wound up buying her book afterwards. Got it signed, too, and her dedication made me smile - “to love and other indoor sports”.
At any rate, before the talk I was looking around her site and read a handful of articles. My favourites:
With all that being said, below are the notes I took from her talk. If you’re interested, find a link to buy Kayt’s book from her site!
If we’re going to study love scientifically, we’ll need an operational definition for what we’re actually looking for
Bartels & Zeki (2000) was the first published study on the neurobiology of love
Fisher, Aran & Brown (2005), in a similar study, found activation in three key areas that are related to attachment, lust, and sex drive
But, for starters, we can mostly agree that love starts with attraction in some form or another
When people claim to be madly in love with a new partner, there are changes in:
In particular, here’s how these chemicals were affected:
Love may actually be the blueprint for drug addiction, as many similar chemicals are involved
Since we see this weird response at the initial development of a romantic relationship, maybe it’s necessary for some evolutionary benefit
Actually, a lot of studies on love and attachment are done on prairie voles
In humans, things are a bit harder to study, but there are interesting differences between men and women:
Is monogamy "natural” in humans? This is probably the wrong question to ask
Motherhood changes the volume of a few areas of the brain
Dads actually have neural changes as well, with an increase in oxytocin
Some people have asked whether studying the neurobiology of love will ruin the mystery and excitiment of love
Oxytocin was first discovered in relation to labour/child delivery
Do the chemical changes in parents stay over time, such as after children move out?
The chemicals involved in love are similar to those involved in long-term stress responses - perhaps they just signify important things in our lives
Psycho-social approaches have advanced understanding of a lot of things like heart problems in medical fields - perhaps they would help in the study of love, too
What about relationships that form solely online, where the influence of odour-prints would be removed?
Perhaps, in the t-shirt studies, women have inherited preferences from their mother - which is why they go looking for someone similar to their father
A little over a month ago, I bought a new copy of Pokemon HeartGold. Those of you who know the game will also know that it comes with a little pedometer that gives you small benefits within the game. I figured I had room for it in my pockets, so I’ve been keeping it on me ever since.
One thing that’s interesting is that it seems to break up steps into discrete “trips”, separating them after some unknown period of low activity. It’s a feature I wouldn’t know I wanted, if I were shopping around for a “real” pedometer. While it’s not perfect (there’s some required threshold for generating a “trip” report, like having 15+ minutes of walking), it winds up giving me a lot of really interesting information. Assuming I remember what I did on a given day. But, for example, the first day I had it, I walked to school in the morning and after class. When I had to stop at a couple of traffic lights on the way to/from campus, it separated the trip into chunks - so I can figure out the relative distances of each part of the trip (from hose to the first major intersection, from there to campus). Well, that assumes I write down the trip numbers at the moment I transfer them to the game cartridge (more on that in a moment). Also, I say relative because I don’t know exactly how long my stride is, and I can’t claim 100% accuracy of its measurements.
When I walked both to and from campus, my totals were in the range of ~12,000-15,000 steps. If I walk in the morning and take the bus in the afternoon, it’s down to ~9,000-12,000. After construction finally finished on a bridge near campus, I was able to cut my travel time hugely by taking the bus halfway and walking from the bridge. This put me down around ~7,000-11,000 steps per day. However, that’s all from my mom’s house - from my dad’s, I’m around 6,000-8,000 most days.
However, the main issue thus far has been that I don’t have access to complete historical data. Data for the last seven days is stored on the device itself, and can be “sent” to the game cartridge for summarizing and getting bonuses. When you sent the information to the cartridge, it gives you your trip reports and updates your total steps thus far. But it doesn’t store the individual daily values that are sent to it (since that could take theoretically infinite storage, which it doesn’t have). So this leaves me with the annoying problem of writing down my daily steps just before bed, and that feels like a lot of effort.
Interestingly enough, the Nintendo 3DS includes pedometer functionality, and seems to keep track of historical data (hourly summaries and daily summaries) indefinitely on a calendar. From my own use, it seems to count less steps in most cases - but perhaps the Pokemon pedometer is counting too many… I’m inclined towards the former because of the size of the 3DS. I imagine it’s harder for the whole thing to shake and count as a step. That, and it’s just not something I can fit in the pocket of my pants, so it’s not a real alternative. I keep it in my backpack, instead, but that counts far less steps.
If only the 3DS could connect and sync steps with the Pokemon pedometer…
A few specific things I learned in the first few days:
Anyway, it’s been somewhat interesting. The data would be more interesting if I put in more effort, though. I imagine there are super amazing pedometers that would automate most of the drudgery, but those would cost money. I don’t want to go for a phone-based option, either, because the phone’s built-in sensors just aren’t a good alternative to the simpler solution of a pedometer. Reading reviews for Android step counting apps, people report terrible battery drain and a variety of limitations (have to keep the app in the foreground) and I’m not terribly surprised. But, I guess, without spending some real money on something like this I probably wouldn’t get anything better than what I already have (fits in pocket, counts steps for a given day).
Ah, well. Perhaps that’s a Christmas present idea.
A couple of weeks ago, I agreed to facilitate a workshop on Python for non-programmers in the cognitive science department at Carleton. It’s been alright so far for the first two sessions - about seven-ish people attending, but with wildly varying skill levels. Specifically, one guy is experienced with C/C++ and several others know almost nothing at all about programming. It’s been hard to engage everyone at once. There’s been a lot of “we can talk about this after” and “this is interesting but probably not important to most of you”… especially from me because I get excited and hope they will still understand.
Anyway! This evening I was trying to find good code examples to show in an online Python interpreter that doubles as a visual debugger. Looking through the available examples, under the advanced Python features section, I saw an example called “for-else” (the title of this post links directly to it). Wait… what is that?
As it turns out, this exists in Python. I found a blog post on the topic that shows a useful application of the technique. This can be applied to a for loop or a while loop, and the code in the “else” clause only runs if the loop exits normally. In other words, it runs if you don’t break out of the loop. In a way, it’s kind of like the “else” is attached to all the break statements inside your loop - namely, if none of the break statements are reached, run the else clause. I have a really hard time matching up the idea of this meaning with an interpretation of “else” - to me, this seems more like a “finally” clause that is surpassed by break. finally is used for exception handling in Java, but by definition you can’t pass over it - code in a “finally” clause is run no matter what. Well, I think. I’m sure there are loopholes I don’t remember, probably involving destructors somehow because they’re a source of much evil.
I feel like it’s a solution in search of a problem, which is probably why this isn’t a very well known feature. Off the top of my head, the only thing I use break for is when I’m looking for a single thing in a collection and want to do something with it afterwards - and not do anything with the rest of the collection. continue is a different story - I use continue a lot more often, because looking at an item of a collection and doing nothing with it is a lot more common than finding one item and discarding the rest of the collection.
But hey, there you go, some unique semantics (I think?) in a programming language. If it was really such a great idea, it probably wouldn’t be so rare.
[[So, here’s the first of my essays I’m going to post - I wrote this in my first year at Carleton, for an Intro to Applied Linguistics and Discourse Studies class. I know it took a while for this post to appear - I was worried it would take a lot of effort to convert the essay to Markdown. I only remembered this morning that Janna Fox, our professor, told us to use as little formatting as possible - if we wanted to emphasize something, we needed to do it with words, not formatting tricks. It was good advice, I think, though on the internet a bit of italics and bolding has its uses.
As for the essay itself, it was for an assignment along the lines of “write an essay about something we’ve talked about in the last month.” So I wrote about second language learning, and my experience with it. I don’t think I made a particularly good argument for anything, but I think the story is valuable. In that respect, you’re probably going to be annoyed by the references I make to our class material. Still, it’s not terribly long, and I don’t think you need much background knowledge to understand it. I hope it’s an enjoyable read!]]
French immersion holds a strange position in the Canadian education system, especially in anglophone areas like my hometown of Summerside, Prince Edward Island. Most parents who enroll their children in the program work for the federal government, or some other position where they see the value of being bilingual. The promise of a bilingual position becomes the main motivating force for many French immersion students. Yet many of us found ourselves ignored or derided by actual francophones when we tried to practice our French during trips to Quebec. Sometimes they would speak to me in incomprehensible English or act as though my French made absolutely no sense [[editor’s note: maybe it didn’t]]. After years spent in the French immersion program, they were telling us we did not qualify as “Really French.”
Based primarily around Chomsky’s theory of a mental grammar, constructed through language use, and the idea of “Discourse” and “identity kits” developed by Gee, I would like to examine the ‘success’ of the French immersion program based on my personal experience and those of a few close friends. I have considered our experience with French immersion, including our abilities to speak and write in French, and use of French outside the classroom. It is clear that the French immersion program taught us to comprehend French, but when the time comes to produce our own, we find that we lack knowledge of standard French grammar, and even that francophones stigmatize our ‘dialect’ of French. As Gee (1996) notes, though our grammar is poor and our forms are not ‘correct,’ we can communicate with other French immersion students quite well.
The isolated nature of our French, learned in the same classes, with the same teachers, and used only within those classes, means that by and large every French immersion student from Summerside, PEI, constructed a roughly identical mental grammar. As discussed in class on September 22nd, the basic idea of the active construction of mental grammar theory claims that experience with language allows us to discern its rules and attempt to apply them on our own. In a language rich environment, with a variety of input, properly learning a language happens quickly and easily. Sadly, the French immersion program, in the areas of Canada devoid of French culture, is anything but linguistically rich. The only source of ‘correct’ French comes from our instructors, and the majority of the experience we get with French comes from other students struggling to learn the language alongside us. With such limited opportunities to truly learn and internalize the standard grammar of the language, no linguist would be surprised that francophones see our French as alarmingly poor.
Despite our severe lack of standard French grammar, anglophone students in the French immersion program understand spoken and written French quite well. Obviously, we do know French, but we have learned to speak a different dialect of French – that of an anglophone French immersion student. Much like the women in the job interviews cited by Gee (1996), our dialect works fine in certain contexts, but in the context of interacting with a francophone, we are stigmatized for not matching the accepted standard. The “Discourse,” or identity, that comes with our spoken French is that of an anglophone failing to learn the ‘correct’ way of speaking and writing French. In the act of “doing being-or-becoming-Really-French,” francophones pass the decision that we are incapable of joining them as Really French. The federal government would accept our French for a bilingual position, but we would struggle to live and work in Quebec as a member of a fully francophone society.
Like the case of being a Real Indian discussed by Gee (1996), any francophone could tell from a mile away that my classmates and I are simply not Really French. The curriculum in the French immersion program tried to test us, once and for all, to determine our identity as capable French speakers. Gee (1996) recognized the fallacy of such “identity tests,” yet they pervade the French immersion program. Thanks to lack of practice, the foundation of our language skills crumbled over the years. In high school, my French instructor marvelled at our poor knowledge of basic concepts, and spent considerable time re-teaching lessons that we received years ago. When tested a few weeks later, as little as 50% qualified as a passing grade [[editor’s note: as in, students could succeed even if they only learned 50% of the material and received a grade above 50% on the tests]], and our instructor could only hope we might remember something. Much like Swain (1995) found when testing for comprehension of French, the lessons a teacher assumes they have taught are not always what the students learn. A lesson on grammar might only boil down to students writing “peux” instead of “peut” all the time and completely forgetting the rest.
As discussed by Gee (2010), express teaching often fails to produce a perfect understanding, and compared to the tacit experience of first learning a language the strategy faces many difficulties. Francophones, who learned the complex rules of the language as children, understand implicitly the rules and conventions of the language. “This is French,” they say, “this is how it has to be.” For an anglophone, these rules require memorization and active correction of our French any time we speak or write. When we forget to use any number of these rules, we do not realize that we are expressing something the ‘wrong’ way, because the rules are not yet a part of our basic understanding of the language. Only when they permanently become a part of our mental grammar will we take them as a given and apply them automatically, and supporters of the innateness hypothesis might argue that our critical period ended long ago. Following that theory, our French may never fully develop.
The ‘success’ of the French immersion program, at least in an area with small French populations like Prince Edward Island, depends on how you measure success in learning a language. If success means landing a bilingual position, then the program succeeds beautifully. For a number of reasons, perfect integration into francophone society may be unrealistic, but knowledge of the standard grammar should serve as a realistic measurement. Even in that respect, the program’s success is questionable. Dedicated students can easily continue their education in French and practice their grammar using what the French immersion program taught them, but when your high school diploma comes with a certificate identifying you as fully bilingual, no extra education should be needed.
Fox, J. (2010). Lecture given September 22nd, 2010.
Gee, J.P. (1996). Social linguistics and literacies: ideology in discourses (pp. 122-132). London: The Falmer Press.
Gee, J. P. (2010). Language, Literacy & Learning in a Digital Age. Given January 22nd, 2010. Online at: http://www2.carleton.ca/slals/events/language-literacy-learning-in-a-digital-age/
Swain, M. (1995). Three functions of output in second language learning. In G. Cook & B. Seidlhoffer (Eds.), Principle & practice in applied linguistics: studies in honour of H. G. Widdowson (pp. 126-142). Oxford: Oxford University Press.ess