Posts Tagged: Keynotes

Clyde Rose

The title of my brief address to you is The Aldergrabber and the Bubble-treader. If you are puzzled by this title hang on and in a few minutes I hope to make it clear to you.

It is an awesome feeling standing on this stage gazing into a sea of faces filled with joy, expectation and triumph. For you – and for me – this day, today, will be tomorrow, and tomorrow, and tomorrow’s memory – a proud day and an unforgettable moment in our lives.

I have had a long attachment to this university having come here as a student at the age of 17. At that time the campus was situated on Parade Street in old St. John’s. Memorial was a strange place in those days. All students and faculty wore long, black gowns which made us look like a community of batmen flitting around the campus in our flowing robes. I remember one notable professor who wore an Oxfordian gown which the years had tattered, torn and discoloured with its globs of green and splotches of yellow. It was quite a sight but he wore this aged garment ever so proudly. With his reading glasses suspended across the bridge of his nose and his pipe between his teeth he was an inspiring sight. He spoke with an English upper-class accent – which was totally incomprehensible to me (at the time) – and he was a formidable figure as witnessed by the hundred or more students who sat in the lecture theatre in complete silence as he mumbled through his lectures. I learned to love the man and many years later I became a close friend and colleague.

I want to go back for a moment and give you a little of my background. I was conceived in the winter of ’37 in a log house – our winter home – in the bottom of Bay de Vieux and born the following summer on Fox Island where my family worked in the salt fish industry. In 1942 – while the War was on – father put us all in his fishing dory and we steamed up the shore to make our home in Burgeo. New technology had replaced the salt fish trade with the fresh fish one – the days of fish flakes, splitting, curing and drying fish in the sun were over. Our large kitchen in Burgeo where we ate, did our homework by the kerosene light was like a theatre. It was a gathering place for fishermen, lighthouse keepers, sailors and hunters who told their tales, sang their songs, jumped to the floor to dance their jigs and royally entertained us. Many of these men and women had never seen the inside of a classroom and they would often joke about their lack of formal education. “I got as far in the book as where Jack hove out a stick for his dog to fetch but I didn’t wait to see if he brought it back to shore.”

As storytellers and songsters they were brilliant and creative and they used a language rich in metaphor and imagery. They were also very adept at explaining the creative process. I remember one night following the singing of a beautiful new song I asked the lighthouse keeper who created it and sang it if he would make me a copy. I passed him my scribbler and pencil to do so. A painful look filled his face. “I wish I could, my b’y, but I can neither read nor write.” Whereupon I said: “If you can’t read or write, how can you make up songs?” To which he replied: “I makes them up in me head – and sings them from me heart.” A more beautiful and apt definition of folklore cannot be found.

My love for the Sou’ West Coast and my upbringing there is best summed up by the poet Robert Hayman who in 1628 wrote the following poem:

The Aire in Newfound-land is wholesome, good;
The Fire, as sweet as any made of wood;
The Waters, very rich, both salt and fresh;
The Earth more rich, you know it is no lesse.
Where all are good, Fire, Water, Earth, and Aire
What man made of these foure would not live there?

As a young man I came to Memorial speaking a dialect whose roots are found in the West Country of England but I learned, to my astonishment, that my Sou’ West Coast English was not appropriate or acceptable if I wished to pursue my studies to be a teacher. My quaint speech such as “I’ll get the dory narder and take ‘er into the lund” might have been acceptable in Shakespeare’s time but was not acceptable at Memorial in my time. I was somewhat disillusioned and dismayed by this confrontation of cultures but I soon learned of an undertaking by some distinguished professors at Memorial that in time would make a lasting impact on me, personally, and on the culture of Newfoundland and Labrador generally.

That undertaking had to do with our legacy of language. My use of the word ‘lund’ in “get the dory narder and take ‘er into the lund” did not go unnoticed by these keen scholars who had taken on the monumental task of creating a dictionary of Newfoundland English. “Lund” as it turned out – although incomprehensible to some at the time – was/is a perfectly respectable word and like thousands of other beautiful words that came from our outports it was in danger of extinction until the DNE (Dictionary of Newfoundland English) came along.

There is no scholarly work that has done more to enhance, preserve and dignify the culture of this place than the DNE. And full credit must go to this institution which supported these outstanding scholars (Kirwin, Story and Widdowson) and their research during the 25 years it took to create this significant book.
When the DNE was published in 1982 it made national news. Some – foolishly – thought it a joke. Others – with more insight – sought it out for guidance. A foreign doctor practicing in Clarenville called me and said he required copies as soon as possible. A woman, he said, had come to him with a bad back and when he asked her how it happened she said: “Well, Doctor, I sloused to the left as I was gettin’ in the boat and me back went squish.” Both words “sloused” and “squish” are found in the DNE – “Sloused” meaning heaving from side to side as water in a boat; while “squish” is defined as “being out of alignment.” This woman had given the doctor a clear, precise and accurate description of her ailment in a language that was hers – and is ours.

Kirwin, Story and Widdowson knew the limitations of their work. Our legacy of language is a goldmine which they could barely tap. The great treasure troves of beautiful words and potent expressions will lie there, perhaps forever. My experience with those who were disgruntled with the DNE is best summarized by a conversation I had with a man from Central Newfoundland. He had worked in the logging camps and had been involved in dangerous work on the Badger River. With disappointment on his face he said to me: “I see that ‘aldergrabber’ and ‘bubble-treader’ didn’t make the book.” I said “No b’y I don’t think they did but I don’t know the words. Perhaps you could explain them to me.”

“Well”, he says, “work on the drive is always dangerous especially when there’s a jam and you have to break the logs loose. You never know when it’s going to happen. But whatever the case it brings out two kinds of men. One is the fellow who is careful, steady and conservative and stays close to the shore so that when the jam breaks he can reach for the alders and pull himself in. We calls him the ‘aldergrabber.’ Now the other kind of man is different altogether. He’s carefree, wild and nimble and out in the middle of the jam, far from the shore with nothing to hold onto. And when she breaks he gets to shore – somehow – by dancing on the bubbles I s’pose. So we calls him the ‘bubble-treader.’ Now shouldn’t they be in the DNE?” I confess, I had to agree.

The old Memorial on Parade Street was unique, intimate and confined. Its focus was clear – academic excellence was its goal. At one time – not so long ago – five of my former classmates were presidents of various universities in Canada. In other fields too MUN graduates have excelled as is evident on this stage today. The new campus that I came back to in the late ‘60s and where I taught in the ‘70s was a sprawling institution. But it was a very invigorating environment as the arts were moving into high gear.

In Memorial’s English Department we had the likes of Ted Russell, Art Scamell, Michael Cook, Al Pittman, Tom Dawe, just to name a few, all of whom have contributed significantly to the literary prowess of our province. We were, in the early ‘70s, on the cusp of the renaissance. In one of my classes was a student by the name of Wayne Johnston; in another Ron Hynes. Wayne’s novels are translated into various languages while Ron’s songs are as popular in Ireland as they are at home.

“Bliss was it in that dawn
To be alive;
But to be young
Was very heaven.”

The music and literature that had its roots in the renaissance of the ‘70s have grown and blossomed. The arts have given us pride in our place and a keen awareness of our identity as a unique people in Labrador and Newfoundland. As the fellow from up the Southern Shore said to me when I told him I was from the Sou’ West Coast: “A scarce breed, Sir. A scarce breed.”

Blessed as we are with brilliant writers, musicians and artists of many talents it seems to me that it is the duty of an institution – such as this one – to ensure that our students get the opportunity to read and study the literature from and about this place and to be exposed to the arts, generally in whatever form they appear.

When I first came to Memorial I noticed its motto: Provehito in altum – Launch into the deep. Thus being from a sea faring background, I’ve always thought of the university as being like a ship taking its precious cargo of knowledge to the people who are its owners. But the captain and crew – as on any ship – must be free to steer a steady and a safe course. They cannot be directed by the landsmen on the shore for that would be a foolish and dangerous thing to do. “On such a full sea are we now afloat and we must take the current when it serves or lose our Ventures.”

And you young graduates, you too must launch into the deep and whether you follow the philosophy of the “aldergrabber” or the “bubble-treader” you must always keep in mind this noble institution, this great university, that set you on your voyage – “to strive, to seek, to find, but never to yield.”

Joel Spolsky

This is part one of the text of a talk delivered to the Yale Computer Science department on November 28. The rest of the talk will be published tomorrow and Wednesday.

I graduated with a B.S. in Computer Science in 1991. Sixteen years ago. What I’m going to try to do today is relate my undergraduate years in the CS department to my career, which consists of developing software, writing about software, and starting a software company. And of course that’s a little bit absurd; there’s a famous part at the beginning of MIT’s Introduction to Computer Science where Hal Abelson gets up and explains that Computer Science isn’t about computers and it isn’t a science, so it’s a little bit presumptuous of me to imply that CS is supposed to be training for a career in software development, any more than, say, Media Studies or Cultural Anthropology would be.

I’ll press ahead anyway. One of the most useful classes I took was a course that I dropped after the first lecture. Another one was a class given by Roger Schank that was so disdained by the CS faculty that it was not allowed to count towards a degree in computer science. But I’ll get to that in a minute.

The third was this little gut called CS 322, which you know of as CS 323. Back in my day, CS 322 took so much work that it was a 1½ credit class. And Yale’s rule is, that extra half credit could only be combined with other half credits from the same department. Apparently there were two other 1½ credit courses, but they could only be taken together. So through that clever trickery, the half credit was therefore completely useless, but it did justify those weekly problem sets that took 40 hours to complete. After years of students’ complaining, the course was adjusted to be a 1 credit class, it was renumbered CS 323, and still had weekly 40 hour problem sets. Other than that, it’s pretty much the same thing. I loved it, because I love programming. The best thing about CS323 is it teaches a lot of people that they just ain’t never gonna be programmers. This is a good thing. People that don’t have the benefit of Stan teaching them that they can’t be programmers have miserable careers cutting and pasting a lot of Java. By the way, if you took CS 323 and got an A, we have great summer internships at Fog Creek. See me afterwards.

As far as I can tell, the core curriculum hasn’t changed at all. 201, 223, 240, 323, 365, 421, 422, 424, 429 appear to be almost the same courses we took 16 years ago. The number of CS majors is actually up since I went to Yale, although a temporary peak during the dotcom days makes it look like it’s down. And there are a lot more interesting electives now than there were in my time. So: progress.

For a moment there, I actually thought I’d get a PhD. Both my parents are professors. So many of their friends were academics that I grew up assuming that all adults eventually got PhDs. In any case, I was thinking pretty seriously of going on to graduate school in Computer Science. Until I tried to take a class in Dynamic Logic right here in this very department. It was taught by Lenore Zuck, who is now at UIC.

I didn’t last very long, nor did I understand much of anything that was going on. From what I gather, Dynamic Logic is just like formal logic: Socrates is a man, all men are mortal, therefore Socrates is mortal. The difference is that in Dynamic Logic truth values can change over time. Socrates was a man, now he’s a cat, etc. In theory this should be an interesting way to prove things about computer programs, in which state, i.e., truth values, change over time.

In the first lecture Dr. Zuck presented a few axioms and some transformation rules and set about trying to prove a very simple thing. She had a computer program “f := not f,” f is a Boolean, that simply flipped a bit, and the goal was to prove that if you ran this program an even number of times, f would finish with the same value as it started out with.

The proof went on and on. It was in this very room, if I remember correctly, it looks like the carpet hasn’t been changed since then, and all of these blackboards were completely covered in the steps of the proof. Dr. Zuck used proof by induction, proof by reductio ad absurdum, proof by exhaustion—the class was late in the day and we were already running forty minutes over—and, in desperation, proof by graduate student, whereby, she says, “I can’t really remember how to prove this step,” and a graduate student in the front row says, “yes, yes, professor, that’s right.”

And when all was said and done, she got to the end of the proof, and somehow was getting exactly the opposite result of the one that made sense, until that same graduate student pointed out where, 63 steps earlier, some bit had been accidentally flipped due to a little bit of dirt on the board, and all was well.

For our homework, she told us to prove the converse: that if you run the program “f := not f” n times, and the bit is in the same state as it started, that n must be even.

I worked on that problem for hours and hours. I had her original proof in front of me, going in one direction, which, upon closer examination, turned out to have all kinds of missing steps that were “trivial,” but not to me. I read every word about Dynamic Logic that I could find in Becton, and I struggled with the problem late into the night. I was getting absolutely nowhere, and increasingly despairing of theoretical computer science. It occurred to me that when you have a proof that goes on for pages and pages, it’s far more likely to contain errors in the proof as our own intuition about the trivial statements that it’s trying to prove, and I decided that this Dynamic Logic stuff was really not a fruitful way of proving things about actual, interesting computer programs, because you’re more likely to make a mistake in the proof than you are to make a mistake in your own intuition about what the program “f := not f” is going to do. So I dropped the course, thank God for shopping period, but not only that, I decided on the spot that graduate school in Computer Science was just not for me, which made this the single most useful course I ever took.

Now this brings me to one of the important themes that I’ve learned in my career. Time and time again, you’ll see programmers redefining problems so that they can be solved algorithmically. By redefining the problem, it often happens that they’re left with something that can be solved, but which is actually a trivial problem. They don’t solve the real problem, because that’s intractable. I’ll give you an example.

You will frequently hear the claim that software engineering is facing a quality crisis of some sort. I don’t happen to agree with that claim—the computer software most people use most of the time is of ridiculously high quality compared to everything else in their lives—but that’s beside the point. This claim about the “quality crisis” leads to a lot of proposals and research about making higher quality software. And at this point, the world divides into the geeks and the suits.

The geeks want to solve the problem automatically, using software. They propose things like unit tests, test driven development, automated testing, dynamic logic and other ways to “prove” that a program is bug-free.

The suits aren’t really aware of the problem. They couldn’t care less if the software is buggy, as long as people are buying it.

Currently, in the battle between the geeks and the suits, the suits are winning, because they control the budget, and honestly, I don’t know if that’s such a bad thing. The suits recognize that there are diminishing returns to fixing bugs. Once the software hits a certain level of quality that allows it to solve someone’s problem, that person will pay for it and derive benefit out of it.

The suits also have a broader definition of “quality.” Their definition is about as mercenary as you can imagine: the quality of software is defined by how much it increases my bonus this year. Accidentally, this definition of quality incorporates a lot more than just making the software bug-free. For example, it places a lot of value on adding more features to solve more problems for more people, which the geeks tend to deride by calling it “bloatware.” It places value on aesthetics: a cool-looking program sells more copies than an ugly program. It places value on how happy a program makes its users feel. Fundamentally, it lets the users define their own concept of quality, and decide on their own if a given program meets their needs.

Now, the geeks are interested in the narrowly technical aspects of quality. They focus on things they can see in the code, rather than waiting for the users to judge. They’re programmers, so they try to automate everything in their life, and of course they try to automate the QA process. This is how you get unit testing, which is not a bad thing, don’t get me wrong, and it’s how you get all these attempts to mechanically “prove” that a program is “correct.” The trouble is that anything that can’t be automated has to be thrown out of the definition of quality. Even though we know that users prefer software that looks cooler, there’s no automated way to measure how cool looking a program is, so that gets left out of the automated QA process.

In fact what you’ll see is that the hard-core geeks tend to give up on all kinds of useful measures of quality, and basically they get left with the only one they can prove mechanically, which is, does the program behave according to specification. And so we get a very narrow, geeky definition of quality: how closely does the program correspond to the spec. Does it produce the defined outputs given the defined inputs.

The problem, here, is very fundamental. In order to mechanically prove that a program corresponds to some spec, the spec itself needs to be extremely detailed. In fact the spec has to define everything about the program, otherwise, nothing can be proven automatically and mechanically. Now, if the spec does define everything about how the program is going to behave, then, lo and behold, it contains all the information necessary to generate the program! And now certain geeks go off to a very dark place where they start thinking about automatically compiling specs into programs, and they start to think that they’ve just invented a way to program computers without programming.

Now, this is the software engineering equivalent of a perpetual motion machine. It’s one of those things that crackpots keep trying to do, no matter how much you tell them it could never work. If the spec defines precisely what a program will do, with enough detail that it can be used to generate the program itself, this just begs the question: how do you write the spec? Such a complete spec is just as hard to write as the underlying computer program, because just as many details have to be answered by spec writer as the programmer. To use terminology from information theory: the spec needs just as many bits of Shannon entropy as the computer program itself would have. Each bit of entropy is a decision taken by the spec-writer or the programmer.

So, the bottom line is that if there really were a mechanical way to prove things about the correctness of a program, all you’d be able to prove is whether that program is identical to some other program that must contain the same amount of entropy as the first program, otherwise some of the behaviors are going to be undefined, and thus unproven. So now the spec writing is just as hard as writing a program, and all you’ve done is moved one problem from over here to over there, and accomplished nothing whatsoever.

This seems like a kind of brutal example, but nonetheless, this search for the holy grail of program quality is leading a lot of people to a lot of dead ends. The Windows Vista team at Microsoft is a case in point. Apparently—and this is all based on blog rumors and innuendo—Microsoft has had a long term policy of eliminating all software testers who don’t know how to write code, replacing them with what they call SDETs, Software Development Engineers in Test, programmers who write automated testing scripts.

The old testers at Microsoft checked lots of things: they checked if fonts were consistent and legible, they checked that the location of controls on dialog boxes was reasonable and neatly aligned, they checked whether the screen flickered when you did things, they looked at how the UI flowed, they considered how easy the software was to use, how consistent the wording was, they worried about performance, they checked the spelling and grammar of all the error messages, and they spent a lot of time making sure that the user interface was consistent from one part of the product to another, because a consistent user interface is easier to use than an inconsistent one.

None of those things could be checked by automated scripts. And so one result of the new emphasis on automated testing was that the Vista release of Windows was extremely inconsistent and unpolished. Lots of obvious problems got through in the final product… none of which was a “bug” by the definition of the automated scripts, but every one of which contributed to the general feeling that Vista was a downgrade from XP. The geeky definition of quality won out over the suit’s definition; I’m sure the automated scripts for Windows Vista are running at 100% success right now at Microsoft, but it doesn’t help when just about every tech reviewer is advising people to stick with XP for as long as humanly possible. It turns out that nobody wrote the automated test to check if Vista provided users with a compelling reason to upgrade from XP.

I don’t hate Microsoft, really I don’t. In fact, my first job out of school was actually at Microsoft. In those days it was not really a respectable place to work. Sort of like taking a job in the circus. People looked at you funny. Really? Microsoft? On campus, in particular, it was perceived as corporate, boring, buttoned-down, making inferior software so that accountants can do, oh I don’t know, spreadsheets or whatever it is that accountants do. Perfectly miserable. And it all ran on a pathetic single-tasking operating system called MS-DOS full of arbitrary stupid limitations like 8-character file names and no email and no telnet and no Usenet. Well, MS-DOS is long gone, but the cultural gap between the Unixheads and the Windows users has never been wider. This is a culture war. The disagreements are very byzantine but very fundamental. To Yale, Microsoft was this place that made toy business operating systems using three-decades-old computer science. To Microsoft, “computer sciency” was a bad word used to make fun of new hires with their bizarre hypotheses about how Haskell is the next major programming language.

Just to give you one tiny example of the Unix-Windows cultural war. Unix has this cultural value of separating user interface from functionality. A righteous Unix program starts out with a command-line interface, and if you’re lucky, someone else will come along and write a pretty front end for it, with shading and transparency and 3D effects, and this pretty front end just launches the command-line interface in the background, which then fails in mysterious ways, which are then not reflected properly in the pretty front end which is now hung waiting for some input that it’s never going to get.

But the good news is that you can use the command line interface from a script.

Whereas the Windows culture would be to write a GUI app in the first place, and all the core functionality would be tangled up hopelessly with the user interface code, so you could have this gigantic application like Photoshop that’s absolutely brilliant for editing photos, but if you’re a programmer, and you want to use Photoshop to resize a folder of 1000 pictures so that each one fits in a 200 pixel box, you just can’t write that code, because it’s all very tightly bound to a particular user interface.

Anyway, the two cultures roughly correspond to highbrow vs. lowbrow, and in fact, it’s reflected accurately in the curriculum of computer science departments throughout the country. At Ivy League institutions, everything is Unix, functional programming, and theoretical stuff about state machines. As you move down the chain to less and less selective schools Java starts to appear. Move even lower and you literally start to see classes in topics like Microsoft Visual Studio 2005 101, three credits. By the time you get to the 2 year institutions, you see the same kind of SQL-Server-in-21-days “certification” courses you see advertised on the weekends on cable TV. Isn’t it time to start your career in (different voice) Java Enterprise Beans!

Part 2

After a few years in Redmond, Washington, during which I completely failed to adapt to my environment, I beat a hasty retreat to New York City. I stayed on with Microsoft in New York for a few months, where I was a complete and utter failure as a consultant at Microsoft Consulting, and then I spent a few years in the mid-90s, when the Internet was first starting to happen, at Viacom. That’s this big corporate conglomerate which owned MTV, VH1, Nickelodeon, Blockbuster, Paramount Studios, Comedy Central, CBS, and a bunch of other entertainment companies. New York was the first place I got to see what most computer programmers do for a living. It’s this scary thing called “in house software.” It’s terrifying. You never want to do in house software. You’re a programmer for a big corporation that makes, oh, I don’t know, aluminum cans, and there’s nothing quite available off the shelf which does the exact kind of aluminum can processing that they need, so they have these in-house programmers, or they hire companies like Accenture and IBM to send them overpriced programmers, to write this software. And there are two reasons this is so frightening: one, because it’s not a very fulfilling career if you’re a programmer, for a list of reasons which I’ll enumerate in a moment, but two, it’s frightening because this is what probably 80% of programming jobs are like, and if you’re not very, very careful when you graduate, you might find yourself working on in-house software, by accident, and let me tell you, it can drain the life out of you.

OK, so, why does it suck to be an in house programmer.

Number one. You never get to do things the right way. You always have to do things the expedient way. It costs so much money to hire these programmers—typically a company like Accenture or IBM would charge $300 an hour for the services of some recent Yale PoliSci grad who took a 6 week course in dot net programming, and who is earning $47,000 a year and hoping that it’ll provide enough experience to get into business school—anyway, it costs so much to hire these programmers that you’re not going to allowed to build things with Ruby on Rails no matter how cool Ruby is and no matter how spiffy the Ajax is going to be. You’re going into Visual Studio, you’re going to click on the wizard, you’re going to drag the little Grid control onto the page, you’re going to hook it up to the database, and presto, you’re done. It’s good enough. Get out of there and onto the next thing. That’s the second reason these jobs suck: as soon as your program gets good enough, you have to stop working on it. Once the core functionality is there, the main problem is solved, there is absolutely no return-on-investment, no business reason to make the software any better. So all of these in house programs look like a dog’s breakfast: because it’s just not worth a penny to make them look nice. Forget any pride in workmanship or craftsmanship you learned in CS323. You’re going to churn out embarrassing junk, and then, you’re going to rush off to patch up last year’s embarrassing junk which is starting to break down because it wasn’t done right in the first place, twenty-seven years of that and you get a gold watch. Oh, and they don’t give gold watches any more. 27 years and you get carpal tunnel syndrome. Now, at a product company, for example, if you’re a software developer working on a software product or even an online product like Google or Facebook, the better you make the product, the better it sells. The key point about in-house development is that once it’s “good enough,” you stop. When you’re working on products, you can keep refining and polishing and refactoring and improving, and if you work for Facebook, you can spend a whole month optimizing the Ajax name-choosing gizmo so that it’s really fast and really cool, and all that effort is worthwhile because it makes your product better than the competition. So, the number two reason product work is better than in-house work is that you get to make beautiful things.

Number three: when you’re a programmer at a software company, the work you’re doing is directly related to the way the company makes money. That means, for one thing, that management cares about you. It means you get the best benefits and the nicest offices and the best chances for promotion. A programmer is never going to rise to become CEO of Viacom, but you might well rise to become CEO of a tech company.

Anyway. After Microsoft I took a job at Viacom, because I wanted to learn something about the internet and Microsoft was willfully ignoring it in those days. But at Viacom, I was just an in-house programmer, several layers removed from anybody who did anything that made Viacom money in any way.

And I could tell that no matter how critical it was for Viacom to get this internet thing right, when it came time to assign people to desks, the in-house programmers were stuck with 3 people per cubicle in a dark part of the office with no line-of-sight to a window, and the “producers,” I don’t know what they did exactly but they were sort of the equivalent of Turtle on Entourage, the producers had their own big windowed offices overlooking the Hudson River. Once at a Viacom Christmas party I was introduced to the executive in charge of interactive strategy or something. A very lofty position. He said something vague and inept about how interactivity was very important. It was the future. It convinced me that he had no flipping idea whatsoever what it was that was happening and what the internet meant or what I did as a programmer, and he was a little bit scared of it all, but who cares, because he’s making 2 million dollars a year and I’m just a typist or “HTML operator” or whatever it is that I did, how hard can it be, his teenage daughter can do that.

So I moved across the street to Juno Online Services. This was an early internet provider that gave people free dial-up accounts that could only be use for email. It wasn’t like Hotmail or Gmail, which didn’t exist yet, because you didn’t need internet access to begin with, so it was really free.

Juno was, allegedly, supported by advertising. It turned out that advertising to the kinds of people who won’t pay $20 a month for AOL is not exactly the most lucrative business in the world, so in reality, Juno was supported by rich investors. But at least Juno was a product company where programmers were held in high regard, and I felt good about their mission to provide email to everyone. And indeed I worked there happily for about three years as a C++ programmer. Eventually, though, I started to discover that the management philosophy at Juno was old fashioned. The assumption there was that managers exist to tell people what to do. This is quite upside-down from the way management worked in typical west-coast high tech companies. What I was used to from the west coast was an attitude that management is just an annoying, mundane chore someone has to do so that the smart people can get their work done. Think of an academic department at a university, where being the chairperson of the department is actually something of a burden that nobody really wants to do; they’d much rather be doing research. That’s the Silicon Valley style of management. Managers exist to get furniture out of the way so the real talent can do brilliant work.

Juno was founded by very young, very inexperienced people—the president of the company was 24 years old and it was his first job, not just his first management job—and somewhere in a book or a movie or a TV show he had gotten the idea that managers exist to DECIDE.

If there’s one thing I know, it’s that managers have the least information about every technical issue, and they are the last people who should be deciding anything. When I was at Microsoft, Mike Maples, the head of the applications division, used to have people come to him to resolve some technical debate they were having. And he would juggle some bowling pins, tell a joke, and tell them to get the hell out of his office and solve their own damned problems instead of coming to him, the least qualified person to make a technical decision on its merits. That was, I thought, the only way to manage smart, highly qualified people. But the Juno managers, like George Bush, were the deciders, and there were too many decisions to be made, so they practiced something I started calling hit-and-run micromanagement: they dive in from nowhere, micromanage some tiny little issue, like how dates should be entered in a dialog box, overriding the opinions of all the highly qualified technical people on the team who had been working on that problem for weeks, and then they disappeared, so that’s the hit-and-run part, because there’s some other little brush fire elsewhere that needs micromanagement.

So, I quit, without a real plan.

Part 3

I despaired of finding a company to work for where programmers were treated like talent and not like typists, and decided I would have to start my own. In those days, I was seeing lots of really dumb people with really dumb business plans making internet companies, and I thought, hey, if I can be, say, 10% less dumb than them, that should be easy, maybe I can make a company too, and in my company, we’d do things right for a change. We’d treat programmers with respect, we’d make high quality products, we wouldn’t take any shit from VCs or 24-year-olds playing President, we’d care about our customers and solve their problems when they called, instead of blaming everything on Microsoft, and we’d let our customers decide whether or not to pay us. At Fog Creek we’ll give anyone their money back with no questions asked under any circumstances whatsoever. Keeps us honest.

So, it was the summer of 2000, and I had taken some time off from work while I hatched the plans for Fog Creek Software and went to the beach a lot. During that period I started writing up some of the things I had learned over the course of my career on a website called Joel on Software. In those early days before blogs were invented, a programmer named Dave Winer had set up a system called EditThisPage.com where anyone could post things to the web in a sort-of blog like format. Joel on Software grew quickly and gave me a pulpit where I could write about software development and actually get some people to pay attention to what I was saying. The site consists of fairly unoriginal thoughts, combined with jokes. It was successful because I used a slightly larger font than the average website, making it easy to read. It’s always hard to figure out how many people read the site, especially when you don’t bother counting them, but typical articles on that site get read by somewhere between 100,000 and a million people, depending on how popular the topic is.

What I do on Joel on Software—writing articles about somewhat technical topics—is something I learned here in the CS department, too. Here’s the story behind that. In 1989 Yale was pretty good at AI, and one of the big name professors, Roger Schank, came and gave a little talk at Hillel about some of his AI theories about scripts and schemas and slots and all that kind of stuff. Now essentially, I suspect from reading his work that it was the same speech he’d been giving for twenty years, and he had spent twenty years of his career writing little programs using these theories, presumably to test them, and they didn’t work, but somehow the theories never got discarded. He did seem like a brilliant man, and I wanted to take a course with him, but he was well known for hating undergraduates, so the only option was to take this course called Algorithmic Thinking—CS115—basically, a watered-down gut group IV class designed for poets. It was technically in the CS department, but the faculty was so completely unimpressed that you were not allowed to count it towards a CS major. Although it was the largest class by enrollment in the CS department, I cringed every time I heard my history-major friends referring to the class as “computer science.” A typical assignment was to write an essay on whether machines can think or not. You can see why we weren’t allowed to count it towards a CS degree. In fact, I would not be entirely surprised if you revoke my degree today, retroactively, upon learning that I took this class.

The best thing about Algorithmic Thinking was that you had to write a lot. There were 13 papers—one every week. You didn’t get grades. Well, you did. Well, ok, there’s a story there. One of the reasons Schank hated undergrads so much was that they were obsessed with grades. He wanted to talk about whether computers could think and all undergrads wanted to talk about was why their paper got a B instead of an A. At the beginning of the term, he made a big speech about how grades are evil, and decided that the only grade you could get on a paper was a little check mark to signify that some grad student read it. Over time, he wanted to recognize the really good papers, so he added check-PLUS, and then there were some really lame papers, so he started giving out check-minuses, and I think I got a check-plus-plus once. But grades: never.

And despite the fact that CS115 didn’t count towards the major, all this experience writing about slightly technical topics turned out to be the most useful thing I got out of the CS department. Being able to write clearly on technical topics is the difference between being a grunt individual contributor programmer and being a leader. My first job at Microsoft was as a program manager on the Excel team, writing the technical specification for this huge programming system called Visual Basic for Applications. This document was something like 500 pages long, and every morning literally hundreds of people came into work and read my spec to figure out what to do next. That included programmers, testers, marketing people, documentation writers, and localizers around the world. I noticed that the really good program managers at Microsoft were the ones who could write really well. Microsoft flipped its corporate strategy 180 degrees based on a single compelling email that Steve Sinofsky wrote called Cornell is Wired. The people who get to decide the terms of the debate are the ones who can write. The C programming language took over because The C Programming Language was such a great book.

So anyway, those were the highlights of CS. CS 115, in which I learned to write, one lecture in Dynamic Logic, in which I learned not to go to graduate school, and CS 322, in which I learned the rites and rituals of the Unix church and had a good time writing a lot of code. The main thing you don’t learn with a CS degree is how to develop software, although you will probably build up certain muscles in your brain that may help you later if you decide that developing software is what you want to do. The other thing you can do, if you want to learn how to develop software, is send your resume to jobs@fogcreek.com, and apply for a summer internship, and we’ll teach you a thing or two about the subject.

Thank you very much for your time.

Steve Jobs

I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I’ve ever gotten to a college graduation. Today I want to tell you three stories from my life. That’s it. No big deal. Just three stories.

The first story is about connecting the dots.

I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?

It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: “We have an unexpected baby boy; do you want him?” They said: “Of course.” My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.

And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents’ savings were being spent on my college tuition. After six months, I couldn’t see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn’t interest me, and begin dropping in on the ones that looked interesting.

It wasn’t all romantic. I didn’t have a dorm room, so I slept on the floor in friends’ rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:

Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.

None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, its likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.

Again, you can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

My second story is about love and loss.

I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.

I really didn’t know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down – that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.

I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.

During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple’s current renaissance. And Laurene and I have a wonderful family together.

I’m pretty sure none of this would have happened if I hadn’t been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don’t lose faith. I’m convinced that the only thing that kept me going was that I loved what I did. You’ve got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don’t settle.

My third story is about death.

When I was 17, I read a quote that went something like: “If you live each day as if it was your last, someday you’ll most certainly be right.” It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: “If today were the last day of my life, would I want to do what I am about to do today?” And whenever the answer has been “No” for too many days in a row, I know I need to change something.

Remembering that I’ll be dead soon is the most important tool I’ve ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure – these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn’t even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor’s code for prepare to die. It means to try to tell your kids everything you thought you’d have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.

I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I’m fine now.

This was the closest I’ve been to facing death, and I hope its the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:

No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960’s, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.

Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: “Stay Hungry. Stay Foolish.” It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.

Stay Hungry. Stay Foolish.

Thank you all very much.

Conan O’Brien

I’d like to thank the Class Marshals for inviting me here today. The last time I was invited to Harvard it cost me $110,000, so you’ll forgive me if I’m a bit suspicious. I’d like to announce up front that I have one goal this afternoon: to be half as funny as tomorrow’s Commencement Speaker, Moral Philosopher and Economist, Amartya Sen. Must get more laughs than seminal wage/price theoretician.

Students of the Harvard Class of 2000, fifteen years ago I sat where you sit now and I thought exactly what you are now thinking: What’s going to happen to me? Will I find my place in the world? Am I really graduating a virgin? I still have 24 hours and my roommate’s Mom is hot. I swear she was checking me out. Being here today is very special for me. I miss this place. I especially miss Harvard Square – it’s so unique. No where else in the world will you find a man with a turban wearing a Red Sox jacket and working in a lesbian bookstore. Hey, I’m just glad my dad’s working.

It’s particularly sweet for me to be here today because when I graduated, I wanted very badly to be a Class Day Speaker. Unfortunately, my speech was rejected. So, if you’ll indulge me, I’d like to read a portion of that speech from fifteen years ago: “Fellow students, as we sit here today listening to that classic Ah-ha tune which will definitely stand the test of time, I would like to make several predictions about what the future will hold: “I believe that one day a simple Governor from a small Southern state will rise to the highest office in the land. He will lack political skill, but will lead on the sheer strength of his moral authority.” “I believe that Justice will prevail and, one day, the Berlin Wall will crumble, uniting East and West Berlin forever under Communist rule.” “I believe that one day, a high speed network of interconnected computers will spring up world-wide, so enriching people that they will lose their interest in idle chit chat and pornography.” “And finally, I believe that one day I will have a television show on a major network, seen by millions of people a night, which I will use to re-enact crimes and help catch at-large criminals.” And then there’s some stuff about the death of Wall Street which I don’t think we need to get into….

The point is that, although you see me as a celebrity, a member of the cultural elite, a kind of demigod, I was actually a student here once much like you. I came here in the fall of 1981 and lived in Holworthy. I was, without exaggeration, the ugliest picture in the Freshman Face book. When Harvard asked me for a picture the previous summer, I thought it was just for their records, so I literally jogged in the August heat to a passport photo office and sat for a morgue photo. To make matters worse, when the Face Book came out they put my picture next to Catherine Oxenberg, a stunning blonde actress who was accepted to the class of ’85 but decided to defer admission so she could join the cast of “Dynasty.” My photo would have looked bad on any page, but next to Catherine Oxenberg, I looked like a mackerel that had been in a car accident. You see, in those days I was six feet four inches tall and I weighed 150 pounds. Recently, I had some structural engineers run those numbers into a computer model and, according to the computer, I collapsed in 1987, killing hundreds in Taiwan.

After freshman year I moved to Mather House. Mather House, incidentally, was designed by the same firm that built Hitler’s bunker. In fact, if Hitler had conducted the war from Mather House, he’d have shot himself a year earlier. 1985 seems like a long time ago now. When I had my Class Day, you students would have been seven years old. Seven years old. Do you know what that means? Back then I could have beaten any of you in a fight. And I mean bad. It would be no contest. If any one here has a time machine, seriously, let’s get it on, I will whip your seven year old butt. When I was here, they sold diapers at the Coop that said “Harvard Class of 2000.” At the time, it was kind of a joke, but now I realize you wore those diapers. How embarrassing for you. A lot has happened in fifteen years. When you think about it, we come from completely different worlds. When I graduated, we watched movies starring Tom Cruise and listened to music by Madonna. I come from a time when we huddled around our TV sets and watched “The Cosby Show” on NBC, never imagining that there would one day be a show called “Cosby” on CBS. In 1985 we drove cars with driver’s side airbags, but if you told us that one day there’d be passenger side airbags, we’d have burned you for witchcraft.

But of course, I think there is some common ground between us. I remember well the great uncertainty of this day. Many of you are justifiably nervous about leaving the safe, comfortable world of Harvard Yard and hurling yourself headlong into the cold, harsh world of Harvard Grad School, a plum job at your father’s firm, or a year abroad with a gold Amex card and then a plum job in your father’s firm. But let me assure you that the knowledge you’ve gained here at Harvard is a precious gift that will never leave you. Take it from me, your education is yours to keep forever. Why, many of you have read the Merchant of Florence, and that will inspire you when you travel to the island of Spain. Your knowledge of that problem they had with those people in Russia, or that guy in South America-you know, that guy-will enrich you for the rest of your life.

There is also sadness today, a feeling of loss that you’re leaving Harvard forever. Well, let me assure you that you never really leave Harvard. The Harvard Fundraising Committee will be on your ass until the day you die. Right now, a member of the Alumni Association is at the Mt. Auburn Cemetery shaking down the corpse of Henry Adams. They heard he had a brass toe ring and they aims to get it. Imagine: These people just raised 2.5 billion dollars and they only got through the B’s in the alumni directory. Here’s how it works. Your phone rings, usually after a big meal when you’re tired and most vulnerable. A voice asks you for money. Knowing they just raised 2.5 billion dollars you ask, “What do you need it for?” Then there’s a long pause and the voice on the other end of the line says, “We don’t need it, we just want it.” It’s chilling.

What else can you expect? Let me see, by your applause, who here wrote a thesis. (APPLAUSE) A lot of hard work, a lot of your blood went into that thesis… and no one is ever going to care. I wrote a thesis: Literary Progeria in the works of Flannery O’Connor and William Faulkner. Let’s just say that, during my discussions with Pauly Shore, it doesn’t come up much. For three years after graduation I kept my thesis in the glove compartment of my car so I could show it to a policeman in case I was pulled over. (ACT OUT) License, registration, cultural exploration of the Man Child in the Sound and the Fury…

So what can you expect out there in the real world? Let me tell you. As you leave these gates and re-enter society, one thing is certain: Everyone out there is going to hate you. Never tell anyone in a roadside diner that you went to Harvard. In most situations the correct response to where did you to school is, “School? Why, I never had much in the way of book larnin’ and such.” Then, get in your BMW and get the hell out of there.

You see, you’re in for a lifetime of “And you went to Harvard?” Accidentally give the wrong amount of change in a transaction and it’s, “And you went to Harvard?” Ask the guy at the hardware store how these jumper cables work and hear, “And you went to Harvard?” Forget just once that your underwear goes inside your pants and it’s “and you went to Harvard.” Get your head stuck in your niece’s dollhouse because you wanted to see what it was like to be a giant and it’s “Uncle Conan, you went to Harvard!?”

But to really know what’s in store for you after Harvard, I have to tell you what happened to me after graduation. I’m going to tell you my story because, first of all, my perspective may give many of you hope, and, secondly, it’s an amazing rush to stand in front of six thousand people and talk about yourself.

After graduating in May, I moved to Los Angeles and got a three week contract at a small cable show. I got a $380 a month apartment and bought a 1977 Isuzu Opel, a car Isuzu only manufactured for a year because they found out that, technically, it’s not a car. Here’s a quick tip, graduates: no four cylinder vehicle should have a racing stripe. I worked at that show for over a year, feeling pretty good about myself, when one day they told me they were letting me go. I was fired and, I hadn’t saved a lot of money. I tried to get another job in television but I couldn’t find one.

So, with nowhere else to turn, I went to a temp agency and filled out a questionnaire. I made damn sure they knew I had been to Harvard and that I expected the very best treatment. And so, the next day, I was sent to the Santa Monica branch of Wilson’s House of Suede and Leather. When you have a Harvard degree and you’re working at Wilson’s House of Suede and Leather, you are haunted by the ghostly images of your classmates who chose Graduate School. You see their faces everywhere: in coffee cups, in fish tanks, and they’re always laughing at you as you stack suede shirts no man, in good conscience, would ever wear. I tried a lot of things during this period: acting in corporate infomercials, serving drinks in a non-equity theatre, I even took a job entertaining at a seven year olds’ birthday party. In desperate need of work, I put together some sketches and scored a job at the fledgling Fox Network as a writer and performer for a new show called “The Wilton North Report.” I was finally on a network and really excited. The producer told me the show was going to revolutionize television. And, in a way, it did. The show was so hated and did so badly that when, four weeks later, news of its cancellation was announced to the Fox affiliates, they burst into applause.

Eventually, though, I got a huge break. I had submitted, along with my writing partner, a batch of sketches to Saturday Night Live and, after a year and a half, they read it and gave us a two week tryout. The two weeks turned into two seasons and I felt successful. Successful enough to write a TV pilot for an original sitcom and, when the network decided to make it, I left Saturday Night Live. This TV show was going to be groundbreaking. It was going to resurrect the career of TV’s Batman, Adam West. It was going to be a comedy without a laugh track or a studio audience. It was going to change all the rules. And here’s what happened: When the pilot aired it was the second lowest-rated television show of all time. It’s tied with a test pattern they show in Nova Scotia.

So, I was 28 and, once again, I had no job. I had good writing credits in New York, but I was filled with disappointment and didn’t know what to do next. I started smelling suede on my fingertips. And that’s when The Simpsons saved me. I got a job there and started writing episodes about Springfield getting a Monorail and Homer going to College. I was finally putting my Harvard education to good use, writing dialogue for a man who’s so stupid that in one episode he forgot to make his own heart beat. Life was good.

And then, an insane, inexplicable opportunity came my way . A chance to audition for host of the new Late Night Show. I took the opportunity seriously but, at the same time, I had the relaxed confidence of someone who knew he had no real shot. I couldn’t fear losing a great job I had never had. And, I think that attitude made the difference. I’ll never forget being in the Simpson’s recording basement that morning when the phone rang. It was for me. My car was blocking a fire lane. But a week later I got another call: I got the job.

So, this was undeniably the it: the truly life-altering break I had always dreamed of. And, I went to work. I gathered all my funny friends and poured all my years of comedy experience into building that show over the summer, gathering the talent and figuring out the sensibility. We debuted on September 13, 1993 and I was happy with our effort. I felt like I had seized the moment and put my very best foot forward. And this is what the most respected and widely read television critic, Tom Shales, wrote in the Washington Post: “O’Brien is a living collage of annoying nervous habits. He giggles and titters, jiggles about and fiddles with his cuffs. He had dark, beady little eyes like a rabbit. He’s one of the whitest white men ever. O’Brien is a switch on the guest who won’t leave: he’s the host who should never have come. Let the Late show with Conan O’Brien become the late, Late Show and may the host return to Conan O’Blivion whence he came.” There’s more but it gets kind of mean.

Needless to say, I took a lot of criticism, some of it deserved, some of it excessive. And it hurt like you wouldn’t believe. But I’m telling you all this for a reason. I’ve had a lot of success and I’ve had a lot of failure. I’ve looked good and I’ve looked bad. I’ve been praised and I’ve been criticized. But my mistakes have been necessary. Except for Wilson’s House of Suede and Leather. That was just stupid.

I’ve dwelled on my failures today because, as graduates of Harvard, your biggest liability is your need to succeed. Your need to always find yourself on the sweet side of the bell curve. Because success is a lot like a bright, white tuxedo. You feel terrific when you get it, but then you’re desperately afraid of getting it dirty, of spoiling it in any way.

I left the cocoon of Harvard, I left the cocoon of Saturday Night Live, I left the cocoon of The Simpsons. And each time it was bruising and tumultuous. And yet, every failure was freeing, and today I’m as nostalgic for the bad as I am for the good.

So, that’s what I wish for all of you: the bad as well as the good. Fall down, make a mess, break something occasionally. And remember that the story is never over. If it’s all right, I’d like to read a little something from just this year: “Somehow, Conan O’Brien has transformed himself into the brightest star in the Late Night firmament. His comedy is the gold standard and Conan himself is not only the quickest and most inventive wit of his generation, but quite possible the greatest host ever.”

Ladies and Gentlemen, Class of 2000, I wrote that this morning, as proof that, when all else fails, there’s always delusion.

I’ll go now, to make bigger mistakes and to embarrass this fine institution even more. But let me leave you with one last thought: If you can laugh at yourself loud and hard every time you fall, people will think you’re drunk.

Thank you.