Wonderful. I guess the reasoning is "well, this is a women's issue, so put it in the bit of the paper that women read", right?
Couldn't tell you what the thinking was, but it does look odd.
Ada Lovelace was not exactly a fashion icon. Nor, to be honest, is Walter Isaacson, though he's dressed in unexceptionable smart-casual for the photoshoot.
Oh, related Lovelace news: there's a book out, with many footnotes.
http://sydneypadua.com/2dgoggles/tantaratara/
Putting it in the Tech section (does NYT have a Tech section?) would unleash the screaming hordes of self-righteous misogynists who lurk beneath the surface in any sufficiently large collection of geeks. "Sufficiently large" is a little indistinct but certainly includes the number of people who would read the NYT Tech section if such a thing existed, which it probably doesn't.
I don't know if that was the motivation, or if Ajay's hypothesis is correct, which seems fairly likely.
Also I want to note the provincialism of calling the NYT "The Times" as if no other city has a local paper with that name. You're a bumpkin, LB. Of course when I refer to "The Times" I also mean NYT rather than the DC Moonie puppy trainer. Maybe I am also a bumpkin.
Does the NY Times put book-related articles anywhere else in the daily paper?
I haven't read a dead-tree edition of the NYT this century, so I honestly don't know the answer.
Also I want to note the provincialism of calling the NYT "The Times" as if no other city has a local paper with that name. You're a bumpkin, LB. Of course when I refer to "The Times" I also mean NYT rather than the DC Moonie puppy trainer.
Or indeed the London fishwrap, which is actually called "The Times" and says so on its masthead and everything.
6: The NYT has an Arts section where book reviews, etc happen.
Putting it in the Tech section (does NYT have a Tech section?) would unleash the screaming hordes of self-righteous misogynists
Argh.
There's one now.
I echo heebie's argh, and raise her one "What fresh bullshit is this?"
Have I ever purported to be anything but provincial?
I just felt the thread might need cheering up.
Style sections used to be called the "Women's Page"
I should perform smugness at this point, because the story that we are publishing about underrepresentation of women is the damn cover story. So that's New York Times nil, Quarterly Proceedings of the London Society of Aquarists and Tropical Fish Keepers one.
Why yes, thank you, I would like a cookie.
Wait, you're not the economics editor for Marie Claire?
16: no, you're thinking of dsquared, who was until recently economics editor of Cosmopolitan ("Twelve Things He Wishes You Knew About Nominal GDP Targeting" was one of his).
"Drive Your Economy Wild With Export-Oriented Tariff Policies".
"Style Makeover: How to flatter your assets and conceal your deficits."
19: My bank has sent me spam (both email and physical) telling me I have "nice assets" and that I should put more into them.
Why is your bank negging you? I'm sure your assets are just fine the way they are.
That's what I said. They look well-balanced when I check out my portfolio. The bank said they'll kick in a bit if I put in an extra $100k but I think it's unrealistic to compare me to someone who has that much.
It's not the size of your assets, it's the way they appreciate.
I've just been told I'm "ineligible" for my voicemail app to show me ads! Fiddlesticks.
I don't think Ada Lovelace has been written out of the history of computing, any more than Charles Babbage has. She's actually fairly revered. She has a major programming language named after her, which is rare (for a male or a female); Haskell Curry is the only other one that comes to mind.
The field is fairly ahistorical. In my undergraduate education, probably the only names I learned as being fairly significant are: Dijkstra, Von Neumann, Turing. Maybe Kernighan and Ritchie? To this day (and I'll take a break and hit Wikipedia right after this) I couldn't tell you what Grace Hoppers contribution to the field was or, say, who invented the transistor. fake accent will be very disappointed in me.
Let me make the subtext of 25.1 text: Walter Isaacson is kind of a dumbass.
Look, books need creation myths too, and that's how I'm reading the story of the gestation of this book, the college essay story.
When I saw the OP, I thought to myself "Who doesn't know this?" Who hasn't heard of at least those 2? I've just now found and looked over The How and Why Book of Robots and Electronic Brains(1963) which I know was my own introduction to the subject, and sure enough they're not mentioned, although many of the illustrations show mainframes with women operating them, with no man in the picture. Nonetheless, The story of Lord Byron's daughter Ada Lovelace, and of Grace Hopper and the "bug," seem to have been part of most computer history I've read since.
So either Issacson is being disingenuous or Bob-Somerby-head-slapping obtuse, or both. The collaborative basis of technical history is a commonplace in the story of invention and development, and is prominently featured as such in histories of other technologies I've read many of them 50-80 years old.
So I am not impressed, but maybe I'm an outlier, both in Ada/Grace awareness and technical history. What about you?
25: she was written out of the history of computing at the time is how I read his argument. Ditto the female programmers on ENIAC.
If it wasn't for Ada Lovelace, there's a chance that none of this would even exist
There's a chance, yes, but a small one. Lovelace and Babbage's work is really interesting and way ahead of it's time, but as far as I know they did not directly lead to early digital computers as we know them (some 50-100 years later). Like I said, though, I expect f.a. to show up and lay some smack down.
Agreed. In my extremely basic computer science class in... 1998? there were about five names we learned. Lovelace, Babbage, Turing, Von Neumann and Joseph-Marie "Loom" Jacquard. Then like all sciences, when we got into the realm of living memory, it kind of stopped being about heroic individuals who have things named after them.
Lovelace, at least, has been being written back in for ages now. The programming language dates to 1980.
27.2: Did that book mention Charles Babbage?
Specifically on Lovelace, rather than the generation of women programmers during and immediately after the war, I agree with Idp -- the fact that Isaacson hadn't heard of her or didn't remember her says something more about Isaacson being forgetful/confused/ignorant than about Lovelace having been written out of history.
I half agree with 25. Ada Lovelace is super well known re: computing and has been since I was using TSR-80s in middle school. Grace Hopper, on the other hand, is less well known than she should be.
28: With respect to Lovelace, I'm not sure what "at the time" means. There was no history of computing at the time. The stuff the she and Babbage were doing only became of interest decades later when computing became practical.
Lovelace got a language; Babbage got a now defunct electronics retailer. She won.
My knowledge of Hopper is "something something early imperative compiled languages maybe FORTRAN?," but people are more quick to remember her for the bug story.
Hopper got a memorial park. I walked past it this morning, some dogs were honoring her memory.
The thing I remember about Hopper (apart from the fact that she was an admiral when very few women were going into the navy) is that she sometimes carried around a little stick to illustrate how far light travels in a nanosecond.
With respect to Lovelace, I'm not sure what "at the time" means.
It means "at the time when she was doing her work". The article mentions one specific example: Babbage being pressured to put his name on her work.
A bunch of Victorian gentleman scientists refusing to publish a paper by an icky girl doesn't tell us much about modern sexism.
This sounds like one of those cases where someone wants to write about something, in this case massively influential women, that has been written out of history or otherwise neglected. And then they want to link it to contemporary problems or argue that this neglect is bad. But they end up picking an area that they don't know much about, and since they don't know much about it the only cases they can think of to use as examples are the ones that weren't written out of history but are actually pretty well known.
(In other words, this is a much less embarrassing version of this sort of thing, right? I mean, less embarrassing in that the person involved is at least a little obscure, and that the actual issue is an important social one and isn't just "I don't know what's going on".)
31:
Babbage, yes but just in passing.
I'm no expert; my computer is Burns level at best.
Isn't the Grace Hopper bug story sort-of apocryphal? I think the term was already in use and she was consciously punning.
I also, in the four minutes I spent on this after breakfast, misrepresented the book (The Innovators). It's not about the women who have been written out of the history of the tech industry. It's about the history of the tech industry generally, but makes an effort not to write women out of it despite the fact that they have been before.
I think Lovelace and Hopper may have been more the journalist's attempt to namecheck particular women who NYT readers might possibly have heard of, rather than major focus points in the book.
A bunch of Victorian gentleman scientists refusing to publish a paper by an icky girl doesn't tell us much about modern sexism.
Didn't Rosalind Franklin get treated about as shabbily at the time, if not identically?
Not so much Victorian. But shabbily, yes.
The structure of DNA was not discovered by crystallography in the Victorian era.
Is the story with Rosalind Franklin that her name was on the paper and they removed it, or that they wrote the paper and never considered putting her name on it?
45: Apparently Ada opens and closes the book though. I dunno, I am suspicious of the two cultures effect. I'm wondering if you asked the author if he had heard of Robert Boyle you would end up with a book decrying Boyle being written out of Chemistry history due to his Irish ancestry.
There's a massive problem with writing women out of tech history but I agree with 39 that Ada Lovelace and Grace Hopper are almost the only two that aren't.
Hopper invented the compiler, insofar as you could name one inventor. She's called Amazing Grace for a reason.
The latter. As I understand it, she had generated the necessary data in her lab; rather than approaching her for permission to use it and bringing her in as a collaborator, Watson and Crick stole the relevant photographs and didn't credit her.
(Checking wikipedia, 'stole' is maybe an overstatement. Another scientist working with Franklin gave Watson and Crick access to her data. But still with the not crediting her.)
Clearly 49 is unaware of Giselle Thompson's Vivological Panopticon. Her pioneering Phlogistonic Refractor anticipated so much of Rontgen's later work, but was sadly not well recognised at the time and has been largely written out of history.
54: Pedantic correction, the photograph was shown to Watson by Franklin's collaborator Wilkins.
Did Watson and Crick credit Wilkins (or anybody) for the photo? If not, it was just straight-up intellectual fraud, rather than sexism.
Wikipedia:
"Crick and Watson then published their model in Nature on 25 April 1953 in an article describing the double-helical structure of DNA with only a footnote acknowledging "having been stimulated by a general knowledge of" Franklin and Wilkin's 'unpublished' contribution. Actually, although it was the bare minimum, they had just enough specific knowledge of Franklin and Gosling's data upon which to base their model. As a result of a deal struck by the two laboratory directors, articles by Wilkins and Franklin, which included their X-ray diffraction data, were modified and then published second and third in the same issue of Nature, seemingly only in support of the Crick and Watson theoretical paper which proposed a model for the B form of DNA."
They cite Wilkins and mention Wilkins and Franklin in the last paragraph. The whole original paper is amazingly short by modern standards. http://www.nature.com/nature/dna50/watsoncrick.pdf
The fact that Watson's reputation is so haunted by the ghost of Rosalind Franklin is kind of a great example a self-inflicted wound. If he hadn't done such a hatchet job on Franklin in The Double Helix it's likely that the whole thing would have remained obscure.
59: I think I sprained a muscle rolling my eyes at you. Sure thing, you couldn't possibly speculate like that.
If they'd just rob a bank, or explain clearly that their motivation for cutting her out of credit for her work was sexism rather than just being self-aggrandizing, then we'd have them for sure!
I could be even more pedantic on the subject of Franklin, but as I understand the story, she had the images but the wrong interpretation. Wilkins shared her images without her permission, and it gave Watson and Crick the solution. Watson and Crick denied and downplayed her contribution (and Watson wrote about her in a pretty unflattering way in his book).
Of course it's in the Style section. Girls like their science sparkly.
The fact that Watson's reputation is so haunted by the ghost of Rosalind Franklin is kind of a great example a self-inflicted wound.
It helps also that Watson's reputation is haunted by the fact that he is an all-around dick.
25, 29: I don't actually know much about the history of computers. Yet. Still, 25.last is right.
I do have the impression that Lovelace and Hopper are a lot better known now than they were a decade or two ago.
62 seems pretty likely to me also. The sort of thing he and Crick did is really common, and typically only gets remembered if the people doing it make it clear that they are raging douchebags too, because otherwise a lot of people would have to admit (to themselves mostly) that this kind of thing is structural not just personal. Once you have someone who is clearly awful though you can talk about how they did it without admitting that it didn't just happen because they were awful.
Don't worry though, ladies! The Republican party is here for you. (And here, and here...)
68: I agree about Hopper but not Lovelace. She was part of the curriculum in middle school computing classes in the 80s, unless my class was wildly unrepresentative.
63: Look, I'm talking out my ass here and there's apparently ample secondary evidence to implicate sexism. All I'm saying is that bare facts—"borrowing" data from another lab without attribution—sound like standard-issue academic megalomania. I've heard many such stories where the victims were men.
Odds are high that the women mentioned in this article (an obituary for one of the women who worked on the ENIAC) remain fairly obscure.
I don't remember a single person being named in either of the programming classes I took in junior high/high school. Not counting Pascal.
71: Look, I'm talking out my ass here and there's apparently ample secondary evidence to implicate sexism.
We're good, then.
73: Same. Then Turing and Godel appeared in first year uni.
All I'm saying is that bare facts--"borrowing" data from another lab without attribution--sound like standard-issue academic megalomania.
They didn't in fact borrow the data without attribution. It's right there in the Nature article.
who invented the transistor
John Bardeen, the only person to get two Nobel Prizes in physics (the other was for explaining the mechanism of superconductivity), together with Walter Brattain and a third person who was a real credit-grubbing asshole.
a third person who was a real credit-grubbing asshole
And who pissed off his employees so badly that they basically invented the modern venture-capital-funded startup in response.
Gödel, really? His work is pretty deep (we spent a week or two going through his incompleteness proof in a graduate seminar, as I recall) and only relevant to computing by analogy with Turing machines.
At the university level, I figure you can get by with, bar minimum, Turing and Von Neumann, for the theoretical underpinnings, and then various name's that have gotten attached to theorems and algorithms (e.g., Church, Dijkstra).
36 she sometimes carried around a little stick to illustrate how far light travels in a nanosecond.
A ruler?
a third person who was a real credit-grubbing asshole.
I mainly knew about his late in life mania for eugenics, but reading more about him reveals that that was just icing on the cake.
They didn't in fact borrow the data without attribution. It's right there in the Nature article.
"We have also been stimulated by a knowledge of the general nature of the unpublished experimental results" does not seem to be to be meaningful attribution if the situation was, as it seems to have been from Watson's account in the Double Helix, that the photograph in question was a key factor in getting to the correct structure.
only relevant to computing by analogy with Turing machines
This isn't quite right, but I'll spare everyone the details so we can get back to the sexism.
Godel was so sexist, he literally couldn't live without a woman in the kitchen. Or so the story goes.
By modern standards, which maybe didn't apply at the time, I would have thought the appropriate thing was for Watson and Crick to cite Franklin's paper and vice versa and publish them simultaneously. Which is almost, but not quite, what happened, as far as I understand: they did the simultaneous publication and they put in an acknowledgment but not a reference.
I basically don't know the name of anyone involved in computer science during the mainframe era. I'm vaguely aware of various really old stuff mentioned above, and the early theoretical stuff (Turing, Church, von Neumann, etc.) and of course know the people who ran the big PC companies. But I couldn't tell you the name of anyone working between 1950-1980 on making the modern computer era happen. To some extent, this might be that big groups played such a big role: I know about IBM, Bell Labs, the Xerox lab that invented everything, that the military played a big role, etc. But I don't know the names of any of the key players at any of those big institutions.
I don't really understand why people still study Gödel's original proof instead of Turing's much much easier-to-understand version of the same basic ideas. This is probably why I'm not a mathematician.
86: von Neumann was not just the theoretical stuff, he also ran the team that built the IAS computer.
88 illustrates my point nicely!
87: If I'm remembering correctly what Godel did and what Turing did, Turing just tells you that there exist functions that are uncomputable. Godel tells you that there are truths that are unknowable (unprovable). The latter is presumably deeper but there's probably some bijection between them that I'm forgetting.
88: And also really sexist and creepy!
I think the individual case of glancing at the photograph would have not been a problem at all, if it were not a part of a larger pattern of sexism and minimizing Franklin's contributions. It also would have helped if Franklin has lived long enough to be awarded to Nobel Prize.
92.1 is my sense. There was bad behavior but most of it was after the publication.
The Style section thing has to be a deliberate recurring bit of provocation at this point, because AFAICT they put everything related to sex discrimination there, and the fact that people complain probably drives traffic. Maybe that's too cynical.
I read part of a hostile biography of Ada Lovelace from the 80s -- I should track down more details. It very nearly claims that she contributed nothing of any value. Female author. So if the backlash is that old...
88: Yes, but his most influential work in the field is the stored program architecture (which was not theoretical to him but is the basis of the predominant computing abstraction).
87: Gödel's proof is actually about first-order logic, so it's much more relevant from a mathematical perspective. Turing's approach was unconventional and required an appendix to make the connection to the Entscheidungsproblem.
90: See this Scott Aaronson post for one way to relate them.
By modern standards, which maybe didn't apply at the time, I would have thought the appropriate thing was for Watson and Crick to cite Franklin's paper and vice versa and publish them simultaneously.
Wilkins and Franklin's papers.
Which is almost, but not quite, what happened, as far as I understand: they did the simultaneous publication and they put in an acknowledgment but not a reference.
In fact there were three DNA papers published in the same issue of Nature. Watson & Crick; Franklin & Gosling; Wilkins, Stokes & Wilson.
All three referred to each other, in roughly similar terms, but you don't cite papers formally if they're in press when your paper is. (Or you didn't back then.)
Watson & Crick: "We have also been stimulated by a knowledge of the general nature of the unpublished experimental results and ideas of Dr. M. H. F. Wilkins, Dr. R. E. Franklin and their co-workers at King's College, London. "
Wilkins et al noted that the results were "not inconsistent with the structure suggested by Watson and Crick" and finished: "We wish to thank... Dr JD Watson and Mr FHC Crick for stimulation, and our colleagues RE Franklin, RG Gosling, GL Brown and WE Seeds for discussion."
Franklin & Gosling also noted that their results agreed with Watson & Crick's idea, and finished: "We are grateful to Prof. J.T. Randall for his interest and to Drs. F.H.C. Crick, A.R. Stokes, and M.H.F. Wilkins for discussion."
So in fact the general belief is 100% the reverse of the truth. Watson (and Crick) didn't fail to thank Franklin, but Franklin failed to thank Watson...
Turing's paper is also not super-accessible. The Turing Machine abstraction has been much improved upon in later pedagogy.
I read the 2009 Grace Hopper biography from MIT Press last year. Learned lots. It might not be as interesting if you already know a lot of the relevant history, but for me, the author's explanations of the problems and debates in computing from the 40s to the 60s were great.
I have links!
Preface. '. . . I have been able to clear up a number of puzzles and misinterpretations about Ada's life and activities. To take one example among many: Ada's "curious letters" to Augustus De Morgan, "enquiring, speculating, arguing, filling pages with equations, problems, solutions, algebraic formulae, like a magician's cabalistic symbols," turn out to be a correspondence course in calculus, in which he was tutoring her.'
Isn't 97 evidence in support of 62? My impression is also that a big part of the controversy was created by the Nobel citation omitting her on a technicality.
98: right, but it was improvable. Gödel's way of mapping statements to numbers seems to remain klugey even after decades of pedagogy.
100: Reviewed by Andrew Hodges! Excellent.
the Nobel citation omitting her on a technicality.
I assume Nobel citation refers to something distinct from the Nobel prize? For the prize, the technicality was that she was dead.
101: Pretty much -- my sense of the story comes mostly from Watson's own account The Double Helix, which as I recall it devotes a lot of attention to how absolutely necessary Franklin's data was, and the difficulties of getting access to it without her knowledge. Possibly Watson was being unfair to himself, though.
104: Right. And there's also a limit of three co-recipients, so she might not have made the cut. But there's the unfortunate fact that "X, Y and Z won the Nobel Prize for α" tends to reinforce the notion that nobody else contributed significantly to α.
101.1: yes.
101.2 - is this that she didn't get a Nobel because she was dead? I am not sure what the "Nobel citation" means here. The citation is just one sentence that says "The Nobel Prize in Physiology or Medicine 1962 was awarded jointly to Francis Harry Compton Crick, James Dewey Watson and Maurice Hugh Frederick Wilkins for their discoveries concerning the molecular structure of nucleic acids and its significance for information transfer in living material".
102: It's kludgy but also kind of awesome. Something else I read (GEB, probably) had a simpler scheme based on concatenation that was nowhere near as fun.
A project I worked on for undergraduate research that didn't go anywhere (admittedly, mostly because I'm a horrible procrastinator) involved trying to get Godel's proof into a theorem proving system (specific ET/PS, which should tell you who I was working with). One reason it didn't work so well--and if I wasn't so lazy I would have found a way around--was that there are numbers in that proof of the form a^b where b is an integer that has more digits than can fit into memory.
specific+ally
Also, if you want to hew really closely to the original implementation, you should be using Church numerals (i.e. a unary representation) which makes memory use even more absurd. I think we decided that was very silly.
106: Watson said later that, if she'd lived, the ideal would have been for him and Crick to get the Physiology/Medicine Nobel and Wilkins and Franklin to get the Chemistry Nobel. But I doubt the committee would have gone for that. And a bit rough on poor old Max Perutz.
Here's something that *would* be appropriate in the style section: why do I every 3 or so years forget how wretchedly almost psychotically bitchy Madame Figaro is and end up looking at it? It just makes me despair, so then have to get a quick infusion of Marie Claire to counterbalance.
Also, if you make a cake consisting of a half pound of butter cantilevered over a 1/3 cup of flour it will be a good cake.
112: That does sound like the makings of a good cake!
essear, where do you get the sense that people still rely on Goedel's original paper? Goedel's paper uses the mathematical foundations proposed by Russell and Whitehead, something that people stopped using long ago. Most of the time, a textbook will present the improved result by Rosser, anyway.
I know people don't rely on the original paper. The improved versions I've seen are still clunky. I just feel like the whole thing is ugly and the Turing stuff about computability is much easier to understand. But I've never found formal logic very interesting and you should probably just ignore me because I don't know what I'm talking about.
I feel like this thread is incomplete if it doesn't include a link to Grace Hopper's appearance on Letterman, including the nanosecond thing.
86. "But I couldn't tell you the name of anyone working between 1950-1980 on making the modern computer era happen."
I could name dozens, but a few that (to me) seemed to be known beyond the narrow boundaries of the field include John McCarthy (inventor of Lisp), Marvin Minsky (AI pioneer), Donald Knuth (algorithms), JCR Licklider (the Arpanet), Vint Cerf and Bob Kahn (TCP/IP, aka "the Internet"), Rivest, Shamir and Adelman (public-key cryptography).
I could go on and on, but my guess is most of you have never heard of any of these folks. (There are also the business types, many of whom are somewhat well-known, but many were also scientists and engineers: Jobs, Wozniak, Gates, etc.)
115: I haven't completely read the Aaronson thing yet, but: I think the clunkiness of it is unavoidable once you really formalize things. They're both built around this intuition about the liar's paradox, but Gödel had to got the extra mile to get it to work within the formal system. And that means gross technical details. But I also find those gross technical details beautiful in their own way since they show how you can treat fairly deep ideas as just another kind of data that can be manipulated like anything else. So to tie it back to another part of the thread you can see it as a worked example of what Von Neumann machines (as in computer architecture, not self-replication) get you.
my guess is most of you have never heard of any of these folks.
Never a good guess at unfogged, but perhaps a particularly bad guess in this specific instance.
I haven't heard of Licklider, AFAICR.
Minsky is the only person in 118.2 that I've heard of.
Same as 121 for me. The rest are all big names.
I've heard of most of them, but as I say above my knowledge of this area is Burns-level.
Okay, read the Aaronson thing. That's really neat. Makes me wish I hadn't forgotten all of this.
I like both the "CS" perspective and the formal logic perspective. For a mathy CS course that isn't going into the formal deep-end--something like 15-251 at my alma mater--that Turing machine formulation is probably the right way to go. But I entirely agree with the early comments about the value of Gödel numberings, especially as showing how you can do *compilation* within logic.
Over my vacation, to see if I still could, I wrote an auto-cannibal maker (a generalized quine). In one of my logic classes there was a problem that I realized was basically showing that you can construct a function that does the exact same thing in a sufficiently strong formal logic.
Anyway blah blah blah "hurry-coward isomorphism"
How about Brian Kernighan, Dennis Ritchie and Ken Thompson (UNIX and C)?
OT ATM bleg:
Here's your big opportunity to break the analogy ban with a metaphor! Fill in the blank:
The overly sweet margarita had enough sugar to _____.
I'm coming up empty, and I'd like to have at least a placeholder metaphor before sending this to my cow-riter. You may see your entry in print (for which I will be lightly remunerated)!
"Support a beehive through the winter".
"Power the cotton candy machines at [local amusement park] for a month".
120. Never a good guess at unfogged, but perhaps a particularly bad guess in this specific instance.
Well, I was somewhat trolling.
121 & 123. Licklider came up with the idea of a universally accessible library. He went from MIT to DARPA, proselytized and funded his ideas: timesharing, personal computers, networking.
"mask the cheap tequila within."
Oh? That's not how it's done?
The overly sweet margarita had enough sugar to
support a twelve-state Bob Mould tour.
"make a meringue the size of Pirates fans' dashed hopes PNC Park"
128. Also good choices! Since Unix was heavily influenced by Multics and CTSS, one could add Corbato and Saltzer (and others). I think Richard Stallman probably fits in under the 1980 endpoint, too.
Not to mention Doug Englebart (mouse, GUI system, etc.). The human-machine interface paradigm (now being supplanted by the touch screen, alas) that dominates computing.
PL nerds would want to add Robin Milner but he's definitely not a household name.
You know who get no damn respect? Eckert and Mauchly.
AISIMHB at my first tech job the chair of the board had (has) a logic gate named for him. Nineteen-year-old me found that pretty darned mind-blowing.
"The margaritas were too sweet."
You're the reason print is dying.
Go noir: "The margaritas were sweeter than the final drag on the cigarette smoked in front of a firing squad."
36: she sometimes carried around a little stick to illustrate how far light travels in a nanosecond
Was it called a "ruler"? The speed of light is one foot per nanosecond.
115: In the big picture it doesn't matter, since once you learn it the connections are clear. So then it's a question of pedagogy, rather than fundamental content, and ultimately it's a question of aesthetics: what the more fundamental concept, the natural numbers, or the concept of Turing machines. Mathematicians think the first, and computer scientists think the second. Since the natural numbers go back to the Sumerians, and Turing machines go back to Turing, I'm going to go with the mathematicians on this one.
Plus, Goedel's approach of constructing a sentence that says "I am not provable" is cooler than something about the halting problem.
145: 20-year-old me found mind-blowing that the guy I did the research with mentioned above shared an advisor with Turing (said advisor obviously a big name in his own right).
I like 136.
146: don't ever change, O.
By the way, in case it wasn't clear, the sentence in question doesn't really begin "The overly sweet margarita".
I think I'll use 131 and let AB fix it.
143. Since the range of dates originally specified was 1950-1980, I think they were too early, although to be fair the UNIVAC was introduced during that period.
So, yeah.
Feet per nanosecond should clearly replace the SI units.
144: In households that know any damn thing about the history of computing, yes.
I have the impression that Milner's profile has risen over the past decade or so, alongside Haskell's, but I'm not sure why. (What I mean is: Haskell and H-M type systems are still very niche but more and more people seem to have heard of them and feel obligated to have an opinion.)
The overly sweet margarita had enough sugar to
...Keep 100 oogles in PBR for a year
...Make a calavera for everyone in Mexico City
...Flood Boston with molasses 10 feet deep
...Make the Archies blush
156.last: Nice -- I was trying to do something with that song.
||
The White Women of Empire or HRC 2016! Homeland posters are just stunningly evil.
New Inquiry. I like that place.
Also there a review of Laurie Penny's new book against Good Girlism
|>
155: I have a cynical belief that part of the appeal of a programming language is the things that are hard, but not impossible, to do in that programming language. If it's hard to do, then it's fun to figure out how to do it. (I came up with this theory when I found myself doing something in C++ rather than Lisp, simply because it was too easy to do in Lisp, and therefore boring.) The appean of Haskell is that understanding the IO monad is more fun than "printf", and understanding the state monad is more fun than a plain-old assignment statement.
155. Haskell's profile has risen because functional languages and Haskell itself are enjoying a vogue right now, even though they are niche compared to, say, C++ or Java.
160: That statement has the form "X implies X."
Haskell (and FP) fans make a lot of noise, but I don't see it getting used much for non-toy things.* Maybe its usage has increased a hundred-fold over the past decade without yielding measurable market share? There's also F#, Scala and Clojure carrying the banner for FP, but these are very watered-down versions of FP compared to Haskell.
* Let me establish my bona fides here and say I used OCaml exclusively for most of grad school and almost went to work for the one notable U.S. company that hires OCaml programmers.
It's used extensively at several big banks.
Maybe I should strike "big banks" and just say "financial services companies" since I don't know how big they are, actually.
161: ja/ne st/reet? When I was interested in OCaml searching for it caused them to be all over my ads for months.
156 is pretty good. Although, for 156.3, I think it's still Too Soon.
All the people I know who use OCaml are German.
162: That's what everybody always says, but it still seems like the ratio of people who get paid to write Haskell to the number of Hacker News threads about it is way out of whack.
164: That's the one.
The same thing is true of Go, BTW, but it's young yet and has the potential to become a truly mainstream language.
I thought the language named after the pioneering work Haskell did in the late 50s/early 60s was called Eddie.
P.S. I tried hard to convince my team to write our new thing in Go and was unanimously overruled in favor of Java. Philistines.
I was basing 162 on personal reports from people employed at such institutions on /r/haskell, rather than cheerleading on HN, but wev.
The same thing is true of Go, BTW, but it's young yet and has the potential to become a truly mainstream language.
Does it have the potential to become a language worth learning for any reason other than its current or potential popularity?
171.last: If you mean, "does it uniquely implement any interesting concepts that expand the expressive power of programs?," then probably not. Light-weight threads with channel communication are neat but not unique. I do think it's a nice alternative to C++ and Java if performance is a particular concern.
I finally got around to installing Julia and playing with it a little bit. It seems potentially fun, but the libraries don't seem quite as well-developed as scipy yet so it was a little hard to find some pieces I wanted to use if I was going to use it for my current project.
Go is definitely on the way up. Docker is written in Go - possibly the most significant piece of software to be written in the past five years.
161. Probably should have said that the "vogue" for Haskell is in commercial applications, where previously if you used Haskell it meant you were an academic.
Such things as correctness proving (just to pick one thing that is easier in a functional language) are becoming more important in some [redacted] applications.
My subjective take on this is that one reason "niche" languages are getting more popular is that web services and SoS architectures mean it doesn't matter as much what's doing the actual work on the backend as it did even five years ago.
175.last is the argument that I first heard from...Joel Sp/olsky or someone similar to him for why they used Lisp on the backend. Which is totally understandable. If you're free from whatever hell the end user is running on, why not take the opportunity to do some fun/useful/awesome?
(In our particular case, so much is written in Java that we can't move away from it on the backend. Or the frontend, via GWT. *shrug*)
79: Yes, really, but mostlyas a namecheck while focusing on Turing machines and Turing's proof of the Halting Problem, the diagonal proof. I found it somehow liberating to discover there were things a computer provably couldn't do.
The department was full of formal methods people and one of the other first year subjects was an intro to formal specification in Z.
I may have remember Gödel because it was reinforced by GEB though.
I found it somehow liberating to discover there were things a computer provably couldn't do.
And it's funny that that includes something so trivial as being able to check whether a student's assignment halts or not. (In practice, you can still write a checker for this that does pretty well.)
Of course, the real question then is whether *you* can do what the computer can't. Or anyone. Or everyone.
166: Jane St have a good OCaml blog if you can get over catching finance cooties. https://blogs.janestreet.com
176: You're probably thinking of P/aul Gra/ham, which brings us back to sexism.
180: Maybe I'll give that a look...back when I last looked into it, OCaml seemed like the most practical ML. I forget why I stopped. Maybe because so much was translated from the French.
181: Yeah, I am, thanks. How easy it is to get back to sexism.
Are we Google-proofing public figures names to avoid getting kibozed and/or sock-puppeted?
Today we've mentioned an unusually large number of people in both this thread and the other who probat have Gooogle Alerts on themselves. So yes.
I somehow double Mr Sp/olsky stays up at night worrying what people on the Internet are saying about him. Speaking of: I kind of miss him, since he decommissioned his blog and the SO podcast. (Oh, I just checked and found out the SO podcast was a going concern up through this year, but I stopped listening years ago.)
kibozed
Blast from the past. Kibo joined metafilter somewhat recently when something pertaining to him was under discussion! That was neat.
Rivest, Shamir and Adelman (public-key cryptography).
I could go on and on, but my guess is most of you have never heard of any of these folks.
Not only have I heard of them - at least Knuth, Minsky, Cerf, Kahn and RSA - I think you mistakenly typed "Rivest, Shamir and Adelman" when you meant to type "Ellis and Cocks (public-key cryptography)" there. Being as how they invented it first.
188: You keep it secret, you don't get the credit.
148: Actually, foot-long pieces of copper wire. There are some of her lectures on youtube and you can see her handing them out.
Other mainframe era people: Louis Pouzin (invented UDP and shells), Maurice "Not Wilkins" Wilkes (EDSAC, but also microcode and operating systems, and this:)
the realisation came over me with full force that a good part of the remainder of my life was going to be spent finding errors in my own programs
189: well, it's not secret now and hasn't been since 1997, so they should get it.
Why should they get the credit? They invented it, kept it secret, and only revealed it after someone else had rediscovered it. They contributed nothing to the body of human knowledge. They don't even deserve a footnote.
Isn't it the norm that the credit for first inventing something goes to the people who first invented it?
Say someone in an archive somewhere finds the bit of paper with slightly wider margins on which Fermat actually wrote his marvellous proof. Fermat doesn't get credit for proving his theorem? Just Wiles, who did it, what, two centuries later?
DW Davies! https://en.wikipedia.org/wiki/Donald_Davies
who sort-of invented packet switching and also told Alan Turing he was wrong.
AMERICA WAS DISCOVERED SEVERAL TIMES BEFORE COLUMBUS, BUT IT HAD ALWAYS BEEN SUCCESSFULLY HUSHED UP.
195: presumably he was the originator of the "Shorter Alan Turing" note.
The problem with 88, ajay, is that you said he "mistakenly" wrote Rivest et al.; if the wide margin page were discovered it would still be true, would it not, that Wiles had proved Fermat's last theorem, by any standard? Ellis and Cocks did it first but the RSA team did it completely independently. So hooray for everyone!
C'mon, big hug.
198. Since the original post was about people who were "making the modern computer era happen," stuff that was kept secret hardly could count.
Fermat definitely didn't prove FLT. Either he just meant that he had a proof for n=4, or he made a mistake.
I guess he might also have had a proof for n=3. Unlikely, but possible.
If you can't trust brief notes scribbled in margins, what can you trust?
Hey, one of Kevin Drum's new cats is now named Grace Hopper (her brother is David Hilbert).
I cherish my Hoppers lunchbag (interest group from MSFT) even though it's cheap vinyl and falling apart. Some of the Bletchley and ENIAC programmers came and talked to us in the 1990s; a few of the latter were still in the field.
About forgetting women; a couple months ago Jameco's newsletter had a short memoir by a retiring hardware designer (iirc) who was a woman `when no women were in the field' except that one of *her* teachers and mentors was also female. The belief overrules observation over and over again.