Give me my multiple-dispatching generic functions back!!!
Some day we will catch up to the 80's again, hopefully in my lifetime.
Yes, helpy-chalk, we did go to college with the fellow in the first link.
Give me my multiple-dispatching generic functions back!!!
Some day we will catch up to the 80's again, hopefully in my lifetime.
You know that those languages still exist right? You can have your multiple dispatch right now.
Graham Harman delivered a brilliantly dense paper in Zagreb over the weekend.
Is it considered good in philosophy for a paper to be dense, or is "brilliantly dense" meant to be read as sardonic?
5: I wondered about that too. Were they making a funny? Or using it to mean something like "rich." IANAPhilosopher; I just sleep with one.
Ben, what the hell are you talking about?
Graham Harman
Reminds me how hot Heather Graham is/was.
I'm assuming that this is the thread for free associative comments.
Anyway, philosophy has too many side-effects as presently practiced. We should switch to functional philosophy.
['Trying']
Ben, what the hell are you talking about?
Do you really want him to explain it?
We should switch to functional philosophy.
Too many parentheses.
Anyway, philosophy has too many side-effects as presently practiced. We should switch to functional philosophy.
Don't be a beautiful soul, Otto. This obsession with "purity" only leads to removal from the world of action entirely—the practical equivalent of skepticism in the theoretical realm. Sooner or later, we have to interact with our environments, for better or worse.
Aristotle understood this.
11: And if you omit the parentheses, in Haskell for instance, you end up obsessed with $.
We should switch to functional philosophy.
With a... hammer!
7: Ben, what the hell are you talking about?
Oh, just some small talk 'n'at.
15: I believe the relevant background reading for this conversation is Kant's Prolog to Any Future Metaphysics.
Die Philosophen haben die RealWelt nur verschieden interpretiert, es kommt darauf an, sie zu verändern.
17: But those with not much time on their hands can always try some turbo Pascal.
The requisite "Subjective-C" joke seems to have been made before, repeatedly.
I suppose the other thing philosophy has going for it is its extensive and mature standard library.
So are they dealing with polymorphism via dual inheritance, or through interfaces?
I'm not even sure that referring to monads is even a joke at this point. They are well suited for encapsulation, lord knows.
22: affordances, maybe.
Was this post designed to drive the lawyers off the site, leaving it to the philosophers and computer geeks? Well played.
I think "inheritance" is a red herring, and I assert that there's a philosophical middle-ground between "definite" types (what I was taught to call a monomorphic type system) and the anything-goes of duck-typing.
Surely there is room in our philosophies for a controlled form of polymorphism, Ben? Maybe a reasonable system of kinds, and beyond that some sorts, and then maybe ... well, maybe then we've reached the point where things must be passed-over silently.
(Half-pwned, on preview, by 22 I think.)
After the kinds and the sorts come the things and the you-knows.
I think I come down on the side of duck typing here. Its not like there's a philosophical compiler that's going to catch your errors prior to runtime anyway, so why bother with the overhead of definite types?
If we want to talk about real features of our philosophy-as-it-is-practiced, I think that "inheritance" is fundamentally least-interesting.
Can we make jokes about the admissible values in a system of "realism without materialism?" Does it have, for instance, some kind of first-class control value? Does it have continuations, Ben??
(Maybe that's what a footnote is, really.)
Shit, all I know is VBA. On the philosophy side of the analogy that's, what, Andy Rooney?
Since I have no idea what the substance (ho ho ho) of the paper actually was or what this fellow meant either by realism or by materialism, uh, go for it.
When it comes to technical details of continuations, I'm on uncertain grounds, so I'm not sure whether I want to call a footnote delimited or escape, but I think it's pretty clearly bounded by the footnote number in the main text.
Actually, in the case of endnotes, maybe what you have is a coroutine, bouncing back and forth between the main text and the endnote text, entering the latter at footnote number one and returning to it at the end of each numbered note.
entering the latter at footnote number one in the former and returning to the former, I should have said.
I vote for including garbage-collection.
As long as it's not done by reference-counting.
Maybe the right way of approaching the "what is the analogue of a continuation" question is to ask, what is the equivalent to call-with-current-continuation? (That is, what is the equivalent structure that lets you name a continuation -- the lambda, if you will, of control flow?) And now that I ask that, maybe the very act of publishing is like naming a continuation... and then references to other words -- the bibliography -- is the act of evaluating those continuations.
Or something.
Anyway, following two links from that blog, I get to this page, with a description of a book about Bruno Latour, in which I find that,
Part One covers four key works that display Latour's underrated contributions to metaphysics: Irreductions, Science in Action, We Have Never Been Modern, and Pandora's Hope. Harman contends that Latour is one of the central figures of contemporary philosophy, with a highly original ontology centered in four key concepts: actants, irreduction, translation, and alliance.In Part Two, Harman summarizes Latour's most important philosophical insights, including his status as the first "secular occasionalist." The problem of translation between entities is no longer solved by the fiat of God (Malebranche) or habit (Hume), but by local mediators. Working from his own "object-oriented" perspective, Harman also criticizes the Latourian focus on the relational character of actors at the expense of their cryptic autonomous reality.
I must own it.
Maybe the right way of approaching the "what is the analogue of a continuation" question is to ask … what is the equivalent structure that lets you name a continuation
d00d.
I've often thought that much of the jargon in philosophy (and other disciplines) might be made penetrable by proper use of Hungarian notation.
...might be made penetrable by proper use of Hungarian notation...
d00d.
Hungarian notation is to types what "significant digits" are to confidence intervals -- a weak substitute, for people who are unwilling to admit that they needed the stronger thing in the first place.
Give me my multiple-dispatching generic functions back!!!
You can still have them you know. I use them most days....
But it's not as if, in writing, you can do much better.
Is it considered good in philosophy for a paper to be dense, or is "brilliantly dense" meant to be read as sardonic?
In some parts of philosophy it is absolutely a good thing to be dense. Wittgensteinian philosophers think this way, as do a lot of continental philosophers. Mostly it happens when you aestheticize the activity of philosophy itself.
Mainstream analytic philosophers don't view density as a virtue in itself, but a sometimes unfortunate byproduct of rigor.
The weird thing about that description is that "brilliantly" occurs as an adverb. (Ok, "brilliantly" is an adverb—that brilliance occurs adverbially. Whatever.) It would be praise to call it brilliant and dense (I mean, I would call that praise). But to call it brilliantly dense, as if there were something particularly accomplished about the manner of its being dense—huh?
Yeah, I agree with nos. Sure, we all know that any number of disciplines seem to privilege opacity, but they don't usually call it out: "Fantastic! Was that paper ever dense!"
"Fantastic! Was that paper ever dense!"
Wittgensteinians say this all the time when talking about Wittgenstein's work. Certainly Wittgenstein thought it was a virtue of his own work, and the work of people he admired.
I was taking "dense" to mean both "thick with layers of meaning" and "hard to read and opaque to outsiders." Like hipster poetry.
45: Huh! Do you remember him from SJC? He was a year ahead.
Oh yeah, I remember him. I actually applied for a job at a university he was with at the time. He was on the screening committee. I didn't get the job.
He looks really different now.
Messieurs, les objets sont contre nous?
No. Doesn't have a ring to it.
20: The requisite "Subjective-C" joke seems to have been made before, repeatedly.
Ah, but it seems no one has made the requisite "Philosophical-C" joke yet.
max
['Surely someone is resigned to doing that.']
Speaking of philosophy (of which I am more or less entirely ignorant): someplace that's else on the internets, I encountered someone claiming to be a philosopher (grad student? faculty? not sure) insisting that it is widely believed by philosophers to be impossible for the mind to be something emerging from physical processes happening in the brain, because of mumble jargon qualia mumble intentionality, or something. This seemed, to my mind, completely nutty (now, that is; it was a perfectly sensible thing to think hundreds of years ago). Is it true?
(Is it true that philosophers often make such claims, I mean. I have my own obvious belief in the truth value of such claims.)
50: "Widely believed" is pretty vague, but as far as I can tell, most mainstream philosophers of mind are materialists of one stripe or other.
There's a sizable minority who have the view your interlocutor describes, including some pretty smart people, so I'd hesitate to call it "completely nutty". But I think it's a minority view.
(On the other hand, I don't think that mathematical Platonism or Lewis's modal realism are completely nutty, so I may have high standards for what qualifies as nutty in philosophy.)
it is widely believed by philosophers to be impossible for the mind to be something emerging from physical processes happening in the brain
As Zadfrack said, it is not widely believed, but it is believed.
The overwhelming majority of secular, English speaking philosophers believe in some form of supervenience which is a more technical term than "emerging." One system (A) supervenes on another (B) if it is impossible for there to be a change in (A) without some change in (B). Most of the debate in secular English speaking philosophy is about trying to figure out exactly how supervenience works.
A few dissenters are dissatisfied with all talk of supervenience. They tend to be general purpose obscurantists--the mind is just something we can't explain--rather than dedicated Cartesian dualists, unless they have special religious motivation.
Ooh! I'm dissatisfied with supervenience!
most mainstream philosophers of mind are materialists of one stripe or other.
Indeed, there's so much consensus on this in the field that people who think otherwise are unlikely to join it.
49:
@interface Harman : Latour {
id realism;
id materialism;
}
+(id) define :(id)philosophy;
+(id) redefine :(id)philosophy;
-(void) lecture :(id)location;
@end
Meh... someone smarter than me could do a better job.
What 53 said. There are more non-materialists than many think (and some of them would agree with your statement that minds emerge from physical brain processes, while saying that what emerges is non-material). Here's what David Chalmers had to say about this a few years ago:
"One still sometimes sees the claim that almost everyone these days is a materialist (e.g. in Peter Carruthers' new book, p. 5: "Just about everyone now working in this area is an ontological physicalist, with the exception of Chalmers (1996) and perhaps a few others"). I don't think one can get away with saying this any more. Apart from the four counterexamples just mentioned, here are a few other contemporary anti-materialists about consciousness who come quickly to mind: Joseph Almog, Torin Alter, George Bealer, Laurence BonJour, Paul Boghossian, Tyler Burge, Tim Crane, John Foster, Brie Gertler, George Graham, W.D. Hart, Ted Honderich, Steven Horst, Saul Kripke, Harold Langsam, E.J. Lowe, Kirk Ludwig, Trenton Merricks, Martine Nida-Rumelin, Adam Pautz, David Pitt, Alvin Plantinga, Howard Robinson, William Robinson, Gregg Rosenberg, A.D. Smith, and Richard Swinburne. There are plenty of others, and then at least as many again agnostics. If I had to guess, I'd guess that the numbers within philosophy of mind are 50% materialist, 25% agnostic, 25% dualist."
As a lawyer with a background in both computer science and philosophy of mind, I feel like I should have something to add here, but I can't think of anything new().
It's not clear that Philosophy-C should be syntactically like Objective-C.
The only datatypes in Philosophy-C are pointers to void and booleans.
and some of them would agree with your statement that minds emerge from physical brain processes, while saying that what emerges is non-material
That seems like a completely reasonable claim to make, I think. There are a lot of things (language, society, cities) that I would say emerge from material stuff while not necessarily being material in themselves, depending on how one defines "material".
It's the stronger claim, that minds do not emerge from physical processes, that seems nutty to me.
Anyway, thanks, all, for the information.
61: It's not clear that it shouldn't be, either.
The only datatypes in Philosophy-C are pointers to void and booleans.
I'd replace booleans with just plain integers ("all else is the work of man"), and I'd insist that Philosophy-C should at least have function pointers -- to be honest, a "real" philosophy programming language would probably just mostly look like combinatory logic of some kind, but then I suppose we'd lose the "C" aspect of the joke...
I'd replace booleans with just plain integers
bignums, surely.
also, wouldn't it have symbols (which may be fbound), not pointers? Like integers, you're exposing a lot of the muddy underpinings there. Surely philisophical-c would keep it's dainty hands cleaner than that.
I'd replace booleans with just plain integers ("all else is the work of man")
But think of how the integers would be represented, and then reflect that all is atoms in the void.
It just isn't a C-like language if you can't have formulations like "&**foo".
I suppose we'd lose the "C" aspect of the joke...
This might be unavoidable.
67 it's a problem. there will probably be a schism. One taking the name -see, I presume.
67 -- I'm confused, wouldn't "&**foo" just be equivalent to "*foo"?
70: I once had to work with (awful, no good, very bad) code where sometimes pointers were wrapped in a class called Link>, or something like that. And you might call a function which returned a pointer to a Link, and need to pass the result to another function which expected simply a pointer to SomeClass. Hence: &**. I doubt such things arise in better-designed code. (And, offhand, I can't imagine a C-specific case where you would ever need &*, so probably this is strictly a C++-ism.)
Err, Link<> and Link<SomeClass>, and I should preview.
What was the point of wrapping pointers in a Link class if you could just take the address of what you got by dereferencing the wrapper and then sling that around?
nosflow, it's terribly naive to think that everything you'd run into in production code would make a lot of sense...
73: This was a body of code maintained by hundreds of physicists, most of them not very competent programmers (not that I am either, for that matter, but by those standards I was), most of whom had no experience with any language but Fortran before deciding that this code should be in C++. It varied between poles of thinly-disguised-old-Fortran-code ("why is this variable named 'jxsvef3' instead of 'energyInCell'?") and overly-proud-to-have-learned-C++ ("oh, someone read about dynamic_cast<> in a book and wanted to mash it into their code whether it made sense or not! how cute!").
"why is this variable named 'jxsvef3' instead of 'energyInCell'?"
I've seen code very much like this...
I figured it was something auto_ptresque which also seems to have the problem that you could extract a reference to the auto_pointed value, stash it somewhere else, and then have a problem when the auto_ptr leaves scope, but I guess the answer to that cocnern is "don't do that".
77: Right, it was very auto_ptr-esque. This code also had lots of "let's replicate the standard library, badly!" classes. It's possible that some of it predated reliable versions of the STL.
I know that there is nothing stopping me from still using a real language, except for management, and my need for filthy lucre.
maybe the very act of publishing is like naming a continuation... and then references to other words -- the bibliography -- is the act of evaluating those continuations.
It strikes me that only the first half of this is right, because we didn't stop to ask, what is the analogue of the runtime (or whatever I want to use there)? Evaluating the continuation is reading the item named in the continuation.
Building a library is storing the continuations in a data structure of some sort (depending on how your library is organized).
But this argument in favor of multiple dispatch seems to be at least the beginning of a case against inheritance and even the tyranny of definite types,
Hey, wait just a minute now...
and in favor of the free play of duck typing
Ack! He hath wounded me in the very heart! Oh, poor, poor nosflow, besotted with badly designed static type systems, he sees no alternative and thus becomes resigned to the oh-so-evitable ill fate of dynamic typing. Either that or he's just trying to sardonically highlight the incongruity and even fallaciousness of so much philosophical reasoning by sketching a quick reductio ad absurdum.
Lisp is so unfortunate. CL is a really nice language to be afflicted with dynamic typing. But CL is stuck in the past.
I read Kant's COPR once. What is the word for when you really love the Critique? "Copr-philia"?
I don't think "besotted" means what you think it means.
I've never used CL. These languages have I used enough to have some sort of vague clue about their type systems:
Python; C; C++; Scheme (unaugmented by any macrofied object system of CLOSish thing); Haskell; ML; Unlambda.
I have tinkered with perl scripts occasionally but have tried not to learn anything about it.
Anyway, what's so bad about dynamic typing?
Haskell and ML have good (static) type systems, though there's a lack of object-orientation there. (That isn't necessarily a terrible thing, depending on what you're doing.) Scala has a really interesting type system, though there's a bit of a lack of type inference for my taste.
The thing that's bad about dynamic typing, and mind you this is religious war territory, is that you lose the opportunity to let the compiler catch bugs for you. The downside is that you have to tell the compiler more about your code, which can be annoying. But more advanced languages have type inference to make it less annoying, sometimes much less. Refactoring can be a lot easier, since when you change the type of a method parameter, once you've fixed all the compile errors all the method calls are now guaranteed to succeed. When you remove a parameter, every single caller gets marked with an error. It's quite convenient.
IMHO the direction compilers and languages are headed is towards catching more and more errors at compile-time. At some point I think program verification, where you write a specification for the program as well as the code, and then prove (formally, rigorously) that it meets the spec, will start to come into play in mainstream languages. You already see a little bit of that with things like pre- and post-method conditions in some languages. (Nemerle? There's some Java extensions that I think do this.) And to do all of that, static typing a la ML is the way to go. So, static typing is the future.
I found ML's type system extremely annoying. I only used the language once, though, for a compilers class.
88: For what it's worth, I usually find the opposite approach more useful in practice, since there is nothing stopping a dynamic language's compiler doing sophisticated type inference if it's given some promises about types, and in research/exploratory code it's actually a pain in the ass often to have to have all of your ducks in a row before you can run things.
But CL is stuck in the past.
I'm of mixed opinion about this. One the one hand, it's dragging a lot of history with it. On the other, there are some things it does well that I can't find anything newer that comes close. There are some really good ideas in CL.
I can easily imagine a compiler that could switch between varying levels of dynamic or static typing depending on your preference. SBCL isn't too far from that, though it isn't really what I had in mind.
But here's one of the problems with dynamic typing. SBCL lets you declare very high levels of optimization that are potentially unsafe if your code does bad things. So if you declare that optimization and pass the optimized method the wrong argument your program crashes hard. That sort of thing isn't possible in a statically typed language.
I found ML's type system extremely annoying.
If I understand correctly, you like Haskell better, right? Well, the type systems are extremely similar.
Right, I agree with you.
In practice though something like sbcl (which I actually use quite often) actually lets me do what I need to, whereas Haskell for example doesn't. I could maybe work with OCAML, but I don't know it well enough. So that's one thing.
The other thing is that I find exploratory coding a lot quicker in a lisp, which is useful.
But to be fair, this: o if you declare that optimization and pass the optimized method the wrong argument your program crashes hard. That sort of thing isn't possible in a statically typed language.
Is not really much of an issue. Declaring this sort of ) safety 0) or whatever only makes sense in a very limited set of circumstances, when a known speedup will happen. In a lot of cases it's irrelevant, because it's performed the same sort of static type checking, at least locally.
Far more typically the lisp will catch the same error. It just may catch it at runtime (and maybe let you do something about it then, too, without stopping the run)
Maybe I'm misremembering, but doesn't ML not have anything like Haskell's classes? It seemed as if one always had to invoke the FooToString function before being able to print a Foo, whereas in Haskell you can declare Foos an instance of Show.
) safety 0)
This is what happens when you optimize too aggressively.
94 to 92
to be clear, I agree with you about the theoretical gain of static type systems, so long as we add the caveat that there is a development cost to this, even in the presence of good inference (i.e., i'm not just talking about extra typing)
beyond that, in actual use a system like sbcls works very, well. While you can choose to do stupid things with it, it doesn't actually encourage it.
95.1 is a surprising proliferation of negatives on neb's part.
94: Well, with Haskell or any of the MLs (including O'Caml) there's the confounding factor of them being pure functional languages, which is kind of awkward for a lot of people. I wonder how you would like Scala.
development cost to this
Clearly you are uninterested in investing in optimized developers.
95.1 is a surprising proliferation of negatives on neb's part.
Whatever do you mean, teacher tweety? It's perfectly clear, and accurately phrased.
Clearly you are uninterested in investing in optimized developers.
Not at all; In my experience for many things, optimal development occurs with the more dynamic setting.... and that's where you'll find optimal developers. Or something.
ML has mutable variables, no? In fact I remember this coming up earlier on unfogged because I thought it didn't and was corrected. Isn't not having mutable state a big part of FP purity?
FP purity is one of those things that can be taken far enough to be silly
so long as we add the caveat that there is a development cost to this
Well, no, we can't. But there you're right back to the initial disagreement.
Although maybe you could give me a different reason why I think Python, Ruby, Groovy, etc. fall so far short of language perfection.
Now Unlambda. THERE's a pure language.
Are you looking for a reason rooted in your psychology or in facts about the languages?
Whichever is more enlightening, of course.
I wrote an Unlambda interpreter in Python, but I couldn't get it included in the unl distribution.
108: he's looking for the convergence of the referenced terrains.
Well, no, we can't. But there you're right back to the initial disagreement.
What I'm saying is that I agree with you about what static typing with a good inference engine offers, and in my experience there is a development cost relative to say, CL with a good inference engine.
I agree with you that there are potential issues with, say, sbcl, but I'm saying in practice and with a little care it isn't actual a problem.
So you can disagree with my premise if you'd like, but if you tell me there is no cost to it I'll tell you it doesn't match my experience.
This is also separable from whether or not a language implementation can actually generate the sort of code that you may need, which again pushes me into a fairly limited set of languages. So there is somewhat a selection bias, I'll agree. But it at least generalized to others I've known.
I mean, maybe part of the reason is that he harbors an irrational loathing of the Dutch. That could be enlightening, I suppose.
I've got lots of reasons why I think Python, Ruby, et all fall short too, for that matter. I don't know how much overlap they'd have with yours.
Oh, on the contrary. I'm enchanted by the Dutch.
You two can have it out by yourselves for a while while I listen to the dulcet tones of Pharoah Sanders.
I agree with you that there are potential issues with, say, sbcl, but I'm saying in practice and with a little care it isn't actual a problem.
I agree. But the point is with real static typing it could be just that little tiny bit better.
So you can disagree with my premise if you'd like,
No, it's fine.
but if you tell me there is no cost to it I'll tell you it doesn't match my experience.
And I'd say your experience is misleading you for some miscellaneous reason. (Like, maybe static languages haven't gotten advanced enough yet.) But I don't really feel like drawing this out. I don't understand very well my own reasons for liking static typing. For all I know if I spent a couple years coding Lisp full time (ha!) I'd come back to the dynamic side.
All I really know is that I used to love, love, love Lisp. Thought it was the best language out there, just had a few shortcomings in the free implementations that needed to be fixed. But my opinion changed over time, and I no longer favor Lisp, and I feel like a big part of that is the dynamic typing, because I really miss a lot of other features. Especially the condition system. Yum.
For all I know if I spent a couple years coding Lisp full time (ha!) I'd come back to the dynamic side.
Or something even weirder would happen.
Do philosophers of long nouns like to s5n them ?
||
I was thinking about a shortest night walk in the rain,
but neB's research already identified
a nascent not-for-profit website that enables unusual encounters with the city through collectively and spontaneously organized, unguided, undirected ...That's with four thoughtful weeks' of advanced notice 'spontaneously'. A July 12 march over the Hudson and back.
|>
I agree. But the point is with real static typing it could be just that little tiny bit better.
No! Because then I'd give up something useful. Now it may be true that the static languages just aren't good enough yet.
As far as lisp goes for me, it's a bit love/hate. There are very nice things like the condition system, yes. And annoying things. But it fundamentally comes down to this
1) I don't know of any other language that can touch it for what I'd call exploratory programming, and I do this a lot in research, and
2) there really are very few language/implementation combinations that can actually generate the sort of high performance numerical code I sometimes need. I'm just lucky that a few of them are nicer than c++
120: the internal organization of each walk is pretty darn spontaneous. As far as I understand only the beginning time and position and ending time are organized in advanced.
And fwiw, for large values of "practical" some lisps inferencing is good enough (TM).
Now it may be true that the static languages IDEs just aren't good enough yet.
Or something even weirder would happen.
one can live in hope.
124: christ you people are crazy. Color me convinced by dynamic typing.
the internal organization of each walk is pretty darn spontaneous.
So would you say they're... random?
Like the bubbles in a California Champagne, Josh.
126: Don't look at me... AFAIC Python is the light and the way. (Spending the last month learning to maintain and modify a framework written entirely in Perl isn't doing much to disabuse me of that notion, either.)
Sifu just likes contradicting me.
124: meh. in many ways IDE's are starting to catch up to where lisp machines & small talk were decades ago years ago (yes, in some ways that's not all that's going on, but it's a big part of it. The first "IDE's" were a huge step back) but you can't really get everything you want without good runtime introspection and other features of the language itself.
130: I do! And I know you like it, too.
AFAIC Python is the light and the way.
It really isn't. But it's pretty nice for what it is.
you don't even want Matlab 5's object system?
131: Yes, exactly: IDEs need to catch up to where they were a thirty years ago. Along with OSes. Someone start hacking on Movitz. At least they didn't have iPods back then, too.
actually on the python front, I think that python + scilab has now gotten to the point that it's a pretty compelling replacement for many things that have been matlabs turf, which makes me happy.
part of me can't believe we're having this conversation
part of me isn't at all surprised, though.
I wonder how much of the unfoggedetariat we can competely bore, though.
It really isn't. But it's pretty nice for what it is.
You'll burn for that, heretic.
136: IDEs need to catch up to where they were a thirty years ago
Every application, environment, or IDE grows until it can send e-mail ?
137: yeah, agreed. And that seems to be happening in the field, too.
139: no more masturbating to the rest of the unfoggetariat.
And that seems to be happening in the field, too.
I'm going to push that where I can now, as a matlab replacement.
And I may even add some things to it, within reason . But there are some limitations that python brings to the table (sorry Josh). If I have to do something complicated and need performance, I'll still go elsewhere.
You need performance? Some academic you are.
You need performance? Some academic you are.
Yeah, I know --- sad, isn't it. Must of rubbed off on me in industry. But largish clusters have been known to cry when they hear me log on.
Demand bigger clusters!
Oh wait, recession.
STIMULUS CLUSTERS FOR SOUP!
I'd make that into a t-shirt, but I fear it would be somewhat obscure.
If I have to do something complicated and need performance, I'll still go elsewhere.
That's when you write an extension in C.
I wonder how much of the unfoggedetariat we can competely bore, though.
I find it reasonably entertaining, but I'm weird like that. (I don't have any very strong opinions myself; I use some mix of Fortran, Python, or Mathematica depending on what I'm doing and which pre-existing code I'm familiar with gets me closest.)
That's when you write an extension in C.
The problem is, this turns out to not be such a great idea.
I understand the appeal of the soft + hard layers idea, I really do. But when your hard layer kind of sucks and your soft layer inherits some badness from it, all is not sweetness and light.
Besides, if I still wanted to be writing numerical code in C or C++ I'd have been doing that.
You could always switch to a field where you use AutoDesk.
There's this fairly strong push in my community to rewrite the old Fortran workhorses in C++ to somehow "modernize" them, which I'm not entirely comfortable with.
149: Yeah, I know. I'm fortunate to work in an area where Python's limitations don't really matter, and as far as dynamically-typed interpreted languages I think it has major advantages over its competitors. Then again, I don't particularly have an interest in working in areas where Python's limitations would matter (and on the occasions I do I thank god for Jython), so it all works out for me.
151: again, MATLAB!
Yay!
All togethr now: yay!
Fortran, Python, or Mathematica
This is mostly me these days. With some matlab and C in the mix too, for reasons of interoperability.
But if it's something standalone that I don't have to hand off to someone else and mathematica isn't really built for? Python isn't going to be high up my list unless it's basically bashing libraries around. It's really good for that, but kind of limited for things beyond that.
Dynamically typed interpreted languages are the pillow topped mattress of programming languages.
Discuss!
and as far as dynamically-typed interpreted languages I think it has major advantages over its competitors.
I can see this. I'm just used to things like lisp, which is more dynamic, more powerful, and generates good machine code too.
A bit of a walled garden, though, which is a downside.
There's this fairly strong push in my community to rewrite the old Fortran workhorses in C++ to somehow "modernize" them, which I'm not entirely comfortable with.
Yeah, I'm familiar with this push although I don't do physics anymore. I can sort of see where it's coming from, but I'm not sure it's such a good idea.
There's a similar but less global push in mathematics. Lots of old workhorse libraries around, but less push to "modernise", partially because there aren't as many big joint projects around, I suspect.
All togethr now: yay!
It's just such a terrible general purpose language is all.
But damn, did it get linear algebra on double float arrays right.
Huh, 156 makes a lot of sense, and makes things clear that were not, previously, clear to me. I wonder what thread it belongs in?
139: I keep checking in to see whether anything intelligible has been said aaaand... nope.
158.last: and damn, does that get you a surprisingly long way, in certain realms.
161: You bet. And if you don't like mathworks, there is always octave, which is pretty damn good actually.
Further to 151, it was actually this sort of effort in C++ which led me to playing around with template meta-programming in C++, which led me to figuring out that's not such a great idea but someone has to have done it right, right? which led me to lisp. So there is that.
A bit of a walled gardenRequires a background in CS, though, which is a downside.
At some point I'll probably poke at Lisp again, but I don't have that background in CS and the learning curve is just too steep (and the marginal payoff too minimal) to make it worthwhile for everyday use.
Besides, the last paragraph of myth #6 on this list really nails Lisp for me.
Didn't need no IDEs. Every char'cter pulled its weight.
Gee, but APL ran great. Those were the days.
164.last would probably matter to me in industry. As it is, single (me) or small group efficiency is more important.
But I don't really disagree with it in a general sense, as being one of the reasons things like lisp are somewhat marginal. There are other effects of the same order too, though.
I know a couple of mathematicians who still swear by APL (or its decendants) fwiw. Now that is pretty marginalized.
Topical to 164.last, I worked at a startup founded (mostly) by a bunch of guys who worked at Symbolics in the early days. One of the high-profile investors had (no shit) a logic gate named for him.
The product? A half-assed web version of Lotus Notes. Written in C++
Of the founders, the most currently successful is my old boss, who cut his teeth (read: made a fortune) writing software to make pie charts in DOS.
167: Yeah, it came up not too long ago in a discussion of whether there were any programming languages that are truly and completely dead. I was surprised to see there was still some energy.
The list Josh linked to is pretty great, actually.
How sweet of you all to save the MOST INSUFFERABLE CONVERSATION IMAGINABLE for when I drop by.
Personally, I prefer statically typed languages, 'cause they have way bigger cocks.
168. interesting.
I don't think the lisp/c division is very current, but if you modernize it a bit, say c#/java vs ocaml (or more realistically, php vs ruby) ... there is still a tension between technical and business issues that renders a lot of these technical arguments a bit irrelevant.
You do see some exceptions (e.g. orbitz), but the logic of plug compatible programmers is pretty strong, and the problems around the corporate infrastructure around a piece of software often dominate the problems inside it ... so there you are.
171: CONSTPIPE PTRPLATE WE HEART U
We saved it up special for you, Standpipe Bridgeplate!
One of the high-profile investors had (no shit) a logic gate named for him.
Fredkin? Was his investment reversible, too?
Was his investment reversible,
Aren't they all?
175: I was sort of hoping nobody would make that connection. Luckily, he has more money than And, Nor, and Xor put together.
deep ah, heartfelt apologies that my procrastination was insufferable, but i'm going to bed so you may hope it all dies out...
For clarity, the first and second sentences of 177 refer to the first and second sentences of 175, and not to each other. If only there was some language that could make this explicit.
I was sort of hoping nobody would make that connection explicit
Luckily, he has more money than And, Nor, and Xor put together.
I think Feynman has a gate, too, but he didn't strike me as the venture capitalist type.
180: thank you for making that reference to an explicit connection explicit, yes.
I'll tell ya, tho, Fredkin has basically the greatest stories of anybody ever by like a lot. Did you know he was a Navy test pilot?
Sorry, Air Force. This was before the Navy had airplanes, possible.
How Navy test pilot was he?
Also, he had the fascinating quality (which I've never observed in anybody else, including some very rich-ass fuckers) of not even noticing he'd gotten a parking ticket, resulting (more than once) in surprised junior employees sent on missions to impound lots to free various hyper-expensive vaguely salt-damaged cars.
184: It's like you've never seen Baa Baa Black Sheep.
I wonder if I can tell the story he told? It's my best story, and it didn't remotely even happen to me.
"Symbolics, Inc: A failure of heterogeneous engineering". Dan Weinreb's comments on same and a couple of other of his posts on Symbolics.
The claim that "the proliferation of LISP machines may have exceeded the proliferation of LISP programmers" is hyperbole. It's not true that nobody thought about a broader market than the researchers; rather, we intended to sell to value-added resellers (VAR's) and original equipment manufacturers (OEM's). The phrase "VARs and OEMs" was practically a mantra. Unfortunately, we only managed to do it once (ICAD).
A surprising amount of Not Really Getting It is embedded in that paragraph.
And apparently MIT has now (this may have been posted here before) phased out SICP and gone from Scheme to Python for the freshman curriculum.
190: That's a good quote.
I've read 189's links before, pretty interesting stuff. I think there are a lot of reasons behind that failure, some of them probably generalize to large scale adaptation of lisps today. If you're in the right niche though, they can be handy.
191: Yeah, it was interesting to read Sussman's take on that. It really wasn't a python vs. scheme decision, it seems, much like SICP wasn't really a scheme book.
erm sorry, what Weinreb said about it, following Sussmans comments.
191: huh, I'd heard they were moving to Haskell.
195: Sussman's comment on why Pyton:
"And why Python, then? Well, said Sussman, it probably just had a library already implemented for the robotics interface, that was all."
According to someone else at that conference
anyway, rebuild finished so really gone now.
Must sleep, but Symbolics Lisp Machines were where I basically did the last of any semi-serious programming. Spoiled me rotten, even just stuff like the distributed logical file system.
MacIvory for the win! ...well, not quite.
The intro CS class I just started is all in C++.
For a taste of the cryptic oddness that was/is APL, here is what Wikipedia tells me is the code for taking a boolean matrix and returning the next iteration of Conway's Game of Life: (Linking the image due to inevitable character lossage).
It's been so long since I've done programming, the only other languages I've used are pascal, logo (with and without using the turtle) and a bit of basic.
||
Well, I devoutly hope nobody was masturbating to him in the first place.
|>
C++ can provide many new and interesting ways for your programs not to work. But you'll be tough!
lisp/c division is very current, but if you modernize it a bit, say c#/java vs ocaml (or more realistically, php vs ruby)
Wait, which of php and ruby is lisp here, and which is c?
I can probably guess, but neither of the two is very like ocaml.
Requires a background in CS, though, which is a downside.
I am not a lisp programmer, but I wonder how true this is.
203: Turns out that over here he died yesterday afternoon. Time zones can be weird.
205.last: You are right to wonder; it certainly does not.
I'm glad I'm taking the course mostly for personal enrichment. I'm going to the lectures and discussions and doing the assignments and the exams, but I get to not care all that much about the grade since it's not part of any degree program for me and I'm not looking to transfer credits. My only hope is that I pick up stuff that will help later - whether it's specific to C++ or general to programming - if I have some professional reason to do a little programming.
HA!
I will now bitch about something I noticed here when I was poking around PLT's webbage a while ago, mostly reading about their webserver.
First! After introducing foldl they say "Despite its generality, foldl is not as popular as the other functions. One reason is that map, ormap, andmap, and filter cover the most common kinds of list loops." It is probably not really necessary right here to point out that maps and filters can be written as folds, but it might be nice to point this out when they go on to give examples of iterations in the form of the my-map and my-length functions.
Then! In the "recursion versus iteration" section they say "a Scheme programmer would more likely just write the following:" and then give a hand-recursing function that is only partially tail-recursive, even though it is not difficult to write a fully tail-recursive function that does the same thing … and that function would just be a specialized fold, so you may as well use foldl!!!!!!one!
It probably doesn't matter but I thought it was weird.
Sorry I didn't read through the comments but is anyone else cracking up over 'brilliantly dense'?
Is that for real? Are they really that excited about a dense paper? Damn, what kind of philosophy is this? I can't be dense! I'm really good at that?
Also, I can contradict myself. That's another skill I have. Is there a place for me at this table?
Does "dense" in American have the same colloquial meaning as in Brit: = "thick", as in extremely stupid, but a little politer than "cretinous"?
Old fashioned teacher: "Boy, your work is appalling! I have never encountered a pupil as dense as you!"
212 me. Dunno why it didn't pick up my name
212 -- Yes.
Although, Americans don't know what the word "cretinous" means, so it's not really rude. We just are cretinous.
Nosflow, you're missing the point there. The point is that we don't particularly care about tail recursion vs non-tail recursion because the stack isn't normally a limiting factor. So use whichever is better for the situation. Writing remove-dups using fold would be a minor pain in the arse as you have to pass around the previous item. In this case I think not using fold is warranted.
Writing remove-dups using fold would be a minor pain in the arse as you have to pass around the previous item
It may be true that the stack isn't a limiting factor (especially in a case like this) but writing it with a fold is pretty trivial; the function to pass is something like (lambda (e acc) (if (equal? e (car acc)) acc (cons e acc))). (Doing it this way means you have to reverse it at the end, but that is also a fold.) It just seems odd to introduce tail-recursion this way.
211, 212: Yes, "dense" means stupid. However, it also means "packed thick with stuff," or having a high density. If that stuff is ideas, as one would expect in a philosophical paper, then "dense" could be complimentary - there's a lot to think about in a relatively short paper.
I'd normally assume "dense" is an insult when used to describe a paper, but I can imagine someone meaning it as a double entendre or backhanded compliment.
Gold and platinum are brilliantly dense.
218: I don't think so. They are dense and they can be brilliant, but they aren't dense in a brilliant way. What would that even mean? It's grammatically correct, but the words don't add up to an idea. "Colorless green ideas swim furiously," to quote a very controversial expert in this field.
219, 220: The Ogged is strong with this one.
219: You're right, your densely brilliant argument has convinced me. (I did forget that point, which neb had raised way, way upthread in 43.)
221: heh, thanks, but if it were true I would have just pointed him to 43.
Quick comment on Harman's writing and his philosophy from one who has read two of his books though not the article in question:
1. Harman's writing is dense in the sense that he packs a lot of ideas into a short amount of space. It is decidedly not dense in the jargon heavy manner favored by so many philosophers today, rather it is engaging and sometimes funny.
2. Nosflow, I know you have some interest in Heidegger and I think that you would find Harman's first book, Tool-Being (hello low hanging fruit!), an excellent read. It centers around an excellent analysis of the tool structure in Being and Time, which he uses to destabilize the Dasein-centric nature of Being and Time as well as the consciousness centric nature of phenomenology.