I just realized that isalpha() is unnecessary since isupper() will be false for anything for which isalpha() is false, but look, it's late.
Zope is worth looking at, as a none-more-objecty framework built on Python.
I found it really easy and quick to develop in [although a tad slow performance-wise compared to traditional PHP/MySQL]. In Zope pretty much everything is an object, too.
Zope is worth looking at
ZopeDB kind of gives me the willies though..
re: 3
Yeah, there is a bit of that. It's been a while, but iirc, we ended up buying mxODBC for zope and using that.
re: 4
For external data, I mean. Not the internal zope database stuff.
You know, there haven't been enough swimming posts around here lately.
Tangentially relevant peeve: Anyone who thinks OOP models how actual objects behave in the real world is insane.
re: 7
I have to admit to employing cargo-cult* OOP techniques when I employ them at all.
* or whatever the word is for doing something while not completely understanding why/how
At last, a programming thread! Look Ben, if you're going to bitch that functions aren't first class, man up and use a real language. PLT Scheme is a good 'un. I don't much favour O'Caml, but some do. Haskell is elegant. Scala has JVM integration.
man up and use a real language.
Here we go. My vote goes to APL.
I don't much favour O'Caml
Anti-Irish bigotry is everywhere.
I do approve of Ben's swipes at the Ruby community. I hate their banging on about elegance when the language has this kind of fundamental crap in it.
I imagine it's like high-school kids who've never been to a decent restaurant chowing down at some local place where the waiters wear jackets and saying "You know, this place is so classy."
Real men program in the Turing-complete meta-language hidden in C++'s template system.
I think the main thing to bear in mind in these and similar debates is that real men do not in fact program anything at all, or ever use text editors, and so on.
Oh, I know - let's have our module just arbitrarily add methods to the Object superclass*! No need to worry about namespace collisions - after all, who else would ever think of doing that?
* at runtime! w00!
Also: Why The Lucky Stiff.
Ruby=Whimsy-oriented programming.
There's too many languages these days. And each language has a whole bunch of frameworks, which turn into meta-languages on top of languages. How the hell am I supposed to keep up?
This post is fine as far as it goes, but my experience has always been that when ratcheting up the whizzinwham, one must always transload the delaxifier so that it may pass the gewürztraminer subroutines directly to the central scrutinizer. After all, 'twas brillig, and the slithy toves did gyre and gimble in the wabe.
Thats stupid, apo. Deflaxifier's haven't needed transloading since the '80s.
There are lots of things to bash about ruby, true. Still, as someone who uses it every day, there's a lot to like. It has its foibles, but then, what doesn't?
19: Relax and watch the blinkenlights.
20: Oh. Well, I'm not really a techie.
Haskell is perfection itself.
Unless you want to do I/O.
Is this post like throwing sand on a fire that's burning too hot? A comment-retardant?
9: 7: who the heck believes that?
Idiots. And people who write intro to OOP books.
Idiots. And people who write intro to OOP books.
And Comp. Sci and Software Eng. Ph.D's.
And Comp. Sci and Software Eng. Ph.D's.
Idiot savant is a subclass of Idiot.
Idiot savant is a subclass of Idiot.
I actually had a sentence saying something like that in my comment before I posted it, but I thought maybe I should be a bit tactful.
18: There's too many languages these days.
Because the overwhelming majority of programming languages are so pointlessly user-hostile that it's easier to write your own than to learn someone else's. In doing so you embed your own irrational prejudices and preferences, ensuring that the process continues.
31: I see. I, on the other hand, am trying to provoke a programming language flamewar. I harbor great hostility towards the CS/SE establishment, mostly because just about every time I change jobs I need to learn a new programming language in order to do relatively simple things I already know how to do quite well in a language that is not supported at my new job. This means I have a poorly informed opinions backed by deep and passionate emotions. Perfect for flamewars, IOW.
just about every time I change jobs I need to learn a new programming language
Pffft! Half a day's work, at most... ;)
Lisp sucks. SML kind of sucks, but type inference is cool. But really, the central problem is that more languages don't have built in gcc-xml based (the technique, not necessarily that implementation) support for FFIs, and better support for exporting to C-compatible modules. If that were the case, 50% of all this inter-language flamewar shit would just go away.
Anyway, whatever Ruby's flaws, at least it's not VB6.
For most of the time I was using VB6 in the day job, my default "postcode" value for testing address form fields was VB6 5UX...
And in 35, by Lisp I mean Common Lisp.
Heh. My company's just about finished porting all of our old VB6 code to C#.
"Learning" a new language is like learning the fundamental axioms or theorems of a new (to you) field of math. Actually learning it involves learning all the libraries, and is comparable to internalizing the theorems of the math field and applying them.
Actually, C#'s not all that bad. If you consider it in the light of a language intended to support a high level of IDE/compiler intelligence and the CLR type/assembly system as solving the big problems with COM, I think they did a great job with it. I mean, sometimes it's still not as nice as a "real" language, but how many "real" languages have IDEs as good as Visual Studio?
I read the homepage on Ruby at one point, curious about all the hype, and decided it was Just Another Perl/PHP/Python. I'm tired of languages pulled out of people's asses. Anybody who designs a language who doesn't know what the Lambda Cube is isn't getting my attention. Which is not to say the language can't be pragmatic.
41. Oh, I like C# - I see it as a better Java, in most respects. It's just that VB6 translated to C# is generally sucky C#, just on account of VB6's intrinsic suckiness.
I'd like to use Coq, but I hear extraction isn't very effective.
Java as a language is fine. Its all the crappy frameworks built onto Java that is the problem.
44: Actually, I'm writing my own proof assistant, because I think the formal foundations of Coq and Isabelle are too opaque. I don't know when I'll get around to supporting extraction. Right now I'm reading an intro paper on unification. Fun stuff.
But when you're trying to just get something done, why bother with extraction?
Oh, and, bring on the Coq jokes.
45: I like the CLR's (C# runtime) handling of generics better, and the whole "integers aren't really objects" thing is kind of weird, and handled better in the CLR.
43: Yeah, I know exactly what you mean. VB6 is a low level language in many respects, so the programs to require a bunch of refactoring to bring them up to par. Which can be fun, as long as there's not a shitload of code to refactor.
I spend most of my working day writing Perl. You people bitching about Ruby's lack of elegance have nothing to complain about.
Perl is elegant. You show me another serious language that packs more inelegance into each square inch.
OOP is a fabulous argument for extending the analogy ban to programming.
Perl reminds me of the street layout in Boston, in that both are so nonsensical it is impossible to learn them systematically. The only way to know your way around the language or the city, respectively, is to have the whole unwieldy structure memorized.
I'll grant that the Calculus of (Inductive) Constructions isn't the easiest formalism to understand, but Isabelle? What's opaque about intuitionistic higher-order logic? You got your foralls, your implies, and your equals, and that's it.
My belief is that Perl is in fact the projection into our limited three dimensions of the unspeakably chaotic and blasphemous language of the Great Old Ones. Choosing the affable Baptist pastor Larry Wall as its avatar in our dimension is their idea of a joke.
I can't manage to get too worked up about a language's structural elegance. Partly this is because I've been developing web applications for so long (and in a PHP-based framework for the past couple of years) that I'm no longer much of a real programmer.
But I also just don't care. I mean, I can appreciate the surprisingly beautiful abstraction lying within Javascript, and the horror of Ruby monkeypatching. But these sorts of things won't stop a competent programmer from designing good systems, or a bad one from creating horrible rats' nests (can you tell I've made my peace with PHP?).
Ruby's not that lovely in the morning, but it looks pretty good when you're out at the bar. And WOW that hpricot gem does strange things to me...
With all that said, I'm currently trying to get more serious about Python (partly thanks to Ben's example). My only real beef with it is the need to compile regular expressions into objects before using 'em.
Oh, I know - let's have our module just arbitrarily add methods to the Object superclass*
Isn't this acceptable Smalltalk, too?
60: Yes, and somehow those guys manage to keep it together. But the Smalltalk community's pretty small, and fairly self-selecting, so the potential for mayhem is much reduced.
To be fair, Mocha (Ruby mock objects framework I was playing with the other day) exploits precisely this capability to beneficial effect. It just gives me the heebie-jeebies, is all.
Lisp sucks ... .And in 35, by Lisp I mean Common Lisp.
Perhaps the most wrong thing ever written on this blog, which is saying something.
More on topic: Both python and ruby are half-assed designs. It's only in comparing them to each other (or perl) that this is plausibly confusing.
56: Well, to understand what goes on when evaluating a theory file, you have to understand, in addition to HOL, at least unification, which is a fairly difficult concept. But really that's not so much a beef as is the fact that neither Isabelle or Coq will give you a step by step list of axiomatic transformations that make up a proof. They don't tell you what they're doing, so if you don't understand something, you're SOL. A proof in a program like that should be self-documenting. I imagine that once I thoroughly understand unification and a bunch of other abstract nonsense I could modify either system (probably Coq more easily, since it's constructive, whatever that means) to be more clear. But until that point I'm doing my own thing.
Oh, and the Coq tutorial sucks.
Hey wait, 52 is is a runner up to 35 for wrongness.
pdf23ds is on a roll today.
But python has a half-assed design to which I'm accustomed, soup.
I started learning Python a while back, and I find its object system pretty horrifying. I mean, there are pluses about its lax handling of interfaces and such, but what's with all the weird underscores and special names? Python is a pretty language built on top of an ugly base.
While it may be possible to write good systems in Perl, there seems to be something about the language that attracts people with a certain set of bad programming traits. I'm not sure exactly what the mechanism is, but it's certainly there.
Python, for all of it's half-assedness, seems to have hit a pretty good sweet spot between performance and elegance/sketchiness.
66: Yeah, that's fair enough. You can get useful work done in any of those. I'm no purist that's going to insist you should spend your type learning other languages `nobody' uses, instead of getting things done. You're going to be more productive in something you know, for many things.
On the other hand, it's nice to know when weird but idiomatic code in language X is there to work around a design flaw, and understand what else is out there.
52 is saying the same thing as 57, but 57 says it much, much better.
I would respond to 63, but I need to get some work done today.
seems to have hit a pretty good sweet spot between performance and elegance/sketchiness.
It really hasn't though, because the only way to get decent performance out of it for a lot of things is to drop to C. Which means coding in c, not pyton, and a c-centric way of looking things. There's nothing wrong with this of course, but it's nowhere near the sweet spot for these trade offs. On the upside though, it's easy for procedural , c-family programmers to pick up, and has inherited a bunch of nice ideas (repl, some introspection, etc.) that make it much more productive than if you were writing the whole thing in c.
69: I actually tend to think I should spend my Copious Free Time learning the esoteric languages nobody (except Dominic, I guess) uses, because, purity! and they're interesting!. But I don't actually need to be productive.
Wow, a whole thread speaking in tongues. Sarah Palin, call your office.
71: It's ok to be wrong, pdf23ds.
Don't misunderstand me --- there's a lot of things I wouldn't do in lisp, but there's some things it's perfect for. Many of the problems it has aren't technical, to boot. For certain types of research code, it's the best language I know of, period.
72: Type safety? Garbage collection? Those are the two biggies in my book.
It really hasn't though, because the only way to get decent performance out of it for a lot of things is to drop to C.
Define "a lot of things" and "decent performance". We have quite acceptable performance in a more-or-less pure python app using a C web server to serve static content, and two or three C modules to handle particular small hotspots (one particular encryption class comes to mind, but there may be a few others).
We're getting to the point where some of the standard library modules have serious performance bugs, but once you're monkeypatching the standard library to get better performance, having to patch a bit of C code isn't really any worse.
76: yeah, I folded that into "etc". For productivity, having a repl (rather than edit-compile-run) and GC are probably the biggest. Decent typing (and there are different acceptable ways to do this) is a big win design- and stability-wise.
I will show this thread to my boyfriend the next time he starts in about what a nerd I am.
I was wrong again. A geek thread around here is like gasoline on the fire. America is doomed.
While it may be possible to write good systems in Perl, there seems to be something about the language that attracts people with a certain set of bad programming traits.
It's called "a shallow learning curve." Back in the late 90s, people (like me) without much formal programming training who needed to write quick-and-dirty web apps wrote in Perl, with predictably bad code as the result. (Not that my code is bad - not anymore - but I'm an exception.)
Nowadays PHP seems to be filling that niche, so you're seeing bad PHP code in all the places where you used to see bad Perl code.
75: Short version: I programmed in CL for two years before making that decision, and I still think there are lots of things CL does that no language I've seen does quite as well, but OTOH the same is true for other languages and there are some important ways in which CL is *in*elegant that in the end, (along with the lack of libraries and a good IDE,) pushed me away.
See, Unfogged is a leading indicator. Ten years from now everyone will be saying this shit. This is as bad as Harry Potter.
78: By "type safety" mostly I mean "no access violation core dumps". Exceptions are nice too.
77: yeah, but I do a lot of numerical codes etc. where native python is hopeless. If someone has already written the c or asm code for you, fine, but it's a kludgy way to mix expressiveness and speed. But it is nice that it gives you easy access to lots and lots of c libraries etc.
There are options of languages that are more expressive and better designed than python, whose native compiled code is pretty close to C. So these are much closer to the `sweet spot' you're speaking of. However, you do typically trade off available standard libraries, so it's not a clear win.
81: Perl has a shallow learning curve? Compared to Intercal, I suppose.
74, 79, 80: I'm with you guys. Good grief.
I know it's rude to complain here in this thread, where y'all are happily geeking out, but according to the sidebar, this is the only game in town.
I have an amusing anecdote about my Pascal class, if anyone wants to hear it.
Wow, a whole thread speaking in tongues.
I was just performing for the rest of the congregation so they wouldn't think I was unworthy to be filled with the spirit.
(Not that my code is bad - not anymore - but I'm an exception.)
Sure you are, honey.
zadfrack,
I'm seeing it in Java. Not the shallow learning curve per se, but I think the big influx of cheap new programmers in you knows where.
Also there is a cultural thing where the English verb-subject convention is giving way to the subject-verb convention.
I'm even seeing if - then - else giving way to then-else-if. Not in java but ant.
It sucks when you no longer rule the world.
Exceptions are nice too.
conditions are even better ....
"no access violation core dumps" is a start, but I'd add dynamic/strong or static/strong with a decent inference system.
85: See, that's what the rest of my 35 was about.
English verb-subject convention?
Perl has a shallow learning curve?
You can get Perl to do stuff easily. It will just be spagheti. To do stuff "right" in Perl is very hard.
neither Isabelle or Coq will give you a step by step list of axiomatic transformations
Rather, in Coq, "Show Proof" will do what you want, and in Isabelle, it's "ProofSyntax.print_proof_of". Assuming that by "axiomatic transformations" you mean the more general "applications of inference rules in a natural-deduction style". You don't want a list of axiomatic transformations, because, you wouldn't be able to follow it at such a low level of abstraction.
Languages that have a shallow learning curve and attract newbies are great - getting started in programming is a very noble cause. Perl is worse than that - experienced programmers who truly love Perl have always turned out to be unmitigated disasters at places I've worked. Small-ish sample size so it's clearly not a universal law, but it seems like a clear trend.
I assume that Perl must have some language features that make it really easy to do things that are in fact horrible ideas.
91: Yes, we all want ponies. How about SML.NET? No support for REPL-style programming. F#? I wonder how they support conditions in the CLR.
I once had a job where I had to program in SAS. I honestly think I would prefer to use Cobol.
Once you get over the irreducible kernel of suck (they should rename it LIFP, for Lots of Irritating Fucking Parentheses), Lisp is the best language.
You could find out whether that's true by learning Perl yourself—or would the process so disorder you that you wouldn't be able to tell?
Truly a thriller in the making.
This is awesome. All this time spent building an Unfogged persona that is urbane, cultured, and witty. Then along comes a programming language thread and *poof* it's all gone.
85: Yeah, numerical stuff in native python is pretty hopeless. People say that numpy or Numerical python are decent, but I have no direct experience. For your typical string manipulation and lots of conditional logic stuff Python does pretty well.
or would the process so disorder you that you wouldn't be able to tell?
That's what I'm afraid of. I don't want to risk it.
You played your part in this, Breeze.
I assume that Perl must have some language features that make it really easy to do things that are in fact horrible ideas.
I think that the problem is that Perl has horrible default behavior for writing anything bigger than the equivalent of a large awk script. You can change them and do things better, but you have to fight the language to do it.
95: I'm familiar with both of those commands, and neither does what I want, AFAIK. And yes, of course if it were just a series of axiomatic transformations it would be unintelligible. But with the proper presentation it could be made quite intelligible. See, for example, this.
All this time spent building an Unfogged persona that is urbane, cultured, and witty. Then along comes a programming language thread and *poof* it's all gone.
Quick! Make fun of NASCAR, and all will be forgiven.
Perl has a shallow learning curve?
Shallow, but long.
You can learn enough Perl to get something useful done - a quick web form or some Unix-administration task - pretty quickly. The same goes for Python, PHP, or Ruby, but those were not as mature as Perl at the beginning of the tech boom.
I'm not going to defend all the bad Perl code out there. I've had to maintain plenty of it. But it's not the fault of the language itself, except insofar as it made it easy for the untrained to program, and that's not an entirely bad thing.
The link in 105 was not meant to be an example of the penultimate sentence, but instead of the first sentence.
82: Well short version of my opposite experience. I spend ages doing high performance asm/c/c++ and it's a pain in the ass, but doable. If you're working along, it takes forever to get a lot of things done, as these languages really kind of suck for a lot of non performance-critical stuff, but you end up having to do that also. I played with template metaprogramming in c++ to get away from some of the problems before figuring out that it's just a terrible idea. Lisp, on the other hand, is actually well designed to do that sort of abstraction (much more so than any other language I know of. Which has it's downsides, too).
There are design problems & tradeoffs with common lisp, but far fewer than comparably general purpose language specifications (C++/C#/Java come to mind). Which doesn't mean it's perfect, anything but. In some ways it's both too powerful/expressive and too undersupported (at least in free variants) to make it practical for a) large groups or b) quickly gluing things together, compared to languages like, say, python.
But all of the more elegant languages I know of are even more marginalized than common lisp, with fewer libraries, smaller community, and often unacceptable performance (for me, clearly not for everyone). On the other hand, all of the more practical languages are far less expressive, and often unacceptably slow. Hybrids like python get somewhat closer, but their only advantage over lisp is community size and 3rd-party libraries, really --- so you'd evaluate that project by project. Much of the stuff I code has never been done before, so libraries don't help me and I have limited UI requirements. This tradeoff would be quite different for different people.
(along with the lack of libraries and a good IDE,)
This is actually only partially true; consider ACL. But I understand what you mean.
So shorter response: `sucks' is laughable, but I can understand why you wouldn't want to use it for particular things.
Practically speaking, an awful lot of the market presence of languages is decided by the availability of large numbers of mediocre programmers. For whatever reason, the industry has decided more interchangeable people and high loc counts is the way to go. Which means large language communities skew to languages that can easily absorb large numbers of semi- and unskilled programmers, and have them up and doing something reasonably fast. Which in practice seems to mean lousy designs. This is an observation, not a complaint.
See, that's what the rest of my 35 was about.
Yes, I understand that. It's only a problem in some contexts, though. And honestly, a lot of the stuff out there may be available but it's crap. So more tradeoffs. Many language implementations have decent ffi's.
I presume this comment took long enough to type that it's all been pwned.
work now.
zadfrack, you are Perl's most tragic victim.
I periodically write Perl scripts to do AWK type tasks, and I always find that I have forgotten it almost entirely. I remember the angle brackets and the dollar signs, and that's it.
I assume that Perl must have some language features that make it really easy to do things that are in fact horrible ideas.
Oh, come on. It's not like Perl makes programmers not put comments in their code, or name their variables poorly. We do those things for job security!
Seriously, most of the bad Perl code out there could have been prevented if the programmer would "use strict" and "use warnings".
most of the bad Perl code out there could have been prevented if the programmer had realized it really isn't a good general purpose language design, and just used it for text mashing and system scripts, where it's fine..
zadfrack, think of what you're saying. Everything you say applies to every language ever created, including Cobol and assembly language. (Perhaps not brainfuck.) If you want to argue that programming languages don't matter at all, then do that.
Then along comes a programming language thread and *poof* it's all gone.
I wouldn't worry. Nobody reads the comments to Ben's posts.
I assume that Perl must have some language features that make it really easy to do things that are in fact horrible ideas.
Yes, but only if you're calling them "horrible ideas" from the perspective of an expert in another language. Perl makes it easy to program a thought process, instead of thinking about the systematically efficient way to solve the same problem.
I personally find the NP-completeness of the C++ preprocessing language to be an astoundingly frightening idea, but maybe that's just because I've never had a reason to use it. I do assume a great amount of overlap between the set of people who find such tools useful and the set of people who say nasty things about Perl because it doesn't limit programmers to certain "good" approaches. Having learned C# and Obj-C (in addition to Perl) I also tend to discount the opinions of snotty C++ programmers since C++ seems to encourage a different sort of sloppy bad code (e.g. undefined pointer references).
I've seen a lot of Perl (working in a succession of minor and major ISPs and data centers) and the worst code I've seen was VB written by cheap, imported coders who couldn't do anything without Visual Studio to guide them.
You could find out whether that's true by learning Perl yourself--or would the process so disorder you that you wouldn't be able to tell?
The latter, I fear. When I wrote the following line, I knew it was too late for me, but maybe you can still get out:
my @parsed = map { s/^\s+|\s+$//g; $_ = quotemeta($_); s/\\\*/(?:\\S*?|\\s+?)/g; s/\\ /\\s*/g; $_; } @uniq;
The thing that wrecks Perl developers, I suspect, is that it uses bunch of (admittedly very useful!) shorthand conventions that are unreadable keyboard-mashing/shoggoth summoning spells. That, plus the tortuous workarounds and horrid syntax for OO Perl and a learned community standard that favors concision over maintainability. (For what it's worth, that line above didn't get checked in like that.)
If you want to argue that programming languages don't matter at all, then do that.
I won't say they don't matter at all, but I will claim they generally matter less than language-flamewar partisans think they do.
I certainly have things I don't like about some languages (don't get me started on PHP), but the common complaint about Perl, viz. that so much of the code out there is unreadable, is the fault of the programmer, not the language.
Programming languages don't matter at all, because I'm just going to built a lisp compiler on top of whatever language I'm using anyway.
Lambda cube lambda cube lambda cube cube cube
I use unlambda for my major projects because there's no question that the unreadability of the results is the fault of the language.
Surely this goes without saying: "I won't say they don't matter at all, but I will claim they generally matter less than language-flamewar partisans think they do."
who say nasty things about Perl because it doesn't limit programmers to certain "good" approaches.
yeah, that isn't really perls primary design problem(s). `more than one way to do it' is a perfectly fine philosophy, if arguable, it's just that perl doesn't do it particularly well. That, and it doesn't scale well. Which is fine, at it's heart its a kludging together of a bunch of unix shell & text utilities in a useful way. Which makes it really quite good at doing those sorts of things. Along the way it avoided much of what had been learned about language design (as did many commercial languages, so lots of company there) and that hurts its scaling.
Yes, but only if you're calling them "horrible ideas" from the perspective of an expert in another language. Perl makes it easy to program a thought process, instead of thinking about the systematically efficient way to solve the same problem.
Exactly.
121: Maybe I just have had to solve different problems, but I've had no problem with Perl scaling up. What do you mean when you say "scale?" Size of the program in lines? Number of modules? Number of tasks per second? Size of datasets being handled?
I personally find the NP-completeness of the C++ preprocessing language to be an astoundingly frightening idea
And yet, it is well-known that CIRCUIT-SAT is reducible to C++-TEMPLATES via BRAINFART.
And I use the most readable interpreter not actually written in unlambda to run it.
123: You have discovered the majesty of Turing completeness. IBM wrote System 360 entirely in assembly. NASA put a man on the moon using Fortran IV.
I think in 115 that should probably be "Turing-complete". Haskell's type system is turing complete.
Haskell's type system is turing complete.
You're just baiting me now, aren't you.
127: I call it "NP-complete" based on a conversation I had with somebody (friend-of-a-friend, IBM employee) who sits on the ANSI C++ committee. If that's inaccurate, it was his inaccuracy. Me, I ordered another beer and decided I didn't need to have any more of that particular conversation.
121: Size of program, number of modules, complexity of code. Perl doesn't do big systems well. It does glue reasonably, and text munging very, very well. You can do bi systems, in the same sense that you can implement an object system in pure C. People have done both. Doesn't mean it's a great idea, usually.
Haskell's type system will bring you a martini when you get home from work.
127/129: `turing complete' is correct. You can compute anything computable (on this class of machine) in the C++ preprocessor. However, the language wasn't really designed with this in mind, so doing it (and moreso debugging it) is a real pain in the ass. There are, however, reasons you might want to. I used this approach once to write numerical code that would work in several dimensions (i.e. one piece of code, that did the same computations on 1-d, 2-d , 3-d etc. data depending what you handed it. No code duplication).
133: For that, Lisp would be good. Also, C# with code generation. I wrote a code generation system for C# for my company. We use it for a configuration framework, among other things. Code generation is good, as long as you're generating on each fresh compile. It's like Lisp macros.
100: This is awesome. All this time spent building an Unfogged persona that is urbane, cultured, and witty. Then along comes a programming language thread and *poof* it's all gone.
Depends on where you want to get your affectionate stroking.
In the U.S.A. today, for example, the typical standard of mental behavior in nearly all social classes, is the desire to be overheard saying what will be accepted among the members of the social grouping to which one's ego is appealing for recognition and affectionate stroking.
It's like Lisp macros.
Well, closer than c macros are anyway. I like that about C#, but your still coding in a pretty c-like environment, even if this cousin has a few nice tricks under the hood (as does Java for that matter). May be the most practical thing in a given circumstance, but it's a plodding way to code.
Following the link in 132:
My purpose today is to show that the GHC typechecker with multi-parameter typeclasses, functional dependencies, and undecidable instances is Turing-complete.
Needless to say, this is not a standard feature of the Haskell type system. Often one wishes the type-checker to terminate.
135: I should further note that C# didn't exist at the time, and I didn't know about Lisp, which is half the excuse for actually getting that code to work in C++ template metaprogramming.
130: I would argue that programmers don't do big systems well, and Perl merely fails to support them as well as some other systems might. The one thing from other languages I always find myself missing in Perl is method overloading, but this is also missing in Obj-C (unless as a late-comer to Obj-C and Cocoa I've just not come across how it's done, which is entirely possible).
In terms of overall lines of code (monolithic or modular) I have found Perl to be no less maintainable than C# or Java, except that Perl lacks a code-completing IDE like Visual Studio (which is the best of those I've used, reserving judgment on Xcode as I'm still not that familiar with it).
137: "but it's a plodding way to code."
No, not at all, except insofar as you consider *any* C# plodding compared to Lisp. (Which I really don't, not with C# 3.) It just doesn't come out of the box in C#. Basically you have to reimplement half of Lisp, and sometimes do a couple creative things with the language, but after that it's really nice. Now, it's not as powerful as Lisp macros because you can't call one from your C# file unless you preprocess it (in which case you lose lots of IDE support). I'm still trying to figure that one out.
Perl lacks a code-completing IDE
If you had dynamic scoping, you'd lack a code-completing IDE, too.
Perl has dynamic scoping?
I did not know that.
Hey Ben, I heard your mom has a type system.
Perl has dynamic scoping?
This is why you have to type "my" in front of all your variables, to make them statically scoped.
I have found Perl to be no less maintainable than C# or Java,
Which is a fair, if low bar to set. That doesn't match my experience, but mileage varies (I've worked on a couple of ~ 50k line perl systems that I found much more fragile than ~500kline c systems)
Definitely programmers don't do large systems well. Some languages seem to fall apart more quickly than others.
One of the reasons I stopped using Lisp was that the language made it really hard to make a very helpful IDE.
146: Anecdotal experience with maintaining others' systems is likely to be worth almost nothing in distinguishing languages, considering how much of the variation in code quality is due to the original coder vs the language.
No, not at all, except insofar as you consider *any* C#
I consider pretty much any edit-compile-run programming to be plodding compared to any read-eval-print programming, in my experience. Even with IDE help.
Granted. I do somewhat miss that part of Lisp. But it was annoying to find out after I renamed a function that I hadn't renamed twenty references to it, and my code was subtly broken because it was still using the old version in some places. I've been thinking about trying to make a C# interpreter. It's low on my list.
Anecdotal experience with maintaining others' systems is likely to be worth almost nothing in distinguishing languages.
Well, that's a problem, yes. At one point I was actually quite good at perl, one had to be very disciplined with it to write 20klines that didn't run into trouble though. Perl has a lot of gotchas. So you can write larger systems in it with care, but you shouldn't have to be so careful. `There's more than one way to do it' doesn't mean they're all good ideas --- since we're talking about common lisp earlier, there is a language that does `more than one way to do it' much better.
Oh, and my symbol table getting all polluted with old symbols and interfering with symbol completion. That was annoying.
One of the reasons I stopped using Lisp was that the language made it really hard to make a very helpful IDE.
ACL has a good one, and emacs+slime is pretty good.
One of the frustrating things about lisp programing is that in most of the ways that matter, modern IDE's for commercial languages are only now catching up to where the lisp machines interface was decades ago. But if you code lisp, you've lost that these days.
I hate Emacs with a passion. But SLIME was OK, considering it was inside Emacs. (I submitted a patch or two for slime.) Never used ACL.
152: Another example of the problems with being given too much rope. There are tools to help (particularly in CLOS) but being that dynamic means you can get into trouble, certainly.
Well, and a lot of the tools to help are things that are implementation-specific, since so little of that is specified in the standard. And tend not to be terribly well documented. Like all that path crap.
Well, I'm sure ACL documentation is good. But I only used the free implementations.
This is why you have to type "my" in front of all your variables, to make them statically scoped.
I think this is why I find Perl less problematic than most "serious" programmers do. I took "use strict" and scoping to heart years ago. Dynamic scoping is a feature I choose not to use.
ObAnecdote: I work with a friend with a much different style than mine, and the number one thing I find when asked to help her debug misbehaving code is some sort of scoping error.
My own number one bug is the dreaded missing semicolon. What's great about that is it's a problem in multiple languages! Yay!
If either one of you turns out to be Paul Graham, I'm going to be sorely disappointed.
soup's point about a lot of mediocre programmers is an interesting one, but as Brooks pointed out in TMM, you can only bring so many really good programmers together in one place, and there's only so big of a system you can do with such a group. If you want to do something bigger, you need to accept mediocre programmers.
In other words, Sam Colt may not make a better gun, but if you need an army, he's your man.
I think this is why I find Perl less problematic than most "serious" programmers do. I took "use strict" and scoping to heart years ago. Dynamic scoping is a feature I choose not to use.
perltidy and Perl::Critic are fantastic tools. My only professional programming experience is in gacky PHP and Perl, I'm not really familiar with similar tools in other languages, but man, I love the ability to bake these into Vim or TextMate.
159: Oh absolutely, there is a limited number of skilled people available.
However, languages certainly aren't equivalently expressive, and the scaling issues aren't exactly linear. It would have been plausible to have seen a commercial programming market that featured fewer, more highly skilled programmers primarily working in expressive languages that were perhaps harder to learn fully, but would get a lot more done in a line of code. I can imagine todays 10 millionish loc systems having been equivalently implemented in more like 1 million lines of more powerful languages. This code would almost certainly be safer and less bug prone, but take more time to read and understand, and involve more abstraction. And there's no reason to believe it wouldn't balloon to the point that it was as bug-prone as current systems, just did more.
Net result, you'd need fewer people, but they'd have to know more. I suspect this is the real killer, as it gives you less flexibility from the corporations point of view, and also gives your coders more market power.
The real strength of the c->c++->java->c# language family is that you can plod through relatively simple procedural patterns with little training. This suffices to meet a lot of `business software' needs, or at least partially. Of course when you try and couple 10 million lines of this stuff together, you can run into lots of trouble. But of course nobody has solved the big systems problem either.
I took "use strict" and scoping to heart years ago. Dynamic scoping is a feature I choose not to use.
Language X is ok for `serious work' as long as you don't use features Y and Z is a common enough claim --- but it always indicates flaws in the in the language design. This isn't just perl of course, C++ is full of this stuff for example.
Hey, is it completely obvious yet that I have a deadline I'm avoiding thinking about?
I think that you do see some organizations that use more expressive languages and more skilled programmers out there; they just aren't working on 10-million line projects.
I work on a half-million-ish line project in Python that is currently undergoing a large influx of new programmers, and the expressiveness that we used is currently causing trouble.
This code would almost certainly be safer and less bug prone, but take more time to read and understand, and involve more abstraction
I really don't see how more complex code with more abstraction makes code less bug prone.
I believe there was a study done at some point showing that the number of bugs a programmer writes per line of code is constant regardless of the expressiveness of the language. So, more expressive languages get you fewer bugs per program, because they're shorter programs.
The abstraction is supposed to make it simpler, not more complex, as a result of greater expressiveness—this makes it harder to understand from the outside, but (hopefully) easier from the inside.
164: Sure, there are niches, but that's not the way the industry as a whole generally works. Using things like python is hedging, a bit (which is fine). It's still a algol-family approach built around a c core. I don't know of many shops who will try something really unusual in trade for expressive power. There are a few examples: erlang shops, ITA, that sort of thing. Unusual though.
I really don't see how more complex code with more abstraction makes code less bug prone.
It'd be equally or slightly less complex, but smaller. Denser. Less bug prone on the principle that bugs are less common in lines of code that don't exist.
I really don't see how more complex code with more abstraction makes code less bug prone.
Well it probably doesn't inherently. But in context, many of the languages we are talking about as being used in industrial practice have bug-prone design (mis?)features, and and the languages I'm talking about are better at that. So yes, in idiomatic use, they are less buggy. Also, properly used, abstractions make correctness easier to see.
Also, I was perhaps less clear but an equivalent implementation would have exactly the same complexity. It would be shorter, and if well abstracted it would be easier to debug. However, I did allow for the possibility that more expressive power would just get you more complexity. So your code would be about as buggy as before, and about as long, it would just do more.
I'll tell you one thing - debugging code that is massive and written by others in Java and so big Eclipse or similar tools won't work is nearly impossible. The underlying code is scattered all over hell and you will find that a single line of code covers dozens of underlying nested functions.
Even if you get to the root exception the null pointer may have happened long ago in a galaxy far away from where the null pointer is used.
But hey, we can pump that crap out bigger and faster and let the support guys deal with the headaches.
172: I think an underappreciated misfeature of heavily object-oriented code is how hard it can be debug. I have seen programs that had so much glue code to make interfaces look a certain way that it was extremely difficult to figure where anything actually happened when you were debugging it.
this makes it harder to understand from the outside, but (hopefully) easier from the inside
Unfortuantley for me, I am the guy who has to fix broken things in production usually without access to the original programmer so more complicated from the outside seems like a negative to me.
173: What's more, some style guidelines actually encourage that kind of programming, saying that the average method should be around 5-10 lines. No thank you. 50-100 lines is better for me.
175: It can be, but it needn't be. Abstraction doesn't mean the same thing as complexity, and if it buys you something it may well be worthwhile. There is some trade off with programmer skill, but at some level of overall complexity being about to easily understand things locally doesn't buy you anything much, if nobody can understand the whole thing.
Yeah, more abstraction gets you code that's easier to understand after an upfront investment in learning the abstractions used. (Which really have to be documented to be easily learned. Undocumented abstractions are almost as bad as spaghetti code.) So depending on the tradeoffs, it's not always better to have more abstraction.
I didn't state 177.last strongly enough. In fact making things globally a mess but locally understandable often makes things worse, but allowing/encouraging a maintenance code style the is all local hack and patch (even to more global problems) so that over time you codebase gets worse, not better.
saying that the average method should be around 5-10 lines. No thank you. 50-100 lines is better for me.
That's ok; just include a lot of one-liners too, and the average should come out right.
174: What I really want is a way to execute parts of my program symbolically. So for example, I want to be able to say "What happens when this pointer is NULL, and this variable is always positive" (within the limits of what's computationally decidable, of course). Compilers already do some of this when figuring out dead code branches, and there's been additional academic research under the name "partial evaluation", but I haven't seen a tool that does it.
181: You mean to determine properties about its execution given certain constraints? Hmm. That would be along the lines of Coq, I would think, but then you have to express the algorithm in HOL. I think that part of computer science is growing and still pretty young.
In fact making things globally a mess but locally understandable often makes things worse, but allowing/encouraging a maintenance code style the is all local hack and patch (even to more global problems) so that over time you codebase gets worse, not better.
In my experience, globally understandable but locally opaque code still gets subject to hack/patch maintenance, but without the redeeming quality of fixing the problem at hand.
This game is not Turing-complete.
185: I'll halt your problem, buddy.
186: Yeah, you and what oracle? I'll Turing jump you to the moon, motherfucker.
187: oh that's it. It's polynomial time!
I will give this to the critics of Perl: the fact that it includes a "goto" function makes me suspect the work of diabolical forces. This is a feature the language could do without.
I'm currently trying to figure out come Perl code that uses "goto" in several places. *shudder*
It's polynomial time!
Never did understand what the big O is really about, did you, Sifu?
189: Every language needs a goto. (Except for pure functional ones, where the concept doesn't even make sense.) Anyone that uses it better have a damn good reason though. I find I need to use gotos to break out of doubly-nested loops where a "break" statement would only break out of the inner one. The "break" statement needs a number argument. This can be avoided by refactoring the loop to be in its own method and using a "return", but sometimes that's too annoying.
Every language needs a goto. (Except for pure functional ones . . . )
Don't those languages just name their goto call/cc?
172:
172: I think an underappreciated misfeature of heavily object-oriented code is how hard it can be debug. I have seen programs that had so much glue code to make interfaces look a certain way that it was extremely difficult to figure where anything actually happened when you were debugging it.
Walt, I could kiss you.
My current job is to debug object-oriented code that has been developed by a current team of at least 100 programmers at about a dozen sites around the globe. The porgram is fairly new, less than ten years old, but the turnover rate is huge so from the original 100 there are maybe, I dunno, ten at best. The total project, after being compiled, takes 50 CDs!!
Meanwhile the young bucks are grabbing shareware tools left and right. They better not be grabbing shareware source or we'll really be up the creek, as in massive lawsuits. By the time we put the kabash on shareware we had to retroactively try to track down every single one of the original contributors and get them to sign *our* exclusive agreement because the so-called agreement they originally signed failed in a court challenge. You ever try to track down 'College Bonger" from Australia who did some hits, and some code, five years ago?
Hell for a little while some shareware group had my 14 year old son doing documentation until I made him stop it. He had no idea what rights he was signing away.
Now you have some idea why I detested the frigging readme that started "hear ye hear ye."
It used to be private individuals would put up with crap but at least businesses cared enough about quaility they would pay for it and look years ahead.
Not anymore. Look at Wall Street today. It is all about the fast buck and tomorrow be damned. So who can I blame for all this? You tell me.
So who can I blame for all this? You tell me.
Saddam Hussein.
189: Point taken. But these uses of "goto" I'm trying to decipher do not have damn good reason for their use.
You could also avoid gotos for jumping out of nested loops by abusing exception handling in languages that support it, but that's not necessarily an improvement.
190: It is O(1) complexity for me to kick your ass!
You know what's worse than a goto?
A goto that is disguised as a method that is defined in some other project which has no source and is in who knows what jar and which itself has methods that spawn out to two more similar method calls, ad infinitum.
which itself has methods that spawn out to two more similar method calls, ad infinitum.
Yes, fork bombs are worse than gotos.
What happened to the time-honored tradition of setting a flag in the inner loop and checking it in the outer?
199: depends what you're trying to do.
Grad school has led me to forget everything but MATLAB. But damn am I good at MATLAB. Too bad MATLAB isn't good for everything.
If I were to take a sabbatical to find the person who misspelled "referrer" in the HTTP RFC and to kill them, would anyone join me?
203: MATLAB is great for everything that doesn't require efficient execution.
MATLAB sucks. Woo-hoo! I can add two matrices by typing A + B! Too bad everything else is a fucking pain in the ass.
What I'm trying to say is: don't make me hash you, Sifu.
MATLAB is a horribly designed (more grown, really) programming language, but a really nice system for doing linear algebra on arrays of floating point numbers. You win some, you lose some. At least its tools have gotten better.
code that has been developed by a current team of at least 100 programmers at about a dozen sites around the globe
There's your first problem, Tripp.
That's ok; just include a lot of one-liners too, and the average should come out right.
That's what getters and setters are for.
MATLAB is great for what it's intended to do. If it wasn't proprietary I'd use it for all my data analysis. Unfortunately springing for a site license is out of the question at this stage, so I'm writing all my data analysis code in python, which is really making me miss MATLAB.
211: fwiw, octave is a pretty good matlab 4 clone. If what you actually need is toolboxes or the newer gui stuff, you're sol. Not sure if this is more or less helpful than numpy for what you are doing. Matlab is very,very good at a few things, it's just not much of a general purpose language and the way it `grew up' makes it clunky.
208: yep. For efficiently converting your linear algebra to code, it can't be beat. For anything dealing with, oh, I/O or real-time execution or truly large data sets or like anything, it's not the way you want to go.
Is Maxima any less clunky? I'm not too familiar with either.
214: Maxima /macsyma isn't really the same thing, it's a CAS more like maple or mathematica (if a bit more dated). These excel at mashing symbolic math representations around, and are reasonably well put together for some types of related programming. Not a good general purpose language. They know quite a bit of algebra and tricks for manipulating/solving equations symbolically, including integral or differential equations to some degree. Maxima is free, but a generation behind now.
psilab (an INRIA project) is more like matlab --- they excel particularly at floating point linear algebra. Also lousy for general purpose. These tend to have some very efficient linalg code, fft's, good stable linalg algorithms and some numerical ode/pde solvers, stuff like that.
it's a bit more complicated than that (i.e matlab and maple cross-license some of their stuff, or at least did), but that's the gist of it.
erm CAS is `Computer Algebra System'
209: There's your first problem, Tripp.
Yeah, although my problem before that one is that I can't retire just yet because of the frigging world economy and the cost of college - specifically my 401K and current college tuition. When my Dad was my age he got one of the early buy-outs that were freaking close to winning the lottery and he was out the door and on the golf course.
I tell myself that my continuing to work keeps my mind active but I think it is also playing hell with my blood pressure. I'm actually thinking I may have to look into the yoga stuff that I used to ridicule.
So I'm not asking for sympathy, I'm just saying when I'm cranky I usually have a reason.
I think that the disseminated intelligence of the Unfoggetariat should dedicate itself to producing the world's first problem-free software package. No one's ever thought of that before.
the world's first problem-free software package
Already been done, but it's Mac-only.
So if I'm an old C/C++ hand who has recently fallen head over heels for Python and will use it for everything that doesn't require sheer performance (and, as a game programmer, I will be needing sheer performance a lot), what can any of you nerds tell me that will make me want to learn a functional programming language?
221: Paul Graham will think you're a weenie if you don't know how to program in Lisp. Wait, no.
Okay, E/ric S. Ray/mond will think you're a weenie if you don't know how to program in Lisp.
I'm not really helping, am I?
E/ric S. Ray/mond is clearly an unrecommendation, but when did consensus become the same for Paul Graham? His programming books are good (I haven't read his other book).
what can any of you nerds tell me that will make me want to learn a functional programming language?
They're often quite elegant and deep.
221: "Who is this Paul Graham," I ask myself, and within a few minutes of googling and browsing his site, I come across this gem: "The unusual thing about Lisp-- in fact, the defining quality of Lisp-- is that it can be written in itself." This statement is pretty close to epic fail but I'll keep reading him.
224: One of my last acts on my last project, implemented about a day before we finished, was to replace the orderly shutdown of a third-party library with the simple murder of the four threads it was running. Elegance is nice in theory but I'm just trying to get shit done.
You sort of have to wonder what Graham meant by that.
Oh! Maybe he was referring to lisp's macro system, in which you manipulate lisp data and the result is new lisp code in the program itself? That does stand a chance at being the defining feature of lisps (especially if you consider Dylan a lisp). I don't know very much about self-modifying code in assembly (or befunge) but I assume the facilities are much more primitive—though I suppose you are manipulating "assembly language data structures" when you do it in assembly.
Conversations like these are why I don't really miss not being a technical writer any more.
I don't really miss not being a technical writer
People who CAPSLOCK when they point out inadvertant double-negatives deserve to write thousand-page manuals using nothing but Microsoft Word.
223 - Paul Graham knows a lot about lisp, but he ventures beyond that knowledge, often to remarkable effect.