I think we need to look separately at software engineers and real engineers.
A young professional explained to me why he didn't want to join the union: We aren't being asked to work for 10 cents an hour. We don't need a union. Unions aren't necessary any more. Plus, there are personal risks associated with joining the union that he didn't want to take.
That was pretty typical. There were others who were attached to the idea of themselves as "professionals" rather than "labor."
At that job -- in a Right To Work state -- I knew a lot of very nice people who acknowledged that they benefited from the union, but refused to pay dues because they didn't want to. They objected to being called "free riders," and I never called them that out loud.
They probably spent their weekends cruising those Little Free Library boxes, taking all the books, and selling them on Amazon.
Was there a turning point somewhere in mid-century, when the middle class forgot they were in the middle, and started thinking they were on top, and safe there?
It's documented in the movie I'm All Right Jack (1959).
So, from 'stop pretending you're not one of the rich ones' to 'why can't you see we're in it together.' Complicated stuff.
I'm pretty sure the shift in the OP was addressed in a Flintstones episode.
If we had a union, I wonder if they'd notice that I don't have the degree that should be the minimum requirement for my job? I suppose that's more of a guild-type thing than a union. But a union that worked with the carpenter's union or plumber's union would make much more sense for my type of work than what (I think) you'd see in a factory or shop.
I think there was also a shift from a time when you could really tell the difference between management and labor by looking at who was tired and dirty at the end of day and who was sitting in an office chair and staying clean, to a time when you could be doing what physically looked like 'white collar' work, but you were really working class.
And management works hard to keep workers confused about what class they're actually in.
Yes. The lower levels of "management" are very often getting the worst of it.
Which is why I thought that Obama's executive order increasing the wage at which you could be a "professional" was so important. I don't have any knowledge of whether or not the court was right to strike it down, but there's no reason it shouldn't be done by legislation. I think that might go well with the fight for $15.
I'm not quite sure where I fall. I run a lab, so "management". But I teach classes and am not an administrator, so "not management".
Unless you teach an abnormally large number of classes for somebody who runs a lab, running a lab overrules.
Unless you mean "Labrador Retriever" or something.
Speaking for software engineers, we have an inherent distaste for anything that stands in the way of our ability to Get Shit Done. That includes sitting in meetings, restrictive software licenses, mandatory HR training, and management.
We also imagine that software development is a meritocracy - which to a large extent it is: some people just write far better code than others. The rub is that many who are among the best don't get recognized for it beyond the small group of people they work with.
While there is certainly a perception that the industry treats software engineers unfairly at times (collusion among the major companies to keep down salaries, for example - as well, of course , as the ever-present concern about getting your job shipped off to India), unions are avoided because they are seen as one more likely impediment to actually getting shit done.
11: But I still get to bitch and complain about administrators, right?
Since I don't have a union to set my lunch times, I'm going to eat now.
The current system has been great for the upper middle class. Inequality has increased everywhere along the spectrum (though the very rich have gotten much more than anyone else). For example, software developers in the US are much better paid than most other places, as are doctors.
For whatever reason, the upper middle class in the US is more self-interested than in Europe.
17.last: It's the American myth of meritocracy.
I read a book - I think it was Selina Todd's The Rise and Fall of the Working Class - which dates the change in class rhetoric to the seventies, where it shifted from "rise with your class" to "rise out of your class". She's talking about the UK in particular, of course, but I think she ascribes this to a mixture of a shift in cultural values, decline of factory jobs and rise in white-collar-but-really-working-class jobs and a change in government priorities in ways that weakened unions. For instance, she points to the shift to oil production in the North Sea (which IIRC was something forseen at the end of the seventies) which weakened the miners' unions. I assume that oil rigs were also unionized (but can't prove it!) but presumably less established and weaker.
The fortunes of unions really began to decline when they stopped closing their HTML tags.
13: Must demur vehemently. [In the following, "you" isn't personal -- just directed at the typical software engineer.] 20yr working in enterprise software at Ye Olde Big-Ass Computer Company taught me a number of important lessons:
(1) The only thing that matters is making your manager look good.
(2) Go ahead and pretend that what matters is writing the best code, the most correct code, etc. It doesn't, see #1
(2.5) Go ahead and pretend that what matters is working for the interests of the stockholder (or CEO), and not your immediate management. It's a path to getting fired (or in my case, 30sec away from it, at which point I realized what I needed to do, and switched gears, resulting in the best 5yr of my entire career).
(3) Oh, and your manager would like his priorities addressed ... *yesterday*. So be fast.
(4) did I mention that as you get slower, your manager (and his manager, etc) all like you less and less?
(4.5) and the same is true at the breathlessly-hip modern Internet companies (again, can't name names); indeed, they're pioneering the true industrialization of software engineering.
(5) Of course, you like to think of yourself as getting smarter as you get older, b/c more experienced. Yeah, right. Unless you're one of the really special ones, you're getting slower, and weaker, and less able to focus for those 12-hour (gee, used to be 18-hour, but you're 50 now) stints of hard-core coding for which you used to be renown.
Software engineers discover they should have been pro-union somewhere between age 45 and 55, when they're no longer young, fast, and possessed of tremendous stamina. [oh, but I repeated myself twice *grin*]
The good (?) thing is: with the spread of the Google style of "industrialized" software engineering, I think it'll become more-and-more the case, that younger engineers will recognize (much sooner) the intrinsic factory-like nature of their work. B/c that's what's coming, as sure as artisanal hand-loom weaving got replaced by machine looms as part of the Industrial Revolution.
I assume that oil rigs were also unionized (but can't prove it!) but presumably less established and weaker.
It may well be different in the UK, but the US oil industry has historically been non-union, in sharp contrast to the highly unionized mining industry.
Something else: up-thread, Walt Someguy noted that SWEs are paid pretty well -- nearly as well as doctors. This isn't quite accurate (all over the heartland, SWE pay isn't so great) but it's close enough for govt work. Doctors in America are also notably averse to unions, and I posit, for the same reasons as SWEs: they think their work is special, self-actualizing, elite, and immune to the regimentation of the modern industrialized workforce. Really though, it's b/c they get paid a shit-ton more than the *proles*, and they think they're part of the (at least petit) aristocracy as a result. But that "industrialization" is coming for them (with corporatization of medicine), and is even necessary (as the wild variability associated with "every doctor decides for himself" is associated with worse outcomes for patients).
I recall that France has a robust union presence amongst doctors, and I posit that it might be because doctors there just aren't paid like doctors here.
So maybe it comes back to inequality? [I haven't read that book, so just guessing here ....]
they think their work is special, self-actualizing, elite, and immune to the regimentation of the modern industrialized workforce.
I don't think you need a complicated psychological explanation. They could just think that they don't have much to gain from collective bargaining vs. what they're getting based on bargaining for themselves.
It's maybe worth pointing out that, in a lot of times and places, this has also been a two-way street: working class identity politics or Left movements have targeted/othered professionals/engineers as effete lackeys of capital. I have Latin America in mind a lot more than the US.
Chet, when you say that doctors/engineers "think their work is special, self-actualizing, elite, and immune to the regimentation of the modern industrialized workforce", do you think they're wrong to think so?
After nineteen years (ouch) in the industry I've seen both hugely varying productivity levels and a prevalence of autism-spectrum disorders, both of which make unionizing pretty unattractive.
Curious what you mean by "Google-style of industrialized software engineering..."
Chet, when you say that doctors/engineers "think their work is special, self-actualizing, elite, and immune to the regimentation of the modern industrialized workforce", do you think they're wrong to think so?
After nineteen years (ouch) in the industry I've seen both hugely varying productivity levels and a prevalence of autism-spectrum disorders, both of which make unionizing pretty unattractive.
Curious what you mean by "Google-style of industrialized software engineering..."
I totally understand 21, which I think is a lot closer to the reality than the self-conception of reality among software developers. That's why we only think we are a meritocracy. At least until the age of enlightenment, which I think is sometime after a dev experiences their first recession.
I remember seeing a chart in 2004 that showed PAC campaign contributions by party. For the Republicans it was Chamber of Commerce, National Association of Manufacturers, stuff like that. For the Democrats it was unions. I know this is super old news (it's what Galbraith called the countervailing power), but seeing it in a table made it obvious that unions are irreplaceable when it comes to determining what kind of politics a country has. How different would America be if it had 70% union density (like the nordics have), instead of, what, 20%? That's why tech workers and professionals and everybody else needs unions. Not for themselves, but for the country. If unions were stronger politicians would pay more attention to working people and less attention to billionaires and corporations.
Few things are more gratifying than having something I "liked" on Twitter, "just in case," become directly relevant. Matt Stoller (great Twitter follow) had a thread about Democrats breaking with unions. In another thread, he says an explicit part of the strategy was to trade stability in the American middle class for stability in the foreign (mostly Chinese) middle class. I'l try to find the title of the book where that argument is made.
Oh, he mentions it in that thread. Pivotal Decade.
My boss has some stories about being a union groundskeeper at a cemetery in MKE (where he's from) -- the old guys yelling at him when he didn't immediately down tools at break time "don't you know people fought and died for your breaks?!" But it makes me wonder if there was some very conscious effort to keep Milwaukee hyper-segregated *because* it was such a strong union town?
Historically the middle class, or at least the bourgeois professional/business part of the middle class in the US has seen itself as above or outside the fray of union-management relations. I don't think middle class union membership in the US has ever been that high, except to the extent that unions helped people enter the middle class.
Also, I don't think other countries have the depth of American experience of unions, or attempts at forming unions, being prosecuted under conspiracy and anti-trust laws. I guess that didn't directly affect middle class membership but had an impact on both union membership in general and on the legitimacy of there even being unions.
I actually didn't mean to leave bourgeois in there, since I didn't want to get bogged down in the definition.
30: Lucky for you Stoller's tweets have so many likes it's hard to see all of them and work out identities that way. But I'm still going to guess you're that guy with MAGA in the username.
26/27 (Jake): [do you think they're wrong to think so?]
100% wrong. I'll deal with doctors first, and in answering your second question, will deal with SWEs. Look at almost *any* branch of medicine, and we find that study after study finds that the application of "standardized protocols" improves patient outcomes. That the creativity of doctors is way, way overrated. I'm the son of a cardiologist, and I remember very, very well the way my parents would describe most other doctors. These jokers got great grades, and worked really hard, sure. But once they got their certification, aside from C(ontinuing)M(edical)E(ducation) they don't learn a damn thing ever again. 40 years of not learning diddly again. It's farcical. And consider the typical emergency room: it's the same damn things over and over and over. Then let's go to the high end -- my ortho surgeon (who's excellent, btw). He basically told me that at this point, he's seen so many shoulders with the same complaint, that he can diagnose after a few minutes. He only goes on and on b/c patients get irritated otherwise -- they expect some sort of personalized experience. Think about it: almost all medical practices are just lather/rinse/repeat over-and-over. It's why corporatization of medicine is so unstoppable!
[Curious what you mean by "Google-style of industrialized software engineering..."]
Ever read about Henry Ford's River Rouge plant? Read this: http://www.lawyersgunsmoneyblog.com/2015/01/day-labor-history-january-5-1914
Ford paid great wages b/c he needed his workers to be soft/human cogs in a massive machine. He needed them to be trained to a high standard -- of fitting into the rest of the machine. And this was dehumanizing (b/c no autonomy, no learning). Workers hated doing it. So he paid top dollar. B/c without carefully-trained workers, you'd get variation all over the assembly line, and variation means errors, incompatibilities, and broken cars down the road. A century later, we have Toyota and "statistical quality control".
Now let's look at software engineering. It's a well-known fact that most SW projects fail. E.g. in "legacy enterprise systems migrations", 9 of 10 fail. Why? B/c it's "artisanal" in the sense of pre-assembly-line factories. We have no way of ensuring that workers (SWEs) produce software units that work as they're supposed-to, and no way of ensuring that they fit with the other units produced by other workers. A similar story can be told for bugs: it's well-known that (what?) 70% of all programmer time is spent fixing bugs. It isn't as well-known that in many orgs, a lot of that time is spent fixing bugs induced by prior bugfixes. These problems come down to quality control, full stop. And it turns out, that if you want to reduce/eliminate these problem, the way to do it, is to (1) break everything down into subproblems that are straightforward for SWEs to implement, (2) force/insist-on rigorous testing, to a level that might seem anal to many, (3) automate all the testing, (4) and run it all in a mechanized way, so that whenever somebody breaks something, they (and everybody else) knows as soon as possible. In short, give every worker the ability to "stop the line" when they see a flaw. Heck, make it impossible for a worker to make progress, if there's a bug anywhere that affects his component, and put the decision in the hands of software, not humans.
All of this forces almost all programmers to work like automata, and puts great constraints on their creativity. You'd think that this would be terrible for software quality. You'd be wrong. It turns out, that much of what constitutes "being on the cutting edge" is just "not screwing up as much as those enterprise software dinosaurs". I'm being COMPLETELY serious. Completely serious.
So how do you do the above? Well, you need to build systems to enforce all these rules. You need social processes, too. And it's not easy to set up. Once it's set up, bringing in a new programmer takes months and months, b/c they need to throw away ALL their training from previous jobs, and learn this new way of thinking and working. They need to become human cogs in this massive software factory.
And while Google pioneered this way of doing things, it's being adopted all over Silicon Valley, more-or-less. Sure, many companies will do it badly. But 20yr from now, all software will be built this way, and to a greater or lesser extent, software engineering will be much more like factory work than it is today.
But heck, the same is almost certainly true of medicine too. It's happening -- just read articles by Atul Gawande, and you can see the ripples on the surface of the water, from the motion underneath.
Last thing: I worked 19yr for Ye Olde Massive Enterprise Computer Company, and 14mos for Radical New Internet Company. While the colleagues at YeOlde were *mental midgets* compared to at RadicalNew, the *work* at YeOlde was *insanely* more satisfying? Why? BECAUSE those not-so-great colleagues, using *terrible* software engineering practices, produced software RIVEN THRU wit bugs. I made my career finding and fixing those bugs. And when you're confronted with "bugs all the way down", you need *insane* creativity to find and fix them. It's challenging work, and in a way, very satisfying. Whereas, at RadicalNew, HAHAHA, everybody was pretty good at their work, and the systems I alluded to above worked hard to force us to write bug-free code. In my entire time there, I never saw a single bug for which I went "Holy Cow, how could they do that? That's INSANE!" Never once. Why did I quit? B/c the work was insanely boring, and I mean "stab your eyes out with knitting needles" boring. Yes, the money was great. But there's more to work, than money. In 14 months, I never had a single problem I needed to solve, that required me to step away from the screen/keyboard and THINK for even 15mins. And I wasn't a entry-level guy -- I was basically at the level where you're respected by everybody else (but no, not a poobah or sure).
This is why I think the right metaphor for what's happening in software engineering at the cutting edge (where these companies are completely transforming software development to take all the risk out of building a thing, once you know what you want to build (and sure, it's still a gamble to figure out what people *want*)) is the assembly line revolution of the turn of the 20th century.
Something else to add: in experimental science, they talk about "hypothesis risk" and "execution risk". H risk is when you come up with some new law of nature. Then you dream up an experiment, where if it works out the way you predict, it's evidence for your law, and if not, then it's evidence against your law. The risk, of course, is that your "hypothesis" (your new law) is wrong. Just wrong.
"E risk" is the risk that you can't actually build and run that experiment well enough to even *test* the hypothesis.
Until the advent of this new industrialized sw.eng., almost all of the risk in software was execution risk. I watched for 20yr as wave after wave of poorly-executed stuff came out of enterprise sw companies, crashed thru their customers, and died of its own bugs and incompetence. It's why so much of enterprise software these days is built on open-source -- b/c the stuff is just built to a higher quality than enterprise software. The new way of building software takes (almost) all the execution risk out of software development. Sure, you can still dream up the *wrong* software. But once you dream it up, getting it built is a much, much, much less risky proposition.
And now let's remember: for every guy who dreams up a new software product, you need N guys to actually build it. So guess what? *most* of your engineers are going to spend their time on that assembly line. Very few are ever going to get the chance to dream up something new. B/c it turns out, that's the most effective, cost-efficient, bug-efficient way to run a software factory.
36-7 are fascinating, thanks. But why would open source come out any better than enterprise?
I mean, it's voluntary, so even less amenable to production line methods. Presumably, not having a commercial need to ship product promptly means there's more time to find bugs, and open source has proverbially been better for a long time; but OTOH ogged just had a whole thread which shows the Linux ecosystem still to be filled with products most consumers wouldn't or couldn't use.
But the consumers of linux aren't consumers: you're running ahead of the story if you think they are: it's next year that's the year of Linux on the desktop
Fusion-powered desktops, I presume?
38-40: I'll answer in series
(1) from the POV of sw.eng., "usability" is "hypothesis risk". As in: "people will love and use this software I'm writing". From the POV of H(uman)C(omputer)I(nteraction) it's execution risk, but we're not talking about that. So yeah, Linux on the desktop still isn't there for lots of folks.
(2) Why is OSS less buggy than enterprise software. A little story. Until 1990-or-so, every operating system vendor (and there were a ton) typically developed their own C compiler. But around then, Mike Tiemann and Cygnus signed up a bunch of folks to sponsor (with $$) GCC (the GNU C Compiler) Version 2. The idea was,it would come out on a number of operating systems, and would be of decent quality. And open-source. Turned out, that was game-changing. Until then, those C compilers? They were all of varying quality. Like -- WIDELY VARYING. I remember an HP C compiler that basically couldn't generate optimized code with any correctness. If you wanted to write portable code, you were -so- screwed -- you had to work around a zillion compiler weirdnesses, not to speak of all the bugs. When GCC v2 came out, every vendor had a choice: either produce a compiler at least as good as GCCv2, or stop trying altogether. Lots of vendors stopped trying.
Also, lots of software isn't differentiating, even to the enterprise vendor. And yet, the vendor has to write, maintain, support it. With OSS, if a vendor decides that they don't want to write their own (say) web server, they can just ship the open-source Apache web server. Sure, they gotta staff up to support it. But that's a lot cheaper than writing one from scratch. And if a bunch of vendors do that, they all benefit from each others' work.
(2') another argument: Lots of what happens in enterprise software happens b/c some architect wants to make his name on it. As a Distinguished Engineer once told me "Chet, nobody ever made DE for removing function from a system". Stuff gets put into products all the time, that nobody uses (sure, they kick the tires, but that's it), and somebody made his bones on it. So he's gonna make damn sure it never gets removed. And hooboy, it costs a ton to maintain. Similarly, one big, big customer can force a vendor to keep function in just for them, and to the detriment of the rest of the product.
In open-source software, that pretty much cannot happen. B/c if one vendor, or one customer, tried to force the community to go a certain way, there'd be a "fork". Sure, Gynormous Bank can hire a team to maintain their special version of the database. But everybody else? They'll use a version that doesn't have the Gynormous Hack, thank you so much. So while software moves much slower in open-source, for the stuff that is widely-used, it moves more stably. And since (as I mentioned before) 70% (maybe more, don't remember) of programmer time is spent in maintenance, the fact that OSS is so much more stable, is a boon to overall productivity.
BTW, entire areas of sheer daylight LUNACY can be avoided, b/c in open-source software, if there aren't enough people actually using a feature, it ends up getting dropped. Great example is "web services". This was an entire suite of insanely complex stuff (some may recognize the name SOAP, but there were a ton of standards all starting with "WS-", e.g. WS-Addressing). Massive number of standards, all put out by companies, and all basically resume-enhancement work for senior architects. Enterprise vendors implemented /supported it (badly) and pushed it on their customers. Insane. Crazily bad. The open-source world, instead, went with something called "REST", which was much simpler. Lots of people implemented REST, even though enterprise vendors were pushing SOAP and "SOA". Eventually, the vendors capitulated, and these days they all support REST.
There are a bunch more examples, but it's the same pattern: enterprise vendors push some crazy baroque thing out. Meanwhile, actual "users/customers/exploiters" come up with something much simpler (which might actually not do all the things the enterprise version claims to do, but geez, at least the simple thing *works*) and eventually the vendors surrender.
(3) back to desktop software: sure, the actual UI stuff on Linux may not be right for many users. But the stuff underneath is mostly unrelated to why the UIs aren't great for most users. So for instance, sure MacOS has a nicey-nice UI. But underneath it's UNIX. And that UNIX is based on FreeBSD, which is another open-source OS (actually older than Linux, and the reasons that Linux won over BSD are contentious, and possibly connected to an AT&T lawsuit from the late 80s).
(4) Another reason enterprise software sucks the big one: When Intel (or some other hw vendor) ships a bad chip design, the blowback is in the billions of dollars. It can be racks and racks of machines coming back. When a consumer software company ships bad code, it can cause pain to many, many, many consumers, all of whom are unused to patches and maintenance. When an enterprise software company ships bad code, they ... (drum roll) ship a "fixpack". A patch. Their customers are SO USED to having to apply maintenance, that it's just no big deal. Furthermore (and I have -direct- experience with this) different customers ARE GIVEN DIFFERENT FIXPACKS. There's no effort made to provide a "cumulative" fixpack that improves over time, that every customer is given. So there can be a "Gynormous Bank" version of some particular product, that is maintained over time, even across versions.
All of this yields more and more versions, more and more code-paths, and less and less certitude that any particular product will work in any new configuration.
And all of this, it turns out, is a lot harder in the open-source world. Sure, RedHat can play the same games. But their margins are a LOT slimmer -- they can't AFFORD to maintain versions of their products for individual customers. So they're forced to stay "closer to the trunk".
41.3 is what I meant by the consumers of OSS not being (retail) consumers. They are for the most part other software engineers, for all the reasons Chet lays out. So, in a sense, my phone, my laptop and my raspberry pis all run on OSS, but the unixy bits of Android and OSX are invisible to me unless I make an effort.
Even if consumer Android were open source (and Google tries damn hard to see that some vital bits aren't) very few people would buy it for that reason. Very few people have any interest in the various attempts to make completely open phone software. Similarly, Apple's huge selling point is that you can't tinker with it and don't need to. So even if it's BSD in the sheets, it's OSX on the streets. Or something. I had better stop now.
But, to tie the two parts of this thread together: note that Maciej Ceglowski was running around the country all spring trying to get tech workers to unionise against the threat of fascism. Because if software engineers ever did, or of they did so now, they would have the kind of power the print unions once had over the newspaper industry.
OT, but has Kalanick been suspended/repositioned yet?
42: God, REST was such a breath of fresh air after the over-engineered madness of SOAP. I remember sneaking REST/JSON it on to the front end of a enterprise website from the SOAP/XML planet... it was one of those "better to ask forgiveness than permission" deals. In the end, no forgiveness was required.
I remember one instance, with a couple developers standing around in my cubicle, one said "holy shit, you're using JSON?"
It's a well-known fact that most SW projects fail. E.g. in "legacy enterprise systems migrations", 9 of 10 fail.
You don't risk failure if you keep using SAS for everything.
Even if consumer Android were open source (and Google tries damn hard to see that some vital bits aren't) very few people would buy it for that reason. Very few people have any interest in the various attempts to make completely open phone software.
To be fair, very few people have any interest in the process management of their car's manufacturer either.
Skimming 36 again, I think I'm being advised to work with more stupidier people. Which seems hard as I'm not usually in a position to hire the people I work with.
It's a well-known fact that most SW projects fail. E.g. in "legacy enterprise systems migrations", 9 of 10 fail.
What exactly does 'fail' mean here? Don't work at all, or just cost far more time and money than planned?
I was wondering about that myself. Because I've noticed that everything seems to be getting migrated eventually no matter how much I say I still want to use a Banyan network.
A real, patriotic, Pittsburgher would use a Bunyan network.
Unsurprising. I bet you've offshored your punning to India too.
All true Americans embrace fake corporate community building.
44: Crikey, I hadn't heard Cieglowski was trying to do this. Geez, if only. If only. It's crazy troubling that software is being built to run the world, under the instructions of sociopaths (b/c all senior mgmt are sociopaths, ffs)
49-50: "9/10 of legacy enterprise systems migrations fail"
For "legacy enterprise system", think stuff like
(a) checking account system
(b) auto loan system
(c) system managing ID badges for large company
etc, etc. Basically, these are all systems that involve business logic and automate some part of the business process of a company, yes? These days, those things get written on some J2EE platform, though maybe more and more in Node.JS or something. They -used- to get written on mainframes and other old platforms. A "migration" is when you try to rewrite a system on some older platform, to to some newer platform. The failure means just that you don't manage to do the job well enough to eliminate the old system.
A great example of why this is hard is in databases. All the major enterprise databases (oracle, sybase, ms-sql, informix) come with "stored procedure languages". Think of them as custom versions of BASIC, embedded in the database. Vendors encouraged customers to write their applications in those languages. So even though the database's "query language" was more-or-less standardized (SQL), the applications themselves were written in these wildly custom languages. So imagine now that you're a big-ass bank, with ... 300 different Sybase applications, all over the company, written over two decades. And you want to switch to Oracle for your database vendor. You have to rewrite 300 applications from Sybase's "Transact-SQL" to Oracle's "PL/SQL". Goooooood luck. So it's not surprising that there's enormous lock-in. Just imagine the *cost* of that rewrite. It's completely insane.
Then there's the old IBM mainframe applications. It's public knowledge that many banks have applications written decades ago, whose authors retired and possibly died, for which they've lost the source code, and which have been patched by taking the runnable program ("object code") and patching it directly ("a binary patch"). At that point, it's almost impossible to really understand what the program actually does. So a complete rewrite is .... uh .... fraught. It's also why IBM mainframes cost so much. B/c when you have a major business that can't migrate off, they will pay .... whatever they can afford. So the mainframe costs ... whatever the market will bear.
This pattern repeats, and is why you see things like the VAX/VMS operating system (from DEC, then Compaq, then HP) getting ported from the VAX, to DEC Alpha, to HP Itanium. VMS has been a dead platform for 20 years, and everybody knows it. So in 2013, HP announced they would not be updating it to the most modern Itanium hardware. Guess what? Customers complained, and as of this writing, in 2016 they did the update. I think it's fair to say, this is wholly due to customers who, even now 20yr after VMS is a dead letter, cannot migrate their applications off. Amazing, eh?
One little bit more: 20yr from now, the legacy will include also systems being written today. E.g. in Ruby. I cleaned up a mess at a well-known Internet entertainment company back in '08, whose app was written entirely in Ruby-on-Rails. Hoooboy. The craziness you can get done in RoR. I used to joke that "I for one welcome our new Ruby-on-Rails overlords, for in their world, I will never want for work". These applications, if they become business-critical, can't be rewritten -- it's just too hard. Even if *parts* can be rewritten, the cost of doing so means that much of it doesn't get rewritten. So if this is the case for some big company, with the money and resources, imagine all the thousands of small apps at mid-size companies, that will never get rewritten. Imagine the Visual Basic apps, the C# apps, etc, etc, etc. All of that, is legacy now and forever.
Yes, thanks. I now feel confident that I can work for another twenty years without learning R stat.
From the inside of one of these massive software factories, I'm not sure I'm fully onboard with 36 (Of course, the premise is that I wouldn't be, right?). We're not yet at the point where the software universe is sufficiently plug-and-chug for things to be quite so routine. Putting a lot more time and effort into tests and continuous integration is a good trend in the world, but I think it's orthogonal to creativity - it's just getting the level of crap down below 90%. On an optimistic day, I'd say it's down to 75% crap.
In my entire time there, I never saw a single bug for which I went "Holy Cow, how could they do that? That's INSANE!" Never once.
This rings completely false. Higher-level tools just enable more fascinating, higher-level bugs. Automate the entire datacenter? Sure! You also just enabled wiping out the whole datacenter with one wrong move.
I though Bruce Sterling captured this well in Heavy weather. Overall not a book that I liked that much, but I liked his treatment software scavenging/reuse part.
Shutting down the Troupe's systems was delicate work. Even the minor systems, for instance, the little telephone switches, had a million or more lines of antique corporate freeware. The software had been created by vast teams of twentieth-century software engineers, hired labor for extinct telephone empires like AT&T and SPRINT. It was freeware because it was old, and because everybody who'd made it was dead or in other lines of work.
Relatedly, OS/2 Lives!
Blue Lion was the code name for the first release of Arca Noae's new OS/2-based operating system, ArcaOS 5.0.
Blue Lion (ArcaOS 5.0) was released for general availability May 15, 2017.
Newer variants exist, but from way back in the day, when NT was coming out. (And previously post by me here over 9 years ago.)
Two versions of OS/2 so the small machines can fly.
Three versions of DOS for the clueless in their homes.
Nine versions of UNIX for the hackers late at night.
One version of Windows for the Dark Lord on his throne.
In the land of Redmond, where the shadows lie.
One OS to rule them all, one OS to find them.
One OS to bring them all, and in the darkness bind them
In the land of Redmond, where the shadows lie.
I liked NT. I have yet to develop a tolerance for Windows 10.
What exactly does 'fail' mean here? Don't work at all, or just cost far more time and money than planned?
I remember reading a bunch of articles on the subject of "most large software projects fail" around the time the healthcare.gov rollout was happening. Let me see if I can find any of them.
The dirty little secret of course is that the people in EndUserCorp Finance Office much preferred the old mainframe systems to the meticulously engineered products they work with today, because if they were hit with some new internal regulation or had a bright idea for marginally improving their process they could just put their head around the door of the IT office and ask Fred, who had worked with them for twenty years and understood their systems as well as they did, to mod the program, and Fred would comment out a few lines of COBOL and write a few more, run it though the test Db and have it in place for them in 48 hours.
They felt cared for, which counts for something, even to bean counters.
It was a kinder, gentler era of corporate greed and malfeasance.
re: 45
JSON, not JSON-LD?
< /dev hipster >
I'm in the middle of building a cultural sector cloud platform, which is a fairly big project, by smallish agency standards. Dozens of services, etc.
Most of the projects I've worked on, have been successes. In the sense that they basically do what they were supposed to do, and are more or less on time.* But none of them have been smooth-sailing, and developers always grumble about process, and commercial people about cost/estimation.
Clients in the sector I work in always expect the moon on a stick, for 'dog shit on a twig' prices.
* as in run over 10-20%, not run over by 100% or whatever.
I don't know if I agree with senior management are all sociopaths, either.
I've worked in big corporates, higher education, and now SME private sector. I'd say that I worked with a moderate number of sociopaths in the corporate sector, more (if I'm being really honest) in higher education, and I don't know if anyone in my current place of work, right up to director level, is a sociopath, or even displays tendencies. But I would say that, as I'm management.
? B/c it's "artisanal" in the sense of pre-assembly-line factories. We have no way of ensuring that workers (SWEs) produce software units that work as they're supposed-to, and no way of ensuring that they fit with the other units produced by other workers.
if they were hit with some new internal regulation or had a bright idea for marginally improving their process they could just put their head around the door of the IT office and ask Fred, who had worked with them for twenty years and understood their systems as well as they did, to mod the program, and Fred would comment out a few lines of COBOL and write a few more, run it though the test Db and have it in place for them in 48 hours.
They felt cared for, which counts for something, even to bean counters.
Incidentally, I work for a small company doing software for a specific niche (which we didn't set out to do, we just ended up realizing that we had an opening, and that there wasn't much competition and structural reasons why it would it be difficult for a competitor to target the niche), using older languages and older programming techniques, and both of the above apply to my job.
I'm proud of how well we do at not having bugs and having good coordination which we achieve mostly by having a couple of us who have been working on it for a long time and none of us have left yet, so there's a lot of institutional memory.
It's way less money than working for a big company (or a fancy startup), but Chet's comments make me appreciate the benefits. It's a nice way to work in terms of job satisfaction.
These days it takes months, rather than 48 hours, for a new idea to make it to the end user, because we need to make sure that it will work for everybody, not just one user, but I do think a lot of value that this model delivers, both for us as programmers and for the end user, is that we do get to care about people in a fairly direct way.
61: "In my entire time there, I never saw a single bug for which I went "Holy Cow, how could they do that? That's INSANE!" Never once.
This rings completely false. Higher-level tools just enable more fascinating, higher-level bugs. Automate the entire datacenter? Sure! You also just enabled wiping out the whole datacenter with one wrong move."
I think you're thinking about the wrong sort of software. Your example is "software that runs computers". "Enterprise software" is "software that runs real-world businesses". Yesterday, MUNI ticket-kiosk machines were failing intermittently all day. That's the sort of bug I'm talking about. Also, believe it or not, even back to Jim Gray's 1985 paper "Why Do Computers Stop and What Can Be Done About It?" operator error has been understood to be the prime cause of computer failures. And in well-run shops, that gets worked-on, too.
Also: I spent from 1998 to 2006 basically only doing one thing: cleaning up dumpster-fires at banks/insurance/etc (my employer's customers). Debugging other people's code. All in enterprise software systems. The -idiocy- of the bugs these people put in their software cannot be described with equanimity. To this day, I refuse to use internet banking, b/c having seen a "USA Today-sized security hole" and getting pushback from "the security architect" that in fact, everything was working as designed, I'm completely unconvinced that these jokers actually know what they're doing.
BTW: there's a big-picture message in this series of comments. Those I/T folk who pretend they know so much? Almost to a -man-, they're idiots. Pretending. Once upon a time they knew some incantations, and now they repeat 'em like magic spells, with precisely as much effect. Useless as tits on a boar. They're paid well b/c they sit on critical systems and farm 'em like any other rent-farmer.
I just happen to have in my desk drawer a copy of NCR Century COBOL Student Text from 1974 (I just inherited it from someone, I never programmed in COBOL).
The COmmon Business Oriented Lnguage (COBOL) is a near-English programming language designed primarily for programming business applications on computers. It is described as a near-English language because its free form enables a programmer to write in such a way that the final result can be red easily and the general flow of logic can be understood by persons not necessarily as closely allied with the details of the problem as the programmer itself.
I guess, generally, to the level where I met with director-level people fairly regularly, but was not one myself. Level below that, basically.
BTW: there's a big-picture message in this series of comments. Those I/T folk who pretend they know so much? Almost to a -man-, they're idiots. Pretending. Once upon a time they knew some incantations, and now they repeat 'em like magic spells, with precisely as much effect. Useless as tits on a boar. They're paid well b/c they sit on critical systems and farm 'em like any other rent-farmer.
I feel like there has to be more than one category. There are people who blithely fit that description, a larger number of people who are terrified that they fit that description (and do to some degree, but mostly make an effort) and a small number of people who are very good at their jobs and not idiots.
I wonder what the percentages are.
71. I only ever endured one* sociopathic CEO, whereas middle managers who, to put it politely, were counter-productive, seemed to be two a penny.
*At one other place I worked, which was in the throes of changing from a family firm to a full on multinational nightmare, one of the surviving members of the family was a dangerous lunatic, but if you kept physically out of his way it was OK because there were a bunch of senior managers dedicated to rendering him harmless, at least to the firm.
68: "The dirty little secret of course is that the people in EndUserCorp Finance Office much preferred the old mainframe systems to the meticulously engineered products they work with today"
Yes, but:
(1) their mainframe vendor is charging them (effectively) a percentage of their company's profits. Carefully measured and re-calibrated every time the licenses re-up. And for sure, the vendor's doing very little for that money. It's basically an annuity, and EndUserCorp doesn't wanna give up that $$
(2) when it comes time to do really new things, that mainframe app is very difficult to upgrade. Basically you have to layer new systems in front of it, to do anything interesting. Example: with a major mutual fund co. allowed users to start checking their IRAs from mobile phones, their mainframes melted down -- b/c the traffic was 99+% reads, and was far too much for the machines they'd bought. They couldn't by more (b/c too pricey). So they built entire new systems in-front to take that read-load. Which involves new complexity and caching and what-not.
But really, it comes down to #1. OBTW, the same can be said for other parts of I/T software. E.g. data-warehousing. This "hadoop" stuff some of you might have read about -- that's really a cheap-and-cheerful way of doing data-warehousing. But it's much more labor-intensive. So why do it? B/c data-warehouse vendors charge for their systems by the amount of data you put in them. "They charge by the terabyte". The cost of the licenses vastly dwarfs the cost of the hardware. So even though you gotta write new programs, hire priced programmers, etc, for a decent-sized company it can STILL be cheaper than buying Teradata. Which is why companies like Teradata have been forced to cut prices like crazy.
re: 77
I also think it doesn't work at the level of whole people, but at a level more granular than that.
There's things I do, where I either really am an imposter, or strongly have the feeling that I am. Other things, where I'm pretty competent, and pretty sure in my self-perception of my own competence. And things where I'm pretty bad at it, but I'm pretty sure that most other people are bad at it, too. And loads of other categories in the Venn diagram of competence/self-knowledge. All at the same time.
I certainly have worked with a lot of total incompetents in all the things, and a smaller number of people who are really genuinely good at almost everything they do. Most people sit somewhere in the middle.
Clients in the sector I work in always expect the moon on a stick, for 'dog shit on a twig' prices.
I'm going to have to steal that in some form someday.
Lesson 1 taught to every callow kid embarking on a career involving software is that they must always start out by telling customers, "Good. Cheap. Quick. Pick any two; you can't have all three."
re: 81
I've spoken to two potential clients in the past couple of months who basically wanted a vast stack of services* for insane amounts of money. One was expecting to get most of that for about 2 weeks of developer time at commercial rates. The other for more, but still, about 20% of the realistic amount of time it'd take to build what they wanted.
They seemed quite surprised when I said that that wasn't going to happen.
* preservation, repository, text analysis, image delivery, image analysis, OCR, sophisticated rights and access management, multiple front-end APIs, front end CMS, etc etc
I am relatively new to software development, but 36 et al. seem both very insightful but exaggerated enough in a lot of the claims so as to not be all that clarifying (starting with the 9/10 failure stat).
In my corner of the world (implementing browser-based user interfaces and orchestrating web services for enterprise saas apps), you can't really separate hypothesis and execution risk. If you've built the wrong thing, you need to iterate quickly and deliver a new interface, with all the execution risks that delivering quickly entails. Regression tests for UI are likely to be brittle. Things like componentizing web interfaces (through all the JavaScript people love to hate) can help get you closer to an industrial model, but the need for speed to market is always going to put a limit on how much quality control you can exert through good processes. Having people that can hold a lot of the system in their heads and anticipate the effects of changes will always be a huge asset in those circumstances.
36; So, to merge medicine and software- why must Epic suck so much?!! I am constantly finding problems at my organization and filing tickets.
Doctors are becoming cogs in a corporate machine-which they hate- but at least where I work, they get so much deference that it goes to their heads - treated like gods. Some are down to earth, but others have can't believe that most people outside of medicine have brains.
21: Software engineers discover they should have been pro-union somewhere between age 45 and 55, when they're no longer young, fast, and possessed of tremendous stamina. [oh, but I repeated myself twice *grin*]
Because their SAG memberships have done such a wonderful job of making sure that Hollywood actresses will continue to get leading roles into their 50s and 60s on a regular basis, right? And rigid union seniority rules turned out to be not so great for pilots flying for a bunch of now-defunct airlines, when they had to either retire early or start over at the bottom of the ladder at a new airline.
Which is not to say that age discrimination isn't a problem in software, nor that unions couldn't do anything about it. But that's two examples off the top of my head of unionized highly-paid professionals in other industries that suggest that unionization isn't automatically going to solve that particular problem either.
Doctors... get so much deference that it goes to their heads - treated like gods... can't believe that most people outside of medicine have brains.
Omg, I could write at annoying length on this subject but lucky you, I don't have time. The unsatisfyingly short version is: 1) I find this *such* a weird and particularly American phenomenon; 2) I could see it when I lived in the US but now that I've been elsewhere for a few years interacting with doctors in/from a few other countries, HOLY SHIT is it a weird and particularly American phenomenon. 3) I've taught medical students in both the US and UK, and while they are certainly not dummies, as a class, they are [unfair generalisation warning] among the least intellectually curious students I've ever had. I'd rather teach first year bio.
With all that off my chest, I have to say that this --
becoming cogs in a corporate machine-which they hate
-- inspires genuine sympathy. One of the great lessons of American culture seems to be that as long as you keep telling people how special they are, they never notice you've got their trousers down.
When I first started going to my current surgery here in the UK, I was convinced my GP was in over his head; he seemed to go through every appointment as if stunned and only just able to create the impression that he was actually present. Then I started reading up on what's going on with NHS funding and the plight of junior doctors (of which my GP is one) and I thought, my god, the poor bastard's just sleep deprived is all.
Yeah, so doctors aren't gods and I don't even care if they get to be rich people but they sure don't deserve to be treated like cogs any more than the rest of us and one wishes they could see that in the US as well as they can here.
Or maybe I managed annoying length after all.
This thread probably has the longest comments on average of any recent thread.
88. Out of curiosity, med students in the US are normally post-grad. In Britain they're undergrad. Do you find this makes a difference to their intellectual curiosity, and if so what?
Tangentially related to these really interesting thoughts about working:
Hamming's talk to Bell Labs on productivity and working well
Anecdote: I have a friend, and one of his friends did med school in France, but found it too boring. After finishing, he came to the US, and completed a Physics PhD (after backfilling missing undergrad work). I think the reverse (med school in US, Physics PhD after) would be .... inconceivable. Med School in the US requires so much more .... *focus* to get in ....
I also remember when I was just graduating undergrad in Houston, my pre-med friends referred to the guys who didn't actually have any intellectual curiosity, but were shooting for the grades to get into med school, as "gunners". I suspect the same thing is happening in programming these days.
I continue to believe it's about the pervasive effect of outsize compensation, sigh. Inequality, sigh.
Layton's _The Revolt of the Engineers_ is specifically about engineers as a class getting suborned by a chance at middle management. It's pretty depressing.
Speaking about labor history this is a fantastic project and I really hope Loomis gets this fellowship: http://www.lawyersgunsmoneyblog.com/2017/06/little-help-friends-whatever-people
for more information visit here
91: Yes, but possibly not in the way you'd think. UK medical students, as undergrads and therefore younger people, seem still more likely to be interested in things generally. US medical students, who are older on average and see med school as a kind of professional program, have that "focus" (see 93) that works against intellectual curiosity. UK students are more like students generally; who knows what you're going to get b/c they're still figuring it out themselves. I supervised three medical students this year on research projects and one of them was absolutely great, in the way that a really bright undergraduate can be great. By contrast, teaching American medical students is the worst teaching experience I ever had b/c they were already mentally occupying their new station in life as *doctors*, and couldn't I just hurry and get on to what's going to be on the exam etc.
As a general statement, from my teaching experience, I'm inclined to trust US doctors more for their qualifications and experience, but UK doctors more not to be assholes.
Apologies if there are any doctors here, I realise I'm slinging generalisations fast and loose.