Slowly processing this whole post, but: I didn't think debt came up in the Culture parts of PoG. Could it, given that the Culture doesn't have money? There was an indebted drone in CP, but that was in the spun-off Vavatch:
"I am near to paying off my Incurred Generation Debt, when I'll be free to do exactly what I like..."
Yeah, agreed - and that drone (Unaha-Closp) definitely isn't a Culture drone. (No debt-servitude in the Culture!)
Similarly the damaged mind in CP, which if I remember right wound up taking a new name from the book's main character.
Nitpick - not a new name; it was unnamed throughout the book. That's mentioned in the Prologue.
It clearly mentions that its identity is connected to its physical form, which makes sense to me and is I think perceptive of Banks to include
Not in the three books we read, but in the others, there are a few examples of AIs changing their physical form. Masaq' Hub, in "Look to Windward", used to be a warship. In fact, it used to be two warships. The war drone at the start of "Excession" changes bodies in a rather complicated way, overwriting itself onto its subverted twin's personality.
Why is it necessary for AIs to age and die? It isn't even necessary for Culture humans to die, and some choose not to.
There's also the question of why the AIs hang around with humans at all. I guess since they were invented by humans in the first place, an interest in them was somehow built in.
Maybe there are hordes of AIs in the Culture that have nothing to do with humans, but I don't recall any mention of that.
I loved the bitchiness of the Minds in Excession. It's sort of Banks' admission that you can't actually write from the perspective of vastly greater than human intelligences, so he went for comedy instead.
And I think that SC does grow its own drones; it doesn't wait for volunteers from the general drone population in the way that Contact does for humans. Drone names reflect their hardware, and Mawhrin-Skel's backstory (which is false, but presumably plausible) is that it was built as an SC drone but then found to be psychologically unfit, not that it tried to join SC but was rejected.
I haven't been reading the Culture novels, but this reminds me of a Charles Stross novel about an anthropomorphic robot in a world with no humans around. A gynomorphic robot if we want to be specific. If anyone else has read it, it's Saturn's Children. Turns out that, spoiler alert, in this world no one ever actually figured out how to program AIs with certain personality traits or laws of robotics. AIs exist and basically have laws of robotics, but they aren't programmed, they're the result of learning, psychology, and conditioning. Yes, this is horrifying.
I might read the Culture stuff, this series of posts has got me interested in it and for Christmas I got a new e-reader to toy with, but I figure I'm too far behind to have anything intelligent to say that's specifically on topic.
Why do they just keep building drones to putter around and do a wide array of things as individuals? Why is their level of intelligence decided as precisely where it is - I think human-equiv or moderately above - but in any case so much lower than Minds? Why do the drones live for periods that are finite by choice?
The cool thing is, all the same questions apply about humans in the Culture, but with the humans these questions do not come as easily because we take their presence as natural. What is the point? And it comes down to, I think, the goal of the Culture is living, and that takes lots of forms, most of them without overt function, but it is determined to persist as an entity for its own sake.
The intentionality is highlighted in Banks's notes where he describes the modes of existence and use of biology/technology as fashions that go in and out:
The era of the stories written so far - dating from about 1300 AD to 2100 AD - is one in which the people of the Culture have returned, probably temporarily, to something more 'classical' in terms of their relations with the machines and the potential of their own genes. The Culture... can look back to times when people lived much of their lives in what we would now call cyberspace, and to eras when people chose to alter themselves or their children through genetic manipulation, producing a variety of morphological sub-species.
(I don't see why a drone should naturally aspire to be a Mind, though, any more than a human should. The gap is profound.)
Why is it necessary for AIs to age and die? It isn't even necessary for Culture humans to die, and some choose not to.
Death is necessary for change, so for adaptation to novelty. Otherwise there's uncontrolled growth leading to resource insufficiency. They're great books, not especially faulting them for this. Like I said, I started thinking about cooperation in the books in biological terms.
Death isn't strictly necessary, but its apparent lack for AIs raises questions. Human fertility is said to be essentially replacement level; is the biological proportion of the Culture slowly shrinking?* Without AI death the Culture doesn't have any natural limit to its growth. Both points would suggest Horza is actually right in CP: the Culture is a cancer that will never stop, and will ultimately dispense with its humans. Ajay's suggestion that Hegemonizing Swarms were being set up as an enemy is especially interesting there.
*Or is the balance maintained by immigration/assimilation?
I found it interesting that the drones are presented as essentially indistinguishable from humans in terms of their personalities. They have a full range of human-like emotions, and they have human-like senses of humor. (Skaffen-Amtiskaw's gift of a hat to Zakalwe after his rather unfortunate injury was one of my favorite moments in the books.)
On the one hand, it's a refreshing rejection of the common sci-fi tropes of robots/androids/machines that are either cold and logic-bound, and thus always alien and "other," or cold and logic-bound but always yearning to understand this mysterious phenomenon of human emotion, and thus made sympathetic.
On the other hand, why should drones think and act so similarly to humans, when their experience of reality is so different than that of any biological organism? Why don't drones prefer the company of other drones, per 3?
Certainly this sort of portrayal is not unique--C-3P0 and Marvin the Paranoid Android spring to mind as other examples of this phenomenon. And while it makes for interesting characters and engaging storytelling, it was never really explained (at least not in these three books) why drones should think and act like humans.
Or rather that Horza had a point, I suppose.
10: Because they're the friendly interface the Minds present to their pets?
I found it interesting that the drones are presented as essentially indistinguishable from humans in terms of their personalities.
It's fiction. You can posit that in the future people are interesting.
Further, the Culture is actively opposed to Sublimation, which apparently is the normal endpoint for civilizations. And it's said in one of the later books that humans transferring their minds into machines is considered bad taste, for reasons I don't remember.
13. They are not interested in having their own civ Sublime, but I don't think they are opposed to it in general. It's seen more or less as the typical final stage of a civ, but "not interested, thanks." In the last Culture book (The Hydrogen Sonata) the setting is a Culture-level but not part of the Culture human civ that is going to Sublime in less than a month, and the Culture is on hand but [really teeny-tiny spoilerette] not telling them they shouldn't.
There are a lot of places in the series where Minds look at humans the way humans look at pets. In at least one case a grumpy ship's Mind refers to another ship Mind's human "crew" as its pets.
11. ISTR that drones are built targeted at a specific level of intelligence, which may or may not be much above human basic. Drones are useful and usually friendly, but they aren't really the "interface" Mind's use to present to their pets; rather, Avatars are that, though they are typically a bit more prickly and god-like than drones.
Without AI death the Culture doesn't have any natural limit to its growth.
Machine intelligence doesn't have a natural baseline growth, though. It's basically up to them whether they build more of themselves or attrit over time. And in their case, the squishy part doesn't have a natural growth either: fertility is at almost exactly replacement. (Banks wrote about that as if it were a natural concomitant of gender-swapping, that the typical human individual is at some point a woman and at some point decides to have a child, but average is one; recalling it now I don't see it as natural as all that, considering they like babies as much as anyone. Possibly more rejiggering of biological impulses? Plus their cultural baseline of course; they don't need children to ward off fear of death.)
15.1: True. It does always appear to be building more ships though, and outside wartime AIs presumably would get attrited no more often than humans. Based on Homomda policy the Culture was certainly growing at the time of the Idiran war; and IIRC the books cover something like the last 1,000 years of a 10,000 year history.
16: I think you're wrong to write off attrition as a significant cause of death; why shouldn't drones die by accident or malice at a fairly high rate? The galaxy is a dangerous place. Stuff goes wrong, even in the Culture. (If we cured every infectious and non infectious disease the human lifespan would still only be about 600 years.)
If Culture human bodies wear out it's by design. They certainly can, if they want, engineer a body that does not age or deteriorate.
I always liked the "defense in depth" breadcrumbs - how their organs are more redundant, they can bypass foods, survive in vacuum, etc.; and at a larger scale, a GSV could if necessary recreate from itself the entire Culture.
17:Malice from whom? I excluded wartime. Accident: the Culture values machine life as highly as human and won't send drones into harm's way except when absolutely necessary; and when that happens, the drones could back themselves up beforehand. We know from Look to Windward the Minds did this before going to war, for instance. If the drones aren't backing themselves up before fixing the fusion reactors or whatever, that's a deliberate choice on their part, and the OP is right to identify the absence of information about such decisions as a hole in the world-building.
Death is necessary for change, so for adaptation to novelty. Otherwise there's uncontrolled growth leading to resource insufficiency.
Is this necessarily the case in a "parsec hopping" (Moorcock, IIRC) civilisation like the Culture? The Culture is clearly a long way short of filling its home galaxy, otherwise SC would be unnecessary. Additional Lebensraum is created regularly by other civilisations subliming, and there doesn't appear to be any particular physical limitation on the sort of system they can colonise. If they start running short of building materials in another million years or so, they can probably expand into the local group. But, as people have said, population growth, both biological and artificial, is pretty slow. It might even have stabilised at replacement, or they might have that as an objective.
It seems to me that resource shortage is a problem they've kicked down the road for perfectly good reasons.
Agreed. The Culture only includes a small fraction of the galaxy and there is a lot of room for growth; they are building Orbitals fast and of course you can have a large number of Orbitals per sun. Even if the Culture has only a thousandth of a percent of the suns available and puts just one Orbital at each sun, and populates each Orbital at a hundredth the density of modern Earth, and doesn't have any GSVs or planets or anything, that's still a population of six quadrillion - two hundred times the actual population of the Culture.
"Malice from whom? I excluded wartime."
Anyone. It's a dangerous galaxy.
And these random actors are killing (un-backed up) Culture AIs at sufficient rate to cancel out new production? I don't think that's directly contradicted by the books, but that doesn't seem at all plausible to me. See frex Look to Windward: the Hub has to get explicit consent for Displacement because there's a one-in-a-zillion chance it won't work. The Culture is an extraordinarily safe place outside Contact.
Why does the Culture population, drones or humans, need to be steady?
Culture human population grows as more societies join the Culture, though. And resources and space are really not an issue; the Culture is still a young society, remember.
26:
(1) I never rested anything on the resources issue.
(2) Long-term shrinkage of humans relative to machines, as I suggest in 9, has implications. Not necessarily bad ones, but interesting ones, which you pointed to in your own post.
(3) The Culture's ability to assimilate new civilizations is a massive question which IIRC Banks never grapples with at all. To bring the Culture back into reality as an allegory,* Banks is saying roughly, "We, the enlightened people, can wall off our garden and keep cultivating it, and the barbarians outside can come in whenever they grow up and get with the program."** That's fine in principle but very hard in practice, as we're seeing in real time.
*Which I take to be the ultimate point of the books and of these threads.
**"And we'll unilaterally intervene in their affairs to push them in that direction." But that's a separate discussion.
I think there's something wrong with talking about the Culture as though it is a state with borders and territory and an immigration policy. And that makes using it as an allegory a bit dubious.
28: Fine, ditch the allegory. The assimilation problem remains in-universe, and becomes even harder, because it's a question of norms and judgement calls, not laws.
In the absence of laws, of taxes, of residency permits and passports, the question of who gets to be a citizen or not becomes rather academic. What advantage is there to being a Culture citizen?
I mean, the Culture is a _culture_, hence the name.
The Culture contacts us tomorrow. Would you like to move there? The question isn't academic at all.
If that happens, I'd wish I'd read the books instead of watching English people kill each other with improbable ease.
32: what? I think obviously yes provided you can bring your family along.
What advantage is there to being a Culture citizen?
Um, top-notch health care, for starters. And not having to work. And not having to worry about money, ever. And lots of great sex. And being able to experience sex as a man and as a woman, and then later as a man again, etc.
I dunno, seems like a pretty sweet deal to me.
32. The question isn't academic at all.
Hmm, "Mossy Character" isn't a bad Ship name. Do you know something we don't know? Last I heard we were being used as a "control" and thus not Contacted. Perhaps now that Contact agent Banks has reported in, things will change.
35: but those are all advantages to being a Culture resident. As long as a GSV or an Orbital is willing to host you, you get all of those things and more.
As for the assimilation problem: it doesn't exist because it's a category error. Assimilation problems are when you have a group of people who are legally part of your state, or resident within its territory, but do not share its culture and values, right? Lots of non-English-speaking Chinese-origin people in Britain: assimilation problem. Lots of non-English-speaking Chinese-origin people in China - no problem.
But the Culture is not a state; it's a culture. You can't have an unassimilated member of the Culture for exactly that reason. What would he look like?
There may or may not be a category error,* but it's a distinction without a difference: The Culture does have territory, however non-contiguous, and it does exercise control over that territory and actors within it. You just said it yourself: Assimilation problems are when you have a group of people who are legally part of your state, or resident within its territory, but do not share its culture and values.
The Culture cannot recognize as citizens** individuals (or, more importantly, societies) who do not share its values. This requires that the Culture's values be defined. How that definition is reached and maintained, and by whom, are crucial questions that Banks doesn't answer at all; the AIs are somehow so wise and so awesome that almost all of them just rub along just fine. It's fiction, so whatever; but Culture AI is firmly in the realm of magic. The question of who gets to be Culture is also crucial, and not just for who gets to join the party. Who gets access to all the Culture's technology? Knowledge? War plans?
*This "culture" has armed forces, a diplomatic service, a standardized language, and the demonstrated ability to wage total war. Sure as shit looks like a state.
**The resident/citizen distinction doesn't do any work here. For one, there's no reason to think there are significant numbers of non-Culture residents: we see a handful of SC mercenaries, one ambassador, one composer, against, canonically, trillions of Culture humans. For two, residents will need to abide by Culture values whether recognized as Culture or not: you can't have al-Baghadi wandering around executing heathens. Far more importantly, you can't have equiv-tech AIs accessing Culture resources without sharing Culture values. The only time we possibly see that happening the outside AI was associated with a rogue faction fomenting major war and getting Culture citizens killed in the process.
Who gets access to all the Culture's technology? Knowledge? War plans?
Right, this exactly. The Culture is post-scarcity, and it's post scarcity for technological reasons rather than because it's sitting on some special mountain of resources. Any other society that has the science and technology the Culture does should be post-scarcity too. How are there still poorer civilizations that are aware of the Culture and aren't in the midst of ramping up at high speed to Culture-equivalent tech and wealth levels? The Culture seems as if it would be really bad at keeping secrets.
The Culture does have territory, however non-contiguous, and it does exercise control over that territory and actors within it.
Well, who are you talking about that exercises this control?
Take a GSV, the Grand Tourist. The GSV Mind exercises control over the GSV; if it wants someone on, it can invite them. If it wants them off, it can ask them to leave, and if they won't go it can Displace them off. It controls the resources and equipment that the GSV contains, and the knowledge that it holds within its own memory banks - including the knowledge of how to build warships, and the memory of any strategic discussions it may have had with other Minds.
But who else has control? If I'm a Culture citizen (whatever that means) and I want to live on the GSV Grand Tourist, but the Mind doesn't like me for some reason... can "The Culture" intervene and enforce my right to live there? No.
Can it compel the Grand Tourist to expel people who it doesn't think should be there, or disburse resources that the Grand Tourist would rather hang on to? No.
Because there is no "The Culture" in the sense that there is a "French Republic" or a "United States of America". It isn't a state! It doesn't raise taxes! It has no central government, representative or otherwise! It has no unified legal system! It has no centralised register of citizenship!
How are there still poorer civilizations that are aware of the Culture and aren't in the midst of ramping up at high speed to Culture-equivalent tech and wealth levels?
Such as whom?
Azad, explicitly, are not aware of the Culture; they don't know the full extent of Culture technology. That's a deliberate decision by Contact. Same goes for most of the societies we see in Use of Weapons.
The Culture seems as if it would be really bad at keeping secrets.
Yes it is - see most of the first half of "Use of Weapons", which is about SC intervening when one of its agents goes rogue and decides he wants to set up his own one-man Contact section and completely screws it up. There's a general consensus among Culture citizens, especially those in a position to do so (i.e. Contact and SC) that this sort of uncontrolled leaking is a bad idea, and Contact/SC will intervene to stop it if necessary.
Again, I'd have to go back to the books for cites on which societies know what. But generally, non-Culture people are often aware of the existence of sexy Culture people with their orgasms and their drug glands. And non-Culture societies often have space travel that seems to be capable of getting them physically to Culture locations.
Why don't (incredibly rich, government funded, whatever) tourists show up at Culture orbitals and sweet-talk rando Culture humans or Minds into handing over a thumb-drive with a copy of "So You Want To Bootstrap Yourself To Post-Scarcity?" The Culture's an anarchy, someone would have to think that kind of technological foreign aid was a good idea, right?
41: Doesn't answer any point raised in 39. In fact it highlights how statelike the individual Minds are, and points toward the history Banks describes in the notes: the Culture originated as a confederation of sovereign space habitats, and appears still to be so; except that its sovereigns have become so homogeneous as to present a highly unified - statelike - front to the rest of the galaxy.
And, from the Culture's point of view, why isn't it a good idea? The 'other galactic powers would start punitive wars over it' theory just doesn't seem plausibly well-developed: from the point of view of the other galactic powers, they'd be locking the barn door after the horse was stolen.
I've never understood that expression. Who has a whole barn with only one horse in it? Even if there are no other animals, there's probably equipment and tack.
48: It's an expression used by urbanites who have never even visited Nebraska.
49: Within the last ten minutes, yet!
I guess it usually is "closing the barn door."
I feel like a lot of discussion is besides the point of what Banks was trying to do. He was trying to imagine a 95% utopia, and then write stories from the point of view of the other 5%. It's not an accident that the very first Culture novel closely follows one of the Culture's outright enemies. A novel about a 100% utopia is boring as shit, so it's a way to talk about utopia from the outside.
Why don't (incredibly rich, government funded, whatever) tourists show up at Culture orbitals and sweet-talk rando Culture humans or Minds into handing over a thumb-drive with a copy of "So You Want To Bootstrap Yourself To Post-Scarcity?" The Culture's an anarchy, someone would have to think that kind of technological foreign aid was a good idea, right?
Well, not all Culture citizens would be in a position to provide that thumb drive. The ones who would are mostly Minds. And I don't think it's unrealistic to think that there is a general consensus among Culture Minds that that sort of thing should be left to Contact Section to handle. Going rogue and giving away stuff like that willy-nilly seems like an Eccentric thing to do, and there seems to be a fair social stigma attached to going Eccentric.
We don't get much of a sight of other equiv-tech civilisations in CP, PoG or UoW, but the later books make clear that there are a lot, and that they have their own Contact-ish tech aid/advisory relationships with less advanced civilisations in their areas of influence, which the Culture generally respects and avoids interfering with. Your tourist/thumb drive scenario implies the existence of societies that aren't equiv-tech or even close to it, but are close enough to Culture territory to reach it easily, but haven't been detected or approached by Contact Section yet. Which is plausible but unlikely.
Who has a whole barn with only one horse in it?
Someone who keeps forgetting to close the door?
They'll mostly come back, if you've been feeding. But if you've only got one horse, you may as well just let it use the guest bedroom.
Also, aren't barns for harvested crops? I've always heard the expression as referring to stable doors. Stables are where you keep horses.
Maybe if you forget to close the barn door the horse will get in and start eating the grain.
Originally, but "horse barn" is the kind of phrase you'll see in America.
Under US tax law the horse is classified as a root vegetable.
The Midwestern topsoil is so rich you don't have to breed the horses, you just harvest them in fall.
Why "haven't been detected or approached by Contact Section yet"? The Culture is an anarchy -- all you need is one Mind, with one copy of "So You Want To Bootstrap..." to start passing it around, regardless of what Contact is doing. Social stigma or no, Eccentrics exist. And why would only Minds have access to that kind of thumb drive? The Minds were originally built by biological intelligences (or by AIs that were built by biological intelligences), so the technological knowledge to get there is fully comprehensible by biological intelligences.
I don't think it's possible to keep technological secrets through a general consensus supported by social pressure. A boat that doesn't have many holes in it still leaks.
It's more that "barn" has lost it's meaning as a place to store harvested crops. Grain goes in a grain bin. Silage in a silo. Hay stays outside in stacks and giant swiss-roll things.
Just as barnacle geese hatch from goose barnacles, chestnut horses hatch from horse chestnuts.
LB is just coining phrase after phrase.
The books are fun, but I can't do anything meaningful with what they say about the morality and so on of contact between cultures when I can't make myself buy the implicit premises.
"A boat with one hole still leaks" would be a good fortune cookie saying.
"A rising tide lifts all boats except the one made of solid lead" would be another.
And why would only Minds have access to that kind of thumb drive?
Well, because it's more than a human could hold in his brain. If he wanted one, he'd have to say "hey, Hub, can you give me a thumb drive with all this stuff on it?" and Hub might well say no or at least ask why.
I don't think it's possible to keep technological secrets through a general consensus supported by social pressure.
And backed up by Contact and SC choking off such leaks when they detect them, don't forget.
Well, I'm not saying it's a perfect system. But if it works 99% of the time then that's good enough to explain why most of these poor societies close to Culture space still exist (if they actually do).
You haven't seen my pointy hat and boots.
You should have pointy boots to ride horses because they go into the stirrups more easily.
Well, because it's more than a human could hold in his brain. If he wanted one, he'd have to say "hey, Hub, can you give me a thumb drive with all this stuff on it?" and Hub might well say no or at least ask why.
If a human Culture citizen started downloading a series of reference books dating back to when humans developed the Minds onto some portable playback device, would a Mind stop them? If no, then there's your leak -- one person out of trillions has to do that. If yes, then the Culture is less of an anarchy (except as between the Minds, who are the only really free participants) and more of a hedonist totalitarian state.
But if it works 99% of the time then that's good enough
Really, no it isn't. To keep a secret, you have to keep it 100% of the time.
Keeping a secret 99% of the time is probably better than the CIA's record.
It's probably harder when everybody involved is the same species.
I believe it's stated that the AIs started producing themselves autonomously independent of humans a long time ago, and that the Minds are the product of a long period of refinements and improvements. Would the full "Here's how to make yourself a Culture" instructions be comprehensible to anything less advanced than a Mind? And if you're already that advanced, then you presumably doon't need the instructions.
This "culture" has armed forces, a diplomatic service, a standardized language, and the demonstrated ability to wage total war. Sure as shit looks like a state.
But a state with no apparent decision making process, which sounds profoundly unstable. Suppose SC has identified external civilisation X and decided (how???) to adopted a particular nuanced approach to contact; and in the same timeframe somebody else, anybody, for the sake of argument the good ship Fuck All This For A Game Of Soldiers, has also become aware of civ. X but has embarked on a different nuanced approach, because it wants to. What happens. Does SC (who?) contact FATFAGOS to tell it it mustn't (on what authority?). Suppose FATFAGOS thinks SC is wrong in this case, is the situation arbitrated? (by whom?). Suppose it is arbitrated and the arbitrator sides with SC, who gets to make FATFAGOS accept that decision?
Armed forces and a diplomatic service, yes, but no police AFAICS.
Would the full "Here's how to make yourself a Culture" instructions be comprehensible to anything less advanced than a Mind?
No, but the "Here's how to make yourself a first-gen AI capable of improving its successors into a Mind" instructions would be. And that point, isn't Bob essentially your uncle? (I swear I'll stop talking like this soon.
If a human Culture citizen started downloading a series of reference books dating back to when humans developed the Minds onto some portable playback device, would a Mind stop them?
Once the Mind worked out why the human was doing it: yes. Because, again, this is very similar to what happens in the first half of UoW, and we see SC stopping it.
To keep a secret, you have to keep it 100% of the time.
Using the phrase "keep a secret" is not helpful here. It is possible for knowledge to leak slightly (the secret has not been "kept") and still not be universally available. There are plenty of people here who know heebie's actual name, for example. That doesn't mean that it is public knowledge.
To take a better example, the detailed plans of how to make an atomic bomb were not kept secret 100% of the time. They're still not available to the vast majority of the human population, though. Just because the "how to become post-scarcity" blueprint leaks once, to one planetary civilisation, it doesn't mean that it will immediately be universally available, for very obvious reasons (I use the atomic bomb analogy advisedly).
Maybe not. Because if it took a few years or generations to go from first generation AI to a Mind, you'd have to worry about the reaction from other people (people-like aliens?) during the transition. You can lead a horse to water but you can't make them drink if the water is local elites losing their status to AI.
80: Right. You can imagine a start down the evolutionary pathway that eventually leads to the Culture halting or going wrong for all sorts of reasons.
They're still not available to the vast majority of the human population, though.
We're getting into simple contradiction here, but yes they are. Nuclear weapons control is about controlling the necessary materials, which are scarce, and about the industrial plant necessary to manufacture (which is big enough to be too expensive for anything smaller than a country, and big enough to be visible for other countries to interfere with.) While there are lots of small secrets about details, "How to build a nuclear weapon" is a nonsecret set of facts that anyone with the non-secret education capable of understanding the science and engineering can look up.
Suppose SC has identified external civilisation X and decided (how???)
Decided after a discussion among a panel of interested SC Minds.
to adopted a particular nuanced approach to contact; and in the same timeframe somebody else, anybody, for the sake of argument the good ship Fuck All This For A Game Of Soldiers, has also become aware of civ. X but has embarked on a different nuanced approach, because it wants to. What happens. Does SC (who?) contact FATFAGOS to tell it it mustn't (on what authority?).
I would imagine that that's the initial step, yes. SC goes to the rogue ship and says, look, we're SC, this is our decision and these are the reasons why we made it; why are you doing something different? And hopefully they can argue out a solution. It's not clear in this example whether the FATFAGOS is a Contact vessel or not. If it is, then SC has the threat of having it expelled from Contact.
Suppose FATFAGOS thinks SC is wrong in this case, is the situation arbitrated? (by whom?)
Well, presumably SC and FATFAGOS both try to pull in more and more allies on their side, either SC Minds or Contact Minds or whatever, who will add weight and sophistication to their arguments and kudos to their side. We sort of see this happening in Excession. Eventually, hopefully, the SC Minds admit they were wrong, or the FATFAGOS does.
Suppose it is arbitrated and the arbitrator sides with SC, who gets to make FATFAGOS accept that decision?
If everyone that either side brings in agrees with SC, but FATFAGOS still won't budge - then I imagine SC does its best to minimise the impact of the rogue policies, and/or discourage FATFAGOS from intervening any more.
77: Exactly. Problems of governance are totally handwaved.
While there are lots of small secrets about details, "How to build a nuclear weapon" is a nonsecret set of facts that anyone with the non-secret education capable of understanding the science and engineering can look up.
The general principles are, yes. But the small secrets about details are what make it actually work. Yes, it's public that you need a polonium-beryllium initiator to produce your initial neutron flux; but how you make one, starting from a lump of each metal, is still very secret indeed, and not the sort of thing that you can look up at all. It's public that you need explosive lenses to produce an implosion; but the exact design of an explosive lens is still secret, as is the hydrocode (IIRC) software that you use to design them. That's why I said "detailed plans" and not "the basic science and engineering involved".
Because if it took a few years or generations to go from first generation AI to a Mind, you'd have to worry about the reaction from other people (people-like aliens?) during the transition.
Yeah. If I stand up and say "hey everyone, I got these plans from an alien for how to build an intelligent supercomputer that will take over our society and run it forever" the reaction is not going to be a universal cry of "you do that, son, sounds like a hoot".
80, 81: Sure, the bootstrapping process could go wrong, but then your plotlines would all be about what's happening during that process, rather than Banks' actual books, where it doesn't seem to happen hardly at all.
Given that "software" is something that post-dates nuclear weapons, is it the case that the know-how to build a good nuclear weapon is still secret, but not so much the information to build an adequate one?
88: the initial explosive lenses for Fat Man were designed by hand, with calculations done using mechanical calculators and pencil and paper, by George Kistiakowsky and his team. The results they came out with are still secret, by which I mean not available to the public (even though they probably leaked to e.g. the USSR). Similarly the initiator design that Fat Man used.
90: I'm handwaving here. But as solving engineering problems goes, are the secrets you're talking about anything that would meaningfully slow down production of an adequate nuclear weapon, or would the possession of a normal publicly available education (and the money to staff a large engineering project and so on) be enough to recreate the engineering work that people successfully did with mechanical calculators in the forties? It's my strong impression that no, there's nothing secret that would be particularly difficult to recreate at the level of making something that would work.
I thought they focused on the centrifuges with Iran because they figured if they got the material, the rest was easy.
Obviously, centrifuges are an engineering problem, but one that exists because the controls on fissionable materials are pretty strong.
Well, yes to both? I mean, yes, they could be rediscovered independently if you had enough time and money. Like any secret technology in that respect, whether it's leaked out to anyone else or not. But, yes, having to do that would meaningfully slow down weapon production.
See, e.g., this story:
Today [Dave Dobson's] experiences in 1964 - the year he was enlisted into a covert Pentagon operation known as the Nth Country Project - suddenly seem as terrifyingly relevant as ever. The question the project was designed to answer was a simple one: could a couple of non-experts, with brains but no access to classified research, crack the "nuclear secret"? In the aftermath of the Cuban missile crisis, panic had seeped into the arms debate. Only Britain, America, France and the Soviet Union had the bomb; the US military desperately hoped that if the instructions for building it could be kept secret, proliferation - to a fifth country, a sixth country, an "Nth country", hence the project's name - could be averted. Today, the fear is back: with al-Qaida resurgent, North Korea out of control, and nuclear rumours emanating from any number of "rogue states", we cling, at least, to the belief that not just anyone could figure out how to make an atom bomb. The trouble is that, 40 years ago, anyone did.
The quest to discover whether an amateur was up to the task presented the US Army with the profoundly bizarre challenge of trying to find people with exactly the right lack of qualifications, recalls Bob Selden, who eventually became the other half of the two-man project. (Another early participant, David Pipkorn, soon left.) Both men had physics PhDs - the hypothetical Nth country would have access to those, it was assumed - but they had no nuclear expertise, let alone access to secret research....
...Dobson's knowledge of nuclear bombs was rudimentary, to say the least. "I just had the idea that [to make a bomb] you had to quickly put a bunch of fissile material together somehow," he recalls. The two men were assigned to one of Livermore's less desirable office spaces, in a converted army barracks near the facility's perimeter. Bob Selden found a book on the Manhattan Project that culminated in America's development of the bomb. "It gave us a road map," Dobson says. "But we knew there would be important ideas they'd deliberately left out because they were secret. This was one of the things that produced a little bit of paranoia in us. Were we being led down the garden path?"
Eventually, towards the end of 1966, two and a half years after they began, they were finished. "We produced a short document that described precisely, in engineering terms, what we proposed to build and what materials were involved," says Selden. "The whole works, in great detail, so that this thing could have been made by Joe's Machine Shop downtown."...
...Finally, after a valedictory presentation at Livermore attended by a grumpy Edward Teller, they were pulled aside by a senior researcher, Jim Frank. "Jim said, 'I bet you guys want to know how it turned out,'" Dobson recalls. "We said yes. And he told us that if it had been constructed, it would have made a pretty impressive bang." How impressive, they wanted to know. "On the same order of magnitude as Hiroshima," Frank replied.
"I just had the idea that [to make a bomb] you had to quickly put a bunch of fissile material together somehow,"
I always thought if you didn't need to worry about safety or portability or efficiency, you could just drop a big enough hemisphere of plutonium unto another hemisphere of plutonium from a sufficient height.
That's why my high school science teacher never let us use the plutonium without supervision.
The story in 95 is kind of great. I mean, imagine being the physics post-doc hired to do that?
Right. But the really crucial thing about that was that it was not accomplished by two blokes in a shed. It was accomplished by two blokes in a shed who had access to unlimited and highly sophisticated engineering testing facilities at Livermore.
They were to explain at length, on paper, what part of their developing design they wanted to test, and they would pass it, through an assigned lab worker, into Livermore's restricted world. Days later, the results would come back - though whether as the result of real tests or hypothetical calculations, they would never know.
If you could do it with two blokes in a shed in two and a half years, that looks pretty bad for those countries that spent decades working on it. Did North Korea not have a shed to spare?
It was much easier to get jobs back in the day.
I always thought if you didn't need to worry about safety or portability or efficiency, you could just drop a big enough hemisphere of plutonium unto another hemisphere of plutonium from a sufficient height.
No - what you get then is a fizzle. As the hemispheres get closer together, the reaction rate picks up and starts to produce heat, and that causes an explosion that blows them apart before they make a critical mass. You still get a biggish bang and a lot of plutonium all over the place, but it's not a bomb. You need to slam them together in a really pretty specific way.
If you could do it with two blokes in a shed in two and a half years, that looks pretty bad for those countries that spent decades working on it. Did North Korea not have a shed to spare?
The trick is that you need a very specific kind of shed, the details of which are top secret.
That's just what they want you to think.
90,91: Pakistan's nuclear capacity owes a great deal to AQ Khan. I know less about South Africa's nuclear program, but I do not think that independent engineering played a big role there.
Examples of the smallest efforts to succeed are one way to check intuition on this-- I think India? And even they used someone else's reactor for a breeder.
Two other mere engineering problems that look hard to solve in practice are making turbines that will produce high-performance turbojets (China can't do this, uses older soviet technology) and quiet submarine propellers ( subcontracted supply chain). Seeing ubiquitous and cheap consumer engineering is not a great guide for intuition, I think. Lots of these systems are built from inherited high-performance infrastructure that would be difficult to duplicate.
access to unlimited and highly sophisticated engineering testing facilities at Livermore.
Okay, countries generally have enough money to pay for sophisticated engineering facilities. The point of the exercise was for them to be duplicating what a country with no access to classified information could do, not what some rando in a bikeshed could do.
If you could do it with two blokes in a shed in two and a half years, that looks pretty bad for those countries that spent decades working on it. Did North Korea not have a shed to spare?
Do we know when North Korea had a workable plan for a Hiroshima-class bomb, as opposed to something bigger and better?
105.2: two years ago, with tremendous fanfare, Xi Jinping announced that China had at last produced its first entirely indigenous ballpoint pen. They had been trying for years but it turns out that the little ball at the point is really quite tricky to manufacture.
106. NK exchanges with Pakistan-- NK can build pretty good missiles, more so than Pakistan, while Pakistan had AQ Khan, who has never been debriefed by a western security agency.
95 is pretty interesting though, I did not know that story.
This was an interesting story about a truck driver who has acquired a massive amount of material about how to design an atomic weapon from publicly available sources. There's a lot more out there than you'd think as Alex Wellerstein and Jeffrey Lewis note.
||
Barry, I'd appreciate your commentary on 6.1 in the Phlebas thread, if you have the time.
|>
The story in 95 is kind of great. I mean, imagine being the physics post-doc hired to do that?
Agreed! It's a fabulous story.
111 Seconded.
110 I'll give it a look but I feel like a bit fat loser for not getting more than ten pages in. The book is right next to me on my bed.
110 I think I need more context (namely, I need to read the damned book). Maybe ajay can explain. The Umayyad Caliphate was a period of very rapid expansion and conquest, bringing a variety of religious and ethnic communities under its rule though they were allowed a great deal of autonomy in deciding their own religious affairs which include family and much civil legal matters. Much else besides but that's the basic takeaway.