Differences between revisions 1 and 5 (spanning 4 versions)
Revision 1 as of 2023-08-15 15:03:03
Size: 440
Editor: Bryce
Comment:
Revision 5 as of 2023-08-15 15:50:16
Size: 5201
Editor: Bryce
Comment:
Deletions are marked like this. Additions are marked like this.
Line 5: Line 5:
It takes place in
Basic problems with a hard sci-fi space opera:
 1. The universe is too big to have multi-system civilizations.
   * No true FTL; there is no hard scifi FTL drive that seems plausible.
   * Wormholes mostly solve this, but artificial wormholes are probably something that only can be made by extremely dramatically-boring civilizations for whom all problems not like "the heat death of the universe" are trivial.
   * Naturally occurring wormholes are not likely to be navigable or go anywhere interesting to people.
   * Preferred solution, then, is that the wormhole network was created by a now non-interacting "dramatically boring civilization" for their own inscrutable reasons.

 2. Individual biological life will probably become irrelevant before a large multi-system could arise.
    * In the 2020s, the most likely mid-term technological scenarios favor a transition away from organic life as the species that would be sent to explore the universe. Humans would follow far behind a friendly AI colonization effort preparing the way for them, or be absent entirely because of unfriendly AI.
    * If the current AI Spring proves to be a false start on human-level AI (for example, because stochastic gradient descent is too training-data inefficient), this probably still only buys a century or so before brute force approaches like artificial evolution become viable. If artificial evolution isn't viable, then human consciousness is probably an extremely rare fluke in naturally-occurring life as well, which is unfavorable for a setting with non-human sophonts.
    * A reactionary movement rejecting AI, in the presence of a false-start 2020s AI Spring that leads to AI which threatens mass social upheaval but not superhuman intelligence, could suppress AI development for a long time only in the presence of a general technological contraction. Otherwise, it will probably be too easy for a rogue actor to develop AI anyway, given the potential rewards.
    * An interstellar civilization arising from the scenario above is likely to be crushed by the first rival which has/is strong AI that it encounters.
    * The civilization in above would need independent reasons to reject pursuing human mind-uploading. This is plausible but a civilization that would categorically reject human mind uploading is probably highly regressive socially. If they don't reject it, why wouldn't uploads be doing the highly-dangerous galaxy exploration?
   
 3. Technological disparity between species is highly likely.
    * Let us accept the premise that life is reasonably common in the universe, and is found on many planets where conditions compatible with large self-organizing molecules exist (which, of course, remains an open question), and that general intelligence is highly adaptive. (N.B. there is no implication that human-like psychology is adaptive.)
    * Stars are in many different stages of their development. Different species encountering each other might have started developing intelligence, technology, etc millions or billions of years apart. This means that randomly-selected alien life is likely to be either pond scum or Cthulhu in comparison to us, wielding no technology or being post-singularity.
    * Ameliorating the above, if there is one post-singularity civilization (such as may also have built the wormhole network), they may actively be suppressing the development of rivals.
    * Encountering pond scum or non-technological sophonts is not really a problem, and finding technological peers or moderate superiors is also fine (and in fact all these are desirable dramatically). So we really need to concern ourselves mainly with vastly superior aliens as a problem.
    * Vastly superior aliens (VSAs) diminish player agency; if conversely the players are the VSAs drama is diminished because the challenges faced by the VSAs are unfamiliar to the game players and likely highly abstract. So, interaction with VSAs needs to be avoided.
    * VSAs may have no interest in lower orders of life, but if they sometimes casually exterminate such because it is on planets made of atoms they need for something, it becomes basically an unpleasant cosmic horror scenario.
    * VSAs may be interested in lower orders of life scientifically or aesthetically, but again their complete superiority means that these interaction with the party civilization will diminish player agency, which is dramatically undesirable.
    * Therefore, the best VSA is one that is deliberately aloof from lower orders of life, except perhaps to maintain the wormhole network it left behind and suppress civilizations that threaten to become a hostile technological peer. This leaves plenty of room for moderately superior aliens, of course.

Connected Space is a spiritual prequel to Beyond the Twelve Worlds, taking some of its concepts (particular, backstory concepts) and using them to develop a hard sci-fi "space opera" with appropriate setup to avoid adventure-destroying technologies such as characters who can be backed up (either because they are natively digital intelligences or because they are digitized human minds.)

Basic problems with a hard sci-fi space opera:

  1. The universe is too big to have multi-system civilizations.
    • No true FTL; there is no hard scifi FTL drive that seems plausible.
    • Wormholes mostly solve this, but artificial wormholes are probably something that only can be made by extremely dramatically-boring civilizations for whom all problems not like "the heat death of the universe" are trivial.
    • Naturally occurring wormholes are not likely to be navigable or go anywhere interesting to people.
    • Preferred solution, then, is that the wormhole network was created by a now non-interacting "dramatically boring civilization" for their own inscrutable reasons.
  2. Individual biological life will probably become irrelevant before a large multi-system could arise.
    • In the 2020s, the most likely mid-term technological scenarios favor a transition away from organic life as the species that would be sent to explore the universe. Humans would follow far behind a friendly AI colonization effort preparing the way for them, or be absent entirely because of unfriendly AI.
    • If the current AI Spring proves to be a false start on human-level AI (for example, because stochastic gradient descent is too training-data inefficient), this probably still only buys a century or so before brute force approaches like artificial evolution become viable. If artificial evolution isn't viable, then human consciousness is probably an extremely rare fluke in naturally-occurring life as well, which is unfavorable for a setting with non-human sophonts.
    • A reactionary movement rejecting AI, in the presence of a false-start 2020s AI Spring that leads to AI which threatens mass social upheaval but not superhuman intelligence, could suppress AI development for a long time only in the presence of a general technological contraction. Otherwise, it will probably be too easy for a rogue actor to develop AI anyway, given the potential rewards.
    • An interstellar civilization arising from the scenario above is likely to be crushed by the first rival which has/is strong AI that it encounters.
    • The civilization in above would need independent reasons to reject pursuing human mind-uploading. This is plausible but a civilization that would categorically reject human mind uploading is probably highly regressive socially. If they don't reject it, why wouldn't uploads be doing the highly-dangerous galaxy exploration?
  3. Technological disparity between species is highly likely.
    • Let us accept the premise that life is reasonably common in the universe, and is found on many planets where conditions compatible with large self-organizing molecules exist (which, of course, remains an open question), and that general intelligence is highly adaptive. (N.B. there is no implication that human-like psychology is adaptive.)
    • Stars are in many different stages of their development. Different species encountering each other might have started developing intelligence, technology, etc millions or billions of years apart. This means that randomly-selected alien life is likely to be either pond scum or Cthulhu in comparison to us, wielding no technology or being post-singularity.
    • Ameliorating the above, if there is one post-singularity civilization (such as may also have built the wormhole network), they may actively be suppressing the development of rivals.
    • Encountering pond scum or non-technological sophonts is not really a problem, and finding technological peers or moderate superiors is also fine (and in fact all these are desirable dramatically). So we really need to concern ourselves mainly with vastly superior aliens as a problem.
    • Vastly superior aliens (VSAs) diminish player agency; if conversely the players are the VSAs drama is diminished because the challenges faced by the VSAs are unfamiliar to the game players and likely highly abstract. So, interaction with VSAs needs to be avoided.
    • VSAs may have no interest in lower orders of life, but if they sometimes casually exterminate such because it is on planets made of atoms they need for something, it becomes basically an unpleasant cosmic horror scenario.
    • VSAs may be interested in lower orders of life scientifically or aesthetically, but again their complete superiority means that these interaction with the party civilization will diminish player agency, which is dramatically undesirable.
    • Therefore, the best VSA is one that is deliberately aloof from lower orders of life, except perhaps to maintain the wormhole network it left behind and suppress civilizations that threaten to become a hostile technological peer. This leaves plenty of room for moderately superior aliens, of course.

CS/Conceptual Notes (last edited 2023-08-22 17:17:58 by Bryce)