• 2 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: August 1st, 2023

help-circle



  • Even referring to a computed outcome as having been the result of a ‘goal’ at all is more sci-fi than reality for the foreseeable future. There are no systems that can demonstrate or are even theoretically capable of any form of ‘intent’ whatsoever. Active deception of humans would require extraordinarily well developed intent and a functional ‘theory of mind’, and we’re about as close to that as we are to an inertial drive.

    The entire discussion of machine intelligence rivaling human’s requires assumptions of technological progress that aren’t even on the map. It’s all sci-fi. Some look back over the past century and assume we will continue on some unlimited exponential technological trajectory, but nothing works that way, we just like to think we’re the exception because if we’re not we have to deal with the fact that there’s an expiration date on society.

    It’s fun and all but this is equivalent to discussing how we might interact with alien intelligence. There are no foundations, it’s all just speculation and strongly influenced by our anthropic desires.


  • Fortunately we’re nowhere near the point where a machine intelligence could possess anything resembling a self-determined ‘goal’ at all.

    Also fortunately the hardware required to run even LLMs is insanely hungry and has zero capacity to power or maintain itself and very little prospects of doing so in the future without human supply chains. There’s pretty much zero chance we’ll develop strong general AI on silicone, and if we could it would take megawatts to keep it running. So if it misbehaves we can basically just walk away and let it die.

    It’s fun to imagine ways it could deceive us long enough to gain enough physical capacity to be self-sufficient, or somehow enslave or manipulate humans to do its bidding, but in reality our greatest protection from machine intelligence is simple thermodynamics and the fact that the human brain, while limited, is insanely efficient and can run for days on stuff that literally grows on trees.




  • Irked, but not disapproving… interesting distinction.

    Sure I’m insecure because cash is money and new, indirect ways of consuming one’s labor is scary, even more so when it’s relatively new and unestablished route.

    You must hate libraries ;)

    My dude… if you’re not writing for the love of writing and you’re this worried about getting paid- you won’t ever have to worry about people consuming your work without paying you.




  • if all they care about is the concepts.

    That’s what I care about, not everyone. I’m saying the general snobbery about how one should enjoy sci-fi could turn kids off to sci-fi… but that would only matter if you said that to a kid that admired you so probably not going to be an issue- so carry on with the snobbery I guess.





  • The fact that I don’t accept your explanation as a valid reason to judge other people’s media enjoyment doesn’t mean I wasn’t interested, that’s why I asked.

    But also I don’t think you really read my post because I said explicitly I “lost interest in character development arcs and relationships and just want to know about the cool high-concept sci-fi ideas and storylines.” Your ‘explanation’ basically just says that’s not okay, so it’s not even an ‘explanation’, just a judgement of my explanation.


  • You know my favorite part of the Dune books was the glossary. I’ve always most enjoyed what I describe as ‘non-narrative’ sci-fi. I’ve read libraries of sci-fi and I found what I really like about it is the big technological and philosophical ideas. The conceptual meat-and-potatoes without all the relationships and personalities.

    I get to enjoy sci-fi literature however I want and tbh it ‘irks’ me a bit to encounter literary snobbery over my choice in how to enjoy sci-fi. If you do that to a younger person it may turn them off to the genre just because they might not enjoy the same aspects as you do. So for their sake I’d advise keeping what ‘irks’ you about how other people enjoy sci-fi to yourself in the future.



  • Yeah this list kind of assumes humans\machines are inherently adversarial and machines are always a threat.

    To be more fair we’d have to have an opposite Bio Threat Level Scale for machines to evaluate threats from biological life. That would be a lot of fun actually. Maybe the highest level would just be like a ‘Luddite Virus’ that makes the infected destroy machines.

    And of course I’m kind of ignoring the idea that the distinction between bio and machine life is a bit arbitrary to begin with so there’s no real reason we can’t just get along.