• Ray Kurzweil thinks he’ll be able to upload his brain to a computer in 10 years and has thought so since the 1990s.

    Kurzweil fervently wishes he’ll be able to do this; existential angst drives many people, uneducated or not, to all sorts of religions. At least Kurtzweil is making educated guesses based on technological progress - wrong guesses, but still within the realm of reasonable.

    There’s no mysticism to the singularity. There’s nothing preventing what he hopes for except engineering sophistication. We know most of the what, and maybe even a good chunk of the how, and we’re making progress. Nothing in the idea of brain uploading depends on an ineffable spirit, or anything we can’t already prove.

    If we don’t destroy ourselves or the planet, there’s no reason we won’t get there eventually. Just not soon enough for Ray or his loved ones, and probably not in time for anyone currently alive. It’s not likely we’ll never achieve it simply because we burn up the planet first, and run out of resources to continue frivolous research like immortality.

    • YourNetworkIsHaunted@awful.systems
      link
      fedilink
      arrow-up
      1
      ·
      16 minutes ago

      I wouldn’t say there’s no mysticism in the singularity, at least not in the sense you’re implying here. While it uses secular and scientific aesthetics rather than being overtly religious the roadmap it presents relies on specific assumptions about the nature of consciousness, intelligence, and identity that may sound plausible but aren’t really more rational than having an immortal soul that gets judged at the end of days.

      And it doesn’t help that when confronted by any kind of questioning of how plausible any of this is there’s a tendency to assume a sufficiently powerful AI can solve the problem and assume that’s the end of it. It’s not less of a Deus ex Machina if you call it an AI instead of a God to focus on the Machina instead of the Deus.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      I realize it’s not mysticism. But it is believing something silly considering he’s been saying it’s just around the corner for decades now.

      Sure, maybe one day it will happen. But it’s like space colonies or everyone using flying cars. It’s always going to happen in the near future.

      • Sure; I’m not saying you’re wrong. Ray is unrealistically optimistic, and his predictions are heavily dependent on several iffy factors: that we’ll create GAI; that it’ll be able to exponentially improve itself; that it’ll be benevolent; and that it’ll see value in helping us obtain immortality, and decide that this is good for us.

        I just don’t think it’s fair to lump him in with SovCits and homeopaths (or whatever Linus Pauling is). He’s a different kind of “wrong”; not crazy or deluded, just optimistic.