• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: June 4th, 2023

help-circle

  • Thank you for taking the time to respond. With siphoning money, I mean not giving actual value in return. The NFT market was a clear example of this: get some hype going, sell the promise of great gains on your investment, once the ball gets rolling make sure you’re out before they realise it’s actually worth nothing. In the end, some smart and cunning people sucked a lot of money from often poor and misinformed small investors.

    I think I have an inherent idea of value, as in: the value it has in a human life and the amount of effort needed to produce it. This has become very detached from economical value, as there you can have speculation, pumping value and all that other crap. I think that’s what frustrates me about the current financial climate: I just want to be able to pay the people who helped produce the product I buy fairly with respect to how much time and work they put it. Currently however, so much money is being transferred to people “just for having money”. The idea that money in and of itself can make more money is such a horrible perversion of the original idea of trade…


  • Your last paragraph is not how money should work at all. Money should represent value that ideally doesn’t change, so that the money I receive for selling a can is worth a can, not a Lambo an not a grain of sand. What your describing is closer to speculation and pyramid schemes (NFTs for example).

    Either try and explain to me how BTC could be an ideal currency that fixes the problems in existing currency, or try to explain me how it’s really cool as an investment thing to siphon money from others, but don’t try and do both at the same time.


  • I think the issue is not wether it’s sentient or not, it’s how much agency you give it to control stuff.

    Even before the AI craze this was an issue. Imagine if you were to create an automatic turret that kills living beings on sight, you would have to make sure you add a kill switch or you yourself wouldn’t be able to turn it off anymore without getting shot.

    The scary part is that the more complex and adaptive these systems become, the more difficult it can be to stop them once they are in autonomous mode. I think large language models are just another step in that complexity.

    An atomic bomb doesn’t pass a Turing test, but it’s a fucking scary thing nonetheless.