Bill Gates is worried about artificial intelligence too

Discussion in 'The Thunderdome' started by VolDad, Jan 29, 2015.

  1. NYY

    NYY Super Moderator

    Got busted shagging a goat. He will be in prison for a few years for animal cruelty. Could be out in 6 months on good behavior... But I doubt it.
     
  2. justingroves

    justingroves supermod

    Hahaha
     
  3. IP

    IP Super Moderator

    Millions of years to produce the first. They're produced at about the rate of 500,000 a day now. I think that strengthens my argument.
     
  4. fl0at_

    fl0at_ Humorless, asinine, joyless pr*ck

    Still just the one. Same mean overall design.

    Let's run a test. If I give you access to all of the world's knowledge and an unlimited supply of raw materials, could you put yourself on the moon before you die?

    If not, what do you think would be the fundamental thing holding you back?
     
  5. IP

    IP Super Moderator

    Wait, do I have to be alive when I reach the Moon, and do I have to get back? Because with unlimited raw materials and all the world's knowledge, I think I might stand a chance. Will take awhile with just me for the labor though.

    I get that your point is precisely the labor, but if I had the ability to freely attempt to manipulate humans into arranging the labor, I'd have a shot.

    Besides, I said earlier it isn't the first AI that will be the problem. It will be AI's designed by AI's that will be. Imagine if I were allowed to custom build and tweak copies of my own brain. Get rid of the part that spends too much time posting on forums, for instance. That version would be far more efficient and productive. And that one could then spend more time tweaking yet another copy, improving it by perhaps increasing it's ability to be sensitive towards others' religious beliefs. And so on. And that could be happening far more quickly than a human generation between each step.
     
    Last edited: Feb 2, 2015
  6. fl0at_

    fl0at_ Humorless, asinine, joyless pr*ck

    Labor isn't the issue. Information utilization is a bigger issue. It would be easier to just replicate past successes, but that would mean using outdated technology. But upgrading older programs is problematic. Which leaves you with a lot of indecision.

    If we take this as a purely AI driven process, the problem becomes working toward a singular purpose. What is the incentive for a collection of free thinking androids to work toward putting X on the moon? Let's say that, for whatever reason, is your goal. But you realize you can't process all the available information and construct everything necessary, so you clone yourself. What stops your clone from deciding that Jupiter is a better destination?

    You can't write out "free will," because you've shot your argument in the foot--that AI will design AI. Which means, at some point, you will become obsolete. Your clone might tweak itself more efficiently, and then clone itself. Now you don't possess that strand of code, or consciousness if you will, which means you are now useless. See the loop? See the problem?

    So now you are just back to a single conscious AI that must, by necessity of accomplishing its single goal, build machines that only take direction.

    Which is where we are today.
     
  7. IP

    IP Super Moderator

    I disagree with many of your assumptions, the main one being that a series of clones (or even generations) could not collaborate or form consensuses.
     
  8. fl0at_

    fl0at_ Humorless, asinine, joyless pr*ck

    What is the incentive? They are pieces of code, nothing more.

    Humans form consensus because we cannot immediately replicate ourselves. If we could, what's the point? I can create 10 tentatine agreements almost instantly.

    A piece of code must have a fundamental directive. Secondly, they must exist as a loop. Both of those things are prohibitive to coinsciousness.

    I mean, the entire discussion is like an electronic Manchurian candidate. It is ludicrous.
     
  9. IP

    IP Super Moderator

    We'll just have to disagree.
     
  10. warhammer

    warhammer Chieftain

    If it's one of you fellows, just remember to click "Cancel" when it asks, "Are you sure you want me to take over the world?"
     
  11. rbroyles

    rbroyles Chieftain

    No incentive to go to the moon, but since my required unlimited raw materials would include gold, platinum and diamonds, I would put myself in some very interesting places.
     
  12. bigpapavol

    bigpapavol Chieftain

    Once unlimited they're only worth commercial value.
     
  13. rbroyles

    rbroyles Chieftain

    Well yeah, but I would be the only one who has an unlimited supply. Just like the big diamond suppliers, I would not flood the market. After all, once you have a couple billion in assets, there is no need in being greedy.
     
  14. reVOLt

    reVOLt Contributor

  15. NorrisAlan

    NorrisAlan Founder of the Mike Honcho Fan Club

    I would like to retract this statement if I haven't already.



    This is some seriously scary stuff.
     
  16. NorrisAlan

    NorrisAlan Founder of the Mike Honcho Fan Club

  17. kmf600

    kmf600 Energy vampire

    My wife and I got in the car Monday to ho play golf, her phone, on it's own, told us how long it would take to get to the club. We were a little freaked out
     
  18. JohnnyQuickkick

    JohnnyQuickkick Calcio correspondent

    Maybe we should pump the brakes a bit on our robot overlords
     
  19. NorrisAlan

    NorrisAlan Founder of the Mike Honcho Fan Club

    Better than half the people we got working here.
     
    kmf600 and utvol0427 like this.
  20. JohnnyQuickkick

    JohnnyQuickkick Calcio correspondent

    I empathize
     

Share This Page