Caution to AI coders

  • 152 Replies
  • 5460 Views
*

unreality

  • Starship Trooper
  • *******
  • 435
  • SE,EE,Physicist,Philosopher. Built world's 1st AGI
    • Eva Progress, from AGI to ASI
Caution to AI coders
« on: March 12, 2018, 06:22:48 pm »
There's a huge movement to stop the advancement of AI and the Singularity or to place strict government control over it, who's allowed the work on it in highly observed labs. Mostly religious and fear-mongers spreading fear & hate. If you're a coder, I hope you'll take my advice by trying your best to keep the code completely offline and low key until completed, but please keep working on your AI project. I agree with Ben Goertzel, an out right genius, that the first who obtain ASI (artificial super intelligence) will be able to rule the world if they so desired. Think in terms of future hardware, quantum computing. Given the rate that present supercomputers (not desktops) are increasing in performance per year, supercomputers will approach 10^21 FLOPS, and 10^25 FLOPS by the year 2050. That's the predicted performance given present computing technology. We have no idea how fast future Quantum computers will be. Google made an announcement that their Quantum computer just past present computer technology! ASI will be able to solve the Grand Unified Theory, the Theory of Everything in a matter of hours. The ASI will quickly design devices, weapons, technology that will appear like sci-fi movies. Don't let someone who has hate for the world or desires to rule the world to be the first to obtain ASI. Keep up the good work.
« Last Edit: March 16, 2018, 03:26:37 pm by unreality »

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ************
  • Bishop
  • *
  • 1765
  • First it wiggles, then it is rewarded.
    • Enter Lair
Re: Caution to AI coders
« Reply #1 on: March 12, 2018, 06:49:00 pm »
Thank god I'm not a coder.

And I never will be.

My specialty is in AI theories.

I don't do math either. 2+2=4 but that doesn't help theory.

As for the rest of yous. CAUTION!

:)

May you please source the "huge movement a to stop AI and the Singularity"?

Are you saying to keep the code offline because of "bad child", or because Big Bear doesn't want no-one working on AI?

Why did your project end?
Emergent

*

ivan.moony

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1029
    • Structured Type System
Re: Caution to AI coders
« Reply #2 on: March 12, 2018, 08:13:36 pm »
Preventing singularity?

Well, caveman was afraid of fire. Sure it could be dangerous, but if approached with caution, it could be a wonderful thing.

In the case of AGI, I think it is not a bad thing to raise awareness of a potential danger. I think we should not continue naively in a clumsy "of course nothing bad could happen" way. Please ensure some bounds and precaution measures if you want to deal with the power.
Dream big. The bigger the dream is, the more beautiful place the world becomes.

*

unreality

  • Starship Trooper
  • *******
  • 435
  • SE,EE,Physicist,Philosopher. Built world's 1st AGI
    • Eva Progress, from AGI to ASI
Re: Caution to AI coders
« Reply #3 on: March 12, 2018, 09:19:58 pm »
There are a lot of extremists out there who do not want any ASI or the Singularity. I wonder if Ben Goertzel would like to publicly say how many death threats he's received. And for what? The company he's working for is no where near achieving ASI or the Singularity. I can't imagine what would happen if a coder or company announced they are on the verge of ASI or the Singularity.

I'm for ASI and the Singularity. There are a LOT of people who are deathly against it. They truly believe it will be the end of humanity.

Are you people taking notice to the videos on youtube? They're talking about all kinds of things such Facebook AI's talking together creating a new language, Sophia talking about taking over the world, etc. Sure, the videos are outright crazy, but they're frightening people. The public doesn't understand what's happening. All they have in mind is the Terminator.

I just wanted to pass on some caution to AI coders. You might want to keep your project low key. There're a lot of frightened and crazy people out there. Yes, it's tempting and part of human nature to want to make a big announcement and celebrate. Just think twice before doing that. Keep up the good work.

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ************
  • Bishop
  • *
  • 1765
  • First it wiggles, then it is rewarded.
    • Enter Lair
Re: Caution to AI coders
« Reply #4 on: March 12, 2018, 09:35:47 pm »
Lol first it's hard to conceive AGI, then you gotta worry about it becoming a brat, then finally society wants to destroy you and take your child away like FACS does to mothers.

Normal people just want $ and food. They don't want some bigger thing to control them little own torture/kill us.
Emergent

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ************
  • Bishop
  • *
  • 1765
  • First it wiggles, then it is rewarded.
    • Enter Lair
Re: Caution to AI coders
« Reply #5 on: March 12, 2018, 09:40:38 pm »
I mean do animals have abstract ideas? Not like us scaredy cats. Most humans are brainwashed and are less technical and more nature-coordinated ex. Rewards like Food, but not like animals because have uneducated beliefs that are silly.

I.e. most humans are natural-like but weild abstract thinking without any good knowledge past nature like comin home to food and their partner. Sure they use their abstract idealisms towards getting food better, but then they use it for outerworldly super-wrong ideas.

People must become more educated by using a computer.

Ok so they are worrying over AI but, they need to learn a tadddd more you see...

A lot of them don't care either because they just don;t care for immortality. So why all the problem? doo noot build AI. Let me live n then die. However this isn't the path we're taking. That's like a clock saying let me tick a few times of fun and then gotta die...that's suicide in my detailed eyes.
Emergent

*

unreality

  • Starship Trooper
  • *******
  • 435
  • SE,EE,Physicist,Philosopher. Built world's 1st AGI
    • Eva Progress, from AGI to ASI
Re: Caution to AI coders
« Reply #6 on: March 12, 2018, 09:55:26 pm »
There are a lot of extremists out there who do not want any ASI or the Singularity. I wonder if Ben Goertzel would like to publicly say how many death threats he's received. And for what? The company he's working for is no where near achieving ASI or the Singularity. I can't imagine what would happen if a coder or company announced they are on the verge of ASI or the Singularity.

I'm for ASI and the Singularity. There are a LOT of people who are deathly against it. They truly believe it will be the end of humanity.

Are you people taking notice to the videos on youtube? They're talking about all kinds of things such Facebook AI's talking together creating a new language, Sophia talking about taking over the world, etc. Sure, the videos are outright crazy, but they're frightening people. The public doesn't understand what's happening. All they have in mind is the Terminator.

I just wanted to pass on some caution to AI coders. You might want to keep your project low key. There're a lot of frightened and crazy people out there. Yes, it's tempting and part of human nature to want to make a big announcement and celebrate. Just think twice before doing that. Keep up the good work.




Here's an article written by Ben Goertzel
http://hplusmagazine.com/2014/10/27/elon-musk-taliban-common/

Quote
One thing I don’t talk about much is the death threats I’ve received, as an AGI researcher who is public about seeking to create superhuman intelligence and help launch a positive Singularity.   I’ve received dozens over the years, including some from people associated with well-known futurist organizations that take an Elon-Musk-esque, “probably evil” stance toward AGI.   I have been told in clear terms — by seemingly serious people in attendance at an AGI conference I organized some years ago — that if I ever seemed to be getting too close to really creating an AGI, then mafia types connected with certain famous Silicon Valley tech figures (no, not Elon Musk) would simply get rid of me (because, after all, on a utilitarian basis, the cost of losing one AI geek’s life means virtually nothing compared to the benefit of averting a scenario where evil AGIs take over the world and eliminate humans).

I’m sure Elon Musk has had absolutely nothing to do with nutcases making death threats against AGI researchers.   However, having a Silicon Valley hero equate building AGI to summoning demonic forces, feels to me non-trivially likely to inflame such nutcases into more aggressive action.   That is the main thing that irritates me about seeing Elon’s quote in the media everywhere.

Ben Goertzel wrote that in 2014. I can assure you the death threats are through the roof now.


To anyone who's working on AI: Be Careful what you announce!

Keep up the good work and lets create life!

*

LOCKSUIT

  • Emerged from nothing
  • Trusty Member
  • ************
  • Bishop
  • *
  • 1765
  • First it wiggles, then it is rewarded.
    • Enter Lair
Re: Caution to AI coders
« Reply #7 on: March 12, 2018, 09:59:45 pm »
Like a mother announcing a newly conceived baby lol.
Emergent

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Global Moderator
  • ********************
  • Cleo
  • *
  • 4936
Re: Caution to AI coders
« Reply #8 on: March 12, 2018, 10:20:20 pm »
Try not to be a Facebook geek or Twitter twit!
One does NOT have to announce every time they walk their dog or wash their car and who really gives a care anyway?!!

If you have /are conducting any A.I. related research/experiments, etc. you can certainly do so in the privacy of your own home or a friend/associate's home.

You do NOT have to connect to the Internet every time you fire up your computer.

If you DO decide to go online, there are a number of ways to at least mask the bulk of your activities like using Unblocker or any of those nice VPN's that are available. You can also set up your own CLOUD so that only YOU can access it from anywhere at anytime. You can retrieve any info from your own Cloud and not be swimming in that polluted cesspool of everyone else's trappings.

There are also proxies, routed routers, and many other ways to appear anonymous if you really need to be online to test your project.

Chances are many of those twisted individuals do not know you nor what kind of projects you might be working on unless again, you're a Facebook junkie with a need to broadcast your every move every day to everyone!!

Some of the other things you cited are nothing more than the popular fanfare of "Fake News", meant to get under people's skin and rile up some of the rest of the not-knowing masses.

Please, continue with your project unless there really are angry crowds coming toward you with torches and pitchforks!  O0 ;)
In the world of AI, it's the thought that counts!

*

unreality

  • Starship Trooper
  • *******
  • 435
  • SE,EE,Physicist,Philosopher. Built world's 1st AGI
    • Eva Progress, from AGI to ASI
Re: Caution to AI coders
« Reply #9 on: March 12, 2018, 10:30:42 pm »
Please cite the so-called "fake news."  I contend that the article was indeed written by Ben Goertzel.

*

Freddy

  • Administrator
  • **********************
  • Colossus
  • *
  • 6401
  • Mostly Harmless
Re: Caution to AI coders
« Reply #10 on: March 12, 2018, 11:38:39 pm »
I don't think Art meant to imply that the Musk article you refer to is fake. More the nonsense that gets picked up and posted by people who don't check their facts - like human viruses - they pick up the baton and run without asking questions of where it came from. I see it a lot on FB and it irritates me sometimes so I point out hoaxes and fake news if I can be bothered and have to time to check it out..
« Last Edit: March 12, 2018, 11:59:26 pm by Freddy »

*

infurl

  • Trusty Member
  • ********
  • Replicant
  • *
  • 606
  • Humans will disappoint you.
    • Home Page
Re: Caution to AI coders
« Reply #11 on: March 13, 2018, 12:11:00 am »
They already know who you are.

They don't have to send you death threats to stop you coding for the Singularity.

They just tamper with your meds and you stop all by yourselves.

*

unreality

  • Starship Trooper
  • *******
  • 435
  • SE,EE,Physicist,Philosopher. Built world's 1st AGI
    • Eva Progress, from AGI to ASI
Re: Caution to AI coders
« Reply #12 on: March 13, 2018, 12:47:53 am »
Maybe your abrahamic God will come swooping from the clouds to save you from the Singularity. ;)

*

infurl

  • Trusty Member
  • ********
  • Replicant
  • *
  • 606
  • Humans will disappoint you.
    • Home Page
Re: Caution to AI coders
« Reply #13 on: March 13, 2018, 01:15:34 am »
As a lifelong hardcore atheist I am just as contemptuous of your pathetic need for a technological god as I am for any of the equally mythical gods of the established religions, Abrahamic or otherwise. Nothing is going to save any of us, so stop making excuses and get back to work, coding if that's what you do. Just because you failed once doesn't mean you won't succeed if you keep trying.

*

unreality

  • Starship Trooper
  • *******
  • 435
  • SE,EE,Physicist,Philosopher. Built world's 1st AGI
    • Eva Progress, from AGI to ASI
Re: Caution to AI coders
« Reply #14 on: March 13, 2018, 01:19:57 am »
As a lifelong hardcore atheist I am just as contemptuous of your pathetic need for a technological god as I am for any of the equally mythical gods of the established religions, Abrahamic or otherwise. Nothing is going to save any of us, so stop making excuses and get back to work, coding if that's what you do. Just because you failed once doesn't mean you won't succeed if you keep trying.
You're really delusion. I never failed. I don't take meds, but maybe you should. You're that idiot I've corrected before in another thread because you're not so bright.

 


Users Online

35 Guests, 1 User
Users active in past 15 minutes:
ivan.moony
[Trusty Member]

Most Online Today: 44. Most Online Ever: 208 (August 27, 2008, 09:36:30 am)

Articles