OpenAI Shuts Down Chatbot Project To Prevent 'Possible Misuse'

  • 5 Replies


  • Roomba
  • *
  • 17
OpenAI Shuts Down Chatbot Project To Prevent 'Possible Misuse'
« on: September 15, 2021, 11:58:14 am »
OpenAI Shuts Down Chatbot Project By Indie Developer To Prevent 'Possible Misuse'
OpenAI told the developer he was no longer allowed to use its tech after he refused to insert a monitoring tool.

Jason Rohrer, an artificial intelligence (AI) researcher and game designer, had created a chatbot using OpenAI's text-generating language model GPT-3 for fun during the pandemic last year. Rohrer named the chatbot “Samantha” and programmed her to be very friendly, acutely warm, and immensely curious. He allowed others to customise his creation — which he named Project December — to build their own chatbots as they desired. One man turned it into a close proxy of his dead fiancee. Soon, OpenAI learned about the project and gave Rohrer the option to either dilute the project to prevent possible misuse or shut it down. Rohrer was also asked to insert an automated monitoring tool, which he refused.

Project December:

Are we playing with "Dangerous Technology"?


Don Patrick

  • Trusty Member
  • ********
  • Replicant
  • *
  • 624
    • AI / robot merchandise
Re: OpenAI Shuts Down Chatbot Project To Prevent 'Possible Misuse'
« Reply #1 on: September 15, 2021, 07:52:33 pm »
A convincingly human chatbot could be used for automated phishing, or online bullying, or political disinformation bot armies.
CO2 retains heat. More CO2 in the air = hotter climate.



  • Starship Trooper
  • *******
  • 499
Re: OpenAI Shuts Down Chatbot Project To Prevent 'Possible Misuse'
« Reply #2 on: September 16, 2021, 05:34:33 am »
Theyve been snapping hackers out of their bedrooms since the 1980's  gotta watch out.



  • Eve
  • ***********
  • 1287
Re: OpenAI Shuts Down Chatbot Project To Prevent 'Possible Misuse'
« Reply #3 on: September 16, 2021, 08:40:03 am »
Pretend you're OpenAI. You have two options.

1. You let walking-dead kind of stories happen, without reaction. The product gets associated with weirdo, and you look like you don't care about the consequences. That's negative.

2. You shut it down because it "could be dangerous". The product now looks efficient (dangerous = efficient) and you look like you have ethics. That's positive.

Simple marketing choice.

GPTs are still fat parrots.



  • Replicant
  • ********
  • 613
    • Knowledgeable Machines
Re: OpenAI Shuts Down Chatbot Project To Prevent 'Possible Misuse'
« Reply #4 on: September 18, 2021, 07:41:45 pm »
I agree with Rohrer censoring the applications of AI based on the paranoia of what is supposed to be normal or someone who is a weirdo is no different than 19th century and early 20th century issues of homosexuals, where it was against the law to practice homosexuality! That the technology can be used for evil is true of any technology that doesn't mean it's a good idea to censor its application. If I buy a propellant I could use it as an explosive to kill people or I could use it in a rocket research project, but it takes the act or intent of a crime to make it a danger to society, not the assumption that I might use it as a bomb. I keep hearing the paranoia about AI and if those that are scared of it get their way they will stifle the technology and prevent smaller organizations from getting funding and leave the potential of the technology to either very large corporations or governments, or even worse those entities that do have criminal intent to improve the technology overseas. Censoring something does make it go away, it just goes underground, becomes stealthy, and even more dangerous...



  • Emerged from nothing
  • Trusty Member
  • *******************
  • Prometheus
  • *
  • 4634
  • First it wiggles, then it is rewarded.
    • Main Project Thread
Re: OpenAI Shuts Down Chatbot Project To Prevent 'Possible Misuse'
« Reply #5 on: September 23, 2021, 04:07:25 pm »
Ya but Zero I think openAI already got ENOUGH oh its too dangerous, let's close it down cuz we are ethical, "blares", it must be gooda dangeruous! Woho.

They did not need to shut down his project. Maybe it would get associated with dead fiances but hey its just a chatbot and its just playtime, get over it. Anyone who sensors AI is a bad person. NSFW is only at work, you don't MAKE THE AI ALWAYS NSFW. AND SO MANY RESEARCHERS DO THAT.

BTW I'm not sure if they wanted to let the public try GPT-3, that could be why??? I bet that. Though I thought these guys they let into the API waitlist were building products out of GPT-3 for testing it with the public.