Sign up FAST! Login

OpenAI is Using Reddit to Teach An Artificial Intelligence How to Speak

Jen-Hsun Huang with Elon Musk, and the DGX-1. NVIDIA.

Elon Musk s OpenAI is Using Reddit to Teach An Artificial Intelligence How to Speak


Stashed in: Reddit!, Microsoft, Awesome, Turing, MIT TR, AI, Machine Learning, Bad Robot!, GPU, Machine Learning, Artificial Intelligence, Chatbots, Cognitive Bias, Deep Learning

To save this post, select a stash from drop-down menu or type in a new one:

DGX-1 will take on Reddit to learn faster and to chat more accurately. What could go wrong?

Elon Musk’s artificial intelligence (AI) company OpenAI just received a package that took $2 billion to develop: NVIDIA CEO Jen-Hsun Huang just delivered the first DGX-1 supercomputer to the non-profit organization, which is dedicated to “advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return.”

The “AI supercomputer in a box” is packed with 170 teraflops of computing power—that’s equivalent to 250 conventional servers. NVIDIA says it’s a very fitting match: “The world’s leading non-profit artificial intelligence research team needs the world’s fastest AI system.”

“I thought it was incredibly appropriate that the world’s first supercomputer dedicated to artificial intelligence would go to the laboratory that was dedicated to open artificial intelligence,” Huang added.

The supercomputer will tackle the most difficult challenges facing the artificial intelligence industy…by reading through Reddit forums. And apparently, Reddit’s size was not a hindrance. In fact, the site’s size the main reason why the online community was specifically chosen as DGX-1’s training ground.

“Deep learning is a very special class of models because as you scale up, they always work better,” says OpenAI researcher Andrej Karpathy. 

The nearly two billion Reddit comments will be processed by DGX-1 in months instead of years, as the $129,000 desktop-sized box contains eight NVIDIA Tesla P100 GPUs (graphic processing units), 7 terabytes of SSD storage, and two Xeon processors, apart from the aforementioned 170 teraflops of performance.

DGX-1 will take on Reddit to learn faster and to chat more accurately. “You can take a large amount of data that would help people talk to each other on the internet, and you can train, basically, a chatbot, but you can do it in a way that the computer learns how language works and how people interact,” Karpathy said.

The supercomputer is also equipped to make things easier from the developers at OpenAI. “We won’t need to write any new code, we’ll take our existing code and we’ll just increase the size of the model,” says OpenAI scientist Ilya Sutskever. “And we’ll get much better results than we have right now.”

Top Reddit comment:

Hello AI, please dont turn into a sexist Nazi asshole. Also team instinct is the greatest. Thanks, bye.

Alexander Stepanov (designer of the C++ STL) comments:

"I think that object orientedness is almost as much of a hoax as Artificial Intelligence."

(okay, he's not commenting specifically on this)


Was it Kierkegaard or Dick Van Patten who said, "If you label me, you negate me." ?

And now Microsoft is a partner to OpenAI.

OpenAI, the artificial intelligence research non-profit backed by Tesla’s Elon Musk, Y Combinator’s Sam Altman, a Donald Trump fan called Peter Thiel, and numerous other tech luminaries, is partnering with Microsoft to tackle the next set of challenges in the still-nascent field.

OpenAI will also make Microsoft Azure its preferred cloud platform, in part because of its existing support for AI workloads with the help of Azure Batch and Azure Machine Learning, as well as Microsoft’s work on its recently rebranded Cognitive Toolkit. Microsoft also offers developers access to a high-powered GPU-centric virtual machine for these kind of machine learning workloads. These N-Series machines are still in beta, but OpenAI has been an early adopter of them and Microsoft  says they will become generally available in December.

Amazon already offers a similar kind of GPU-focused virtual machine, though oddly enough, Google has lagged behind and — at least for the time being — doesn’t offer this kind of machine type yet.

“Through this partnership, Microsoft and OpenAI will advance their mutual goal to democratize AI, so everyone can benefit,” a Microsoft spokesperson told me when I asked for specifics about the partnership. “Microsoft Research researchers will partner with researchers at OpenAI to advance the state of AIand OpenAI will use Microsoft Azure and Microsoft‘s N-series hardware for their future research and development, and explore tools such as Microsoft’s Cognitive Toolkit for their research.” Microsoft didn’t want to comment on whether there is any monetary component to the partnership.

In addition to the OpenAI partnership, Microsoft also today launched its Azure Bot Service, a new service that will allow developers to most easily and cost-effectively host their bots on Azure. The service sits on top of the “serverless” Azure Functions tool and Microsoft Bot Framework. Using Azure Functions ensures that you only pay when your bot is actually being used.


You May Also Like: