Fly On Wall Street

Microsoft debuts supercomputer for creating human-like artificial intelligence

Microsoft (MSFT) on Tuesday announced that it has built one of the top five supercomputers in the world designed specifically for OpenAI. The company made the announcement during its Build developers conference, which is being held virtually rather than in person in Seattle where it is usually held each year.

The supercomputer, Microsoft says, will be used to train OpenAI’s own artificial intelligence models. OpenAI, which was founded in 2015 and received initial funding from the likes of Elon Musk and Sam Altman. Musk has since left the company, while Altman is now its CEO.

The company is working to create artificial general intelligence, or AI that is capable of outperforming humans, according to the group’s charter. The firm says its main concern is that such technologies benefit all of humanity, while ensuring its power doesn’t become concentrated in the hands of a few.

The new supercomputer has an incredible 285,000 CPU cores, 10,000 graphics processing units, and 400 gigabits per second of connectivity for each GPU server. While the computer ranks in the top five of the most powerful machines in the world, Microsoft won’t say exactly where it falls on that list.

Microsoft and OpenAI announced they were teaming up last year. The companies agreed that Microsoft would become a preferred partner for commercializing new AI technologies. The agreement also called for the two to build Azure AI supercomputing technologies, with Microsoft pouring $1 billion into the project.

The new supercomputer is the first major result of those efforts.

Microsoft says it is also providing separate AI capabilities to its business customers that don’t need one of the most powerful computers on Earth.

The company says it is making its Microsoft Turing AI models available to business customers. This will give customers access to the same models that Microsoft itself uses to power language understanding in its own Microsoft 365 products.

The Turing mode, for example, develops a more complete understanding of language by using so-called “self-supervised” learning. This model allows the system to try to fill in missing portions of sentences based on the words around it using billions of pages of information from online sources including instruction manuals, human resources guidelines, and Wikipedia.

When trained, according to Microsoft, that kind of model can summarize long speeches, moderate live chats, or generate code by searching GitHub.

Hopefully, it doesn’t learn well enough to take over tech reporting duties for Yahoo Finance.

Exit mobile version