Satya Nadella (Microsoft): “Expect us to incorporate AI into every layer of the stack”


On Tuesday evening during Microsoft’s fiscal second quarter earnings conference call with Wall Street analysts, CEO Satya Nadella offered what is perhaps his broadest view yet on what the investment of the company in the OpenAI startup, based in San Francisco and creator of the very popular ChatGPT, stands for Microsoft.

OpenAI, he said, is part of the next wave of computing. “The next big wave will be AI, and we strongly believe that much of business value is created by the ability to catch those waves and make those waves impact every part of our technology stack, and also create new solutions and new opportunities,” said Nadella.

To that end, Microsoft “fully expects to embed AI into every layer of the stack, whether in productivity or in our customer services, and so we’re excited about that.”

GitHub CoPilot, first example of the power of GPT-3

About the partnership with OpenAI, Nadella remarked, “There’s an investment part, and there’s a business partnership, but fundamentally it’s going to be something that’s going to drive, I think, innovation and competitive differentiation in each of Microsoft’s solutions by being a leader in AI.”

Currently, applications developed with OpenAI are GitHub CoPilot, where neural networks help programmers perform coding tasks. “GitHub Copilot is the most scaled LLM,” Nadella said, using neural network industry jargon, “based on the product existing on the market today.

OpenAI’s GPT-3, which is part of how ChatGPT works, is one of the largest large language models in the world, measured by the number of parameters, or neural “weights.”

Support for ChatGPT is coming

Microsoft, Nadella said, “will soon add support for ChatGPT,” he said, “allowing customers to use it in their own apps for the first time.”

Azure recently made available a service called Azure OpenAI Service, a way for developers to access OpenAI programs, and “more than 200 customers, from KPMG to Al Jazeera, use it,” he said. note.

Nadella hinted that the company would further integrate the technology into Microsoft products, including as part of Synapse. Synapse is Microsoft’s “catch-all” database approach to building a “data warehouse” and a “data lake”, common ways of bringing data together for parsing and then running queries. D equoi, perhaps, overshadow the structured databases.

The Unstructured Database and AI Link

“You can see us with data services beyond Azure OpenAI Service,” Nadella told analysts, “Think what Synapse plus OpenAI APIs can do.” However, he did not elaborate on this point.

After telling analysts that customers are tightening their belts when it comes to cloud spending, he said Microsoft is continuously investing in OpenAI and other AI capabilities.

In particular, Microsoft Azure must invest in developing not only the part of its computing facilities that develops OpenAI code, the so-called “training”, but also the vast infrastructure that responds to the millions of queries from users of the software, what is called in the trade “inference.

“The infrastructure business is changing”

“We are working very, very hard to build both the training supercomputers and now, of course, the inference infrastructure,” he said. “Because once you use AI inside your apps, it goes from just heavy training to inference.”

This design change, Nadella said, will further push customers towards Azure. “I don’t think any app startups that happen in the near future will look like app startups from 2019 or 2020,” he said.

“They’ll all have considerations of how the AI ​​inference, the performance, the cost, the model is going to look like, and that’s where we’re well positioned again.”

Because of the ubiquity of AI, Nadella said, “I think the core of Azure itself is transforming, the infrastructure business is transforming.”

To go further on OpenAI and ChatGPT


Source: “ZDNet.com”





Source link -97