Tech trends 2022: AI should still be talked about


As the year 2021 has just ended, the time has come for ZDNet’s editorial staff to review the technologies that will mark the year 2022 in their imprint. After discussing the trends related to open source, cloud and blockchain, let’s now focus on what the world of databases, data management and artificial intelligence (AI) has to us. bring in this new year.

The many faces of AI

In principle, it is easier to approach AI holistically. To take into account the positive and negative aspects of this technology, going from the brilliant to the mundane, and from hardware to software. In recent years, hardware has been a recurring topic in the larger history of AI.

Over the past two years, we have closely followed the growing list of vendors of “AI chips”, that is, companies that have set out to develop new hardware architectures from A to Z, aimed specifically at workloads. of AI work. All are looking for a piece of a seemingly ever-growing pie: As AI grows, workloads keep growing, and serving them as quickly and economically as possible is an obvious goal.

Nvidia continues to dominate this market it was in long before AI workloads started to explode. The company had the insight and the reflexes to capitalize on this situation by building a hardware and software ecosystem. Its 2020 decision to integrate Arm into this ecosystem is still under regulatory review.

A very popular industry

Among the many announcements made at Nvidia’s GTC event in November 2021, the ones that bring something new to the hardware level relate to what we believe characterizes the AI ​​priority in 2021: inference. and the periphery. Nvidia introduced a number of improvements for the Triton inference server. It also showcased the Nvidia A2 Tensor Core GPU, a low-power, small-footprint accelerator for AI inference at the edge that Nvidia says delivers inference performance up to 20 times better than that. CPUs.

And what about newcomers? SambaNova claims to be “the world’s best-funded AI startup” today, having secured a whopping $ 676 million in Series D funding, surpassing the $ 5 billion valuation mark. SambaNova’s philosophy is to deliver “AI as a Service”, which now includes GPT language models. 2021 above all seems to have been the year of its launch for the company.

Xilinx, for its part, claims to achieve dramatic acceleration in neural networks compared to Nvidia’s GPUs. Cerebras claims to “absolutely dominate” high-end computing. The company has also secured significant funding. Graphcore competes with Nvidia (and Google) in MLPerf results. Tenstorrent hired legendary chip designer Keller.

Blaize, for its part, has raised $ 71 million to equip industrial applications with cutting-edge AI. Flex Logix raised $ 55 million in venture capital, bringing its total to $ 82 million. Last but not least, we have a new horse in the race with NeuReality, ways to mix and match deployment in ONNX and TVM, and the promise of using AI to design AI chips.

Future developments in all areas

According to the Linux Foundation’s State of the Edge report, digital healthcare, manufacturing and retail companies are particularly likely to expand their use of edge computing by 2028. It is not No wonder, hardware, frameworks and AI applications for edge computing are also proliferating.

TinyML, which wields the art and science of producing machine learning models thrifty enough to run on the edge, is growing rapidly and building an ecosystem. Edge Impulse, a startup that wants to make machine learning on the edge accessible to everyone, just announced $ 34 million in Series B funding. Peripheral applications are coming, and AI and its hardware will be an important part of it.

2021 has seen the explosion of what’s called MLOps – which claims to bring machine learning to production. In 2022, however, the focus will no longer be on the shiny new models of MLOps, but on perhaps more mundane, but practical aspects, such as data quality and data pipeline management, which will allow to MLOps to continue to develop.

The year of language models?

The other element likely to develop, both in terms of size and number, is that of large language models (LLM). Some believe that LLMs can internalize basic forms of language, be it biology, chemistry or human language, and unusual applications of LLMs are about to develop. Others disagree. Either way, LLMs are proliferating.

Besides OpenAI and its GPT3, DeepMind and its latest LLM RETRO, Google and its ever-widening range of major language models – Nvidia has partnered with Microsoft for the Megatron model. But that’s not all. Recently, EleutherAI, a collective of independent AI researchers, released its GPT-j model, which includes six billion parameters. If you are interested in languages ​​other than English, it now has a great model of European languages ​​- English, German, French, Spanish and Italian – created by Aleph Alpha.

Beyond large language models, DeepMind and Google have hinted at groundbreaking architectures for AI models, with Perceiver and Pathways, respectively. Pathways is criticized for being rather vague. However, it can be speculated that it could be based on Perceiver. But to stay in the realm of future technology, it would be an omission not to mention the neural algorithmic reasoning of DeepMind, a research direction promising to marry classical computational algorithms with deep learning.

No visit of the year 2021 to AI, however condensed it may be, would be complete without an honorary mention of the work carried out on the ethics of AI. AI ethics remained the focus of concern in 2021. From regulators to practitioners, everyone has gone on their own to rewrite the ethics of AI in their own way. And let’s not forget the current boom in AI applications in healthcare, an area where ethics should be a top priority, with or without AI.

Source: ZDNet.com





Source link -97