Right now the GTC 2023 is going on and Nvidia showed off some of their newest steps in AI including this amazing Intro.

They introduced cuLitho, a new tool to optimize the design of processors. This was a complicated process that took weeks to calculate and can now be done in a few hours. Speeding up the chip design will lead to a speedup of the entire industry and shows how positive feedback loops power exponential growth.

They also talked about their new H100 chips for their DGX supercomputers. These chips will not only power the servers of big AI players like Aws, Azure, and OpenAI, but also Nvidias own cloud servers, which will be available for smaller companies.

Part of this Cloud service will be Nvidia cloud foundation will provide pre-trained models for text, image, and protein-sequencing and will run the training and interference of the models. One of the first users is Adobe, which uses the service for its new AI service Firefly.

In the end, they also presented a new server CPU “Grace” and the Bluefield-3 DPU which will power future data centers.

I am most impressed by their hardware improvements and their AI cloud platform which will both accelerate Ai adoption greatly.