Microsoft combines accelerated computing with cloud scale engineering to bring advanced AI capabilities to our customers. For years, we’ve worked with NVIDIA to integrate hardware, software and ...
Microsoft AI has made its in-house models for transcription, speech recognition, and image generation available on Foundry.
The future of AI compute is heterogenous, according to Microsoft's GM of Azure Maia Andrew Wall. The implications of this are ...
On Thursday, Microsoft introduced three new foundational AI models—MAI-Transcribe-1, MAI-Voice-1, and MAI-Image-2—focused on ...
These tech stocks look particularly well positioned to benefit from this opportunity.
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models. Signaling that the future of AI may not just be how ...
Microsoft has described how it validates GPU clusters for Azure AI workloads using its internally developed SuperBench ...
As AI compute costs rise, Microsoft is seeking to reduce reliance on third-party chips, extending its push from custom ...
The big four cloud giants are turning to Nvidia's Dynamo to boost inference performance, with the chip designer's new Kubernetes-based API helping to further ease complex orchestration. According to a ...