AI Hardware, Software, and Architectures Powering Modern Artificial Intelligence Audiobook By Practicing Engineers Network cover art

AI Hardware, Software, and Architectures Powering Modern Artificial Intelligence

From GPUs and ASICs to CUDA, Accelerators, Compilers and Runtimes

Virtual Voice Sample

Audible Standard 30-day free trial

Try Standard free
Select 1 audiobook a month from our entire collection of titles.
Yours as long as you’re a member.
Get unlimited access to bingeable podcasts.
Standard auto renews for $8.99 a month after 30 days. Cancel anytime.

AI Hardware, Software, and Architectures Powering Modern Artificial Intelligence

By: Practicing Engineers Network
Narrated by: Virtual Voice
Try Standard free

$8.99 a month after 30 days. Cancel anytime.

Buy for $5.99

Buy for $5.99

Background images

This title uses virtual voice narration

Virtual voice is computer-generated narration for audiobooks.

AI Hardware, Software, and Architectures Powering Modern Artificial Intelligence
From GPUs and ASICs to CUDA, Accelerators, Compilers, and Runtimes

Modern artificial intelligence is not powered by models alone—it is driven by a carefully engineered stack of hardware, software, and system architectures working together at scale.

This book provides a clear, practical, and system-level view of how modern AI actually runs, from silicon to software. It explains how GPUs, ASICs, and emerging accelerators execute neural networks, how CUDA and alternative runtimes expose hardware capabilities, and how compilers, communication libraries, and inference systems transform models into high-performance AI workloads.

Written for engineers, architects, and technically curious professionals, this book goes beyond surface-level explanations and focuses on how real AI systems are designed, optimized, and deployed in production environments.

You will learn:

  • How GPUs, TPUs, NPUs, and AI ASICs are architected for training and inference

  • Why CUDA became central to AI—and how alternatives like ROCm, oneAPI, and Vulkan compare

  • How compilers, graph lowering, and kernel optimization affect performance

  • How distributed training systems scale across multiple accelerators

  • How inference systems balance latency, throughput, and cost

  • Where memory, bandwidth, and communication become bottlenecks

  • How modern AI hardware trends are shaping future system designs

The book emphasizes system thinking—connecting hardware capabilities to software abstractions and real-world performance tradeoffs. Diagrams, structured explanations, and end-to-end workflows make complex concepts accessible without oversimplifying them.

Whether you are designing AI infrastructure, optimizing performance, evaluating hardware platforms, or simply seeking a deeper understanding of how AI works under the hood, this book serves as both a learning guide and long-term reference.

Computer Science Technology Software Data Science Machine Learning
No reviews yet