Journey To The Center Of AI
How to AI
AI has been in the headlines almost constantly for the last 18 months. But despite all that fanfare, ChatGPT���ing, and Faces being hugged; I have to admit that I still don���t know how it all works. Not that anyone really does, but beyond the prompt text I submit to a Chats GPT or Dalls ‘E’, I don���t know what is happening in the background. What is the software stack powering GPT 4? What type of operating system, orchestrator, and applications allow for the massive training models behind OpenAI? And what about the hardware? I know there���s a lot of GPUs, but there���s got to be more to it than that. I decided to dig into some of these layers to try and trace from the prompt down to the physical servers, and you get to come with me. So get in loser, we���re going to learn how to AI.
I bet OpenAI has published something about their stack… sortaAccording to a post from 2016, OpenAI was using Python, TensorFlow, Numpy, Keras, and AnacondaTensorFlow is a machine learning library with APIs available for Python and C++Numpy is a Python package developed to support scientific computingFlyte, a graduated project from the Linux Foundation that leverages KubernetesNvidia H100 Spec SheetDGX SuperPodTensor Cores and CUDA cores, what is the difference?Nvidia helpfully posted some example code using the cuDNN libraryThe Infiniband standard is a different protocol than EthernetUltra Ethernet Consortium is focused on developing an open standard that meets or exceeds what Infiniband does todayRemember when Crypto was using as much power as the country of ArgentinaAI is already on pace to match that by 2027Microsoft���s AI for Beginners site to get down some of the terminology and lab timeYou may also want to take a beginner course on PythonIntro and outro music by James Bellavance copyright 2022