ARE GPUs THE FUTURE CPUs ?
- Rohan Banerjee

- Dec 26, 2021
- 5 min read

The growing industry of GPUs brings us to a question for the existence of CPUs in the coming future. Imagine we might also be the last to know about CPUs; our next generation will never know CPUs ever existed unless they read their textbooks properly.
Well coming to the present time where the question still keeps arousing in everyone’s mind “ARE GPUs THE FUTURE?” Well, honestly, I am not sure either, but I can guide you to your opinions based on the facts I gathered.
But first what’s a GPU anyway?
One of the most important types of computing technology in modern times, the GPU or the graphics processor unit was designed for parallel processing and is used in a wide range of applications, including graphics and video rendering. Although they're best known for their out compelling capabilities in gaming, GPUs are getting popular for creative product and Artificial Intelligence too.
In the 1970s, the term “GPU” firstly stood for the graphics processor unit and was defined as a programmable processing unit working singly from the CPU and responsible for graphics manipulation and output.
They were originally designed to accelerate the rendering of 3D graphics. They became more adaptable and programmable over time, expanding their capabilities. This in turn allowed the graphics programmers to create more interesting visual effects and realistic with advancing lighting and shadowing techniques. Other developers began to boost GPU capacity to substantially accelerate other workloads in high-performance computing (HPC), deep learning, and other areas.
Modern GPUs are very efficient at manipulating image processing and computer graphics. For algorithms that handle large blocks of data in parallel, their highly parallel structure makes them more efficient than general-purpose CPUs. A GPU in a PC can be found on a video card or incorporated on the motherboard. They are incorporated on the CPU die in certain CPUs.
I think you have got a basic idea about GPUs, their history, and their basic purpose. I understand that computing acronyms can be confusing and cause problems in school, but acronyms are the IT world's favourite technique to make interesting technologies sound extremely complicated.
So, what are CPUs, and why we are talking about when GPUs already exist?
When looking for a new PC or laptop, the specifications will specify the type of CPU you may anticipate to find in the gleaming new device. Frustratingly, they almost fail to tell you why that’s so important.
The Central Processing Unit (CPU) is commonly referred to as the computer's brain. While the CPU is merely one of several processing units, it is the most crucial. Before producing the output, the CPU accepts instructional inputs from the computer's RAM, decodes, and processes the activity. It's usually a small squared chip placed onto the device’s motherboard which interacts with other hardware to operate your computer.
Despite years of improvements in CPU capabilities, the basic function has remained the same consisting of three steps; fetch, decode and execute.
So why GPUs? What makes them better competitors?
Although not as versatile as CPUs, GPUs can easily process data of several orders of magnitude faster than CPUs due to massive parallelism. In a server environment, there might be 24 to 48 very fast CPU cores. Adding 4 to 8 GPUs to this same server can provide as many as 40,000 additional cores.
GPUs are best suited for repetitive and highly parallel computing tasks. Aside from video rendering, they excel at machine learning, financial simulations and risk modelling, and a variety of other scientific computations. GPUs provide tremendous parallelism, with each core dedicated to performing efficient operations.
Do they work together?
To increase the output of data and the number of concurrent calculations within an application; a CPU and GPU and collaborated to work together. A GPU augments a CPU architecture by allowing repetitive calculations within an application to be performed in parallel while the program code is still running on the CPU. While the CPU can still be the taskmaster of the entire system, a GPU can complete more work in the same amount of time by utilising the power of parallelism.
"If the CPU is a computer's brain, the GPU is its soul."
Over the past decades, however, GPUs have broken out of the boxy confines of the PC. They have ignited a worldwide AI boom. They’ve been woven into sprawling new hyperscale data centers.
While GPUs are now about a lot more than the PCs in which they first appeared, they remain anchored in the much older idea of parallel computing. And that’s what makes GPUs so powerful.
CPUs, to be sure, have remained essential, fast, and versatile, performing various tasks. But by contrast, GPUs break complex problems into thousands of millions of pieces and work them at once unlike CPU.

Why GPUs are considered ‘THE FUTURE’?
Architecturally, a CPU is made up of a few cores with a lot of cache memory that can manage a few software threads at once, but a GPU is made up of hundreds of cores that can handle thousands of threads at the same time.
GPUs deliver the once-esoteric technology of parallel computing. It's a technology with a distinguished lineage that includes names like supercomputer genius 'Seymor Cray.' Instead of assuming the form of hulking supercomputers, GPUs put this concept to work in the desktops and gaming consoles of over a million gamers.
Over time GPUs have proven the superman in modern computing and its implication on AI is no stranger. In deep learning, where data is vastly processed through neural networks, training them to perform the complicated task is all done by the GPUs.
In the automotive industry, they provide unmatched image recognition capabilities. They are key to creating self-driving cars.
In healthcare and life sciences, they can crunch medical data and help turn data through deep learning, into new capabilities.
CONCLUSION:
Yes, a GPU can nowadays do everything that a CPU can. It can execute data at a tenth to a quarter speed. Of course, each of those data can do 128 to 256 identical operations at the same time that the CPU does one operation. So, if you have data that you want in image processing, it beats the pants off a CPU. But if you only want one operation, which much ordinary code does, it runs slow.
It's like using a train instead of a car. A train takes hundred of passengers very fast, but only where it wants to go; whereas, a car can take a few passengers anywhere.
When the day comes where GPUs can also run very linear/basic computations of a CPU at the same rate, we are likely to already have the opposite of your question occurring.
In short, GPUs have become essential. They began by accelerating gaming and graphics. Now, they’re accelerating more and more areas where computing horsepower will make a difference.
Could it be possible to replace CPUs with GPUs? That’s up to the future computing industry, but what do you say?
--- Rohan Banerjee




Comments