Let me ask you this while reading this blog post, are you listening to music? Or do you have more than 1 tab open? Or do you have other applications open? Maybe discord? How do you think your operating system is running all of them at once? This is known as multi-tasking. Let's take a look into how our computer can achieve multitasking.

Why this post?

Multi-threading is a very common follow up question in interviews. Sometimes it is also used for screening purpose. This post builds the foundations required for multi-threading. Get a cup of ☕ and read on.

Be sure to check out an experiment I ran at the bottom of the post 👇

Basics

Instruction Cycle

As with everything let's start with the basics. Imagine a device with a single processor. The basic function of the processor is to execute instructions. Something like below to add two numbers (assembly language):

ADD     eax, ebx

Every time processor is executing instruction, it is "busy" or doing work. Until that instruction is processed it will not be able to process another instruction.


Have you checked out our discord group? We host live sessions for coding and system design every week on the server, have study kits that are based on multiple different patterns and topics. No sign up required, just join the discord


Single Task Single Core System

Let's assume I am a bad programmer and I write the following piece of code

int i = 0
while(true) {
    i++;
    if(i == Integer.MAX_INT) {
    	i = 0;
    }
}

If you would run the above program on a hypothetical system that has only one processor and which can only run one task at a time, your system would freeze or be non-responsive. The above task is taking up all of the compute time. (Check out the experiment below)

Multi-tasking

Fortunately, the designers of the computer knew there would be programmers like me, so they came up with a different approach. Instead of pausing up all the programs until one program completes, the processor actually switches between the active processes at any given time. This ensures that multiple processes are getting to a completed state simultaneously without freezing the system

How does multi-tasking work

Imagine, you had to progress on 3 tasks together.

  • Cooking
  • Laundary
  • Play game

The easiest way for your to do it would be to do each task for a certain amount of time (time-slice) and move on to the next one. You would possibly use an alarm clock (interrupt) to indicate the end of time to move on to the next one.

In the same exact way, the OS sets up time slices for interrupts to occur. On interrupts, the processor switches the tasks. These time slices are so small that as a user you do not realize them.

💡
Remember, multi tasking does not necessary mean multiple processors. Even a single processor can run multiple tasks. However, it will not be in parallel.

Multi-processor Systems

The above was the scenario for single processors. What about multi-processors? The entire scenario is exactly the same. Each processor is handling multiple different processors to keep your system usable.


Want to get more content every week in your inbox? Sign up for our newsletter below where we send awesome tips and tricks
✔ Interview tips and tricks including negotiations
✔ System Design and OOP concepts
✔ Leetcode detailed solutions
✔ Guest live event notifications


Now you may wonder, this just seems to have all advantages. How's that possible in computers? (or in life 🤔). There has to be a catch. Yes, there is.

Context Switching

This brings us to the next important topic known as context switching. When a process is created by the operating system it is allocated some amount of memory and process control block (PCB). The PCB is used to store and restore the state of the process.

When the processor receives an interrupt to run a different process (P2), it saves the current state of the current process (P1) in PCB, loads the PCB of a different process (P2), and executes the process (P2). The time involved in context switching is called Context Switching time. Theoretically, during this time, there is no work done from the perspective of the user. However, this time is not noticeable to the user.

Process Scheduling

You may think that if your processor is running 100% then it's not good for your system. But you are getting the most bang for your buck when your system is actually running at 100%. It means your processors are actually doing some work instead of just sitting ideal

This is where the process scheduler comes in. The job of the process scheduler is to make sure that your processor is being used as much as possible. There are different types of algorithms that can be used here. Some of them are

Experiment

Here is a small experiment that I ran on my machine which has 8 logical processors. You can also run the same experiment on your machine and your favorite language. Be ready to restart your computer. You have been warned.

public class Infinite {
	public static void main(String[] args) {
		int i = 0;
			while (true) {
                i++;
                if(i == Integer.MAX_VALUE){
                     i = 0;
                }
            }
        }
}

As you have noticed, the above code just run infinitely. I opened 8 terminal windows and ran the above program in all 8 of them. Below are the before and after CPU utilization graphs.

*I had to restart my computer for the first time in 48 days.

Next up, we will use these concepts and build upon for multi-threading.

Resources


Want to get more content every week in your inbox? Sign up for our newsletter below where we send awesome tips and tricks
✔ Interview tips and tricks including negotiations
✔ System Design and OOP concepts
✔ Leetcode detailed solutions
✔ Guest live event notifications