What is DSA?

Chapter 11 – Deques (Double-Ended Queues)

Chapter 11: Deques (Double-Ended Queues) Welcome to Chapter 11! In this chapter, we’ll explore Deques, or Double-Ended Queues, an interesting and versatile data structure that extends the concept of a standard queue by allowing insertion and deletion at both ends. This concept will help you understand more advanced applications of queues and solve real-life problems like managing text editors, undo operations, or simulations. Let’s begin! What is a Deque? A Deque (pronounced “deck”) stands for Double-Ended Queue, and it is a generalized form of the standard queue. Unlike a regular queue, where insertion is done at the back and deletion is done at the front, a deque allows insertion and deletion from both ends of the structure. In simple terms: You can add an element at either the front or the back. You can remove an element from either the front or the back. This makes the deque a very flexible data structure. Imagine you’re standing in a line (queue) but now, you have the option to add someone to the front of the line, or let them sneak in at the back. That’s how a deque works! Types of Deques Deques can be classified into two types: Input-Restricted Deque: In this type of deque, insertion is restricted to one end (say, the rear), but deletion can occur at both ends. Output-Restricted Deque: In this type, deletion is restricted to one end (say, the front), but insertion can happen at both ends. Both types allow different levels of control depending on your use case. Applications of Deques Before we get into the operations, let’s explore some real-world applications of deques: Undo/Redo Operations: In text editors or drawing tools, deques can be used to manage undo and redo functionality. You can easily move forward and backward in the history of actions. Palindrome Checking: Deques are perfect for checking palindromes (words or phrases that read the same forward and backward) because they allow access from both ends. Sliding Window Problems: In algorithms, deques are often used to maintain a sliding window of data, useful in problems like finding the maximum or minimum of all subarrays of a given size. Task Scheduling: In simulations or real-time systems like the ones developed by Emancipation Edutech Private Limited, deques can manage tasks with both high and low priorities, offering flexibility in choosing which task to execute next. You can read more about these applications and other use cases for free on digilearn.cloud! Basic Operations of Deques Now, let’s break down the core operations that can be performed on a deque: Insertion at Front: Insert an element at the front of the deque. Insertion at Rear: Insert an element at the rear of the deque. Deletion from Front: Remove an element from the front of the deque. Deletion from Rear: Remove an element from the rear of the deque. Get Front: Retrieve the front element without removing it. Get Rear: Retrieve the rear element without removing it. Check if Empty: Check if the deque is empty. Check if Full: Check if the deque is full (in a limited-size implementation). These operations make deques highly versatile, with various possibilities for manipulating the data structure. Deque Representation and Implementation Let’s implement a deque using an array in C to better understand how it works. Define the Deque Structure #include <stdio.h> #include <stdlib.h> #define MAX 100 struct Deque { int arr[MAX]; // Array to store elements int front; // Points to the front end int rear; // Points to the rear end int size; // Current number of elements }; arr: This array will store our elements. front: This pointer will indicate the front of the deque. rear: This pointer will indicate the rear of the deque. size: This will track the number of elements in the deque. Initialize the Deque void initialize(struct Deque* dq) { dq->front = -1; // Initially, front is -1 (empty state) dq->rear = -1; // Initially, rear is -1 (empty state) dq->size = 0; // No elements in the deque } Insertion at Front void insertFront(struct Deque* dq, int value) { if (dq->size == MAX) { // Check if deque is full printf("Deque Overflow\n"); return; } if (dq->front == -1) { // First element insertion dq->front = 0; dq->rear = 0; } else if (dq->front == 0) { // Circular condition dq->front = MAX – 1; } else { dq->front–; } dq->arr[dq->front] = value; dq->size++; } Insertion at Rear void insertRear(struct Deque* dq, int value) { if (dq->size == MAX) { // Check if deque is full printf("Deque Overflow\n"); return; } if (dq->rear == -1) { // First element insertion dq->rear = 0; dq->front = 0; } else if (dq->rear == MAX – 1) { // Circular condition dq->rear = 0; } else { dq->rear++; } dq->arr[dq->rear] = value; dq->size++; } Deletion from Front void deleteFront(struct Deque* dq) { if (dq->size == 0) { // Check if deque is empty printf("Deque Underflow\n"); return; } if (dq->front == dq->rear) { // Only one element in deque dq->front = -1; dq->rear = -1; } else if (dq->front == MAX – 1) { // Circular condition dq->front = 0; } else { dq->front++; } dq->size–; } Deletion from Rear void deleteRear(struct Deque* dq) { if (dq->size == 0) { // Check if deque is empty printf("Deque Underflow\n"); return; } if (dq->front == dq->rear) { // Only one element in deque dq->front = -1; dq->rear = -1; } else if (dq->rear == 0) { // Circular condition dq->rear = MAX – 1; } else { dq->rear–; } dq->size–; } Checking if the Deque is Empty or Full int isEmpty(struct Deque* dq) { return dq->size == 0; } int isFull(struct Deque* dq) { return dq->size == MAX; } Real-World Example Imagine you’re developing a browser. The deque can be used to implement forward and backward navigation: You can insert URLs into the deque as the user browses different websites. When the user clicks the "back" button, you remove the URL from the rear (current page) and go back

Chapter 11 – Deques (Double-Ended Queues) Read More »

Chapter 10 – Priority Queues

Chapter 10: Priority Queues Welcome to Chapter 10! In this chapter, we’ll explore Priority Queues, a special type of queue where elements are processed based on their priority rather than the order in which they were added. Priority queues are incredibly useful in various real-world applications, such as task scheduling, Dijkstra’s shortest path algorithm, and handling network packets. Let’s dive into the concept and understand how they work! What is a Priority Queue? A Priority Queue is a data structure where each element is associated with a priority, and the element with the highest (or lowest) priority is dequeued first. Unlike a regular queue, which follows the First-In-First-Out (FIFO) principle, priority queues can rearrange the order of processing based on priorities. For example, in an emergency room, patients with life-threatening conditions (higher priority) will be treated first, even if they arrived after other patients with minor injuries. Here’s a basic visual of how a priority queue works: Priority Queue: [Critical Patient (Priority 3)] -> [Moderate Patient (Priority 2)] -> [Minor Patient (Priority 1)] In this case, the Critical Patient will be dequeued and treated first, regardless of the order in which they entered the queue. Types of Priority Queues There are two types of priority queues: Max-Priority Queue: In this type, the element with the highest priority is dequeued first. Min-Priority Queue: In this type, the element with the lowest priority is dequeued first. For instance, in task scheduling systems at Emancipation Edutech Private Limited, a max-priority queue might be used to prioritize the most urgent student requests first. Priority Queue Operations The following operations are performed on priority queues: Insert (Enqueue): Add an element to the queue with a given priority. Extract Max (Dequeue): Remove the element with the highest priority (in max-priority queues). Get Max: Retrieve, but do not remove, the element with the highest priority. Increase Priority: Change the priority of an element in the queue. Let’s understand these operations more deeply by implementing a priority queue using arrays. Priority Queue Implementation Using Arrays To keep things simple, we’ll implement a max-priority queue using arrays, where higher numbers indicate higher priorities. Define the Queue Structure #include <stdio.h> #include <stdlib.h> #define MAX 100 // Max size of the priority queue struct PriorityQueue { int data[MAX]; // Array to store elements int priority[MAX]; // Array to store priorities int size; // Number of elements in the queue }; data: Stores the elements. priority: Stores the priority associated with each element. size: Tracks the current number of elements in the queue. Initialize the Priority Queue void initialize(struct PriorityQueue* pq) { pq->size = 0; // Queue is initially empty } Insert Operation void insert(struct PriorityQueue* pq, int value, int prio) { if (pq->size == MAX) { // Check if the queue is full printf("Queue Overflow\n"); return; } int i = pq->size; pq->data[i] = value; pq->priority[i] = prio; pq->size++; } Here, we simply insert an element along with its priority at the end of the queue. Extract Max Operation int extractMax(struct PriorityQueue* pq) { if (pq->size == 0) { // Check if the queue is empty printf("Queue Underflow\n"); return -1; } int maxIndex = 0; // Find the element with the highest priority for (int i = 1; i < pq->size; i++) { if (pq->priority[i] > pq->priority[maxIndex]) { maxIndex = i; } } int maxValue = pq->data[maxIndex]; // Store the highest priority value // Shift the remaining elements for (int i = maxIndex; i < pq->size – 1; i++) { pq->data[i] = pq->data[i + 1]; pq->priority[i] = pq->priority[i + 1]; } pq->size–; return maxValue; } In this operation, we scan the queue to find the element with the highest priority and then remove it, shifting the rest of the elements. Example Usage int main() { struct PriorityQueue pq; initialize(&pq); // Insert elements with priorities insert(&pq, 10, 1); // Insert value 10 with priority 1 insert(&pq, 20, 4); // Insert value 20 with priority 4 insert(&pq, 30, 3); // Insert value 30 with priority 3 // Extract the highest priority element printf("Extracted element: %d\n", extractMax(&pq)); return 0; } Output: Extracted element: 20 Notice how 20 was extracted because it had the highest priority (4). Using Priority Queues in Real Life Priority queues are extremely useful in real-world applications. Let’s explore some common scenarios: Task Scheduling: Operating systems use priority queues to manage tasks. Processes with higher priority are executed before lower-priority processes, ensuring time-sensitive tasks are handled quickly. Network Traffic: In network systems, priority queues help manage data packets. Critical packets (such as emergency notifications) are transmitted first, while less important packets (like email) are handled later. Dijkstra’s Algorithm: Priority queues are used in algorithms like Dijkstra’s Shortest Path Algorithm. Nodes with the smallest distance (priority) are processed first, helping to find the shortest path between two points efficiently. Emergency Systems: Priority queues manage patients in hospitals, where the most critical patients (with higher priorities) are treated first. At Emancipation Edutech Private Limited, we could use priority queues to manage student queries or schedule courses based on demand, ensuring that high-demand courses are given priority. Don’t forget to explore digilearn.cloud to access more free educational resources! Implementing Priority Queues Using Heaps While arrays are a simple way to implement priority queues, they’re not the most efficient. For faster access to the highest-priority element, heaps are commonly used. A binary heap allows both insertion and extraction of the maximum element in O(log n) time, compared to O(n) time with arrays. Let’s see how a max-heap works with priority queues: Max-Heap Representation A max-heap is a binary tree where the parent node is always greater than or equal to its children. This ensures that the root always contains the highest-priority element, which can be removed efficiently. 20 / \ 15 10 / \ 8 5 Here, 20 is the maximum value, and extracting it will require only O(log n) time to rearrange the heap. Applications of Priority Queues Priority queues are widely used in areas like: Operating Systems: For task scheduling and resource management.

Chapter 10 – Priority Queues Read More »

Chapter 9 – Circular Queues

Chapter 9: Circular Queues Welcome to Chapter 9! In this chapter, we’ll explore Circular Queues, which are an extension of the basic queue structure. Circular queues optimize memory usage by reusing spaces that are freed when elements are dequeued. This makes them highly useful for scenarios with fixed-size buffers, like operating systems and network packet handling. What is a Circular Queue? A Circular Queue is a type of queue in which the last position is connected back to the first position, making the queue circular. This means that when the rear of the queue reaches the end of the array, it wraps around to the front (if there’s space) to reuse the vacated positions. Circular queues solve the problem of wasted space in a standard queue implementation where elements are removed from the front but the space isn’t reused. Here’s a basic diagram of a circular queue: Front ↓ [5] -> [6] -> [7] -> [8] -> [0] ↑ Rear In this circular queue, after adding an element to the rear, we’ll wrap around to the front to fill any vacated positions. Why Use a Circular Queue? In a normal queue, once the rear pointer reaches the end of the array, no more elements can be added, even if there are free spaces in the front due to dequeued elements. Circular queues resolve this by allowing the rear pointer to wrap around to the front and reuse those free spaces. At Emancipation Edutech Private Limited, a circular queue could be used in systems where students access learning modules. When students complete a module and move on, their slot in the queue can be reused by another student. Operations on Circular Queues The operations on circular queues are similar to those on regular queues but with a few modifications to handle the circular nature. The following are the main operations: Enqueue: Add an element to the rear of the circular queue. Dequeue: Remove an element from the front of the circular queue. Front: Get the front element of the circular queue. IsFull: Check if the queue is full. IsEmpty: Check if the queue is empty. Let’s see how to implement these operations using an array-based circular queue. Circular Queue Implementation Using an Array Define the Queue Structure #include <stdio.h> #include <stdlib.h> #define MAX 5 // Size of the circular queue struct CircularQueue { int front, rear; int arr[MAX]; }; front: Index of the front element. rear: Index of the rear element. arr: Array to store the elements. Initialize the Circular Queue void initialize(struct CircularQueue* q) { q->front = -1; q->rear = -1; // Queue is initially empty } Enqueue Operation void enqueue(struct CircularQueue* q, int value) { if ((q->rear + 1) % MAX == q->front) { // Check if the queue is full printf("Queue Overflow\n"); return; } if (q->front == -1) // If queue is initially empty q->front = 0; q->rear = (q->rear + 1) % MAX; // Increment rear in a circular manner q->arr[q->rear] = value; // Add the value } In this case, we use the modulo (%) operator to wrap around the rear pointer when it reaches the end of the array. Dequeue Operation int dequeue(struct CircularQueue* q) { if (q->front == -1) { // Check if the queue is empty printf("Queue Underflow\n"); return -1; // Return -1 to indicate an error } int value = q->arr[q->front]; // Store the front value if (q->front == q->rear) { // If there was only one element q->front = -1; q->rear = -1; // Reset the queue } else { q->front = (q->front + 1) % MAX; // Increment front in a circular manner } return value; } Front Operation int front(struct CircularQueue* q) { if (q->front == -1) { // Check if the queue is empty printf("Queue is Empty\n"); return -1; } return q->arr[q->front]; // Return the front value } Example Usage int main() { struct CircularQueue queue; initialize(&queue); // Enqueue elements enqueue(&queue, 10); enqueue(&queue, 20); enqueue(&queue, 30); enqueue(&queue, 40); // Dequeue an element printf("Dequeued element: %d\n", dequeue(&queue)); // Enqueue another element enqueue(&queue, 50); // Print the front element printf("Front element: %d\n", front(&queue)); return 0; } Output: Dequeued element: 10 Front element: 20 Notice how, after dequeueing 10, we enqueue another element 50, reusing the space previously held by 10. Circular Queue vs. Linear Queue Let’s compare circular queues and linear queues to understand the key differences: Feature Circular Queue Linear Queue Memory Utilization Utilizes memory efficiently by reusing vacated positions after dequeueing. Wastes memory as vacated positions aren’t reused. Overflow Condition Occurs only when all positions are filled, even if front has moved from the first index. Occurs when the rear reaches the end, even if there are empty spaces at the front. Example Usage Used in scheduling, buffering, and resource management systems like CPU scheduling or network packet handling. Used in simpler cases, like managing a single queue of tasks. Circular queues are perfect when you need to manage fixed-size buffers and prevent wasted space. At Emancipation Edutech Private Limited, circular queues might be used in our learning management systems to efficiently handle student queries or manage the flow of study resources—ensuring no slot goes to waste. To learn more, visit digilearn.cloud, where you can read this book and other educational resources for free. Applications of Circular Queues Circular queues are used in several real-world applications: CPU Scheduling: Circular queues are often used to manage processes in the round-robin CPU scheduling algorithm. In this method, each process is given an equal share of the CPU, and once its time slice is over, it goes back to the end of the queue. Network Buffers: Circular queues are used in network systems to handle incoming data packets. When one packet is processed, the slot is reused for a new incoming packet, preventing overflow until all slots are filled. Job Scheduling: Printers and other resource management systems use circular queues to manage multiple print jobs without wasting space. Real-Time Systems: In real-time computing environments, circular queues help in managing time-critical

Chapter 9 – Circular Queues Read More »

Chapter 8 – Queues

Chapter 8: Queues Welcome to Chapter 8! In this chapter, we’ll explore Queues, another fundamental data structure that follows the First-In-First-Out (FIFO) principle. Queues are widely used in various computing scenarios like task scheduling, resource management, and buffering. What is a Queue? A Queue is a linear data structure where elements are inserted at the rear (also called the back) and removed from the front. This follows the First-In-First-Out (FIFO) principle, which means the first element added to the queue will be the first one to be removed, much like a line of people waiting at a ticket counter. Here’s a basic diagram of a queue: Front -> [1] -> [2] -> [3] -> [4] <- Rear In this queue, 1 is the front element and will be removed first, while 4 is at the rear and will be removed last. Why Use Queues? Queues are especially useful in situations where you need to manage tasks or resources in the order they arrive. They are widely applied in various domains: CPU Scheduling: Operating systems use queues to manage processes waiting for execution. Network Traffic Management: Queues are used to buffer incoming network packets before processing. Printer Job Management: When multiple print jobs are sent to a printer, they are queued and printed in the order they were received. At Emancipation Edutech Private Limited, queues may be used to manage student requests or service multiple learning queries simultaneously, ensuring that tasks are handled in the order they arrive. Basic Operations on Queues Queues support four essential operations: Enqueue: Add an element to the rear of the queue. Dequeue: Remove an element from the front of the queue. Front: Get the front element of the queue. IsEmpty: Check if the queue is empty. Let’s see how to implement these operations using both array-based and linked-list-based queues. Queue Implementation Using an Array An array-based queue is implemented using a fixed-size array to store the elements. However, managing the front and rear pointers becomes essential to keep track of the queue. Define the Queue Structure #include <stdio.h> #include <stdlib.h> #define MAX 100 // Maximum size of the queue // Define the queue structure struct Queue { int front, rear; int arr[MAX]; }; front: Index of the front element. rear: Index of the rear element. arr: Array to hold the elements. Initialize the Queue void initialize(struct Queue* q) { q->front = -1; q->rear = -1; // Queue is empty initially } Enqueue Operation void enqueue(struct Queue* q, int value) { if (q->rear == MAX – 1) { // Check for overflow printf("Queue Overflow\n"); return; } if (q->front == -1) // If queue is initially empty q->front = 0; q->arr[++(q->rear)] = value; // Increment rear and add value } Dequeue Operation int dequeue(struct Queue* q) { if (q->front == -1 || q->front > q->rear) { // Check for underflow printf("Queue Underflow\n"); return -1; // Return -1 to indicate an error } return q->arr[q->front++]; // Return front value and increment front } Front Operation int front(struct Queue* q) { if (q->front == -1 || q->front > q->rear) { // Check if queue is empty printf("Queue is Empty\n"); return -1; } return q->arr[q->front]; // Return front value } Example Usage int main() { struct Queue queue; initialize(&queue); // Enqueue elements enqueue(&queue, 10); enqueue(&queue, 20); enqueue(&queue, 30); // Get the front element printf("Front element is %d\n", front(&queue)); // Dequeue elements printf("Dequeued element is %d\n", dequeue(&queue)); printf("Dequeued element is %d\n", dequeue(&queue)); return 0; } Output: Front element is 10 Dequeued element is 10 Dequeued element is 20 Queue Implementation Using a Linked List In a linked-list-based queue, we dynamically allocate memory for each element, allowing the queue to grow as needed without a fixed size limit. Define the Node Structure #include <stdio.h> #include <stdlib.h> // Define the node structure struct Node { int data; struct Node* next; }; // Define the queue structure struct Queue { struct Node* front; struct Node* rear; }; Initialize the Queue void initialize(struct Queue* q) { q->front = q->rear = NULL; // Queue is empty initially } Enqueue Operation void enqueue(struct Queue* q, int value) { struct Node* newNode = (struct Node*)malloc(sizeof(struct Node)); newNode->data = value; newNode->next = NULL; if (q->rear == NULL) { // If queue is empty q->front = q->rear = newNode; return; } q->rear->next = newNode; q->rear = newNode; } Dequeue Operation int dequeue(struct Queue* q) { if (q->front == NULL) { // Check if queue is empty printf("Queue Underflow\n"); return -1; // Return -1 to indicate an error } struct Node* temp = q->front; int value = temp->data; q->front = q->front->next; if (q->front == NULL) { // If queue becomes empty q->rear = NULL; } free(temp); return value; } Front Operation int front(struct Queue* q) { if (q->front == NULL) { // Check if queue is empty printf("Queue is Empty\n"); return -1; } return q->front->data; // Return front value } Example Usage int main() { struct Queue queue; initialize(&queue); // Enqueue elements enqueue(&queue, 10); enqueue(&queue, 20); enqueue(&queue, 30); // Get the front element printf("Front element is %d\n", front(&queue)); // Dequeue elements printf("Dequeued element is %d\n", dequeue(&queue)); printf("Dequeued element is %d\n", dequeue(&queue)); return 0; } Output: Front element is 10 Dequeued element is 10 Dequeued element is 20 Types of Queues There are several variations of the basic queue that serve different purposes in computer science: 1. Simple Queue: The basic queue structure we have seen so far. Elements are inserted at the rear and removed from the front. 2. Circular Queue: In a circular queue, the rear of the queue wraps around to the front when the end of the array is reached. This allows for efficient use of space by reusing positions that are freed when elements are dequeued. if (rear == MAX – 1) { rear = 0; } Circular queues are often used in resource management systems like printer spoolers or traffic management software. 3. Priority Queue: In a priority queue, each element is assigned a priority, and the element with the highest priority is dequeued first,

Chapter 8 – Queues Read More »

Chapter 7 – Stacks

Chapter 7: Stacks Welcome to Chapter 7! In this chapter, we will delve deeply into the concept of Stacks, a fundamental data structure with a Last-In-First-Out (LIFO) ordering principle. Understanding stacks will equip you with a solid foundation for tackling a range of computational problems and algorithms. What is a Stack? A Stack is a collection of elements that follows the Last-In-First-Out (LIFO) principle. This means that the most recently added element is the first one to be removed. Think of it like a stack of plates: you add new plates on top and remove plates from the top. Here’s a basic diagram of a stack: Top -> [3] [2] [1] Bottom -> NULL In this diagram, 3 is the top element, and it will be removed first during a pop operation. Why Use Stacks? Stacks are versatile and used in many computing scenarios: Function Call Management: The call stack manages function calls and local variables during program execution. Expression Evaluation: Stacks are essential in evaluating arithmetic expressions, especially in postfix notation. Backtracking Algorithms: Used in algorithms like depth-first search where you need to backtrack through previous states. In a similar way, Emancipation Edutech Private Limited might use stacks to manage learning modules or track student progress, highlighting the versatility of this data structure. Basic Operations on Stacks Stacks support a few primary operations: Push: Add an element to the top of the stack. Pop: Remove the element from the top of the stack. Peek: View the top element without removing it. IsEmpty: Check if the stack is empty. Let’s look at how to implement these operations using both array-based and linked-list-based stacks. Stack Implementation Using an Array An array-based stack is straightforward and involves a fixed-size array to store elements. Here’s a detailed implementation: Define the Stack Structure #include <stdio.h> #include <stdlib.h> #define MAX 100 // Define the maximum size of the stack // Define the stack structure struct Stack { int top; int arr[MAX]; }; top: Index of the top element in the stack. arr: Array to hold stack elements. Initialize the Stack void initialize(struct Stack* s) { s->top = -1; // Stack is empty initially } Push Operation void push(struct Stack* s, int value) { if (s->top == MAX – 1) { // Check for stack overflow printf("Stack Overflow\n"); return; } s->arr[++(s->top)] = value; // Increment top and add value } Pop Operation int pop(struct Stack* s) { if (s->top == -1) { // Check for stack underflow printf("Stack Underflow\n"); return -1; // Return -1 to indicate an error } return s->arr[(s->top)–]; // Return top value and decrement top } Peek Operation int peek(struct Stack* s) { if (s->top == -1) { // Check if stack is empty printf("Stack is Empty\n"); return -1; // Return -1 to indicate an error } return s->arr[s->top]; // Return top value } Example Usage int main() { struct Stack stack; initialize(&stack); // Push elements push(&stack, 10); push(&stack, 20); push(&stack, 30); // Peek the top element printf("Top element is %d\n", peek(&stack)); // Pop elements printf("Popped element is %d\n", pop(&stack)); printf("Popped element is %d\n", pop(&stack)); // Check if the stack is empty if (s->top == -1) { printf("Stack is empty\n"); } return 0; } Output: Top element is 30 Popped element is 30 Popped element is 20 Stack is not empty Stack Implementation Using a Linked List A stack can also be implemented using a linked list, which allows for dynamic size adjustments. Define the Node Structure #include <stdio.h> #include <stdlib.h> // Define the node structure struct Node { int data; struct Node* next; }; // Define the stack structure struct Stack { struct Node* top; }; Initialize the Stack void initialize(struct Stack* s) { s->top = NULL; // Stack is empty initially } Push Operation void push(struct Stack* s, int value) { struct Node* newNode = (struct Node*) malloc(sizeof(struct Node)); newNode->data = value; newNode->next = s->top; s->top = newNode; } Pop Operation int pop(struct Stack* s) { if (s->top == NULL) { // Check if stack is empty printf("Stack Underflow\n"); return -1; // Return -1 to indicate an error } struct Node* temp = s->top; int value = temp->data; s->top = s->top->next; free(temp); return value; } Peek Operation int peek(struct Stack* s) { if (s->top == NULL) { // Check if stack is empty printf("Stack is Empty\n"); return -1; // Return -1 to indicate an error } return s->top->data; // Return top value } Example Usage int main() { struct Stack stack; initialize(&stack); // Push elements push(&stack, 10); push(&stack, 20); push(&stack, 30); // Peek the top element printf("Top element is %d\n", peek(&stack)); // Pop elements printf("Popped element is %d\n", pop(&stack)); printf("Popped element is %d\n", pop(&stack)); // Check if the stack is empty if (s->top == NULL) { printf("Stack is empty\n"); } return 0; } Output: Top element is 30 Popped element is 30 Popped element is 20 Stack is not empty Applications of Stacks Stacks are fundamental in various computing contexts: Expression Evaluation: Evaluate postfix expressions and manage operators and operands. Function Call Management: The call stack tracks function calls and local variables. Undo Mechanism: Manage undo operations in text editors and similar applications. For instance, Emancipation Edutech Private Limited might use stacks to manage undo operations in interactive learning tools or to handle complex navigation tasks efficiently. Advantages and Disadvantages of Stacks Advantages: Simple Operations: Push and pop operations are straightforward and easy to implement. Dynamic Size: Linked-list-based stacks can grow and shrink dynamically, avoiding fixed size limitations. Disadvantages: Fixed Size: Array-based stacks have a fixed size, which can lead to overflow if the limit is reached. Limited Access: Only the top element is accessible directly, which might not be suitable for all use cases. Summary of Chapter 7 In this chapter, we have explored Stacks in depth, including their basic operations, implementations using arrays and linked lists, and their various applications. Understanding stacks will help you handle problems requiring LIFO access efficiently. Key takeaways: Stacks operate on a Last-In-First-Out principle. Implementations can be array-based or linked-list-based. Stacks are used

Chapter 7 – Stacks Read More »

Chapter 6 – Circular Linked Lists

Chapter 6: Circular Linked Lists Welcome to Chapter 6! In this chapter, we will explore Circular Linked Lists, a variation of linked lists where the last node in the list points back to the first node, forming a circular structure. This type of linked list is particularly useful in scenarios where you need a continuous loop through your data. What is a Circular Linked List? A Circular Linked List is a type of linked list in which the last node points back to the first node instead of having a NULL pointer. This circular structure allows you to traverse the list from any node and return to the starting node without having to check for the end. Here’s a graphical representation of a circular linked list: [Data | Next] -> [Data | Next] -> [Data | Next] -+ ^ | | | +——————————————-+ In this diagram, you can see that the next pointer of the last node points back to the first node, creating a loop. Why Use Circular Linked Lists? Circular linked lists are useful for scenarios that require a continuous loop or when you need to repeatedly cycle through the elements. Some applications include: Round-Robin Scheduling: In operating systems, circular linked lists can be used to manage tasks in a round-robin fashion. Circular Buffers: They are used in scenarios where you have a fixed-size buffer and need to overwrite old data when the buffer is full. Navigation Systems: For continuous navigation without end, like in a continuous data feed. Just like how Emancipation Edutech Private Limited might use circular linked lists to cycle through learning modules or resources efficiently, these structures can simplify repetitive tasks. Building a Circular Linked List To implement a circular linked list, we start by defining the node structure: #include <stdio.h> #include <stdlib.h> // Define a node structure struct Node { int data; struct Node* next; }; Creating a Node Creating a node is similar to singly linked lists, but with an additional step to handle the circular nature: struct Node* createNode(int value) { struct Node* newNode = (struct Node*) malloc(sizeof(struct Node)); newNode->data = value; newNode->next = newNode; // Point to itself initially return newNode; } Inserting Elements in a Circular Linked List Here’s how you can insert nodes into a circular linked list: At the Beginning: void insertAtBeginning(struct Node** head, int value) { struct Node* newNode = createNode(value); if (*head == NULL) { *head = newNode; newNode->next = *head; return; } struct Node* temp = *head; while (temp->next != *head) { temp = temp->next; } newNode->next = *head; temp->next = newNode; *head = newNode; } This function inserts a new node at the start of the circular list and adjusts the next pointers to maintain the circular structure. At the End: void insertAtEnd(struct Node** head, int value) { struct Node* newNode = createNode(value); if (*head == NULL) { *head = newNode; newNode->next = *head; return; } struct Node* temp = *head; while (temp->next != *head) { temp = temp->next; } temp->next = newNode; newNode->next = *head; } This function adds a new node to the end of the circular list by traversing to the last node and updating pointers. Inserting After a Given Node: void insertAfter(struct Node* prevNode, int value) { if (prevNode == NULL) { printf("The given previous node cannot be NULL"); return; } struct Node* newNode = createNode(value); newNode->next = prevNode->next; prevNode->next = newNode; } This function inserts a new node after a given node, adjusting the next pointers accordingly. Traversing a Circular Linked List Traversal in a circular linked list involves starting from the head and continuing until you return to the head: void traverse(struct Node* head) { if (head == NULL) return; struct Node* temp = head; do { printf("%d -> ", temp->data); temp = temp->next; } while (temp != head); printf("(back to head)\n"); } This function loops through the list and prints each node’s data until it returns to the head. Example: Creating and Traversing a Circular Linked List Here’s a sample program to create a circular linked list and traverse it: int main() { struct Node* head = NULL; // Inserting nodes insertAtEnd(&head, 10); insertAtEnd(&head, 20); insertAtBeginning(&head, 5); insertAtEnd(&head, 30); insertAfter(head->next, 15); // Insert 15 after the node with data 10 // Traverse the list traverse(head); return 0; } Output: 5 -> 10 -> 15 -> 20 -> 30 -> (back to head) Deleting Nodes in a Circular Linked List Deleting nodes in a circular linked list involves similar steps as in singly linked lists but requires updating the next pointers to maintain the circular structure: Deleting the First Node: void deleteFirstNode(struct Node** head) { if (*head == NULL) return; struct Node* temp = *head; if ((*head)->next == *head) { free(*head); *head = NULL; return; } struct Node* last = *head; while (last->next != *head) { last = last->next; } last->next = (*head)->next; *head = (*head)->next; free(temp); } Deleting the Last Node: void deleteLastNode(struct Node** head) { if (*head == NULL) return; struct Node* temp = *head; if ((*head)->next == *head) { free(*head); *head = NULL; return; } struct Node* prev = NULL; while (temp->next != *head) { prev = temp; temp = temp->next; } prev->next = *head; free(temp); } Deleting a Node by Value: void deleteNodeByValue(struct Node** head, int value) { if (*head == NULL) return; struct Node* temp = *head; struct Node* prev = NULL; do { if (temp->data == value) { if (prev == NULL) { // Node to be deleted is the head deleteFirstNode(head); return; } else { prev->next = temp->next; free(temp); return; } } prev = temp; temp = temp->next; } while (temp != *head); } Advantages and Disadvantages of Circular Linked Lists Advantages: Circular Traversal: Easily traverse through the list without needing to check for the end. Efficient for Round-Robin Scheduling: Ideal for scenarios where cyclic behavior is required. Disadvantages: Complexity: Slightly more complex to manage compared to singly linked lists. Potential for Infinite Loops: If not handled carefully, circular traversal can lead to infinite loops. Wrapping Up Chapter 6

Chapter 6 – Circular Linked Lists Read More »

Chapter 4 – Linked Lists

Chapter 4: Linked Lists Welcome to Chapter 4! Now that you’ve become familiar with pointers and dynamic memory allocation, it’s time to explore how pointers unlock one of the most fundamental and powerful data structures: Linked Lists. If you’ve ever worked with arrays and found their fixed size to be limiting, you’re going to love linked lists. They provide flexibility, allowing you to dynamically manage data, making them perfect for scenarios where you don’t know the size of the data in advance. What is a Linked List? A Linked List is a linear data structure where each element (commonly called a node) points to the next element in the sequence. Unlike arrays, linked lists do not store elements in contiguous memory locations. Instead, each node contains: Here’s a simple graphical representation of a linked list: Each node points to the next, and the last node points to NULL, which signifies the end of the list. Why Use Linked Lists? Linked lists are dynamic, meaning you can add or remove elements from them without worrying about fixed size or shifting elements (as with arrays). Some reasons to use linked lists include: Linked lists, like those used in the backend of platforms like Emancipation Edutech Private Limited, efficiently handle data structures that grow dynamically, such as managing student records or tracking progress across multiple courses. Types of Linked Lists There are several variations of linked lists, each suited to different tasks: For now, we’ll focus on Singly Linked Lists, as they form the foundation for understanding more complex linked lists. Building a Singly Linked List To create a linked list, we first need to define the node structure. In C/C++, this is usually done using struct: Each node will have two fields: Creating a Node We’ll use dynamic memory allocation (malloc) to create nodes dynamically at runtime: This function allocates memory for a new node, initializes its data field, and sets the next pointer to NULL. Inserting Elements in a Linked List Now that we have a node, let’s learn how to insert it into a linked list. There are three common ways to insert nodes: Traversing a Linked List After inserting elements, you’ll want to traverse the list to access or display the data: This function loops through the list, printing each node’s data until it reaches the end (where next is NULL). Example: Creating and Traversing a Linked List Here’s an example of creating a linked list and printing its contents: Output: In this example, we created a list with four nodes and then printed it. Deleting Nodes in a Linked List Removing nodes from a linked list is just as important as adding them. There are three common ways to delete nodes: Advantages of Linked Lists Linked lists offer several advantages over arrays, making them suitable for situations where dynamic memory management is essential: Disadvantages of Linked Lists Linked lists also come with a few drawbacks: Wrapping Up Chapter 4 In this chapter, you’ve learned all about Singly Linked Lists, from creating and inserting nodes to deleting and traversing through them. Linked lists are the backbone of many advanced data structures, and mastering them opens up a world of possibilities in dynamic data management. Key takeaways: Keep practicing by implementing different types of linked lists, and if you ever need more resources, feel free to check out digilearn.cloud for interactive coding exercises. Next up: Stacks and Queues

Chapter 4 – Linked Lists Read More »

Chapter 5 – Doubly Linked Lists

Chapter 5: Doubly Linked Lists Welcome to Chapter 5! In this chapter, we’ll explore the Doubly Linked List—a more versatile type of linked list compared to the singly linked list. If you think of singly linked lists as one-way streets, doubly linked lists are like multi-lane roads with traffic flowing in both directions. This added flexibility can be particularly useful in various applications where bidirectional traversal is required. What is a Doubly Linked List? A Doubly Linked List is a type of linked list where each node has two pointers: Here’s a graphical representation: Each node contains a data field, a next pointer, and a prev pointer. The prev pointer of the first node is NULL, and the next pointer of the last node is NULL. Why Use Doubly Linked Lists? Doubly linked lists provide several advantages over singly linked lists: Doubly linked lists are used in applications such as navigation systems, where moving in both directions is necessary, similar to handling user interactions in educational platforms like Emancipation Edutech Private Limited. Building a Doubly Linked List To implement a doubly linked list, we start by defining the node structure: Each node now includes an additional pointer to the previous node. Creating a Node Here’s how you can create a new node: This function initializes a new node with the given value and sets both the next and prev pointers to NULL. Inserting Elements in a Doubly Linked List Here’s how you can insert nodes into a doubly linked list: Traversing a Doubly Linked List Traversal in a doubly linked list can be done in both directions: Example: Creating and Traversing a Doubly Linked List Here’s a sample program to create a doubly linked list and traverse it: Output: Deleting Nodes in a Doubly Linked List Deleting nodes in a doubly linked list is straightforward because you have pointers to both the next and previous nodes: Advantages and Disadvantages of Doubly Linked Lists Advantages: Disadvantages: Wrapping Up Chapter 5 In this chapter, we delved into Doubly Linked Lists—an enhanced version of singly linked lists that allows bidirectional traversal and simplifies node deletion. Mastering doubly linked lists will give you greater flexibility in handling dynamic data. Key takeaways: As always, if you want to see more examples or practice with interactive exercises, visit digilearn.cloud. And remember, **Emancipation

Chapter 5 – Doubly Linked Lists Read More »

Introduction to DSA (Data Structures and Algorithms)

What is DSA? When we talk about DSA, we’re referring to Data Structures and Algorithms. Let’s break that down: In simple terms, think of DSA as the combination of tools (data structures) and methods (algorithms) that help you solve complex problems in an optimized way. It’s like having a toolkit where each tool (data structure) is suited for a specific job, and the method (algorithm) is how you use that tool. DSA is at the heart of programming and problem-solving, which makes it essential for anyone diving into computer science, coding, or software engineering. Why Learn DSA? Learning DSA equips you with the knowledge to: Algorithmic Notation Before jumping into algorithms, let’s talk about notation. When discussing algorithms, we use notations to describe how fast or slow they are. This helps us understand if an algorithm is efficient enough for a particular problem. Notations to Measure Complexity 1. Big O Notation (O) The most commonly used notation to describe how the runtime of an algorithm increases as the input size increases. Big O focuses on the worst-case scenario. For example: Why it matters: Knowing the worst-case performance helps you plan for the worst possible situation your code might face. 2. Small o Notation (o) This notation is used to describe algorithms that are better than what Big O suggests but don’t quite reach the next best level. It’s a more precise way of saying how close the algorithm’s runtime is to the ideal scenario. For example, if you have a sorting algorithm that’s slightly faster than O(n log n), we might say it’s o(n log n). Capital and Small Notations: What’s the Difference? When we talk about notations like O, Ω, θ, and o, the size of the letter tells us something important: Example: Linear Search vs. Binary Search Let’s take an example of searching for a number in a list: Wrapping It Up Understanding algorithmic notation helps you gauge how well your code will perform as your input grows larger. It’s a critical skill, especially when working on big projects where efficiency can make or break the application. In the next section, we’ll dive into more practical algorithms and how different data structures help us solve various problems. So, stay tuned, and we’ll explore sorting, searching, and more exciting concepts in the world of DSA!

Introduction to DSA (Data Structures and Algorithms) Read More »

Scroll to Top