Table of Contents
Chapter 1: Why Collections Matter in C# (Beyond Arrays)
Chapter 2: Lists: Your Go-To Dynamic Collection
Chapter 3: Dictionaries: Key-Value Powerhouses
Chapter 4: Queues: First In, First Out Processing
Chapter 5: Stacks: Last In, First Out Operations
Chapter 6: HashSet & SortedSet: Unique Collections
Chapter 7: LinkedList: When Order and Insertion Matter
Chapter 8: ObservableCollection & Modern .NET Collections
Chapter 9: LINQ with Collections: Querying Made Easy
Chapter 10: Best Practices, Performance, and Real-World Use Cases
Chapter 1: Why Collections Matter in C# (Beyond Arrays)
Picture this: you’re building a social media feed that needs to handle thousands of posts, a gaming leaderboard that constan…
Table of Contents
Chapter 1: Why Collections Matter in C# (Beyond Arrays)
Chapter 2: Lists: Your Go-To Dynamic Collection
Chapter 3: Dictionaries: Key-Value Powerhouses
Chapter 4: Queues: First In, First Out Processing
Chapter 5: Stacks: Last In, First Out Operations
Chapter 6: HashSet & SortedSet: Unique Collections
Chapter 7: LinkedList: When Order and Insertion Matter
Chapter 8: ObservableCollection & Modern .NET Collections
Chapter 9: LINQ with Collections: Querying Made Easy
Chapter 10: Best Practices, Performance, and Real-World Use Cases
Chapter 1: Why Collections Matter in C# (Beyond Arrays)
Picture this: you’re building a social media feed that needs to handle thousands of posts, a gaming leaderboard that constantly updates, or a fintech application processing real-time transactions. Arrays might seem like the obvious choice, but they’ll quickly become your bottleneck.
As developers, we often start with arrays because they’re simple and familiar. However, real-world applications demand more flexibility, better performance, and cleaner code. Collections in C# provide the tools to build scalable, maintainable applications that can grow with your user base.
The Array Limitation Problem
Arrays in C# are fixed-size data structures. Once you declare an array with a specific length, that’s it—no growing, no shrinking. Here’s what this looks like in practice:
// Fixed size - this becomes problematic quickly
int[] userScores = new int[100]; // What if we get 101 users?
// Adding elements requires manual array manipulation
int[] expandedScores = new int[userScores.Length + 1];
Array.Copy(userScores, expandedScores, userScores.Length);
expandedScores[userScores.Length] = newScore; // Tedious and error-prone
This manual memory management becomes a nightmare when building dynamic applications like social media platforms or gaming systems where data constantly changes.
Collections: Built for Real-World Development
Collections solve the fundamental limitations of arrays by providing dynamic sizing, optimized operations, and specialized behaviors. They’re designed for the messy, unpredictable nature of real-world software development.
// Dynamic and clean - this is what modern C# looks like
List<int> userScores = new List<int>();
userScores.Add(newScore); // Simple, safe, and efficient
Memory Management and Performance
Unlike arrays, collections handle memory management intelligently. When a List<T>
needs more space, it automatically allocates a larger internal array and copies existing elements. This happens behind the scenes, following algorithms optimized by Microsoft’s engineers over decades.
The key insight: collections trade small memory overhead for massive developer productivity gains. In a typical business application, the slight memory cost is negligible compared to the time saved and bugs avoided.
Real-World Scenarios Where Collections Shine
Gaming Development
// Managing dynamic game objects
List<Enemy> activeEnemies = new List<Enemy>();
Dictionary<string, PlayerStats> leaderboard = new Dictionary<string, PlayerStats>();
Queue<GameEvent> eventQueue = new Queue<GameEvent>(); // Process events in order
Social Media Applications
// Dynamic content feeds
List<Post> userFeed = new List<Post>();
HashSet<string> uniqueHashtags = new HashSet<string>(); // No duplicate tags
Dictionary<int, User> userCache = new Dictionary<int, User>(); // Fast user lookup
Financial Systems
// Transaction processing
Queue<Transaction> pendingTransactions = new Queue<Transaction>(); // FIFO processing
Stack<Transaction> undoStack = new Stack<Transaction>(); // Undo last operations
Dictionary<string, decimal> accountBalances = new Dictionary<string, decimal>(); // O(1) balance lookup
Good Practices vs Bad Practices
✅ Good Practice: Choose the Right Collection
// Use Dictionary for fast lookups by key
Dictionary<string, User> userCache = new Dictionary<string, User>();
var user = userCache["john123"]; // O(1) average case
❌ Bad Practice: Using Lists for Everything
// Inefficient - O(n) search every time
List<User> users = new List<User>();
var user = users.FirstOrDefault(u => u.Username == "john123");
✅ Good Practice: Initialize with Capacity When Known
// Prevents multiple internal array allocations
List<string> knownSizeList = new List<string>(1000);
❌ Bad Practice: Ignoring Performance Characteristics
// Inefficient for frequent insertions at the beginning
List<int> numbers = new List<int>();
numbers.Insert(0, newNumber); // O(n) operation - all elements shift
Theory Foundation: Big O Notation in Collections
Understanding performance characteristics helps you make informed decisions:
- List: O(1) append, O(n) insert at beginning
- Dictionary: O(1) average lookup, O(n) worst case
- HashSet: O(1) average add/contains, O(n) worst case
- Queue and Stack: O(1) for their primary operations This isn’t academic theory—it directly impacts user experience. A social media feed with O(n) user lookups will crawl as your user base grows, while O(1) dictionary lookups scale effortlessly to millions of users.
Chapter Summary
Collections are the foundation of professional C# development because they solve real-world problems that arrays cannot. They provide dynamic sizing, optimized operations, and specialized behaviors that make complex applications maintainable and performant. Understanding when and why to use each collection type separates junior developers from senior engineers who build systems that scale.
The key insight: arrays are tools for simple, fixed-size data; collections are tools for building applications that handle the complexity and growth of real business requirements.
Practice Exercise
Scenario: You’re building a customer service ticket system. Design the data structures you’d use for:
- Storing all tickets (need to add/remove frequently)
- Looking up tickets by ticket ID (must be fast)
- Processing tickets in order of arrival (first come, first served)
- Storing unique customer email addresses (no duplicates) Write a brief explanation of which collection you’d choose for each requirement and why. Consider both the theoretical performance characteristics and real-world usage patterns.
Chapter 2: Lists: Your Go-To Dynamic Collection
Every successful application starts with managing changing data. Whether you’re tracking user interactions in a mobile app, managing inventory in an e-commerce system, or building playlists in a music streaming service, List<T>
is your reliable workhorse.
Lists bridge the gap between the simplicity of arrays and the dynamic requirements of modern software. They provide array-like syntax with the flexibility to grow and shrink as your data changes, making them the most commonly used collection in professional C# development.
Understanding List Fundamentals
A List<T>
is essentially a dynamic array backed by an internal array that automatically resizes when needed. The generic <T>
means it can store any type while maintaining type safety—no boxing, no casting, no runtime errors from type mismatches.
// Type-safe and efficient
List<string> userNames = new List<string>();
List<int> scores = new List<int>();
List<Customer> customers = new List<Customer>();
// Compile-time safety - this won't compile
// userNames.Add(42); // Error: Cannot convert int to string
Dynamic Resizing: The Magic Behind the Scenes
When you add elements to a list, it doesn’t allocate memory for each individual item. Instead, it uses a capacity doubling strategy:
List<int> numbers = new List<int>(); // Initial capacity: 0
numbers.Add(1); // Capacity becomes 4
numbers.Add(2);
numbers.Add(3);
numbers.Add(4);
numbers.Add(5); // Capacity doubles to 8
This exponential growth strategy ensures that adding elements is amortized O(1)—even though occasional resizing operations are O(n), they happen infrequently enough that average performance remains excellent.
Essential List Operations
Adding Elements
List<string> tasks = new List<string>();
// Add single items
tasks.Add("Review code");
tasks.Add("Deploy to staging");
// Add multiple items
tasks.AddRange(new[] { "Run tests", "Update documentation" });
// Insert at specific position
tasks.Insert(0, "Morning standup"); // Insert at beginning
Accessing and Modifying Elements
// Index-based access (just like arrays)
string firstTask = tasks[0];
tasks[1] = "Deploy to production"; // Modify existing item
// Safe access with bounds checking
if (tasks.Count > 2)
{
string thirdTask = tasks[2];
}
Removing Elements
// Remove by value
tasks.Remove("Update documentation"); // Removes first occurrence
// Remove by index
tasks.RemoveAt(0); // Remove first item
// Remove multiple items
tasks.RemoveRange(1, 2); // Remove 2 items starting at index 1
// Clear everything
tasks.Clear();
Searching and Checking
// Check if item exists
bool hasDeployTask = tasks.Contains("Deploy to production");
// Find index of item
int index = tasks.IndexOf("Run tests"); // Returns -1 if not found
// Find items with conditions
string urgentTask = tasks.Find(task => task.Contains("urgent"));
List<string> deployTasks = tasks.FindAll(task => task.Contains("Deploy"));
Performance Characteristics in Practice
Understanding List performance helps you write efficient code:
- Add/AddRange: O(1) amortized, O(n) when resizing
- Index access: O(1) - just like arrays
- Insert at beginning: O(n) - must shift all elements
- Remove by index: O(n) - must shift elements after removal
- Remove by value: O(n) - must search then shift
- Contains/IndexOf: O(n) - linear search
// Efficient: Adding to end
List<LogEntry> logs = new List<LogEntry>();
logs.Add(newLogEntry); // O(1) - fast
// Inefficient: Frequent insertions at beginning
List<string> recentMessages = new List<string>();
recentMessages.Insert(0, newMessage); // O(n) - slow for large lists
Real-World Application Patterns
User Interface Lists
// Managing dynamic UI elements
List<MenuItem> navigationItems = new List<MenuItem>
{
new MenuItem("Dashboard", "/dashboard"),
new MenuItem("Reports", "/reports"),
new MenuItem("Settings", "/settings")
};
// Add items based on user permissions
if (user.HasPermission("Admin"))
{
navigationItems.Add(new MenuItem("Admin Panel", "/admin"));
}
Data Processing Pipelines
// Processing batches of data
List<CustomerOrder> todaysOrders = GetTodaysOrders();
List<CustomerOrder> priorityOrders = new List<CustomerOrder>();
foreach (var order in todaysOrders)
{
if (order.IsPriority)
{
priorityOrders.Add(order);
}
}
// Process priority orders first
ProcessOrders(priorityOrders);
Caching and Temporary Storage
// Temporary collection for API responses
List<WeatherData> weatherCache = new List<WeatherData>();
public async Task RefreshWeatherData()
{
weatherCache.Clear();
var newData = await weatherApiClient.GetForecast();
weatherCache.AddRange(newData);
}
Good Practices vs Bad Practices
✅ Good Practice: Initialize with Known Capacity
// Prevents multiple internal reallocations
List<Customer> customers = new List<Customer>(expectedCustomerCount);
❌ Bad Practice: Ignoring Capacity for Large Collections
// Multiple expensive resize operations
List<DataPoint> largeDataset = new List<DataPoint>(); // Starts at capacity 0
// Adding 10,000 items triggers many resize operations
✅ Good Practice: Use Collection Initializer for Static Data
List<string> allowedFileTypes = new List<string>
{
".pdf", ".docx", ".txt", ".jpg", ".png"
};
❌ Bad Practice: Multiple Individual Add Calls for Known Data
List<string> fileTypes = new List<string>();
fileTypes.Add(".pdf");
fileTypes.Add(".docx");
fileTypes.Add(".txt");
// More verbose and slightly less efficient
✅ Good Practice: Consider Performance for Frequent Operations
// Use Queue<T> instead of List<T> for frequent beginning insertions
Queue<ProcessingTask> taskQueue = new Queue<ProcessingTask>();
taskQueue.Enqueue(newTask); // O(1) instead of List.Insert(0, item) which is O(n)
Working with Custom Objects
Lists truly shine when working with custom objects and business entities:
public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public decimal Price { get; set; }
public DateTime DateAdded { get; set; }
}
List<Product> inventory = new List<Product>();
// Adding products
inventory.Add(new Product
{
Id = 1,
Name = "Wireless Headphones",
Price = 99.99m,
DateAdded = DateTime.Now
});
// Finding products with LINQ (covered in detail in Chapter 9)
var expensiveProducts = inventory.Where(p => p.Price > 100).ToList();
var recentProducts = inventory.Where(p => p.DateAdded > DateTime.Now.AddDays(-7)).ToList();
Thread Safety Considerations
Lists are not thread-safe by default. In multi-threaded applications, you need additional synchronization:
// Thread-safe wrapper (simple but can impact performance)
List<string> threadSafeList = new List<string>();
lock (threadSafeList)
{
threadSafeList.Add("Thread-safe addition");
}
// Better: Use ConcurrentBag<T> for concurrent scenarios
// (covered in Chapter 10: Best Practices)
Chapter Summary
List<T>
is the foundation of dynamic data management in C#. Its automatic resizing, type safety, and familiar array-like syntax make it the go-to choice for most scenarios involving changing collections of data. Understanding its performance characteristics—especially the O(1) append and O(n) insertion costs—helps you write efficient code that scales with your application’s growth.
The key insight: Lists excel at append-heavy scenarios and provide the best balance of usability and performance for general-purpose dynamic collections.
Practice Exercise
Scenario: Build a simple task management system for a development team.
Create a Task
class with properties: Id
, Title
, AssignedTo
, Priority
(1-5), and IsCompleted
.
Write methods to:
- Add new tasks to the list
- Mark tasks as completed
- Find all high-priority tasks (priority 4-5)
- Remove completed tasks older than 30 days
- Get tasks assigned to a specific team member
Consider the performance implications of each operation and explain why
List<T>
is appropriate for this use case.
Chapter 3: Dictionaries: Key-Value Powerhouses
Imagine you’re building a user authentication system, a product catalog, or a caching layer for a high-traffic web application. You need lightning-fast lookups by unique identifiers—user IDs, product SKUs, or cache keys. This is where Dictionary<TKey, TValue>
becomes indispensable.
Dictionaries solve the fundamental problem of associative data storage. While lists are excellent for ordered collections, dictionaries excel when you need to quickly find information using a unique key rather than searching through every element.
Understanding Hash Tables and Dictionary Internals
A Dictionary<TKey, TValue>
is C#’s implementation of a hash table. The “magic” happens through a process called hashing:
- Hash Function: Converts your key into a numeric hash code
- Bucket Assignment: Uses the hash code to determine where to store the value
- Collision Handling: Manages cases where different keys produce the same hash code
// Behind the scenes when you do this:
Dictionary<string, User> users = new Dictionary<string, User>();
users["john123"] = new User("John", "Doe");
// C# calculates: "john123".GetHashCode() → determines storage location
// Result: O(1) average lookup time
Essential Dictionary Operations
Creating and Adding Items
// Initialize empty dictionary
Dictionary<string, decimal> productPrices = new Dictionary<string, decimal>();
// Add items
productPrices["laptop"] = 999.99m;
productPrices["mouse"] = 29.99m;
productPrices["keyboard"] = 79.99m;
// Alternative: using Add method (throws exception if key exists)
productPrices.Add("monitor", 399.99m);
// Safe addition - check if key exists first
if (!productPrices.ContainsKey("webcam"))
{
productPrices["webcam"] = 149.99m;
}
// Dictionary initializer syntax
var defaultSettings = new Dictionary<string, bool>
{
["EnableLogging"] = true,
["UseHttps"] = true,
["AllowGuestAccess"] = false
};
Accessing and Modifying Values
// Direct access (throws KeyNotFoundException if key doesn't exist)
decimal laptopPrice = productPrices["laptop"];
// Safe access with TryGetValue
if (productPrices.TryGetValue("tablet", out decimal tabletPrice))
{
Console.WriteLine($"Tablet costs: {tabletPrice}");
}
else
{
Console.WriteLine("Tablet not found");
}
// Modify existing values
productPrices["laptop"] = 899.99m; // Update price
Checking and Removing Items
// Check if key exists
bool hasLaptop = productPrices.ContainsKey("laptop");
// Check if value exists (less efficient - O(n))
bool hasExpensiveItem = productPrices.ContainsValue(999.99m);
// Remove items
bool removed = productPrices.Remove("mouse"); // Returns true if key existed
// Remove with condition
productPrices.Remove("laptop", out decimal oldLaptopPrice);
Iterating Through Dictionaries
// Iterate over key-value pairs
foreach (var kvp in productPrices)
{
Console.WriteLine($"{kvp.Key}: ${kvp.Value}");
}
// Iterate over keys only
foreach (string productName in productPrices.Keys)
{
Console.WriteLine($"Product: {productName}");
}
// Iterate over values only
foreach (decimal price in productPrices.Values)
{
Console.WriteLine($"Price: ${price}");
}
Performance Characteristics in Real-World Context
Understanding dictionary performance is crucial for building scalable applications:
- Get/Set by key: O(1) average case, O(n) worst case
- Add: O(1) average case, O(n) when resizing
- Remove: O(1) average case
- ContainsKey: O(1) average case
- ContainsValue: O(n) - must check all values The “average case” performance assumes a good hash function and reasonable load factor. .NET’s built-in hash functions for common types (string, int, etc.) are well-optimized for real-world use.
Real-World Application Patterns
User Management Systems
// Fast user lookups in authentication systems
Dictionary<string, User> userCache = new Dictionary<string, User>();
public User GetUser(string username)
{
if (userCache.TryGetValue(username, out User cachedUser))
{
return cachedUser; // O(1) cache hit
}
// Cache miss - load from database
var user = database.LoadUser(username);
userCache[username] = user; // Cache for next time
return user;
}
Configuration Management
// Application settings with fast lookups
Dictionary<string, string> appSettings = new Dictionary<string, string>
{
["DatabaseConnection"] = "Server=localhost;Database=MyApp",
["ApiTimeout"] = "30",
["MaxRetries"] = "3",
["EnableFeatureX"] = "true"
};
public T GetSetting<T>(string key, T defaultValue)
{
if (appSettings.TryGetValue(key, out string value))
{
return (T)Convert.ChangeType(value, typeof(T));
}
return defaultValue;
}
Gaming and Leaderboards
// Player statistics with instant lookups
Dictionary<string, PlayerStats> playerStats = new Dictionary<string, PlayerStats>();
public void UpdatePlayerScore(string playerId, int score)
{
if (playerStats.TryGetValue(playerId, out PlayerStats stats))
{
stats.TotalScore += score; // O(1) lookup and update
stats.GamesPlayed++;
}
else
{
playerStats[playerId] = new PlayerStats { TotalScore = score, GamesPlayed = 1 };
}
}
E-commerce Product Catalogs
// Product information with SKU lookups
Dictionary<string, Product> productCatalog = new Dictionary<string, Product>();
public Product GetProduct(string sku)
{
return productCatalog.TryGetValue(sku, out Product product) ? product : null;
}
public void UpdateInventory(string sku, int quantity)
{
if (productCatalog.TryGetValue(sku, out Product product))
{
product.QuantityInStock = quantity; // Instant update
}
}
Good Practices vs Bad Practices
✅ Good Practice: Use TryGetValue for Safe Access
// Safe - no exceptions thrown
if (userCache.TryGetValue(userId, out User user))
{
ProcessUser(user);
}
❌ Bad Practice: Direct Access Without Checking
// Dangerous - throws KeyNotFoundException if key doesn't exist
User user = userCache[userId]; // Could crash your application
✅ Good Practice: Choose Appropriate Key Types
// Good - strings and primitives have efficient hash functions
Dictionary<int, Customer> customerById = new Dictionary<int, Customer>();
Dictionary<string, Order> orderByNumber = new Dictionary<string, Order>();
❌ Bad Practice: Using Complex Objects as Keys Without Proper Hashing
// Problematic - needs custom GetHashCode() and Equals() implementation
public class CompositeKey
{
public string Part1 { get; set; }
public int Part2 { get; set; }
// Missing GetHashCode() and Equals() overrides
}
Dictionary<CompositeKey, string> badExample = new Dictionary<CompositeKey, string>();
✅ Good Practice: Initialize with Capacity for Known Size
// Prevents multiple resize operations for large datasets
Dictionary<string, ProductInfo> products = new Dictionary<string, ProductInfo>(10000);
❌ Bad Practice: Using Lists for Key-Based Lookups
// Inefficient O(n) search instead of O(1) dictionary lookup
List<User> users = new List<User>();
var user = users.FirstOrDefault(u => u.Id == targetId); // Slow!
Advanced Dictionary Features
Custom Key Types
When using custom objects as keys, implement GetHashCode()
and Equals()
:
public class ProductKey : IEquatable<ProductKey>
{
public string Category { get; set; }
public string SubCategory { get; set; }
public bool Equals(ProductKey other)
{
return other != null &&
Category == other.Category &&
SubCategory == other.SubCategory;
}
public override bool Equals(object obj) => Equals(obj as ProductKey);
public override int GetHashCode()
{
return HashCode.Combine(Category, SubCategory); // .NET Core 2.1+
}
}
Dictionary with Custom Comparer
// Case-insensitive string keys
Dictionary<string, User> users = new Dictionary<string, User>(StringComparer.OrdinalIgnoreCase);
users["JOHN"] = user1;
var sameUser = users["john"]; // Finds the same user
Thread Safety and Concurrent Access
Dictionaries are not thread-safe for modifications. For concurrent scenarios:
// Thread-safe alternative for concurrent access
ConcurrentDictionary<string, User> threadSafeUsers = new ConcurrentDictionary<string, User>();
// Atomic operations
threadSafeUsers.AddOrUpdate("john123",
newUser, // Add if doesn't exist
(key, existingUser) => newUser); // Update if exists
Common Pitfalls and How to Avoid Them
Hash Code Stability
// ❌ Bad: Mutable key objects can break dictionary
public class MutableKey
{
public string Value { get; set; } // Changing this breaks hash table
}
// ✅ Good: Immutable key objects
public class ImmutableKey
{
public string Value { get; }
public ImmutableKey(string value) => Value = value;
}
Null Key Handling
// Most dictionary implementations don't allow null keys
// Check for null before using as key
if (keyValue != null)
{
dictionary[keyValue] = someValue;
}
Chapter Summary
Dictionary<TKey, TValue>
provides O(1) average-case performance for key-based operations, making it essential for scenarios requiring fast lookups, caching, and associative data storage. Understanding hash table concepts and proper key design ensures optimal performance in real-world applications.
The key insight: Dictionaries transform linear O(n) searches into constant O(1) lookups, making them indispensable for scalable application architecture.
Practice Exercise
Scenario: Build a simple inventory management system for a warehouse.
Create an InventoryItem
class with properties: SKU
, Name
, Quantity
, Location
, and LastUpdated
.
Using dictionaries, implement:
- Fast SKU-based product lookup
- Location-based inventory grouping (multiple items per location)
- Low-stock alerts (items with quantity below threshold)
- Audit trail of the last 10 operations per SKU Explain why dictionaries are more efficient than lists for this use case, and identify which operations benefit most from O(1) lookup performance.
Chapter 4: Queues: First In, First Out Processing
Think about the last time you waited in line at a coffee shop, submitted a print job, or uploaded files to a cloud service. These scenarios all follow the same principle: first come, first served. In software development, Queue<T>
implements this FIFO (First In, First Out) pattern, making it essential for task processing, message handling, and any system where order matters.
Queues are the backbone of many critical systems—from web servers handling HTTP requests to operating systems managing process scheduling. Understanding when and how to use queues properly can mean the difference between a responsive application and one that frustrates users.
Understanding FIFO Principles
A queue works exactly like a line at the bank: the first person to arrive is the first person served. In programming terms:
- Enqueue: Add an item to the back of the queue
- Dequeue: Remove and return the item from the front of the queue
- Peek: Look at the front item without removing it
Queue<string> customerService = new Queue<string>();
// Customers arrive (enqueue)
customerService.Enqueue("Alice"); // Queue: [Alice]
customerService.Enqueue("Bob"); // Queue: [Alice, Bob]
customerService.Enqueue("Charlie"); // Queue: [Alice, Bob, Charlie]
// Serve customers (dequeue)
string nextCustomer = customerService.Dequeue(); // Returns "Alice"
// Queue is now: [Bob, Charlie]
Essential Queue Operations
Adding and Removing Items
Queue<ProcessingTask> taskQueue = new Queue<ProcessingTask>();
// Add items to the queue
taskQueue.Enqueue(new ProcessingTask("Send email"));
taskQueue.Enqueue(new ProcessingTask("Generate report"));
taskQueue.Enqueue(new ProcessingTask("Update database"));
// Process items in order
while (taskQueue.Count > 0)
{
ProcessingTask currentTask = taskQueue.Dequeue();
await currentTask.Execute();
Console.WriteLine($"Completed: {currentTask.Description}");
}
Examining Queue Contents
// Check what's next without removing it
if (taskQueue.Count > 0)
{
ProcessingTask nextTask = taskQueue.Peek();
Console.WriteLine($"Next task: {nextTask.Description}");
}
// Check if queue is empty
bool hasWork = taskQueue.Count > 0;
// Check if specific item is in queue (O(n) operation)
bool hasEmailTask = taskQueue.Contains(emailTask);
Converting and Enumerating
// Convert to array (preserves order)
ProcessingTask[] allTasks = taskQueue.ToArray();
// Enumerate without modifying queue
foreach (ProcessingTask task in taskQueue)
{
Console.WriteLine($"Queued: {task.Description}");
// Items remain in queue after enumeration
}
Performance Characteristics
Queues are optimized for their primary operations:
- Enqueue: O(1) - Adding to the back is constant time
- Dequeue: O(1) - Removing from the front is constant time
- Peek: O(1) - Looking at the front item
- Contains: O(n) - Must search through all items
- Count: O(1) - Maintained internally This makes queues extremely efficient for their intended use cases but inappropriate when you need frequent random access or searching.
Real-World Application Patterns
Background Task Processing
public class BackgroundTaskProcessor
{
private readonly Queue<IBackgroundTask> _taskQueue = new Queue<IBackgroundTask>();
private readonly object _lock = new object();
public void QueueTask(IBackgroundTask task)
{
lock (_lock)
{
_taskQueue.Enqueue(task);
}
Console.WriteLine($"Queued: {task.Name}");
}
public async Task ProcessTasks()
{
while (true)
{
IBackgroundTask nextTask = null;
lock (_lock)
{
if (_taskQueue.Count > 0)
{
nextTask = _taskQueue.Dequeue();
}
}
if (nextTask != null)
{
await nextTask.Execute();
Console.WriteLine($"Completed: {nextTask.Name}");
}
else
{
await Task.Delay(1000); // Wait for new tasks
}
}
}
}
Web Request Processing
public class RequestProcessor
{
private readonly Queue<HttpRequest> _requestQueue = new Queue<HttpRequest>();
private const int MAX_QUEUE_SIZE = 1000;
public bool TryQueueRequest(HttpRequest request)
{
if (_requestQueue.Count >= MAX_QUEUE_SIZE)
{
// Queue full - reject request or implement overflow handling
return false;
}
_requestQueue.Enqueue(request);
return true;
}
public async Task<HttpResponse> ProcessNextRequest()
{
if (_requestQueue.Count == 0)
{
return null; // No requests to process
}
HttpRequest request = _requestQueue.Dequeue();
return await HandleRequest(request);
}
}
Game Development: Event Processing
public class GameEventSystem
{
private readonly Queue<GameEvent> _eventQueue = new Queue<GameEvent>();
public void QueueEvent(GameEvent gameEvent)
{
_eventQueue.Enqueue(gameEvent);
}
public void ProcessEvents()
{
// Process all events that occurred this frame
int eventsToProcess = _eventQueue.Count; // Snapshot count
for (int i = 0; i < eventsToProcess; i++)
{
GameEvent currentEvent = _eventQueue.Dequeue();
switch (currentEvent.Type)
{
case EventType.PlayerMoved:
HandlePlayerMovement(currentEvent);
break;
case EventType.ItemCollected:
HandleItemCollection(currentEvent);
break;
case EventType.EnemySpawned:
HandleEnemySpawn(currentEvent);
break;
}
}
}
}
File Processing Pipeline
public class FileProcessingPipeline
{
private readonly Queue<FileInfo> _processingQueue = new Queue<FileInfo>();
public void AddFilesToQueue(string directoryPath)
{
DirectoryInfo directory = new DirectoryInfo(directoryPath);
FileInfo[] files = directory.GetFiles("*.csv");
foreach (FileInfo file in files)
{
_processingQueue.Enqueue(file);
}
Console.WriteLine($"Queued {files.Length} files for processing");
}
public async Task ProcessFiles()
{
while (_processingQueue.Count > 0)
{
FileInfo currentFile = _processingQueue.Dequeue();
try
{
await ProcessFile(currentFile);
Console.WriteLine($"Processed: {currentFile.Name}");
}
catch (Exception ex)
{
Console.WriteLine($"Error processing {currentFile.Name}: {ex.Message}");
// Optionally re-queue failed files for retry
}
}
}
}
Good Practices vs Bad Practices
✅ Good Practice: Use Queues for Sequential Processing
// Perfect use case - processing items in arrival order
Queue<EmailMessage> outboxQueue = new Queue<EmailMessage>();
public void SendQueuedEmails()
{
while (outboxQueue.Count > 0)
{
EmailMessage email = outboxQueue.Dequeue();
await emailService.SendAsync(email); // Process in order
}
}
❌ Bad Practice: Using Queues for Random Access
// Inefficient - queues aren't designed for searching
Queue<Customer> customers = new Queue<Customer>();
var targetCustomer = customers.FirstOrDefault(c => c.Id == targetId); // O(n) operation
// Better: Use Dictionary<int, Customer> for ID-based lookups
✅ Good Practice: Implement Queue Size Limits
public class BoundedQueue<T>
{
private readonly Queue<T> _queue = new Queue<T>();
private readonly int _maxSize;
public BoundedQueue(int maxSize) => _maxSize = maxSize;
public bool TryEnqueue(T item)
{
if (_queue.Count >= _maxSize)
{
return false; // Queue full
}
_queue.Enqueue(item);
return true;
}
}
❌ Bad Practice: Unbounded Queues in Production
// Dangerous - could consume unlimited memory
Queue<LogEntry> logQueue = new Queue<LogEntry>();
// In high-traffic scenarios, this could cause OutOfMemoryException
✅ Good Practice: Thread-Safe Queue Operations
// For concurrent scenarios, use thread-safe alternatives
ConcurrentQueue<WorkItem> threadSafeQueue = new ConcurrentQueue<WorkItem>();
// Or synchronize access to regular queues
private readonly object _queueLock = new object();
public void SafeEnqueue(WorkItem item)
{
lock (_queueLock)
{
_workQueue.Enqueue(item);
}
}
Advanced Queue Scenarios
Priority Queue Implementation (Custom)
While Queue<T>
processes items in arrival order, sometimes you need priority-based processing:
public class PriorityQueue<T> where T : IComparable<T>
{
private readonly List<T> _items = new List<T>();
public void Enqueue(T item)
{
_items.Add(item);
_items.Sort(); // Simple implementation - could be optimized with heap
}
public T Dequeue()
{
if (_items.Count == 0)
throw new InvalidOperationException("Queue is empty");
T item = _items[0]; // Highest priority item
_items.RemoveAt(0);
return item;
}
public int Count => _items.Count;
}
Queue with Retry Logic
public class RetryQueue<T>
{
private readonly Queue<QueueItem<T>> _mainQueue = new Queue<QueueItem<T>>();
private readonly Queue<QueueItem<T>> _retryQueue = new Queue<QueueItem<T>>();
public void Enqueue(T item)
{
_mainQueue.Enqueue(new QueueItem<T>(item, maxRetries: 3));
}
public bool TryDequeue(out T item)
{
// First try retry queue (failed items get priority)
if (_retryQueue.Count > 0)
{
var queueItem = _retryQueue.Dequeue();
item = queueItem.Item;
return true;
}
// Then try main queue
if (_mainQueue.Count > 0)
{
var queueItem = _mainQueue.Dequeue();
item = queueItem.Item;
return true;
}
item = default(T);
return false;
}
public void RequeueForRetry(T item, int remainingRetries)
{
if (remainingRetries > 0)
{
_retryQueue.Enqueue(new QueueItem<T>(item, remainingRetries - 1));
}
// Otherwise, item is discarded after max retries
}
}
Thread Safety Considerations
Standard Queue<T>
is not thread-safe. For concurrent scenarios:
// Option 1: Use ConcurrentQueue<T>
ConcurrentQueue<string> concurrentQueue = new ConcurrentQueue<string>();
concurrentQueue.Enqueue("thread-safe item");
if (concurrentQueue.TryDequeue(out string result))
{
Console.WriteLine($"Dequeued: {result}");
}
// Option 2: Synchronize access with locks
private readonly Queue<string> _queue = new Queue<string>();
private readonly object _lock = new object();
public void ThreadSafeEnqueue(string item)
{
lock (_lock)
{
_queue.Enqueue(item);
}
}
public bool ThreadSafeTryDequeue(out string item)
{
lock (_lock)
{
if (_queue.Count > 0)
{
item = _queue.Dequeue();
return true;
}
item = null;
return false;
}
}
Chapter Summary
Queue<T>
implements FIFO processing with O(1) performance for enqueue and dequeue operations, making it ideal for sequential task processing, request handling, and any scenario where order preservation is critical. The key is recognizing when order matters more than random access or searching capabilities.
The key insight: Queues excel in producer-consumer scenarios where fairness and order preservation are more important than flexible data access patterns.
Practice Exercise
Scenario: Build a customer support ticket system that processes tickets in the order they’re received.
Create a SupportTicket
class with properties: TicketId
, CustomerName
, Issue
, Priority
, CreatedAt
, and Status
.
Implement:
- A ticket queue that processes standard tickets in FIFO order
- An urgent ticket queue that gets priority over standard tickets
- A method to estimate wait time based on queue position
- Handling for tickets that fail processing (retry up to 3 times)
- Daily statistics: tickets processed, average processing time, tickets remaining Explain why queues are more appropriate than lists for this system, and identify potential issues with unbounded queues in a high-traffic support system.
Chapter 5: Stacks: Last In, First Out Operations
Picture the last time you used your browser’s back button, pressed Ctrl+Z to undo an edit, or debugged through nested method calls. Each of these scenarios follows a Last In, First Out (LIFO) pattern—the most recent action is the first one you reverse. Stack<T>
in C# implements this fundamental pattern, making it essential for undo systems, expression evaluation, and managing hierarchical operations.
Stacks are everywhere in computing, often working behind the scenes. Understanding when to reach for a stack versus other collections can dramatically simplify complex problems, especially those involving nested operations or reversible actions.
Understanding LIFO Principles
A stack works like a stack of plates: you can only add or remove plates from the top. In programming terms:
- Push: Add an item to the top of the stack
- Pop: Remove and return the top item from the stack
- Peek: Look at the top item without removing it
Stack<string> browserHistory = new Stack<string>();
// User navigates through pages (push)
browserHistory.Push("google.com"); // Stack: [google.com]
browserHistory.Push("stackoverflow.com"); // Stack: [google.com, stackoverflow.com]
browserHistory.Push("github.com"); // Stack: [google.com, stackoverflow.com, github.com]
// User clicks back button (pop)
string currentPage = browserHistory.Pop(); // Returns "github.com"
// Stack is now: [google.com, stackoverflow.com]
Essential Stack Operations
Adding and Removing Items
Stack<string> undoStack = new Stack<string>();
// Record actions as they happen
undoStack.Push("Typed 'Hello'");
undoStack.Push("Typed ' World'");
undoStack.Push("Changed font to bold");
// Undo actions in reverse order
while (undoStack.Count > 0)
{
string lastAction = undoStack.Pop();
Console.WriteLine($"Undoing: {lastAction}");
}
// Output:
// Undoing: Changed font to bold
// Undoing: Typed ' World'
// Undoing: Typed 'Hello'
Examining Stack Contents
Stack<decimal> calculatorStack = new Stack<decimal>();
calculatorStack.Push(10);
calculatorStack.Push(20);
calculatorStack.Push(5);
// Look at top item without removing it
decimal topValue = calculatorStack.Peek(); // Returns 5
Console.WriteLine($"Top value: {topValue}"); // Stack unchanged
// Check if stack is empty
bool hasValues = calculatorStack.Count > 0;
// Search for specific value (O(n) operation)
bool containsTwenty = calculatorStack.Contains(20);
Converting and Enumerating
// Convert to array (top-to-bottom order)
decimal[] values = calculatorStack.ToArray(); // [5, 20, 10]
// Enumerate without modifying stack
foreach (decimal value in calculatorStack)
{
Console.WriteLine(value); // Prints 5, then 20, then 10
// Stack remains unchanged after enumeration
}
Performance Characteristics
Stacks excel at their core operations:
- Push: O(1) - Adding to the top is constant time
- Pop: O(1) - Removing from the top is constant time
- Peek: O(1) - Looking at the top item
- Contains: O(n) - Must search through all items
- Count: O(1) - Maintained internally This makes stacks extremely efficient for LIFO scenarios but inefficient when you need to access items in the middle or search frequently.
Real-World Application Patterns
Undo/Redo Functionality
public class TextEditor
{
private readonly Stack<EditorAction> _undoStack = new Stack<EditorAction>();
private readonly Stack<EditorAction> _redoStack = new Stack<EditorAction>();
private string _content = "";
public void ExecuteAction(EditorAction action)
{
// Save current state for undo
_undoStack.Push(new EditorAction
{
Type = ActionType.RestoreContent,
PreviousContent = _content
});
// Apply the action
_content = action.Apply(_content);
// Clear redo stack - new action invalidates redo history
_redoStack.Clear();
Console.WriteLine($"Executed: {action.Description}");
}
public void Undo()
{
if (_undoStack.Count == 0)
{
Console.WriteLine("Nothing to undo");
return;
}
// Save current state for redo
_redoStack.Push(new EditorAction
{
Type = ActionType.RestoreContent,
PreviousContent = _content
});
// Restore previous state
EditorAction undoAction = _undoStack.Pop();
_content = undoAction.PreviousContent;
Console.WriteLine($"Undid last action");
}
public void Redo()
{
if (_redoStack.Count == 0)
{
Console.WriteLine("Nothing to redo");
return;
}
// Move action from redo to undo stack
EditorAction redoAction = _redoStack.Pop();
_undoStack.Push(new EditorAction
{
Type = ActionType.RestoreContent,
PreviousContent = _content
});
// Apply the redo action
_content = redoAction.PreviousContent;
Console.WriteLine("Redid last undone action");
}
}
Expression Evaluation (Calculator)
public class PostfixCalculator
{
public decimal Evaluate(string[] tokens)
{
Stack<decimal> operandStack = new Stack<decimal>();
foreach (string token in tokens)
{
if (IsNumber(token))
{
operandStack.Push(decimal.Parse(token));
}
else if (IsOperator(token))
{
// Pop operands (note the order for non-commutative operations)
decimal right = operandStack.Pop();
decimal left = operandStack.Pop();
decimal result = token switch
{
"+" => left + right,
"-" => left - right,
"*" => left * right,
"/" => left / right,
_ => throw new ArgumentException($"Unknown operator: {token}")
};
operandStack.Push(result);
}
}
return operandStack.Pop(); // Final result
}
// Example: "3 4 + 2 *" evaluates to 14
// 1. Push 3, Push 4
// 2. Pop 4, Pop 3, Push 3+4=7
// 3. Push 2
// 4. Pop 2, Pop 7, Push 7*2=14
}
Function Call Stack Simulation
public class CallStackTracer
{
private readonly Stack<MethodCall> _callStack = new Stack<MethodCall>();
public void EnterMethod(string methodName, Dictionary<string, object> parameters = null)
{
var methodCall = new MethodCall
{
Name = methodName,
Parameters = parameters ?? new Dictionary<string, object>(),
StartTime = DateTime.Now
};
_callStack.Push(methodCall);
Console.WriteLine($"→ Entering {methodName} (depth: {_callStack.Count})");
}
public void ExitMethod()
{
if (_callStack.Count == 0)
{
Console.WriteLine("Warning: ExitMethod called with empty call stack");
return;
}
MethodCall completedCall = _callStack.Pop();
TimeSpan duration = DateTime.Now - completedCall.StartTime;
Console.WriteLine($"← Exiting {completedCall.Name} " +
$"(duration: {duration.TotalMilliseconds:F2}ms, " +
$"depth: {_callStack.Count})");
}
public void PrintCallStack()
{
Console.WriteLine("Current Call Stack:");
foreach (MethodCall call in _callStack)
{
Console.WriteLine($" {call.Name}");
}
}
}
// Usage in methods:
public void ProcessOrder(Order order)
{
callTracer.EnterMethod("ProcessOrder", new Dictionary<string, object> { ["orderId"] = order.Id });
try
{
ValidateOrder(order);
CalculateTotal(order);
SaveOrder(order);
}
finally
{
callTracer.ExitMethod();
}
}
Bracket/Parentheses Matching
public class BracketValidator
{
private static readonly Dictionary<char, char> BracketPairs = new Dictionary<char, char>
{
{ ')', '(' },
{ '}', '{' },
{ ']', '[' }
};
public bool IsValid(string expression)
{
Stack<char> bracketStack = new Stack<char>();
foreach (char character in expression)
{
if (IsOpeningBracket(character))
{
bracketStack.Push(character);
}
else if (IsClosingBracket(character))
{
if (bracketStack.Count == 0)
{
return false; // Closing bracket without matching opening
}
char lastOpening = bracketStack.Pop();
if (BracketPairs[character] != lastOpening)
{
return false; // Mismatched bracket types
}
}
// Ignore other characters
}
return bracketStack.Count == 0; // All brackets should be matched
}
// Examples:
// IsValid("()[]{}") → true
// IsValid("([{}])") → true
// IsValid("([)]") → false (crossed brackets)
// IsValid("((())") → false (unmatched opening)
}
Navigation History (Web Browser)
public class BrowserNavigationManager
{
private readonly Stack<string> _backStack = new Stack<string>();
private readonly Stack<string> _forwardStack = new Stack<string>();
private string _currentPage = "about:blank";
public void NavigateTo(string url)
{
// Save current page to back stack
if (!string.IsNullOrEmpty(_currentPage))
{
_backStack.Push(_currentPage);
}
_currentPage = url;
// Clear forward stack - new navigation invalidates forward history
_forwardStack.Clear();
Console.WriteLine($"Navigated to: {url}");
}
public bool CanGoBack => _backStack.Count > 0;
public bool CanGoForward => _forwardStack.Count > 0;
public void GoBack()
{
if (!CanGoBack)
{
Console.WriteLine("Cannot go back - no previous pages");
return;
}
// Move current page to forward stack
_forwardStack.Push(_currentPage);
// Get previous page from back stack
_currentPage = _backStack.Pop();
Console.WriteLine($"Went back to: {_currentPage}");
}
public void GoForward()
{
if (!CanGoForward)
{
Console.WriteLine("Cannot go forward - no forward history");
return;
}
// Move current page to back stack
_backStack.Push(_currentPage);
// Get next page from forward stack
_currentPage = _forwardStack.Pop();
Console.WriteLine($"Went forward to: {_currentPage}");
}
}
Good Practices vs Bad Practices
✅ Good Practice: Use Stacks for Naturally LIFO Operations
// Perfect use case - reversing operations
Stack<DatabaseTransaction> transactionStack = new Stack<DatabaseTransaction>();
public void BeginTransaction(DatabaseTransaction transaction)
{
transactionStack.Push(transaction);
transaction.Begin();
}
public void RollbackLastTransaction()
{
if (transactionStack.Count > 0)
{
DatabaseTransaction lastTransaction = transactionStack.Pop();
lastTransaction.Rollback(); // Most recent transaction rolled back first
}
}
❌ Bad Practice: Using Stacks for Sequential Processing
// Inefficient - items are processed in reverse order
Stack<EmailMessage> emailQueue = new Stack<EmailMessage>();
emailQueue.Push(firstEmail); // Will be processed LAST
emailQueue.Push(secondEmail); // Will be processed FIRST
// Use Queue<T> instead for FIFO processing
✅ Good Practice: Check for Empty Stack Before Operations
public T SafePop<T>(Stack<T> stack)
{
if (stack.Count == 0)
{
throw new InvalidOperationException("Cannot pop from empty stack");
}
return stack.Pop();
}
// Or use TryPop pattern
public bool TryPop<T>(Stack<T> stack, out T item)
{
if (stack.Count > 0)
{
item = stack.Pop();
return true;
}
item = default(T);
return false;
}
❌ Bad Practice: Assuming Stack Will Always Have Items
// Dangerous - will throw exception if stack is empty
Stack<string> history = new Stack<string>();
string lastAction = history.Pop(); // InvalidOperationException if empty
✅ Good Practice: Use Stacks for Hierarchical Processing
// Excellent for tree traversal or nested structures
public void TraverseDirectoryStructure(DirectoryInfo directory)
{
Stack<DirectoryInfo> directories = new Stack<DirectoryInfo>();
directories.Push(directory);
while (directories.Count > 0)
{
DirectoryInfo current = directories.Pop();
ProcessDirectory(current);
// Add subdirectories to stack (will be processed depth-first)
foreach (DirectoryInfo subdirectory in current.GetDirectories())
{
directories.Push(subdirectory);
}
}
}
Advanced Stack Scenarios
Stack with Size Limit
public class BoundedStack<T>
{
private readonly Stack<T> _stack = new Stack<T>();
private readonly int _maxSize;
public BoundedStack(int maxSize) => _maxSize = maxSize;
public bool TryPush(T item)
{
if (_stack.Count >= _maxSize)
{
return false; // Stack full
}
_stack.Push(item);
return true;
}
public bool TryPop(out T item)
{
if (_stack.Count > 0)
{
item = _stack.Pop();
return true;
}
item = default(T);
return false;
}
public int Count => _stack.Count;
public bool IsFull => _stack.Count >= _maxSize;
}
Thread-Safe Stack Operations
public class ThreadSafeStack<T>
{
private readonly Stack<T> _stack = new Stack<T>();
private readonly object _lock = new object();
public void Push(T item)
{
lock (_lock)
{
_stack.Push(item);
}
}
public bool TryPop(out T item)
{
lock (_lock)
{
if (_stack.Count > 0)
{
item = _stack.Pop();
return true;
}
item = default(T);
return false;
}
}
public bool TryPeek(out T item)
{
lock (_lock)
{
if (_stack.Count > 0)
{
item = _stack.Peek();
return true;
}
item = default(T);
return false;
}
}
public int Count
{
get
{
lock (_lock)
{
return _stack.Count;
}
}
}
}
Chapter Summary
Stack<T>
implements LIFO processing with O(1) performance for push, pop, and peek operations, making it essential for undo systems, expression evaluation, and any scenario involving nested or reversible operations. The key is recognizing when the “last in, first out” pattern matches your problem domain.
The key insight: Stacks excel when you need to reverse the order of operations or process nested structures, making complex hierarchical problems much simpler to solve.
Practice Exercise
Scenario: Build a simple arithmetic expression evaluator that can handle parentheses and basic operators (+, -, *, /).
Create a system that:
- Uses a stack to convert infix notation (“3 + 4 * 2”) to postfix notation (“3 4 2 * +”)
- Uses another stack to evaluate the postfix expression
- Handles nested parentheses correctly
- Provides meaningful error messages for invalid expressions
- Tracks the operation history for debugging purposes For example:
- Input: “2 * (3 + 4)” should evaluate to 14
- Input: “((2 + 3) * 4” should