Base.cs Podcast - podcast cover

Base.cs Podcast

CodeNewbiewww.codenewbie.org
Beginner-friendly computer science lessons based on Vaidehi Joshi's base.cs blog series, produced by CodeNewbie.

Episodes

S9:E8 - "In the end, the code you take is equal to the code you make"

For our final episode, we answer your burning questions including the Base.cs origin story, Saron and Vaidehi's favorite niche data structure, and what are some good resources to check out next. We also take a look back at some of our favorite moments from the show's history, and find a couple of fun themes. Based on Vaidehi Joshi's blog post, "Base.cs".

Mar 25, 202026 min

S9:E7 - "This way to translate is le-JIT"

We've been talking a lot about the differences between compilers and interpreters, and how both of them work, and the ways that allowed one — the compiler — to lead to the creation of the other — the interpreter. Now we get into the Just In Time compiler, or a JIT, which is fusion or combination of the interpreter and the compiler, which are each two types of translators in their own right. A just-in-time compiler has many of the benefits of both of these two translation techniques, all rolled u...

Mar 18, 202022 min

S9:E6 - "Two translators, both alike in dignity"

We have been talking a lot about compilers, and in this episode we discuss the differences between compilation versus interpretation. An interpreter is also a translator, just like a compiler, in that it takes a high level language (our source text) and converts it into machine code. However, it does something slightly different: it actually runs and executes the code that it translates immediately (inline) as it translates. Based on Vaidehi Joshi's blog post, "A Deeper Inspection Into Compilati...

Mar 11, 202020 min

S9:E5 - "Paring down our parse trees with AST"

In this episode, we take our parse tree, an illustrated, pictorial version of the grammatical structure of a sentence, and we take a metaphorical broom to sweep away repetitive bits, sliming it down, and leveling it up by creating an abstract syntax tree (AST). Based on Vaidehi Joshi's blog post, "Leveling Up One’s Parsing Game With ASTs".

Mar 04, 202019 min

S9:E4 - "Confused about compilers?"

In this episode, we get into what a compiler is and does. In short, a compiler is a program that reads our code (or any code, in any programming language), and translates it into another language. You'll want to listen in to find out just how it does this! Based on Vaidehi Joshi's blog post, "Reading Code Right, With Some Help From The Lexer".

Feb 26, 202023 min

S9:E3 - "Parsing out parse trees"

In this episode, we get into parse trees, an illustrated, pictorial version of the grammatical structure of a sentence, which is important to understanding how computers understand coding syntax. Based on Vaidehi Joshi's blog post, "Grammatically Rooting Oneself With Parse Trees".

Feb 19, 202023 min

S9:E2 - "Speeding up our traveling salesperson"

We continue our journey with the Traveling Salesman Problem (TSP), where this we imagine a salesperson has to travel to every single city in an area, visiting each city only once. Additionally, they need to end up in the same city where they starts their journey from, and do this in the most efficient manner. However, in this episode, we are going to speed our salesperson up by using a bottom-up approach! Based on Vaidehi Joshi's blog post, "The Trials And Tribulations Of The Traveling Salesman"...

Feb 12, 202022 min

S9:E1 - "Take a journey with the Traveling Salesman"

We start our season off with something that often pops up in technical interviews: the Traveling Salesman Problem (TSP). In this problem, a salesperson has to travel to every single city in an area, visiting each city only once. Additionally, they need to end up in the same city where they starts their journey from. Find out how to make our salesperson do this in the most efficient way possible! Based on Vaidehi Joshi's blog post, "The Trials And Tribulations Of The Traveling Salesman".

Feb 05, 202030 min

S8:E8 - "Memoizing all the things in dynamic programming"

In this last episode of the season we continue our discussion of dynamic programming, and show just how efficient it can be by using the Fibonacci sequence! Based on Vaidehi Joshi's blog post, "Less Repetition, More Dynamic Programming".

Dec 11, 201919 min

S8:E7 - "Dynamic Programming is pretty dynamite "

In this episode we talk about different paradigms and approaches to algorithmic design: the Divide and Conquer Algorithm, the Greedy Algorithm, and the Dynamic Programming Algorithm, which remembers the subproblems that it has seen and solved before so as not to repeat doing the same thing over again. Based on Vaidehi Joshi's blog post, "Less Repetition, More Dynamic Programming".

Dec 04, 201922 min

S8:E6 - "Getting deeper into Dijkastra"

We continue our talk about Dijkstra's algorithm, which can be used to determine the shortest path from one node in a graph to every other node within the same graph data structure, provided that the nodes are reachable from the starting node. Based on Vaidehi Joshi's blog post, "Finding The Shortest Path, With A Little Help From Dijkstra".

Nov 20, 201929 min

S8:E5 - "Dijkstra's algorithm is a weighty topic"

In this episode, we talk about Dijkstra's algorithm, which can be used to determine the shortest path from one node in a graph to every other node within the same graph data structure, provided that the nodes are reachable from the starting node. It's super important, and you'll see why when you learn about the weighted graph! Based on Vaidehi Joshi's blog post, "Finding The Shortest Path, With A Little Help From Dijkstra".

Nov 13, 201926 min

S8:E4 - "DAG, Daniel! Back at it again..."

We end our section of the DFS algorithm with a discussion on DAGs (directed acyclic graphs), because most implementations of depth-first search will check to see if any cycles exist, and a large part of that is based on the DFS algorithm checking to see whether or not a graph is a directed acyclic graph. DAGs are also somewhat infamous in computer science because they’re pretty much everywhere in sofware. For example, a directed acyclic graph is the backbone of applications that handle schedulin...

Nov 06, 201913 min

S8:E3 - "Living on the edge!"

Throughout our exploration of graphs, we’ve focused mostly on representing graphs, and how to search through them. We also learned about edges, the elements that connect the nodes in a graph. In this episode, we look at the different classifications of edges and how, in the context of a graph, edges can be more than just “directed” or “undirected”. Based on Vaidehi Joshi's blog post, "Spinning Around In Cycles With Directed Acyclic Graphs".

Oct 29, 201921 min

S8:E2 - "Jump around the indexes with DFS!"

Last episode, we talked about traversing through a graph with the depth-first search (DFS) algorithm, which helps us determine one (of sometimes many) paths between two nodes in the graph by traversing down one single path until we can't go any further, checking one child node at a time. Now we talk about how you code BFS and what tools might you use. Based on Vaidehi Joshi's blog post, "Deep Dive Through A Graph: DFS Traversal".

Oct 23, 201925 min

S8:E1 - "Getting deep with depth-first search"

We ended last season by starting our discussion of searching, or traversing, through a graph with breadth-first search (BFS). The breadth-first search algorithm traverses broadly into a structure, by visiting neighboring sibling nodes before visiting children nodes. Now we begin our new season with depth-first search (DFS), which also helps us determine one (of sometimes many) paths between two nodes in the graph, but this time by traversing down one single path in a graph, until we can't go any...

Oct 16, 201926 min

S7:E8 - "Delivering muffins with BFS"

In this episode, we start our discussion of searching, or traversing, through a graph with breadth-first search (BFS). The breadth-first search algorithm traverses broadly into a structure, by visiting neighboring sibling nodes before visiting children nodes. The power of using breadth-first search to traverse through a graph is that it can easily tell us the shortest way to get from one node to another, which you'll experience first hand by brining muffins to your neighbors! Based on Vaidehi Jo...

Sep 11, 201923 min

S7:E7 - "Plotting to represent a graph? We got you."

In this episode, we continue our discussion of representing graphs with adjacency lists -- a hybrid between an edge list and an adjacency matrix, which we learned about last episode! They are also the most popular and commonly-used representation of a graph. Based on Vaidehi Joshi's blog post, "From Theory To Practice: Representing Graphs".

Sep 04, 201925 min

S7:E6 - "It's laughable how easy it is to get graphical"

Graphs come from mathematics, and are nothing more than a way to formally represent a network, which is a collection of objects that are all interconnected (this is all stuff you should already know if you have been religiously listening to this podcast, which you should be). Now we're going from theory to practice and talking about how to represent graphs. Based on Vaidehi Joshi's blog post, "From Theory To Practice: Representing Graphs".

Aug 28, 201921 min

S7:E5 - "To b-tree or not to b-tree"

In last episode, we talked about 2-3 trees, where the nodes of every tree contain data in the form of keys, as well as potential child nodes, and can contain more than one key. This takes us to b-trees, which is a generalized version of the 2-3 tree, and are super efficient for storing data in an indexed database, like MySQL. Based on Vaidehi Joshi's blog post, "Busying Oneself With B-Trees".

Aug 21, 201917 min

S7:E4 - "A 2-3 tree for you and me"

We continue our discussion of tree data structures with 2-3 trees, where the nodes of every tree contain data in the form of keys, as well as potential child nodes. Not only that, but it can contain MORE THAN ONE KEY. They are also the -key- to what we'll be talking about next episode, B-trees, and you won't tree-lieve how cool those are. Based on Vaidehi Joshi's blog post, "Busying Oneself With B-Trees".

Aug 14, 201920 min

S7:E3 - "Color me logarithmic!"

In this episode, we are looking at a different type of self-balancing tree: red-black trees. By following four very important rules while we paint our tree red and black, we can make it not only self-balancing, but also make it run super efficiently in logarithmic time. Based on Vaidehi Joshi's blog post, "Painting Nodes Black With Red-Black Trees".

Aug 07, 201923 min

S7:E2 - "Stay gold, AVL tree, stay gold"

Last episode, we learned about AVL trees, a type of self-balancing binary search tree that follows a golden rule: no single leaf in the tree should have a significantly longer path from the root node than any other leaf on the tree. In this episode, we learn about a pattern that we can use to programmatically figure out the minimum number of nodes we’ll need to create any given height-balanced AVL tree, which leads us to the Fibonacci sequence, and relates to the "golden ratio" you might know ab...

Jul 31, 201918 min

S7:E1 - "The AVL balancing act"

When you're dealing with data structures like trees, the balance of its "leaves" (data/nodes) matters. The moment a tree becomes unbalanced, it loses its efficiency, much like a real life tree bending to the weight of one side, unable to efficiently stand tall and grab the light of the sun. Don't let your garden grow full of lopsided saplings, and make sure to plant some AVL trees--your efficient runtime hangs in the balance. Based on Vaidehi Joshi's blog post, "The Little AVL Tree That Could".

Jul 24, 201923 min

S6:E8 - "Meet our good friend PATRICIA"

In this episode, we continue our talk on Radix Trees and introduce the Practical Algorithm To Retrieve Information Coded In Alphanumeric trees, also known as PATRICIA trees. Yeah, I think we'll just stick with calling them PATRICIA trees. Based on Vaidehi Joshi's blog post, "Compressing Radix Trees Without (Too Many) Tears".

Jun 19, 201927 min

S6:E7 - "The cannibalistic efficiency of radix trees"

In this episode, join us as we adventure into the safari that is radix trees, where parent nodes eat their offspring nodes as they chomp them down and compress. Don't worry, with all of this new added space in the trie(b), they'll more efficiently keep their children's memory alive. Based on Vaidehi Joshi's blog post, "Compressing Radix Trees Without (Too Many) Tears".

Jun 12, 201923 min

S6:E6 - "Dear tries, you (auto)complete me"

In this episode we continue our talk on pies and tries, and how this data structure is used to power such things as auto-complete! Based on Vaidehi Joshi's blog post, "Trying to Understand Tries".

Jun 05, 201923 min

S6:E4 - "Radix sort: the patient zero of sorting algorithms "

This episode we're diving into radix sort! The word has no relation to Raid, so it is definitely non-toxic and you don't have to bug out. It IS, however, a great integer sorting algorithm, and the first one at that! Based on Vaidehi Joshi's blog post, "Getting To The Root Of Sorting With Radix Sort".

May 22, 201927 min

S6:E3 - "You can count on counting sort"

You may have noticed that it's really hard to sort things efficiently. Well, that's where counting sort comes in! Based on Vaidehi Joshi's blog post, "Counting Linearly With Counting Sort".

May 15, 201930 min