I would like to thank Professor Sariel Har-Peled whose lectures I've based much of my own content on. These lectures are updated every semester to emphasize things I find important and match my voice. Though my lectures have diverged from his, Prof. Har-Peled's lectures are also great.
Also, quick hint, icons are links to slides and video contents (And the "Read Me" link to some quick notes)! By popular demand, we are also including links to Sumedh's (the true OG), discussion recordings!
Pre-lecture slides | Post-lecture scribbles | Asynchronous lecture video | Live recording | Live recording (Sumedh) |
---|---|---|---|---|
This is the first lecture of the course. We’ll discuss the course policies and why we model problems as languages.
We’ll discuss regular languages and representing them as regular expressions (the OG RegEx).
Introduction to deterministic finite automata. We’ll also discuss how to use DFAs to prove closure properties.
Introduction to non-determinism. We’ll talk about how it applies to automata and formalize NFA concepts/definitions.
DFA/NFA/RegEx’s can be converted from one form to another using the simple methods discussed in this lecture.
Now that we’ve exhausted regular languages, it’s time to move onto larger complexity classes. But first, how do we tell if a language isn’t regular?
Now that we know of many non-regular languages, it is time to investigate the next computability class - context-free languages.
We’ll skip context-sensitive language and go straight for Turing recognizable languages and the most important machine in modern computing: the Turing machine
The goal of this lecture to have you internalize one truth - that there is a universal turing machine that can simulate any other turing machine.
We’ll begin the algorithms portion of our course with a quick recap of the most fundamental algorithmic technique: recursion. We’ll also briefly go over solving recurrences.
The next step up from recursion are divide and conquer algorithms. Large objective is to understand linear time selection!
It is time to optimize your recursive algorithms by storing previously computed instance outputs, a.k.a backtracking. We’ll also introduce the longest increasing sub-sequence (LIS) problem.
Important one … even though you probably won’t see many instances of dynamic programming in the real world, it is a favorite of FANNG interviews.
We will continue our discussion of dynamic programming with the edit-distance and common subsequence problem(s). We’ll also discuss the general formula for dynamic programming problems.
First lecture on graphing algorithms. We’ll discuss basic search and introduce the concept of connected components.
We’ll begin our graphing algorithms discussion with directed-acyclic graphs which hold quite a few properties we can exploit.
Pretty my the de-facto graphing problem, we’ll discuss simple shortest path algorithms including Djikstra’s algorithm.
We’ll continue our discussion of shortest path algorithms with two algorithms which use DP principles as well.
Lastly, we’ll end the algorithms portion of the course with a discussion on minimum spanning tree algorithms.
We’ll begin this section of the course with a discussion of reductions, what various reductions imply and why they are useful. The SAT problem will be introduced as well.
We now delve into the hard (NP-hard) problems and investigate how to prove if a specific problem is NP, NP-hard, etc.
We’ll discuss more NP-complete problems/reductions and specifically focus on reductions requiring gadgets.
Stepping away from algorithms, it is time to ask what problems are not decidable and cannot be solved by a Turing machine. We’ll introduced the halting problem.
We’ll go over some more undecidable problems and how to prove their (un-)decidability. We’ll use our familiar reduction method in novel ways.
Let’s take a breather before the third midterm and simply go over some lingering questions and/or some practice problems. We’ll also begin with a brief overview of classic decidability problems.
Last lecture. We’ll go over all the concepts discussed in the course using some simple practice problems. Special emphasis on topics not covered the the other exams (MSTs and TMs).
Some quick problem involving recursive definitions and what a language is and how to describe it.
Some quick problem involving recursive definitions and regular expressions.
Going over how to construct DFAs and how to create formal definitions of other automata.
Now that NFAs are understood, we can easily prove regular closure for a variety of operations.
This lab will help you understand that RegEx’s, DFAs and NFAs all represent the same languages.
Not all languages are regular. We practice a simple proof technique for proving this.
The next computability class: context-free languages. Will discuss context-free grammars, push-down automata and how they relate to eachother.
The most complex computing machine we know of, we’ll go through examples of Turing machines and how they can be used to recognize languages.
We’ll start our investigation of recursion through binary search and the many variations of the binary search problem.
We’ll discuss one of the OG divide and conquer algorithms: Kartsuba’s algorithm.
We’ll continue our work on recursion by practicing how to use backtracking to create optimal recursions. Special emphasis on writing recurrences.
Now that we’ve formulated our solutions as recurrences, let’s turn them into efficient algorithms.
Second dynamic programming lab. We take classic DP problems and explain them in a new way.
Our first graph algorithms lab. This first lab will be on formulating logic puzzles as graphing problems that can be solved using simple search algos.
A second graph algorithms lab focusing on formulating problems as graphs and using BFS/DFS to solve them.
Having learned our first shortest path algorithm (Djikstra’s) we’ll discuss the shortest path problem in a variety of contexts.
A few more shortest path problem that include both negative edges and cycles potentially requiring algorithms other than Djikstra’s.
The OG graphing problem, we’ll take a moment to discuss minimum spanning tree problems.
We begin with a discussion of reductions and show how one can solve novel problems using known solutions from standard problems.
We explore a few simple reductions to prove that a problem is NP-hard. Special emphasis on the SAT problem.
The second lab on using reductions to show NP-hardness. Emphasis will be placed on gadget-based reductions.
First lab on decidability. We’ll be testing the waters on some simple problems provable using the standard reduction template.
The second lab on decidability. We’ll prove some other languages are undecidable using slightly more complex proof structures.