Quick Links
2020
Coursework
I spent January to April finishing up my undergraduate degree in CS and Combinatorics and Optimization at UWaterloo. As a result of only having 2 easy required courses left to take, I spent most of term absorbing as much combinatorics knowledge as I could:
-
Random Graph Theory (CO 749)
We worked through some major results and built up probability tools along the way as needed. The key highlight for me was probably a more thorough understanding of various concentration results. I gave up on LaTeXing my notes since the calculations were quite gruesome, but have some hybrid typed/scanned notes on my notes site. The main reference was Bollobás’ book, but it was actually a bit too fast for me to follow so I also referenced Frieze and Karoński’s book.
-
Graph Colourings (CO 749)
We went through a lot of the cutting edge (as of the time this was written) in graph colourings. To be very honest, a lot of the content went over my head. At the very least, though, I walked away with a much more expanded view of graph theory and, as a bonus, picked up a couple tools, methods and concepts along the way. Overall, everything covered in this course was super interesting and I definitely gained a better appreciation of some of the deeper results. Some of the references we went through can be found in my scans.
-
Combinatorial Design (CO 434)
And now back to courses I actually understood a lot of. This course was the wildest mishmash of linear algebra, abstract algebra, number theory and combinatorics I’ve ever seen. It was so nice to see how seemingly different combinatorial problems could just be cast as one another. I don’t doubt that there were even deeper connections that I missed, so I’ll definitely be revisiting this material in the near future. I started typing up my notes and the main reference for this course was Stinson’s Combinatorial Designs: Constructions and Analysis.
And since I’m also a CS student, I did actually take some CS courses:
-
Introduction to Symbolic Computation (CS 487)
This course was the perfect balance between mathematics and computer science. Some of the proofs were glossed over (which was actually reasonable since an algebra course was not a prereq), but the textbook: Modern Computer Algebra, 3rd Edition (von Zer Gathen, Gerhard) went through all the gory details so I was content. We pretty much went through all of Part I and II and Chapter 14, of which I typed up partial notes for. I wrote up a project surveying primality testing methods which I may put up on a later date. Quite a fun project—pretty much covered all of chapter 18 in the textbook, but also had to hunt around for references to other algorithms.
-
Models of Computation (CS 365)
Took a deep dive into most of Part I (Automata theory) and Part II (Computability theory) of Sipser’s Introduction to the Theory of Computation, 3rd edition. We meant to do more, that is, the plan was to spend 3-4 weeks on Part III (Complexity theory), but unfortunately, due to COVID-19, we only got through Chapter 7 (Time complexity). The complexity portion was a bit more interesting to me so my plan is to go through Chapter 8 and 9 by myself and then pick up Arora and Barak’s Computational Complexity: A Modern Approach and work through that.
Algorithms
-
Introduction to Algorithms (MIT 6.006, Fall 2011) [Course Site] [Lectures]
The material taught here was review, but I wanted to make sure I wasn’t missing anything before moving onto the followup course: 6.046J. I basically just sped through the lecture videos and sketched answers to their problem sets (minus the coding portions).
-
Design and Analysis of Algorithms (MIT 6.406J, Spring 2015) [Course Site] [Lectures]
Mostly review as well. I hadn’t seen any of the distributed content or the cache-oblivious content so that was nice. I’m surprised they jam pack such a large breadth of content in the latter half of the course. I guess from the perspective of a student this is an awesome teaser to algorithms in different areas. I loved the Network Flow theory (CO351) course at UWaterloo so it was great to see them going over the max-flow, min-cut theorem.
It was interesting to see a lot of familiar looking questions on the problem sets. I guess a lot of these are your bread-and-butter-type extensions to the core material taught in lectures. The problem sets reminded me a lot (maybe too much) of when I would sit in my chair for hours on end trying to solve them for the first time.
-
A Second Course in Algorithms (Stanford CS261, Winter 2016) [Course Site] [Lectures]
Parts III and IV were all content I hadn’t seen before. The problem sets and exercises are extremely interesting and I have to admit that some of the problems in the problem sets are genuinely tough. As of the time that I’m writing this, there are still a couple problems that I don’t yet have the solution to.
The main takeaways for me was the brief introduction to online algorithms and the emphasis on primal-dual methods for algorithm design in the latter half of the course. I had seen primal-dual algorithms before, but these lectures really helped it click.
Looking ahead, I kind of want to learn more techniques and methods. I have my eye on Online Algorithms: The State of the Art by Fiat and Woeginger. I’ll, of course, update this site if I continue down that route. The graduate course CS266 was also mentioned in this course, but unfortunately they don’t seem to have course notes or videos. I may flip through Parameterized Algorithms by Cygan et al..
-
I/O Efficient Algorithms (Coursera) [Course Site]
Was motivated to learn more about cache-oblivious algorithms from the teaser at the end of 6.046J. The course helped to solidify some concepts, and I also learned some more cool concepts such as time-forward processing and buffer trees. Check out my certificate for this course here!
Deep Learning
Projects
-
Similar Images Purger - The project that kickstarted my journey of picking up deep learning. My goal was to remove similar looking images from my massive photo album (We actually consider the dual problem of producing a representative set of images from the album instead). To that end, I dip my feet into TuriCreate’s image similarity model and play around with clustering algorithms available in scikit-learn
-
IZ*Net - This project came after I took the CNN course on Coursera (see below), so I knew a tad bit more about deep learning. As such, this is a more hands-on project where I build CNN models from scratch in order to create a face recognition model and also code up a simplified version of the YOLO algorithm to do face detection.
-
ARAMNet - A portion of a larger project I spent some time on in trying to solve a toy probability problem, inspired by ARAM, a game mode in popular game League of Legends. As part of our work, we create an RNN to help us performs simulations and obtain empirical evidence for our toy problem.
Deep Learning Specialization (Coursera)
I successfully completed the entire Deep Learning specialization! (Certificate can be found here!) It was quite the journey, I learned a lot of cool things and got a taste on various different architectures and problems in this field. Andrew Ng did a wonderful job of simplifying the content and peeling out the important bits for discussion during the lectures. That said, although we covered a huge breadth of material, I think the next goal for me is to delve into a bit more depth on particular topics. I’ll probably continue expanding my knowledge in this field; I have a couple other courses and textbooks I’ll probably give a shot in the near future.
Generative Adversarial Networks (Coursera)
Successfully completed another one of Deeplearning.ai’s specializations. This one was pretty digestible after tackling the Deep Learning specialization. We went over several state-of-the-art models/architectures, including StyleGANs, Pix2Pix, CycleGANs, etc. The lectures were pretty useful in gaining a high-level overview which made going over their respective papers much easier. The coding assignments helped to an extent, ultimately it was just filling in portions of their unfinished code, but it was still nice to be able to quickly see a working product. I did end up picking up PyTorch as a result of this specialization and enjoy using it alongside Tensorflow. (Don’t really have a preference yet, Tensorflow makes it a bit easier to create models without having to explicitly care about input/output sizes but PyTorch feels much more pythonic) I had a couple project ideas that would require fiddling around with GANs, but may or may not ever get to them. Nonetheless, a lot of the techniques used with these models (e.g. pairing up the networks in a cycle, W-Loss, etc.) seem pretty useful outside of the GANs context so I’ll definitely be keeping those in mind for the future.
Lean
Went through the Natural Number Game (Find me solutions here). I had been meaning to learn something in the theorem prover/type theory-land for a while, so this is finally a start. Overall this was pretty fun and some of the puzzles actually took me a while to solve. I want to continue this journey by going through the official docs and maybe dabble in Lean 4. Don’t have a real goal or direction I’m going to take with this knowledge yet, so we’ll see what happens as I continue learning more of this stuff.
MIT 18.S191 (Introduction to Computational Thinking)
Went through roughly 2/3rds of the course with three goals in mind:
- Learn Julia
- Learn about epidemic modelling
- Learn about raytracing
You can check out the stuff I fiddled around with here. The course is pretty cool and I probably would’ve really enjoyed it as a student. I’ve taken a similar course at UWaterloo (CS370/371) and our assignments were super dry compared to this. (Probably comparing apples to oranges here since this course is new and our course is old as heck.)
I had heard about Julia before the course so being able to fiddle around while working through their assignments was super fun. I liked it quite a bit. Its expressibility is pretty much the same as Python’s but the types make it a lot less of a pain to work with. Native support of broadcasting and vector operations is also really nice. All this said, I guess I should be expecting this since I’m comparing an actual scientific language to a general-purpose language that was forced into scientific applications. One complaint I might have is that Pluto was a bit annoying on Windows, especially since I couldn’t stop a cell (More than one time I’d accidentally set an infinite loop off running).But the UI was nice compared to Juypter.
Overall though, getting a nice introduction to Julia, building up the SIR model and a simple raytracer was super educational and I quite enjoyed it all!
Computer Networking
Finished up to chapter 4 of Computer Networking: A Top-Down Approach. I reprioritized the things I wanted to learn so this sat in the back-burner for most of the year.
2021
Nothing yet! Happy New Years!
(Last updated Jan. 1st)