Quick Links

2020

Coursework

I spent January to April finishing up my undergraduate degree in CS and Combinatorics and Optimization at UWaterloo. As a result of only having 2 easy required courses left to take, I spent most of term absorbing as much combinatorics knowledge as I could:

And since I’m also a CS student, I did actually take some CS courses:

Algorithms

Deep Learning

Projects

Deep Learning Specialization (Coursera)

I successfully completed the entire Deep Learning specialization! (Certificate can be found here!) It was quite the journey, I learned a lot of cool things and got a taste on various different architectures and problems in this field. Andrew Ng did a wonderful job of simplifying the content and peeling out the important bits for discussion during the lectures. That said, although we covered a huge breadth of material, I think the next goal for me is to delve into a bit more depth on particular topics. I’ll probably continue expanding my knowledge in this field; I have a couple other courses and textbooks I’ll probably give a shot in the near future.

Generative Adversarial Networks (Coursera)

Successfully completed another one of Deeplearning.ai’s specializations. This one was pretty digestible after tackling the Deep Learning specialization. We went over several state-of-the-art models/architectures, including StyleGANs, Pix2Pix, CycleGANs, etc. The lectures were pretty useful in gaining a high-level overview which made going over their respective papers much easier. The coding assignments helped to an extent, ultimately it was just filling in portions of their unfinished code, but it was still nice to be able to quickly see a working product. I did end up picking up PyTorch as a result of this specialization and enjoy using it alongside Tensorflow. (Don’t really have a preference yet, Tensorflow makes it a bit easier to create models without having to explicitly care about input/output sizes but PyTorch feels much more pythonic) I had a couple project ideas that would require fiddling around with GANs, but may or may not ever get to them. Nonetheless, a lot of the techniques used with these models (e.g. pairing up the networks in a cycle, W-Loss, etc.) seem pretty useful outside of the GANs context so I’ll definitely be keeping those in mind for the future.

Lean

Went through the Natural Number Game (Find me solutions here). I had been meaning to learn something in the theorem prover/type theory-land for a while, so this is finally a start. Overall this was pretty fun and some of the puzzles actually took me a while to solve. I want to continue this journey by going through the official docs and maybe dabble in Lean 4. Don’t have a real goal or direction I’m going to take with this knowledge yet, so we’ll see what happens as I continue learning more of this stuff.

MIT 18.S191 (Introduction to Computational Thinking)

(Course Link)

Went through roughly 2/3rds of the course with three goals in mind:

  1. Learn Julia
  2. Learn about epidemic modelling
  3. Learn about raytracing

You can check out the stuff I fiddled around with here. The course is pretty cool and I probably would’ve really enjoyed it as a student. I’ve taken a similar course at UWaterloo (CS370/371) and our assignments were super dry compared to this. (Probably comparing apples to oranges here since this course is new and our course is old as heck.)

I had heard about Julia before the course so being able to fiddle around while working through their assignments was super fun. I liked it quite a bit. Its expressibility is pretty much the same as Python’s but the types make it a lot less of a pain to work with. Native support of broadcasting and vector operations is also really nice. All this said, I guess I should be expecting this since I’m comparing an actual scientific language to a general-purpose language that was forced into scientific applications. One complaint I might have is that Pluto was a bit annoying on Windows, especially since I couldn’t stop a cell (More than one time I’d accidentally set an infinite loop off running).But the UI was nice compared to Juypter.

Overall though, getting a nice introduction to Julia, building up the SIR model and a simple raytracer was super educational and I quite enjoyed it all!

Computer Networking

Finished up to chapter 4 of Computer Networking: A Top-Down Approach. I reprioritized the things I wanted to learn so this sat in the back-burner for most of the year.

2021

Nothing yet! Happy New Years!

(Last updated Jan. 1st)