Becoming a better engineer
My understanding of this path has evolved quite a bit since my presentation a month and a half ago. Originally, I proposed combining more complex problems at work and either going to Stanford for a part-time masters program or Bradfield for a full-time bootcamp. Here are some updates on each of those ideas:
Regarding tackling complex problems at work:
- The result of the pubsub investigation is a) much more detailed and actionable data about pubsub performance and b) a 70% reduction in pubsub traffic volume with no change in functionality. We also generated a couple for the future. Hopefully we’ll make time for them in the next episode. Besides being a worthwhile project, I think the scientific approach was good for my engineering skills.
- Brian has suggested that I develop an expertise in networking and work more deeply on Internet connectivity in our dorms, which sounds great to me. I’m looking forward to working with Cheng and Brian on beginning to , starting with stats collection. Depending on the bottlenecks we identify, we may decide to do any of a million projects: micromanage our WiFi channel utilization to minimize interference, experiment with tuning between WiFi access points to reduce disruptions, implement , install (possibly ), , etc.
- If we’re able to make significant progress on the above, I’d also be eager to contribute to Licode or the ALF-Licode boundary to accelerate our progress towards an architecture that scales smoothly with the number of participants and a variety of connection qualities. I’ve been (although much of the time it’s me learning by asking naive questions 😬).
- Finally, in a distant future in which video quality is good enough for the ALF to “become invisible”, I think one of Minerva’s strongest differentiators could be the way in which we apply the power of machine learning to our uniquely technological context — who else has students attend class through their app? Questions that are impossible for most institutions to decisively answer, such as “How do we choose who to admit?”, “How do we incentivize students?”, “How do we evaluate teachers?”, and “How do we evaluate student outcomes?” — all depend on the intelligent extraction of insight from data. I helped organize Machine Learning club with Jason to get people started thinking about this, and I’m eager to dive more deeply into the field (see below!).
Regarding the two ongoing education opportunities I mentioned:
- JK and Jason were both excited by the prospect of a Bradfield bootcamp. Jason even helped me choose the four courses he thought I would benefit from the most: Computer Architecture, Networking, Databases, and Operating Systems. However, Brian vetoed the idea, saying that given the steep cost and time requirement, I may as well pursue a masters degree; he also believes that I’ll actually retain more of what I learn if I mix it with professional experience.
- Unfortunately, I’ve learned that a part-time masters at Stanford is impossible unless I were to work for an affiliated company (Google, Uber, Airbnb, etc.), and I’d still rather work at Minerva than any of those places. Studying full time is still an option, of course, but I have doubts that the total cost is worth it relative to two years working full time and studying part time, especially in light of various feedback I’ve received (see below), and especially given JK’s revelation that a low-cost, part-time masters degree can be obtained from Georgia Tech, a top-10 CS program.
I’ve also now spoken with several other people on this topic:
- Jan was somewhat disdainful of pursuing a masters degree solely for the improved career prospects. Some companies might hire these people for R&D, but mostly they’re useful as a springboard to other graduate degrees or as a way to gain a specialization. When he was in school 20 years ago, it was rare for people to bother getting an MS.
- Alex Cusack, who has a similar background in CS as myself and just finished Bradfield bootcamp, recommended the bootcamp and Oz and Myles generally. He benefited hugely from their lectures and pairing time (although he wishes there had been more of it during the bootcamp). He’s thankful for the material they covered, saying that he can now comfortably reason at any level of abstraction, where before he only considered himself competent at web development.
- Skylar Cook, who has completed 24 of 30 credits in the Georgia Tech masters program and who recently joined Google as an embedded systems engineer, recommends the program. He is self-taught and used to struggle with imposter syndrome, but now he’s much more confident in his skills and knowledge. He did warn me that class quality can vary, and cautioned me against making some of the mistakes he made - taking on too much in a single semester, and enrolling in courses that sound cool but that don’t deliver on their promise. He also noted that the additional commitment did not interfere with his productivity at his job.
My educational goals are still to achieve the kind of well-rounded computer science knowledge that Alex and Skylar now benefit from, while also gaining a specialty in machine learning (which, luckily, is one of the four specialization options offered at Georgia Tech). If I were to enroll in Georgia Tech’s OMS in CS, here’s a possible course sequence:
Intro to Operating Systems
Data & Visual Analytics
Intro to Information Security
Notably absent from this sequence are several foundational CS courses. This is for two reasons:
- Many of Georgia Tech’s foundational CS offerings are poorly rated, compared to the above:
- Database Systems Concepts & Design: 3.2
- Computer Networks: 3.6
- Software Architecture & Design: 3.9
- Bradfield is now offering covering this material.
The Bradfield short courses are not a difficult commitment to juggle — they’re three weeks long each and involve homework and 9 hours of classroom instruction per week (3 hours on Monday and Thursday nights and 3 hours on Saturdays). I could participate in a few of these over the next few months, depending on whether I decide to go to Buenos Aires in February (with Cheng and Brian) or April (with Erin and potentially other Schools team members, as originally planned). See the comments for Bradfield’s course descriptions:
- January: Computer architecture and the hardware/software interface
- February: Databases
- March: Distributed systems
- April: Introduction to neural networks
To prepare for the Computer Architecture course, lay a solid foundation for the 5-10 courses involving C I would take, and to prepare for contributing to Licode, I could also work through in the next month — it’s fairly short — and tackle a few practice problems on Exercism. Update: read the book, completed a couple problems — I’m ready for the course.
With this plan, I would broadly survey the field of computer science, starting at the lowest practical layer of abstraction:
- Computer architecture and the hardware/software interface