Reviews

The Udacity iOS Development Nanodegree

I started auditing the first course in the Udacity Swift/iOS Nanodegree to get a feel for the concept.

I like the Nanodegree pitch:

Leading technology companies design and teach Nanodegree programs with our help, and also endorse them. They know what technical skills they need to hire for and which new skills their employees need.1

The tone is very different than just a college course that has been made available online (like the Stanford CS193P course). The Udacity introduction course appears to target professionals in another field who are interested in getting into programming and have limited exposure to programming.

While the introduction course did not really seem like a good fit for me, I tried to imagine taking it as a non-programmer and found it to be quite accessible. The introduction course seemed to be more about learning to research programming concepts through the lens of iOS development.

Researching programming concepts

The course had a good balance between suggesting “copy and paste programming” to get started and looking up material in the documentation to actually make the snippets work.

An important lesson I learned when I began programming was knowing how to ask questions at a high level and following terms in those answers to a concrete answer. This is something the course tries to impart on the student. For example, a search on the internet for a high-level concept like “How to play back audio on iOS” will bring up a lot of new terms, such as AVFoundation, AVAudioPlayer and AVAudioEngine. Those terms should then be researched, which will lead to the question, “How to play back audio with AVAudioEngine.”

Refactoring - the missing section

I wish the course had one additional section that focused on refactoring. Yes, the course does ask the student to reduce some duplicate code. However, I think it could’ve gone a bit farther by pushing the student to reduce the number of audio engines used in the playback view controller to one.

AVAudioEngine, specifically AVAudioUnitTimePitch, which is already used in the project, is more than capable of changing the speed of the playback in addition to the pitch.

In a production app, there is a struggle between shipping and technical debt. While your end users won’t notice you are using two playback engines, it adds a lot of technical debt. AVAudioEngine and AVAudioPlayer use two very different concepts for playback, and you’d probably want to pick one and stick with it to reduce the possibility of bugs and to reduce the cognitive load on you as a developer.

As a web developer, I once took over a project that used MooTools, script.aculo.us, jQuery, and Dojo Toolkit for various things (all of which could have be done with just one library). I imagine the original developer did a lot of copying and pasting and didn’t take future development or page load times into account.

Talkboy - Pitch Perfect extended

After going through the course, I wanted to extend the “Pitch Perfect” project to include a few new features:

  • File Management - Now you can replay and delete old files
  • Recording visualization - A Siri-like waveform view during recording
  • Unified playback engine - Now uses AVAudioEngine for both speed and pitch effects
  • Slider controls - Pitch and speed settings are no longer static buttons but now variable, real-time sliders
  • All wrapped in a UISplitViewController

You can find Talkboy on GitHub. Please leave feedback in the issues section, and pull requests are welcome.

Other courses in the program

I’m probably going to skip the UIKit Fundamentals course. While I would like to review UIKit with Swift, having iOS development experience, I feel this course would go a bit slower than I would like, and skimming a book would be better for me.

I am interested in taking the iOS Networking with Swift course. I haven’t used NSURLSession yet (the successor to NSURLConnection) and a project app like a themoviedb.org client seems fun.