There is plenty one can learn from a professor in a classroom; the foundations of computer thinking are relatively universal. Every student should be able to maintain trees and heaps, run a quicksort or mergesort, and understand fundamental concepts like modularity and data typing. Indeed, if all programmers skipped these foundations in favor of getting straight to hacking, there would be little elegance or progress in the field. Our systems would be inefficient, insecure, and unsustainable.
But this education is not enough. No matter how much theory and math an education provides, it cannot cover every facet of the technological world. There are an infinite number of established systems in place whose ins and outs cannot be studied without exposure in an internship or apprenticeship setting. The steady, simple foundations we learn in school give way to teetering masses of complex code written anywhere from 20 years to 2 weeks ago, code which we must accommodate in order to integrate our own original ideas to the pile.
I came to the realization as I began to learn iOS. There existed a surprisingly large paradigm shift between exercises and homework in college and practical programming on any modern platform or frameworks. In school, it’s all about ideas. The student comes up with a plan and, knowing the basic syntax of his or her chosen language, does his best to obtain the desired behavior as efficiently as possible. Working in a real, practical environment is entirely different, and becomes as much a memorization game as a problem solving experience. Rather than, “What can I build to accomplish this goal?” we necessarily ask, “What has been built to accomplish this goal?” In the real world, we must leverage the work of both our predecessors and contemporaries to deliver a real product, because we can’t be all be simultaneous experts in graphics, I/O, business, web, hardware, sound, security and countless more disciplines.