2014-02-22

Theory, Practice, and Balance

When I landed my first software gig almost four years ago, I was a self-taught programmer. Not to say that I didn't have help along the way; I had loads of tutoring and mentoring from friends. Point being, it was my own motivation that kept me going, and my own intuition that determined the direction my learning process would take. I had no formal training, schooling or guidance to keep me on track or ensure that I'd learn any particular skill set or theory in any kind of organized way.

Especially in the beginning, before I was ever employed as a developer, my goal was to get to a point where my learning could become a self-sufficient process. For example, an early task I assigned myself was to learn enough about a programming language to get "hello, world" on the screen so I'd have something to build on from there. I explored IDEs and debuggers so I could figure out what was breaking in my rudimentary programs. I dove into docs, grabbed third party libraries to help, and set up RTEs. I don't think many in the community would argue that these aren't useful skills, if not important, maybe even essential skills. But essential or not with respect to any kind of career path, I was learning them haphazardly based on immediate need.

By the time I started at my first job, I had made it to the point where I could jump onto a project and start delivering serviceable work within a few days, but my code wasn't what you might call optimal. It became quickly evident that there were a lot of gaps in my knowledge, skills, and experience that were holding me back. I understood, more or less, how to solve simple problems with code, how to compile and debug; I even stuck to some common best practices in program design from what I had picked up along the way. But I sure didn't know anything about math, theory, or architecture. Set theory, data structures, state machines, binary trees, Boolean algebra, recursion; even stats, calc, the supposedly basic stuff, linear algebra; let alone dynamic programming, approximation, or any of the other heavy theory you'll stagger through in upper division computer science coursework. At the time, I just knew how to code, even if the code came out slightly more organized than a trash can filled with spaghetti, and hey - for that first job, it was enough.

I know from experience that this is a common attitude for some self-taught programmers. "Hey, it compiles, and it works for most cases, what do you want?" Well, depending who you want to work for and what you want to do with your life, there's plenty there to want, especially as the low-hanging fruit of the software development world is phased out of the career curriculum almost entirely in place of more challenging architectural and mathematically challenging problems.

Now that I'm in the thick of an academic Computer Science program to fill some of the early gaps in my knowledge, I'm inundated in the polar opposite of a practicality-focused paradigm. This semester, for example, I'm taking classes in discrete math, formal proofs, and logic; set and language theory and automata; and operating system theory. So far, over a month into the semester, not a single line of code has been requested or demonstrated by any of my professors. In all honesty, hyperbole and exaggeration aside, a computer has not even been mentioned in the classroom. Where is the code? The practice? The application? Over a year into the curriculum and no one has so much as mentioned a debugger, spoken a word on environments, given a nod to APIs or anything of the sort. They are completely ignoring the fact that one day, presumably, we'll need to actually apply all this math and theory to something. They are training us all to be mathematicians and PhD candidates.

Both of the above situations - the plight of the inexperienced CompSci grad, and the crude hacking style of the common self-taught developer - probably sound familiar to anyone who has spent significant time on the job as a professional developer. Even with my limited experience and time in the field, I've met both. Employers have complaints about these two types of entry level job candidates, and I think they are valid points to make. No one wants to hire a kid who can technically write a program, but can't for the life of him do a good job of it because he has never considered testing, or any kind of process, or software engineering principles, or the fact that someone - maybe even him- or herself - is going to have to maintain that code one day. On the other hand, it's rarely a good idea to hire a math whiz with a CompSci degree who has, ironically, no clue how to open Excel, let alone IDEs. I have had professors who prototype in notepad.exe and teach three-generation-old UI libraries because it's what they know best and they're too lazy to keep up on the tech and it's too easy to give the excuse that hey, sometimes you have to maintain legacy code. True as that may be, and though it may be a separate issue, it's part of the larger problem. At any rate, what does it tell you about the practical skills of the resultant graduates?

I hear complaints from employers that CompSci grads too often come out of school knowing such-and-such theorem and So-and-So's Law but with no idea how to use any of it in the workplace; and that self-taught programmers with drive to succeed have taught themselves how to compile and debug but haven't the slightest clue how to improve their algorithms - or, heaven forbid, toss in a comment here and there. So, what do we do about it?

I understand that opinions are a dime a dozen and my commentary is a drop in the bucket of sentiment on this topic, but I have lately felt the need to share it anyway, because these issues have impacted me both as a member of the workforce, and as a self- and university-taught developer. It seems, frankly, outrageous that more effort isn't being put into a combined emphasis on real-world application and theory. Students have to be given some kind of bigger picture with hands-on experience, so that they can connect the theory to the application. A few schools do seem to be getting it; I have one friend who graduated from a technical college with boatloads of practical experience, in addition to a detailed understanding of the math and theory on which best practices and problem solving are founded. But this guy is a painfully rare exception. I, for example, would never have learned how to set up my machine and get myself jump-started on a project without my own extracurricular work, industry experience, and attention from concerned personal contacts. Not to say that students shouldn't be doing any extracurricular work, but leaving the critical element of hands-on experience out of the schooling process by policy is counterproductive to the ultimate goal of schooling: which is preparing students for the real world.

As for the self-taught developer and self-driven learner, the onus is on the community to give a sense of importance to the concepts underlying best practices and solutions to complex computing problems. When someone green behind the ears comes onto a forum to ask a question, instead of dismissing it as stupid, or shunning them with a condescending LMGTFY GTFO, or telling them to just do their project in an easier language, it is up to those with more experience to guide them to better solutions and opportunities for self-education. Otherwise, who do we have to blame when our co-workers are writing hacked up code that we have to fix for them?

Thanks for reading!
- Steven Kitzes

2014-01-05

Android SDK Setup Pitfalls in Windows 7 without Eclipse

I'm delving into Android development for the first time and ran into some trouble during the SDK setup on my Windows 7 machines. I wanted to use an IDE other than Eclipse, which complicated things somewhat. For example, if you're not using the Eclipse-ADT (Android Development Tools) bundle, you'll be keeping a lot more cozy with the command line. But I followed the provided platform-specific documentation as closely as possible and still ran into snags that cost me a lot of time. Hopefully by piecing together a write-up of my experience I can cement the process in my mind and consolidate some of this information to save others time during their setup.

Resources, Downloads, Installation


First, if you are new to Android development, I recommend visiting and bookmarking the Android developer home page. From there you can (sorta) easily navigate to all the resources and information you need to get started. If the installation of the SDK goes smoothly (which I hope to help facilitate with this post), you can follow their tutorial and probably get a "hello, world" Android app up and running inside an hour.

Start by installing the Android SDK (not the Eclipse-ADT bundle). Following the documentation for the SDK installer and resource download went smoothly for me. The only thing to mention is that there is some finagling you have to do to get all the licenses accepted in the SDK Manager if you want any of the options that aren't specified in the documentation (there are multiple methods of accepting certain licenses for some reason).

Using a Batch Supportive CLI


One thing they won't tell you in the docs that they should, is that the command line input they give you includes some batch file execution. When I was going through this process, I was using Git Bash. This was causing me all kinds of problems because the syntax you use to execute batch files in Git Bash or other third party shells is different from that you'd use at the Windows 7 command line. I kept getting android: command not found and I didn't understand why, because I had correctly added all my directories to the Windows PATH variable and everything. Even once I'd figured out that these were batch files and needed different syntax to be run, it took me a while to realize that the reason the execution was failing wasn't just because my syntax was wrong, but because the Git Bash just doesn't fully support batch file execution.

In short, the Android documentation assumes you're using a CLI that provides full support of Windows batch file execution.

Setting Up an Android Virtual Device and Emulation


Skip this if you only plan to test on physical Android devices. I wanted to test virtually so I had to set up the emulator. For the emulator to run on Windows 7, you can't give your virtual Android device too much memory or the emulator will crash. I was getting errors from emulator_arm.exe when trying to run my virtual Nexus 7. Reducing the allocated memory on the virtual device from 1024 to 512 allowed me to run the emulator with no trouble (though I'm not confident this provides a realistic testing environment down the road).

Also note that when loading the emulator, the virtual device's boot time may be quite long. I have a reasonably quick laptop with SSD, 8GB RAM, yadda yadda yadda, and it still took a couple of minutes to boot the emulator. Be patient, don't panic.

Windows Environment Variables, Including JAVA_HOME


Throughout this process, you may well find yourself working with Windows environment variables, including the PATH variable, at length. Be careful, you can do some pretty annoying damage to your system if you are careless in there. Follow directions carefully and correctly to make sure everything will work and you don't lose any important system information that may already have been added to any Windows environment variables.

This is a preemptive note for the next section, but important nonetheless. There is a command line application you'll need by Apache called Ant. It's a command line build tool for Java projects. When installing Ant, make sure you add that to your Windows PATH variable. Otherwise, when you try to run Ant you'll just get the classic 'ant' is not recognized as a an internal or external command error. Next, you need to make sure a Windows environment variable called JAVA_HOME is correctly pointing to the location of the JDK (Java Development Kit) on your system, because Ant uses the JDK. If the JAVA_HOME variable doesn't exist, just create it (I had to).

Installing an App onto an Android Device


At long last, I had the SDK all set up and I'd gotten the sample "hello, world" app built. To install an app onto a device (virtual or physical) for testing, the documentation first tells you to navigate to your project root directory and enter ant debug at the command line. They don't tell you what this does, why it's important, or most critically, that Apache Ant (which you can download and learn more about here), may not even be installed on your system, and is essential to the tutorial in the documentation. Now you know.

Finally


Now I'm running Android apps in emulation. My next step is to try running the "hello, world" app on a physical device, then I'm off to the races. Hope you found something here to help you along the way on your Android journey! And as always...

Thanks for reading!
- Steven Kitzes