I am really the last person you’d want doling out development advice, trust me. I have been “doing development” for about 20 years now, but the thing about doing development is that you’ll never be good enough unless you have a natural aptitude for it, or live and breath it as a religion. I fall into neither camp, to be honest.
I got started with development in a half-assed way back when personal computers first became a thing. Back then, you could subscribe to magazines for computer enthusiasts, and those magazines would feature a rather lengthy program printed out on the last few pages. Those who had the fortitude could type the damn thing out manually, swearing at all of the inevitable typos generating errors in code you might not yet understand, and maybe end up disappointed that the resulting app didn’t really blow you away in terms of how cool it turned out. I remember coding a maze game in this manner, and while I managed to get it transcribed and eventually error-free so I could play it (and I did play it because the video game landscape was still pretty thin back then), I didn’t really learn from it. I didn’t fully understand what “computer programming” would be good for, and my lack of math skills are the stuff of legend (in my own mind).
My real introduction to coding was after high school. For some reason I really wanted a computerized roleplaying game character manager that would help you design your alter ego for, say, Dungeons and Dragons (2E, of course). This drove me to recklessly connect the 2400 baud dial-up modem at 10 PM one night, log into a BBS, and pirate the entirety of Visual Basic 4.0. I am not proud of this. My aunt agreed to pay something like $200 for an online development course for me, so I was on my way towards learning how to code.
It wouldn’t be until about 6 years later that I’d actually do anything with it, however. I went to college to study biology, which didn’t have any development courses involved, and after graduation, I got a job working in an IT department (because science, while cool, sucks ass as a career). Eventually, I had an opportunity to do web development part-time at a small start-up after my day job, and that eventually turned into a full-time job. We used Microsoft’s ASP as our company’s primary platform, although I eventually got saddled with occasionally maintaining an inherited PHP. It was during this time that I wrote a lot of really cool apps, including a fully-functional blogging platform that was in operation for a few years before I got into a pissing match with some jackoff who laid claim to my domain name, forcing me to abandon the project. As that was a small company whose management was…less than stellar…I was forced to jump ship to where I am currently, back working for a large corporation as a rank and file developer tasked with writing and maintaining various web and desktop applications, mostly for internal use.
Over 20 years, I’ve written a lot of code, but when I look out onto the landscape of what “development” looks like today, it’s staggering — and kind of disheartening. The conventional wisdom seems to be that there’s two kinds of developers. The first is the Old Guard. These are the folks who have been developing for decades and are assumed to cling to Their Way of Doing Things. Employers supposedly don’t like hiring older developers because they might be inflexible, and not caught up on current trends. That means, of course, that the other type of developer is Young and Energetic, the type we usually see portrayed synonymous with All Things Tech and is assumed to be constantly on top of the latest hot trends in development. They know the buzzwords and how to navigate the ever-widening world of platforms, tools, and conventions. Because the tech sector has a cutting-edge-fetish, these are the kinds of people that companies want to hire to ensure that they won’t be left behind The Next Big Thing.
I try and keep up with some of what’s hot in the development world for my area of expertise, but it’s difficult for a few reasons. First, my company is a .NET shop for our web and some application needs. We have development groups that do things that I’m not aware of, like mobile apps and such, so we must also have Objective C and Java devs out there somewhere, but my concern is mandated by what the company line imposes. We are a “technology company”, but we are not a technology company. We do not value the freshest conventions, which is the second reason I don’t keep up with trends: why? I have enough to keep me busy here, and I don’t have the freedom to just jump development methods “because I want to play around with them”. If I do, it’s because I want to see what the hubbub is about, but I can’t actually code our next app in a totally different language for a totally different deployment vector. I have inherited other projects written during various times by various developers who did feel that it was OK to strike off into uncharted territory and let me tell you: if I ever meet those people in a dark alley, only one of us is going home. Someone, somewhere, at some point is going to have to deal with these flights of fancy, and Murphey’s Law states that it’ll happen at a point where upper management is screaming bloody murder for a fix to be done yesterday.