Partly in response to Troutgirl’s post about webdev, I’ve been thinking about how it can be that really smart people can miss entire hot areas in tech, even when there’s plenty of evidence that the area is hot and/or interesting. And I’m driven to the conclusion that there are two entirely different kinds of thinking happening: one is architectural, about the real subject-matter itself (and can be a responsive kind of thinking if you’re paying attention), and the other is purely prestige-based (and is slow to change). Think of the former as the neocortex and the latter as the lizard-brain.A case in point is webdev, as Joyce was arguing. Anyone who is paying attention to the web is waking up to the fact that there is lots of cool stuff happening with client-side programming, and in particular with tighter client-server interactions. But why is “waking up” necessary in the first place? If people were really paying attention, they should be able to match up the browser’s maturation with everything that made client-side programming a turn-off in the first place, and realize that they should be exactly that much turned-on now.
Instead, what I think happens is that people draw early conclusions about what areas are cool, interesting or even lucrative, and that these conclusions fossilize as prestige distinctions. From that point on, it is _very_ hard for anyone to pay attention to an area that they have pre-decided is uninteresting or low-status, _even_ when the game has changed in ways that invalidate the previous thinking. People are inoculated already; say the words “webdev”, “javascript”, “DHTML”, and their brains have already turned off, before you get to the pitch — they’re thinking “webmonkey” and it’s too late. And this is what makes even innovative large companies (that will remain nameless) miss the boat.
A comparable thing has happened with scripting languages in general — because they are easier than purely compiled languages like C++, people who live in the compiled world find it difficult to take them seriously (even though, other things equal, ease-of-use would be a _good_ thing in a tool). And so, with a certain kind of person, the dismissal happens before the possibility of thinking arrives. What if it turns out that the resource restrictions that motivated C++’s tight control just don’t matter anymore for most domains?
Finally, I’m reminded what it was like to interact with Joyce’s grad school friends back in the day, when I was a CS grad student. As Humanities people, they’d absorbed an ideology that computers were inherently uninteresting and almost anti-intellectual — that they were all about engineering in the most precise and prosaic sense. (The ironic thing was that, as the dreamy kid I was, I was _entirely_ intellectually motivated, without any considerations of money or even building things in an engineering sense — I just wanted to learn what computers could tell me about The Mind.) But there was an anti-prestige block with these people about computation that couldn’t be surmounted, even though (as we can all now recognize) there was not just opportunity, but intellectual ferment in spades, which they were simply missing.
So for the hard-core application developers and systems people out there, a thought experiment: imagine that, for whatever reasons, the game changes underneath your feet so that QA and/or tech support become not only the hot and opportunity-filled areas, but also the most intellectually interesting. How many months or years would pass before you noticed?
Nicely put. First impressions are exceedingly important. Which is one reason I really don’t like the “release early” mantra of some OSS advocates. Release carefully, when you can make the right impression on your audience. In the case of webdev, you’ve got to admit that there was a lot of bad prior art that offends esthetic and engineering sensibilities. And much of what is now considered “cool” webdev is still built on a platform which is hacky. But the potential is huge.