Being a programmer in the corporate world is a somewhat arduous task. There are a lot of assumptions thrown around from within and without. Those outside the world of development tend to see development departments as something of a police-free zone, where anyone can do whatever they want without repercussions or having to adhere to deadlines. This couldn't be further from the truth.
Developers have a bit more leeway when it comes to technology, because they must. In order to do their job (you'll note this is unqualified- it's a requirement, not optional), developers must have such software as Firefox, Firebug, Eclipse, Fiddler, open internet connections, and the list goes on and on. Corporate IT can't restrict them (or shouldn't), because all it will do is slow down the development cycle and force deadlines to be pushed back even further
Deadlines come and go for development, it's true. This is the single biggest driver of all the churn/turn over in development groups. Miss a deadline, and one day you have 250 contractors in a massive project, and within a month, you could be down to a hundred. Or you could go from 30 down to 15 almost overnight, with two managers stepping down. Both of these have occurred in my time with corporations. It's a given, because software development is more of an art than a science. It doesn't adhere well to dates because understanding the "whole picture" in a project is something that very few people can do. Even when a person can, there are always unexpected hiccups and stop gap measures. Imagine a construction project suddenly having a supplier run out of their building material. This causes timelines to be pushed, and that doesn't make executives happy.
While the outside perception definitely hurts development on occasion, it's not the piece that strains developers the most. It's the perceptions from the inside.
During college or learning periods for developers, they get taught certain ways of seeing problems. Many of which never question these perceptions- They just start to feel as if there is only one right way to look at problems, or a very limited number of ways to solve a given problem. When that is the exact opposite of what companies are looking for. Innovation is about bringing creative and new ideas to the table. Trying new things that aren't simply extensions of the old stagnant ones.
In order to be clear about what I mean, here is an example: In a recent rewrite of an application, we went from a Java stack based on Struts 1 with some Hibernate thrown in, to another Java stack with Spring, iBatis and a DMS library. With the hopes that this would help speed up productivity and make our code more maintainable.
The core problem in our move from one version of the application to another is that we did not evaluate other languages/architectures that may be more suited to developing a web application. Groovy on Grails, Ruby on Rails, Python/Django, et al. Because developer preconceptions got in the way. This concept extends beyond just the language and patterns. The idea should extend in to some of the broken areas of application development where developers spend more time in XML files or property files than actual code. Spending more time writing unit tests or performing a build than writing code.
I've said it before, and I'll say it again: A good developer should be able to learn new languages and new patterns quickly and easily. If they can't, I would call in to question if they fully understand application development.