tag:blogger.com,1999:blog-72252480655411032522024-03-08T10:21:08.087-06:00Fail Forward!Musings on challenges and opportunities in the world of Software Development.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.comBlogger25125tag:blogger.com,1999:blog-7225248065541103252.post-69982021219490378642018-01-05T12:28:00.002-06:002018-01-05T12:28:25.786-06:00LinkedIn, Titles, Roles, and the nature of work.Titles are like animals that we keep around because we are amused by them. You never quite know what they're going to do, or how they're going to effect your life, but a lot of people really seem to like them, and pay a whole lot more attention to them than they're actually worth.<div>
<br /></div>
<div>
Recently I've spent a lot of time at a company that barely recognizes that titles exist, internally. Everyone can chime in about anything. People may look at you a little odd, but it's true, and certainly interesting to have that level of exposure. Sure, there's some churn due to information flows, but people tend to self organize by interest, rather than specific role or title.</div>
<div>
<br /></div>
<div>
This has made me reconsider how I think about titles, and in particular what I should put on LinkedIn. I try to keep my resume as up to date as I can, but when you work at a company that is a <a href="https://www.holacracy.org/">Holacracy</a>, you can't quite pigeon hole yourself with one particular role/title. I've gone through a bunch of iterations on what I think I am, as a software developer...</div>
<div>
<ul>
<li>Junior</li>
<li>Mid-level</li>
<li>Senior</li>
<li>Architect</li>
<li>Principle</li>
</ul>
<div>
But what do any of these really mean? They're supposed to indicate a level of capability within your given field. They might just do that if you're an Accountant, or perhaps in certain lines of business. They simply don't map very well on to the abilities of a programmer. For example, I've met people that call themselves Principle Software Architects. What does that mean? That they're really good at making flow charts? Do they even code? As another example, I've met Junior Developers that are far more capable than your average Senior Developer. These dichotomies exist all over the place, of course, but it's quite stark in software. It's far more important how your thought process works, than how long you've been able to call yourself a developer.</div>
</div>
<div>
<br /></div>
<div>
So I've decided to take a new track for LinkedIn. My primary title is now 'Thinker, Problem Solver, and Writer of Code'. I think that best describes what I do, in respect of how important they are. Without considering problem spaces, you can't even come up with problems to solve, and when you're solving problems, sometimes it'll involve code. Occasionally. Too often people rush to write code without considering whether or not they really need to, and that's one of the things I like to think hard about. </div>
<div>
<br /></div>
<div>
The only code you don't have to support is code not written.</div>
<div>
<br /></div>
<div>
That doesn't mean I'll remove all my titles from individual positions, but I do think it's far more important than all those position names, combined.</div>
Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-83749103006227871662017-04-27T18:00:00.000-05:002017-04-27T18:00:12.804-05:00It's been a little while since I've written, but recently I've been hearing a lot of interesting app ideas from people, and I've been asked why an app may or may not succeed. These are the two questions that I ask them, when I hear, "Hey, I have an idea..."<br />
<h2>
1. How is your app going to shortcut some pre-existing habit or work for an individual?</h2>
You can't add more work to the day. People just wont do it, and it'll be a struggle all the way. The only way you get to add work to peoples day is via legislation, or some kind of rule that has enforcement. Otherwise, people are lax and will forget to do it, or simple choose not to. Make sure they want to do it. The only way you can make sure of that, is to make some existing work easier.<br />
<br />
There are some applications that seem like they've gotten around this theory, particularly in the social sphere. Arguably, social application are shortcutting communication and socialization. You're "reaching" 10x the people in the same amount of time, though you're giving up the human touch to do it.<br />
<br />
<i>The other question is:</i><br />
<h2>
2. Where's the money?</h2>
Money is how apps get built, and how they survive. It's impossible to keep an application running if money isn't flowing in. This could be from any number of sources. Ads, subscriptions, one time purchases (not a good idea!), partners, or even some corporate sponsor.<br />
<br />
Without a clear idea of where the money to build and maintain your application is going to come from in the long term, it probably wont get off the ground. Further, if it seems like you have an answer to this question that involves some kind of connection to an industry, you need to follow that up with, "Who has the connections to make this happen?"<br />
<br />
If you can't answer these two questions clearly, concisely, and deeply, your idea needs more fleshing out before you can move forward. Unless you want to constantly be funding the app with your own time and money.<br />
<br />
Once you can get past these two questions, refer back to my <a href="http://fail-forward.blogspot.com/2015/11/inception-of-application.html">previous blog post</a> for ideas on how to more deeply flesh out your idea!<br />
<br />Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-23748147617304289342016-01-15T19:00:00.000-06:002016-01-15T19:43:27.332-06:00Modernizing the "Joel Test"Most developers have heard of the "<a href="http://www.joelonsoftware.com/articles/fog0000000043.html">Joel Test</a>". At least, those developer that have spent much time researching development methodologies and keeping their finger on the pulse of tech in the past ten years. Developers are always looking for ways to evaluate places that ask them to do work, and I am no different. The thing is, some of those items don't necessarily apply to the modern development environment, so I've updated it for 2016. I've also weighted them so that it's clear what is a necessity, and what is a nicety.<br />
<br />
<h4>
Must Haves</h4>
<br />
<ol>
<li>Are you an agile shop?</li>
<li>Do you have a well defined audience for your product?</li>
<li>Do you have a product owner?</li>
<li>Do you use version control for all code? Distributed version control?</li>
<li>Do you utilize Continuous Integration (CI), or even better, Continuous Delivery (CD)? </li>
<li>Do you keep an up to date issue database? </li>
<li>Do you do hallway usability testing?</li>
<li>Do you have the best hardware and software that money can buy?</li>
</ol>
<br />
<h4>
Nice to Haves</h4>
<br />
<ol>
<li>Are you profitable?</li>
<li>Are developers encouraged to actively participate in requirements gathering?</li>
<li>Do developers have quiet working conditions?</li>
<li>Do you allow developers to work from home?</li>
<li>Do you have a schedule?</li>
<li>Are there product testers?</li>
<li>Do candidates write code in their interview?</li>
<li>Do you provide training opportunities for your developers?</li>
<li>Do you encourage writing unit tests for all features?</li>
</ol>
<br />
As I see it, on the "must have" list, you really should be able to answer every single one of those with a yes. If not, you're going to need a really good explanation. If it's more than one, you probably have a serious problem on your hands. On the "nice to haves," slightly over half would probably suffice, but really, you should try to hit them all. If you want to be competitive in the market place, and a good place to work, you'll want to make sure as many of these are true as possible, along with being in the top 25th percentile for compensation. It is a very competitive market for engineers, and anyone that isn't providing proper care and feeding of their development staff will soon find that someone else will.<br />
<br />
Of course, these are all open to subjective interpretation, because each developer will have their own ideas, but let me provide you with my take on why each of these are so important.<br />
<br />
<h3>
Must Have #1 - Are you an Agile shop? ("<u>a</u>"gile? Scrum? Kanban?)</h3>
Waterfall is dead, long live waterfall. It's been nearly two decades since Agile was coined, and still many shops haven't implemented it, and many developers haven't been trained in it. Even if you don't subscribe to a particular set of practices (a la Scrum), lets hope that you at least know and live by the <a href="http://agilemanifesto.org/">agile manifesto</a>.<br />
<br />
<h3>
Must Have #2 - Do you have a well defined audience for your product?</h3>
In order to be able to know how to build your product properly, you're going to need to know exactly who you are marketing your product to. If you don't fully understand who you are marketing to, then you need to go back to the ideation stage of your product, and figure that out. If you can't answer that question thoroughly, you can't move on to development.<br />
<br />
<h3>
Must Have #3<span style="font-weight: normal;"> -</span> Do you have a product owner? Bonus points for documenting requirements.</h3>
Many products have languished in software purgatory over not having a visionary that is leading the development of software. This is because every point of software development is about managing trade offs. If you don't have someone with a firm grasp of where you are going, you'll never know what kind of trade offs that you can make.<br />
<br />
<h3>
Must Have #4<span style="font-weight: normal;"> -</span> Do you use version control for all code? Distributed version control?</h3>
Software is part of your companies DNA if you're hiring software engineers. First and foremost, you should be making sure that you're keeping full control over that software that is being produced. That's only a side effect however. Primarily, the purpose here is to enable and facilitate efficient and consistent collaboration between software developers. That is its number one duty. Without a solid version control system and process in place, crashing and burning is all but inevitable.<br />
<br />
<h3>
Must Have #5<span style="font-weight: normal;"> -</span> Do you utilize Continuous Integration (CI), or even better, Continuous Delivery (CD)? </h3>
This is all about the build. You should be able to build immediately on check in. If you can't do a build and check to make sure all your tests pass, at the least, you're not going to have any level of code quality that is being ensured. This is the first step towards a well tested piece of software, without it, it can fairly be assumed that testing is discouraged passively, if not actively.<br />
<br />
<h3>
Must Have #6<span style="font-weight: normal;"> -</span> Do you keep an up to date issue database? </h3>
If you don't know what your bug trends look like, it's hard to tell if you have a fairly solid piece of software. Further, if you don't have a strong understanding of how many bugs are in your system, you'll have no idea if you should be actually doing deploys. This should guide every step of your development process.<br />
<br />
<h3>
Must Have #7<span style="font-weight: normal;"> -</span> Do you do hallway usability testing?</h3>
Developers should constantly be bugging the product manager, or at least doing usability tests between themselves. To make sure that what they're building fits in with expectations of others. If not, it's very easy for them to go off the rails and build something that they think is neat, rather than what the product actually needs.<br />
<br />
<h3>
Must Have #8<span style="font-weight: normal;"> -</span> Do you have the best hardware and software that money can buy?</h3>
This should be a no brainer. If you're going to pay a software engineer six digits per annum, don't skimp on the things that they need to do their job. 5% of salary yearly budget seems like it would be a start, though a stronger offering would be 10%. This is the primary way to show that you really want a developer to stick around. Removing any possible roadblock to them doing their job. Computers and software are the tools of the trade. Don't offer, or accept, anything less than the best.<br />
<br />
<h3>
Nice to Have #1<span style="font-weight: normal;"> -</span> Are you profitable?</h3>
Many companies aren't. That's fine, but you need to have a really strong reason that you aren't profitable at this time, and have a path to profitability in the near future. Or at least a way to be certain that you can make payroll. If developers aren't confident that they'll receive their paycheck, you can assume their eyes will be wandering.<br />
<br />
<h3>
Nice to Have #2<span style="font-weight: normal;"> -</span> Are developers encouraged to actively participate in requirements gathering?</h3>
Product managers, business analysts, et al, can theorize all day long. Unless a developer is actively involved in their ideation efforts, and helping to keep ideas firmly grounded in reality, you could end up handing requirements to a developer that will take them months to implement. Even years. If they even are capable of doing so. Developer feedback should be highly encouraged.<br />
<br />
<h3>
Nice to Have #3<span style="font-weight: normal;"> -</span> Do developers have quiet working conditions?</h3>
Despite many trends that are going in the opposite direction, developers need quiet time. This is how the magic happens. If developers down have quiet time where they can be heads down and write code, they wont be able to get their job done. Over collaboration slows down production just as much as being completely out of sight and out of mind. Slack, and tools that assist in communication, but don't interrupt flow, are perfectly suited for interacting with developers. They don't require instant reaction, but they aren't email, which can be completely overlooked.<br />
<br />
<h3>
Nice to Have #4<span style="font-weight: normal;"> -</span> Do you allow developers to work from home?</h3>
There is no intrinsic reason why development needs to be done at a shop. Or even from the same continent. That's why offshoring was so popular for a while. That said, there are reasons to get together and make sure that 'the magic is happening'. Office space can certainly be used for that purpose, but it shouldn't be thought of as 'where the magic happens'. The magic happens wherever development happens. For some people, that's harder in a sterile development atmosphere. Flexibility is key.<br />
<br />
<h3>
Nice to Have #5<span style="font-weight: normal;"> -</span> Do you have a schedule?</h3>
If your software lives on a schedule, is it posted? Why is it scheduled? Remember that agile development is all about working software, customer orientation, and responding to changes. Deadlines and plans are all guesswork and, in many cases, wishful thinking. Sure, you can have some idea of what you want done, and by when, but that doesn't mean that it'll happen. Good software takes time, and if you have your audience right, they're going to be happier waiting for solid software, than complaining about software that isn't working correctly.<br />
<br />
<h3>
Nice to Have #6<span style="font-weight: normal;"> -</span> Are there product testers?</h3>
Sure, developers can test software, but they're very myopic. They only see the feature that they just implemented, and don't tend to look at the bigger picture. Some developers manage that, but they're a horse of a different stripe. A unicorn, so to speak. Ideally, someone from your target audience would perform user acceptance testing at a minimum.<br />
<br />
<h3>
Nice to Have #7<span style="font-weight: normal;"> -</span> Do candidates write code in their interview?</h3>
How do you know if a given developer can code? You ensure that by actually having them write you some code. Keep in mind that if you do it during the interview, it's very likely that they're extremely stressed from the interview experience itself, and wont be operating anywhere near 100%. You could alternately offer them a code test on off time, or just have them explain a code project that is all their own that they could walk you through. Either way, you want to see some kind of code from the developer. Would you trust a designer that didn't show you a portfolio? Probably not! Same goes for developer.<br />
<br />
<h3>
Nice to Have #8<span style="font-weight: normal;"> -</span> Do you provide training opportunities for your developers?</h3>
Developers have serious upkeep requirements. One of the big ones is making sure that they're staying up to date. This means giving them opportunities to learn now things. This can take the form of a 20% rule, like Google. Or it could be conventions. Any number of ways, but developers should always be investing in themselves, and this is how it is done.<br />
<br />
<h3>
Nice to Have #9<span style="font-weight: normal;"> -</span> Do you encourage writing unit tests for all features?</h3>
Tests aren't really optional anymore. Software moves too fast in the modern era, and you need to be able to test it at every step of the way, and be confident that it works as expected. Without tests, your confidence level diminishes rapidly. You should be writing tests for all new features, but if you aren't, you better have some other very serious testing plans at hand.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-49308187671790355402015-11-03T09:00:00.000-06:002015-11-03T09:00:00.869-06:00Inception of an Application<br />
As software developers, we're consistently asked to 'just build this thing'. I used to find it somewhat flattering that people would believe that I could help them realize their dreams. After years of talking to people about their ideas, and occasionally offering to help out, I came to a much different understanding. It was less flattering, and more frustrating. It's like asking a construction worker if they would build you a high rise. Maybe that'd be nice, but it's a dream. Without backing, it's like being asked to help them win the lottery. The person asking often just doesn't have any conception of the <b>time </b>and <b>will</b> that it takes to bring about an application. So I thought I would write this introduction to what it takes to get a developer interested in building your application, and why the <i>magnitude of the word</i> '<u>just</u>' is <b>staggering</b>.<br />
<br />
While this first post is more clearly aimed towards the person that is new to the SDLC (software development life cycle), it may have some insightful points for the seasoned programmer that isn't accustomed to reflecting on why they should build their own applications. Either way, lets hop right in.<br />
<br />
<i><b>When</b> you feel that stirring that you think you might have an idea for a piece of software</i>, you need to start answering questions about it. Usually before you even bother anyone else with it. It needs to become an object in your own mind that you can spend time carefully questioning and scrutinizing. Turning over each facet and understanding what you have, even before you bring it to other peoples attention. These five questions should lay a solid groundwork for refining the startling clarity that you will need:<br />
<ol>
<li>What is it that you want? <i>Hint: This answer should be <u>personal</u>, and probably long form.</i></li>
<li>Who is your audience? <i>Apps are not all that different from books.</i></li>
<ol>
<li>Why will the audience care? </li>
<li>How will it save them time, or improve their process?<br /><i>Remember: People wont use your software if it doesn't improve their life in some way.</i></li>
</ol>
<li>Why is it worth doing, what is going to make it be amazing? <i><u>Don't</u> be circumspect.</i></li>
<li>Where will the motivation come from? <i>For you and anyone else working the project.</i></li>
</ol>
These are reflective questions, not meant for easy answers. If you believe that you have found an easy answer, then you probably need to go back to the drawing board and keep brainstorming. These questions should generate a lot of thoughts as you really start to understand your idea. Paragraphs worth, perhaps. <i><b>Once</b> you feel you have a solid understanding</i>, then the next step is to distill these answers down to a <b>5-15 second elevator pitch</b>. Something reasonably easy and quick for your target audience to understand, but leaves them wanting more. This is <i><u>not</u></i> the pitch you should use on developers. Most developers have a finely tuned BS-o-meter, and wont be interested in a sales pitch.<br />
<br />
Once you're comfortable that you have a solid pitch, then you should move on to keep questioning your idea with deeper, more incisive questions. While these aren't prescriptive, they're a great starting point for things that you'll definitely need to understand:<br />
<ul>
<li>Who is your audience? Dig in to them. If you think you have multiple audiences, make sure you go over them all.</li>
<ul>
<li>Why do they care? What are their motivations?</li>
<li>Why will they use your software daily?</li>
<li>How is your idea better than your competition? And don't fall in the pit of 'There's nobody else doing this'. There's always someone providing some kind of competition.</li>
</ul>
<li>Will this software have a revenue stream? <br />If yes (<i>and 99% of the time, if you're not the developer, it <u>should</u> be yes</i>): </li>
<ul>
<li>Where from? <i>Sales? Ads? Grants?</i></li>
<li>Why will they give you their hard earned money?</li>
<li>How much will it generate? <i>And don't be coy.</i></li>
<li>Is that worth up-keeping for at least 7 years? <i>Think maintenance and return on investment.</i></li>
</ul>
If no:
<ul>
<li>Where there is no money, there darn well better be a strong motivator. What is it?</li>
<li>How will you host and maintain the software if you have no money?</li>
<li>How will you get developers interested, and keep their attention?</li>
<li>Are you really sure it's worth keeping up for at least 7 years?</li>
</ul>
<li>When do you want it by? </li>
<ul>
<li>Are you going to need resources/money to develop it? </li>
<li>Where are those resources going to come from?</li>
</ul>
<li>Finally, and most importantly, <b>how will you know when you have your desired software?</b></li>
</ul>
These are just stepping stones to get you moving in the right direction at a high level. They're thinking points that will assist you in figuring out what your starting point looks like, and deciding if you are willing to be in it for the long term, and if you'll be able to actually get money out of your product.<br />
<br />
If you've made it through this entire process, only then should you really think about involving a developer. Understand that you're talking to someone that can help you realize your dream, but don't imagine you're asking for a dog house, when you're actually asking for a high rise, and remember that they're not likely to build a dog house for free, much less a high rise.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-10606024019938225142015-10-28T12:00:00.000-05:002015-10-28T14:51:19.400-05:00Assembled Best Practices for Software DevelopmentI've been curating a list of my favored practices for some time now. These ideas have come from a variety of sources (books, blogs, coworkers, myself, etc.). These make up some of the best ideas and practices that I have seen in software services. There are a whole lot more that I could talk about, but these are the most important that I think everyone that works in software development should be familiar with. Without further ado:<br />
<br />
<b>Understand the foundation.</b> This can be one of the most glaringly obvious, yet oft overlooked, points of an application. Every application has a goal; Something that it aims to do better than anyone else. Generally, that goal leads towards two or three core competencies that the whole platform will revolve around. Make sure you know what those competencies are, and don't ever sell them out, or outsource them. Your evenings and weekends depend on it.<br />
<br />
<b>Features are like pets.</b> They're fun! But, they have to be groomed, cared for, and carefully maintained. As such, like pets, you should be very careful about which ones you adopt. Whenever someone says, "We should add.." always start with "no". Make any non-core feature work to be in your application.<br />
<br />
<b>Scratch your own itch.</b> The next big thing is a lie, and doesn't deserve your attention. Make something that saves you time, and makes your life better. There will always be people trying to figure out how to force the next application to blow up in to something big. Instead, just figure out what your itch is, and scratch it.<br />
<br />
<b>Always Be Shippin'. </b>Too many projects die before they even get off the ground. Figure out what the minimum viable set of features are, and then ship it as quickly as possible. After that, iterate on the design, and make sure you get it in front of your (preferably paying) customers, right away! They'll love you all the more for seeing your software evolve in front of them. Speaking of which:<br />
<br />
<b>Love your audience.</b> Your software is nothing without your customers. Focus on their needs and collaborating with them, more than the amount of money, or the contracts that they bring in. When they give you information, respond to it. The most important lesson of sales is to remember this one simple truth: People don't buy a product, they buy an emotion. Make them feel good over and over, and you'll have a customer for life.<br />
<br />
<b>The customer always believes they are right.</b> <i>But, they're probably not</i>. That's right. Conventional wisdom is wrong. That doesn't mean you treat them as if they're addled. On the contrary. You should always treat your customers with dignity, and respect, if not admiration. Yet, even when you do that, you should help them understand why your software works the way it does, and don't let them dictate terms that will lead you in to a feature haze. That way only leads to heartbreak. Listen to your customer, ask pointed questions, and act only when it is in the best interest of your entire audience. Never become myopic, as that always has an effect on the picture as a whole.<br />
<br />
<b>Build simple systems.</b> Some will find it frustrating or condescending, but building three ways to do the same thing in your system doesn't make you nicer, it makes your system more obtuse. Find one way to make your system do the best possible work, and make it as intuitive as you possibly can.<br />
<br />
<b>Estimates are pure guesswork.</b> Don't fall in to the trap if you can help it. Keep shipping features as you are able. If someone makes you guess how long it will take you to implement a particular feature, imagine how long it will take, then weight that by somewhere between 1.5 to 5 times that number, depending entirely on your own personal expertise and the complexity of the system at hand. If you think something that you're not very familiar with will take you a day and you're dealing with a complex system, don't feel bad about saying five days. If it's something you're very comfortable with it's not complex at all, and you think it will take two hours, go with three. The key is this: <i>Always under promise and over deliver.</i> People may think you're daft when you guess five days, but they'll think you're a miracle worker when you deliver it in two, instead of grumbling about it taking twice as long as one.<br />
<br />
<b>Remember these three things</b>:<br />
<ol>
<li>There are things that <u>you know</u> that <u>you know</u>.</li>
<li>There are things that <u>you know</u> that <u>you <b>don't</b> know</u>.</li>
<li>There are things that <u>you <b>don't</b> know</u> that <u>you <b>don't</b> know</u>.</li>
</ol>
That last category is the largest of the three, guaranteed.<br />
<br />
Finally, the best way to learn is to <b>practice, scrutinize, adjust, and repeat</b>. Don't just keep doing same thing that you've always done, else you'll always get what you've always got.<br />
<br />
<b><u>Code on. </u></b>Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-78750889232256504172014-05-12T09:00:00.000-05:002014-05-12T11:57:03.138-05:00How Java and Ruby made me a better Node developerHerein is a partial recounting of the bits and bobs that I have learned passing through different programming languages. Principles and processes that I found useful, and not so. It is by no means a complete recounting, but rather meant to help people understand why it's so important to try new things, even when you think you've 'got it down'. How each culture brings new and innovative ideas that can better your code! Onward...<br />
<br />
<h3>
Just shy of a decade with <u>Java</u></h3>
There are many things to be said in favor or against Java. While I personally have no plans of returning to the language in the mid-term future, I learned many things from it and the connected community.<br />
<br />
The Java Virtual Machine is a rock. Meaning, it is a very suitable foundation upon which to build the next generation of languages, much the same as C is. It allows highly portable code to be crafted such that it can be highly performant on just about any system that the JVM runs on, in languages (groovy, scala, clojure, jruby, et plus..) that are much more terse and flexible.<br />
<br />
Everything in Java-land is configurable. This can be both a boon and a bane. Many Java libraries take their configurability to the point of absurdity, which in turn leads to codebase bloat. Spring is a fantastic example of this. It started off as a great DI platform, and ended up with nearly a J2EE level of complexity and in-and-of itself requires hundreds of megabytes of space on disk, not even to speak of once it gets in to memory. It is not desirable that every library should cover every possible option.<br />
<br />
The Java community has a love for design patterns. I was first exposed to many useful patterns that I still use in other languages while I was in Java, here are a few that I still use quite regularly: MVC (or rather, variants thereof), Observer, Messaging, Facade, Factory, and Flyweight. Many of the others still apply, but aren't as useful when you have a greater syntactic flexibility. Realizing that these patterns are available to me has been a great help. When I am trying to reason through particularly difficult problems, I can fall back on some of these great patterns and find possible solutions. Unfortunately, the Java community takes this to an unhealthy level in a lot of cases. They get bogged down in talking about patterns instead of solutions. As martial arts masters say, first you have to learn how to do it, then you have to unlearn how to do it, and just do it.<br />
<br />
Document all the things. When you work with so many highly configurable pieces of code, you have to make sure you know exactly how to make those libraries perform as you wish. The Java community is fantastic at generating vast quantities of documentation. There is a lot of white noise when you going looking for a specific problem that you have, but more likely than not, someone has your problem documented, somewhere. You just have to dig hard enough to find it. There are many communities that could learn from this approach.<br />
<br />
Boilerplate is bad, mm'kay? Strongly typed languages require lots of it, no matter how you slice the problem. Particularly when trying to encapsulate absolutely everything. My take away from this is that boilerplate, while sometimes necessary, is an undesirable. If you find yourself constantly writing the same pieces of syntax over and over, you're probably just wasting your time and making your code more complex. Sometimes the solution is to refactor. Sometimes you need to re-evaluate the tools that you're using.<br />
<br />
No conversation about Java would be complete without mention of their dogmatism. This was one of my greatest take aways from the Java community: Don't adhere to dogma. More than that, don't listen to appeals to authority. Just because it's writ in an RFC somewhere, doesn't make it the right way to do something. The Java culture takes their 'thou shalts' to an unhealthy level, and they end up writing massive amounts of code in order to prove their adherence.<br />
<br />
<h3>
Two years in <u>Ruby</u></h3>
Ruby was built with developer happiness and ease of use in mind. It is an object oriented language that inherits a lot of its thought processes from Smalltalk. It was my first completely object oriented language, and I must admit that I found its terseness to be a breath of fresh air when I first joined the world, after so long in Java. When I was first coding for the web using Perl and PHP, and then went to Java, I thought for a very long time that it was going to be almost impossible to go back to a dynamic language, due to how poorly written "all dynamic code" was. Due to my experiences with Perl and PHP. Oh how wrong I was, but it took me a good amount of my own personal time, and jumping off the cliff in to enterprise Ruby to truly find out how "ready for prime time" that these new dynamic languages have become!<br />
<br />
While I was not originally exposed to the <a href="http://en.wikipedia.org/wiki/Pareto_principle">Pareto principle</a> ("80/20 rule") in Ruby, it was the first time that I saw many libraries trying to follow its advice. While controversial to many people, I saw this work out very well for a lot of cases. Take ActiveRecord for example. That library makes it very easy to do basic database queries against any number of relational databases, but if you need to do something more specific with a database, like call a stored procedure, you can just use the connection directly. Rather than them trying to encapsulate that functionality somehow.<br />
<br />
Convention over configuration. A phrase that seems to have come directly from the Ruby community as far as I can tell. Those involved in Ruby still build some fairly large libraries (though nothing alongside the monolithic ones that are in Java), but they provide sane defaults. A good portion of the time, even if you're using a fairly impressive sized library, you can start with just bringing it in and using the default settings that it provides. This language was the first that I had seen that provided these sane defaults that you didn't have to shepherd and tweak to get just right.<br />
<br />
Keeping it Simple and Stupid is a mantra for the Ruby community. That's where a lot of the innovation that people hear about from Ruby-land comes from. Don't overbuild. As <a href="http://david.heinemeierhansson.com/">DHH</a> said, <a href="http://david.heinemeierhansson.com/2012/rails-is-omakase.html">Rails is omakase</a>. Don't like ActiveRecord? Don't use it. Need to kick some processing out of your web worker thread? Great, put it on a queue somewhere for a worker to pick up- but don't overbuild that queue. You should have sane defaults so that you can get a worker on that queue almost immediately. Anything else is overbuild and waste of time.<br />
<br />
Two very important acronyms: DRY (Don't Repeat Yourself) and YAGNI (You Aint Gunna Need It). These two are mantras that I find myself repeating constantly while I'm programming. "Do I need to be able to do that with this piece of code? ..Nope!" And, when I start to copy and paste a set of code, or start retyping code that I know is elsewhere, I immediately stop myself and start refactoring so that I only have one set of code performing any particular function. I can't even begin to recount how many times this has saved my bacon when hotfixing applications.<br />
<br />
Rails Style: "Skinny controller, fat model". This means that you should put your business logic, or the logic that actually performs changes, in the model. The controller shouldn't be overbuilt, it should only contain enough code to tell the model what to do with itself, and then pass the information back up to the client. I think that this doesn't go far enough some times. There are usually opportunities to improve even further on this during fairly complex processes. Rather than expanding a model in to quite large proportions, I occasionally take a cue from the Java land and write up a service layer instead. So that you're not bloating your controller or your model to an unmanageable level.<br />
<br />
While the Ruby culture is not quite as dogmatic as the Java culture, they still definitely believe there is a right way and a wrong way to build your applications. Particularly if you live in Rails-land, rather than one of the subcultures (Padrino/Sinatra, for example). Still, the Ruby culture is more palatable and less stogy than the Java one in my estimation.<br />
<br />
What's more: The Ruby culture taught me that development can be fun again. I blame that expressly on the fact that so few "paycheck developers" spend their evenings learning Ruby, so it is a group of enthusiasts instead of people that are looking for their next gig.<br />
<br />
<h3>
A year of <u>Node</u></h3>
My introduction to Node came when I started using it regularly as tooling, which I do believe is how it is getting its induction in to a lot of development groups! Whether it is setting up an asset pipeline (grunt/gulp), or giving you a foundation for your next web application (yeoman), or managing your front end javascript dependencies (bower), there are a lot of uses for Node without even utilizing it as a primary language. That's a pretty big win. Once somethings starts to feel familiar, you can't help but start be curious about how more of it works. Eventually, I decided to take the plunge. I'm glad I did, despite some of the pundits that speak of Node's insufficiencies.<br />
<br />
So far in my time spent in Node, there are two big things that I have taken away. One of which I've mildly hinted at from the Java and Ruby sections: Modularity. Ruby still has some fairly large large stacks that it requires one to use. While Rails is not one monolithic framework, but a collection of fairly large frameworks, it is still nearly 200mb in memory when it is first loaded. While memory is cheap, and disk is too, I still don't believe that relieves us developers from paying attention to those numbers. When I first built an express application with Node and it used less than fifty megabytes from the get go, I knew I was hooked. Sure, there are some lighter weight frameworks in Ruby and Java, but that's not really how their communities think. The strongest and most active communities in Node revolve around the ExpressJS foundation. The primary application that I support today is less than 150mb with all code, in memory. The largest node app that I maintain doesn't go over 300mb in memory until it starts taking requests. I remember one java gig that I took, and they had a server with 32gb of memory for me on my first day, because the application itself took up more than 8gb in memory... I still look back and shake my head.<br />
<br />
Asynchrony is pure win, but has its trade offs. It comes at the cost of comprehension. It can be harder for the developer to reason through, when nearly everything is an event (everyone that has spent much time in JavaScript has heard of "Callback Hell"). Despite that, there are many advantages of having your code being non-blocking. The scalability characteristics for any program that has to spend any time in wait cycles (and any program that accesses a database spends <b>lots and lots</b> of time waiting) is losing a lot of computer cycles just sitting and waiting for that response. Node takes full advantages of that time spent waiting to do other things, instead of making them wait in line. This is primarily achieved with the heavy use of streams. Any non-trivial application will eventually end up doing some work with them. It'll probably bend your mind a bit at first, but eventually you'll see the light At this point, I'd have a really hard time justifying going to an "IO Blocking" language again, having seen the ease with which Node can scale.<br />
<br />
<h3>
And.... That's a Wrap!</h3>
Phew! That was quite a bit! I hope you found this review interesting, or at least a bit insightful. Feel free to leave me a comment either way, I'd like to hear what you think. Either way, I would encourage you to not get too comfortable where you're at. Keep pushing the boundaries. Keep innovating. Because that is what being a coder is all about.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com2tag:blogger.com,1999:blog-7225248065541103252.post-89095549337181993892014-05-09T11:07:00.003-05:002015-10-28T13:57:20.569-05:00Getting Uncomfortable: Three Years in RetrospectIt's been years since I've written in this blog, yet I feel that many of the things herein have bared out to show what I was originally speaking of. Particularly with regards to my predictions of CoffeeScript. There are more options than ever for writing JavaScript with syntactic sugar: CoffeeScript, Dart, TypeScript... Pretty impressive for a dynamic language!<br />
<div>
<br /></div>
<div>
It can be a little scary jumping in to a new pool of knowledge, even if some of your old knowledge still applies. That's how I felt when I left behind Java, and dove in to the wide world of Ruby, and even more so later when I dove in to Node.</div>
<div>
<br /></div>
<div>
Minneapolis is not a "technologically innovative" part of the country. The majority of developers that live in this area are mired in technologies that are "safe" or "proven" rather than pushing the boundaries. Though I'm not sure Ruby counts as pushing the boundaries, considering that it has been around since 1994, and Rails has been under active development since 2004. Somehow that still qualifies as "new" technology in a lot of peoples eyes. I can only imagine what they would think of those writing applications in Node, which was first released in 2009! </div>
<div>
<br /></div>
<div>
Somewhere along the line there was a shift in technology culture. I can't imagine anyone that was in technology during the 70s and 80s having the same thought process. They were chomping at the bit to push the boundaries of what can and should be done. We wouldn't be where we are today without those innovations. Sure, not everyone can be on the 'wave' of technology. There have to be many holding down the old tech, keeping it all under control. The change has become fairly clear. What's more, it's surprising how many greenfield projects end up choosing Java or .Net as their building blocks. Because it's what they have used for more than a decade.</div>
<div>
<br /></div>
<div>
With that as background, here is what I can tell you about living in the dynamic language world:</div>
<div>
<ol>
<li>Don't hesitate to use Ruby or Node for your next production application. If Github, New Relic, 37Signals, and Ravelry can scale Ruby, so can you. If Paypal and Walmart can scale Node, so can you.</li>
<li>There is massive scalability with some of the new frameworks out there, without having to stand on your head while holding three blocks with your feet. Worry more about your problem domain, and choose tools that help you solve those problems swiftly. Speaking of which:</li>
<li>Time has a cost. It's called opportunity. When you spend time writing an application 'the only way you know,' and that way takes a factor of 5 times longer to build the application, you need to find a new way. Give your idea life, don't let it flounder in a quagmire of "we've always done it this way".</li>
<li>Embrace new ways of thinking. Every time I have adopted a new language, it has brought new thought patterns to light, which in turn makes my code better, and thus my end product stronger. </li>
</ol>
Don't be skeptical of new technology, embrace it. Try it. Extend it. How do you think the Java and .Net platforms grew to be the size they are? Because people with their own visions were constantly hacking on them. That's the way they managed to mold the subculture. Maybe you don't like what Ruby brings to the table. That's understandable, but standing still is not. Try Python instead! Too mainstream? Sure! Try Node! The asynchrony will make your head spin. JavaScript not your thing? Okay, but I think you're just making excuses now: Try one of these powerful languages: Elixir, Erlang, Go, or Rust! It's an exciting time for developers. There are so many new ways to build massively scalable applications. Take up the challenge! Don't eschew the new for the old. Get out of your comfort zone. See what's out there. </div>
<div>
<br /></div>
<div>
You <b>will</b> be better off for it. Even if you never deploy a production app with it.</div>
<div>
<br /></div>
<div>
Stay tuned for my next post: How Java and Ruby made me a better Node developer.</div>
Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-15643587423461274462012-06-01T10:59:00.000-05:002014-05-09T12:32:51.061-05:00CoffeeScript Is the future of JavaScriptWhat is <a href="http://jashkenas.github.com/coffee-script/">CoffeeScript</a>? It's an abstraction on top of JavaScript that makes your code more maintainable, readable and beautiful. Your CoffeeScript compiles down to JavaScript that runs as fast (if not faster) than your otherwise hand coded JavaScript would. A bonus is that it runs through JSLint error-free.<br />
<br />
Why do I think that CoffeeScript will take over the landscape of JavaScript? Simple. Because it's <b>committee-free.</b> There is no major group that controls it. It's a simple abstraction on top of current standards that forces you to write <u>sane JavaScript</u> (almost as if it came out of the book <i>JavaScript, The Good Parts</i>).<br />
<br />
JavaScript has been a nearly <i>quality-control-less</i> language since the beginning. There have been some great strides towards forcing better quality on JavaScript, but most of them are trying to drag other paradigms in to the JavaScript world, rather than forcing good JavaScript practices from the very beginning. Things such as appropriate scoping have often alluded even strong JavaScript developers for a long time. With the advent of CoffeeScript, it makes JavaScript "<b>feel</b>" a little bit more like a fully featured object oriented language.<br />
<br />
"What about the next version of ECMAScript? Wont that fix a lot of the problems?"<br />
<br />
Where is it? I still don't see even a shell implementation in one of the latest and greatest modern browsers. Moreover, when can we use it? I'm still coding to IE6 specifications on occasion. How long is an acceptable grace period before I can just start disregarding the previous version of ECMAScript?<br />
<br />
The answer is here and now. Something on top of JavaScript to <b>enforce standards</b> and make things <b>more readable</b>. Yes, it's syntactic sugar on top of JavaScript. That's a <b><i>good</i></b> thing. JavaScript is half-baked and half-readable- CoffeeScript will make life easier while we're waiting for the next big script level to get to where we can use it regularly (which I'm guessing at somewhere around 2016).Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-52546183693926802892012-03-21T20:51:00.000-05:002012-03-21T20:51:39.283-05:00The JavaScript RevolutionThe writing is on the wall. <i>If you're not learning more JavaScript, you're going to get left behind.</i><br />
<br />
Does that mean that you wont be able to get gigs? Certainly not. It just means that the greatest level of innovation seems to be happening in JavaScript land. Let me tell you a story...<br />
<br />
<i><b>Once upon a time</b>, in a computer generation far far away</i>, ECMA Script was a scripting language in it's infancy. Something designed to make <i>life developing in a browser just a little easier</i>. Java came along and was getting popular about the same time. So ECMA Script jumped on the band wagon and branded itself <u>JavaScript</u>! That helped it gain a little popularity. Enough to hold it's own in the browser space.<br />
<br />
Many hesitated to use this JavaScripty thing because it was considered <b>unapproachable</b> or too <b>difficult </b>to bother with, because every browser implemented their core functions just a little bit differently. Enough to make it difficult to make sure that every browser would behave similarly.<br />
<br />
Then, in come the <u>standardized libraries</u>. This was the <i>beginning of a revolution</i>. It was when people began to realize that JavaScript wasn't about writing messy code in one huge file that barely managed to work. People started innovating. Creating their own way of interacting with each individual browser, splitting up their scripting files in to more manageable chunks, and so forth.<br />
<br />
<u>That was the tipping point</u>. When people realized that this <b>somewhat abnormal functional language</b> could be used to actually do a lot of very powerful things like AJAX, browser animation and more. What is more, it's possibly the most popular language out there that has many of it's foundations coming right out of the LISP playbook. I digress, anyway, that was about the same time that big browser companies began to take notice and start optimizing their JavaScripting engines (really, we have Chrome to thank for the JavaScript engine optimization war). What is more, people started thinking about the best ways to utilize these new capabilities and a whole new era of libraries were born- <a href="http://www.sencha.com/">ExtJS</a>, <a href="http://sproutcore.com/">SproutCore</a>, <a href="http://documentcloud.github.com/backbone/">Backbone</a>, and so many more. These have all been used to push the browser in to new levels of near-desktop application styles and design. <i>Who would have guessed that JavaScript would become the glue of the browser?</i><br />
<br />
What's more,<i> it doesn't stop with the browser</i>. Nothing as solid as V8 can stay in one place. V8 is the JavaScript browser built by Google for their Chrome web browser. The thing is.. it was fast. Really, really fast. Why couldn't the server side also benefit from such a fast engine? The answer is.. it could! <b>NodeJS was born</b>. Not the first JavaScript interpreter to show up on scene, but definitely the one to catch on. Including package management in a similar way to what is done in the Ruby community.<br />
<br />
This is a new way of working- <u>"Event" based programming</u>. It's an upstart. A powerful one. People have started to realize just how much idle processor time they have, and how much could be utilized by not constantly having a blocking thread. It's like the idea of not using transactions for databases. <i>The first time you see it, it just doesn't make any sense</i>. You have to put the time and effort in to understanding it before you can utilize it's power.<br />
<br />
I don't think it's a unfair of me to say- <b>JavaScript is the glue that holds a fair portion of the internet together</b>. As we move forward, I can say with some level of confidence that JavaScript will continue to take a larger role in server and client web application development.<br />
<br />
<br />Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-18046958754533787992012-03-16T00:55:00.000-05:002012-03-16T00:56:30.009-05:00Three Months with Extreme ProgrammingWith nearly a decade of enterprise development experience under my belt, I have finally had a brush with <b><u>Extreme Programming</u></b>. For those that don't know what that is, it is a form of <i>Agile development</i> that has a few very <a href="http://www.extremeprogramming.org/rules.html">specific qualities</a> that differentiate it from all the others. Most notably from a development perspective, <u>pair programming on all coding</u> (which turns out to be- all day, every day). There is a concept of heartbeat and rhythm, but those are shared with many other Agile shops.<br />
<br />
My focus for this review is going to be regarding the results of pair programming each and every day.<br />
<br />
<span style="font-size: large;"><b>The Short Term Effects</b></span><br />
<br />
Each day starts out by figuring out who is going to be pairing with whom. Their are differing ways of dealing with this, but often times it is fairly organic that those that show up first, pair together. This creates something of an uneven pairing process (if you're trying to keep the pair partners fluid and changing). It's important to use some kind of a pairing chart reminder if this ends up happening. To force a change in the pair mid-day in order to keep people working with fresh pairs. It's hard to consistently work with the same people.<br />
<br />
Every developer on the project has <u>exposure to every system</u> that is under development. In theory this leads to no one person ending up the guru of a particular system. That effective knowledge transfer is a highly sought after commodity in the enterprise world.<br />
<br />
As <b>a byproduct of such close quarters</b>, that <u>it generates a lot of communication and design chatter</u>. It's not about just throwing code down immediately. Each line of code costs more, but has a higher level of design consideration. It's argued that this is where time is actually accelerated. Because less code that isn't needed never gets written.<br />
<br />
Meetings become more difficult because it becomes necessary to re-figure out the pairs after each meeting, and usually there is a need for a fair amount of breaks for each individual.<br />
<br />
<span style="font-size: large;"><b>The Long Term Effects</b></span><br />
<br />
<b>Burn out becomes a serious problem</b>. When everyone is operating at <i>100% capacity on an ongoing basis</i>, and going home completely wiped, that creates a certain amount of reduced morale. In turn, that ends up requiring some kind of mitigation in one way or another. Whether that's through some form of a 20% rule (where one day of the week is spent separately doing other things), or something else, <b><u>it is necessary</u></b>. <i>It is as if you're constantly operating a power reactor at the red line</i>. Eventually, it's going to have blow back, and it's going to need a serious overhaul.<br />
<br />
Due to burn out and the high level of collaboration, this can lead to something of a person-that-writes-the-code and person-that-reads-the-code (because they need a break). That can also result due to one person having a more domineering personality, or the other a more submissive one. That can have very detrimental effect on the team as a whole.<br />
<br />
That isn't to say that it's all bad. Far from it. The shared ownership of the codebase, greater all around knowledge and potential tight bonds of the team. Those all make for a pretty winning scenario.<br />
<br />
<b><span style="font-size: large;">What is the net result?</span></b><br />
<br />
Balancing on the head of a razor is extremely difficult. The more people involved, the harder it becomes.<br />
<br />
Continuous collaboration. Burnout from being overworked.<br />
Efficient knowledge transfer. Tight working conditions.<br />
Higher code quality. Slower coding rate.<br />
Domineering personalities vs submissive ones.<br />
<br />
<b>It's about the trade offs</b> and being <b><u>honest</u></b> with yourself. I found that <i>this particular style of development </i>was not for <u>me.</u> That is not to say that I don't think it has many positives. It just requires the <i>right personalities</i>, a lot of <i>commitment</i>, everyone <i>buying in</i>, and willingness to constantly <i>evolve the team</i>. Many developers are not right for this style. Me? I found myself anxious constantly. I have my own rhythm when coding, and I don't like having to artificially hold that rhythm in check. It doesn't feel natural.<br />
<br />
<u>Those that can do it have my respect</u>. It's a strong way to get the job done, and get it done right.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-13354428706543142442011-12-13T11:45:00.002-06:002011-12-13T11:47:38.695-06:00Seven Deadly Sins of Java Web Application DevelopmentI have been involved in enterprise level Java web application development for the better part of a decade at this juncture. I have deployed applications in the law, education, healthcare and business industries. I have worked with technologies ranging from <i>JSPs and Servelets on to J2EE EJBs, Spring IoC and MVC, AspectJ, Struts, JSF, Seam, Wicket, JUnit, PowerMock, Maven, Ant, Hibernate, iBatis, Freemarker, Velocity, JNDI, JMS</i>... The acronym-soup goes on and on. These technologies are all things that make the "Java development culture" proud!<br />
<br />
I often find myself stunned at choices that are made in these cultures. Choices that make development more complex, reduce productivity, cause release delays and just flat out waste time. I have had a lot of reasons quoted at me why it is necessary (necessary!) to use Java for application development. These reasons range from "only Java is stable and scalable enough for us" to "that's where the developers are" and on to my personal favorite: "That's what I know, and it works great! Why should I change?" Ugh!<br />
<br />
Why should you look at other languages and frameworks? Because most web applications that are developed in Java are wasteful! Nine hundred and ninety nine web applications out of a thousand don't need Java-level throughput. What's worse, most web applications developed in Java aren't even taking advantage of Java's speed because they're mired in unnecessary frameworks doing excessive reflection. I couldn't even begin to guess how many hundreds of hours of my life I've wasted sitting and waiting for builds.<br />
<br />
Before you say that I'm being unreasonable by lumping all Java web applications together, I will say that there are some attempts to needle these problems down. That isn't what the larger community accepts. Maybe you are the exception. Could be. If so, I admire your tenacity. Anyways, onwards. To what I perceive as the seven deadly sins of Java development...<br />
<br />
<b>Deadly Sin # 7: Cleverness</b><br />
<br />
It's not easy managing Java Developers. They're hard to pin down and they spend a lot of time talking about these great architectures they are going to come up with (usually within other architectures that have already been implemented). You end up with a composition nightmare. Instead of spaghetti code that used to result from bad coding, you get lasagna code that has layer after layer of complex abstraction after complex abstraction. One of the most important things to remember when developing is that code is twice as hard to test as it is to write. That means that if you write the most complex code you can, in the cleverest way you can, that you are, by definition, incapable of testing it thoroughly.<br />
<br />
<b>Deadly Sin # 6: Experience</b><br />
<br />
Most Java developers show up straight out of college. Some have a background in coding, but most have done their time in internships. Usually that internship has solidified their knowledge in a particular piece of the pie that is Java, and not broadened their horizons further. Schools don't all have the same curricula, but they tend to preach Java for at the present time. That is just plain dogmatic. It's as if they're saying that you need a bulldozer to paint a sidewalk. While you probably could do it, how much sense does that really make?<br />
<br />
Generally speaking, it takes about 3 years to turn yourself in to a well rounded Java developer. To understand all the idioms, jargon and idiosyncrasies of the language. During those three years you will be inundated with all sorts of concepts like inversion of control and aspect oriented programming. Many of which are a direct result of the need for more dynamic language features in the code. That's just one big code smell. If you want a dynamic language, why not use one?<br />
<br />
<b>Deadly Sin # 5: Drive</b><br />
<br />
Learning all there is to know about Java requires a certain amount of drive. It's not something you can dabble in. You either are a Java developer or you aren't. You know it or you don't. While that may be a boon in some cases, it can become difficult to see who is pulling the acronym-soup wool over your eyes. Sure, you know the words for inversion of control, but what does it do? What is its purpose? Is it right for your project? Does it make your code more or less maintainable? The party line answer isn't good enough. This kind of drive is actually pretty rare. That also means that over the course of a Java developers career, they are going to have to learn and relearn all sorts of technologies. The culture continues to get more clever, requiring more out of every developer that wants to be involved.<br />
<br />
<b>Deadly Sin # 4: Cost</b><br />
<br />
This is gluttony, pure and simple. The cost to run a Java development environment is extreme. Java developers require the best machines, with the most memory. Their environment usually requires a continuous integration server, versioning server, and more. All of which have to run on different machines because the Java software that integrates their Java software is so bloated that it requires devoted resources to run at acceptable speeds. Once you get past the development environment, you're probably looking at a fair sized server farm just to run the basic application that you want. Perhaps you can get around that by running in a cloud, perhaps not. Most companies want total control, and that means that your Java apps require a massive investment before you even consider having your first real client on the system. This doesn't even bring in to account what Java developers consider themselves being worth because of their lengthy college education and their ability to wade through the overly clever technologies they are working with.<br />
<br />
<b>Deadly Sin # 3: Time</b><br />
<br />
Waiting on your IDE to load, index, churn through all of your code, stop pausing when you're trying to write code. Run your unit tests within your IDE. Waiting on builds. Waiting on compilation of your code. Copying new code over. Setting up and forcing Java servers to run as you expect them to. Looking through hundreds of thousands of lines of log files to find out that you had one single little configuration error that caused hours of downtime. These are just a few of the wastes of time in Java development. Every Java developer has wasted hundreds of hours by going through these processes. <br />
<br />
When you are building a web application the mantra should be- Deliver early, deliver often. Without clients, you're not getting feedback, and you're not making money. This is almost impossible when you're spending a fair amount of your time just forcing your development environment in to submission time and time again.<br />
<b><br />Deadly Sin # 2: Unproductive</b><br />
<br />
Java is verbose. Extremely verbose. It not only has a syntax that requires 5 lines instead of 1 for particular tasks, the culture behind the language actually embraces the idea that more is less. If you looked at your average Java Web App you would find it replete with XML files, properties files, and javadoc blocks. Instead of running with sane configurations from the get go, you have to define that sane basis. Even to all the frameworks out there that are supposed to ease your development cycle, or provide particular functionality. You can't write self documenting code because you're working with something that is too low of level. This leads to a lot of time spent debugging frameworks and writing config files, instead of writing feature-driven code.<br />
<br />
<i>And finally...</i><br />
<b>Deadly Sin # 1: Distractions</b><br />
<br />
Java developers are clever. By nature they must be. The frameworks they work with all day every day enforce and reinforce that concept all the time. Instead of writing twenty lines of servlet code for a simple API, they'll instead pull in a framework. Struts, Jersey, Spring MVC. Waste hours setting up the config files. It will never enter their consciousness that they've just completely over-engineered a twenty line program in to something that requires hundreds of lines of XML.<br />
<br />
This produces unreadable, unproductive and unmaintainable code. There are a great many developers that have been forced to maintain these codebases and have become either enamored of them (because they must be to keep supporting it) or simply forced themselves to deal with it. Worse yet, most developers lose sight of the goal. They forget that their application still has constraints because it is just a given that most companies will simply 'deal with' the extra cost of the hardware that's needed to run it.<br />
<br />
Don't get lost in the mire. Know what you're developing. Know how much engineering it needs. Don't add eight layers when you only need two. Remember: Keep it simple and stupid. The more simple your app is, the easier it will be for you to write the next big feature.<br />
<br />
<i>Stay tuned for my comparison of developing a web application in <b>Java</b> versus <b>Ruby</b>.</i>Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-88686607414586786932011-11-01T11:21:00.000-05:002012-03-21T20:12:41.229-05:00Toxic Perceptions in Development<i>Being a programmer in the corporate world is a somewhat arduous task</i>. There are a lot of assumptions thrown around from within and without. Those outside the world of development tend to see development departments as something of a police-free zone, where anyone can do whatever they want without repercussions or having to adhere to deadlines. This couldn't be further from the truth.<br />
<br />
Developers have a bit more leeway when it comes to technology, because they must. <b>In order to do their job</b> (<i>you'll note this is unqualified- it's a requirement, not optional</i>), developers must have such software as Firefox, Firebug, Eclipse, Fiddler, open internet connections, and the list goes on and on. Corporate IT can't restrict them (or shouldn't), because all it will do is slow down the development cycle and force deadlines to be pushed back even further<br />
<br />
Deadlines come and go for development, it's true. <i>This is the single biggest driver of all the churn/turn over in development groups</i>. Miss a deadline, and one day you have 250 contractors in a massive project, and within a month, you could be down to a hundred. Or you could go from 30 down to 15 almost overnight, with two managers stepping down. Both of these have occurred in my time with corporations. It's a given, because <u>software development is more of an art than a science</u>. It doesn't adhere well to dates because understanding the "whole picture" in a project is something that very few people can do. Even when a person can, there are always unexpected hiccups and stop gap measures. Imagine a construction project suddenly having a supplier run out of their building material. This causes timelines to be pushed, and that doesn't make executives happy.<br />
<br />
While the <b>outside perception definitely hurts</b> development on occasion, it's not the piece that <i>strains developers the most</i>. It's the <b>perceptions</b> from the <b>inside</b>.<br />
<br />
<i>During college</i> or<i> learning periods</i> for developers, they get taught certain ways of seeing problems. <i>Many of which never question these perceptions</i>- They just start to feel as if there is only one right way to look at problems, or a very limited number of ways to solve a given problem. When that is the exact opposite of what companies are looking for. <b>Innovation</b> is about bringing <u>creative and new ideas</u> to the table. Trying new things that aren't simply extensions of the old stagnant ones.<br />
<br />
In order to be clear about what I mean, here is an example: In a recent rewrite of an application, we went from a Java stack based on Struts 1 with some Hibernate thrown in, to another Java stack with Spring, iBatis and a DMS library. With the hopes that this would help speed up productivity and make our code more maintainable.<br />
<br />
The core problem in our move from one version of the application to another is that we did not evaluate other languages/architectures that may be more suited to developing a web application. Groovy on Grails, Ruby on Rails, Python/Django, et al. <i>Because developer preconceptions got in the way</i>. This concept extends beyond just the language and patterns. The idea should extend in to some of the broken areas of application development where developers spend more time in XML files or property files than actual code. Spending more time writing unit tests or performing a build than writing code.<br />
<br />
I've said it before, and I'll say it again: <b>A good developer should be able to learn new languages and new patterns quickly and easily.</b> If they can't, I would call in to question if they fully understand application development.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-57575500605639270822011-05-19T12:32:00.003-05:002011-06-01T17:49:36.277-05:00Quick and dirty intro to REST within HTTP<i><b>RESTful</b> web design</i> in terms of the HTTP protocol breaks down quite simply. It all boils down to how the URI is formed. Many systems handle this in different ways on the server, but it's simply about following the basic URI pattern:<br />
<br />
(<b>action</b>) http://(host)/(<b>resource(s)</b>)<br />
Request Body: (<b>description</b>)<br />
<br />
<b>Action</b>: Verb based CRUD, where GET is Retrieve, POST is create, PUT is update and DELETE is delete.<br />
<b>Resource</b>: Noun based object. Usually found in value pairs (object/id) when not creating a new resource.<br />
<b>Description</b>: POST and PUT have request bodies. This should describe details about what you want the object to have, or what you want to change.<br />
<br />
<u>Several examples of each type of action:</u><br />
<br />
<ul><li>GET http://host.com/book/123abc789</li>
<li>POST http://host.com/book<br />
Request Body: {book: {name: "the super duper", contents: "once upon a midnight dreary..."}}</li>
<li>DELETE http://host.com/book/987bca123</li>
<li>PUT http://host.com/book/123abc789<br />
Request Body: {book: {name: "new super"}}</li>
</ul><br />
<u>Here are a couple examples of nested objects, where a shelf has rows, and a row has books:</u><br />
<br />
<ul><li>GET http://host.com/shelf/23a/row/5/book/21</li>
<li>DELETE http://host.com/shelf23a/row/4</li>
</ul><br />
It becomes pretty darn obvious exactly what you're doing when you look at URIs in this manner. The retrieved representation should have some kind of default, but you also should be able to specify a particular representation (if it makes sense for your implementation). For example:<br />
<br />
<ul><li>GET http://host.com/shelf/23a/row/5/book/21.pdf</li>
<li>GET http://host.com/shelf/23a/row/5/book/21.html</li>
</ul><br />
<b>Common REST pitfalls:</b><br />
<br />
<ul><li>REST does not support any kind of API listing, so it needs to be published clearly somewhere for people to use.</li>
<li>You may not update more than one resource/object at a time, with one call to the server.</li>
</ul><br />
While this is a very basic primer, if you follow these rules on both client and server, and you should fall successfully in to the pit of resource oriented architecture. Don't be afraid to just try it out. It's a new way of organizing all of your URIs, and you're bound to have to try things a few different ways before finding the most optimized pattern. Everything else in REST terms is just expanding on the concepts introduced here, either on server code or client code.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-24698560673599699872011-05-17T17:52:00.005-05:002011-06-01T17:50:15.846-05:00Scaling Web Applications (Aka: "Does Rails Scale?")I have spent a lot of time talking about Ruby and Rails with people recently. When you're in that domain, you can't help but have people asking you, <i>"So, is it true? Rails can't scale?"</i><br />
<br />
<b>It's a fair question</b>, <i>if somewhat naive</i>. A lot of people have heard about the issues that Twitter has had as it has become the monolithic application that it is. It's been blamed squarely on Rails, even by some of Twitter's management. <u>It's just not that simple</u>. The kind of application that will service the amount of requests they are seeing is not the kind of application that happens on accident. However, I'm not going to attempt to answer whether or not Rails can scale, because <i>I believe the question is fundamentally flawed</i>. Instead, I want to talk about the concept of scalability, and why <b>your question shouldn't</b> be <i>"Can X scale,"</i> but rather, <i>"How does X scale?"</i><br />
<br />
<b>Building an application is a bit like constructing a building.</b> You have to choose the right materials, tools and techniques. You have to plan in advance. You have to make trade offs about durability, flexibility, and all the other <i>-ilities</i>. Most web sites out there require very little scalability, because they'll never see more than a request every ten seconds. Some may get lucky and hit once a second. The very best, may see more! Consider that a millions hits per day is only approximately ten hits per second. That's really not all that impressive.<br />
<br />
There are two types of scaling that are widely accepted. "Scaling up," and "Scaling out." Each have their pros and cons, but I feel it's important to define them before considering the greater picture.<br />
<br />
<i>"Scaling Up"</i> refers to the applications ability to be placed to "Big Metal"- Think old time main frames. They are applications that are meant to have one instance of the application servicing every request. This is the easiest to conceptualize. You only need one program running, and as long as you buy powerful enough hardware, you can get away with any number of requests. There is a hard limit to this kind of system, though. When you aren't parallelizing tasks, you can end up with a lot of downfalls. Such as deadlocks, stalls, and more. What happens on a hardware failure? How do you plan for that, without having a second massively expensive install? You don't. Pure and simple. It's expensive. Very expensive. But it's simple to maintain.<br />
<br />
<i>"Scaling Out"</i> refers to the applications ability to be massively parallel on any number of given systems. This could be commodity level systems, on out to high powered blades (or even mainframes). It's not about the hardware. It's about the software. Run it wherever you want, they'll all cluster together. This kind of scalability requires a lot of advanced planning, and forethought to be able to run twenty applications side by side, and have them buzz along happily. This tends to be why many applications need to be reworked when they get to the point where thousands of users are accessing them regularly. But if your application is set up correctly, you can grow with it, on demand. Just by bringing up a few new servers to service more requests. Scaling out tends to be the preferred method of modern scaling needs. You don't anticipate your need, you buy hardware as you need it. Backups are only as costly as having a few extra systems standing by.<br />
<br />
<b>Now,</b> taking the earlier example: Instead of having to service<b> a million requests per day</b>, what happens when you have to service a <b>hundred million</b>? Or more? You're now looking at more than <i>one thousand requests per second</i>. The same system that can happily buzz along and handle one or two, or even ten, requests per second, will no longer be capable of handling the load. It will realistically be crushed under the weight of the load. <b>Crushed.</b> You didn't plan for it, it wont be capable of it. When you build a doghouse, you don't expect it to house hundreds of people, <b>right?</b><br />
<br />
That means that you need to think about how to handle that load. Build a foundation that can handle it- Pick tools and frameworks that you can vet.<br />
<br />
Some key questions you really should be asking are- How many requests per second can your system service? Will they talk to each other? How? Are you persisting data? If you are, how many requests can your persistence tier handle? Can it scale out, too? How? Has someone else done what you're trying to do with the tools that you're using? At what scale? What pitfalls did they run in to? How can you avoid them?<br />
<br />
<b>The bottom line is...</b> Don't fall in to the <i><b>Sucks/Rocks</b> dichotomy</i>. Especially if you haven't fully evaluated what you're talking about.<br />
<br />
<b>Remember</b>- <i>Facebook</i> is written in PHP, <i>YouTube</i> is written in Python, <i>Twitter</i> is written in Ruby, <i>Amazon's systems are written</i> in multiple languages, as are <i>Google's</i>. It's not about the language. It's about how you utilize it.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-20557045299515112582011-05-06T19:32:00.002-05:002015-07-02T12:13:23.249-05:00Parkinson's LawIt's pretty rare that I come across a work saying that I feel compelled to share with people, but Parkinson's Law is a very important law to keep in mind in software development. What is it?<br />
<br />
<div style="text-align: center;">
<i><span class="Apple-style-span" style="font-size: large;">"Work expands so as to fill the time available for its completion."</span></i></div>
<br />
This has been said in many different ways, in many different professions, or even in nature. Those include...<br />
<i><br />
</i><br />
<div style="text-align: center;">
"Data expands to fill the space available for storage."</div>
<div style="text-align: center;">
"Storage requirements will increase to meet storage capacity."</div>
<div style="text-align: center;">
"Nature abhors a vacuum."</div>
<div style="text-align: center;">
<br /></div>
<div style="text-align: left;">
Why does this matter? One simple reason. Programmers, and all other IT people, tend to aim for the stars, and will happily spend from now until the end of eternity designing the absolutely perfect, beautiful system. Instead of shipping the product.<br />
<br />
In the end, real developers ship.</div>
<div style="text-align: center;">
<span class="Apple-style-span" style="font-size: xx-small;"><br />
</span></div>
Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-19995007131013515952011-05-06T18:57:00.009-05:002011-06-01T17:51:31.734-05:00Thinking in Code: JavaScript and Java<b>JavaScript is not Java.</b><br />
<br />
Let me repeat that again:<b> JavaScript is to Java what Hamster is to Ham.</b><br />
<br />
There is no direct correlation, other than the fact that they both are a form of programming language, and a way to instruct a computing environment to perform specific tasks. Their paradigms are completely different, and they are not formed of the same types of features. Let me highlight some of the key features of each language:<br />
<br />
<b>JavaScript is...</b><br />
<ul><li>a functional language with basic facilities for forcing object oriented notation, despite not being object oriented at all.</li>
<li>a dynamic, loosely typed language that is event driven</li>
<li>typically used in the web browser, though there are server side implementations that are catching on.</li>
</ul><b>Java is...</b><br />
<ul><li>a mostly object oriented language</li>
<li>a static, strongly typed language</li>
<li>typically used for implementing very powerful server based processes which are cross platform compatible</li>
</ul>Every language has it's own strengths and weaknesses, and it's own way to code appropriately in. This has given birth to many books such as "<i>Effective Java</i>," "T<i>hinking in Java</i>," "<i>The Well Grounded Rubyist</i>" and "<i>Javascript, the Good Parts</i>." Each of these books focuses on where each language excels. The reason that there are so many is because each language has it's own way of thinking. If they didn't, there wouldn't be any reason to have so many books.<br />
<br />
<span class="Apple-style-span" style="font-size: large;">Why Does it Matter?</span><br />
<br />
I have run across an awful lot of JavaScript code that was clearly written by a Java developer. Over architected and excessively complex code which takes hundreds of times the CPU cycles than ten lines of well written, clear, easy to use, self documenting code. This highlights the differences between the two languages, and the two different thought processes between the two. The correct way to implement a particular feature in one programming language may not only be wrong in another language, but may go against the very intention and fabric of the language. This is why companies hire programmers to participate in the development of a particular application based on language, rather than just hiring the best generalist that they can find. Not that generalists don't have their place, by all means they do. But in order to be well grounded in many languages, it not only takes passion for coding, but motivation and drive. Despite what you may believe, you're probably not one. They are few and far between, and when found, they are worth their weight in gold.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-74002705159701058412010-10-17T14:41:00.004-05:002011-06-01T17:51:46.142-05:00Ruby on Rails and Sencha's ExtJS brings serious productivity to web application development.As someone who has been living in the Java world when it comes to developing web applications, development times have always been ridiculously long. Cycles spent figuring out what should take hours, not weeks. I recently started playing with two major technologies that really enhance the web sphere. Both of which are squarely aimed at taking application development time, and dropping the curve as sharply as possible.<br />
<div><br />
</div><div><span class="Apple-style-span" style="font-size: x-large;">Ruby on Rails</span></div><div>Ruby is a modern programming language that is about economy of code, and productivity. It is fully featured, and a pleasure to code in due to it's pure object oriented nature. This also contributes to how easy it is to write DSLs in. Ruby could be considered a hybrid between Smalltalk, Lisp and Perl. Due to how many areas it pulls from, the language itself can take a little while to wrap your head around. It's very different than the "big" languages out there (C#, Java, PHP). I plan on writing a much longer post about Ruby specifically, but here are a couple highlights:</div><div><br />
</div><div>Probably the biggest distinguishing factor of Ruby is it's Gems. Perl started going down this route with CPAN, but Ruby took it all the way to its logical conclusion with Ruby Gems. Gems can be libraries, DSLs or executables. It could be something you use in your code, or a command you run to push code to a server. The biggest difference from the 'big' languages out there in this realm is that they can be installed with one simple command on the command link, unlike other library dependency managements systems that require you to either hunt down the dependencies, or download the internet every time you build. For example, pulling in the "rails" framework is as simple as: "gem install rails" and waiting a few minutes.</div><div><br />
</div><div>The other big highlight of Ruby that I'll make is the Rails framework. It's one of the most amazingly productive frameworks for designing your web application, and despite rumors, scales perfectly well. Assuming you know how to write scalable code.There are a ton of great tutorials out there on how to get started with Rails, so I don't ever really plan to cover Rails in any more detail (unless there are specifics I decide are worth covering). Suffice it to say, you could have a running application in about five commands on the command line. Which includes full database backing (with a database of your choice), basic unit tests for generated models and controllers, with proper separation of concerns between layers.</div><div><br />
</div><div><span class="Apple-style-span" style="font-size: x-large;">Sencha's ExtJS Framework</span></div><div>ExtJS is a JavaScript framework that makes building fully featured user interfaces in web browsers a fast and nearly painless process. When used creatively, it can make fully customized web applications in much less time than it would take to try and cobble together all the pieces with various javascript libraries, CSS coding and HTML, all in a standards compliant way for all the modern web browsers out there.</div><div><br />
</div><div>There are plenty of tutorials out there for both, so I wont belabor those subjects. But all in all? At this moment in time, I'm thinking that Ruby on Rails partnered with ExtJS is a match made in development heaven.</div>Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-8544346162991308472010-05-11T17:21:00.003-05:002010-05-11T17:24:07.971-05:00On whitespace and other coding "monstrocities"...!Why exactly is it that programmers all seem to be anal retentive?<br />
<br />
Some of the best coders that I have been witness to have been extremely reflexitively irritated by certain things that do not fit their view of 'good coding practices'. Most recently, I was reading archives of <a href="http://www.codinghorror.com/blog">coding horror</a> when I came across <a href="http://www.codinghorror.com/blog/2009/11/whitespace-the-silent-killer.html">this post</a>, which can be pretty much summed up as 'OMG WTF dem white spaze not l33T nuff!' While I definitely respect Jeff's work (okay, most of it anyways!), and feel that he's spot on most of the time (really!), I wish that people would stop propogating the misconception that you have to be anal retentive to be a good programmer.<br />
<br />
Having worked in mid-to-large sized teams for most of my development career, I can safely say that everyone will run up against different coding practices, and different ways that people believe things should be done. Take this for example:<br />
<br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">public String doStuff() {</span><br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;"> return "Because I said so!";</span><br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">}</span><br />
<br />
Can become..<br />
<br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">public String doStuff() </span><br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">{</span><br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;"> return "Because I said so!";</span><br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">}</span><br />
<span style="font-family: Courier New; font-size: x-small;"></span><br />
Or..<br />
<br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">private String EVERYONE_LOVES_CONSTANTS = "Because I said so!";</span><br />
<br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">public String doStuff() {</span><br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;"> return EVERYONE_LOVES_CONSTANTS;</span><br />
<span style="font-family: "Courier New", Courier, monospace; font-size: x-small;">}</span><br />
<br />
...and there are plenty of other permutations!...<br />
<br />
At most shops there is someone that sets standards, and usually gets pretty upset if they aren't followed. Really, it doesn't matter! Honest! They're all usable. All can be self-documenting. Conventions are helpful, but that's all they are. Conventions. If you can't read code you haven't programmed (particularly with use of a decent code auto-formatter), then you have bigger problems to solve! All I have to say is this: If you want my code to have extra CR/LFs, I darn well better have a huge monitor to deal with a heck of a lot of whitespace on my screen at a time.<br />
<br />
Yes, most good programmer are pretty OCD about a lot of things. Going 'over the edge' about little things like that only leads down the path of madness, especially if you wander through many different shops in the corporate world.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-76250817484056001622010-02-18T14:41:00.005-06:002010-03-10T10:22:03.449-06:00Why devices like the iPad are not only relevant, but game changers.This post is not only relevant to the iPad, but all of the devices that are coming out 'in it's class'. Over the course of speaking to people about the iPad, I have found myself hearing trends. People talking about why they should get an iPad when they already have a computer or a laptop. My answer is very different than some, I still think it's a game changer.<br /><br /><span style="font-weight:bold;">The iPad is an appliance, not a computer.</span><br /><br />The face of computing has changed very little since the first desktops and laptops came out. Embedded devices have been used by a select few that were willing to deal with the pains of the little screens, and the poor web browsers. This is all changing. Between the iPhone/iPad OS, and Android, many of those woes are headed towards being a thing of the past. This is a great thing! The day that I can hand my Mom an iPad and tell her to go to town on the internet on there, without worrying about viruses.. well, let's just say that will be a beautiful day. I see that day approaching.<br /><br /><span style="font-weight:bold;">Like all embedded devices, the iPad will be highly software regulated.</span><br /><br />Some people think this is tantamount to treason of their beloved computers. I'm here to say it now: Regulation of technology only makes the user experience better! It may not be perfectly what you, geek user, want. But it's going to make the majority of end users much happier in the long run. This is one of Apple's secret to success. This may not be quite as true on Android tablets that are coming out, but it's still useful to have an acceptance process. They have a core of followers, but they've been gaining more as time goes on. Because people are purchasing PCs as if they're well designed systems, when in fact they're just slapped together by the lowest bidder. Sure, Apple doesn't put anything out on-the-cheap. Why? Because you get what you pay for. They spend the extra time designing and developing both the hardware and the software. They integrate and blend the two to provide a world class experience. It's nothing short of the best customer service possible.<br /><br /><span style="font-weight:bold;">Media player</span><br /><br />This is the first device that you can viably listen to music, watch videos and read books, on. It's the only class of device that I might actually consider doing all three things on. In those terms, I think that makes it one of the most likely devices to succeed. Do I think it'll immediately be a runaway hit just because of that? No. I do believe that really raises the stakes though. The device can do all those great things that a standard iPod Touch can do, and more.<br /><br />My workflow will change immensely. At least while I'm at home at will. Instant messenger and web pages on the iPad style device, and I'll only break out the desktop when I have development work to do.<br /><br /><span style="font-weight:bold;">Here's the real game changer: Why have ten embedded device, when one will do?</span><br /><br />With the advent of a decent sized embedded device, with this much power, why would a company such as a hospital, bother to have either tiny little devices that connect to stuff? Why not one device that can do it all? Imagine a device that can actually read patient charts, interface with all the tiny little machines, and the like. The iPod Touch was only really good as a point of sales device. This thing could actually be used in places where current embedded technology is old and stale. Completely revitalizing the market in those sectors. A prime example being medical devices, which I happen to think of only because I used to develop for a monotone device that was horrible. It was used to scan stuff for inventory. Imagine walking around with one of these devices in one hand, a scanner in the other. Maybe not completely new, but it would integrate itself easily in to all sorts of workflows. Directly because it is an appliance, not a computer.<br /><br />The most poor decision that Apple has made regarding this product is, simply put, it's name. It sounds so close to a feminine product that it is unlikely to be taken serious in many circles. My guess? You have a board room of male directors that are brainstorming names, and they decide on the iPad. They approve it, and the first female to see it outside the board room nearly falls off her chair laughing. Unfortunately, the name is already 'set in stone'! Too late!<br /><br />Ah well. I'll forgive it it's name, and use it as it was intended.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-50483694601200514372010-02-03T09:34:00.010-06:002010-02-03T10:33:27.550-06:00Mac UnmeasurablesSo, a friend of mine asked me to measure "Mac Unmeasurables" recently. For that reason, I decided to go ahead and start this blog post. I'll keep updating it with new ones as I go forward, but this should help some people that are interested in buying a Mac, but don't understand why specs don't match exact specs from laptop to laptop. Keeping in mind I'm forced to use Windows systems every day of my life, because business still hasn't figured it out either. As a side note, I will not be providing apologetics in this post. If you don't realize that Mac Mice have more than one button, go out and look. I'm thinking I'll provide an apologetics post in a few days.<br /><br />Foundation piece: Apple is a hardware <b>and</b> software manufacturer. They put them together for optimum performance. There is little 'lag' and stuttering in the Mac OS, unlike any other operating system on the market (not counting Chrome OS- It's not really out of beta yet as of this writing, and I can't count that). Apple also understands presentation. Even in the box, they are careful in how they present their products. It's not 'throw it in the box and hope'.<br /><ul><br /><li>5 minute setup. The most recent Windows systems have started to match this, but you still have to get rid of all the spam-ware that they pre-install on your system.</li><br /><li>Immediate productivity. Mac finds the network, you enter the password, done. No guess work.</li><br /><li>Intuitive interface. Most people instinctively grasp the dock in OS X, and the toolbar is always there. The only knock I'll give this part is that when you close the window, you're not closing the program. That's no so intuitive. But it does make it faster to bring that window back, if you have the memory to keep the program running with everything else you're running.</lI><br /><li>Less errors and less 'fiddling' time. Sure, it's not as customizable as a windows box. At least, not exactly the same ways. But, I've never had a Mac crash on me. I know people that have, but I haven't. Further, I demand a lot from a computer. I'm a web designer, and I can have twenty windows open, five-to-ten programs, and a video encoder going at the same time. No problem on a Mac. I avoid that situation like the plague on Windows.<br /><ul><li>Speaking of video encoders. I haven't had a single "free" Windows video encoder work for me without fiddling. Props to Handbrake on the Mac.</li></ul></li><br /><li>Closing the lid does what I would expect. It puts the computer to sleep within ten seconds. I've only ever seen one pre-installed Windows sytem do that- An IBM T42.</li><br /><li>Opening the lid does what I would expect. Windows can't even compete in this arena: My Mac is ready for me to go within probably 2 seconds of opening the lid. Roughly three times since I started using a Mac (years ago), did it take longer than 2-3 seconds.</li><br /><li>Turning on a bluetooth keyboard that is paired with the computer, whle it is sleeping, turns the computer on. Great for external displays.</li><br /><li>Simple, stupid backups. Time machine is the first back up technology I would trust my mom with.</li><br /><li>PDF viewing. Adobe Acrobat Reader? Yeah. It sucks. Everyone knows it, yet they still use it. It amazes me. Preview is much faster at viewing PDFs than Reader. Speaking of which, Mac has a built in preview mechanism for almost every file type. If you're in finder (the equivalent of explorer), looking at your files, just hit spacebar to preview it. Whether it's an image file, video, text, doc, rtf, pdf, it'll show you some basics of what is in there.</li><br /><li>...there will be more...</li><br /></ul><br />Here are a few points specifically for geeks, because those tend to be it's biggest critics:<br /><ul><br /><li>BSD. Use all your favorite *nix based programs on it. No dual booting, no dealing with all the silly little things you have to in *nix. It's ready to rock and roll. Pull up the terminal, and do all the old familiar tasks you normally can on *nix.</li><br /><li>Perfect for server connections. Use standard secure shell style protocols. File system paths are familiar. Don't know how something works? Use the man pages.</li><br /><li>All the networks tools built in to the GUI. Network Utility. Info, netstat, ping, trace, whois, finger, portscan. All baked right in.</li><br /><li>A usable dashboard. Windows doesn't come with a good dash. They're trying more, but the F12 key drops the dashboard right down over everything and you have access to your time, calendar, notes, all sorts of stuff. Much easier.</li><br /><li>Need a tool for developing? Admining? It's on Mac. Probably freeware. SQL? Sequel Pro. Developing? Eclipse, IntelliJ, Netbeans. CVS/SVN? Oh yeah.</li><br /><li>Your favorite browsers are there. Firefox, Chrome, Opera. It's all there.</li><br /></ul><br />Here's the trump card: Mac's run Windows! If you really feel it necessary to run a Windows program, you can emulate it on Crossover, or use a virtual environment with VMWare Fusion (Parallels is horrible, I would never recommend it). Mac gives you every environment, and ease of use. No other system can claim that.<br /><br />Here are a few parting words to think on: Who is the innovator? Where do most new techs come from? The mouse is from Mac. Bluetooth? Mac. CD Drives? Mac. Firewire? Mac. All those pretty programs that showed up in Vista? Yeah. Mac first. It's been called iLife for a long time. No, it's not as open (but there aren't any operating systems outside *nix bases that are). No, it's not as cheap. No, it's not perfect. But am I willing to pay for the experience? For ease of use? Simplicity?<br /><br />My answer? Yes. Will it always be my answer? Who knows. I use it because it's better. If someone comes along and makes something else better, I'll be there. For now, it's Mac.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com2tag:blogger.com,1999:blog-7225248065541103252.post-91762838308995529572009-12-07T10:38:00.006-06:002009-12-07T11:18:53.022-06:00Web Tier Development PatternsI have been doing web design and development for a few years now. It's been interesting. I have seen a lot of different designs and been part of putting together a good many web applications for different companies. Each of which is done in at least a slightly different way, even if they're using the same technologies. Though I have primarily seen these things done in Java and a few open source technologies, I have an insight to share that seems to be pretty obvious to me, but not so much to others. This, from the perspective of a front end designer, after being part of several great teams.<br /><br />It seems that the most successful web user interfaces are completely divorced from their back end counterparts. This is somewhat anti-intuitive from traditional web design. In Javaland you have JSPs that are directly included as part of the project. It seems to make sense that they would become integrated with the middle tier because it's so closely linked. It creates a false sense of "togetherness" that really doesn't exist. Static HTML (the way most JSPs are used) and Servlets couldn't be more different if they tried. It's something that has been apparent to me since the first day that I found out JSPs were compiled in to Servlets. I immediately asked "Why? That sounds like a lot of wasted CPU cycles." My comment still holds true today. Sometimes the marriage between JSP and Servlet is useful, where it's going to be a completely dynamic response. However, it's been my experience that using JSPs in a way that utilizes their dynamic nature (and not just for excess complexity purposes), is very rare. Most of the time, even if a few tag libraries are used, ninety percent of the page in question is just static html that has no need to pass through all the extra cycles that JSPs do.<br /><br />So, am I proposing totally static HTML? Definitely not. There are just too many reasons that the server should do some view logic work. Browsers just aren't built powerfully enough to take care of everything that a modern webpage needs to.<br /><br />Ideally, I would love to see a "front end project" consisting of a standard *AMP set up. Pick your poison for that P (PHP, Perl, Python.. or Ruby, etc), and a separate Java (or ASP) project for doing the heavy lifting. Compiled code is much more efficient with CPU than interpreted code, and makes a lot more sense to write powerful algorithms in. The interpreted code can easily connect to the compiled code to get the data that it needs on the server side. The static code that the client ends up with is capable of getting access to the back end as well using JavaScript with JSON (or whatever your preferred client side technology stack is) for getting the data from the client side.<br /><br />I have a feeling this is just one of those separation of concerns design patterns that has not caught on yet. My guess is that at some point in the future we'll start seeing this sort of separation become more common. It adds a bit of complexity in connecting things together, but it relieves front end designers of trying to do a back end designers work. People like me that truly live on the web tier, and not the middle tier, would find great joy in seeing this sort of change in mentality.<br /><br />*AMP = Fast, immediate delivery of data to the client.<br />Java/ASP = Powerful, heavy lifter for back end processes.<br /><br />It only makes sense.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-30886380613223373462009-11-20T17:34:00.004-06:002009-12-07T11:19:53.933-06:00Google OS ReviewSo, I got a chance to take a sneak peek at the new Google OS. Or at least, what seems to be the new Google OS.<br /><br />Not that I'm special in any way: http://discuss.gdgt.com/google/chrome-os/general/download-chrome-os-vmware-image/ <br /><br />However, I just want to throw a few quick thoughts out on the intarwebs.<br /><br />1) I really like the concept. The computer is part of the network. Your sign in to your computer actually is your Google ID. I like that idea. Problem is? What happens if you're not actually connected? It'd be impossible to use your system. Not that most computer users these days can tell the difference between not having internet and not being able to do anything..<br /><br />2) It's a good start. Which is exactly where I figure that they are. Starting. Inside of a couple years I am sure that they will have a lot of the bugs smoothed out. But as far as usage as a netbook system? It's just about perfect. Throw that with a 3g card in a netbook, and you've got an "internet based" computer. I never really understood what people saw in netbooks short of being used for that...<br /><br />3) Not a geek in creation will be happy with it. I can hear the cries already: "I can't install my local server? Worthless!" "It's based on Linux and all you can do is browse the net? Are you kidding me?" Basically, the people who will miss the point of it entirely.<br /><br />4) It wont be a competitor with Windows. Unless we're going back to dumb thin clients. Which is basically what the OS feels like. Even though it is a fully featured OS, it doesn't have that feel to it, since you are only able to be on the web. I think there is going to be a lot of resistance by the geeks before this ever truly becomes a reality.<br /><br />That's all I have for now. I think it'll be some time before wide adoption. On anything other than netbooks anyways. I still think virtualization of software is more likely than everything in a browser.Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-37753724900013144632009-11-18T08:00:00.005-06:002009-11-20T09:38:49.390-06:00Apple vs Microsoft - Different market-shares!I'm really tired of hearing anti-Apple talk from people that don't understand their business model. Microsoft is going after a different demographic than Apple is. Linux is going after yet another. For the sake of this post, I'm going to assume Apple vs Microsoft as the only competitors. Sorry Linux. You're still too geeky.<div><br /></div><div>So, I was hearing things last night like "The video card isn't good enough in the Mac for me to justify it," and "They're just too expensive compared to PCs," nevermind that Mac's are PCs, and "They're too locked down." Basically all of which are statements that are filled with FUD. I wont refute any of these arguments, because they're half truths. Go out and look for yourself if you want to find their answers. Don't believe what you hear.</div><div><br /></div><div>Lets take the car industry. Most people understand the differences in car manufacturers. Nobody would make the mistake that BMW, Audi or Mercedes are in the same market share as Ford, Chevy or Saturn. </div><div><br /></div><div>When you buy a BMW, you give up a lot. All your decision decisions are made for you, and it's not extremely customizable unless you're an extremely talented auto mechanic. Even then, I highly doubt there are many mechanics out there that would say they need work. They're manufactured from the ground up to be a solid unit. Everything is taken in to account, and the owners experience is just that much better because every last little detail is scrutinized. It may not have the exact engine you want, or maybe it doesn't have a specific piece of piping in there that you like. It will just feel better. Maybe if you had that Ford, you could throw that single piece in there that you want, and you would feel happier about your purchase and yourself because you made a contribution to it.. That's just not how it works with the higher end cars. The experience is what you pay for, not the specific little pieces. That isn't to say you can't pay attention to the specifics, you just can't weigh them on the same scale as their lower end competitors. Sure, a truck has a bigger engine and more power than a little BMW car. Can it go as fast? Does it look as good? Will you be as happy? Will it fall apart as much? Just because you think it costs too much, or you can't afford it... That's not a good enough reason to look down on it.</div><div><br /></div><div>That's the difference between a Mac Box and a Windows Box. Experience. I have (and do) own both. I use them for their respective strengths. </div><div><br /></div><div>I play games on my Windows box, and I do everything else on my Mac. I've repeatedly tried to go back to Windows for many of the things I do on the Mac, and simply put, it just doesn't feel as fun. What would take five keystrokes on a mac system, takes ten on a windows system. Windows is good for certain activities, Macs are good for another set of activities. Right tool for the right job.</div><div><br /></div><div>Optimistically, I'll hope that some day people are going to stop spreading FUD. I'll look forward to that day.</div>Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-75978244104233130442009-09-03T17:24:00.005-05:002009-12-07T11:19:15.091-06:00Android vs iPhone - Realistically.I've spent a lot of time with both the G1 and the iPhone at this point, and while I don't consider myself on expert at both phones inner workings, the user experience between the devices really has started to draw my attention (or perhaps, ire).<div><br /></div><div>Google has squandered an opportunity. An opportunity to squash the iPhone squarely. Android is a better OS, hands down. It's SDK is open, it's community is open, the design decisions that have been made for it's UI are fairly consistent across the device (and the software that is released for it!), and it's got way more extensibility than the iPhone does. Not to mention, it's usable on any phone. Not only do they want you to replace certain pieces of the phones functionality, they encourage it. Nowhere will you find anything like that in the cellphone industry today. Even the beloved iPhone.</div><div><br /></div><div>So, if I say the OS is so great, why do I think the iPhone has beaten Android instead?</div><div><br /></div><div>Simple. The hardware for the G1 and every other Android device, well, sucks. I don't believe there isn't the right hardware out there, or that it was any one persons fault, but it simply has really shoddy hardware. This completely degrades the user experience of the device, even if the OS is just that much better from a technical standpoint. The iPhone delivers a truly full "experience," from end to end. There's no denying it. While you're sitting there waiting for your messaging system to come up, the iPhone is happily whirring away on its message already.</div><div><br /></div><div>So even if the Android OS truly is the better choice, there still aren't any viable choices for using it. It's like having a five ton gorilla locked in a box. Waiting even a single second for the device to "unlock itself" so that you can use it is unacceptable. Waiting for the phone to catch up with itself to answer a call? Unforgivable. It's a phone. That's its purpose. There are still thousands of people voting with their money on this, and all that's happening is that it's being ingrained in people that Android provides a poor user experience.</div><div><br /></div><div>Don't get me wrong, I certainly will never switch to AT&T to get the iPhone. In truth, I wouldn't want one. I'm stuck with Android because I love all of the integrated Google services. I am simply disappointed. Google (and HTC, and T-Mobile) really botched their charge in to the superphone market. It could have been great if they had insured that the proper hardware was implemented. Now, it's most likely to be religated to tech geeks and people that want features over finish.</div><div><br /></div><div>Let's hope I'm wrong!</div>Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0tag:blogger.com,1999:blog-7225248065541103252.post-90897200219014023672009-08-31T09:50:00.005-05:002009-12-07T11:30:54.279-06:00A greeting.<div>Hail and welcome.</div><div><br /></div><div>This blog will be primarily devoted to my "in awe" moments about technology in society. Whether that falls in to the religious diatribe, righteous philosophy or business categories, I'll be posting them here. I try not to take a side most of the time, but sometimes there is a clear side that I fall on and I will speak to that.</div><div><br /></div><div>Once again, welcome, and I hope you amuse yourself here!</div>Anonymoushttp://www.blogger.com/profile/08428238230632879997noreply@blogger.com0