Showing posts with label innovation. Show all posts
Showing posts with label innovation. Show all posts

Saturday, January 16, 2010

Changing Tastes

When I was a kid, I hated onions, green peppers, and mushrooms. I used to tell people I was allergic to mushrooms so they wouldn't try to make me eat them. I hated any sort of chunky sauce or really textured meat. I think I wanted everything to have either the consistency of a chicken nugget or ketchup. My parents used to tell me that when I was older my tastes would change. That I liked crappy food, disliked good food, and eventually I would realize it. They were right.

So kids like chicken nuggets and ketchup. Wow, huge revelation. What does this have to do with technology? I'm on the steering committee for an internal conference on software engineering that my employer is holding. I'm the most junior person on the committee, and most of the members are managers who have more managers reporting to them. Our technical program committee (separate, more technical people on it, but all very senior) just finished abstract selection and we've been discussing topics and candidates for a panel discussion. During this process I have been absolutely shocked by how my tastes differ from those of my colleagues.

I've noticed that within the presentations selected topics on process, architecture, and management are over-represented. On the other side, many of the abstracts that I thought were very good and deeply technical fell below the line. I can't quite say there was a bias towards "high level" topics, because I think they were over-represented in the submissions. Given the diversity of technology that's almost inevitable. A guy doing embedded signal processing and a guy doing enterprise systems would most likely submit very different technical abstracts, but ones on management or process could be identical. It's almost inevitable that topics that are common across specialties will have more submissions.

There's a similar story with the panel discussion. I wanted a narrower technical topic, preferably one that is a little controversial so panelists and maybe even the audience can engage in debate. My colleagues were more concerned with who is going to be on the panel than what they would talk about, and keeping the topic broad enough to give the panelists freedom.

What's clear is that my colleagues clearly have different tastes in presentation content than me. I think they are genuinely trying to compose the best conference they can, and using their own experiences and preferences as a guide. I think their choices have been reasonable and well intentioned. I just disagree with many of them. If I had been the TPC chair, I would have explicitly biased the selection criteria towards deeper, technical topics. Those are the topics I would attend, even if they are outside my area of expertise. I would use my preferences as a guide. But that leaves me wondering, in another five years or ten years are my tastes going change? My tastes have certainly changed over the past decade, so I have no reason to believe they won't change over the next. Will I prefer process over technology and architecture over implementation? Will I stop thinking "show me the code!" and "show me the scalability benchmarks!" when see a bunch of boxes with lines between them? I don't think so, but only time will tell. When I was a kid, I would have never believed that I would ever willingly eat raw fish, much less enjoy it, but today I do.


Sphere: Related Content

Monday, September 17, 2007

The Tar Pit

The editor of TSS has decided to run a series discussing The Mythical Man Month, by Frederick Brooks. Hopefully it will produce some good discussion. There are a lot of Agile advocates that hang out on TSS that really could use to (re)learn some of the lessons of software development. I make a point of rereading it every few years lest I forget the lessons learned by computing's pioneers. The first chapter - The Tar Pit - contains on of my favorite concepts, as illustrated by the graphic below (slightly changed from original): Programs Let me explain. A program is what we all have developed. It's simple piece of software that is useful to the programmer and/to some set of users who are directly involved in defining its requirements. Most bespoke departmental and a substantial portion of enterprise applications fall into this category. They are sufficiently tested and documented to be useful within their originating context, but once that context is left their usefulness breaks down quickly. In addition, they are not solidly designed to be extensible and certainly not to be used as a component in a larger system. Obviously this is a range, and I've really described a fairly well developed program - one almost bordering on a programming product. That script you wrote yesterday to scan the system log for interesting events that has to be run from your home directory using your user account in order to work is also just a program. Programming Products Moving up the graph, we hit programming product. In theory, all commercial applications and mature bespoke applications are programming products. In practice this isn't really the case - but we'll pretend because they are supposed to be and I increased the standard over what Brooks originally described. The big challenge with programming products is that, according to Brooks, they cost three times as much to develop than simple fully-debugged programs yet they contain the same amount of functionality. This is why it's so hard to get sufficient budget and schedule to do a project right. The difference between a solid piece of software and something just cobbled together is very subtle (you can't tell in a demo) yet the cost difference is quite astounding. Consequently, I think most commercial applications are released well before they hit this stage, and bespoke ones require years to mature or a highly disciplined development process to reach this point. Programming Systems Programming systems are programs intended to be reused as parts of larger systems. In modern terms, they are libraries, frameworks, middleware, and other such components that are all the rage in software development. Like programming products, programming systems are thoroughly tested, documented, and most importantly are useful outside of the context in which they were created. And, like programming products, according to Brooks they take three times as long to develop as a regular program. Developing programming systems for bespoke applications or niche use can be a tar pit all its own. For one, many programmers like building libraries and frameworks. The problems are more technically challenging, and there is no strange-minded user to consider. The programmer and his colleagues are the user. Programming systems are relatively common in groups that execute a lot of similar projects and/or that contain programmers who really want to build components. Programming System Products Amazingly, programming systems products are relatively common - even if there really aren't that many of them. As you've probably guessed, a programming system product has all the traits of both a programming product and a programming system. It is useful to a wide range of users and can be effectively extended and/or embedded for the creation of larger systems. It has complete documentation and is extensively tested. Where are these wondrous things? Well, you are using one right now (unless you printed this). Your operating system is one. It both provides useful user-level functions and a huge amount of infrastructure for creating other programs. MS Office is one as well, because it has a pretty extensive API. Most commercially developed enterprise systems should be programming system products, because:

  1. They provide off-the-shelf functionality for regular users
  2. Customers always customize them
  3. They often must be integrated with other products
  4. Simply integrating their own components would go better with a programming system
The problem is that they are not, because of: The Tar Pit Brooks didn't explicitly write this definition of The Tar Pit but I think he would agree. Put yourself in the position of a development manager at a startup or in a larger company about to launch on a new product. On on hand, you want to make the product as good as possible. You know that what you develop today will serve as the base for the company/product line for years to come. It needs to be useful. It needs to be extendable. It needs to be thoroughly tested and documented... It needs to be cheap and delivered yesterday. The differences between a programming system product and a simple programming product are far more subtle than the differences between a program and a programming product. But the programming system product costs a full NINE TIMES as much to develop as the program with essentially the same "outward functionality" - at least if you are a sales guy or a potential customer sitting in a demo. I think this is the struggle of all engineering teams. If the product is long lived, doing it right will pay major dividends down the line. But it can't be long lived if it is never released. It stands a worse chance if it comes out after the market is flooded with similar products (actually, that's debatable...). The ultimate result is a mish-mash of tightly coupled components that, as individuals, fall into one of the lesser categories but as a whole fall down. There is a documented API, but the documentation isn't really that good and the application code bypasses it all the time. The user documentation is out-of-date. Oh, and the application isn't really that general - hence why all the major customers need the API so they can actually make it work. Escaping the Tar Pit Ok, so if you develop a large system you are destined to fall into the tar pit because cheap-and-now (well, overbudget and past schedule) will override right-and-the-next-decade. You need a programming system product, but budget and schedule will never support much more than a program. So how do you escape it? Accept It Products that give the outward impression of being far more than they are often sorely lacking in conceptual integrity. If you are building an application - build it right for the users. Remember you can build it three times for the cost of building a programming system product. Maybe by the third time there will be budget and schedule for it. Or maybe, just maybe, you can evolve your current system. But pretending will just make a mess while wasting significant amounts of time and money. Partition It Some pieces of your system are probably more important than others. There are places where requirements will be volatile or highly diverse amoung customers - those are the places where you need a truly extensible system. You should also be able to reuse strong abstractions that run through your system. The code associated with those abstractions should be top-notch and well documented. Other pieces just need to be great for the users, while a few that remain need to be convenient for yourself to extend or or administrators. Open Source It Find yourself developing yet another web framework because what exists just isn't right? Open source it. This isn't really my idea. It's what David Pollak of CircleShare is doing with the lift web framework for Scala (actually, I'm guessing at David's reasoning, I could be wrong). The infrastructure for your application is essential, but it isn't your application. It is what Jeff Bezos refers to as muck. You have to get it right, but it's distracting you from your core mission. So why not recruit others with similar needs to help you for free? That way you don't have completely give up control but also don't have to do it alone. Theoretically the same could be done for applications. Many large customers of software companies have significant software development expertise - sometimes more than the software companies. I think it would be entirely feasible for a consortium of such companies to develop applications that would better serve them than commercial alternatives. But I have yet to convince someone of that...

Sphere: Related Content

Tuesday, August 14, 2007

Business Engagement and IT Project Failure

CIO.com recently ran an article by the CIO of GE Fanuc regarding "functional engagement" (i.e. "the business" or more commonly "the users") and project failure. Michael Krigsman later posted a more succinct summary on his ZDNet blog. Here's a even shorter version:

  1. Make sure a single business-side has significant capability, responsibility, and authority for project success.
  2. Don't short-circuit important parts of the project
  3. Make that person understands the "laws of IT" and defends them to his peers, subordinates, and superiors
#1 is just plain common sense. #3 amounts to establishing a scape goat for the inevitable failure caused by short-circuiting appropriate systems engineering or architectural activities. Notice that there is neither a definition of "failure" nor a definition of "success." I think it can be inferred that he's defining "failure" as "exceeding project or maintenance budget and/or schedule." Not once does he mention delivering real innovation to the business or even real value - just avoiding causing massive amounts of pain. Consider the following statement:
Enterprise platforms like Siebel, Oracle and SAP are not intended to be heavily customized. When going from a niche, custom application to Siebel, you need a strong functional leader to push back on every “Yeah, but” statement. They must start at zero and make the business justify every customization. Saying “no” to customization is bitter medicine for your business partners. They will make contorted faces and whine ad nauseum. But it is for their own good.
Let's think for a moment. Your customer (internal or external) wants to spend millions of dollars implementing a CRM (or ERP, PLM, etc.) package. Earlier, it went to the trouble of building one from scratch. That probably means it considers CRM to be an extremely important part of its business, and it expects to derive a competitive advantage from having a shiny new system. The best way to be competitive is to copy what everyone else is doing, right? Also, repeatedly receiving "no" as an answer will really make your customer want to take an active role in the implementation, right? Hmmm....somehow I don't think so. Introducing a strong impedance mismatch between the organization and the software it uses is failure. Standing in the way of innovation is failure. Mr. Durbin wants you to fail. There is actually a simple, albeit occasionally painful, solution to this problem: don't buy an off-the-shelf application that neither meets the business's requirements nor can be cost-effectively extended to meet them. First, you have to understand your business processes and requirements. I mean really understand them, not just understand the official process documentation that no one follows. All those winces and "yes buts" are requirements and process details that absolutely must be understood and addressed. Second, you do a trade study. This is a key part of systems engineering. Replacing enterprise systems is expensive, often prohibitively expensive, so do a thorough analysis. Take some of your key users to training sessions and write down every wince and vaguely answered question, because those are all either issues that will stand in the way to delivering value or will require customization. Finally, keep your pen in your pocket. Don't be tempted to write checks, buy licenses, and sign statements-of-work before your really understand the product. Just ignore those promises of discounts if you get a big order in before the end of the quarter. The next quarter isn't that far away, and the end of the fiscal year may even be approaching. The discounts will reappear then. Instead, make sure the vendor really understands the requirements and business process, and structure any contracts in a way that they must be met or the vendor will face significant penalties. It's amazing what comes out of the woodwork when you do this. All of a sudden that 3x cost overrun from the original projection is sitting right there in front of your eyes in the form of a quote, all before you've sunk any real money into the project. The purpose of engaging "the business" in an IT project is not to create a personal shield for all the deficiencies in the system you are deploying. It is to make sure you identify those deficiencies early enough in the project cycle to avoid cost and schedule overruns.

Sphere: Related Content

Tuesday, July 17, 2007

More Efficient User Interfaces

Over the weekend I ran into the following video about changing user user interface paradigms. I found the language based interface really interesting, but didn't much care for the ZUI. Unfortunately, the language based interface reminds me of old games from about the time personal computers became powerful enough to run Eliza-like software (for hackers, go here for code, and here for an easy environment on Windows). Basically you typed in "natural language" commands, and the characters were supposed to carry them out. Personally, I thought it was about the most awful way to play a game. Maybe in the past couple decades technology has progressed. It looks like it has potential. Anyway, the first thing that struck me was it reminded me of SciFi books where they describe any advanced computer use "programming," possibly because I largely stick to classic SciFi. Of course the vast majority computer users, who indeed perform relatively complex tasks, are not progammers. Not by a long shot. But I think the language-based interface concept could bring "programming" one step closer to the masses (whether that is good or bad is an excercise for the reader). The second thing I noticed was a stricking similarity to SmallTalk, only SmallTalk is a lot better. You see, if all of your software was inside of a SmallTalk image, then you could easily interact with it programmatically. Furthermore, SmallTalk's syntax and dynamic nature make it fairly approachable for those who have not been corrupted by years of experience with less lofty languages (in other words, me). Given a "humanized," if you will, object model I could see some real potential. At a minimum it would be a realization of much more componentized software. Alas, we can dream... The paradigm for such a system would be something like this: Someone builds an API that translates human-language like typed commands into Smalltalk code. It could quite possibly be an internal DSL. Then you build the interface so that an easy keystroke let's you "talk" to the computer by typing. AI would be optional, but I think essential for really advanced use by lay users. As time progresses, the user and the computer would learn more and more complex tasks - the user would essentially be programming the computer, but it would feel more like training. Ultimately this could facilitate much more efficient and sophisticated computer use. Maybe. There are counter points. Paul Murphy seems to think that everyone should learn keyboard shortcuts and Unix command-line utilities. While I agree in the utility of both, somehow I don't think the every-day computer user is ever going to do it. The more convincing counter point was made by my wife, an attorney, who wasn't even really trying to make a counter point. The exchange went something like this: Me: I saw this video the other day about changing user interface paradigms to consist of unified functionality instead of distinct applications. Her: Why would you want that? Me: Well, difference applications do different things, and so often times you have to use more than one to get it done. Her: Word is complicated enough already. Wouldn't that make it more complicated? Me: Yeah, that's why this guy advocated switching to text-based commands instead of menus and dialogs and stuff. Her: Huh? Me: You know how Word has a ton of menus, and those have sub-menus, and those launch dialog boxes with a bunch of tabs in them? Her: Yeah. It's confusing. Me: Well, if there were easy text commands that resembled natural language, then you could just tell the computer what to do and not worry about the menus. Her: Wouldn't that mean I have to learn all that? What's wrong with what we've got. It works. I know how to use it. Why change? People won't like the change. It will confuse them. Me: Well, how about intelligent search.... And go on to another defeat of the engineer. But that's the point. We dream of novel new things. We dream of users who care to really learn applications. We get people who want consistency. They've learned it, and they don't want to learn it again. So why is this important? Well, two reasons. One is that people are amazingly hampered by most application. By people, I mean everyone. How many of you have spent hours fighting Word or PowerPoint to get some document to look right? I know I've done it plenty of times. That's wasted time, and time is money. But there's a more important reason. I think we've hit a plateau in terms of technology innovation. We have all this shiny new processing power and networks, and most "innovation" seems to consist of thing like "social networking" (many were doing that a 1200bps or less years ago), sharing pictures (ditto), playing media (replacing TV/VCR isn't innovation), and other such pursuits that while may be very beneficial, hardly represent technology advancements. Why? There are a lot of reasons. I think one of them is that computers are so complicated already that getting a user to accept them doing something complicated is incredibly low. You can do it in specialized fields that have a technical leaning (science, engineering, finance, economics), but doing it for others is another ballgame. In the long run, in order to keep innovating, we have to be able to bring innovative software to people who today do not want it, because what they have today is complex enough. We have to change the user interface paradigm, and we have to do it in a way that doesn't scare people who already "know something" away.

Sphere: Related Content