Документ взят из кэша поисковой машины. Адрес оригинального документа : http://hea-www.harvard.edu/~fine/opinions/computertime.html
Дата изменения: Unknown
Дата индексирования: Sun Apr 10 09:31:05 2016
Кодировка:

Поисковые слова: п п п п п п п п п п п п п п п п п п п п п п п п п п
DRAFT OUTLINE started September 1998

DRAFT OUTLINE

There is an assumption that hides behind almost all of todays software development/engineering practices. That assumption is that programming is expensive.

The assumption is wrong, more every day.

Where does it come from

For most of the history of software development, one thing was always true: computer time costs far more than people time. Twenty years ago, a top computer programmer might have cost $20/hour. At the same time, computer time cost [Need reference] $XXX/hour.

In this era, you clearly wouldn't let your programmers "just try something to see if it works". It was far more cost effective to design it carefully, review the design in a group, then code it, then review all the code by hand, and only then feed it to the machine for processing.

The most important point to realize here is that the goal was to have as few debug cycles as possible, because each one cost quite a lot of money.

Around 19XX [need reference], the cost of computer time, and human time met. Now, a good programmer costs $40/hour. And high estimate for computer costs now is in the ballpark of $1.50/hour. A low estimate could go as low as $0.60/hour.

That's not sufficient to completely change the software development view. Consider a development cycle of a medium size project in 1985, where compile time might be 15 minutes. Assume ten minutes each cycle to change the code (which is actual work) and ten debug cycles would take more than four hours, of which less than half is paid work, the rest is waiting. While computer time wasn't too expensive then, this mode of development was still quite wasteful.

But as we all know, not only is computer time cheaper, it's also better. An hour of computer time now can accomplish what two months would have done in [need reference]19XX.

Now consider that same project today, which would compile in quite a bit less than a minute, but we'll call it a minute anyway. Same assumptions, 110 minutes for ten debug cycles. Of this, only ten percent is wasted on computer time, and that was a high estimate. It's now cost effective to let the programmers "hack and run".

Prototype Early, Prototype Often

It's obvious that pure "hack and run" is a lousy design model. I'm not advocating abandoment of formal design practices. I'm only advocating a much greater involvement of programming in the ongoing design and development process. What "hack and run" is good for, is prototyping.

Prototypes allow you to answer all the questions you find yourself asking during the design process. Instead of spending days fooling around with cardboard mockups of interfaces, just make a working prototype of the interface and try it. efficiency -> try it

The only sticking point in the whole thing is, you have to have discipline. Specifically, the discipline to keep yourself from saying "This prototype works, and has all necessary features, let's call it a product". A rock makes it possible to pound in nails, but you shouldn't be selling rocks in hardware stores (and Windows 98 makes it possible to use a computer, but it still shouldn't be on software shelves...). The good news is your prototypes don't go into the garbage. Most of the prototype should end up (post clean-up) as part of the product. The bad news is, this part of the process is all grunt work, and as such the temptation is to not truly finish.

computer time is vastly cheaper than people time (and has been for awhile).

effects of this on software development.
  computer on every desktop
    computers should wait for people, used to be cheaper for people to
    wait to use the computer

  prototype early, prototype often
    compile time drops from hours to seconds, better languages, interpreted
    languages, deeper libraries
    used to be better to waste lots of time in planning to save computer time
    debug cycles used to be very long, very expensive

    better tools - full screen editors, programmable editors, combined
    editor/debuggers, better debuggers, code organizers (cscope)
    used to be too expensive to allocate computer resources to aid the
    development and programming tasks.

why do we cling to old models
  prevents human nature from taking over: "it seems to be working, lets
  call it a product"

  momentum
    in management
    in education (success of technical incidents)


Tom Fine's Home Send Me Email