May 93 - The Future of OOP: One Man's Crystal Ball
The Future of OOP: One Man's Crystal Ball
Jeff Alger
This year marks a turning point in the object-oriented community. A future that
seemed so clear and so certain until recently is now clouded by a haze of conflicting
products, markets, and philosophies. Although no one can claim to have perfect vision
of the future, I will give it a try, and the right place to start is not by looking forward,
but backward at the history of our industry. For, as the saying goes, "those who do not
learn the lessons of history are doomed to repeat them.
Lessons from the History of OOP
In the 1970's, a small, talented group of researchers at a place called the Xerox Palo
Alto Research Center, or PARC, invented the personal computer and with it, the first
true object-oriented programming language, Smalltalk. Computer hardware wasn't far
enough along for their invention to be practical. Xerox as a company wasn't ready for
it. And there really wasn't much of a perception in the software community that there
was a need for it, anyway. Remember that at that time database technology and
information engineering were just coming into their own and minicomputers were
sprouting like weeds. So the Star workstation and Smalltalk languished, in my opinion
more because the market didn't see the need than because of any failure of Xerox to
follow through.
Soon thereafter, a brash young entrepreneur named Steve Jobs visited the PARC labs
and turned what he saw into the Lisa computer. It flopped, again because it was ahead of
the curve on hardware and there was no groundswell of support for new ways of
developing software. Even when the most egregious problems of the Lisa were
corrected in the Macintosh, it was a hard sell for Apple Computer to convince the
public that new techniques were needed, especially after IBM weighed in with its own
personal computer. The IBM PC was certainly less imaginative than the Macintosh, but
it was closer to traditional architectures both in terms of hardware and software, so it
was easier for the systems community to gets its hands around. A toy mainframe that
sold because it reminded people of the real thing.
About the time of the Macintosh and Lisa, a small group within Apple recognized the
potential, not just of the human interface and hardware aspects of what the PARC group
had created, but of the software techniques they had used to program their machines.
Object-oriented software seemed such a natural way to handle an iconic, event-driven
graphical user interface that they created in succession Object Pascal, the Lisa
Toolkit, the original MacApp, even an abortive Smalltalk for the Macintosh. There was
talk even of object-oriented operating systems, objects from soup to nuts. Their work
was only mildly influential in determining the course of Macintosh software
development tools, as we saw a steady progression of procedural operating systems,
languages, tools and techniques. How many people are aware that as recently as three
and a half years ago, at a time when the then-MacApp Developers Association had about
a thousand members worldwide, that there were only two - count 'em - engineers in
all of Apple working on MacApp?
Apple had the option of going object-oriented all the way at any time up to the advent of
Multifinder. There are those that disagree with me, but I and many others there at the
time feel that had they done so, there would have been no need for a Taligent today and
we would perhaps already be programming to the tune of Rhapsody in Pink. But even
within Apple, champion of new ideas, counterculture of the computer industry,
conservatism won; there were just too many people who did not even see the problems,
let alone the value of object-oriented solutions.
But I'm getting ahead of myself. Across the continent, squired away in an obscure
corner of Bell Telephone Laboratories, a fellow by the name of Bjarne Stroustrup
spent his time writing software to simulate exactly what, I don't know, but the Labs
have always done a lot of simulations work. He used a language called Simula, an
offshoot of Algol designed specifically to simulate real-time stochastic processes and
arguably one of the first true object-oriented languages before that term had been
coined. But Bell Labs did other things beyond simulations. And like any large
organization, they developed software in a variety of languages: COBOL, Fortran, and
lots of obscure languages like Simula and Snobol. The breakup of the phone system was
forcing all of AT&T to think about new ways of making money and Unix looked a good
bet. But there was a problem: how could AT&T tell everyone else in the world to use
Unix and its companion language, C, if they themselves weren't? So, the order came
down from on high: henceforth, all software will be written in C. Now, I have heard
conflicting stories as to whether this order directly prompted Stroustrup to migrate to
C or whether it merely built a critical mass that made C more acceptable as a delivery
vehicle. It doesn't matter, because AT&T just wasn't interested in any language that
wasn't C. C++, as with most other object-oriented innovations, was ignored by its own
company. Today there are many within AT&T that speak of the fish that got away.
Wherever you look, the history of object-oriented technology has not been pretty.
Microsoft was smart enough to recognize an object-oriented image problem when they
saw it. Windows is as object-oriented as one can get without an object-oriented
language, yet nowhere in the early literature of Windows was the term "object" even
used. They recognized the merits of the approach but realized that few others did.
Operating systems designers have been doing what could arguably be called
object-oriented programming - attaching function vectors to packets of data - since
the 50's but remain, perhaps for the very reason that they've done so well without,
curiously skeptical of the need for object-oriented languages. Computer scientists
have been pushing the use of abstract data types - encapsulating data behind functional
interfaces - for decades but no one in the commercial arena has been there to listen,
with the singular exception of Ada. Even in the rarefied world of databases, where
challenges to the data-driven approach are quickly shown the door, the trend has been
strongly toward what arguably could be called object-oriented architectures. Call
them triggers, stored procedures, or what have you, but the fact remains that modern
data modeling requires associating functions with packets of data, the core concept of
OOP. Yet, few in the database community are willing to call a spade a spade: most
so-called "object-oriented databases" are, in fact, nothing more than glorified
network or relational models that support complex data types, and many so-called
"object-oriented methodologies" are nothing more than recycled information
engineering.
Even today, with magazines, training classes, programming languages, conferences,
college curricula and associations devoted to object-oriented technology, it is easy to
get fooled into thinking that the war is won and that the ramparts themselves are now
object-oriented. The problem is that OOPers tend to talk mostly to other OOPers and
forget how much resistance there is to the idea outside our own mutual support group.
Well, as one who advises companies on these issues, I can tell you that outside our own
ranks OOP is still viewed as either snake oil or a silver bullet, but not as a practical
tool for solving everyday problems.
Throughout its history, OOP has been the Rodney Dangerfield of software: it gets no
respect.
There are, of course, reasons for this history, and they are important to understand if
we are to anticipate the future, for there is little reason to think they are about to
change after all this time. Throughout, little emphasis has been laid on solving real
problems that translate to real market share. That is, the OOP community has tended to
be internally focused, developing great ideas and products and only then trying to
convince the world that there is a need for them. Accompanying this has been more
than a little arrogance, especially in waiving off the very real concerns of managers
everywhere: integration with existing systems and techniques; leveraging skill sets
already in place; having measurable, controllable and repeatable processes rather
than a few smart people locked in a room arguing with each other. In Marketing 101
they teach you what happens when you try to bully the market. I am reminded of the
story of the then-Chairman of the Great Atlantic and Pacific Tea Company being
approached earlier in this century about sponsoring a Sunday afternoon radio show. He
declined, saying that he doubted anyone would listen during that time slot. After all,
everyone he knew played polo on Sunday afternoon. Well, everyone I know in the
object-oriented community thinks polymorphism is really important.
Wild, unsubstantiated and often patently false claims have been made about the benefits
of object orientation. Where, for example, are the case studies of large-scale code
reuse to back up all the popular literature on the subject? Why, if this is so
"natural," did the organizers of OOPSLA a couple of years ago feel compelled to hold a
panel on the subject, "Why Is Object-Oriented So Difficult?" And why, if the payback
is so quick and dramatic, is it quietly understood in the OOP community that it takes
two years to develop a good object-oriented engineer?
We have done a very poor job of articulating why technology managers should believe
us when, like Charley Brown running up to kick the football every fall only to have
Lucy once again yank it away at the last second, they have been consistently let down by
other, similar, claims in the past. A good friend of mine, John Brugge of IDS Financial
Services, circulated a paper in his company explaining the relative merits of
object-oriented technology. It spoke of dramatic increases in productivity, lower
maintenance costs, better results, higher quality. Reading the paper at his urging I felt
it to be quite mainstream, the sorts of claims to be found throughout the literature on
the subject. After five pages, however, the paper broke off in mid-sentence. I can't
go on with this. This really is not my paper." He explained that the paper was, in fact,
from a book by Edward Yourdon from the 1970s that dealt with the structured
programming revolution; John had literally done a bulk search-and-replace of
"object-oriented" for "structured" and otherwise left the wording unchanged. Little
wonder that we are viewed with suspicion.
Another problem has been a lack of focus. 75% of development costs, and an even
higher percentage of software lifecycle costs, are tied up in analysis and design but, as
Yourdon points out in his new book, "The Decline and Fall of the American
Programmer," the OOP community seems stuck in the backwaters of code. Even within
that limited domain the focus has tended to be more on piling feature after feature into
the syntax of languages while giving short shrift to the problems that really consume
programmer time: memory management, debugging, object persistence ("you mean
you actually want to store your data?") and integration with non-OOP technologies.
Even worse, OOP has often been the excuse used for working without any methodology
whatever. I call this the "Brilliant Pebbles" approach to software development, named