WebObjects 4 Everyone
Volume Number: 16
Issue Number: 5
Column Tag: Programming
WebObjects and EOF For Everyone!
By Patrick Taylor and Sam Krishna
Everyone, this is WebObjects and EOF. WebObjects and EOF,
this is Everyone
When done well, programming provides an answer for a problem. At its worst,
programming obscures the question. The first part of that statement isn't very
controversial, few of us practice programming just for its pure aesthetic joy. We
program for a reason and when we do it well, we achieve our purpose. The difficulty
that the software industry find itself in is that normal practice more often resembles
the worst case rather than the best. Applications are fat, slow, complicated and become
impossible to maintain.
The tool you use won't prevent problems, after all, you can write just as bad a
program in Java as in Visual Basic or PERL, but the development environment used
can lead you in certain directions. WebObjects and Enterprise Objects Framework
(EOF) have inherited from earlier NeXT-era tools a certain "cultural" approach to
development that set them apart from practically all development software available
today. While they may look similar to the dozens of other tools avalable for web
applications development and serving, WebObjects and EOF do things differently. And it
is these core differences that make them a good choice.
The biggest problem facing the developer unfamiliar with WebObjects or EOF (or any
of the new generation Apple development tools) is that "cultural" divide. Many classic
Mac OS programmers have been using object-oriented tools of one sort or another for
well over 10 years starting perhaps with Object Pascal and ending with C++ or Java,
however there is a strong sense that the object-oriented approaches we used were
"bolted on." While our tools have been getting more object-oriented, the target
platform (whether it is the MacOS, Windows or a web application server) is still as
procedural today as it was in the 1980s.
In contrast, years before many of us had even heard the term, NeXT made
object-orientation an essential aspect of their tools and platforms. Bear in mind, that
this object-orientation is not of the timid C++ variety but the older and more radical
SmallTalk variety of OOPLs (object-oriented programming languages). EOF and
WebObjects bring this radicalism out of the academic closet and into the very practical
world of databases and the Web.
The Back Story (A New Hope)
One of the difficulties in explaining why WebObjects and EOF are such powerful tools
and technologies is that there isn't much of a common history. The back story of NeXT
is partly known but like the history of the Macintosh, it is so intertwined with the
story of Steve Jobs, that it is difficult to give the technology its due.
Apple hasn't been much help here. Many people in the industry refer to the way Apple
sells WebObjects and EOF as "stealth marketing." Whether WebObjects and EOF are the
best kept secrets of the web application market because Apple doesn't know how to sell
to the enterprise or they are waiting until MacOS X arrives in its full "Aqua"-fied
glory is the subject for an editorial not a technical introduction. There is a great deal
to admire about these products and they could make a big difference to you as a
developer, but the first step in demystifying is to tell the back story.
The Model-View-Controller software pattern
Model-View-Controller (or MVC) has its roots in SmallTalk, according to the famous
Design Patterns book by Erich Gamma, Richard Helm, Ralph Johnson, and John
Vlissides. MVC consists of a triad of classes which are structured to decouple the
behavior of the application from its user interface. In MVC, often the most reusable
parts are the Model classes and the View classes while the Controller classes are rately
reused.
An example would probably be the most useful here to illustrate what happens. Since
Apple hasn't managed to significantly update the Calculator in the classic MacOS, you
might want to develop a half-decent calculator. In this program, there would be a
CalculatorModel class which has the code used to actually perform and return results
of the computation; a CalculatorController class which is used to broker between the
Calculator UI and the CalculatorModel class; and a CalculatorController class which is
used to broker between the Calculator UI and the CalculatorModel class - it passes
input from various digit and operations buttons and a text field to the CalculatorModel
class to perform computation. The View part of the Calculator is actually a set of
classes used to represent digits 0-9, the symbols for math operations (addition,
subtraction, multiplication, and division), and a text field which actually can receive
direct keyboard input as well as display computational results.
Before MVC, many programmers would simply cram all of this into a monolithic code
mess: interface, computational engine and all. The computation code for 'addition' be
inside of the '+' (plus) code widget! Perhaps this doesn't seem like such a terrible
thing, but by separating the Model and the View, we could use WebObjects to build a
new view based on either HTML or a Java client applet which used the exact same
computation engine. Or we could build an interface designed just for Windows. Or
convert the interface to compute using Roman numerals. While this might not seem
like such a big deal for a calculator, it makes far more sense for hundreds of
higher-end applications.
Interface/Access/Control
The EOF 2 team realized they needed to have broad functional separation as well as
leverage off of the work of the engineering team behind AppKit, the desktop UI
framework for Openstep aka Yellow Box aka Cocoa. In general, database functionality
breaks down in three places: the interface or display of the data fetched; access, the
actual retrieval and manipulation of the data at the database level; and control, the
layer which brokers between interface and access. EOF2 provided a database-specific
variation on MVC, and it worked spectacularly well.
The Access layer is responsible for actually generating the SQL necessary to create a
"state reflection" of the actual state of the EOs (Enterprise Objects) in memory at the
time the EOF equivalent of a commit is called. There are two distinct layers to the
Access layer: the EOAdaptor layer and the EODatabase layer. At the lowest level, the
EOAdaptor layer is responsible for the SQL generation. The EODatabase layer is
responsible for taking the rows fetched through the EOAdaptor layer and packaging
them into EOs and registering this with the Control layer. What's incredibly cool about
the Access layer is that all your SQL is generated for you at runtime-you don't ever
have to write SQL again if you're just doing generic SELECTs, INSERTs, UPDATEs, and
DELETEs.
The Control layer is responsible for managing the EOs in memory and notifying both
the Interface and Access layers to update their information whenever EOs change in
memory or when a commit is requested by the user. Without getting into too much
detail, the Control layer is responsible for managing the state information of the EOs
as well as controlling the other layers.
There are actually two different versions of the Interface layer: the WebObjects
framework and the EOInterface framework. EOInterface is used for desktop
applications and maps various attributes of EOs to desktop widgets; while the
WebObjects UI framework does the same, but from a web perspective.
Unless you have some of that history, you'll probably always wonder why the original
developers did what they did. Or, alternately, you could resort to the ever-popular "it
just works" explanation. Since neither of these options puts any food on our table, this
series will attempt to explain WebObjects and EOF, not just in a technical way, but
also in a historical sense. There is much more to know there than the fact that
WebObjects has the cool name and EOF the dorky one.
So if you want to understand NeXT technologies, you should at least know a little bit
about Objective C ...
A brief story of Objective C or "When you use C++, your
objects can't hear you scream
Computer languages can inspire remarkable loyalty. Regardless what flaws might
exist, there is rarely a language that doesn't possess at least a small cult of dedicated
followers. Flame wars between language advocates are so ever-present that it has
become commonplace for language agnostics to argue that all languages are equally
useful. This is a nice sentiment, but all languages have different strengths and
weaknesses. As a friend of mine used to say, "Pizza is nice, pizza is tasty, but you can't
eat pizza every night." Objective C is such a language, full of advantages and flaws,
some perceived, some real. And it was just those properties of Objective C that made
WebObjects and EOF possible.
Languages are created as an answer to an immediate problem. In the case of Objective
C, the problem was one described back in 1968 as the "software crisis." Fred Brooks
wrote about this problem in his seminal work "The Mythical Man-Month" and in his
famous article "No Silver Bullet: Essence and Accidents of Software Engineering" In a
nutshell, the software crisis is caused by the difficulty to scale software development.
An insufficient number of people are available to develop the increasingly more
complex applications the market demands. The paradox is that adding more workers to
a project is not a solution; in fact, this would most likely aggravate the situation due to
difficulties in managing the increased communication complexity and the new workers'
varying levels of technical ability. According to Brooks, the problem is not one that
will ever improve dramatically because the very nature of software development will
impede this change. And despite the huge libraries of how-tos and programming
methodology books, Brooks' thesis seems to be holding true.
Brad Cox didn't believe this had to be so. In his Byte article "What if there is a Silver
Bullet: And the Competition Gets There First?", Cox argued that the problem isn't
structural to computer programming itself, but had more to do with the way the
industry carried out development. Taking an artisan-like approach to programming,
developers were creating unique implementations of software each and every time.
Brad Cox proposed the idea that software development needed to go through its own
"Industrial Revolution.
While code-reuse gets a lot of lip service today, it is still rare that significant
amounts of code get reused. And-this is the critical point-even when we do practice
reuse, we do it as an add-on. To overcome the software crisis, the whole process of
coding must change to make it simpler to practice reuse. "Re-user-friendly," if you
will.
Many people complain about a problem, Brad Cox did something about it. He created
Objective C, a computer language that added a small set of object-oriented extensions
to the ever-popular C language. Among its other features, Objective C featured a
dynamic runtime that allowed compiled objects to communicate with other objects
without knowing the exact implementation of the target object. The reason for the
dynamic runtime was that Cox foresaw a day when programming involved groups of
interchangeable "software widgets.
Cox didn't create Objective C from scratch, he borrowed the syntax and philosophy
from SmallTalk, the grand-daddy of object-oriented programming languages.
SmallTalk was an interpreted language environment whose remarkable flexibility was
used to create the first graphical interfaces emerging from Xerox PARC. Some of the
finest minds in computing are associated with SmallTalk, including Alan Kay, a
Macintosh legend in his own right, and Kent Beck, author of several classic books on
programming.
While far more advanced than any other programming language of its day, SmallTalk's
problem (other than its sheer alieness when compared to the procedural languages of
the day) was that, being interpreted and message-passing, there were significant
performance problems. The tradeoff of performance for flexibility and development
speed were minor ones for the academics and researchers who used SmallTalk to create
the early prototypes of our present day interfaces and applications. Commercially
SmallTalk was much harder to sell, not that the hardcore fans didn't try.
Brad Cox however took a middle road and hybridized SmallTalk (easy to program and
reuse) with C (fast and compiled) to create Objective C. Almost two decades later, it is
easy to forget that Cox was working in a world where microcomputers had 8- or
16-bit processors that topped out at about 8 MHz. Many programmers still swore by
assembler due to the scarcity of resources, not the least of which was less memory
than you'll find included standard on an L-2 cache on a modern processor.So while far
from perfect, Objective C was probably the best compromise between his goals and the
reality of the computer industry at the time.