I'm not sure what your point is. I'm, aware of what it is called. But,
by "model" are you referring to "Prototype" or just the style of
programming? If the former, then here too it brings with it
preconetption of class-based OOP. What I trying to get across is that
there is deeper sense to this then class-oriented programmers generally
grasp. With Prototype-based OOP, you should not start by thinking about
what is Dog. Rather you should just create a dog. Later, if you need a
different dog, you copy your first and change it as needs be. You do
not cookie-stamp a fromal prototype. In a way the term prototype is
unfortuante. Perhaps "Instance-only programming" would convey the idea
better. Of course in practice one often does create Formal entities
nonetheless --as library tools, but even so, the bottom lines is that
there should be no dichotomy between prototype and non-prototype. There
just isn't any such thing in Prototype-OOP.
I've been working with, and assisting in development of the prototype programming language Io for the past 2 years, I'm fully aware of the design principals one should utilize when working within the confines of a prototype-based language. As far as how people use it; it is my experience that those familiar with class based OOP tend to take about 3 weeks or so before they start to understand that you don't define a template first; you instead define your object, and from that make copies modifying as needed. That said however, just because it looks like someone is writing in a class-based style, doesn't mean they are. Generally speaking quite a few occasions will you find yourself cloning an object which seemingly is acting like a class (not being used itself except as a template for other objects). In cases like this, I don't know what your problem is; this comes up quite a bit, and is often the simpler of the two choices -- reconstruct the object based on all previous objects, or construct a template holding basic values which then a few objects will clone from. If you have a system like Io or NewtonScript which use a differential inheritance model, only the changes from the object you are cloning will be attached to your object, so you can change this template at a later time, and anything that any object that cloned from it hasn't changed, will receive those changes as well when they're used. To be honest, I think this satisfies a lot of conditions. Nobody is saying that class-based programming is without its merits, and certainly writing in a seemingly class-based style in a prototype language is possible, and often the simpler case.
To address your dog point more directly, incase I have to spell it out for you. It is often the case where you have say a Kennel which in this case, let's think of it like a data structure that houses only Dog-like objects. It is more efficient to define (in a system like Io or NewtonScript at least) a singular Dog object, and then define dogs from that. I.e.,
Kennel := Object clone
Kennel dogs := list
Kennel addDog := method(aDog, dogs atIfAbsentPut(aDog))
Dog := Object clone
/* newSlot creates a setter slot and a value slot */
Now, if you're Kennel is going to house a lot of Newfoundland dogs, it may be worth while to make a clone of Dog called NewfoundlandDog (or similar); unless of course, a very small minority are going to be anything but NewfoundlandDog's, in which case, I'd make Dog above representative of a newfoundland (setting "kind" to "Newfoundland") then for any non newfoundland types, i'd just as an example:
Fred := NewfoundlandDog clone
And similar for other dogs (cloning Fred for other Malamutes, etc).
It makes sense in a differential inheritance system to do this kind of thing since it eliminates a lot of repeating.
That said, I understand ruby doesn't have a differential inheritance system, but these decisions can be applied to Ruby when using a prototype design pattern. It just makes sense; the nice thing is, it's not required, which is what differentiates it from class-based programming.
On 13-Jul-06, at 5:12 AM, firstname.lastname@example.org wrote:
"One serious obstacle to the adoption of good programming languages is the notion that everything has to be sacrificed for speed. In computer languages as in life, speed kills." -- Mike Vanier