“Avi Bryant” avi@beta4.com wrote in message
I am not convinced that there is a deep tradeoff of environment Vs.
language
expressiveness and dynamism. The Lisp world back in the 80’s had a
combination of superb environment (easily comparable to Smalltalk) and
heavy
use of arbitrarily sophisticated code generation (Lisp macros). All
parts of
the environment, including the debugger, were amazingly macro aware.
This is drifing off topic, but I’d like to know more. Was the LispM
environment truly comparable to (modern) Smalltalk? Current Lisp
environments (I’ve used Allegro and LispWorks) really don’t come
close, especially when it comes to tools like version control - what’s
the Lisp equivalent of ENVY or StORE? Or, since that’s where this
thread started, of the Refactoring Browser?
I’m not trying to get into a pissing match here - Lisp is wonderful,
and there are lots of things in CL (and in Scheme, and in Ruby for
that matter) that I miss in Smalltalk. But I honestly do think that
Smalltalk hits a sweet spot in terms of tool support, and I have a
hard time imagining how certain tools would work in the Lisp world -
like versioning by semantic chunks instead of raw text/s-exprs. If
those problems were in fact solved back in the 80s, I’d love to hear
about it.
Lisp is in my distant past but here is what I recall.
Imo the LispMachine environment was truly comparable to modern Smalltalk.
Not in the details, but in the consistency and integration of all tools
(compilers, editors, debuggers, code browsers, run-time object browsers,
documentation tools, interface-builders, build-tools, persistence managers,
cross-language integration (the C-interface was just coming on line)…).
When the big bet on Lisp hardware failed, the lisp machine’s successors lost
some of those things.
Lisp had much better semantic chunks than text/s-exprs: first class Packages
and Modules.
A package act as namespaces, and (use-package P) set up namespace import
relations between packages.
A module is a subsystem-like code unit loaded up (as a single atomic unit)
from some number of files. (require M) loads M. (eval-when …) gave control
over whether code chunks got executed when compiled, loaded, etc.
The object system was based on generic functions (first Flavors, then CLOS)
with behaviors not necessarily coupled to a class definition. Hence if class
C is defined in package P, you can add generic functions on class C in any
other package and module. Multi-methods took this client-side polymorphism
even further. In addition, the classes themselves can be redefined
dynamically. Readers and writers were macros (comparable to Ruby’s
attr_accessor). And this is without really getting into the meta-object
stuff.
I don’t recall anything directly comparable to Envy. But with
modules/packages, an open object system with generic functions and class
redefinition, and decent persistence for these, it was somewhat less needed.
You could do some of the things the ENVY’s applications / sub-applications
provide. I’d say it did a pretty good job (1983, mind you) on providing a
shared code and documentation repository, concurrent access, atomic loads
and unloads, software component hierarchy; and not as good on aiding the
software component lifecycle.
I don’t think refactoring browsers had been conceived of in 1983.
Hth …