Marc Heiler wrote:
Cobra sounds interesting.
And the syntax is - for my eyes - clean.
However there is one thing I worry - isnt adding many features
problematic?
Well, this is a very philosophic question that goes to the very heart
of programming language design. There are basically three different
schools of thought:
1) A programming language should have a very small kernel of very few,
very simple abstractions, plus powerful ways of combining these
abstractions. The prototypical example of this is Scheme ... I mean,
it doesn't even have *syntax*! Everything else can be built *on top*
of these abstractions. Take the famous book Structure and
Interpretation of Computer Programs as example: it teaches you how to
implement object-orientation, lazy evaluation, garbage collection, an
interpreter, a compiler, logic programming, heck, even a CPU(!) in
Scheme.
2) A programming language should provide many powerful abstractions
and paradigms. One of the most extreme examples of this is Oz/Mozart.
This programming language takes the word "multi-paradigm" to a whole
new level: it is declarative, imperative, functional, procedural,
object-oriented, lazy, strict, you name it. It even supports logic
programming, dataflow programming. For concurrent programming it
supports futures, actors, threads, ... Its approach is best
exemplified by the book Concepts, Techniques, and Models of Computer
Programming, which *also* teaches you OO, logic programming, lazy
evaluation and so on. But, unlike, SICP, it doesn't teach you how to
*implement* them (after all, Oz already supports them), but how to
*use* them. (A more mainstream example of a multi-paradigm language is
C#. Originally basically a clone of Modula-2 with a Java object system
and C++ syntax, it has since grown functional features, monads (in the
form of LINQ) and most recently dynamic typing.)
3) A programming language should play nice with other languages, so
that you can mix and match paradigms by mixing and matching languages.
This is of course the idea that Parrot and .NET are built on and what
the JVM is moving towards. And, of course, this approach is also
exemplified in a book: Programming Languages: Application and
Interpretation. This book teaches you again the same things that SICP
and CTM teach you, but this time it uses *different* languages for
each concept. Well, mostly Scheme, but lazy evaluation is shown in
Haskell and logic programming in Prolog. In my education at the
University of Karlsruhe, Germany, we were taught different concepts
using, among others, Java, Pizza, Gofer, Python, MIPS Assembly, C,
Prolog, SQL, OQL, XSLT, plus a whole range of very specific teaching
"languages", like lambda calculus, universal Turing machine,
µ-recursive functions, Semi-Thue sytems, Markov systems, the FOR
language, the WHILE language (whose only control structure is FOR and
WHILE, respectively), a hypothetical assembly language for which we
had to build a hypothetical CPU, just with pen and paper and so on.
Like, I get to decide downright to the basic level to use only a subset
of this feature, to keep complexity down?
Well, if you don't want to use contracts or tests, then don't write
them 
Cobra even helps you out with that:
def m(p as int) as dynamic
"""
doc
"""
require
...
ensure
...
test
...
body
...
is the normal form of a method definition. If you leave off the
contracts and tests, it would be:
def m(p as int) as dynamic
"""
doc
"""
body
...
But if you *only* provide the method body, you don't need to
explicitly say that:
def m(p as int) as dynamic
"""
doc
"""
...
jwm