MyLisp Editor

I’ve started to make a very simple editor for MyLisp, just so I can get some highlighting and testrun my snippets.

Here is the exclusive world first screenshot ;-)

mylisppad.png

Well, thats about it, source and binaries will be out once it’s more complete.
(The SyntaxHighlight editor component is available for download at www.puzzleframework.com )

Optimizing MyLisp

Today I stumbled across the weirdest thing.

I was comparing the performance of MyLisp to L# and MyLisp was terribly slow in comparison.
(Running the exact same Fibonacci code)

So I checked the code of L# and they seem to do pretty much the same as I do, quite similar design and code.

The only thing I could find was that I evaluated the first arg twice when calling Add, Sub,Mul and Div.
So I changed the code for those functions so that all args are evaluated only once.
And suddenly the performance was increased by some 1000 times or so, things that took a bout a minute before now completes in less than 0.1 sec.
I would get it if performance was increased about 2 times, but not like this.
The only thing I can think of is that the original code did some bugged recursive evaluation of some sort.

Well, I’m happy about the performance gain, but a bit annoyed that I can’t find the reason why the old code was so much slower.

Updated source code (C# 3) can be downloaded at:
http://www.puzzleframework.com/roger/mylisp.zip

[Edit. oh dear god I’m so stupid..]

Two minutes after I published this post I found the cause.
I even wrote the reason in this post..
I evaluated the first arg in Add, Sub, Mul, Div twice.
And the Fib code looks like this:

(= fibonacci (fn (n) 
   (if (< n 2) n 
      (+ (fibonacci (- n 1)) (fibonacci (- n 2)))))) 
;;;         ^ problem is right there...

The first arg of the last Add call is a recursive call to itself..
I feel so dumb now..
Well I guess the best way to find the cause of a problem is to try to describe it for someone else. 

In this case it was pretty much like that IQ test where you should count the letter “F”‘s in an english text where most people don’t count the “F” in “of”.

I was only looking at the (- n 1) and (-n 2) parts of the code..
Not seeing that I was also adding the result of recursive function calls.

Genetic programming / Math

For those of you who are interested in evolution / GP.
Here is a small app that I made to reverse engineer blackboxed formulas.

The application is fed with a “problem domain”

The problem domain consists of multiple cases, like this:

Case1:
When X is 1 and Y is 2 I want the result to be 3
Case2:
When X is 3 and Y is 5 I want the result to be 8
Case3:
When X is 10 and Y is 10 I want the result to be 20

That was a very simple problem.
And the formula to solve the problem would be (X+Y).

Ofcourse the problems aren’t described in english, it’s code. but this is the idea behind it anyway.

Once the application is fed with a problem domain, it will create a population of “formulas”
Each formula have its own DNA, the DNA in this case is the various functions, constants and variables.

Each formula in the population will be fed with the data from each problem case and then evaluated.
Depending on how close to the goal the result is, the better the error level is.

Once each problem case have been evaluated by a formula, all the error levels per case will be summed up.
The summed error level is then used to decide what formulas are allowed to survive and breed.
(There is also a bit of crossover going on between formulas, but I won’t get into that now.)

So how fancy stuff can this application solve?

Well, here are some examples:
Problem data and result is shown on the pictures.

And no, it does not care if the result is pretty, it just strives to find a working formula.
ofcourse this could be altered so that once the solution is found, it continues to evolve and promote shorter formulas.

GenMath sample 1

GenMath sample 2

GenMath sample 3

Complex solution, the application generated a formula to convert decimal 0-15 into binary representation:

genmath41.gif

After some optimization of the final formula it came up with this:

((((int)(((((100 * ((int)(-8) & (int)(x))) - (-23 * ((int)(-4) & (int)(x)))) + x) - 2)) | (int)(7)) + 1) + x)

Not the prettiest formula in the world, but it does what it is supposed to..
Also consider that the extensive use of “(int)” is not really generated by the application,
Its my ToString implementation of binary operators that add those.

If we remove all the extra clutter which is really only part of the visual representation, we would end up with this:

(((100 * (-8 & x) - (-23 * (-4 & x)) + x) - 2) | 7) + 1 + x;

I find it quite interesting to see how evolution solves the problem, it’s far from how a human would have solved it.
Evolution also have a habit to cheat, and very much so.

Eg, if I allow the application to use the XOR binary operator, it sometimes finds XOR patterns that can be applied to the problem cases, and thus solving the problem, but not the way you might have wanted it to..

Anyway, for those interested:
The application source code (C# 2) can be downloaded here:
Genetic Math

MyLisp: Callstack and threading

I’ve added multi threading support to MyLisp today.

The earlier implementation of the callstack was not up to the task, so I had to rewrite it completely.
The new callstack is based on stack frames which can hold local symbols.

So behold! :-) , the first multi threaded application in MyLisp

(func FooBar () 
  ( 
    (let i 0 
      (for i 0 1000000 
        (print (format "thread {0}" i)))))) 

(= thread (new-thread FooBar)) 
(call thread Start) 
(for i 0 1000000 
  (print (format "main {0}" i)))

Sure , the “new-thread” function is a bit of cheating, I dont have any generic code for .NET delegate <-> MyLisp delegate yet.
So I have to use hard coded methods to cast from and to delegates for now.

OK, this was not much of a real post, more of a “yay, it works!” shout :-)

Lazy evaluation

My Lisp now supports Lazy evaluation.

I’ve made it possible to define per function if it should use lazy or eager evaluation.
(The next step will be to decide that through code analysis.)

For those who don’t know what lazy evaluation is about, the purpose of lazy evaluation is to avoiding unnecessary calculations.

Take the following example (in C#)

public static void FooBar () 
{ 
  int res = GetSomeValue();         
  //pass the value to a logger 
  Logger.Log ("value of GetSomeValue was:" , res); 
}          

public static int GetSomeValue() 
{ 
 //some realy heavy algorithm here 
 ... heavy code ... 
 ... more heavy code ...   
 return res; 
}           

... logger code ...           

public void Log (string message,params object[] args) 
{ 
 //lets assume we can attach multiple providers to our logger 
 foreach (LogProvider provider in LogProviders) 
 { 
  provider.WriteString(message,args); 
 } 
}

In this case, we would always need to execute “GetSomeValue” first in order to call our logger.
EVEN if there is no provider attached to the logger.
So, even if we don’t want to write anything to disc or DB or anywhere, we would still have to call “GetSomeValue”.

It is ofcourse possible to add special code to your consumer , “if (logger.Providers.Count == 0) then ignore…”.
But that forces you to add responsibility to your code that isn’t supposed to be there.
My FooBar method should not have to care about the number of log providers etc.

Lazy evaluation can solve this for us.
By applying Lazy evaluation to “GetSomeValue” method, we would only need to call the method once the result of it is used.
Sounds odd?
How can you use the result before the code have executed?

It’s quite simple, its done through delegates, we can even do this in C# by passing delegates around.
However, you do have to know that you are passing delegates and not simple values, so it is not very implicit in C#.

In My Lisp, the code would look something like this:

(func FooBar () 
    ( 
        (= res (GetSomeValue)) 
        (call logger log "value of GetSomeValue was:" res)))     

(lazy-func GetSomeValue () 
    ( 
         ... heavy code ... 
         ... more heavy code ... 
        (return res)))     

...logger...     

(func Log (message data) 
    ( 
        (foreach provider Providers     

            ;;; GetSomeValue will be called here 
            (call provider WriteString message data))))

As you see, there is no special code to handle the lazy evaluation.
The only thing that I need to do was define “GetSomeValue” as a lazy function.

So if we do not have any log providers attached to our logger, “GetSomeValue” will never be executed, since its result is never needed.

And if there are providers attached, “GetSomeValue” will be executed from within the foreach loop inside the logger.
(Once, then caching its value)

Pretty slick IMO.

Lisp operators

I just got home from a conference with my new job, a little bit tipsy and couldn’t sleep.
So I decided to add some new features to my Lisp clone.

I’ve made it possible to set operator priority for functions.
Eg.

(operator-priority + 1) 
(operator-priority - 1) 
(operator-priority * 2) 
(operator-priority / 2)

This will make the engine scan every list that does not start with a function for operators.
The engine will enter a loop that search for the operator with the highest priority and replace the operator together with the left and right argument with a new list where the operator is the first arg (a function call).
The process will be repeated untill no operators are found.

So a statement like:

(let a ((1 + 10) * 3 / 5 - 1))

Will be turned into:

(let a (-(/(*(+ 1 10) 3) 5) 1))

Which is a normal Lisp expression that can be evaluated.

Well, thats it for now, I’d better get some sleep now.

 [Edit]

Update:

Ok, I’ve decided to change this a little bit.
The syntax will be:

(# 1 + 2 * 3 / 4)

And # will be a function that takes the rest of the formula as args.
Thus, not breaking any Lisp rules or syntax.

Remeber kids,
Don’t drink and develop…