I've been experimenting with QR codes (2-d bar codes) with my mobile phone. While helping my daughter assemble a Lego® spaceship, I had the random thought that maybe I could build a QR code out of Lego. Would such a QR code be readable, or would the bumps and gaps mess it up?
To find out, I generated a QR code and started assembling it out of my daughter's spare Lego pieces. This would have been much easier if she had a lot of small white pieces, so I needed to really dig to find the necessary 1x1 blocks. (I'm pretty sure that determining if a code can be tiled with a specific set of blocks is a NP-complete problem, but my manual heuristics were sufficient to get it assembled.)
The moment of truth... would it scan? No, not at all. The problem was I needed a border around the code block and I'd put the pieces too close to the edge. So I laboriously shifted all the pieces over and added a white border.
With the border, the QR code scanned amazingly well. Here's a camera-eye view. If you have a barcode-enabled phone, you can probably scan this off the screen:
I used the ZXing barcode scanning program on my phone. I expected the scan would be sensitive to the orientation, lighting, and distance, since Lego isn't very even. However, the phone scanned the blocks surprisingly rapidly and effectively. Note the orientation-independence:
My conclusion is that Lego is a viable medium for implementing QR codes.
Some tips if you try this at home... QR codes come in multiple sizes with various levels of error correction. The minimum size is 21x21 but many QR websites will generate larger ones. Avoid the larger sizes unless you want to do more assembly. I used the Raco Industries online QR code generator since that site provides a lot of control over the generated QR code. If you print out the code with .3125" pixels, the output will conveniently match the size of the Lego blocks.
(Apologies if you were expecting some Arc code in this article.)
Professor Ronald Loui has an interesting article on the rise of scripting languages (In Praise of Scripting: Real Programming Pragmatism) in the July 2008 issue of IEEE Computer. It claims scripting languages such as Perl, Python, and Javascript have dramatically fulfilled their early promise, provide many benefits, and are poised to take over the lead from Java. However, the academic programming language community is stuck in theory and hasn't recognized the ascendence of scripting languages.
I agree that scripting languages are on the rise. Most people would agree that they provide rapid development, higher levels of abstraction, and brevity that helps the programmer. The article also describes how scripting languages can be a performance win, since they can allow experimentation and implementation of efficient algorithms that would be too painful in Java or C++. So even if C++ is faster on the micro-benchmark level, a programmer using a scripting language may end up with faster algorithms overall. I've argued somewhat controversally that Arc is too slow for my programming problems, so I remain unconvinced that basic performance can be ignored entirely.
As for the claim that Java is in full retreat, it strikes me as wishful thinking. (I'd believe "slow decline" though.) It will be interesting to check back on this claim in 5 years.
I personally believe that CS1 [freshman computer science] Java is the greatest single mistake in the history of computing curricula.
-- R. Loui
The article suggests good languages for teaching introductory computer science are gawk, Javscript, PHP, and ASP, but Python is emerging as a consensus for the best freshman programming language. This is the hardest part of the article for me to swallow. The idea of writing real programs in Awk never occurred to me, and I remain skeptical even though the author claims it works well. For those who would suggest Scheme as an introductory programming language, it was displaced as a dominant freshman language by Java a decade ago, and is apparently no longer considered an option.
I can't argue with the author's claim that student learning is enhanced by experimenting, writing code, and getting hands-on experience, and that scripting languages make this faster and easier.
Python and Ruby have the enviable properties that almost no one dislikes them, and almost everyone respects them.
-- R. Loui
In Why your favorite language is unpopular I discussed how the Change Function model can explain the success of programming languages based on maximizing the crisis solved and minimizing the perceived pain of adoption. I can apply this model applies to scripting languages as well:
Magnitude of crisis solved by Tcl/Tk: High - How to add a scripting language to a C program. How to add a GUI to a C program without painful X11 and Motif code.
Total Perceived Pain of Adoption: Low - Link Tcl in with your C program and add a few hooks. Create the GUI with trivial scripts.
Magnitude of crisis solved by Perl: High - How to quickly write CGI scripts. How to solve problems too complex for shell scripts. How to process files. How to develop quickly and iteratively.
Total Perceived Pain of Adoption: Low - Apart from looking like line noise, Perl is easy to get started with, is well integrated with Unix, has the definitive regex implementation, and has libraries for almost everything.
My point is that these languages solved specific painful problems and had low pain of adoption. As a result, they were much more successful than beautiful, powerful languages that were less able to directly solve painful problems or were more painful to adopt.
The real reason why academics were blindsided by scripting is their lack of practicality.
-- R. Loui
A major thrust of the article is that academics are too concerned with theoretical issues of syntax and semantics, rather than pragmatic issues of what a language can achive quickly, inexpensively, and practically. Academics are said to be too tied to theoretical concepts such as object-oriented programming and strong typing, and are missing the real-world benefits of scripting languages.
(Interestingly, Rob Pike made a similar argument against academics in the context of operating systems software (Systems Software Research is Irrelevant), stating that academic research is irrelevant and the real innovation is in industry. Since I have friends doing academic OS research, I should add a disclaimer here that I don't necessarily agree.)
One measure of pragmatics raised by the paper is how well does a language work with other Unix tools. I think the importance of this is underappreciated. In particular, I view this as a significant barrier to adoption of Arc. Running Arc as a shell script instead of a REPL is nontrivial (as is the case with many Lisp and Scheme implementations). Running an external program from Arc is clunky, even though it is often necessary to actually get things done (Kens' law), and real pipes are missing from Arc entirely.
Java's integration with Unix also has painful gaps - where's getpid() for instance? Why is JNI so difficult compared to calling native code from C#? I blame Sun's pure-Java platform independence ideology, and I'm surprised it hasn't hurt Java more.
On the other hand, Python and Perl provide a remarkable degree of integration, which I view as a key factor in their success. Likewise, Visual Basic is highly integrated with the Windows environment and highly successful there.
In conclusion, Loui's paper raises numerous interesting points about the success of scripting languages. I expect that the reasons for the rise of scripting languages will only get stronger, and languages that don't support the scripting model will have an increasingly harder time gaining adoption.
Note: quotes above are from the preprint and may not match the published article.
The total world's population of Haskell programmers fits in a 747.
And if that goes down, nobody would even notice.
-- Erik Meijer
I recently saw an interesting talk on functional programming by Erik Meijer (of Bananas, Lenses, Envelopes, and Barbed Wire fame). Among other things, he discussed why many superior technologies such as Haskell don't catch on.
Geek formula for success
He claims the "Geek formula" for success of a technology is that if a technology is 10 times better, it should catch on and become popular. Even if it is slower, Moore's law will soon make it 10 times faster.
So if Haskell is 10 times better than C and Haskell programs are 10 times shorter, everybody should be using Haskell.
Real-life formula for success
However, as Erik points out, "That's not how it is in real life." In real life, success is based on the perceived crisis divided by the perceived pain of adoption. Users want something that will get the job done and solve their crisis, without a lot of pain to switch.
This argument applies to many languages that remain unpopular despite their technical merits, such as Lisp, Arc, and Erlang, as well as technologies such as the Semantic Web and LaTex.
The Change Function
The above argument is based on the book The Change Function: Why Some Technologies Take Off and Others Crash and Burn by Pip Coburn. To summarize the book, new technologies aren't adopted because they are great, new, and disruptive; they are adopted only if the user's crisis solved by the technology is greater than the perceived pain of adoption. As a result, most new technologies fail.
The first half of the book is a bit fluffy, but gets more interesting when it discusses specific technologies that failed or succeeded. The book also goes out on a limb and predicts future winners (mobile enterprise Email, satellite radio, business intelligence software) and losers (RFID, entertainment PC, WiMax).
Languages and The Change Function
The Change Function argument has a lot of merit for explaining what languages become popular and what languages don't.
If Lisp is so great, why are there 8 million Visual Basic programmers worldwide and few Lisp programmers? The answer isn't pointy-haired bosses (since Lisp isn't popular on SourceForge either). The crisis vs. pain of adoption model provides a powerful explanation:
Magnitude of crisis solved by Visual Basic: High (e.g. how to easily write Windows applications)
Total Perceived Pain of Adoption for Visual Basic: Very Low (hit Alt-F11 in Excel and you're done)
Magnitude of crisis solved by Lisp: Low (metaprogramming, powerful macros, and higher-order functions are solutions in search of problems)
Total Perceived Pain of Adoption for Lisp: High (this shouldn't require explanation)
The same model explains the success of, for instance, Java:
Magnitude of crisis solved by Java: High (originally how to run code in a browser and write portable code, now how to avoid crashes due to memory allocation errors and bad pointers)
Total Perceived Pain of Adoption: Low (syntax similar to C++, easy to deploy)
Applying this model to other languages is left as an exercise for the reader.
Erik points out that Erlang and Haskell are now being marketed according to the second formula: there is a multicore crisis and functional languages are the solution. It will be interesting to see how much additional traction these languages get without addressing the "pain of adoption" part.
The Change Function and startups
The Change Function ends with ten sets of questions and a set of techniques for designing technologies that will be adopted; this part of the book has many ideas that would be beneficial for startups. Many of these are fairly obvious, such as "Fail fast and iterate", and have a customer-centered culture instead of a sales-centered culture, while others are more thought-provoking: "What is the user crisis you intend to solve? What are the top five reasons a user with this crisis would not use your product?" The ultimate conclusion of the book is "Figure out what people really want!", which brings to mind the advice to make something people want.
Yes, that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!”
This quote appears many places on the web, but the code itself is harder to find. What is this amazing half page of code?
The Lisp 1.5 Manual, which was written by John McCarthy et al in 1961, is available at softwarepreservation.org. In it, the "Maxwell's equations" define a universal Lisp function evalquote that can evaluate any given function:
The above code is defined in a meta-language (M-expressions), which can be straighforwardly translated into S-expressions. Functions in M-expressions use square brackets and have arguments separated by semicolons. M-expressions conditionals are of the form [predicate -> value; predicate -> value; ...]. M-expression label is analogous to defun or define.
The point of all this is that M-expressions are the code that operates on the S-expression data, but the M-expression meta-language and S-expression data actually coincide. Thus, code and data are the same thing in Lisp, and a half-page of code is sufficient to define a basic Lisp interpreter in Lisp given a few primitives (car, cdr, cons, eq, atom). The code presents a Meta-circular evaluator for Lisp; see (SICP chapter 4.1 for more details on metacircular evaluators. (Unfortunately, this won't give you a working Lisp interpreter for free; things such as the garbage collector, the list primitives, and parsing need to be implemented somewhere. Also note that this metacircular evaluator doesn't give you niceties such as arithmetic.))
To understand the above code, apply takes a function and argument, while eval acts on a form. The last argument to these is an association list for the environment, which stores the values of bound values and function names. In brief, apply implements CAR, CDR, CONS, ATOM, and EQ in terms of the primitives. It implements LAMBDA by pairing up the variables and arguments and passing them to eval. It implements LABEL (which defines a function) by adding the function name and definition to the association list.
The code for eval processes a form in a straightforward manner. It handles the QUOTE form by returning the quoted value. It handles COND by evaluating the predicates with the help of evcon. Otherwise, it interprets an atom as a variable and returns the value. If given a list, it interprets this as a function application; the arguments are evaluated with evalis and the function is evaluated by apply.
The above code is not quite complete; it relies on some other simple functions that were previously defined in the manual, such as equals and cadr Less obvious functions are pairlis[x;y;a] pairs up lists x and y and adds them to association list a. assoc[x;a] looks up x in association list a. sublis[a;y] treats association list a as a mapping of variables to values, and replaces variables in S-expression y with the associated variables. These functions can be straightforwardly built from the primitive functions.
(By the way, I'm pretty sure the comma in eq[car[e],QUOTE] is a typo, but that's how it is in the original.)
Can Arc be used to create a Tetris-like game? In my previous posting on using OpenGL with Arc, I described how the effort to set up OpenGL had stalled my game-writing adventure. I finally got the game written, only to discover that Conan Dalton had beat me by writing Arc Tetris running on Rainbow, his version of Arc running on Java. Nonetheless, here's my implementation in Arc using OpenGL.
Playing the game
See using OpenGL with Arc for details on setting up OpenGL. To summarize, the files are in the opengl directory of the Anarki git. Run gl-arc.scm in DrScheme. Then load "tet.arc" into the Arc REPL, and the game will start.
To control the game, use the arrow keys. Left and right arrows move the tile. Up-arrow rotates the tile, and down-arrow drops it faster. When the game ends, "s" will start a new game.
Implementing games in Arc
Note that Arc itself doesn't have any graphics support, or any way to get at Scheme libraries. I hacked on Arc's implementation in Scheme to get access to the OpenGL libraries in PLT Scheme.
My first concern with using Arc was speed. It turns out that Arc on a newish laptop is plenty fast to run the game, apart from occasional garbage collection glitches. The game itself doesn't require a fast refresh rate, and the amount of processing per frame is pretty small, so Arc manages just fine.
The biggest problem with implementing the game in Arc was the lack of useful error reporting in Arc; I should have emphasized this in Why Arc is bad for exploratory programming. Every time I wrote more than a few lines at a time, I'd encounter a mystery error such as "ac.scm::18267: car: expects argument of type ; given 1" or "ac.scm::17324: Function call on inappropriate object 2 ()". I often had to stick print statements throughout the code, just to figure out where the error was happening. Reporting what is the error and where an error occurs is basic functionality that a programming language implementation should provide. An actual debugger would be a bonus.
I found that game programming doesn't really let you take advantage of the REPL for development. OpenGL is very stateful, so I can't just call a drawing routine from the REPL to see how it's working. I usually had to restart my code, and often reload the Arc interpreter itself to clear up the various Scheme objects. When I worked incrementally, I often ended up with startup issues when I ran from the start (i.e. variables getting initialized in the wrong order).
One annoyance is Arc's lack of arrays. I used tables in some places, and lists of lists in other places, but vectors and arrays seem like really basic datatypes.
Another annoyance is OpenGL's lack of any text support. If you're wondering about the lack of text in the screenshot, that's why. I implemented a simple 7-segment number-drawing function to display the score.
Some (pretty obvious in retrospect) advice to other game writers: Figure out in advance if the Y axis points up or down. I wrote some of the code assuming Y increases going down before realizing that other parts of the code expected the Y axis to point up. Also, figuring out the origin and scale in advance is a good thing.
Another tricky part of game implementation is state transitions. Both startup and end-of-game caused me more difficulty than I expected.
Implementing the game became much more rewarding once I got a minimal interactive display working. I should have done that sooner, rather than implementing the algorithms up front. On the other hand, I found it useful to disable animation much of the time, using keypresses to single-step movement.
The game has no audio, since PLT Scheme doesn't have audio support.
It took me a long time to realize that the pieces in Tetris don't actually rotate around a fixed point. Once I realized this and hard-coded the rotations instead of implementing a rotation function, things became easier.
I display the piece in motion separately from the playing field array that holds the stationary pieces. When a line is completed, instead of updating the playing field array in place, I decided it would be interesting to write remove-rows in a functional way so it returns an entirely new playing field array, rather than updating the existing array (as I did in saveshape when a piece hits bottom). Writing the whole game in a functional style would be very difficult, as others have described.
I could make the code more "Arcish" with various macros (e.g. w/matrix to wrap code in gl-push-matrix and gl-pop-matrix). But I decided to "ship early". Likewise I didn't bother implementing game levels, although the game does speed up as it progreses. And I've only made minimal efforts to clean up the code, so I'm not presenting it as a model of good style.
Conclusion
It's possible to write a playable game in Arc (with enough hacking on the language to get OpenGL support). I think it would have been considerably easier to write it in PLT Scheme directly, but the Arc implementation builds character. I was pleasantly surprised that Arc had enough speed to run the game. The Arc code works out to 389 lines for those keeping score; OpenGL is responsible for much of the length (50 lines just to display the score). I expect writing in Arc will be considerably easier if the language gets decent error reporting.
I saw a posting on news.ycombinator entitled "Take the Tetris test (an Arc version, anyone?)", which suggested writing a simple Tetris-like game. One obvious problem with doing this in Arc is the lack of graphics support. But how hard could it be to use OpenGL graphics from Arc?
It turns out that using OpenGL from Arc is harder than I expected, but possible, given enough hacking on Arc's underlying Scheme implementation. In this article, I discuss how I got OpenGL to work.
Challenge 1: How to access libraries from Arc
The first challenge is that the official Arc release does not let you access Scheme libraries or functions. Not at all. Even though Arc is implemented on top of MzScheme, there is no mechanism to access the underlying MzScheme implementation. If you want to access Scheme, you must actually modify the Arc language implementation by hacking on the Scheme code that implements Arc.
The unofficial Anarki implementation is a modified version of Arc that provides access to Scheme (as well as many other useful improvements). However, I decided to base my OpenGL project on the offical Arc implementation rather than Anarki.
I replaced the Arc implementation file ac.scm with a modified version called arc-gl.scm that gives me access to the necessary Scheme functions. The relevant Scheme code is:
(require (lib "mred.ss" "mred")
(lib "class.ss")
(lib "math.ss")
(prefix gl- (lib "sgl.ss" "sgl"))
(lib "gl-vectors.ss" "sgl"))
; List of functions to export from Scheme to Arc
(map (lambda (s) (xdef s (eval s)))
'(gl-shade-model gl-normal gl-begin gl-end gl-vertex gl-clear-color gl-clear
gl-push-matrix gl-pop-matrix gl-rotate gl-translate gl-call-list gl-flush
gl-light-v gl-enable gl-new-list gl-gen-lists gl-material-v gl-viewport
gl-matrix-mode gl-load-identity gl-frustum gl-light-v gl-enable gl-end-list
gl-scale sin cos))
; Arc doesn't provide access to vector, so make gl-float-vector
; take individual arguments instead of a vector
(xdef 'gl-float-vector (lambda (a b c d) (vector->gl-float-vector (vector a b c d))))
First, the code imports the necessary libraries. Next, it uses xdef to allow access to the list of specified Scheme functions. I included the OpenGL functions I used; you will need to extend this list if you want additional OpenGL functionality. I also include sin and cos; they are missing from Arc, which is almost pervesely inconvenient.
This contains arc-gl.scm, the updated Scheme code; arch.arc, the arch demo in Arc; and gears.arc, the gears demo in Arc. It is also available from the Anarki git.
Challenge 2: Where is OpenGL?
OpenGL isn't part of the plain vanilla MzScheme that is recommended for Arc, but it is part of DrScheme. Both DrScheme and MzScheme are versions of Scheme under the PLT Scheme umbrella; MzScheme is the lightweight version, while DrScheme is the graphical version that includes MrEd graphics toolbox and the OpenGL bindings for Scheme. Thus:
An alternative to OpenGL would be using MrEd's 2-d graphics; I've described before how to add simple graphics to Arc. However, I wanted to use the opportunity to learn more about OpenGL.
Challenge 3: Running an Arc REPL in PLT Scheme
It's straightforward to run as.scm inside PLT Scheme and get an Arc REPL. However, a problem immediately turns up with OpenGL
An OpenGL animation will lock up as soon as the Arc REPL is waiting for input. The problem is that MrEd is built around an event loop, which needs to keep running (similar to the Windows message loop). When the REPL blocks on a read call, the entire system blocks.
The solution is to implement a new GUI-based Arc REPL instead of the read-based REPL. MrEd provides text fields that can be used to provide non-blocking input. When the input is submitted, the event loop executes a callback, which can then run the Arc code. Of course, the Arc code needs to return reasonably promptly, or else things will be locked up again. However, the Arc code can start a new thread if it needs to do a long-running computation.
The following MzScheme code creates a frame, text-field for output, text-field for input, and a submit button, and executes the submitted code through arc-eval. It is analogous to the REPL code in ac.scm. I put this code in arc-gl.scm.
Load the new REPL code into the same directory as the original Arc files.
Run drscheme.
Go to Language -> Choose Language -> PLT -> Graphical. Click Ok.
Go to File -> Open -> arc-gl.scm
Click Run
As you can see from the screenshot, the REPL doesn't get any points for style, but it gets the job done.
Challenge 4: Using Scheme's object model
The mechanism above can't be used to access PLT Scheme's windowing operations, because they are heavily based on the Scheme object implementation, which is implemented through complex Scheme macros. Thus, I can't simply map the windowing operations into Arc, as I did with sin. If the operations are called directly, they will try to apply Scheme macros to Arc code, which won't work. If they're called after Arc evaluation, the Arc implementation will have already tried to evaluate the Scheme macros as Arc code, which won't work either.
I tried several methods of welding the Scheme objects into Arc, none of which are entirely satisfactory. The first approach was to encapsulate everything in Scheme and provide simple non-object-based methods that can be called from Arc. For example, an Arc function make-window could be implemented by executing the necessary Scheme code. This works for simple operations, and is the approach I used for simple 2-d graphics, but the encapsulation breaks when the code gets more complex, for example with callbacks and method invocations. It is also unsatisfying because most of the interesting code is written in Scheme, not Arc.
Another approach would be to fully implement Scheme's object model in Arc, so everything could be written in Arc. That was way more difficulty and work than I wanted to do, especially since the object system is implemented in very complex Scheme macros.
My next approach was to implement an exec-scheme function that allows chunks of Scheme code to be called directly from inside Arc. This worked, but was pretty hacky.
Minor semantic differences between Arc and Scheme add more ugliness. Arc converts #f to nil, which doesn't work for Scheme code that is expecting #f. I hacked around this by adding a symbol false that gets converted to #f. Another problem is Arc lists get nil added at the end; so the lists must be converted going to and from Scheme.
Finally, I ended up with a somewhat hybrid approach. On top of eval-scheme, I implemented Arc macros to wrap the object operations of send and invoke. These macros try to do Arc compilation on the Arc things, and leave the Scheme things untouched. Even so, it's still kind of an ugly mix of Scheme and Arc with lots of quoting. I found writing these macros surprisingly difficult, mixing evaluated and unevaluated stuff.
As an aside, Scheme's object framework uses send foo bar baz to invoke bar on object foo with argument baz. I.e. foo.bar(baz) in Java. I found it interesting that this semantic difference made me think about object-oriented programming differently: Scheme objects are getting sent a message, doing something with the message, and providing a reply. Of course, this is the same as invoking a method, but it feels more "distant" somehow.
At the end of this, I ended up with a moderately ugly way of creating Scheme objects from Arc, providing callbacks to Arc functions, and implementing instantiate and send in Arc. This isn't a full object implementation, but it was enough to get the job done. For example, to instantiate the MrEd frame% class and assign it to an Arc variable:
Here's a simple example of an animated arch displayed in OpenGL. To run this example, download the code and start up DrScheme as described above. Then:
From the Arc REPL (not the Scheme REPL) execute:
(load "arch.arc")
(motion)
The code, in arch.arc, has several parts. The arch function does the OpenGL work to create the Arch out of triangles and quadrilaterals. It uses archfan to generate the triangle fan for half of the front of the arch. The full code is too long to include here, but the following code, which generates the inside of the arch, should give a taste:
(for i 0 (+ n n -1)
(withs (angle (* (/ 3.1415926 n 2) i)
angle2 (* (/ 3.1415926 n 2) (+ i 1)))
(gl-normal (* r (cos angle)) (* r -1 (sin angle)) 0)
(gl-vertex (* r -1 (cos angle)) (* r (sin angle)) z)
(gl-vertex (* r -1 (cos angle)) (* r (sin angle)) negz)
(gl-normal (* r (cos angle2)) (* r -1 (sin angle2)) 0)
(gl-vertex (* r -1 (cos angle2)) (* r (sin angle2)) negz)
(gl-vertex (* r -1 (cos angle2)) (* r (sin angle2)) z)))
The next section of the code contains the graphics callback functions:
ex-run: the animation entry point called by the timer. Updates the rotation and refreshes the image.
ex-on-paint: uses OpenGL commands to draw the arch
ex-on-size: handles window resize and initial size. The OpenGL model (arch, lighting, projection) is set up here.
The last part of the code calls into Scheme to create the canvas and tie it to the appropriate callbacks, create the frame holding the canvas, and display the frame.
I find the arch-generation code somewhat unsatisfying stylistically, as there is a lot of duplicated code to generate the vertices and normal vectors for the front, back, and sides. I couldn't come up with a nice way to fold everything together. I suppose the Arc-y solution would be to write a DSL to express graphical objects, but that's beyond the scope of this project.
Let me mention that low-level OpenGL is not particuarly friendly for exploratory programming. It's tricky to generate complex shapes correctly: it's really easy to end up with the wrong normals, vertices that aren't clockwise, non-convex polygons, and many other random problems. I find it works much better to sketch out what I'm doing on paper first; if I just start coding, I end up with a mess of bad polygons. In addition, I've found that doing the wrong thing in OpenGL will lock up DrScheme and/or crash my machine if I'm unlucky.
The gears example
I also ported the classic OpenGL "gears" demo from Scheme to Arc. This demo includes GUI buttons to rotate the gears. (The animated GIF at the top of the page shows the program in operation.) This is a fairly straightforward port of gears.scm that comes with DrScheme. To run it:
Enter into the Arc REPL: (load "gears.arc")
The interesting thing to note in gear.arc is the horizontal-panel%, vertical-panel%, and button% objects that provide the UI controls. They are linked to Arc functions to update the viewing parameters. For example, in the following, note that instantiate is passing Arc code (using fn) to the Scheme constructor for button%. The tricky part is making sure the right things get evaluated in the right language:
(instantiate 'button% (list "Right" h (fn x (ex-move-right)))
'(stretchable-width #t))
How did I generate the animated gifs? Just brute force: I took some screenshots and joined them into an animated gif using gimp. The real animation is smoother. I found the animated gifs are a bit annoying, so I added JavaScript to start and stop them. The animation is stopped by substituting a static gif.
Conclusion
So what about the Tetris challenge that inspired my work on OpenGL? After all the effort to get OpenGL to work in Arc, I lost momentum on the original project of implementing the game. (This is an example of why Arc is bad for exploratory programming. If I wanted to get something done, I would have been much better off using DrScheme directly.) Maybe I'll complete the game for a future posting.
Recently, I've been reading Programming Collective Intelligence, which is a practical guide to machine learning algorithms, showing how to build a recommendation system, implement a search engine, classify documents, mine websites, use genetic algorithms and simulated annealing, and implement other machine learning tasks. The book shows how to implement each of these in surprisingly few lines of Python.
The book is an excellent example of exploratory programming, showing how to incrementally build up these applications and experiment with different algorithms from the Python interactive prompt. For instance, topic clustering is illustrated by first implementing code to fetch blog pages from RSS feeds, breaking the pages into words, applying first a hierarchical clustering algorithm and then a K-means clustering algorithm to the contents, and then graphically displaying a dendrogram showing related blogs. At each step, the book shows how to try out the code and perform different experiments from the interactive prompt.
By using Python libraries, each step of implementation is pretty easy; the book can focus on the core algorithms, and leave the routine stuff to libraries: urllib2 to fetch web pages, Universal Feed Parser to access RSS feeds, Beautiful Soup to parse HTML, Python Imaging Library to generate images, pydelicious to access links on del.icio.us, and so forth.
If you want more details than the book provides (it is surprisingly lacking in references), I recommend Andrew Moore's online Statistical Data Mining Tutorials, which covers many of the same topics.
What does this have to do with Arc?
While reading this book, I was struck by the contradiction that this book is a perfect example of exploratory programming, Arc is "tuned for exploratory programming", and yet using Arc to work through the Collective Intelligence algorithms in Arc is an exercise in frustration.
The problem, of course, is that Arc lacks libraries. Arc lacks basic functionality such as fetching a web page, parsing an XML document, or accessing a database. Arc lacks utility libraries to parse HTML pages or perform numerical analysis. Arc lacks specialized API libraries to access sites such as del.icio.us or Akismet. Arc lacks specialized numerical libraries such as a support-vector machine implementation. (In fact, Arc doesn't even have all the functionality of TRS-80 BASIC, which is a pretty low bar. Arc is inexplicably lacking trig, exp, and log, not to mention arrays and decent error reporting.)
To be sure, one could implement these libraries in Arc. The point is that implementing libraries detours you from the exploratory programming you're trying to do.
I think there are two different kinds of exploratory programming. The first I'll call the "Lisp model", where you are building a system from scratch, without external dependencies. The second, which I believe is much more common, is the "Perl/Python model", where you are interacting with existing systems and building on previous work. In the first case, libraries don't really matter, but in the second case, libraries are critical. The recently-popular article Programming in a Vacuum makes this point well, that picking the "best" language is fine in a vacuum, but in the real world what libraries are available is usually the key.
Besides the lack of libraries. Arc's slow performance rules it out for many of the algorithms from Programming Collective Intelligence. Many of the algorithms run uncomfortably slow in Python, and running Arc is that much worse. It's just not true that speed is unimportant in exploratory programming.
On the positive side for Arc, chapter 11 of Programming Collective Intelligence implements genetic programming algorithms by representing programs as trees, which are then evolved and executed. To support this, the book provides Python classes to represent code as a parse tree, execute the code tree, and prettyprint the tree. As the book points out, Lisp and its variants let you represent programs as trees directly. Thus, using Arc gives you the ability to represent code as a tree and dynamically modify the code tree for free. (However, it only takes 50 lines of Python to implement the tree interpreter, so the cost of Greenspunning is not particularly severe.)
To summarize, a language for exploratory programming should be concise, interactive, reasonably fast, and have sufficient libraries. Arc currently fails on the last two factors. Time will tell if these issues get resolved or not.