- It is very dynamic: commands can be created or modified during the execution of a script.
- It treats both code and data as strings: new code can be generated on the fly.
- It supports introspection so that a program can examine itself and make decisions based upon its current configuration.
Advantages of Meta Programming
- Performance: Meta-programs represent data to be processed in terms of the low level data structures used by the script interpreter to represent the scripting language itself. Therefore, instead of being processed by an interpreted script manipulating script-level data structures, the data is processed by the compiled code of the language interpreter manipulating low level data structures directly. This removes the levels of indirection that slow down the execution of scripts, and so results in significant performance improvements.
- Or at least, there is no performance drop for using metaprogramming; the performance hit comes from it being much more difficult to efficiently compile the language at all. - DKF
Metaprogramming can often involve more than one level of quoting and interpretation. Here is a bit of code to make this easier by using a template procedure. -- Mark
proc makeTemplate {procname templatename parmlist} { set arg [info args $templatename] set body [info body $templatename] foreach {a b} $parmlist { regsub -all $a $body $b body } eval "proc $procname {$arg} {$body}" } proc FooTemplate {a b} { puts "MSG1" puts "hello, $a, $b" puts "MSG2" } makeTemplate foo1 FooTemplate {MSG1 "hello world!" MSG2 "goodbye!"} makeTemplate foo2 FooTemplate {MSG1 "hi!" MSG2 "bye!"} foo1 mom dad foo2 breathren cistern
Links about Meta Programming in Tcl
Duplicating Procedures (Emmanuel Frecon)I have used dynamic duplication of procedures in several projects and found this very useful:
- In one project, I had a client/server solution where a process(the server) allowed external clients to connect and perform commands within the server on their behalf. This is part of the DIVE system and is documented here [4]. In this project, my application was running in an external wish for various reasons and I was using the DIVE as a 3D visualisation only. Since these processes could be on separate machines, not necessarily sharing the same set of files, I ensured that my application was sending some of its procedure through the socket, so that they could be used by the application code that was at the server side.
- In another project, I had a text conversion application. The main converting code uses a generic backend, where all procedures start with the keyword "output_". None of these procedures exist when the program is started, they are all created depending on the output method that is chosen. So, for example, when HTML is chosen as the output method, a known set of procedure, all starting with the keyword "html_" is analysed and duplicated into the "output_" procedures described above. This is powerful, since it allows the conversion application to convert several files one after the other, each with different output method. Within the program, this is just a matter of dynamically redeclaring the procedures. Pseudo-code for this is given below:
proc html_hbar { fdes { size -1 } { alignment "LEFT" } } { puts -nonewline $fdes "<HR " if { $size >= 0 } { puts -nonewline $fdes "width=\"$size%\" " } puts $fdes "align=\"$alignment\">" } ... proc conversion_init { } { foreach procname { hbar } { set args {} foreach a [info args html_$procname] { if { [info default html_$procname $a deft] } { lappend args [list $a $deft] } else { lappend args $a } } uplevel \#0 [list proc output_$procname \ $args [info body html_$procname]] } }This code would lead to the procedure "output_hbar" being declared identical as the procedure "html_hbar". Note the use of the "info default" to make sure that procedure with default arguments can still be called with less than all the arguments. If you don't do this, you will end up with procedures that require all the arguments when called.DKF: You can use interpreter aliases to do this sort of thing as well. Furthermore, you can use them to supply leading arguments to do some very cute things. Leading on from the above...
proc HTML {fdes script} { global errorInfo errorCode namespace eval HTML_EVAL {} foreach procname [info commands html_*] { regsub html_ $procname HTML_EVAL:: aliasname interp alias {} $aliasname {} $procname $fdes } set code [catch {namespace eval HTML_EVAL $script} msg] namespace delete HTML_EVAL return -code $code -errorcode $errorCode -errorinfo $errorInfo $msg }
"Outside" references on the topic include:
- A page at the Mother Wiki on metaprogramming [5].
- Notes for a Scheme-based course which touches on "Metaprogramming" [6]
- Nat Pryce's "Patterns for Scripted Applications" [7]
- [Remarks on metaprogramming in Lua and Python.]
- ...