Discussion:
Function calls [was: Lua and Neko comparison]
Wim Couwenberg
2006-09-28 16:26:30 UTC
Permalink
Nope. The fibonacci micro-benchmark measures call frame setup and
teardown overhead and nothing else.
So is there room for considerable improvement? (This is relevant in
operator vs. library call, iterators and other places.)

--
Wim
Mike Pall
2006-09-28 17:37:00 UTC
Permalink
Hi,
Post by Wim Couwenberg
Nope. The fibonacci micro-benchmark measures call frame setup and
teardown overhead and nothing else.
So is there room for considerable improvement? (This is relevant in
operator vs. library call, iterators and other places.)
[Well, Lua already compares favourably with other interpreters in
this regard ...]

I don't think there is a way to speed up Lua->Lua function calls
in the interpreter any further without compromising Lua language
semantics (like adjusting # of args and results).

But there is a way to speed up Lua->C function calls:

1. Add a kind of "light" C functions. This avoids building up a
call frame for every function call. These functions are a bit
more restricted (like no callbacks into Lua), but have a faster
calling convention. The cl->isC byte may be reused to
differentiate them from regular functions. Three options:

1a. Just bump up L->base temporarily and call the C function
within the existing Lua call frame. The C function must store the
results in the proper place and generally has to be _very_
careful with the API functions for stack manipulation. I'm not
sure how feasible this is, because one needs to export some
knowledge about the internals of the Lua VM.

1b. Like 1a, but add a special set of API calls which exclusively
work with light functions. E.g. non-stack based fast type checks.

1c. Specialize the types early on. E.g. here are the two most
common candidates (more can be added):

lua_Number lightfunc_n_n(lua_State *L, lua_Number a);
lua_Number lightfunc_n_nn(lua_State *L, lua_Number a, lua_Number b);

Type checking can be inlined in luaD_precall and is much faster,
too. Very few (or no) Lua internals need to be opened up.


A Lua compiler has two more options:

2. Fully inline the called code. Would be useful for Lua->Lua
function calls, but is tricky to get right. Also helpful for a
selected subset of C functions (drawback: not easily user
extensible for new C functions in external libraries). In fact
LuaJIT is doing the latter already (with great gains).

3. Optimize, i.e. reduce the call frame overhead. LuaJIT is
already doing this to some extent. But compatibility with the
(mostly unchanged) Lua VM requires some compromises.

A better way would be to use the C stack for everything (like
most compiled languages do). This avoids setting up and tearing
down three structures in parallel (C stack, Lua stack and Lua
call frames). Alas, this would imply a significant departure from
the standard Lua VM structures.

Bye,
Mike
Rici Lake
2006-09-28 19:07:06 UTC
Permalink
Interesting ideas. But I think the best way of speeding up the Lua->C
interface is to call it less often; i.e. design higher-level
interfaces.

The optimizations you mention seem to apply to the math library only
(nowhere else would prototypes which only involve lua_Numbers be
common), but I can see that speeding up the math library would be a
concern for some contexts. The function I was always interested in
providing a fast interface for is the one which implements next() for
common tables, but my attempts to do so did not significantly speed up
non-artificial programs.
Glenn Maynard
2006-09-28 19:43:21 UTC
Permalink
Post by Rici Lake
Interesting ideas. But I think the best way of speeding up the Lua->C
interface is to call it less often; i.e. design higher-level
interfaces.
I want (hypothetically) to implement an inner-loop character count
function for UTF-8 (like strlen, but returns the number of whole
characters). It's inner-loop, so I can't have all that horrible
function call overhead. Can you suggest the higher-level interface
that you have in mind that would solve this? (Without hardcoding
anything specific to my narrow purpose into the Lua core, of course.)

Any of Mike's approaches seem to solve this.
Post by Rici Lake
The optimizations you mention seem to apply to the math library only
(nowhere else would prototypes which only involve lua_Numbers be
common), but I can see that speeding up the math library would be a
He only gave examples using lua_Numbers; it would also, presumably,
support strings.
--
Glenn Maynard
Rici Lake
2006-09-28 20:11:17 UTC
Permalink
Post by Glenn Maynard
Post by Rici Lake
Interesting ideas. But I think the best way of speeding up the Lua->C
interface is to call it less often; i.e. design higher-level
interfaces.
I want (hypothetically) to implement an inner-loop character count
function for UTF-8 (like strlen, but returns the number of whole
characters). It's inner-loop, so I can't have all that horrible
function call overhead. Can you suggest the higher-level interface
that you have in mind that would solve this? (Without hardcoding
anything specific to my narrow purpose into the Lua core, of course.)
I wrote one of those and I don't find that the horrible function call
overhead is much of a problem. If you want it, I'll post it somewhere.
I also wrote it in pure Lua, which is somewhat slower but still
acceptable for most practical applications.

But I hardly ever use it. What I usually want to know about UTF-8
strings is not how many UTF-8 codes there happen to be, but rather how
big the rendered string would be. There are also a number of useful
regex-like interfaces which can be applied to UTF-8 strings, but none
of them require counting codes.

Now, I do often prototype such programs in Lua, but the intent is
generally to write them in C, as higher-level interfaces (which don't
require modifying the Lua core at all.)
Post by Glenn Maynard
Post by Rici Lake
The optimizations you mention seem to apply to the math library only
(nowhere else would prototypes which only involve lua_Numbers be
common), but I can see that speeding up the math library would be a
He only gave examples using lua_Numbers; it would also, presumably,
support strings.
Sure. However, except for very small strings, the cost of a function
call is hardly noticeable.

Certainly in the case of the math library, the cost of a function call
is noticeable in relation to the cost of computing a sine or cosine.

By the way, am I missing something, or are you a different Glenn
Maynard from the one who said that efficiency wasn't a good reason to
introduce new operators like % and #?
Glenn Maynard
2006-09-28 21:16:44 UTC
Permalink
Post by Rici Lake
I wrote one of those and I don't find that the horrible function call
overhead is much of a problem. If you want it, I'll post it somewhere.
I also wrote it in pure Lua, which is somewhat slower but still
acceptable for most practical applications.
But I hardly ever use it. What I usually want to know about UTF-8
strings is not how many UTF-8 codes there happen to be, but rather how
big the rendered string would be. There are also a number of useful
regex-like interfaces which can be applied to UTF-8 strings, but none
of them require counting codes.
I picked an example of a function where the function-call overhead could
be annoying, purely for the sake of giving you an opportunity to explain
what kind of "higher-level interface" you were thinking of.

If you don't like that example, pick any other. toupper(str, index),
isspace, "convert offset in UTF-8 to a Unicode codepoint" ("iterate
characters in UTF-8"), the whole math library, many others: Bessel
functions, quadratic interpolation, random number generation. I
think function call overhead could be relevant to any of those; pick
one. :)
Post by Rici Lake
Sure. However, except for very small strings, the cost of a function
call is hardly noticeable.
Certainly in the case of the math library, the cost of a function call
is noticeable in relation to the cost of computing a sine or cosine.
strlen() is as fast as sin() in quick tests (on a string of length 90).
It varies by use and architecture, of course, but string operations can
be very fast.
Post by Rici Lake
By the way, am I missing something, or are you a different Glenn
Maynard from the one who said that efficiency wasn't a good reason to
introduce new operators like % and #?
Huh? I've pointed to this very idea (fast function call dispatch)
as a *reason* why adding new operators for performance reasons is
a bad idea. Adding an operator gives a few people a speed improvement,
at the cost of accumulating complexity in the whole language each time
it's done, and setting a pattern of bloating Lua a little more for
individual use cases. Faster function call dispatch gives all of
them the speed improvement, in a way that doesn't continue to expand
the language every time someone needs a new one.
--
Glenn Maynard
Rici Lake
2006-09-28 21:34:10 UTC
Permalink
Post by Glenn Maynard
I picked an example of a function where the function-call overhead could
be annoying, purely for the sake of giving you an opportunity to explain
what kind of "higher-level interface" you were thinking of.
Well, I hope I answered that, in that case. The sort of higher-level
interface I was thinking of was:

regex on utf-8 strings

rendersize of utf-8 strings

mousehit in rendered strings

etc.
Glenn Maynard
2006-09-28 22:06:23 UTC
Permalink
Post by Rici Lake
Post by Glenn Maynard
I picked an example of a function where the function-call overhead could
be annoying, purely for the sake of giving you an opportunity to explain
what kind of "higher-level interface" you were thinking of.
Well, I hope I answered that, in that case. The sort of higher-level
regex on utf-8 strings
rendersize of utf-8 strings
mousehit in rendered strings
I don't see how that applies to any of the many examples I offered.
What kind of interface do you have in mind, that would avoid function
call overhead in exposing random number generation to Lua?

If what you mean is "don't do that kind of performance-sensitive stuff
in Lua, group it up in C and expose that whole thing to Lua", that's
fine--and my own approach, most of the time. But, it's in contradiction
with the use of "performance" as a rationale for adding operators in
previous discussions. You don't get to say performance is important
for the stuff you want--integer division--but not important for the
stuff other people want. :) But maybe I'm not understanding what
you mean.
--
Glenn Maynard
Rici Lake
2006-09-28 22:11:32 UTC
Permalink
Post by Glenn Maynard
You don't get to say performance is important
for the stuff you want--integer division--but not important for the
stuff other people want. :) But maybe I'm not understanding what
you mean.
Right. I'm not saying that. I'm saying that I find a // b more readable.
Klaus Ripke
2006-09-28 22:28:16 UTC
Permalink
Post by Rici Lake
Well, I hope I answered that, in that case. The sort of higher-level
regex on utf-8 strings
rendersize of utf-8 strings
d'accord, and actually these are available as "higher-level" C interfaces.
Mike Pall
2006-09-28 20:34:00 UTC
Permalink
Hi,
Post by Mike Pall
1c. Specialize the types early on. E.g. here are the two most
lua_Number lightfunc_n_n(lua_State *L, lua_Number a);
lua_Number lightfunc_n_nn(lua_State *L, lua_Number a, lua_Number b);
Type checking can be inlined in luaD_precall and is much faster,
too. Very few (or no) Lua internals need to be opened up.
Quick followup, because this was easy to check: this variant is
around 2x faster than a regular call to a C closure. Which means
it's still 2.5x slower than an operator. Mainly because of stack
slot copying and some remaining overhead in the code path.

Speeding up the latter can be accomplished with a new VM opcode
for calling intrinsic functions. The CALL opcode can be replaced
on-the-fly when a call-site for a function with an intrinsic
equivalent is found (flagged in the closure). Pretty similar to a
polymorphic inline cache (works for interpreters, too).


Another idea -- a meta mechanism for user extensible operators:

Pass a table with user defined operators (*) to loadstring() etc.
Compile this to OP_[LIGHT]{CONST|UN|BIN}OP, indexing a table of
(light) functions held in the Lua function prototype. I guess
this would be very fast.

Slight complication: for (un)dumping to/from bytecode you need to
pass the same table of user defined operators.

(*) Name, arity, precedence, (light) function, optional type
checks, optional constant folding.

Simplified, hypothetical example:

local userops = {
PI = { arity = 0, op = math.pi }, -- arity 0 are constants
sin = { arity = 1, op = math.sin },
cos = { arity = 1, op = math.cos },
["|"] = { arity = 2, op = bit.bor },
["<<"] = { arity = 2, op = bit.lshift },
["//"] = { arity = 2, op = function(a, b) return math.floor(a/b) end },
} -- These should of course better be light functions.

loadstring([[
local a, b = ...

print( PI * sin a + cos b )
print( (a << 11) | (b << 3) | 5 )
print( "Rici buys", 1.38 // 0.16, "apples." )

]], nil, userops)(...)

[Yes, of course this works with strings or any other type, too.]

The operator table could of course be global (but overridable)
for convenience. And it may be chained via __index metamethods,
too. It's even imaginable to define yourself a compile time
"const" keyword which modifies the operator table on the fly
(Forth anyone?).

BTW: This would also solve the annoying 'local sin = math.sin'.

Bye,
Mike
Rici Lake
2006-09-28 20:47:57 UTC
Permalink
Post by Mike Pall
Pass a table with user defined operators (*) to loadstring() etc.
Compile this to OP_[LIGHT]{CONST|UN|BIN}OP, indexing a table of
(light) functions held in the Lua function prototype. I guess
this would be very fast.
Yeah, I have something very similar to that kicking around, but it's a
slightly different protocol, to avoid the issue with dump.

The operands I defined were:

OP_BINOP ra, rkb, rkc
OP_UNOP ra, rkb, 0
OP_GENOP2 0, K

Both OP_BINOP and OP_UNOP must be followed by an OP_GENOP2; the K in
the OP_GENOP2 is the index of a string which is used to look the
definition up in the metatable of (b or c), as with Arithmetic
operators (I considered adding alternative ones for the < and ==
metarestrictions but it seemed too complicated.)

Then the parser is extended with:

local operator [binary|unary] <token> as <identifier>

for <token>s which would otherwise be valid identifiers, this follows
normal scope visibility rules.

It seems to provide quite a reasonable extension mechanism. I'll dig
out the implementation and share it if anyone is interested.
Glenn Maynard
2006-09-28 23:14:04 UTC
Permalink
Post by Mike Pall
local userops = {
PI = { arity = 0, op = math.pi }, -- arity 0 are constants
sin = { arity = 1, op = math.sin },
cos = { arity = 1, op = math.cos },
["|"] = { arity = 2, op = bit.bor },
["<<"] = { arity = 2, op = bit.lshift },
["//"] = { arity = 2, op = function(a, b) return math.floor(a/b) end },
} -- These should of course better be light functions.
loadstring([[
local a, b = ...
print( PI * sin a + cos b )
print( (a << 11) | (b << 3) | 5 )
print( "Rici buys", 1.38 // 0.16, "apples." )
Intuitively, I don't like this. I don't want to use a language where people
are almost defining their own syntax inside the language. That takes
the evils that operator overloading can cause in C++ (people using operators
for things unrelated to their original cause, like overriding + for "set
union"), and makes it an order of magnitude worse, allowing making whole
new operators.

That's not to say it couldn't be used well, just like operator overloading
can be used well; it just seems to invite incredible abuse. Practical
languages can't make it impossible to write bad code, of course--people
will--but this seems at such a level that in order to read anyone else's
code, I'm going to have to learn their own personal sub-language first.

(I don't understand how a parser could support this, since the language
is being defined by the language being compiled.)
--
Glenn Maynard
John Belmonte
2006-09-29 11:19:03 UTC
Permalink
Post by Glenn Maynard
Post by Mike Pall
local userops = {
PI = { arity = 0, op = math.pi }, -- arity 0 are constants
sin = { arity = 1, op = math.sin },
cos = { arity = 1, op = math.cos },
["|"] = { arity = 2, op = bit.bor },
["<<"] = { arity = 2, op = bit.lshift },
["//"] = { arity = 2, op = function(a, b) return math.floor(a/b) end },
} -- These should of course better be light functions.
loadstring([[
local a, b = ...
print( PI * sin a + cos b )
print( (a << 11) | (b << 3) | 5 )
print( "Rici buys", 1.38 // 0.16, "apples." )
Intuitively, I don't like this. I don't want to use a language where people
are almost defining their own syntax inside the language. That takes
the evils that operator overloading can cause in C++ (people using operators
for things unrelated to their original cause, like overriding + for "set
union"), and makes it an order of magnitude worse, allowing making whole
new operators.
That's not to say it couldn't be used well, just like operator overloading
can be used well; it just seems to invite incredible abuse. Practical
languages can't make it impossible to write bad code, of course--people
will--but this seems at such a level that in order to read anyone else's
code, I'm going to have to learn their own personal sub-language first.
(I don't understand how a parser could support this, since the language
is being defined by the language being compiled.)
Please understand that there are significant uses of Lua different than
your own, such as implementing domain-specific languages.

--John
David Jones
2006-09-29 11:42:55 UTC
Permalink
Post by Glenn Maynard
Post by Mike Pall
local userops = {
PI = { arity = 0, op = math.pi }, -- arity 0 are constants
sin = { arity = 1, op = math.sin },
cos = { arity = 1, op = math.cos },
["|"] = { arity = 2, op = bit.bor },
["<<"] = { arity = 2, op = bit.lshift },
["//"] = { arity = 2, op = function(a, b) return math.floor(a/
b) end },
} -- These should of course better be light functions.
loadstring([[
local a, b = ...
print( PI * sin a + cos b )
print( (a << 11) | (b << 3) | 5 )
print( "Rici buys", 1.38 // 0.16, "apples." )
Intuitively, I don't like this. I don't want to use a language where people
are almost defining their own syntax inside the language. That takes
the evils that operator overloading can cause in C++ (people using operators
for things unrelated to their original cause, like overriding + for "set
union"), and makes it an order of magnitude worse, allowing making whole
new operators.
That's not to say it couldn't be used well, just like operator
overloading
can be used well; it just seems to invite incredible abuse. Practical
languages can't make it impossible to write bad code, of course--
people
will--but this seems at such a level that in order to read anyone else's
code, I'm going to have to learn their own personal sub-language first.
See below.
Post by Glenn Maynard
(I don't understand how a parser could support this, since the
language
is being defined by the language being compiled.)
No it's not. The table defining the language, userops above, is not
part of the program being compiled, the string argument to
loadstring. I assume Rici Lake meant to pass userops to loadstring
or something like that. Effectively the compiler (as embodied in
loadstring for example) changes from a function of type "string ->
function" to one having type "string x language -> function" where
"language" could be specified as a table like Rici Lake has done.

On sub-languages:

One paradigm for programming solves problems by creating a new
language in which the expression and solution of the problem in hand
is, in some sense, "nice". This is one of the reasons people use
different languages for different problems. To some extent it's
possible to regard _all_ programming as the creation of a new
language in which the problem solution is expressed.

For example when looking at a random program in C, say the Lua
system, it's not possible to understand the code without
understanding the specialised language that has been created.
Concretely, here's a snippet of code from Lua:

int b = GETARG_B(i);
int nresults = GETARG_C(i) - 1;
if (b != 0) L->top = ra+b; /* else previous instruction set
top */
L->savedpc = pc;

Now, this is fairly plain C, but knowing C doesn't really help you
understand what's going on. You have to understand the language of
"GETARG_B" and "L->top" and such like. A language has been created
with the "C language programming system".

Where programming systems (languages) differ is in how much
flexibility they offer you in defining your own language. My
assertion is that you'll be defining your own domain-specific
language anyway, regardless of how much support you get from the
programming system you are using. Languages like Java offer almost
no flexibility. Your language pretty much must consist of objects
and method calls, so it's all in the names and typical usage idioms.
Languages like C offer quite a bit more (because of the horrible
macro system); you can to some extent define you're own syntax for
some things. Languages like Prolog or Lisp offer a huge range of
flexibility due to the way you can get at the internals of the parser
("reader").

Lua seems to occupy a sort of middle ground somewhere between Java
and Lisp. You can make languages that look a bit declaration and
configuration-like by using syntax like:

base { pos={3, -7}, health=50 }

you can make fairly ordinary "just a pile of functions" languages:

base[i] = newBase({3, -7}, 50)

And you can also make some quite funky languages by messing around
with metamethods and the built-in operators.

Observe that load takes a function that generates input, so one could
regard this as a sort of completely general front-end to the Lua
"parser" capable of transforming any language into Lua. That kind of
thinking would place Lua near the Lisp end of the spectrum, but it's
not really because of the general difficultly of creating typical
transformations.

Perhaps it boils down to: to what extent do you trust your developers
(people that write code that you have to look at, say) to design
sensible languages? I observe that what C++ makes easy, operator
overloading, seems to make it easy for people to create languages
that are not intuitively understood, and are therefore "bad".

I find all this a bit ironic since you Glenn have created your own
sub-language with stuff like: http://stepmania.cvs.sourceforge.net/
stepmania/stepmania/Themes/default/metrics.ini?
revision=1.1344&view=markup . In order to "understand" that document
I have to understand the sub-language of "[ScreenUnlockStatus]" and
"# blah blah blah" and so on. I'm not saying that's difficult in
this case, I'm just saying that you create sub-languages of your own.

Personally I really like the fact that compilation in Lua is
currently a function of type "string -> function". The result of
compilation does not depend on the compilation environment. OTOH I
also think much of the recent "syntax modification" discussions and
developments, such as Silverstone's token-processing patch and Lake's
example above, are very interesting.

drj
Glenn Maynard
2006-09-29 20:01:39 UTC
Permalink
Post by David Jones
int b = GETARG_B(i);
int nresults = GETARG_C(i) - 1;
if (b != 0) L->top = ra+b; /* else previous instruction set
top */
L->savedpc = pc;
Now, this is fairly plain C, but knowing C doesn't really help you
understand what's going on. You have to understand the language of
"GETARG_B" and "L->top" and such like. A language has been created
with the "C language programming system".
I can read the code, and understand the order of operations. I can't
even do that when new basic tokens are being created, with new precedences.

I want a domain-specific language, not a domain-specific language
language.
Post by David Jones
Perhaps it boils down to: to what extent do you trust your developers
(people that write code that you have to look at, say) to design
sensible languages? I observe that what C++ makes easy, operator
overloading, seems to make it easy for people to create languages
that are not intuitively understood, and are therefore "bad".
Not all means of "extending" languages (in the sense you're using here)
are bad; many of them aren't, in fact. Operator overloading can be
abused (and frequently is), but it isn't defining new tokens and
precedences; I think this is at a new level.
Post by David Jones
I find all this a bit ironic since you Glenn have created your own
sub-language with stuff like: http://stepmania.cvs.sourceforge.net/
stepmania/stepmania/Themes/default/metrics.ini?
revision=1.1344&view=markup . In order to "understand" that document
I have to understand the sub-language of "[ScreenUnlockStatus]" and
"# blah blah blah" and so on. I'm not saying that's difficult in
this case, I'm just saying that you create sub-languages of your own.
That's the INI file format, used by Windows since the beginning of time.
That software does have some of its own file formats, but we try hard
not to, since it reduces learning curves and increases reusability.
--
Glenn Maynard
Alex Queiroz
2006-09-29 20:22:58 UTC
Permalink
Hallo,
Post by Glenn Maynard
Not all means of "extending" languages (in the sense you're using here)
are bad; many of them aren't, in fact. Operator overloading can be
abused (and frequently is), but it isn't defining new tokens and
precedences; I think this is at a new level.
This is not really new. In Haskell you can define new operators,
which are infix functions, and specify their precedence and
associativity:

infixl 10 ($=) -- infix operator, associates left, precedence 10
($=) :: Int -> Int -> Int
a $= b = a + b
--
-alex
http://www.ventonegro.org/
Mike Pall
2006-09-29 12:39:15 UTC
Permalink
Hi,
Post by Glenn Maynard
Post by Mike Pall
print( PI * sin a + cos b )
print( (a << 11) | (b << 3) | 5 )
print( "Rici buys", 1.38 // 0.16, "apples." )
Intuitively, I don't like this. I don't want to use a language where people
are almost defining their own syntax inside the language.
But that's sort of the definition of a DSL (Domain Specific
Language). Lua is already pretty good at that, but it can be
improved.
Post by Glenn Maynard
Practical
languages can't make it impossible to write bad code, of course--people
will--but this seems at such a level that in order to read anyone else's
code, I'm going to have to learn their own personal sub-language first.
Language is about expression. The ability to abstract concepts is
broadening your ability to express your intentions more clearly.

Computer languages are built for humans and only incidentally run
on computers. Designing computer languages is the art to allow a
maximum of abstraction with a minimum of implementation effort.
Post by Glenn Maynard
it just seems to invite incredible abuse.
This reminds me: If you design a computer language to be
foolproof, only fools would use it.

BTW: If you want Java, you know where to get it. :-)

Bye,
Mike
Glenn Maynard
2006-09-29 19:43:18 UTC
Permalink
Post by Mike Pall
This reminds me: If you design a computer language to be
foolproof, only fools would use it.
Of course; as I said, practical languages can't make it impossible to
write bad code. But there's a ratio of the benefit of a feature to
how likely it is to lead to things like terrible code and high per-
program learning curves. I think this feature has a worse ratio
than most by a large margin.
--
Glenn Maynard
David Jones
2006-09-29 11:48:39 UTC
Permalink
Post by Mike Pall
Pass a table with user defined operators (*) to loadstring() etc.
Compile this to OP_[LIGHT]{CONST|UN|BIN}OP, indexing a table of
(light) functions held in the Lua function prototype. I guess
this would be very fast.
Slight complication: for (un)dumping to/from bytecode you need to
pass the same table of user defined operators.
(*) Name, arity, precedence, (light) function, optional type
checks, optional constant folding.
local userops = {
PI = { arity = 0, op = math.pi }, -- arity 0 are constants
sin = { arity = 1, op = math.sin },
cos = { arity = 1, op = math.cos },
["|"] = { arity = 2, op = bit.bor },
["<<"] = { arity = 2, op = bit.lshift },
["//"] = { arity = 2, op = function(a, b) return math.floor(a/
b) end },
} -- These should of course better be light functions.
loadstring([[
local a, b = ...
print( PI * sin a + cos b )
print( (a << 11) | (b << 3) | 5 )
print( "Rici buys", 1.38 // 0.16, "apples." )
]], nil, userops)(...)
[Yes, of course this works with strings or any other type, too.]
Sorry Mike, in another message a credited all this stuff to Rici Lake
for some reason.

I think this is very interesting and I've been meaning to implement
something almost exactly like this and release it so that the "we
need more operators" crowd could play around with it.

It wouldn't be too hard imagine having simple operators, like "|", be
bound to one of the unused opcodes, and having a user-patchable table
in the VM of "light functions" that implemented each "user-defined"
opcode.

drj
Glenn Maynard
2006-09-29 20:03:37 UTC
Permalink
Post by Mike Pall
(*) Name, arity, precedence, (light) function, optional type
checks, optional constant folding.
A metamethod name, too, right?
--
Glenn Maynard
Mark Hamburg
2006-09-29 22:02:10 UTC
Permalink
Post by Mike Pall
1. Add a kind of "light" C functions. This avoids building up a
call frame for every function call. These functions are a bit
more restricted (like no callbacks into Lua), but have a faster
calling convention. The cl->isC byte may be reused to
I'll note that I've wanted a different sort of light C function: One that is
to regular C functions/closures as light userdata is to userdata. The reason
to add this is that routines like lua_cpcall create closures when they
frequently aren't needed.

None of this undercuts the notion of having faster Lua->C interfaces. It's
just that when I had thought of light C functions, I was wanting something
else.

Mark

Loading...