Luke Horgan
2018-09-14 16:35:49 UTC
Hello Lua friends,
We're using Lua to facilitate Ethereum-style smart contracts in a
C/C++ based cryptocurrency toolkit. In case you aren't familiar with
smart contracts, the idea is pretty simple. It's just a little
program that all the peers in a P2P network run. Every peer needs to
agree on the program's output, so the scripts need to be completely
deterministic. Also, any peer can submit a smart contract, so the
scripts run in a sandboxed environment with constraints on instruction
count and memory usage. Before running a contract script, we disable
the garbage collector with 'collectgarbage("stop")'. Hypothetically,
this shouldn't be a problem, since we only want scripts to be able to
allocate a fixed amount of memory, total, over the entire course of
their execution. However, it seems to cause the memory profile of
even trivial scripts to blow up for no particularly apparent reason.
We are tracking allocations using a custom lua_Alloc function, as
documented in the reference manual (version 5.3). Of particular
confusion is that functions which are never called still appear to
consume large amounts of memory (hundreds of kilobytes). This problem
does seem to be severely exacerbated by the use of custom variables we
load from the sandbox environment, so I can only imagine that's part
of the problem, but the behavior is confusing regardless. Why should
an uncalled function consume so much memory, or any memory at all? Is
there any reason lua_Alloc might behave strangely with the garbage
collector disabled?
I can find no obvious fault with my own code, so I thought it sensible
to check that there isn't some well known Lua behavior I'm unaware of.
Thanks for your help.
Best,
~Luke
We're using Lua to facilitate Ethereum-style smart contracts in a
C/C++ based cryptocurrency toolkit. In case you aren't familiar with
smart contracts, the idea is pretty simple. It's just a little
program that all the peers in a P2P network run. Every peer needs to
agree on the program's output, so the scripts need to be completely
deterministic. Also, any peer can submit a smart contract, so the
scripts run in a sandboxed environment with constraints on instruction
count and memory usage. Before running a contract script, we disable
the garbage collector with 'collectgarbage("stop")'. Hypothetically,
this shouldn't be a problem, since we only want scripts to be able to
allocate a fixed amount of memory, total, over the entire course of
their execution. However, it seems to cause the memory profile of
even trivial scripts to blow up for no particularly apparent reason.
We are tracking allocations using a custom lua_Alloc function, as
documented in the reference manual (version 5.3). Of particular
confusion is that functions which are never called still appear to
consume large amounts of memory (hundreds of kilobytes). This problem
does seem to be severely exacerbated by the use of custom variables we
load from the sandbox environment, so I can only imagine that's part
of the problem, but the behavior is confusing regardless. Why should
an uncalled function consume so much memory, or any memory at all? Is
there any reason lua_Alloc might behave strangely with the garbage
collector disabled?
I can find no obvious fault with my own code, so I thought it sensible
to check that there isn't some well known Lua behavior I'm unaware of.
Thanks for your help.
Best,
~Luke