Discussion:
autotools alternatives, is anybody using autosetup?
(too old to reply)
Sam Roberts
2012-07-17 00:32:59 UTC
Permalink
The Fossil[1] community had this whole process not all that long ago.
Automake/autoconf were rejected as not being portable and being in general a
complete pain in the lower torso anatomy to use. The solution that was used
in the end was a little piece of software called autosetup[2].
Has anybody been using autosetup with lua projects?

I've been playing with it. I've had trouble finding a decent tcl
syntax summary, but its not like I know m4 very well, either.

I quite like it. I particularly like that it does the part that make
does poorly: feature discovery, build parameterization, outputting a
config.h, and injecting discovered values into the template Makefile.
Much, much easier to understand, and easy to apply to existing
projects.

I've just been playing around with it in a stub project, trying to see
if it can do the lua header file discovery that every lua binding so
annoyingly needs to do. Result is here:

https://github.com/sam-github/udns-lua/blob/master/auto.def

Is anybody else using autosetup with lua? If there are public repos,
I'd like to see more examples.

Btw, I also looked at premake, and its pretty much exactly what I
don't want, a replacement for make, that controls all aspects of the
build. Nice that its in lua, but no...

Cheers,
Sam
[2]: http://msteveb.github.com/autosetup/
William Ahern
2012-07-17 01:30:20 UTC
Permalink
Post by Sam Roberts
The Fossil[1] community had this whole process not all that long ago.
Automake/autoconf were rejected as not being portable and being in general a
complete pain in the lower torso anatomy to use. The solution that was used
in the end was a little piece of software called autosetup[2].
Has anybody been using autosetup with lua projects?
I've been playing with it. I've had trouble finding a decent tcl
syntax summary, but its not like I know m4 very well, either.
M4 is deviously simple. It's autoconf that makes M4 seem difficult and
arcane.

<snip>
Post by Sam Roberts
I've just been playing around with it in a stub project, trying to see
if it can do the lua header file discovery that every lua binding so
https://github.com/sam-github/udns-lua/blob/master/auto.def
The better alternative to autoconf is usually nothing, IMO.

For example, your include script is pretty trivial to do with the shell.
Unlike the 1980s and 1990s, Unix environments are significanly more
homogenous. There're very few headaches supporting the *BSDs and Linux. And
if you rigorously stick to POSIX then scripts almost always work elsewhere.
Most of the headaches I've had turned out to be my non-POSIX compliant
scripts.

Predefined CPP macros can handle most feature discovery tasks, and it's far
easier for others to contribute, and for you to integrate, experiential
knowledge. See, e.g.

http://sourceforge.net/apps/mediawiki/predef/index.php?title=Main_Page

(I realize feature probing is preferable to version testing, but not at any
cost. And for those cases where it's clearly preferable it's fairly easy to
generate code from probes using Make.)

More importantly, sometimes it's best to just require the user to provide a
parameter. The infrastructure to do this automagically seems to get in the
way more often than it helps, especially for oddball cases, which ironically
is really the selling point** of using these tools. For example, I keep my
Lua headers under include/lua/5.?, similar to module paths and as god
intended. So your autosetup script would appear to fail in that case, plus
in other more common cases, like /opt or using separate project dirs under
/usr/local, like in the old Slackware days.

Autoconf may have given M4 a bad name, but there's nothing wrong with
multistage source generation, IMHO.

Finally, if you're trying to be portable to Windows then you're already
doomed to a fate worse than death, so the fact that the above points fail is
hardly consequential ;)


** That's the main selling point to the users. Distributions like them
because they tend to prevent the user from screwing up cross-compiling.
OTOH, if the build is relatively simple then it should be relatively simple
for the porter to fix. And the less stuff your build does to introspect the
environment, the fewer issues there'll be to fix in the first place.
Miles Bader
2012-07-17 04:19:42 UTC
Permalink
Post by William Ahern
M4 is deviously simple. It's autoconf that makes M4 seem difficult
and arcane.
Naw, typical autoconf files are very straight-forward.
Post by William Ahern
The better alternative to autoconf is usually nothing, IMO.
This is only true for the most trivial cases (which, to be fair, may
include many lua projects), and often not even then, because even
trivial autoconf files are usually much simpler than the equivalent
shell-script.

Remember, autoconf input files _are_ essentially shell scripts, except
that they give you easy mechanisms to accomplish common configuration
tasks; you choose the degree to which you use these mechanisms though.
Writing everything yourself in shell is obviously possible, but
typically means you just end up simply duplicating what the autoconf
authors have done, usually in a less functional, more buggy, and less
portable way.

-miles
--
Quack, n. A murderer without a license.
Coda Highland
2012-07-17 05:26:01 UTC
Permalink
Post by Miles Bader
Post by William Ahern
The better alternative to autoconf is usually nothing, IMO.
This is only true for the most trivial cases (which, to be fair, may
include many lua projects), and often not even then, because even
trivial autoconf files are usually much simpler than the equivalent
shell-script.
Would you consider LuaJIT to be a "trivial" case? It doesn't use
anything but straight-up "make".

I'm going to have to agree: autoconf is an unnecessary extra step in
software deployment on modern systems, and one that slows down build
times for little tangible benefit.

/s/ Adam
steve donovan
2012-07-17 06:03:35 UTC
Permalink
Post by Coda Highland
I'm going to have to agree: autoconf is an unnecessary extra step in
software deployment on modern systems, and one that slows down build
times for little tangible benefit.
Yes, I have to side with William on this one. From a user
perspective, the output of './configure --help' is very daunting, and
the key custom configuration parameters (like where your non-standard
Lua is) are hidden n the noise.

Another option is to use Lua for generating the makefile, or at least
the .inc files. Yes, there is a fair amount of support code (I find
myself coding which() far too often) but that can go into a relatively
small support module that ships with the build. Being inside Lua also
means that you know your module path, etc.

And then you have a fighting chance of getting a cross-platform build working ;)

steve d.
Axel Kittenberger
2012-07-19 21:15:39 UTC
Permalink
Post by steve donovan
Yes, I have to side with William on this one. From a user
perspective, the output of './configure --help' is very daunting, and
the key custom configuration parameters (like where your non-standard
Lua is) are hidden n the noise.
Did we had the same discussion a few months ago? I'll repeat my point
back then ;-) Yes, from a /user/ perspective who downloads and builds
the application/library this is true. But I really like to hear the
opinion of a distros package manager. From them the noise is useful,
and from what I get they love autoconf built packages, since it allows
them to all the funky stuff they need out of the box like virtual
root. In that case the /user/ as in user will just click the package
from the package manager of his/her choice and never be into any of
the details of building it.
Luiz Henrique de Figueiredo
2012-07-19 21:43:35 UTC
Permalink
Post by Axel Kittenberger
Did we had the same discussion a few months ago?
Probably. Perhaps we can then close this discussion, which is getting OT.
Craig Barnes
2012-07-19 22:24:37 UTC
Permalink
Post by Axel Kittenberger
Post by steve donovan
Yes, I have to side with William on this one. From a user
perspective, the output of './configure --help' is very daunting, and
the key custom configuration parameters (like where your non-standard
Lua is) are hidden n the noise.
Did we had the same discussion a few months ago? I'll repeat my point
back then ;-) Yes, from a /user/ perspective who downloads and builds
the application/library this is true. But I really like to hear the
opinion of a distros package manager. From them the noise is useful,
and from what I get they love autoconf built packages, since it allows
them to all the funky stuff they need out of the box like virtual
root. In that case the /user/ as in user will just click the package
from the package manager of his/her choice and never be into any of
the details of building it.
Autoconf does nothing for me as a packager, except make the build
process as opaque and indirect as possible. Passing flags to an
ifdef'd Makefile build is much nicer and easier to understand than any
autotools build.

Even the Nginx build system is more straight forward than autotools
and they use a huge pile of shell scripts.

Take a look at this:

http://pkgs.fedoraproject.org/gitweb/?p=lua.git;a=blob;f=lua-5.1.4-autotoolize.patch;h=afcb3fbeea3d4542359402803c5f096270253dae;hb=HEAD

Yup, Fedora's "autotoolize" patch for Lua 5.1.4 is 40,000 lines! More
than double the size of Lua itself. That for me is just too much to
stomach, regardless of the supposed "virtues" of autotools.

I also recently saw someone submit an "autotoolize" patch for a
project I work on. The "project" is a 500 line text-processing
utility. The autotoolize patch was about 10,000 lines.

autotools is a cult!
Andres Perera
2012-07-19 23:28:14 UTC
Permalink
Post by Craig Barnes
Autoconf does nothing for me as a packager, except make the build
process as opaque and indirect as possible. Passing flags to an
ifdef'd Makefile build is much nicer and easier to understand than any
autotools build.
by ifdef'd makefile you mean gmake or pmake dependent, or actually
processing it with cpp? could you show me an example of such a
makefile that covers the amount of architectures autoconf does? i
would like to see how it manages to provide compatibility without
indirection and complexity
Post by Craig Barnes
Even the Nginx build system is more straight forward than autotools
and they use a huge pile of shell scripts.
well, what runtime do you use to probe the system for features other
than shell? this ties in to the previous question: show me a project
with as many targets so that i can objectively compare the two. i
would assume that's the expected course of action before deciding one
approach is inferior -- not withstanding holding the opinion that some
autoconf targets aren't relevant
Post by Craig Barnes
http://pkgs.fedoraproject.org/gitweb/?p=lua.git;a=blob;f=lua-5.1.4-autotoolize.patch;h=afcb3fbeea3d4542359402803c5f096270253dae;hb=HEAD
Yup, Fedora's "autotoolize" patch for Lua 5.1.4 is 40,000 lines! More
than double the size of Lua itself. That for me is just too much to
stomach, regardless of the supposed "virtues" of autotools.
in those 40,000 lines you also counted autogen.sh, and other scripts
that aren't tailored for each autoconf deployment -- they are either
generated or duplicated amongst projects. have you ever maintained a
project that uses autoconf? i ask because you're showing your
unfamiliarity by not discerning between "object" and "source"
Post by Craig Barnes
I also recently saw someone submit an "autotoolize" patch for a
project I work on. The "project" is a 500 line text-processing
utility. The autotoolize patch was about 10,000 lines.
autotools is a cult!
i have no answer for this last line because you've completely dropped
any illusion of having technical merit to your critique
Craig Barnes
2012-07-20 00:29:58 UTC
Permalink
Post by Andres Perera
by ifdef'd makefile you mean gmake or pmake dependent, or actually
processing it with cpp? could you show me an example of such a
makefile that covers the amount of architectures autoconf does? i
would like to see how it manages to provide compatibility without
indirection and complexity
I was just giving a packager's perspective in relation to the previous
post. I don't claim to be an autotools expert by any means.

Tools are supposed to be helpful and usable are they not? Project
maintainers are not the only "users" of autotools. Other people have
to endure them too. Despite the effort to make autotools "just work" -
that frequently isn't the case.

I'm not attempting to discuss the technical merits or alternatives as
a maintainer - just giving a perspective as a downstream user - which
I think was made clear by my post and by the post I was replying to.

The last sentence was a bit trollish. I apologize for that.
Miles Bader
2012-07-17 06:10:44 UTC
Permalink
Post by Coda Highland
Post by Miles Bader
Post by William Ahern
The better alternative to autoconf is usually nothing, IMO.
This is only true for the most trivial cases (which, to be fair, may
include many lua projects), and often not even then, because even
trivial autoconf files are usually much simpler than the equivalent
shell-script.
Would you consider LuaJIT to be a "trivial" case? It doesn't use
anything but straight-up "make".
I've never looked at LuaJIT's build systems, so it's hard for me to say.

There are certainly "reasonably portable" packages out there that
depend only on make (e.g., git), but those that are complex enough
often simply end up essentially duplicating what autoconf does
themselves. That's the authors choice, of course, but it's not a step
for the faint of heart, as it tends to make makefiles complex and
fragile.
Post by Coda Highland
I'm going to have to agree: autoconf is an unnecessary extra step in
software deployment on modern systems, and one that slows down build
times for little tangible benefit.
In certain cases, that's true. In others, it is just wrong.

It all depends on the nature of the software, on the
libraries/services it needs, and the goals of the author. Software
that is "purely computational" (and so needs little in the way of
external libraries or services) can often be written portably enough
to avoid the need for configuration, although this can be hard if you
want to _actually_ be portable, even to a restricted range of systems
(like "typical linux systems", and especially if you include oddball
but popular cases like macosx).

My general development route is to start out with simple Makefiles,
and keep them as long as possible. In cases where just make (with
maybe some helper shell scripts) are enough, this is nice. But in
many cases, this simply doesn't prove to be adequate in the long run.

[and note that one of the best things about the autotools is not
actually autoconf, but automake, which allows far more concise and
readable/maintainable makefiles than doing the same thing in raw
make.]

So I don't think there's really any need for an argument: if make
proves enough, then use it. If it's not, then use something more.
It's OK to change somewhere down the line if the first choice doesn't
work out. Leave ideological purity to the fanbois... :)

-miles
--
Guilt, n. The condition of one who is known to have committed an indiscretion,
as distinguished from the state of him who has covered his tracks.
Coda Highland
2012-07-17 06:12:43 UTC
Permalink
Post by Miles Bader
My general development route is to start out with simple Makefiles,
and keep them as long as possible. In cases where just make (with
maybe some helper shell scripts) are enough, this is nice. But in
many cases, this simply doesn't prove to be adequate in the long run.
[and note that one of the best things about the autotools is not
actually autoconf, but automake, which allows far more concise and
readable/maintainable makefiles than doing the same thing in raw
make.]
So I don't think there's really any need for an argument: if make
proves enough, then use it. If it's not, then use something more.
It's OK to change somewhere down the line if the first choice doesn't
work out. Leave ideological purity to the fanbois... :)
This is the kind of pragmatic approach I like. :) Start as simple as
possible instead of overdoing it up front, and add more if you need
it.

/s/ Adam
Sam Roberts
2012-07-17 18:25:27 UTC
Permalink
Post by Coda Highland
Post by William Ahern
The better alternative to autoconf is usually nothing, IMO.
Clearly, I don't agree, or I wouldn't be looking for one :-)

This makefile approach is crap: https://github.com/sam-github/pcap-lua

It only supports two platforms, but not well, and I have to copy the
boilerplate approach to all my projects, where they slowly
desynchronize.
Post by Coda Highland
Would you consider LuaJIT to be a "trivial" case? It doesn't use
anything but straight-up "make".
Yes, I would. I just looked, and luajit appears to have only trivial
dependencies: a C library and gcc. Most of its 600+ lines of
src/Makefile is devoted to figuring out how to call gcc for the
platforms it supports.

I maintain libnet (but did not write it, or make its autoconf system),
and system networking APIs vary enormously, I would describe it
as non-trivial. auto* is mostly for testing existence of dependencies,
when the mere fact that you are compiling on a platform is not
sufficient to know if an optional dependency exists.

Also, if your platform support is wide, it can be better to declare
what you want from a system, then try to exhaustively list for every
supported system, whether you believe it does or does not have a
particular facility.
William Ahern
2012-07-17 20:15:49 UTC
Permalink
Post by Sam Roberts
Post by Coda Highland
Post by William Ahern
The better alternative to autoconf is usually nothing, IMO.
Clearly, I don't agree, or I wouldn't be looking for one :-)
This makefile approach is crap: https://github.com/sam-github/pcap-lua
That's not too bad. Obvious issues are that one shouldn't put an explicit
"/" after $(DESTDIR), and install is not a POSIX utility. It requires GNU
Make, but that's fair.
Post by Sam Roberts
It only supports two platforms, but not well, and I have to copy the
boilerplate approach to all my projects, where they slowly
desynchronize.
You also have to write different code for each project. I don't see what the
issue is.

The search for the perfect automated build system is a fool's errand, IMHO.
Your best bets are either autoconf (if you know it and don't care about the
versioning nightmare) or standard make and unix utilities. That is, of
course, my own opinion.

Some day I'd like to write a makefile translator which can convert certain
non-portable macro and conditional constructs between the various make
flavors. But I'll never pretend that it would be preferable than autoconf or
plain make for general usage. There are two customers, myself and the users
of my code. I may be comfortable with one approach, but third parties
generally expect either autoconf or plain Make. Use anything else and their
ability to fix build bugs or contribute patches drops dramatically.
Post by Sam Roberts
Post by Coda Highland
Would you consider LuaJIT to be a "trivial" case? It doesn't use
anything but straight-up "make".
Yes, I would. I just looked, and luajit appears to have only trivial
dependencies: a C library and gcc. Most of its 600+ lines of src/Makefile
is devoted to figuring out how to call gcc for the platforms it supports.
And how many lines of autoconf'd M4 and shell code would be required to
introspect this, or to declare it as an option through the build?

You have to compare apples to apples. And even if using autoconf or
something else required fewer lines of code, you have to offset it against
the hassle.
Post by Sam Roberts
I maintain libnet (but did not write it, or make its autoconf system), and
system networking APIs vary enormously, I would describe it as
non-trivial. auto* is mostly for testing existence of dependencies, when
the mere fact that you are compiling on a platform is not sufficient to
know if an optional dependency exists.
Also, if your platform support is wide, it can be better to declare what
you want from a system, then try to exhaustively list for every supported
system, whether you believe it does or does not have a particular
facility.
I've written and maintained an asynchrounous sockets and stub/recursive DNS
library for several years. For cqueues I decided to make sure those source
files worked under Solaris, NetBSD, and FreeBSD (previously I only ever
wrote it for OS X, OpenBSD, and Linux). Lo-and-behold the entire thing
compiled just fine on all of those with very little effort. The only real
issue was my CPP-based byte ordering detection was wrong for my DNS packet
structure.

Supporting all of those platforms (including kqueue/epoll/completion ports)
in portable source code requires no support from the build system
whatsoever. The makefile's only job is to figure out how to invoke the
compiler and linker properly.

I see that libnet supports OSF/1 and Windows, and like I said before once
you step outside the modern world of POSIX things get really hairy. But in
those cases I _especially_ doubt that there'll be a satisfactory automated
build solution. Elbow grease and aspirin is about the only way to approach
it.
Aleksey Cheusov
2012-07-18 10:00:55 UTC
Permalink
On Tue, Jul 17, 2012 at 11:15 PM, William Ahern
Post by William Ahern
Post by Sam Roberts
Post by William Ahern
The better alternative to autoconf is usually nothing, IMO.
Clearly, I don't agree, or I wouldn't be looking for one :-)
This makefile approach is crap: https://github.com/sam-github/pcap-lua
That's not too bad.
If it was written in mk-configure, i'd look like the following (not
tested, only idea is demonstrated).

----------------------------------------
LUA_MODULES = pcapx.lua # .lua module
LUA_CMODULE = pcap # Lua module written in C (pcap.c)
SCRIPTS = pcap-recode pcap-dump pcap-split

WARNS = 4 # highest possible compiler's warning level

all: README.txt
README.txt: README.txt.in pcap.c
cp README.txt.in $@
luadoc pcap.c >> $@

test:
...
doc:
...

.include <mkc.mk>
----------------------------------------
That's it.
Aleksey Cheusov
2012-07-18 10:22:21 UTC
Permalink
Post by Aleksey Cheusov
If it was written in mk-configure, i'd look like the following (not
tested, only idea is demonstrated).
I overlooked one important fragment from pcap makefile. Analog in mk-configure
may look like the following.
...
CFLAGS_pcap != pcap --cflags
LDADD_pcap != pcap -- libs
CFLAGS += ${CFLAGS_pcap}
LDADD += ${LDADD_pcap}
...

I introduced two new _pcap variables not because it's not possible to write
everything in two lines but because with _pcap variables it's easier for package
maintainer to overwrite the defaults.

P.S.
mk-configure supports pkg-config, so, if pcap library use it,
Makefile will look even easier.
Sam Roberts
2012-07-18 18:49:15 UTC
Permalink
On Tue, Jul 17, 2012 at 1:15 PM, William Ahern
Post by William Ahern
That's not too bad. Obvious issues are that one shouldn't put an explicit
"/" after $(DESTDIR), and install is not a POSIX utility. It requires GNU
Make, but that's fair.
Post by Sam Roberts
It only supports two platforms, but not well, and I have to copy the
boilerplate approach to all my projects, where they slowly
desynchronize.
You also have to write different code for each project. I don't see what the
issue is.
The problems you point out above are a shining example of what the
issue is: and when I fix it, I have to hunt down all my projects and
fix it everywhere.

And actually, code reuse is a problem, too, I have a set of utilty
functions I cut-n-paste between projects, but that's outside of the
scope of the build tool.
Post by William Ahern
The search for the perfect automated build system is a fool's errand, IMHO.
I'm not looking for perfect, I'm looking for better, and to spend more
time writing src code, and less writing make/shell/awk.
Post by William Ahern
Post by Sam Roberts
Yes, I would. I just looked, and luajit appears to have only trivial
dependencies: a C library and gcc. Most of its 600+ lines of src/Makefile
is devoted to figuring out how to call gcc for the platforms it supports.
And how many lines of autoconf'd M4 and shell code would be required to
introspect this, or to declare it as an option through the build?
If you think I'm criticizing Mike's approach or think he should be
using autotools, you misunderstand me: I think what Mike is doing is
fine, and I think it's trivial.
Post by William Ahern
I've written and maintained an asynchrounous sockets and stub/recursive DNS
library for several years.
Yes, I've been looking at it (and udns and unbound) recently.

It consists of a single .c source file, and has no dependencies, or
even an install target!

It works fine, but as an example of why everybody should do it that
way, its not so convincing.

And I see you copy your hairy EXE_CC macro into two makefiles in the
same project.
Post by William Ahern
Supporting all of those platforms (including kqueue/epoll/completion ports)
in portable source code requires no support from the build system
^^^^^^^^^^^^^^^^^^^^^^^^

Portable close-to-ansi-or-posix C src, like lua's, or your dns.c
requires only to know how to call cc, but those aren't the cases I'm
interested in. I'm interested in code that is not so trivially
portable.
Post by William Ahern
whatsoever. The makefile's only job is to figure out how to invoke the
compiler and linker properly.
I see no sign that of support for epoll in the socket code in your
libdns/contrib/ directory.

Also, avoiding support in your build system for finding dependencies
by just including the dependencies in your project is certainly one
way to go, but its not what I want to do.
Post by William Ahern
I see that libnet supports OSF/1 and Windows, and like I said before once
But not with autotools, I don't think, it has seperate windows build
files. No idea if they even work, but I haven't gotten any bug
reports. I don't even pretend to test libnet on its myriad platforms.

Sam
William Ahern
2012-07-19 02:25:27 UTC
Permalink
Post by Sam Roberts
Post by William Ahern
Supporting all of those platforms (including kqueue/epoll/completion ports)
in portable source code requires no support from the build system
^^^^^^^^^^^^^^^^^^^^^^^^
Portable close-to-ansi-or-posix C src, like lua's, or your dns.c
requires only to know how to call cc, but those aren't the cases I'm
interested in. I'm interested in code that is not so trivially
portable.
Therein lies much of the problem. Portability is often an after thought. For
example, people say, "I want do to XYZ". So they implement XYZ in Linux.
Then they tack on support for FreeBSD. Windows. Etc. It becomes a mess, even
if at the outset they intended to make it portable.

If they stood back and thought carefully about portability ahead of time,
they may have structured their project differently; approached the
implementation from a different direction; move a particular element up or
down the abstraction stack.

Instead, they wind up with a mess of problems and lean on the build system
to help solve them.

Example: asynchronous I/O. Windows and Unix use completely different
approaches. The primitives they expose, and the assumptions they make, are
completely at odds. Something like libevent is too low on that stack, which
makes it needlessly messy and confusing to use. libevent ends up with the
worst of both worlds and none of the redeeming qualities of either. And it's
Windows build is perennially broke, just like for most FOSS projects which
try to provide seamless POSIX and Windows support.

Out-of-the-box portability across disparate systems is rarely worth the time
and effort; it's a tarpit. Most projects which try it fail. Ultimately
support still requires user or developer intervention, so the effort in
automation was a complete wash. Time is better spent making it easier to
port--that is, as an active exercise.
Post by Sam Roberts
Post by William Ahern
whatsoever. The makefile's only job is to figure out how to invoke the
compiler and linker properly.
I see no sign that of support for epoll in the socket code in your
libdns/contrib/ directory.
It's in the cqueues/ projects, about two hundred lines in the middle of the
Lua module cqueues.c.
Post by Sam Roberts
Also, avoiding support in your build system for finding dependencies
by just including the dependencies in your project is certainly one
way to go, but its not what I want to do.
Most of my code gets put into appliances and cloud environments. It gets
used _because_ it's easy to integrate into a project and tweak and modify.
And most of all, because it lacks dependencies. Dependencies cause nothing
but headaches. That's why so many commercial products have so many
vulnerabilities; updating dependencies is a nightmare. Merely making
something into a library does not solve that problem, and in fact can make
it worse because it turns the library into more of a black box, making it
seem more risky to swap out for a newer version. And if multiple components
depend on the same library, it's an all-or-nothing proposition, making it
less likely it'll get updated regularly. Adding library versioning into the
mix creates even more uncertainty; or at best puts you back to square 1, but
with a plethora of confusing libraries and headers.

The key characteristic that makes developer's lives easier is simplicity and
isolation. That's different than a black box. A black box says, "Do not
enter." Isolation means, "Play with me, you're not going to break anything."

I'm not saying installed libraries are bad. I'm just saying they're over
used, and used prematurely. Something that works well for libc or openssl is
not necessarily best for project foo.

A great counterexample is Lua. I really wish Lua had a more dependable
installation where Lua 5.1 and Lua 5.2 could sit side-by-side. But Lua
succeeded partly because it didn't try to solve this early on. It's only
after becoming successful that this was something that needed (needs) to be
addressed.
Tony Finch
2012-07-19 15:13:05 UTC
Permalink
Post by Sam Roberts
I maintain libnet (but did not write it, or make its autoconf system),
and system networking APIs vary enormously, I would describe it as
non-trivial. auto* is mostly for testing existence of dependencies,
when the mere fact that you are compiling on a platform is not
sufficient to know if an optional dependency exists.
One of the things that annoys me about autoconf-style configuration is the
idea that the same build commands can build a package with different
features because of some environmental change. I prefer builds that are
reproducible, and that fail rather than quietly disabling functionality.
The only real issue was my CPP-based byte ordering detection was wrong
for my DNS packet structure.
http://commandcenter.blogspot.co.uk/2012/04/byte-order-fallacy.html

Tony.
--
f.anthony.n.finch <***@dotat.at> http://dotat.at/
Forties: Northerly 5 to 7, decreasing 4 in west. Moderate or rough. Rain then
showers. Good, occasionally moderate.
Dimiter 'malkia' Stanev
2012-07-20 01:18:52 UTC
Permalink
+1 to this.

I still don't get how autoconf works with cross compiling... Does it
actually?

Ideally it should create small test executables and run them on the
target platform... I doubt it's doing that...
Post by Tony Finch
Post by Sam Roberts
I maintain libnet (but did not write it, or make its autoconf system),
and system networking APIs vary enormously, I would describe it as
non-trivial. auto* is mostly for testing existence of dependencies,
when the mere fact that you are compiling on a platform is not
sufficient to know if an optional dependency exists.
One of the things that annoys me about autoconf-style configuration is the
idea that the same build commands can build a package with different
features because of some environmental change. I prefer builds that are
reproducible, and that fail rather than quietly disabling functionality.
The only real issue was my CPP-based byte ordering detection was wrong
for my DNS packet structure.
http://commandcenter.blogspot.co.uk/2012/04/byte-order-fallacy.html
Tony.
Coda Highland
2012-07-20 02:58:36 UTC
Permalink
On Thu, Jul 19, 2012 at 6:18 PM, Dimiter 'malkia' Stanev
Post by Dimiter 'malkia' Stanev
+1 to this.
I still don't get how autoconf works with cross compiling... Does it
actually?
Ideally it should create small test executables and run them on the target
platform... I doubt it's doing that...
That's one reason I like using Scratchbox for cross-compiling. CPU
transparency via qemu is a marvelous thing and it can even
cross-compile stuff that doesn't normally cross-compile.

/s/ Adam
Dimiter 'malkia' Stanev
2012-07-20 03:12:18 UTC
Permalink
But then, there are platforms where that's not very possible - like much
of the video game consoles - there you can have only one process running
at the time.
Post by Coda Highland
On Thu, Jul 19, 2012 at 6:18 PM, Dimiter 'malkia' Stanev
Post by Dimiter 'malkia' Stanev
+1 to this.
I still don't get how autoconf works with cross compiling... Does it
actually?
Ideally it should create small test executables and run them on the target
platform... I doubt it's doing that...
That's one reason I like using Scratchbox for cross-compiling. CPU
transparency via qemu is a marvelous thing and it can even
cross-compile stuff that doesn't normally cross-compile.
/s/ Adam
Coda Highland
2012-07-20 03:25:41 UTC
Permalink
On Thu, Jul 19, 2012 at 8:12 PM, Dimiter 'malkia' Stanev
But then, there are platforms where that's not very possible - like much of
the video game consoles - there you can have only one process running at the
time.
Well, certainly so, but I'm talking about POSIX platforms -- and even
so, some of those console platforms may have a suitable emulator in
the SDK if you mess with enough.

(Why are you top-posting? I thought you knew better.)

/s/ Adam
Jay Carlson
2012-07-20 03:14:22 UTC
Permalink
[apologies for carrying this thread on, but speaking as a frequent cross-compiler and former distro maker, I think the ways autoconf actually assists cross-compilation are not intuitive]
I still don't get how autoconf works with cross compiling... Does it actually?
Yes. It was easier to cross-compile autoconfiscated packages than anything else in the ~2001 timeframe, and autoconf has improved since then. The real disasters were things like perl's config script.
Ideally it should create small test executables and run them on the target platform... I doubt it's doing that...
It can't do that when cross-compiling. However, run-time behavior comes up less often than you'd think. The presence of an #include or #define is a feature of the cross-compiler, not the target. This can be tested at compile time: does "mipsel-linux-gcc -c foo.c" succeed and produce a foo.o? The presence of a symbol, or which library is needed for a symbol, can be tested at link time.

Back in ~2001, the most common problem was questions like sizeof(int), which I thought was only knowable at run-time.[1] For features like that, it was convenient to preseed autoconf with a central config.cache with hand-written platform answers once--more convenient than manually patching handwritten makefiles for each package.
In a traditional self-hosted build process you can write a program
that prints the sizeof all the types you care about and then generate
C code based on that. But in cross-compilation environments, you may
not have the ability to run code you compile. At some point autoconf
gained the ability to deduce compile-time constants at by...declaring
an array of size 1-2*!(expr). If expr is false, the size of the array
is -1, and the compile fails. autoconf then does a *BINARY SEARCH* of
the integers for the value....
Jay

[1]: Yeah, "testing for broken select() implementation" is not going to work. Sometimes just knowing the target is Linux is good enough for configure to skip those. In more extreme cases you can break out of the config script and hand-run the executable on the target, but I don't remember doing that much.
Dimiter 'malkia' Stanev
2012-07-20 18:48:05 UTC
Permalink
Post by Jay Carlson
It can't do that when cross-compiling. However, run-time behavior comes up less often than you'd think. The presence of an #include or #define is a feature of the cross-compiler, not the target. This can be tested at compile time: does "mipsel-linux-gcc -c foo.c" succeed and produce a foo.o? The presence of a symbol, or which library is needed for a symbol, can be tested at link time.
Thanks Jay! That was good explanation, after all lots of platform
specific magic might be found at the preprocessor, or testing for
expected/unexpected compiler errors, so no need to actually run it on
the device.

Aleksey Cheusov
2012-07-17 10:49:31 UTC
Permalink
Consider using mk-configure as a replacement for autotools.

http://sourceforge.net/projects/mk-configure
https://github.com/cheusov/mk-configure

It is a portable general purpose build automation system that has
support for Lua.
You can find simple examples under examples/hello_lua{,2,3}
directories on github.
Sam Roberts
2012-07-17 18:56:24 UTC
Permalink
Post by Aleksey Cheusov
Consider using mk-configure as a replacement for autotools.
Thanks, Aleksey, I'll check it out.

Sam
Xavier Wang
2012-07-18 00:17:21 UTC
Permalink
but it requires bmake, which is not available in Windows :(

I also hopes can have a Windows version of bmake very much, but no
lucky, so on windows, GNU make is the only choice
Post by Aleksey Cheusov
Consider using mk-configure as a replacement for autotools.
http://sourceforge.net/projects/mk-configure
https://github.com/cheusov/mk-configure
It is a portable general purpose build automation system that has
support for Lua.
You can find simple examples under examples/hello_lua{,2,3}
directories on github.
Aleksey Cheusov
2012-07-18 09:51:48 UTC
Permalink
Post by Xavier Wang
but it requires bmake, which is not available in Windows :(
Sorry, what is Windows? :-P
Post by Xavier Wang
I also hopes can have a Windows version of bmake very much, but no
lucky,
Seriously, bmake can be compiled on Cygwin and Interix.
So, it is definitely available on Windows.
Post by Xavier Wang
so on windows, GNU make is the only choice
No ;-)
steve donovan
2012-07-18 10:06:59 UTC
Permalink
Post by Aleksey Cheusov
Seriously, bmake can be compiled on Cygwin and Interix.
So, it is definitely available on Windows.
True, it's amazing how close you (as a developer) can get Windows to
look like Unix.

But it places a big burden on ordinary people just wanting to build
the software, with the standard mingw download.

So there are three kinds of pain to be minimized:
1 original developer
2 anybody wishing to patch and modify
3 users just wanting to build

Generally, we expect (1) to take some extra pain so that (3) is
happier. After all, we _do_ want software to be used ;)

steve d.
Aleksey Cheusov
2012-07-18 10:15:40 UTC
Permalink
On Wed, Jul 18, 2012 at 1:06 PM, steve donovan
Post by steve donovan
Post by Aleksey Cheusov
Seriously, bmake can be compiled on Cygwin and Interix.
So, it is definitely available on Windows.
True, it's amazing how close you (as a developer) can get Windows to
look like Unix.
In the middle 90th I was a Windows user, so, I can feel the pain.
But I believe that UNIX tools ported to Windows (cygwin and interix)
gives Windows developers A LOT of power. So, it makes sense for them
to learn UNIX tools once and use them for years. IMHO mingw is not relevant.
steve donovan
2012-07-18 10:23:20 UTC
Permalink
Post by Aleksey Cheusov
gives Windows developers A LOT of power. So, it makes sense for them
to learn UNIX tools once and use them for years. IMHO mingw is not relevant.
Ah, but if I want Unix, I know where to find it ;)

Mingw is relevant because it's a straightforward way for people to get
a good working GCC on their computers, especially with the Dragon
distribution. 15Meg download, they have a C compiler. But they won't
have anything more than GNU make, and that's often not enough. This is
why things like CMake are definitely relevant here, even though it
makes my eyes bleed.

steve d.
David Given
2012-07-18 10:48:04 UTC
Permalink
Aleksey Cheusov wrote:
[...]
Post by Aleksey Cheusov
In the middle 90th I was a Windows user, so, I can feel the pain.
But I believe that UNIX tools ported to Windows (cygwin and interix)
gives Windows developers A LOT of power. So, it makes sense for them
to learn UNIX tools once and use them for years. IMHO mingw is not relevant.
We recently switched our development system from Cygwin to Mingw because
Cygwin was becoming far too much of a PITA to use --- recent versions
get on *really* badly with having more than one version of Cygwin
installed at a time, or calling out to non-Cygwin command line tools
from Cygwin, or calling in to Cygwin command line tools from non Cygwin...

Mingw tries much less hard than Cygwin to be Unix-like, which makes it
much simpler and easier to understand, and vastly easier to deploy in a
turnkey environment which may need to coexist with random other tools
the user may have deployed. And the licensing is vastly, vastly easier
and cheaper.

(GNU make on Windows is still a dead loss, though; it can't cope with
targets containing colons, such as any Windows-style path...)
--
┌───  ───── http://www.cowlark.com ─────
│ "Parents let children ride bicycles on the street. But parents do not
│ allow children to hear vulgar words. Therefore we can deduce that
│ cursing is more dangerous than being hit by a car." --- Scott Adams
Aleksey Cheusov
2012-07-18 10:55:59 UTC
Permalink
Post by David Given
(GNU make on Windows is still a dead loss, though; it can't cope with
targets containing colons, such as any Windows-style path...)
There are miriads of problems in porting UNIX apps to Windows.
This is why in our company we use normal UNIX for developement
of Windows-based product.
Doug
2012-07-19 00:22:52 UTC
Permalink
-_- as a developer force to support windows apps built on mingw,
please don't say that.
If lua ever stopped being buildable for that target it'd be a nightmare.

~
Doug.
Post by Aleksey Cheusov
On Wed, Jul 18, 2012 at 1:06 PM, steve donovan
Post by steve donovan
Post by Aleksey Cheusov
Seriously, bmake can be compiled on Cygwin and Interix.
So, it is definitely available on Windows.
True, it's amazing how close you (as a developer) can get Windows to
look like Unix.
In the middle 90th I was a Windows user, so, I can feel the pain.
But I believe that UNIX tools ported to Windows (cygwin and interix)
gives Windows developers A LOT of power. So, it makes sense for them
to learn UNIX tools once and use them for years. IMHO mingw is not relevant.
liam mail
2012-07-17 12:09:55 UTC
Permalink
Post by Sam Roberts
Btw, I also looked at premake, and its pretty much exactly what I
don't want, a replacement for make, that controls all aspects of the
build. Nice that its in lua, but no...
FYI and you are not alone in thinking this, but this is not what
Premake is. As the name would suggests it runs before make (or others)
and is a cross platform project generator ie it generates Visual
Studio solutions, Xcode projects, makefiles, CodeBlocks, CodeLite etc
much like CMake does yet using a sane Language :)
Post by Sam Roberts
that controls all aspects of the build
With the latest development version you can add blocks of code which
will be placed as is into a Makefile etc and you have been able to run
your own scripts for build events for a long time, ie it does not
always control all aspects.

Liam
Jay Carlson
2012-07-18 15:08:30 UTC
Permalink
Post by Sam Roberts
I've been playing with it. I've had trouble finding a decent tcl
syntax summary,
"Get used to disappointment."

Tcl is like the platonic essence of a programming language designed around string interpolation, in the same way Lua is for tables and Scheme is for cons cells. A lot of us have written various hacky little macro expansion languages; core Tcl is those done right. Tcl's major charm is that function arguments are not evaluated, making it easy to build special forms and little languages. So other than the core {}[]"$f" syntax, Tcl syntax is whatever you want it to be.

But as the saying goes, "there's no right way to do a wrong thing". Tcl may not be a mess like PHP, but pure and clean doesn't mean particularly ergonomic. Sure, everything is a string, but that's about as cool as everything being a void*; you still have to keep track of how a string should be interpreted. Is it a list? A command? A name of an object? A name of an array? Who's in charge of deleting its referent? You end up with all these second-order memory leaks, since there can be no GC which can know how to interpret strings as pointers.

All-string data structures can accidentally lead to algorithms with performance characterized as "superquadratic". And since everything's a string, Tcl is naturally missing closures, and Tk kinda needed them. Callbacks from UI events were just strings eval'd in the global context, with %x %y printf'd in.

The language STk was a decent Scheme connected to a patched Tk, and several other languages were maintaining essentially the same Tk patch: give widgets and callbacks a void* and a hook for GC for languages with values richer than strings. IIRC this patch was offered upstream but Osterhout was not interested in making life any easier for people who didn't want Tcl. In today's social environment, Tk would have forked, but that was considered far more rude in that era.

Tcl-the-distribution parted ways with Lua around the time Tcl reimplemented stdio; I think this was around 8.0. I do remember downgrading to Tcl 7.x because it was significantly smaller when I built http://web.archive.org/web/20000821215129/http://vhl-tools.sourceforge.net/demo1.html . Later versions of Tk depended so heavily on Tcl libraries there was little point in trying to tease them apart.

Tcl did a lot of good in advancing the ideology of using scripting to build applications. It was the Lua of its era. But Lua is the Lua of this era.

Jay
David Given
2012-07-18 15:46:59 UTC
Permalink
Jay Carlson wrote:
[...]
Post by Jay Carlson
Tcl is like the platonic essence of a programming language designed around string interpolation, in the same way Lua is for tables and Scheme is for cons cells. A lot of us have written various hacky little macro expansion languages; core Tcl is those done right. Tcl's major charm is that function arguments are not evaluated, making it easy to build special forms and little languages. So other than the core {}[]"$f" syntax, Tcl syntax is whatever you want it to be.
The Fossil DVCS/bug tracker/wiki/CMS (which I've been using, and is
really nice) uses a Tcl subset called TH1 as its default template
language. It's specifically a *template* language, designed to do simple
stuff in HTML files, rather than a fully-fledged scripting language, and
somewhat to my horror actually seems to do the job quite well...

The implementation is about 4000 lines of well-commented C, and the
programming manual is 26 pages:

http://www.sqliteconcepts.org/THManual.pdf

I can't say I *like* it, but I can appreciate the minimalism of the design.
--
┌───  ───── http://www.cowlark.com ─────
│ "Parents let children ride bicycles on the street. But parents do not
│ allow children to hear vulgar words. Therefore we can deduce that
│ cursing is more dangerous than being hit by a car." --- Scott Adams
Luiz Henrique de Figueiredo
2012-07-18 17:45:47 UTC
Permalink
Post by Jay Carlson
Tcl did a lot of good in advancing the ideology of using scripting to build applications. It was the Lua of its era. But Lua is the Lua of this era.
Very nicely put!
Eric Wing
2012-07-18 19:10:33 UTC
Permalink
Post by Sam Roberts
Is anybody else using autosetup with lua? If there are public repos,
I'd like to see more examples.
Btw, I also looked at premake, and its pretty much exactly what I
don't want, a replacement for make, that controls all aspects of the
build. Nice that its in lua, but no...
I dropped autotools many years ago out of frustration. I found that
autotools did very little to help me (the project writer) to find and
detect which dependencies are available and let me do things
accordingly and that burden was on me to write more m4 scripts and
such. Also, I found the use of libtool by the tool chain to be
excruciatingly slow for large/complex projects.

Also, as others point out, autotools isn't very friendly to
environments that are not built around it (i.e. Windows) so it didn't
really do anything useful for us. Also of note, Apple no longer ships
GNU/GPL based tools with Xcode's command line tools.


I ended up with CMake a long while back. While far from perfect, I
think their overall approach/philosophy is sound. Instead of trying to
micromanage and reinvent the entire build tool chain, CMake is a
meta-build generator that constructs native projects (e.g. Makefiles,
Visual Studio projects, Xcode projects, Eclipse workspaces). This
empowers developers to use the tools they are familiar/comfortable
with.

I did a video introduction on this awhile back to demonstrate how
Makefile users, Eclipse users, Xcode users, and Visual Studio users
would all use a common project under CMake:


My biggest gripe is the CMake language. I still have dreams of
resurrecting/finishing the Lua bindings to CMake some day, but its not
something I have time for in the foreseeable future.

For stock Lua, I know there are multiple CMakeLists.txt descriptions
for Lua. I wrote one myself (emphasizing/embracing proper Apple/Mac
conventions which requires some additional code, plus proposing a way
to coordinate independent projects with a simple unifying CMake
script). The repo links can be found here:
http://playcontrol.net/ewing/jibberjabber/mercurial_subrepos_a_past_e.html

-Eric
--
Beginning iPhone Games Development
http://playcontrol.net/iphonegamebook/
Doug
2012-07-19 00:21:00 UTC
Permalink
Idly, I agree with all your points.

Gad, it'd be so amazing to be able to write my cmake scripts in lua
instead of the retard cmake syntax.

Just imagine, a cmake:execute_process(...) and
cmake:add_library(sources) instead of ADD_LIBRARY("${SOURCES}").

*day dreams*

~
Doug.
Post by Eric Wing
Post by Sam Roberts
Is anybody else using autosetup with lua? If there are public repos,
I'd like to see more examples.
Btw, I also looked at premake, and its pretty much exactly what I
don't want, a replacement for make, that controls all aspects of the
build. Nice that its in lua, but no...
I dropped autotools many years ago out of frustration. I found that
autotools did very little to help me (the project writer) to find and
detect which dependencies are available and let me do things
accordingly and that burden was on me to write more m4 scripts and
such. Also, I found the use of libtool by the tool chain to be
excruciatingly slow for large/complex projects.
Also, as others point out, autotools isn't very friendly to
environments that are not built around it (i.e. Windows) so it didn't
really do anything useful for us. Also of note, Apple no longer ships
GNU/GPL based tools with Xcode's command line tools.
I ended up with CMake a long while back. While far from perfect, I
think their overall approach/philosophy is sound. Instead of trying to
micromanage and reinvent the entire build tool chain, CMake is a
meta-build generator that constructs native projects (e.g. Makefiles,
Visual Studio projects, Xcode projects, Eclipse workspaces). This
empowers developers to use the tools they are familiar/comfortable
with.
I did a video introduction on this awhile back to demonstrate how
Makefile users, Eclipse users, Xcode users, and Visual Studio users
http://youtu.be/CLvZTyji_Uw
My biggest gripe is the CMake language. I still have dreams of
resurrecting/finishing the Lua bindings to CMake some day, but its not
something I have time for in the foreseeable future.
For stock Lua, I know there are multiple CMakeLists.txt descriptions
for Lua. I wrote one myself (emphasizing/embracing proper Apple/Mac
conventions which requires some additional code, plus proposing a way
to coordinate independent projects with a simple unifying CMake
http://playcontrol.net/ewing/jibberjabber/mercurial_subrepos_a_past_e.html
-Eric
--
Beginning iPhone Games Development
http://playcontrol.net/iphonegamebook/
Loading...