Ikke's Blog

Archives for: February 2005

Feb 25
Gnome Settings Manager

Just thought of this:
Imagine a program that allows you to tar up settings etc. You run it, it takes a snapshot of your Gconf settings, your Gnome application settings in .gnome2, your current GTK+ theme, your Metacity theme, your icon theme,... i.e. all possible settings related to your Gnome desktop, and maybe even more things (as a plugin) like Mozilla Firefox preferences, Openoffice settings,... It takes all this, and makes a nice tarball out of it which you can take to some other machine, run some tool, feed it with the tarball, and all at once you got the exactly same environment as on the first machine...

I've been thinking of this 'caus I installed Ubuntu Linux at VTK the other day on one workstation, and I hated it to be forced to reconfigure every application the way I want them. Just grabbing the settings of my workstation here and applying them there would be so cool.

I know I could just tar -cjvf ~/.somedirectoriesandfiles, but that's no "nice" solution, and could (next to other things) corrupt the existing Gconf keys, Firefox profiles etc.

I just launched the idea on #gnome@GIMPNet, one possible problem that's been mentioned is hard-coded paths (e.g. in Gconf keys). The settings application should be smart enough to find these things, and check if the specified file exists. If not, the key should not be created, and the application using the kay should create it or get it from the Gconf defaults (that's just the way applications should work with Gconf. If they don't, they're not 100% Gconf-enabled/valid).

This is not easy to write, although when coding step by step, one should be able to get it working.

RealNitro: wouldn't this be some nice Python project? File handling, string parsing, Gconf bindings, even using PyGTK one day... If you'd like to take a look at this, I'd be glad to join and learn some more Python at the same time :-)

FOSDEM tomorrow and Sunday. I hope I'll get there (on Sunday preferably).

Feb 25
Vim and SCP/SFTP

Maybe usefull for some people reading this: if you want to edit remote files using Vim or gVim over SCP, no need to use Gnome-VFS, shfs, lufs or whatever more.

:e scp://username@host/a/file/in/your/homedir

will do the job.

When using a decent .ssh/config or when you got the same username on both machines, you can omit the username part.
You should use DSA/RSA keys so you don't have to enter your password on every open/save action :-)

AFAIK tab completion is not available, which is a pitty.

Ikke • LifePermalink 1 comment
Feb 23
Ugent WebAuth using PHP

As mentioned before I wrote some PHP class to allow people writing PHP applications to (ab)use the authentication method UGent and DICT offer easily.
This is the documentation of it, rendered using PhpDocumenter. The code of the class can be found inside the docs (can't give a link here, b2evo bug :-/), a working demo is here.

The code uses 2 mechanisms to decrypt the RSA encrypted key we get from the authentication server: it tries to use PHP's internal openssl_* set of functions, if support for them is compiled in the PHP interpreter. Otherwise it falls back to shell_exec to call OpenSSL in a shell, which gives a lot of overhead, unfortunately.
Unlike the provided samples, I'm not using temporary files anywhere.

Just installed Evince. It got some more features than Gpdf already, let's hope it continues to grow :-)

Feb 23
X Development

There's been a lot of talking lately regarding the future of X, things that should become possible,... Just take a look at Planet Gnome, especially Havoc and Seth have been active in this field.
A week ago or so there was the XDevConf, a conference for, guess what, X developers. There's been blogged about this summit quite a lot too, both on Planet Gnome and Planet Freedesktop.

I just read an article by Rasterman, an Enlightenment developer and X wizard. I did look at E17 and EFL some months ago already, but was pointed to them again because of the article. It explains what is possible already nowadays, including some videos. Make sure you watch them.

I'm a Gnome user, I love the Gnome Desktop and development. Some things you can see in the sample movies are great though, and cannot be used inside Gnome at this moment.
I don't like E17. I dislike the looks of the Windows Manager. Some of the things these guys are able to achieve at this only moment look incredible though, and even usefull ;-) If the X.org developers at freedektop.org could keep enhancing their X server (and especially the drivers), imagine how cool it would be if GTK could make use of some of the things these Enlightenment libraries provide. Not only to provide eye-candy, also "functionality" (think of *real* terminal transparency, no only copying the lowest X buffer, or transparent Gaim buddy lists as I wrote before). The possibilities are endless. And some people would die to get Linux on their desktop :-)

Lennert got a good idea too yesterday: get the Gaim buddy list out of a window, and put it on the desktop (if wanted, of course). Like a "Gaim GDesklet" (if you don't know what desklets are, check this screenshot, upper right corner). Would be very cool IMHO B-)

Feb 22
News from me

Long time ago once more ;-)

I got lots of things to do lately, so I rarely find time to update this blog. Next to this, I got sick about a week ago, so I had to stay in bed for a while too.

Some thing's I've been doing lately:

  1. Installed Xgl, dropped it because I got no hardware acceleration

  2. New desktop (screenshot, slightly changed now: the line under my top panel is smaller)

  3. Installed Hula some minutes ago.



    It is nice, but still needs some work IMHO.

  4. Wrote some mass-mailing-with-attachment Python script. It's not 100% done, if you want it give me a call

  5. At Ghent University there's a new system to allow web application developers to authenticate users against the universities student database. Whilst all sample code in the docs is written in Perl, usign a bunch of temporary files to decode/decrypt keys etc (I'm not getting into the implementation details here), I wrote some PHP class to achieve the same goals. Not using any temporary files (which is better, think of multithreaded webservers...), but still using a shell_exec call to openssl to decrypt a PKI string using some public key. TODO: use PHP's openssl functions, or shell_exec as a fallback when OpenSSL support hasn't been compiled into PHP. I think I'll send the class to the UGent admins once it's done, so others can make use of it too.

  6. Just showed Nat it's easy to generate PDF files from MediaWiki pages using this software. He was looking for a tool like this to create PDF's out of the Hula Wiki pages (hey, I even pointed him at MediaWiki in a lengty mail I sent him upon his request describing all wiki implementations I got some experience with :-))

  7. Installed PhpAdsNew2 today at VTK. It's not implemented yet (still need to give a demo to the admins), I hope it'll allow us to deploy a better, easier and more manageable ad management, including decent statistics. Hey, we need to know our market value, isn't it?

TODO:

  1. Get a new harddrive (8.4Gb is too small on a modern desktop/development system)

  2. Get NVidia Binary drivers working, play around with Xgl once more, and try hardware accelerated composition

  3. Get semi-transparant Gaim chat windows and buddy list working

  4. Play around with Galago

  5. Get a Subversion repository somehow, somewhere

  6. Find out how to get to FOSDEM, or I won't go once more

  7. Finish reading the GStreamer development handbook (60 pages to go)

  8. Get some new girlfriend?

  9. Lots more, not willing to think about it at the moment.

Oh I almost forgot. We sold our first Dell computers today. Let's hope the ecommerce site will be up soon.

Feb 13
New article: GOptions

Wrote a new article today, on command line options parsing using Glib 2.6's GOptions.

Once more, it's listed on my articles page, direct link here.

Next one will be on moving your project dir into a local or remote Subversion repository, or GStreamer basics, still have to make up my mind.

Feb 12
Use id's

Hint #cantremember: Use ID's where-ever you're allowed to

This makes crossreferencing later on (using XRef or whatsoever) easier, and references are very nice and usefull (look at WikiPedia and others).

Ikke • DocbookPermalink 1 comment
Feb 12
"Introduction to Makefiles and Autotools" published

I finally finished the Docbook-translation of my article on Makefiles and Autotools. The result is listed on my articles page, direct link here.

Feb 10
CV

Just wrote some initial version of my CV, still needs some work, and PDF conversion :-)

I might ask some people for some advise on it too.

Feb 10
Get a reference sheet

Hint: Use a Docbook element reference sheet

Simple :-)
I use this, this is somewhat bigger.

Feb 9
#3

Hint 3: Write directly in DocBook format

As you might know I first wrote my first 2 articles in this blog. I docbook'ized the first one, which went quite well, now in progress doing the second one, which is a real PITA.

So, little hint: don't write plaintext files first, write DocBook code/tags directly.

Ikke • DocbookPermalink 1 comment
Feb 8
Use Yelp to preview your documents

Simple one for now, got not much time, but don't want to break the chain either:

Tip 2: Use Yelp to preview your documents

You can use Yelp to preview the Docbook documents you wrote, so you don't need to xsltproc/xmlto the file everytime (which takes a while).

Just open the like this:

yelp mydocument.dbk

Ikke • DocbookPermalink 1 comment
Feb 7
Article list generation script

I wrote the articles/index.html generation script, using Bash and some sed magic.

This is a sample input file:

Using Glib Signals with GOB_gob-signals
Introducing the Glib Mainloop_glib-mainloop

which outputs the current index.html.

You can find the script here, it's called genindex.sh. The index.html.in file I use as a boilerplate in the end is here, very easy to figure out what's done.

Bash is a really powerful thing, the more scripts I try to write using it, the more I learn.
Of course you could do something like this using PHP, Python or something else, but Bash is much cleaner IMHO (i.e.: you don't need a fully fledged PHP installation to do stuff like this).

Feb 7
New articles page

Just finished (well, almost) the articles section on my website.
As you can see, the articles are available as PDF now too, although the PDFs aren't formatted very well sometimes (links aren't rendered as real links :-(), need to look into this.
I also need an info page with an explanation of the license, and some copyright information.
The article titles should get the same look as the section links on my homepage, but I cant get them to behave correctly :'(

I'm going to write a PHP script that generates the HTML code you see there from an XML file listing all available articles, too. Will make things much easier for me :-)

I've read some of the GStreamer API docs today, it's a wonderful framework. Prepare for some tutorial ;-)
Next one should be about local (UNIX Domain) sockets, but I think I'll Docbookize the Makefiles tutorial first.

Feb 7
A Docbook tip a day keeps MS Word away

Last night I decided to create a new blog category, where I'll try to give one Docbook-related tip every day, so others can get used to this great format too and start writing documentation or articles using it :-)

I will concentrate on writing "article" files, not "books" or some of the other Docbook classes.

First tip: The standard Docbook Article boilerplate

A Docbook document is an SGML or XML file. Writing SGML can be a tedious task, so most users write their documents in XML.
This is the standard boilerplate for a Docbook XML article:

<?xml version="1.0"?>
<!DOCTYPE article PUBLIC "-//OASIS/DTD DocBook XML V4.1.2//EN"
"http://www.oasis-open.org/docbook/xml/4.1.2/docbookx.dtd">
<article id="sometitle" lang="en">

Document goes here

</article>
Ikke • DocbookPermalink 1 comment
Feb 6
New article: The Glib Mainloop

I decided to write new articles in docbook directly, not posting them into this blog.

As promised I wrote an article on the Glib Mainloop. It is available here. I still need to add hyperlinks to the quoted references (API's,...), but I'll do that later.

I start to like DocBook, actually. The generated documents may look ugly, but some CSS work could fix it.
The good thing is you got a consistent interface, and the XML you write really describes what you're writing about, if you use the correct tags.
The documents I write may not be 100% correct (missing tags on some places etc), but I'm learning ;-)

I hope you like the article, please comment on it if thing's aren't correct, not obvious enough,...

I should try to write up some CV too next days. Hell of a job.

Feb 4
Articles

Converted my first article, the one about GObject Signals, into the Docbook format.
The resulting HTML is here, the Docbook XML source here.
The CSS stylesheet needs some more polishing, I know.
Also try viewing the article in Yelp: download the source, and run "yelp gob-signals.dbk".
I updated my homepage so there's a new link to the articles section now.

I'll convert the article on Makefiles later, it's a tedious job.

Started working on the "Glib Mainloop" one, won't be finished too soon (not easy to know what to write about ;-))

Feb 3
Gnome 2.10.0 beta 1 released

Just got the notification on IRC. Jay, finally :-)

Musical hint: check out the "Cello Concerto in a" by Camille Saint-Sa

Feb 1
Writing Makefiles the manual way, and using autotools

As promised, the article on writing Makefiles. As an extra, I also include how to build a basic/simple autotools project.

Writing Makefiles, the manual way

Let's get started.
First of all, you need all files used in my previous article, i.e. test-signal.gob and test-signal-test.c

There are some rules of thumb you can use when writing a Makefile manally:

  1. Find out what programs you need to process your source files

  2. Find out what packages you need: which headers, and which libraries

  3. Make a list of all source files, and find out which ones are dependent on others

Let's get through these step by step:

  1. What programs do we need? We need gob2 of course, to process our gob file. Next to this, we need the stuff you need most of the time when creating a Makefile: a compiler and a linker. And guess what, GCC can do both things.

  2. What packages do we need? Remember the command line thing you had to use when compiling the test-signal executable?

    gcc -Wall `pkg-config --libs --cflags gobject-2.0 glib-2.0` -o testsignal test-signal-test.c test-signal.c

    We don't even need this line to figure out what we need: we're building a gobject, so we need all libs and headers provided by the gobject package, version 2.x in our case, same thing for glib.
    How can we find out where these things are located? Well, the smart people at freedesktop.org made a nifty tool called pkg-config. When a library is installed, it can install a pkg-config resource file, which lists the directories where it's stuff is installed. For some samples of these files, check /usr/lib/pkg-config. The pkg-config command line utility can parse these files and give you the information you need.

  3. What source files have we got, and which one needs which? We got 2 files, test-signal.gob and test-signal-test.c. In the end we want to generate an executable called test-signal, which needs test-signal.c (the GObject implementation file) and test-signal-test.c.
    We will work in 2 stages here: first we'll compile all necessary .c files to an object file (.o), then in the end link all object files together in a nice executable.
    Here's what should happen: test-signal.gob should be parsed by gob to create our test-signal.c and test-signal.h file, test-signal.c must be compiled, test-signal-test.c must be compiled, and finaly test-signal.o and test-signal-test.o should be linked into test-signal.

First a little intermezzo. You might be asking "If I need to make all these lists, what's the use? Can't I just write some Bash script which executes the gcc command all at once?". Well, no. Make does more than just executing some commands. It also checks whether it *should* do something. Imagine you got 100 source files. You got all of them compiled, linked into some executable, run that one, and find some bug. You fix it by editing some lines in one file, and re-execute your huge gcc command. Gcc would recompile all 100 files, which will take some time.
If you use a Makefile make will find out only one file has been altered since the last build, it'll let gcc recompile only that file, then relink all object files together, which will take less time. Next to this: once using autotools, everything becomes much simpeler ;-)

Ok, now we got all prerequisites. Let's get started writing our Makefile.
I always tend to use the same format when writing one (although I dont write many of them, I use autotools ;-)). I start with defining the executables:

GOB2=gob2
CC=gcc
LD=gcc

This is not necessary, but can be usefull sometimes. Now we can use these variables later on. If we want to change the linker, we only have to edit the Makefile in one place.

Next comes the definition of the CFLAGS, the flags given to the compiler (CC) when compiling a sourcefile into an object file:

CFLAGS=`pkg-config --cflags glib-2.0`
CFLAGS+=`pkg-config --cflags gobject-2.0`
CFLAGS+=-Wall -g

As you can see, we request the CFLAGS necessary for glib-2.0 and gobject-2.0 (2.x, actually) by querying pkg-config. In the end we add 2 compiler flags: -Wall, which tells gcc to show all possible warnings, and -g, which tells gcc to include debugging information. This can result in a somewhat bigger executable, but it is very usefull if we want to debug the program using GDB.

Now we define the LDFLAGS, the flags given to the linker:

LDFLAGS=`pkg-config --libs glib-2.0`
LDFLAGS+=`pkg-config --libs gobject-2.0`

This should be fairly self-explaining.

This project only consists of one executable, so I define an "OBJS" variable including all the object files needed to build our executable:

OBJS=test-signal.o test-signal-test.o

Now comes the name of the executable we want to create:

PROG=test-signal

Now the magic starts. Until now we only defined some variables for later use. We were not forced to do so, it's just more convenient later on. Actually, when building small things, we can just use re-use this Makefile, and only have to change the variable definitions (OBJS, PROG, maybe some CFLAGS and LDFLAGS).

Now some rules follow, which tell make how to handle files:

%.c %.h %-private.h: %.gob
        $(GOB2) $<

This rule tells make: "When a foo.c, foo.h or foo-private.h is needed, and not existant, it should be made from foo.gob, by executing "$(GOB2) $<" which gets expanded as "gob2 foo.gob"". Indeed, % represent "a string", $(GOB2) is the variable we defined at the beginning and gets expanded as "gob2", and $< gets expanded as the first item on the right of the semicolon (":").
One big thing to watch out for: Makefiles are indenting-sensitive but you may not use spaces. So in the last Makefile fragment there's a [tab] before $(GOB2).
The format of a rule is very simple:

filenametobuild: dependenciestobuilditfrom
[tab]what to do first[enter]
[tab]what to do next if necessary[enter]

etc.

Next we define how to link our $(PROG):

$(PROG): $(OBJS)
        $(CC) $(LDFLAGS) $(OBJS) -o $(PROG)

$(PROG) is built out of $(OBJS) by issuing "$(CC) $(LDFLAGS) $(OBJS) -o $(PROG)", expanded to "gcc `pkg-config --libs gobject-2.0` `pkg-config --libs glib-2.0` test-signal.o test-signal-test.o -o test-signal"

We still have to tell make how to create object files out of source files:

%.o: %.c
        $(CC) $(CFLAGS) -c $<

You should be able to figure out what this does by yourself.

Now we add some convenience targets:

all: $(PROG)

default: $(PROG)

clean:
        rm -f $(OBJS) $(PROG)
        rm -f test-signal.[ch]
        rm -f test-signal-private.h

This enables us to just type "make", which will start building the "default" target, or "make all", which will build the "all" target, or "make clean" to "build" the "clean" target.
Notice a target

  1. is not forced to "do" something ("all" and "default"). You can just say "it depends on "foo" and/or "bar", which will be built then, and

  2. the dependencies of a target may be empty ("clean")

Let's end with the complete Makefile:

GOB2=gob2
CC=gcc
LD=gcc

CFLAGS=`pkg-config --cflags glib-2.0`
CFLAGS+=`pkg-config --cflags gobject-2.0`
CFLAGS+=-Wall -g

LDFLAGS=`pkg-config --libs glib-2.0`
LDFLAGS+=`pkg-config --libs gobject-2.0`

OBJS=test-signal.o test-signal-test.o

PROG=test-signal

%.c %.h %-private.h: %.gob
        $(GOB2) $<

$(PROG): $(OBJS)
        $(CC) $(LDFLAGS) $(OBJS) -o $(PROG)

%.o: %.c
        $(CC) $(CFLAGS) -c $<

all: $(PROG)

default: $(PROG)

clean:
        rm -f $(OBJS) $(PROG)
        rm -f test-signal.[ch]
        rm -f test-signal-private.h

Save this file as a file called "Makefile" in your source directory, and type "make". This should be the result:

gob2 test-signal.gob
gcc `pkg-config --cflags glib-2.0` `pkg-config --cflags gobject-2.0` -Wall -g -c test-signal.c
test-signal.c: In function `test_signal_testsignal':
test-signal.c:141: warning: implicit declaration of function `memset'
gcc `pkg-config --cflags glib-2.0` `pkg-config --cflags gobject-2.0` -Wall -g -c test-signal-test.c
gcc `pkg-config --libs glib-2.0` `pkg-config --libs gobject-2.0` test-signal.o test-signal-test.o -o test-signal
rm test-signal.c

Notice the "rm test-signal.c": make removes the files it generated itself, so when you update test-signal.gob, it will tell gob to reconstruct the c file, otherwise the c file wouldnt get updated.

Now you should be able to execute ./test-signal.

Ok, we got a nice Makefile now, but it took some time to write it, isn't it? And we don't have a "normal" FOSS install method like ./configure, make, make install...

Well, that's what we are going to do now. The following stuff is much easier than writing Makefiles by hand (you know, FOSS devs are lazy people ;-)), but it is useful to know how Makefiles are formatted, and how they work, though.

Ok, time for some really 1337 (couldn't stop it) stuff: introducing GNU Autotools.

Using the GNU Autotools to build your project

Didn't you ever want to be able to write such a neat ./configure script yourself? Here's how to do it with our sample project :-)

Autotools consist of a bunch of utilities, most of them starting with "auto" (duh). There's autoconf to generate a "configure" script from a file you provide, there's the automake script that creates Makefile's for you (actually, it does not create Makefiles. Read on), and many more.

This is how it works: the configure script will look up a bunch of stuff for you (or you provide it using --with-foo=... etc), it will do some tests to figure out whether the project should compile and run cleanly on your system, and in the end it will generate some files.
A boilerplate for these generated files should be provided by you, called "thefile.in". The generated file will be "thefile" then. Inside "thefile.in", you can use variables like these: "@FOO@", which will be substituted by the configure script.
That's the system automake uses: you write a simple Makefile.am file (read on on how to do this), automake generates a long and difficult Makefile.in file, which gets processed by configure to create the final Makefile.

GNU Autotools have some strict rules (although it is possible, but not advisable, to get around them). Source files should be in the src/ subdirectory, and some files are required in the root directory of the project. We'll find out which ones these are later on.

Let's get started by creating our initial project directory layout:

#Go into an empty directory
mkdir src
cd src
cp /foo/bar/test-signal.gob ./
cp /foo/bar/test-signal-test.c ./
cd ..

Initial task: once more, figure out what the required dependencies are. As mentioned in the first part, we need gobject-2.0 and glib-2.0. Next to this, we need a working C compiler.

We can start writing a configure.in file now, in the root dir of our project, which will be processed by autoconf to generate out ./configure script. Here's what it could look like, comments (starting with "dnl") inline:

dnl Register ourselves to autoconf, giving the main source file
AC_INIT(src/test-signal-test.c)

dnl Init Automake, giving the program name and version. More parameters (author and author's email) are optional
AM_INIT_AUTOMAKE(TestSignal, 0.1)
dnl Enable maintainer mode (debugging flags etc)
AM_MAINTAINER_MODE

dnl Check whether we got a good C compiler. Variable "CC" will be defined and expanded in the .in files
AC_PROG_CC

dnl GOB2 macro, to check whether gob version >=x.y.z (here >=2.0.0) is found. Variable "GOB2" will be substituted/expanded
GOB2_CHECK([2.0.0])

dnl Use built-in macro's to query pkg-config. First parameter is a variable name we'll use later on, second is the package to check for (with optional minimal version), third is the thing to do if the package is found, fourth if not
PKG_CHECK_MODULES(GLIB, glib-2.0, have_glib=true, have_glib=false)
if test "x${have_glib}" = "xfalse" ; then
        AC_MSG_ERROR([No Glib package information found])
fi
dnl So glib-2.0 is found. Remember the first parameter in the previous command, GLIB? Well, GLIB_CFLAGS now contains the output of `pkg-config --cflags glib-2.0`, same thing for GLIB_LIBS with --libs instead of --cflags
dnl AC_SUBST tells configure to substitute the given value in the provided .in files
AC_SUBST(GLIB_CFLAGS)
AC_SUBST(GLIB_LIBS)

dnl Same thing for gobject-2.0
PKG_CHECK_MODULES(GOBJECT, gobject-2.0, have_gobject=true, have_gobject=false)
if test "x${have_gobject}" = "xfalse" ; then
        AC_MSG_ERROR([No GObject package information found])
fi
AC_SUBST(GOBJECT_CFLAGS)
AC_SUBST(GOBJECT_LIBS)

dnl Here we tell the configure script which files to *create*, so we leave out the .in part
AC_OUTPUT([
        Makefile        \
        src/Makefile
])

Currently Makefile.in and src/Makefile.in don't exist yet, they will be created by automake later on.

This file is a very simple one, it can be a tedious job to create complex configure.in files :-/

Next thing: Makefile.am files, which will be processed by automake to create the Makefile.in's.
In the project root dir, this file can be very simple:

SUBDIRS = src

We just define the subdir(s) of the current dir.
src/Makefile.am is a little more complex:

INCLUDES = $(GLIB_CFLAGS) $(GOBJECT_CFLAGS)

bin_PROGRAMS = test-signal

test_signal_SOURCES = test-signal.c test-signal-test.c

test_signal_LDADD = $(GLIB_LIBS) $(GOBJECT_LIBS)

%.c %.h %-private.h: %.gob
        @GOB2@ $<

Some explanation:

  • "INCLUDES" is a variable that will be given to every compile call to "$(CC)", here we only need the glib and gobject includes. Remember these values will be setted by the configure script.
  • "bin_PROGRAMS" is a variable defining the names of all targets we want to be installed as an executable in the bin directory (/usr/local/bin if no prefix is given to ./configure)
  • "test_signal_SOURCES" is a variable defining which files are needed to make the target "test-signal". Notice "-" being replaced with "_" here, which is a common thing in automake files.
  • "test_signal_LDADD" defines which parameters to offer to the linker when linking the input object files to the "test-signal" executable. We could have used a global "LDADD" variable here, like "INCLUDES", or have made "INCLUDES" non-global by using "test_signal_INCLUDES". In large projects this can make a big difference.
  • The last part is the one we also used in our hand-written Makefile. It will be put like this in the resulting Makefile.in by automake, so make will know how to build .c and .h files from a .gob file.
    Automake puts all non-automake-specific stuff that's in Makefile.am in the resulting Makefile.in.

Now everything is done. At least, almost ;-) We still need to call out autotool scripts.
It's easy to do this from a shell script. Most projects call this script "autogen.sh", so will we. This script can be fairly complex, ours will be braindead easy:

#!/bin/bash
aclocal
autoconf
automake -a

As you can see, we first call aclocal (part of the automake package), then autoconf, then automake with the -a flag.
Why "aclocal"? Autoconf, which creates a configure script out of a configure.in file, uses M4, a complex macro system, to do this. aclocal copies the necessary macro definitions for your system to the right place.

Run this script, or enter the commands by hand. aclocal can give a lot of warnings, don't bother about these.
If this is the first time you run the script, you will see some automake errors in the end, and the script execution will fail:

Makefile.am: required file `./NEWS' not found
Makefile.am: required file `./README' not found
Makefile.am: required file `./AUTHORS' not found
Makefile.am: required file `./ChangeLog' not found

These are the required files I mentioned earlier. We'll just touch them for now so they "exist", although contain no usefull information:

touch NEWS README AUTHORS ChangeLog

Now restart the autogen.sh script, everything should pass now.

If you take a look now, you'll see a couple of Makefile.in files are created now (who will be converted into Makefiles by configure, remember?), and the configure script.

Let's take our work to the test, and run ./configure. You'll see the usual output, and everything should pass fine (we don't have a lot of prerequisites :-))

Now let's try if our project builds fine:

make

Jay, lot of compiler commands, no errors. Fun :-)

To finish, test whether the executable works:

cd src
./test-signal

The same output as before appears, we're member of the Autotools User Group now ;-)

That's it for now, more stuff should follow: "The Glib mainloop", "Debugging using GDB", and a from-scratch "program" we'll create. I only lack the time to write everything ;-)

I'd like to convert these articles to some format so they can be used outside this blog (converted to PDF, HTML,...). Docbook seems to be a good format to do this, I need to get used to it first, though. If someone knows a good Docbook editor (next to Conglomerate), please let me know.

Feb 1
I'm amazed

I just found this article on distrowatch.com, and I'm amazed. This is *so* unreal :crazy:
If you're too lazy to read it, or just want to know why you should: it looks like CCux Linux included Ivman in their latest development release, by default XX(
Quoting the Release Notes:

(But) This Version has some nice new Features too. Supermount doesn't handle the automatically Device mounting. In Fact, this is now handled from dbus 0.23, hal 0.4.7 and ivman 0.5. This Combination does this Job much better when mounting CD/DVD Drives or USB Sticks, Kameras or other Things. Therefore it doesn't need anymore User Interaction to get all Drives running.

88| :-P :-) :-D :-p B-) |-| :>>

This is some great new, isn't it? Mostly thanks to Rohan lately I guess :-) YOU ROCK

Next to this: as promised in my GObject Signal handling article, I wrote a makefile for it, and even did the autotools stuff. I'll write an article on that stuff this evening, or tomorrow, if possible.

PS. Good news: exam was fairly good, I think. Let's hope for the best tomorrow.

Feb 1
New category

Just made a new blogging category, "Coding Corner". I've got some ideas related to a serie of articles on Gnome/Glib coding, building on GObjects like in the article of yesterday, but also introducing some basic GTK+ coding, daemonizing, autotools (autoconf/automake/autoheader) and some more. It will be a little "Hello world" program, which can *normally* be done in like 10 lines of code, but a little more sophisticated here. I hope I'll be able to start writing the first article (about the Glib mainloop) tomorrow afternoon.
I'm looking for some format to write the articles in, so maybe I'll be able to generate PDF documents of them later on, etc. Maybe I should take a look at Docbook? LaTeX isn't really suited here.
Of course the CUPS PDF Writer could do it's job too ;-)

Got an exam at 14:00 and tomorrow morning, wish me luck, I need it.

Categories

Who's Online?

  • Guest Users: 444

Misc

XML Feeds

What is RSS?