Getting stfl working on fedora 11 x64

July 30, 2009 at 11:14 AM | categories: Linux | View Comments


Newbeuter is my preferred way to browse RSS feeds. Its a great CLI app that is 'Mutt Of Feed Readers', and being a user of mutt I love this app for being so similar. I wanted to use the newest version of newbeuter, because recently I have been on a kick to get all of my apps: screen, vim, mutt, and now newsbuter; setup with 256 colours.

I read on the development blog that the newest version in the git repo was able to use 256 colours, so I immediately cloned the repo and started the install process. I was stopped immediately at the config.sh. It wanted the stfl libraries and wasn't able to find them, and neither could yum.

Looking into the config.sh I saw that it was querying pkg-config for the flags to use on compilation to see if the library was present. It wasn't, and so I also read further on the notes for the git repo version, and saw the bit about using the latest svn copy of stfl. So I checked that out, made it and installed it onto my machine. But this didn't get me any further into making newsbueter, because pkg-config was still unable to find libstfl.

So I decided to try and figure out what the issue was with stfl and pkg-config. First thing I saw was that the stfl.pc file that pkg-config uses to make its responses was in the wrong place for my system. It was in /usr/local/lib/pkgconfig, instead of with the other libraries on my system in /usr/lib64/pkgconfig. This resolved the issue of pkg-config not knowing where the pc file for stfl was, and pkg-config was now returning actual data about libstfl. A recompilation of newsbeuter did show that there was still a problem. I now received: error while loading shared libraries: libstfl.so.0: cannot open shared object file: No such file or directory.

I did another updatedb, and ran a locate for all stfl files, and found that the make install for stfl was putting the libs in a strange place as well. I then updated both the Makefile.cfg and the stfl.so.in to reflect the locations of other libs:

Makefile.cfg changed lines

export libdir ?= lib64
export prefix ?= usr
export DESTDIR ?= /

stfl.pc.in changes lines

prefix=/usr
exec_prefix=/usr
libdir=/usr/lib64
includedir=/usr/include

This got me pretty far I think, because now the system looked like the other libraries. The stfl.pc lines were taken almost verbatim from the sqlite3.pc file in fact. I was still getting the shared library issue though, and after not thinking about the problem for a bit I realized that the stfl makefile had made libstfl.so.0.21 and the error message was not able to find libstfl.so.0, so I too the logical leap and made a symlink named libstfl.so.0 that pointed to libstfl.so.0.21 in the /usr/lib64/ directory. After making sure that this worked, I made a change to the makefile for stfl, and added another line to make a second symlink, that ended in 0:

Makefile

ln -fs libstfl.so.$(VERSION) $(DESTDIR)$(prefix)/$(libdir)/libstfl.so.0

I do admit that there may be a better way to do this, and some of it may be done with command line config or make flags. I don't know them, and was able to successfully build with these changes.







Decouple with kwargs

July 22, 2009 at 05:26 PM | categories: Programming, Linux | View Comments


So I've been attempting to make a suite of cli scripts for work. I recently discovered the multiprocessing module for python, and really liked its simplicity, and started using it, with great success. Everything was faster.

This then spurred me to take the scripts that were for the most part, copy common-ish bits and then modify to suite, and turn them into a library of sorts. The neat part then arose when I wanted to import a argument parser, as well as pass off to a proc creation component.

In doing this I had in mind that the 'script' would need to only define a function to make a list of commands to run on a given server, and a __main__ section that would pass in a list of servers, the function to make a command list and some other info. This way the script itself would only be two definition sections, and only the parts that were going to be unique for the most part.

The problem that came up in doing this is when I wanted the function that makes the command list to have more arguments that normal. How would I pass them in, and how would I define them so that I don't have to edit my libraries to accommodate this argument passing. It was kwargs that saved me there, that and some optparse tweaking.

Here is a basic example:

script.py

def get_commands(host, **kwargs):
    command_list = []

    user = kwargs['user']
    key_file = kwargs['key_file']

    command_list.append("echo %s" % user)

    return command_list

if __name__ == '__main__':
    import automation

    hosts = []

    (hosts, options) = automation.process_args(sys.argv)

    automation.thread_hosts(
            hosts,
            get_commands,
            options,
            user="test"
            )

automation.py (library w/ functions)

def thread_hosts(hosts, get_commands, options={}, **kwargs):
    import multiprocessing

    kwargs.update(options)

    jobs = []

    for host in hosts:
        p = multiprocessing.Process(
                target=run_commands,
                args=(
                    host,
                    get_commands(host, **kwargs),
                    ), )

        jobs.append(p)
        p.start()

So this example is a script that defines the function to return a command list, and provides an options var, and list of hosts. The thread hosts then loops over the hosts each time passing the host and the get_commands function to another library function that connects to said host, and loops over the returned command list.

A part that might be confusing is that the parse_args function returns optparse's options variable but the options.__dict__ representation specifically. This then allows me to be able to update kwargs with any options that I allow to be set at the command line. The example in the script being the key_file variable.

The neat part of all this is being able to take the kwargs for one function and pass it right along to the next. This is key, because it allows for the library function in this case to be able to be entirely decoupled from the script itself.

With this implementation I am able to write a script that defines extra args to use, and only the script need know what they are. In the examples the library will just dumbly pass them along in the kwargs dict, I never have to tell it that I want to pass a user variable to it, and it makes the script a nice self contained unit.







Next Page ยป