[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: [olpc-software] On installing software



OK, so here are some more thoughts :)

Firstly a couple of cosmetic issues - why LaptopSoftwareManager instead of a more generic name? It might be useful elsewhere (though I appreciate that this raises interesting political issues). The ".app" extension is already in use by MacOS X and ROX Filer, and people who wish to distribute cross-platform software on the same media might find that a bit annoying. A wider name might be better as you could use such bundles for themes, screensavers, wallpaper packs, etc - stuff that is "software" but not an application. Maybe .software .bundle .box etc?

I question /usr/local instead of /usr for software :) Historically installing "non distro stuff" to /usr has prompted massive flamewars, IMHO out of all proportion to its importance, given that it's just a file path. Nonetheless this involves adapting possibly large numbers of programs and even libraries. Why not use /usr - the daemon doing the symlinking can ensure nothing is overwritten or modified, and it solves the problem of needing to modify lots of software.

PINs? Why PINs over more easily remembered passwords?  :)

The problem of malware, stopping programs installing software themselves etc is a ridiculously deep one .... why would a program need to trigger the installation of another program if it can just include the malware itself, for instance? Why not just screw around with the .bashrc or the session manager or gconf? To prevent applications installing other applications (which might be a desirable feature in some cases!) I don't think we need any user interaction - DBUS security policy can be used to say things like "Nautilus can whitelist apps, the 'Manage 3rd Party Software' applet can whitelist apps, nothing else can". X security can stop programs trying to "remote control" other apps (though there are other possible attacks).

File conflicts are a tricky problem but one that's important to deal with IMHO, given that there would be no sysadmins around to fix things when they go splat :) My proposal involved a "stack" of software, but that's more an artifact of how union mounts work than anything else. If you look at the files that actually need to be linked to /usr/local then they are only files important for desktop integration - icons, menu entries, online help and so on. The applications private files don't matter. In many cases the actual file names on disk don't matter either, for instance .desktop files can be totally randomized and GNOME doesn't care. So the daemon could easily randomize some filenames as they are symlinked to reduce the problem of conflicts. Library sonames are a much harder issue, but that's what the platform is for. And executables probably don't matter as their name is only meaningful to humans for end-user apps, so they could be tagged with -1 -2 -3 etc by the daemon (or not symlinked at all, and then the user has to tab complete the path to the appdir).

On dependencies - I'm not sure duplicating package manager dependency checking logic in such a daemon would be a good idea. There are several models of dependency checking, some radically different to others. For instance autopackage has a very rich dependency checking API, which scans the system looking for clues as to what interfaces are actually implemented autoconf-style. That's very different to something like dpkg! One way to solve this is to say that OLPC apps cannot have [library] dependencies at all, this is what MacOS X does and it's end users love it as it's so simple. On the other hand this can be inefficient and we have tighter efficiency constraints and more modular libraries.

What to do about dependencies is connected to how appdirs are distributed. You suggest a .app.tar.gz format, but:

a] IMHO .tar.gz is a poor archive format :) You can't randomly seek to a given file like you can with .zip, and gzip is not good compression compared to LZMA.

b] You'd end up redefining RPM/dpkg but using META-INF entries instead of headers/whatever dpkg uses (in fact .debs are tarballs aren't they?), and not gain much.

c] Such a thing would be OLPC specific, reducing the efficiencies of the mass market and making it harder for Joe Linux Developer to contribute (he'd need to take time to develop a package for a platform he probably doesn't have).

I think there's nothing wrong with representing apps as appdirs internally, and that'd be quite convenient. And when you know you're swapping stuff with another OLPC laptop user you can of course just transfer the appdir directly. I'm less convinced it's the right format for general purpose distribution. Apple took this approach and ended up with piles of hacks (internet enabled dmgs, safe content, etc) and they don't have to worry about a fragmented distro space or dependencies! Nowadays much of Apples own software comes in their .pkg format, which is a simplistic equivalent to RPM.

So what I'd propose is some intermediate format that is recognised as an AppDir, then converted to an _actual_ expanded appdir as it's being installed by LaptopSoftwareManager. That appdir can then be moved around, deleted, shared over the network, sent to other users etc as is wanted. The intermediate format could be anything - rpm, deb, autopackage, zip file, tarball .... this email is a bit long already but here are what I think the selection criteria should be:

* Already existing. Seriously, a container for software is not a very complex thing, the hard parts are how to manage its interaction with other containers and what's inside the container :) Any new format would have to reinvent most of that, so there'd need to be a clearly demonstrated gain.

* IMHO they should work for users of "normal" Linux distributions too. It's just a gut feeling but I think there would be more software usable on the laptop if developers can simply be told "do what you normally do, but faster and with fewer resources" rather than requiring them to set up VMs, complex SDKs, etc. That means maximizing their return on investment, eg, the more users can hit with their work the better.

* It should be easy to evolve as eg, better compression algorithms are invented, the platform is modified, mistakes are made and must be worked around ....

* Should be nice for end users and developers, have nice tools and GUIs etc.

I'd argue that autopackage meets all those criteria, no surprises there, but it could be argued that other formats would too.

thanks -mike


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]