[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: A more efficient up2date service using binary diffs

Indeed there are serveral approaches to this problem, all with their
upsides and downsides. I'm currently treating the cpio archive as one
big file, and you're probably right -- treating it a  file system may
yield better results. I'll add it to my todo list.

Right now I'm working on a proof-of-concept prototype: A proxy
webserver (or FTP server) running locally (on the client or on the
client's LAN) which will pretend to be a full repository. In reality
it will be trying to serve as many requests as possible by downloading
deltas and applying them to locally stored RPMs from the original
distribution. Naturally it will cache anything that's downloaded also.

My design criteria are:
1. No modification to the existing software (yum, up2date etc)
2. Delta repository must be automatically built and maintained
3. Delta repository must be nothing more than a FTP/HTTP server 

Because of #2 and #3 I'm going to keep things as simple as possible
for the moment.


On Mon, 14 Mar 2005 12:40:23 +0100, Thomas Hille
<thomas hille nightsabers org> wrote:

> Maybe doing the diff on the single files also could help compress the
> rpms, that were not compressible using the whole rpm (omni-foomatic
> etc.)

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]