Fwd:[cvsnt] Re: Performance problem with large f iles

Tony Hoyle tmh at nodomain.org
Thu Oct 21 15:40:23 BST 2004


michel.leclerc at b-rail.be wrote:
> Tony,
> 
> It seems that network speed is not the bottleneck. I've tried with a speed of
> 10 Mbits, 100Mbits and gigabit and it does not make a big difference.
> These large files are binary files and I don't understand why they require so
> much CPU resources. Why CVs does require so much memory to handle these files?
> Which amount of memory do I need to feel comfortable?

It depends on the amount of activity going on at the same time, whether 
you're checking out HEAD, etc.

A 270MB file will at a minimum use twice that size for checkout, then 
the disk space for the temporary file, before sending to the client.

If you're not checking out HEAD you can double that memory requirement, 
at least.  If multiple users are going to be checking out 
simultaneously, multiply the whole thing by the number of users.

A version control system doesn't simply store the files on disk, it has 
to calculate differences between versions and (during checkout) 
reconstruct the original file.  There are also some changes required by 
the RCS file format that mean you don't entirely get away with it even 
for HEAD.  The faster the CPU the better for this kind of thing - no 
matter how efficient the algorithm a file of that size is going to take 
a significant time to process.

You may be better off using an alternate method of storing those files, 
as you're quickly going to end up with RCS files of several GB in size - 
this makes backup of those files difficult.

Tony



More information about the cvsnt mailing list