large files

MCCALL,DON (HP-USA,ex1) don_mccall at hp.com
Wed Oct 17 12:19:12 GMT 2001


Ok,
I don't have any time to look at this at the moment, but it sure does sound
like an internal storage issue from the data
you supply:
Since you can copy the file ONTO the fs that samba is sharing out, obviously
your fs can handle a file that large.
But the concern is that some of your OS utilities like stat, etc, are NOT
reporting the size right.  This indicates to me 
that perhaps a system call that Samba is using to return file/filesystem
informatino back to the pc in response to a smb
is not able to do the job, and thus the program on nt is thinking that it
can't create a file any bigger than 4 or 2 gigabytes.
If you turned on debug during the ntbackup and saved off the debug file as
soon as the ntbackup program died, we might be able to see exactly what smb
was being processed when it failed.  This would tell us what system call was
being made by samba to give that info, and we would have a better idea.  BUT
the fact that stat, etc is not working properly on your system is IMHO the
most likely culprit You might want to check the patches and fixes in later
revisions of the os to see if fstat, stat, or other system calls have
reported a problem that is fixed...
Hope this helps,
Don

-----Original Message-----
From: Ivan Fernandez [mailto:ivan at vyb.com]
Sent: Wednesday, October 17, 2001 2:49 PM
To: samba at lists.samba.org
Subject: Re: large files



Hi Joseph, 
  
    please read the message entirely. I say I can copy the 22 Gb file onto
the samba server with no problems. (I'm using ReiserFS 3.6).


Cheers! 
-----Mensaje original----- 
De: samba-admin at lists.samba.org [ mailto:samba-admin at lists.samba.org
<mailto:samba-admin at lists.samba.org> ]En nombre de Joseph Loo 
Enviado el: miércoles, 17 de octubre de 2001 15:30 
Para: samba list 
Asunto: Re: large files 


I am not sure but you might be running into the limitations of the ext2 file
ssytem. I believe it has a 4 Gbyte limitation on a single file. You might
have to consider another file system for your large files.

Ivan Fernandez wrote: 

I'm reposting this problem (perhaps a bug) now I've got more information on
it. This is another point of view of the situation and I hope someone could
have run into the same trouble before (and solved it :-))

This is it: 

        * with ntbackup 2000 I create a 22Gb .bkf file in the windows
machine. 

        * I can copy that file over a samba share and get correct info form
the file in windows explorer. 

        * ls -l also returns correct info, *WHILE* stat, mc, and other
programs raise up with an error regarding a value too high for defined data
type.

        * If I try to create the file with ntbackup directly over the share,
it gets downsized to 0 bytes and grows slowly while ntbackup dies when the
file crosses the 4 Gb (exactly) size.

        *I have compiled myself version 2.2.2 of samba and, surprisingly,
the 4Gb "limit" situation described above was taken down to 2 Gb exactly.

Any idea on what's happening, please?? 

Thanks all for reading (and more more thanks if someone responds!) 



-- 
Joseph Loo 
jloo at acm.org 

-------------- next part --------------
HTML attachment scrubbed and removed


More information about the samba mailing list