-- +-----------------------------------+---------------------------------+ | Tom Webster | "Funny, I've never seen it | | SysAdmin MDA-SSD ISS-IS-HB-S&O | do THAT before...." | | webster_at_ssdpdc.lgb.cal.boeing.com | - Any user support person | +-----------------------------------+---------------------------------+ | Unless clearly stated otherwise, all opinions are my own. | +---------------------------------------------------------------------+ Also from Alan: I looked at the ltf(5) (format of ANSI labeled tapes) from an ULTRIX system and it looks like the maximum file size for an ANSI tape is around 19 GB. It seems to have a maximum record size of 20480 with 999,999 records. The other choice would be to simply use dd(1) to write the files to tape at some appropriate block size. Use the no- rewind device in between files and file size shouldn't be an issue. For cataloging, you might try writing a small tar archive for each large file and put those files before each of the large ones: % echo "A description of 'bigfile'" > bigfile.txt % ls -las bigfile >> bigfile.txt % tar cbf 128 bigfile.tar bigfile.txt % dd if=bigfile.tar of=/dev/nrmt0h bs=64k % dd if=bigfile of=/dev/nrmt0h bs=64k ---------------------------------------------------------------------- From: belonis_at_dirac.phys.washington.edu I don't know about tar filesizes, but you can always use dd to write files to tape. It may or may not have filesize limits of its own, and will not include directory information with the file, so write the ls -l output as a separate file. ORIGINAL POST: Managers, I am installing DEC patch c970426-1401.tar.Z to fix a problem with the tar utility. The readme file indicates that the patch will cause tar to handle files greater that 4 GB correctly and also seems to imply that the max size of a tar file is 8 GB. Excerpt from README: Modify tar to correctly handle a file whose size exceeds 4GB. Previously, this would quietly cause corruption of the fileheader, thus producing a useless tarset (a tar extract would not be able to read the tarset). Additionally added code to detect attempts to exceed the maximum size of a file (8GB as described by the tar fileheader), and to warn when it is necessary to truncate a file that is too large. Am I reading this correctly? Is 8 GB really as big as you can get with tar? What about gnutar? Does it suffer the same limitation? I have to have a tool which can write individual files of up to 16 GB to tape. If tar is not the answer, I would appreciate any suggestions. TIA, susrod_at_hbsi.com - consistency is the defense of a small mind susrod_at_hbsi.com - consistency is the defense of a small mindReceived on Thu Mar 05 1998 - 19:31:55 NZDT
This archive was generated by hypermail 2.4.0 : Wed Nov 08 2023 - 11:53:37 NZDT