I apologize if this is partially off-topic, but the osf-managers is usually so
prompt in replies ...
I am in the process of archiving on CD-ROM a quantity of satellite
astronomical data I have at the moment on the disks of our three Alphas (the
Alphas are dedicated to a project, and I'm using more than 8 Gb for these
data. I have the raw data already on cassette, but since I corrected a lot of
errors introduced in the original telemetry by the ground segment people, I
want to make a new archive, and I want to make it on a direct access medium
like CDs, not again on cassettes, for ease of later use).
The software I use to write CDs is Elektroson's GEAR 3.2, and it runs on a
Sun, to which the CD writer is attached. Our Suns and Alphas share (or may
share) disks via NFS.
The procedure I am using is the following :
1- I make sure the Sun mounts all Alpha disks with my data
2- I enter the msgen (GEAR) program and create a virtual image.
It is my intention that the CD has the following structure :
/ -------- SVP ----- Opnnn ----- data files
----- Opnnm ----- ...
.....
-------- targets ----- target1 --- date1 --- Op...
date2 --- ...
.....
i.e. one directory contains one subdirectory OPnnn per "run" with
the data files, the other (targets) contains a directory and
subdirectory organization per target (celestial object) and date,
with a soft link to ../../../SVP/Opnnn for each data subdirectory
I make sure that these links exist as they should on the magnetic
disk of the Alphas
3- I continue this until the size of data in the virtual image approxi-
mates the capacity of a CD (I uso ISO 74 minute CD, which should
give me some 650 Mb)
4- I then create a physical image with "physvol", this for reason of
space is on another Alpha disk seen in NFS by the Sun
5- I then test-write the physical image to the CD unit (no physical
burning of the disk, just testing the throughput)
6- if the above succeeds, I write the physical image to the CD
I have encountered the following problems (I did not have problems in the past
in preparing a 575 MB CD, but at the time we had a different NFS arrangement,
a colleague of mine did several CDs, all with a lesser capacity, but no
problems) :
I started with a data size of more than 600 Mb, and obtained systematically
a "50h" error (the manual says just "WRITE APPEND ERROR indicates that an
append by a write command during writing failed") during the test step
(number 5).
Since my colleague said that a CD cannot be completely full, I decreased
the quantity of data to 596 MB, and remade the physical image. This
passed step 5, so I wrote the disk.
This failed again with error 50h during step 6, close to the end.
The resulting disk was unfinished. I "fixated" it and was able to mount
the CD both on Sun and Alpha. However it was clear that the CD was
partially corrupt :
- the top directory was showing all directories in an "ls", but a
"ls -aFgls" showed that the "target" subdirectory did not exist.
The same occurred of a few other subdirectories which were soft-
links (actually, cd-ing to one of them made my Alpha crash-dump !)
- however the thing is only coincidental with the presence of soft
links. I compared each accessible subdir with the original data
(diff -r) and found that all data were copied with the exception
of two (bulky) files, which are totally corrupt.
- I note that the links and the directory with the corrupted files
were created recently, or moved recently in place. It appears that
CD writing occurs in chronological order of file creation.
We attributed this fact to concurrent activity (somebody using a DAT
on the same SCSI bus where the CD writer is), and repeated the writing
excluding any other user. I obtained the same error in the same place.
We now feel there might be an hard limit on the capacity of the CD (possibly
also depending on the size of the files, and the way they fill disk sectors,
my colleague archived all big images of the same size, my directories have big
or medium data files depending on the observation length mixed with short
ascii files). In particular we note that :
- the size of the physical image for 596 Mb of net data is
reported as 624951296 bytes (size reported by GEAR equal to size
of file on disk), inclusive of overheads.
- the write error occurred when 600977408 bytes had been written on
disk
- the largest CD we wrote in the past was 602931200 bytes.
Does this make sense ?
Has anybody experience with this sort of limits using GEAR software ?
I have to create some more disks for my 8 GB of data, I do not want to
waste more CD blanks, and at the same time want to maximize CD filling.
Any suggestion ?
----------------------------------------------------------------------------
Lucio Chiappetti - IFCTR/CNR - via Bassini 15 - I-20133 Milano (Italy)
----------------------------------------------------------------------------
Fuscim donca de Miragn E tornem a sta scio' in Bregn
Che i fachign e i cortesagn Magl' insema no stagn begn
Drizza la', compa' Tapogn (Rabisch, II 41, 96-99)
----------------------------------------------------------------------------
For more info :
http://www.ifctr.mi.cnr.it/~lucio/personal.html
----------------------------------------------------------------------------
Received on Mon May 05 1997 - 18:20:08 NZST