[DAQ Page]
a)Number of Detectors
    1) Run 19 - 6 BLIPS
    2) Run 20 - Starting 9/1/99 - 6 ZIPS
    3) CDMS-II - Starting 01/01/01 - 18 ZIPS
                 Final               42 ZIPS
    4) Channel Count:
        i)BLIP:      2 Ionization - 5 microsecond rise
                                   50 microsecond fall
                     2 Phonon     - 5 millisecond rise
                                   50 millisecond fall
       ii)ZIP:       2 Ionization - 2 microsecond rise
                                   50 microsecond fall
                     4 Phonon     - 2 microsecond rise
                                   50 microsecond fall

b)number of veto elements
    1) CDMS-I:   26 PMT's, grouped to 13 channels
                 1 ns rise, 10 ns fall, stretched to 1.2 microseconds

    1) CDMS-II: 42 PMT's, grouped to 42 channels
                 1 ns rise, 10 ns fall, stretched to 1.2 microseconds

c)Sampling Rate, Resolution
  (see also  Ray Bunker's Digitizer Page)
    1) CDMS-I:
      * 12 bit, over -X to +Y V range - 2 Bytes/Sample

      * BLIP
          i)Ionization: 1000 samples,  0.5 microsecond/sample, 
                        500 microseconds, 375 musec after, 125 musec before
                        2000 bytes
         ii)Phonon:     2000 samples, 64.1 microsecond/sample, 
                        128 milliseconds, 96 ms after, 32 ms before; 
                        4000 bytes
      * ZIP
         i)Ionization: 10000 samples, 0.125 microsecond/sample, 1250 microseconds,
                       20000 bytes
        ii)Phonon:     10000 samples, 0.125 microsecond/sample, 1250 microseconds,
                       20000 bytes
      * Time History   12500 samples, 0.83 microseconds/sample, 10.4 milliseconds,
                       150000 bytes  (96 Channels, 12 bytes/time slice)

    2) CDMS-II:
      * 12 bit, over -X to +Y V range - 2 Bytes/Sample
      * ZIP
         i)Ionization: 10000 samples, 0.125 microsecond/sample, 1250 microseconds,
                       20000 bytes
        ii)Phonon:     10000 samples, 0.125 microsecond/sample, 1250 microseconds,
                       20000 bytes
       iii)Veto:        450 kilobytes  (a guess, use 3 time history units)

d)`data' Event Size
    1) CDMS-I :  150 + 12*B + 120*Z kilobytes (B=6, Z=0 get 222 kilobytes)
                                              (B=0, Z=6 get 870 kilobytes)
                B= number of BLIPS
                Z= number of ZIPS

    2) CDMS-II: 450 + 120*Z  kilobytes (Z=42, get 5.5 Megabytes)
                Z= number of ZIPS

e)number of `slow control/monitoring' bytes/time interval

f)trigger rate (including calibrations)

   1)Calibration: let's guess, for ZIP's, that one segments
                  each detector quadrant 4 by 4 by 4, or
                  64 bins, for 256 total bins.  And,
                  in each bin, you want 100 good events, and
                  you need to take 500 triggers to get 100 good
                  events. So, that is 1.3 * 10^5 triggers per
                  detector, or 5.4 * 10^6 triggers for the whole
                  7 towers.

                  Now guess, you'd like to accomplish that in
                  4 days, or 3.5 * 10^5 seconds.

                  That is a trigger rate of 

                         5.4/0.35 = 15 Hz,

                  or about 2.1 Hz per tower, or about 0.36 Hz per hocky puck, 
                  or about 0.09 Hz per phonon channel.

                  To deal with this rate, we need to read out only a
                  fraction of the detector; in particular, only those
                  detectors that fire a minimum threshold.  We can
                  work backward from a 100 BaseT ethernet bandwidth of
                  3 Mbyte/second; the information from one ZIP will
                  comprise 0.12 Mbyte; to stay above a livetime of
                  90%, we need to stay under 0.3 Mbyte/s.  If we read
                  out 0.12 Mbyte at 15 Hz, then, that would be
                  15*0.12=1.8 Mbyte/second; we'd need to achieve another
                  factor of 6 on this, by, say, decimation of the trace.

                  The trigger must be reenabled in less than
                  6 milliseconds, in order to keep the front-end livetime
                  from falling below 90%.

   2)Normal Data Taking: let's plan for a maximum trigger rate of
                         1/3 Hz; that is a factor of 3 higher than
                         that estimated in the proposal of 0.1 Hz.

                         To read out the entire detector at 1/3 Hz,
                         we'd need an available bandwidth of
                         5.5 Mbyte/event * 1/3 * event/s 
                                  = 1.65 Mbyte/second

                         The maximum 100 BaseT transfer rate is about
                         3 Mbyte/second; to keep a livetime of >90%, 
                         we'd need to reduce the needed bandwidth by
                         a factor of 6 or so.  That implies that we
                         must decimate the traces.

                         The algorithm to decimate the trace must go fast
                         enough to not add deadtime at the front end;
                         there is 3 seconds available on average between
                         events (from the 1/3 Hz), so the algorithm must
                         be done in 300 milliseconds or so.  This is
                         actually a looser requirement than needed for
                         the case of calibration.
[DAQ Page]