|| In the DCM GUI, choose Configuration from the main menu.
The System Settings page appears.
||In the Configuration tree, double-click the interface card for which the default settings for incoming TSs must be changed.
The interface configuration page of the selected interface card is displayed.
||Click Default Settings, and then click the Input TS tab. |
||Refer to the
Input TS Default
drop-down list, choose
(Advanced Television Systems Committee), or
Compact CA Descriptor drop-down list, choose
||From the MPEG Priority Bit drop-down list, choose Default (xxxxxx), Transparent, Force to 0, or Force to 1.
The MPEG Priority Bit drop-down list is only available if the DCM is provided with PRIORITY_BIT_ADAPTATION license keys.
||Check the Index Packet Interpreter check box to enable this function or clear this check box otherwise (ASI and ASI SFN card only). |
||From the Time Base Selection drop-down list (GbE, GbE MK2, and 10GE card only), choose Auto,
Auto Ref. PCR, CBR - Auto Ref. PCR (RR), CBR - Auto, CBR - Auto Ref. PCR, or Bypass. |
||In the CBR Latency (ms) field, enter the desired latency (GbE, GbE MK2, and 10GE card only). |
||In the VBR Latency (ms) field, enter the desired latency (GbE, GbE MK2, and 10GE card only). |
||In the Max. Incoming TS rate (Mbps) field, enter the maximum bit rate for an incoming TS. |
||In the UDP Stream Loss Timeout field, enter the time, in milliseconds, allowed for no UDP packets in the UDP stream before triggering the UDP Stream Loss alarm. To decrease the loss detection time and increase the TS or service backup process, enter a low value. The minimum value is 200 ms. The default value is 2500 ms.|
||In the Window field, enter the size of the RTP buffer.|
After changing the Time Base Selection, CBR Latency (ms), or VBR Latency (ms) parameter and clicking Apply, a dialog box is displayed to inform you that the transmission is shortly interrupted. Click OK.