Base class for physical entities like inputs and outputs. Unique identifier referencing the physical entity. User readable name. Length up to 64 characters. Rectangle defined by lower left corner position and size. Units are pixel. Range of a rectangle. The rectangle itself is defined by lower left corner position and size. Units are pixel. Range of X-axis. Range of Y-axis. Range of width. Range of height. Range of values greater equal Min value and less equal Max value. Range of duration greater equal Min duration and less equal Max duration. List of values. Representation of a physical video input. Frame rate in frames per second. Horizontal and vertical resolution Optional configuration of the image sensor. Optional configuration of the image sensor. To be used if imaging service 2.00 is supported. Representation of a physical audio input. number of available audio channels. (1: mono, 2: stereo) A media profile consists of a set of media configurations. Media profiles are used by a client to configure properties of a media stream from an NVT.
An NVT shall provide at least one media profile at boot. An NVT should provide “ready to use” profiles for the most common media configurations that the device offers.
A profile consists of a set of interconnected configuration entities. Configurations are provided by the NVT and can be either static or created dynamically by the NVT. For example, the dynamic configurations can be created by the NVT depending on current available encoding resources.
User readable name of the profile. Optional configuration of the Video input. Optional configuration of the Audio input. Optional configuration of the Video encoder. Optional configuration of the Audio encoder. Optional configuration of the video analytics module and rule engine. Optional configuration of the pan tilt zoom unit. Optional configuration of the metadata stream. Extensions defined in ONVIF 2.0 Unique identifier of the profile. A value of true signals that the profile cannot be deleted. Default is false.
Optional configuration of the Audio output. Optional configuration of the Audio decoder. Base type defining the common properties of a configuration. User readable name. Length up to 64 characters. Number of internal references currently using this configuration.

This informational parameter is read-only. Deprecated for Media2 Service.

Token that uniquely references this configuration. Length up to 64 characters.
Reference to the physical input. Rectangle specifying the Video capturing area. The capturing area shall not be larger than the whole Video source area. Readonly parameter signalling Source configuration's view mode, for devices supporting different view modes as defined in tt:viewModes. Optional element to configure rotation of captured image. What resolutions a device supports shall be unaffected by the Rotate parameters.
If a device is configured with Rotate=AUTO, the device shall take control over the Degree parameter and automatically update it so that a client can query current rotation.
The device shall automatically apply the same rotation to its pan/tilt control direction depending on the following condition: if Reverse=AUTO in PTControlDirection or if the device doesn’t support Reverse in PTControlDirection
Optional element describing the geometric lens distortion. Multiple instances for future variable lens support. Optional element describing the scene orientation in the camera’s field of view. Parameter to enable/disable Rotation feature. Optional parameter to configure how much degree of clockwise rotation of image for On mode. Omitting this parameter for On mode means 180 degree rotation. Enable the Rotate feature. Degree of rotation is specified Degree parameter. Disable the Rotate feature. Rotate feature is automatically activated by the device. Angle of incidence. Mapping radius as a consequence of the emergent angle. Optional ray absorption at the given angle due to vignetting. A value of one means no absorption. Optional horizontal offset of the lens center in normalized coordinates. Optional vertical offset of the lens center in normalized coordinates. Offset of the lens center to the imager center in normalized coordinates. Radial description of the projection characteristics. The resulting curve is defined by the B-Spline interpolation over the given elements. The element for Radius zero shall not be provided. The projection points shall be ordered with ascending Radius. Items outside the last projection Radius shall be assumed to be invisible (black). Compensation of the x coordinate needed for the ONVIF normalized coordinate system. Optional focal length of the optical system. Supported range for the capturing area. Device that does not support cropped streaming shall express BoundsRange option as mentioned below BoundsRange->XRange and BoundsRange->YRange with same Min/Max values HeightRange and WidthRange Min/Max values same as VideoSource Height and Width Limits. List of physical inputs. Maximum number of profiles. Options of parameters for Rotation feature. Scene orientation modes supported by the device for this configuration. Supported options of Rotate mode parameter. List of supported degree value for rotation. Signals if a device requires a reboot after changing the rotation. If a device can handle rotation changes without rebooting this value shall be set to false. Defines the acceptable values for the Orientation element of the SceneOrientation type Parameter to assign the way the camera determines the scene orientation. Assigned or determined scene orientation based on the Mode. When assigning the Mode to AUTO, this field is optional and will be ignored by the device. When assigning the Mode to MANUAL, this field is required and the device will return an InvalidArgs fault if missing. Source view modes supported by device. Undewarped viewmode from device supporting fisheye lens. 360 degree panoramic view. 180 degree panoramic view. View mode combining four streams in single Quad, eg., applicable for devices supporting four heads. Unaltered view from the sensor. Viewmode combining the left side sensors, applicable for devices supporting multiple sensors. Viewmode combining the right side sensors, applicable for devices supporting multiple sensors. Dewarped view mode for device supporting fisheye lens. Used video codec, either Jpeg, H.264 or Mpeg4 Configured video resolution Relative value for the video quantizers and the quality of the video. A high value within supported quality range means higher quality Optional element to configure rate control related parameters. Optional element to configure Mpeg4 related parameters. Optional element to configure H.264 related parameters. Defines the multicast settings that could be used for video streaming. The rtsp session timeout for the related video stream A value of true indicates that frame rate is a fixed value rather than an upper limit, and that the video encoder shall prioritize frame rate over all other adaptable configuration values such as bitrate. Default is false. Number of the columns of the Video image. Number of the lines of the Video image. Maximum output framerate in fps. If an EncodingInterval is provided the resulting encoded framerate will be reduced by the given factor. Interval at which images are encoded and transmitted. (A value of 1 means that every frame is encoded, a value of 2 means that every 2nd frame is encoded ...) the maximum output bitrate in kbps Determines the interval in which the I-Frames will be coded. An entry of 1 indicates I-Frames are continuously generated. An entry of 2 indicates that every 2nd image is an I-Frame, and 3 only every 3rd frame, etc. The frames in between are coded as P or B Frames. the Mpeg4 profile, either simple profile (SP) or advanced simple profile (ASP) Group of Video frames length. Determines typically the interval in which the I-Frames will be coded. An entry of 1 indicates I-Frames are continuously generated. An entry of 2 indicates that every 2nd image is an I-Frame, and 3 only every 3rd frame, etc. The frames in between are coded as P or B Frames. the H.264 profile, either baseline, main, extended or high Range of the quality values. A high value means higher quality. Optional JPEG encoder settings ranges (See also Extension element). Optional MPEG-4 encoder settings ranges (See also Extension element). Optional H.264 encoder settings ranges (See also Extension element). Indicates the support for the GuaranteedFrameRate attribute on the VideoEncoderConfiguration element. Optional JPEG encoder settings ranges. Optional MPEG-4 encoder settings ranges. Optional H.264 encoder settings ranges. List of supported image sizes. Supported frame rate in fps (frames per second). Supported encoding interval range. The encoding interval corresponds to the number of frames devided by the encoded frames. An encoding interval value of "1" means that all frames are encoded. Supported range of encoded bitrate in kbps. List of supported image sizes. Supported group of Video frames length. This value typically corresponds to the I-Frame distance. Supported frame rate in fps (frames per second). Supported encoding interval range. The encoding interval corresponds to the number of frames devided by the encoded frames. An encoding interval value of "1" means that all frames are encoded. List of supported MPEG-4 profiles. Supported range of encoded bitrate in kbps. List of supported image sizes. Supported group of Video frames length. This value typically corresponds to the I-Frame distance. Supported frame rate in fps (frames per second). Supported encoding interval range. The encoding interval corresponds to the number of frames devided by the encoded frames. An encoding interval value of "1" means that all frames are encoded. List of supported H.264 profiles. Supported range of encoded bitrate in kbps. Video Media Subtypes as referenced by IANA (without the leading "video/" Video Media Type). See also IANA Media Types. Video Media Subtype for the video format. For definitions see tt:VideoEncodingMimeNames and IANA Media Types. Configured video resolution Optional element to configure rate control related parameters. Defines the multicast settings that could be used for video streaming. Relative value for the video quantizers and the quality of the video. A high value within supported quality range means higher quality Group of Video frames length. Determines typically the interval in which the I-Frames will be coded. An entry of 1 indicates I-Frames are continuously generated. An entry of 2 indicates that every 2nd image is an I-Frame, and 3 only every 3rd frame, etc. The frames in between are coded as P or B Frames. Distance between anchor frames of type I-Frame and P-Frame. '1' indicates no B-Frames, '2' indicates that every 2nd frame is encoded as B-Frame, '3' indicates a structure like IBBPBBP..., etc. The encoder profile as defined in tt:VideoEncodingProfiles. A value of true indicates that frame rate is a fixed value rather than an upper limit, and that the video encoder shall prioritize frame rate over all other adaptable configuration values such as bitrate. Default is false. Number of the columns of the Video image. Number of the lines of the Video image. Desired frame rate in fps. The actual rate may be lower due to e.g. performance limitations. the maximum output bitrate in kbps Enforce constant bitrate. Video Media Subtype for the video format. For definitions see tt:VideoEncodingMimeNames and IANA Media Types. Range of the quality values. A high value means higher quality. List of supported image sizes. Supported range of encoded bitrate in kbps. Exactly two values, which define the Lower and Upper bounds for the supported group of Video frames length. These values typically correspond to the I-Frame distance. Signals support for B-Frames. Upper bound for the supported anchor frame distance (must be larger than one). List of supported target frame rates in fps (frames per second). The list shall be sorted with highest values first. List of supported encoder profiles as defined in tt::VideoEncodingProfiles. Signal whether enforcing constant bitrate is supported. Indicates the support for the GuaranteedFrameRate attribute on the VideoEncoder2Configuration element. Token of the Audio Source the configuration applies to Tokens of the audio source the configuration can be used for. Audio codec used for encoding the audio input (either G.711, G.726 or AAC) The output bitrate in kbps. The output sample rate in kHz. Defines the multicast settings that could be used for video streaming. The rtsp session timeout for the related audio stream list of supported AudioEncoderConfigurations The enoding used for audio data (either G.711, G.726 or AAC) List of supported bitrates in kbps for the specified Encoding List of supported Sample Rates in kHz for the specified Encoding Audio Media Subtypes as referenced by IANA (without the leading "audio/" Audio Media Type and except for the audio types defined in the restriction). See also IANA Media Types. AudioEncodingMimeName G726 is used to represent G726-16,G726-24,G726-32 and G726-40 defined in the IANA Media Types Audio Media Subtype for the audio format. For definitions see tt:AudioEncodingMimeNames and IANA Media Types. Optional multicast configuration of the audio stream. The output bitrate in kbps. The output sample rate in kHz. Audio Media Subtype for the audio format. For definitions see tt:AudioEncodingMimeNames and IANA Media Types. List of supported bitrates in kbps for the specified Encoding List of supported Sample Rates in kHz for the specified Encoding optional element to configure which PTZ related data is to include in the metadata stream Optional element to configure the streaming of events. A client might be interested in receiving all, none or some of the events produced by the device:
  • To get all events: Include the Events element but do not include a filter.
  • To get no events: Do not include the Events element.
  • To get only some events: Include the Events element and include a filter in the element.
Defines whether the streamed metadata will include metadata from the analytics engines (video, cell motion, audio etc.) Defines the multicast settings that could be used for video streaming. The rtsp session timeout for the related audio stream (when using Media2 Service, this value is deprecated and ignored) Indication which AnalyticsModules shall output metadata. Note that the streaming behavior is undefined if the list includes items that are not part of the associated AnalyticsConfiguration.
Optional parameter to configure compression type of Metadata payload. Use values from enumeration MetadataCompressionType. Optional parameter to configure if the metadata stream shall contain the Geo Location coordinates of each target. Optional parameter to configure if the generated metadata stream should contain shape information as polygon.
True if the metadata stream shall contain the PTZ status (IDLE, MOVING or UNKNOWN) True if the metadata stream shall contain the PTZ position Subcription handling in the same way as base notification subscription. True if the device is able to stream the Geo Located positions of each target. A device signalling support for content filtering shall support expressions with the provided expression size. List of supported metadata compression type. Its options shall be chosen from tt:MetadataCompressionType. True if the device is able to stream pan or tilt status information. True if the device is able to stream zoom status inforamtion. True if the device is able to stream the pan or tilt position. True if the device is able to stream zoom position information. Representation of a physical video outputs. Resolution of the display in Pixel. Refresh rate of the display in Hertz. Aspect ratio of the display as physical extent of width divided by height. Token of the Video Output the configuration applies to If the device is able to decode Jpeg streams this element describes the supported codecs and configurations If the device is able to decode H.264 streams this element describes the supported codecs and configurations If the device is able to decode Mpeg4 streams this element describes the supported codecs and configurations List of supported H.264 Video Resolutions List of supported H264 Profiles (either baseline, main, extended or high) Supported H.264 bitrate range in kbps Supported H.264 framerate range in fps List of supported Jpeg Video Resolutions Supported Jpeg bitrate range in kbps Supported Jpeg framerate range in fps List of supported Mpeg4 Video Resolutions List of supported Mpeg4 Profiles (either SP or ASP) Supported Mpeg4 bitrate range in kbps Supported Mpeg4 framerate range in fps Representation of a physical audio outputs. Token of the phsycial Audio output. An audio channel MAY support different types of audio transmission. While for full duplex operation no special handling is required, in half duplex operation the transmission direction needs to be switched. The optional SendPrimacy parameter inside the AudioOutputConfiguration indicates which direction is currently active. An NVC can switch between different modes by setting the AudioOutputConfiguration.
The following modes for the Send-Primacy are defined:
  • www.onvif.org/ver20/HalfDuplex/Server The server is allowed to send audio data to the client. The client shall not send audio data via the backchannel to the NVT in this mode.
  • www.onvif.org/ver20/HalfDuplex/Client The client is allowed to send audio data via the backchannel to the server. The NVT shall not send audio data to the client in this mode.
  • www.onvif.org/ver20/HalfDuplex/Auto It is up to the device how to deal with sending and receiving audio data.
Acoustic echo cancellation is out of ONVIF scope.
Volume setting of the output. The applicable range is defined via the option AudioOutputOptions.OutputLevelRange.
Tokens of the physical Audio outputs (typically one). An audio channel MAY support different types of audio transmission. While for full duplex operation no special handling is required, in half duplex operation the transmission direction needs to be switched. The optional SendPrimacy parameter inside the AudioOutputConfiguration indicates which direction is currently active. An NVC can switch between different modes by setting the AudioOutputConfiguration.
The following modes for the Send-Primacy are defined:
  • www.onvif.org/ver20/HalfDuplex/Server The server is allowed to send audio data to the client. The client shall not send audio data via the backchannel to the NVT in this mode.
  • www.onvif.org/ver20/HalfDuplex/Client The client is allowed to send audio data via the backchannel to the server. The NVT shall not send audio data to the client in this mode.
  • www.onvif.org/ver20/HalfDuplex/Auto It is up to the device how to deal with sending and receiving audio data.
Acoustic echo cancellation is out of ONVIF scope.
Minimum and maximum level range supported for this Output.
The Audio Decoder Configuration does not contain any that parameter to configure the decoding .A decoder shall decode every data it receives (according to its capabilities). If the device is able to decode AAC encoded audio this section describes the supported configurations If the device is able to decode G711 encoded audio this section describes the supported configurations If the device is able to decode G726 encoded audio this section describes the supported configurations List of supported bitrates in kbps List of supported sample rates in kHz List of supported bitrates in kbps List of supported sample rates in kHz List of supported bitrates in kbps List of supported sample rates in kHz The multicast address (if this address is set to 0 no multicast streaming is enaled) The RTP mutlicast destination port. A device may support RTCP. In this case the port value shall be even to allow the corresponding RTCP stream to be mapped to the next higher (odd) destination port number as defined in the RTSP specification. In case of IPv6 the TTL value is assumed as the hop limit. Note that for IPV6 and administratively scoped IPv4 multicast the primary use for hop limit / TTL is to prevent packets from (endlessly) circulating and not limiting scope. In these cases the address contains the scope. Read only property signalling that streaming is persistant. Use the methods StartMulticastStreaming and StopMulticastStreaming to switch its state. Defines if a multicast or unicast stream is requested Defines the network protocol for streaming, either UDP=RTP/UDP, RTSP=RTP/RTSP/TCP or HTTP=RTP/RTSP/HTTP/TCP Optional element to describe further tunnel options. This element is normally not needed This value is deprecated. Stable Uri to be used for requesting the media stream Indicates if the Uri is only valid until the connection is established. The value shall be set to "false". Indicates if the Uri is invalid after a reboot of the device. The value shall be set to "false". Duration how long the Uri is valid. This parameter shall be set to PT0S to indicate that this stream URI is indefinitely valid even if the profile changes Indicates if the scope is fixed or configurable. Scope item URI. Indicates whether or not an interface is enabled. Network interface information Link configuration. IPv4 network interface configuration. IPv6 network interface configuration. Extension point prepared for future 802.3 configuration. Configured link settings. Current active link settings. Integer indicating interface type, for example: 6 is ethernet. Auto negotiation on/off. Speed. Duplex type, Half or Full. For valid numbers, please refer to http://www.iana.org/assignments/ianaiftype-mib. Network interface name, for example eth0. Network interface MAC address. Maximum transmission unit. Indicates whether or not IPv6 is enabled. IPv6 configuration. Indicates whether or not IPv4 is enabled. IPv4 configuration. List of manually added IPv4 addresses. Link local address. IPv4 address configured by using DHCP. Indicates whether or not DHCP is used. Indicates whether router advertisment is used. DHCP configuration. List of manually entered IPv6 addresses. List of link local IPv6 addresses. List of IPv6 addresses configured by using DHCP. List of IPv6 addresses configured by using router advertisment. Network protocol type string. Indicates if the protocol is enabled or not. The port that is used by the protocol. Network host type: IPv4, IPv6 or DNS. IPv4 address. IPv6 address. DNS name. Indicates if the address is an IPv4 or IPv6 address. IPv4 address. IPv6 address IPv4 address Prefix/submask length IPv6 address Prefix/submask length Indicates whether the hostname has been obtained from DHCP or not. Indicates the device hostname or an empty string if no hostname has been assigned. Indicates whether or not DNS information is retrieved from DHCP. Search domain. List of DNS addresses received from DHCP. List of manually entered DNS addresses. Indicates if NTP information is to be retrieved by using DHCP. List of NTP addresses retrieved by using DHCP. List of manually entered NTP addresses. Dynamic DNS type. DNS name. Time to live. Indicates whether or not an interface is enabled. Link configuration. Maximum transmission unit. IPv4 network interface configuration. IPv6 network interface configuration. Indicates whether or not IPv6 is enabled. Indicates whether router advertisment is used. List of manually added IPv6 addresses. DHCP configuration. Indicates whether or not IPv4 is enabled. List of manually added IPv4 addresses. Indicates whether or not DHCP is used. IPv4 address string. IPv6 address string. Unique identifier of network interface. Indicates whether the zero-configuration is enabled or not. The zero-configuration IPv4 address(es) Optional array holding the configuration for the second and possibly further interfaces. According to IEEE802.11-2007 H.4.1 the RSNA PSK consists of 256 bits, or 64 octets when represented in hex
Either Key or Passphrase shall be given, if both are supplied Key shall be used by the device and Passphrase ignored.
According to IEEE802.11-2007 H.4.1 a pass-phrase is a sequence of between 8 and 63 ASCII-encoded characters and each character in the pass-phrase must have an encoding in the range of 32 to 126 (decimal),inclusive.
If only Passpharse is supplied the Key shall be derived using the algorithm described in IEEE802.11-2007 section H.4
See IEEE802.11 7.3.2.25.2 for details. Analytics capabilities Device capabilities Event capabilities Imaging capabilities Media capabilities PTZ capabilities Analytics service URI. Indicates whether or not rules are supported. Indicates whether or not modules are supported. Device service URI. Network capabilities. System capabilities. I/O capabilities. Security capabilities. Event service URI. Indicates whether or not WS Subscription policy is supported. Indicates whether or not WS Pull Point is supported. Indicates whether or not WS Pausable Subscription Manager Interface is supported. Number of input connectors. Number of relay outputs. Media service URI. Streaming capabilities. Indicates whether or not RTP multicast is supported. Indicates whether or not RTP over TCP is supported. Indicates whether or not RTP/RTSP/TCP is supported. Maximum number of profiles. Indicates whether or not IP filtering is supported. Indicates whether or not zeroconf is supported. Indicates whether or not IPv6 is supported. Indicates whether or not is supported. Indicates whether or not TLS 1.1 is supported. Indicates whether or not TLS 1.2 is supported. Indicates whether or not onboard key generation is supported. Indicates whether or not access policy configuration is supported. Indicates whether or not WS-Security X.509 token is supported. Indicates whether or not WS-Security SAML token is supported. Indicates whether or not WS-Security Kerberos token is supported. Indicates whether or not WS-Security REL token is supported. EAP Methods supported by the device. The int values refer to the IANA EAP Registry. Indicates whether or not WS Discovery resolve requests are supported. Indicates whether or not WS-Discovery Bye is supported. Indicates whether or not remote discovery is supported. Indicates whether or not system backup is supported. Indicates whether or not system logging is supported. Indicates whether or not firmware upgrade is supported. Indicates supported ONVIF version(s). Major version number. Two digit minor version number. If major version number is less than "16", X.0.1 maps to "01" and X.2.1 maps to "21" where X stands for Major version number. Otherwise, minor number is month of release, such as "06" for June. Imaging service URI. PTZ service URI. Indication that the SetLayout command supports only predefined layouts. The address of the replay service. The address of the receiver service. Indicates whether the device can receive RTP multicast streams. Indicates whether the device can receive RTP/TCP streams Indicates whether the device can receive RTP/RTSP/TCP streams. The maximum number of receivers supported by the device. The maximum allowed length for RTSP URIs. Obsolete property. Enumeration describing the available system log modes. Indicates that a system log is requested. Indicates that a access log is requested. The log information as attachment data. The log information as character data. The support information as attachment data. The support information as character data. base64 encoded binary data. Enumeration describing the available factory default modes. Indicates that a hard factory default is requested. Indicates that a soft factory default is requested. Indicates that the date and time are set manually. Indicates that the date and time are set through NTP General date time inforamtion returned by the GetSystemDateTime method. Indicates if the time is set manully or through NTP. Informative indicator whether daylight savings is currently on/off. Timezone information in Posix format. Current system date and time in UTC format. This field is mandatory since version 2.0. Date and time in local format. Range is 1 to 12. Range is 1 to 31. Range is 0 to 23. Range is 0 to 59. Range is 0 to 61 (typically 59). The TZ format is specified by POSIX, please refer to POSIX 1003.1 section 8.3
Example: Europe, Paris TZ=CET-1CEST,M3.5.0/2,M10.5.0/3
CET = designation for standard time when daylight saving is not in force
-1 = offset in hours = negative so 1 hour east of Greenwich meridian
CEST = designation when daylight saving is in force ("Central European Summer Time")
, = no offset number between code and comma, so default to one hour ahead for daylight saving
M3.5.0 = when daylight saving starts = the last Sunday in March (the "5th" week means the last in the month)
/2, = the local time when the switch occurs = 2 a.m. in this case
M10.5.0 = when daylight saving ends = the last Sunday in October.
/3, = the local time when the switch occurs = 3 a.m. in this case
Posix timezone string.
Username string. Password string. User level string. Certificate id. base64 encoded DER representation of certificate. Certificate id. Indicates whether or not a certificate is used in a HTTPS configuration. Validity Range is from "NotBefore" to "NotAfter"; the corresponding DateTimeRange is from "From" to "Until" EAP Method type as defined in IANA EAP Registry. Confgiuration information for TLS Method. Password for those EAP Methods that require a password. The password shall never be returned on a get method. 'Bistable' or 'Monostable'
  • Bistable – After setting the state, the relay remains in this state.
  • Monostable – After setting the state, the relay returns to its idle state after the specified time.
Time after which the relay returns to its idle state if it is in monostable mode. If the Mode field is set to bistable mode the value of the parameter can be ignored. 'open' or 'closed'
  • 'open' means that the relay is open when the relay state is set to 'inactive' through the trigger command and closed when the state is set to 'active' through the same command.
  • 'closed' means that the relay is closed when the relay state is set to 'inactive' through the trigger command and open when the state is set to 'active' through the same command.
Indicate the Digital IdleState status. A unique identifier that is used to reference PTZ Nodes. A list of Coordinate Systems available for the PTZ Node. For each Coordinate System, the PTZ Node MUST specify its allowed range. All preset operations MUST be available for this PTZ Node if one preset is supported. A boolean operator specifying the availability of a home position. If set to true, the Home Position Operations MUST be available for this PTZ Node. A list of supported Auxiliary commands. If the list is not empty, the Auxiliary Operations MUST be available for this PTZ Node. Indication whether the HomePosition of a Node is fixed or it can be changed via the SetHomePosition command. Indication whether the Node supports the geo-referenced move command. Detail of supported Preset Tour feature. Indicates number of preset tours that can be created. Required preset tour operations shall be available for this PTZ Node if one or more preset tour is supported. Indicates which preset tour operations are available for this PTZ Node. A mandatory reference to the PTZ Node that the PTZ Configuration belongs to. If the PTZ Node supports absolute Pan/Tilt movements, it shall specify one Absolute Pan/Tilt Position Space as default. If the PTZ Node supports absolute zoom movements, it shall specify one Absolute Zoom Position Space as default. If the PTZ Node supports relative Pan/Tilt movements, it shall specify one RelativePan/Tilt Translation Space as default. If the PTZ Node supports relative zoom movements, it shall specify one Relative Zoom Translation Space as default. If the PTZ Node supports continuous Pan/Tilt movements, it shall specify one Continuous Pan/Tilt Velocity Space as default. If the PTZ Node supports continuous zoom movements, it shall specify one Continuous Zoom Velocity Space as default. If the PTZ Node supports absolute or relative PTZ movements, it shall specify corresponding default Pan/Tilt and Zoom speeds. If the PTZ Node supports continuous movements, it shall specify a default timeout, after which the movement stops. The Pan/Tilt limits element should be present for a PTZ Node that supports an absolute Pan/Tilt. If the element is present it signals the support for configurable Pan/Tilt limits. If limits are enabled, the Pan/Tilt movements shall always stay within the specified range. The Pan/Tilt limits are disabled by setting the limits to –INF or +INF. The Zoom limits element should be present for a PTZ Node that supports absolute zoom. If the element is present it signals the supports for configurable Zoom limits. If limits are enabled the zoom movements shall always stay within the specified range. The Zoom limits are disabled by settings the limits to -INF and +INF. The optional acceleration ramp used by the device when moving. The optional acceleration ramp used by the device when recalling presets. The optional acceleration ramp used by the device when executing PresetTours. Optional element to configure PT Control Direction related features. Optional element to configure related parameters for E-Flip. Optional element to configure related parameters for reversing of PT Control Direction. Parameter to enable/disable E-Flip feature. Parameter to enable/disable Reverse feature. A list of supported coordinate systems including their range limitations. A timeout Range within which Timeouts are accepted by the PTZ Node. Supported options for PT Direction Control. The list of acceleration ramps supported by the device. The smallest acceleration value corresponds to the minimal index, the highest acceleration corresponds to the maximum index. Supported options for EFlip feature. Supported options for Reverse feature. Options of EFlip mode parameter. Options of Reverse mode parameter. A range of pan tilt limits. A range of zoom limit The Generic Pan/Tilt Position space is provided by every PTZ node that supports absolute Pan/Tilt, since it does not relate to a specific physical range. Instead, the range should be defined as the full range of the PTZ unit normalized to the range -1 to 1 resulting in the following space description. The Generic Zoom Position Space is provided by every PTZ node that supports absolute Zoom, since it does not relate to a specific physical range. Instead, the range should be defined as the full range of the Zoom normalized to the range 0 (wide) to 1 (tele). There is no assumption about how the generic zoom range is mapped to magnification, FOV or other physical zoom dimension. The Generic Pan/Tilt translation space is provided by every PTZ node that supports relative Pan/Tilt, since it does not relate to a specific physical range. Instead, the range should be defined as the full positive and negative translation range of the PTZ unit normalized to the range -1 to 1, where positive translation would mean clockwise rotation or movement in right/up direction resulting in the following space description. The Generic Zoom Translation Space is provided by every PTZ node that supports relative Zoom, since it does not relate to a specific physical range. Instead, the corresponding absolute range should be defined as the full positive and negative translation range of the Zoom normalized to the range -1 to1, where a positive translation maps to a movement in TELE direction. The translation is signed to indicate direction (negative is to wide, positive is to tele). There is no assumption about how the generic zoom range is mapped to magnification, FOV or other physical zoom dimension. This results in the following space description. The generic Pan/Tilt velocity space shall be provided by every PTZ node, since it does not relate to a specific physical range. Instead, the range should be defined as a range of the PTZ unit’s speed normalized to the range -1 to 1, where a positive velocity would map to clockwise rotation or movement in the right/up direction. A signed speed can be independently specified for the pan and tilt component resulting in the following space description. The generic zoom velocity space specifies a zoom factor velocity without knowing the underlying physical model. The range should be normalized from -1 to 1, where a positive velocity would map to TELE direction. A generic zoom velocity space description resembles the following. The speed space specifies the speed for a Pan/Tilt movement when moving to an absolute position or to a relative translation. In contrast to the velocity spaces, speed spaces do not contain any directional information. The speed of a combined Pan/Tilt movement is represented by a single non-negative scalar value. The speed space specifies the speed for a Zoom movement when moving to an absolute position or to a relative translation. In contrast to the velocity spaces, speed spaces do not contain any directional information. A URI of coordinate systems. A range of x-axis. A range of y-axis. A URI of coordinate systems. A range of x-axis. Pan and tilt speed. The x component corresponds to pan and the y component to tilt. If omitted in a request, the current (if any) PanTilt movement should not be affected. A zoom speed. If omitted in a request, the current (if any) Zoom movement should not be affected. A list of preset position name. A list of preset position. Readable name of the preset tour. Read only parameters to indicate the status of the preset tour. Auto Start flag of the preset tour. True allows the preset tour to be activated always. Parameters to specify the detail behavior of the preset tour. A list of detail of touring spots including preset positions. Unique identifier of this preset tour. Detail definition of preset position of the tour spot. Optional parameter to specify Pan/Tilt and Zoom speed on moving toward this tour spot. Optional parameter to specify time duration of staying on this tour sport. Option to specify the preset position with Preset Token defined in advance. Option to specify the preset position with the home position of this PTZ Node. "False" to this parameter shall be treated as an invalid argument. Option to specify the preset position with vector of PTZ node directly. Indicates state of this preset tour by Idle/Touring/Paused. Indicates a tour spot currently staying. Optional parameter to specify how many times the preset tour is recurred. Optional parameter to specify how long time duration the preset tour is recurred. Optional parameter to choose which direction the preset tour goes. Forward shall be chosen in case it is omitted. Execute presets in random order. If set to true and Direction is also present, Direction will be ignored and presets of the Tour will be recalled randomly. Indicates whether or not the AutoStart is supported. Supported options for Preset Tour Starting Condition. Supported options for Preset Tour Spot. Supported options for detail definition of preset position of the tour spot. Supported range of stay time for a tour spot. A list of available Preset Tokens for tour spots. An option to indicate Home postion for tour spots. Supported range of Pan and Tilt for tour spots. Supported range of Zoom for a tour spot. Supported range of Recurring Time. Supported range of Recurring Duration. Supported options for Direction of Preset Tour. Status of focus position. Status of focus MoveStatus. Error status of focus. Parameter to set autofocus near limit (unit: meter). Parameter to set autofocus far limit (unit: meter). If set to 0.0, infinity will be used. Focus of a moving camera is updated only once after stopping a pan, tilt or zoom movement. Enabled/disabled BLC mode (on/off). Image brightness (unit unspecified). Color saturation of the image (unit unspecified). Contrast of the image (unit unspecified). Exposure mode of the device. Focus configuration. Infrared Cutoff Filter settings. Sharpness of the Video image. WDR settings. White balance settings. Exposure Mode
  • Auto – Enabled the exposure algorithm on the NVT.
  • Manual – Disabled exposure algorithm on the NVT.
The exposure priority mode (low noise/framerate). Rectangular exposure mask. Minimum value of exposure time range allowed to be used by the algorithm. Maximum value of exposure time range allowed to be used by the algorithm. Minimum value of the sensor gain range that is allowed to be used by the algorithm. Maximum value of the sensor gain range that is allowed to be used by the algorithm. Minimum value of the iris range allowed to be used by the algorithm. Maximum value of the iris range allowed to be used by the algorithm. The fixed exposure time used by the image sensor (μs). The fixed gain used by the image sensor (dB). The fixed attenuation of input light affected by the iris (dB). 0dB maps to a fully opened iris.
White dynamic range (on/off) Optional level parameter (unitless) Enumeration describing the available backlight compenstation modes. Backlight compensation is disabled. Backlight compensation is enabled. Backlight compensation mode (on/off). Optional level parameter (unit unspecified). Parameters for the absolute focus control. Parameters for the relative focus control. Parameter for the continuous focus control. Position parameter for the absolute focus control. Speed parameter for the absolute focus control. Distance parameter for the relative focus control. Speed parameter for the relative focus control. Speed parameter for the Continuous focus control. Valid ranges of the position. Valid ranges of the speed. Valid ranges of the distance. Valid ranges of the speed. Valid ranges of the speed. Auto whitebalancing mode (auto/manual). Rgain (unitless). Bgain (unitless). Status of focus. Status of focus position. Status of focus MoveStatus. Error status of focus. Type describing the ImagingSettings of a VideoSource. The supported options and ranges can be obtained via the GetOptions command. Enabled/disabled BLC mode (on/off). Image brightness (unit unspecified). Color saturation of the image (unit unspecified). Contrast of the image (unit unspecified). Exposure mode of the device. Focus configuration. Infrared Cutoff Filter settings. Sharpness of the Video image. WDR settings. White balance settings. Optional element to configure Image Stabilization feature. An optional parameter applied to only auto mode to adjust timing of toggling Ir cut filter. Optional element to configure Image Contrast Compensation. Optional element to configure Image Defogging. Optional element to configure Image Noise Reduction. Parameter to enable/disable Image Stabilization feature. Optional level parameter (unit unspecified) Specifies which boundaries to automatically toggle Ir cut filter following parameters are applied to. Its options shall be chosen from tt:IrCutFilterAutoBoundaryType. Adjusts boundary exposure level for toggling Ir cut filter to on/off specified with unitless normalized value from +1.0 to -1.0. Zero is default and -1.0 is the darkest adjustment (Unitless). Delay time of toggling Ir cut filter to on/off after crossing of the boundary exposure levels. Type describing whether WDR mode is enabled or disabled (on/off). Wide dynamic range mode (on/off). Optional level parameter (unit unspecified). Type describing whether BLC mode is enabled or disabled (on/off). Backlight compensation mode (on/off). Optional level parameter (unit unspecified). Type describing the exposure settings. Exposure Mode
  • Auto – Enabled the exposure algorithm on the device.
  • Manual – Disabled exposure algorithm on the device.
The exposure priority mode (low noise/framerate). Rectangular exposure mask. Minimum value of exposure time range allowed to be used by the algorithm. Maximum value of exposure time range allowed to be used by the algorithm. Minimum value of the sensor gain range that is allowed to be used by the algorithm. Maximum value of the sensor gain range that is allowed to be used by the algorithm. Minimum value of the iris range allowed to be used by the algorithm. 0dB maps to a fully opened iris and positive values map to higher attenuation. Maximum value of the iris range allowed to be used by the algorithm. 0dB maps to a fully opened iris and positive values map to higher attenuation. The fixed exposure time used by the image sensor (μs). The fixed gain used by the image sensor (dB). The fixed attenuation of input light affected by the iris (dB). 0dB maps to a fully opened iris and positive values map to higher attenuation.
Parameter to enable/disable or automatic ToneCompensation feature. Its options shall be chosen from tt:ToneCompensationMode Type. Optional level parameter specified with unitless normalized value from 0.0 to +1.0. Parameter to enable/disable or automatic Defogging feature. Its options shall be chosen from tt:DefoggingMode Type. Optional level parameter specified with unitless normalized value from 0.0 to +1.0. Level parameter specified with unitless normalized value from 0.0 to +1.0. Level=0 means no noise reduction or minimal noise reduction. Valid range of Backlight Compensation. Valid range of Brightness. Valid range of Color Saturation. Valid range of Contrast. Valid range of Exposure. Valid range of Focus. Valid range of IrCutFilterModes. Valid range of Sharpness. Valid range of WideDynamicRange. Valid range of WhiteBalance. Options of parameters for Image Stabilization feature. Options of parameters for adjustment of Ir cut filter auto mode. Options of parameters for Tone Compensation feature. Options of parameters for Defogging feature. Options of parameter for Noise Reduction feature. Supported options of Image Stabilization mode parameter. Valid range of the Image Stabilization. Supported options of boundary types for adjustment of Ir cut filter auto mode. The opptions shall be chosen from tt:IrCutFilterAutoBoundaryType. Indicates whether or not boundary offset for toggling Ir cut filter is supported. Supported range of delay time for toggling Ir cut filter. 'ON' or 'OFF' Level range of BacklightCompensation. Exposure Mode
  • Auto – Enabled the exposure algorithm on the device.
  • Manual – Disabled exposure algorithm on the device.
The exposure priority mode (low noise/framerate). Valid range of the Minimum ExposureTime. Valid range of the Maximum ExposureTime. Valid range of the Minimum Gain. Valid range of the Maximum Gain. Valid range of the Minimum Iris. Valid range of the Maximum Iris. Valid range of the ExposureTime. Valid range of the Gain. Valid range of the Iris.
Valid ranges for the absolute control. Valid ranges for the relative control. Valid ranges for the continuous control. Valid ranges of the distance. Valid ranges of the speed. 'AUTO' or 'MANUAL' Rgain (unitless). Bgain (unitless). Mode of auto focus.
  • AUTO - The device automatically adjusts focus.
  • MANUAL - The device does not automatically adjust focus.
Note: for devices supporting both manual and auto operation at the same time manual operation may be supported even if the Mode parameter is set to Auto.
Parameter to set autofocus near limit (unit: meter). Parameter to set autofocus far limit (unit: meter).
Zero or more modes as defined in enumeration tt:AFModes.
Mode of WhiteBalance.
  • AUTO
  • MANUAL
Supported modes for auto focus.
  • AUTO - The device supports automatic focus adjustment.
  • MANUAL - The device supports manual focus adjustment.
Valid range of DefaultSpeed. Valid range of NearLimit. Valid range of FarLimit.
Supported options for auto focus. Options shall be chosen from tt:AFModes. Supported options for Tone Compensation mode. Its options shall be chosen from tt:ToneCompensationMode Type. Indicates whether or not support Level parameter for Tone Compensation. Supported options for Defogging mode. Its options shall be chosen from tt:DefoggingMode Type. Indicates whether or not support Level parameter for Defogging. Indicates whether or not support Level parameter for NoiseReduction. Token value pairs that triggered this message. Typically only one item is present. Value name pair as defined by the corresponding description. Item name. Item value. The type is defined in the corresponding description. Complex value structure. XML tree contiaing the element value as defined in the corresponding description. Item name. Set of tokens producing this message. The list may only contain SimpleItemDescription items. The set of tokens identify the component within the WS-Endpoint, which is responsible for the producing the message.
For analytics events the token set shall include the VideoSourceConfigurationToken, the VideoAnalyticsConfigurationToken and the name of the analytics module or rule.
Describes optional message payload parameters that may be used as key. E.g. object IDs of tracked objects are conveyed as key. Describes the payload of the message.
Must be set to true when the described Message relates to a property. An alternative term of "property" is a "state" in contrast to a pure event, which contains relevant information for only a single point in time.
Default is false.
Describes a list of items. Each item in the list shall have a unique name. The list is designed as linear structure without optional or unbounded elements. Use ElementItems only when complex structures are inevitable. Description of a simple item. The type must be of cathegory simpleType (xs:string, xs:integer, xs:float, ...). Item name. Must be unique within a list. Description of a complex type. The Type must reference a defined type. Item name. Must be unique within a list. The type of the item. The Type must reference a defined type. List of configuration parameters as defined in the corresponding description. Name of the configuration. The Type attribute specifies the type of rule and shall be equal to value of one of Name attributes of ConfigDescription elements returned by GetSupportedRules and GetSupportedAnalyticsModules command. List describing the configuration parameters. The names of the parameters must be unique. If possible SimpleItems should be used to transport the information to ease parsing of dynamically defined messages by a client application. The analytics modules and rule engine produce Events, which must be listed within the Analytics Module Description. In order to do so the structure of the Message is defined and consists of three groups: Source, Key, and Data. It is recommended to use SimpleItemDescriptions wherever applicable. The name of all Items must be unique within all Items contained in any group of this Message. Depending on the component multiple parameters or none may be needed to identify the component uniquely. The topic of the message. For historical reason the element is named ParentTopic, but the full topic is expected. The Name attribute (e.g. "tt::LineDetector") uniquely identifies the type of rule, not a type definition in a schema. The fixed attribute signals that it is not allowed to add or remove this type of configuration. The maxInstances attribute signals the maximum number of instances per configuration. Lists the location of all schemas that are referenced in the rules. List of rules supported by the Video Analytics configuration.. Maximum number of concurrent instances. It optionally contains a list of URLs that provide the location of schema files. These schema files describe the types and elements used in the analytics module descriptions. Analytics module descriptions that reference types or elements imported from any ONVIF defined schema files need not explicitly list those schema files. Maximum number of concurrent instances. Contains array of Polyline Contains PolylineArray configuration data Motion Expression data structure contains motion expression which is based on Scene Descriptor schema with XPATH syntax. The Type argument could allow introduction of different dialects Contains Rule MotionExpression configuration Mapping of the cell grid to the Video frame. The cell grid is starting from the upper left corner and x dimension is going from left to right and the y dimension from up to down. Number of columns of the cell grid (x dimension) Number of rows of the cell grid (y dimension) Configuration of the streaming and coding settings of a Video window. Optional name of the pane configuration. If the device has audio outputs, this element contains a pointer to the audio output that is associated with the pane. A client can retrieve the available audio outputs of a device using the GetAudioOutputs command of the DeviceIO service. If the device has audio sources, this element contains a pointer to the audio source that is associated with this pane. The audio connection from a decoder device to the NVT is established using the backchannel mechanism. A client can retrieve the available audio sources of a device using the GetAudioSources command of the DeviceIO service. The configuration of the audio encoder including codec, bitrate and sample rate. A pointer to a Receiver that has the necessary information to receive data from a Transmitter. This Receiver can be connected and the network video decoder displays the received data on the specified outputs. A client can retrieve the available Receivers using the GetReceivers command of the Receiver Service. A unique identifier in the display device. A pane layout describes one Video window of a display. It links a pane configuration to a region of the screen. Reference to the configuration of the streaming and coding parameters. Describes the location and size of the area on the monitor. The area coordinate values are espressed in normalized units [-1.0, 1.0]. A layout describes a set of Video windows that are displayed simultaniously on a display. List of panes assembling the display layout. This type contains the Audio and Video coding capabilities of a display service. If the device supports audio encoding this section describes the supported codecs and their configuration. If the device supports audio decoding this section describes the supported codecs and their settings. This section describes the supported video codesc and their configuration. The options supported for a display layout. Lists the possible Pane Layouts of the Video Output Description of a pane layout describing a complete display layout. List of areas assembling a layout. Coordinate values are in the range [-1.0, 1.0]. Description of a receiver, including its token and configuration. Unique identifier of the receiver. Describes the configuration of the receiver. Describes the configuration of a receiver. The following connection modes are defined: Details of the URI to which the receiver should connect. Stream connection parameters. Specifies a receiver connection mode. The receiver connects on demand, as required by consumers of the media streams. The receiver attempts to maintain a persistent connection to the configured endpoint. The receiver does not attempt to connect. This case should never happen. Specifies the current connection state of the receiver. The receiver is not connected. The receiver is attempting to connect. The receiver is connected. This case should never happen. Contains information about a receiver's current state. The connection state of the receiver may have one of the following states: Indicates whether or not the receiver was created automatically. The earliest point in time where there is recorded data on the device. The most recent point in time where there is recorded data on the device. The device contains this many recordings. A structure for defining a limited scope when searching in recorded data. A list of sources that are included in the scope. If this list is included, only data from one of these sources shall be searched. A list of recordings that are included in the scope. If this list is included, only data from one of these recordings shall be searched. An xpath expression used to specify what recordings to search. Only those recordings with an RecordingInformation structure that matches the filter shall be searched. Extension point The lower boundary of the PTZ volume to look for. The upper boundary of the PTZ volume to look for. If true, search for when entering the specified PTZ volume. The state of the search when the result is returned. Indicates if there can be more results, or if the search is completed. A RecordingInformation structure for each found recording matching the search. The state of the search when the result is returned. Indicates if there can be more results, or if the search is completed. A FindEventResult structure for each found event matching the search. The recording where this event was found. Empty string if no recording is associated with this event. A reference to the track where this event was found. Empty string if no track is associated with this event. The time when the event occured. The description of the event. If true, indicates that the event is a virtual event generated for this particular search session to give the state of a property at the start time of the search. The state of the search when the result is returned. Indicates if there can be more results, or if the search is completed. A FindPTZPositionResult structure for each found PTZ position matching the search. A reference to the recording containing the PTZ position. A reference to the metadata track containing the PTZ position. The time when the PTZ position was valid. The PTZ position. The state of the search when the result is returned. Indicates if there can be more results, or if the search is completed. A FindMetadataResult structure for each found set of Metadata matching the search. A reference to the recording containing the metadata. A reference to the metadata track containing the matching metadata. The point in time when the matching metadata occurs in the metadata track. The search is queued and not yet started. The search is underway and not yet completed. The search has been completed and no new results will be found. The state of the search is unknown. (This is not a valid response from GetSearchState.) Information about the source of the recording. This gives a description of where the data in the recording comes from. Since a single recording is intended to record related material, there is just one source. It is indicates the physical location or the major data source for the recording. Currently the recordingconfiguration cannot describe each individual data source. Basic information about the track. Note that a track may represent a single contiguous time span or consist of multiple slices. A set of informative desciptions of a data source. The Search searvice allows a client to filter on recordings based on information in this structure. Identifier for the source chosen by the client that creates the structure. This identifier is opaque to the device. Clients may use any type of URI for this field. A device shall support at least 128 characters. Informative user readable name of the source, e.g. "Camera23". A device shall support at least 20 characters. Informative description of the physical location of the source, e.g. the coordinates on a map. Informative description of the source. URI provided by the service supplying data to be recorded. A device shall support at least 128 characters. MP4 files with all tracks in a single file. CMAF compliant MP4 files with 1 track per file. AES-CTR mode full sample and video NAL Subsample encryption, defined in ISO/IEC 23001-7. AES-CBC mode partial video NAL pattern encryption, defined in ISO/IEC 23001-7. Key ID of the associated key for encryption. Key for encrypting content. The device shall not include this parameter when reading. Optional list of track tokens to be encrypted. If no track tokens are specified, all tracks are encrypted and no other encryption configurations shall exist for the recording. Each track shall only be contained in one encryption configuration. Mode of encryption. See tt:EncryptionMode for a list of definitions and capability trc:SupportedEncryptionModes for the supported encryption modes. Token of a storage configuration. Format of the recording. See tt:TargetFormat for a list of definitions and capability trc:SupportedTargetFormats for the supported formats. Path prefix to be inserted in the object key. Path postfix to be inserted in the object key. Maximum duration of a span. Maximum duration of a segment. Optional encryption configuration. See capability trc:EncryptionEntryLimit for the number of supported entries. By specifying multiple encryption entries per recording, different tracks can be encrypted with different configurations. Each track shall only be contained in one encryption configuration. This case should never happen. Type of the track: "Video", "Audio" or "Metadata". The track shall only be able to hold data of that type. Informative description of the contents of the track. The start date and time of the oldest recorded data in the track. The stop date and time of the newest recorded data in the track. Placeholder for future extension. A set of media attributes valid for a recording at a point in time or for a time interval. A reference to the recording that has these attributes. A set of attributes for each track. The attributes are valid from this point in time in the recording. The attributes are valid until this point in time in the recording. Can be equal to 'From' to indicate that the attributes are only known to be valid for this particular point in time. The basic information about the track. Note that a track may represent a single contiguous time span or consist of multiple slices. If the track is a video track, exactly one of this structure shall be present and contain the video attributes. If the track is an audio track, exactly one of this structure shall be present and contain the audio attributes. If the track is an metadata track, exactly one of this structure shall be present and contain the metadata attributes. Average bitrate in kbps. The width of the video in pixels. The height of the video in pixels. Video encoding of the track. Use value from tt:VideoEncoding for MPEG4. Otherwise use values from tt:VideoEncodingMimeNames and IANA Media Types. Average framerate in frames per second. The bitrate in kbps. Audio encoding of the track. Use values from tt:AudioEncoding for G711 and AAC. Otherwise use values from tt:AudioEncodingMimeNames and IANA Media Types. The sample rate in kHz. Indicates that there can be PTZ data in the metadata track in the specified time interval. Indicates that there can be analytics data in the metadata track in the specified time interval. Indicates that there can be notifications in the metadata track in the specified time interval. List of all PTZ spaces active for recording. Note that events are only recorded on position changes and the actual point of recording may not necessarily contain an event of the specified type. Information about the source of the recording. Informative description of the source. Sspecifies the maximum time that data in any track within the recording shall be stored. The device shall delete any data older than the maximum retention time. Such data shall not be accessible anymore. If the MaximumRetentionPeriod is set to 0, the device shall not limit the retention time of stored data, except by resource constraints. Whatever the value of MaximumRetentionTime, the device may automatically delete recordings to free up storage space for new recordings. Optional external storage target configuration. Type of the track. It shall be equal to the strings “Video”, “Audio” or “Metadata”. The track shall only be able to hold data of that type. Informative description of the track. Token of the recording. Configuration of the recording. List of tracks. Configuration of a track. Token of the track. Configuration of the track. Identifies the recording to which this job shall store the received data. The mode of the job. If it is idle, nothing shall happen. If it is active, the device shall try to obtain data from the receivers. A client shall use GetRecordingJobState to determine if data transfer is really taking place.
The only valid values for Mode shall be “Idle” and “Active”.
This shall be a non-negative number. If there are multiple recording jobs that store data to the same track, the device will only store the data for the recording job with the highest priority. The priority is specified per recording job, but the device shall determine the priority of each track individually. If there are two recording jobs with the same priority, the device shall record the data corresponding to the recording job that was activated the latest. Source of the recording. Optional filter defining on which event condition a recording job gets active.
This attribute adds an additional requirement for activating the recording job. If this optional field is provided the job shall only record if the schedule exists and is active.
Topic filter as defined in section 9.6.3 of the ONVIF Core Specification. Optional message source content filter as defined in section 9.4.4 of the ONVIF Core Specification. Optional timespan to record before the actual event condition became active. Optional timespan to record after the actual event condition becomes inactive. This field shall be a reference to the source of the data. The type of the source is determined by the attribute Type in the SourceToken structure. If Type is http://www.onvif.org/ver10/schema/Receiver, the token is a ReceiverReference. In this case the device shall receive the data over the network. If Type is http://www.onvif.org/ver10/schema/Profile, the token identifies a media profile, instructing the device to obtain data from a profile that exists on the local device. If this field is TRUE, and if the SourceToken is omitted, the device shall create a receiver object (through the receiver service) and assign the ReceiverReference to the SourceToken field. When retrieving the RecordingJobConfiguration from the device, the AutoCreateReceiver field shall never be present. List of tracks associated with the recording. If the received RTSP stream contains multiple tracks of the same type, the SourceTag differentiates between those Tracks. This field can be ignored in case of recording a local source. The destination is the tracktoken of the track to which the device shall store the received data. Identification of the recording that the recording job records to. Holds the aggregated state over the whole RecordingJobInformation structure. Identifies the data source of the recording job. Identifies the data source of the recording job. Holds the aggregated state over all substructures of RecordingJobStateSource. List of track items. Identifies the track of the data source that provides the data. Indicates the destination track. Optionally holds an implementation defined string value that describes the error. The string should be in the English language. Provides the job state of the track. The valid values of state shall be “Idle”, “Active” and “Error”. If state equals “Error”, the Error field may be filled in with an implementation defined value. Configuration parameters for the replay service. The RTSP session timeout. Token of the analytics engine (AnalyticsEngine) being controlled. Token of the analytics engine configuration (VideoAnalyticsConfiguration) in effect. Tokens of the input (AnalyticsEngineInput) configuration applied. Tokens of the receiver providing media input data. The order of ReceiverToken shall exactly match the order of InputToken. This case should never happen. Token of the control object whose status is requested. Action Engine Event Payload data structure contains the information about the ONVIF command invocations. Since this event could be generated by other or proprietary actions, the command invocation specific fields are defined as optional and additional extension mechanism is provided for future or additional action definitions. Request Message Response Message Fault Message AudioClassType acceptable values are; gun_shot, scream, glass_breaking, tire_screech Indicates audio class label A likelihood/probability that the corresponding audio event belongs to this class. The sum of the likelihoods shall NOT exceed 1 Array of audio class label and class probability For OSD position type, following are the pre-defined:
  • UpperLeft
  • UpperRight
  • LowerLeft
  • LowerRight
  • Custom
The value range of "Transparent" could be defined by vendors only should follow this rule: the minimum value means non-transparent and the maximum value maens fully transparent. The following OSD Text Type are defined:
  • Plain - The Plain type means the OSD is shown as a text string which defined in the "PlainText" item.
  • Date - The Date type means the OSD is shown as a date, format of which should be present in the "DateFormat" item.
  • Time - The Time type means the OSD is shown as a time, format of which should be present in the "TimeFormat" item.
  • DateAndTime - The DateAndTime type means the OSD is shown as date and time, format of which should be present in the "DateFormat" and the "TimeFormat" item.
List of supported OSD date formats. This element shall be present when the value of Type field has Date or DateAndTime. The following DateFormat are defined:
  • M/d/yyyy - e.g. 3/6/2013
  • MM/dd/yyyy - e.g. 03/06/2013
  • dd/MM/yyyy - e.g. 06/03/2013
  • yyyy/MM/dd - e.g. 2013/03/06
  • yyyy-MM-dd - e.g. 2013-06-03
  • dddd, MMMM dd, yyyy - e.g. Wednesday, March 06, 2013
  • MMMM dd, yyyy - e.g. March 06, 2013
  • dd MMMM, yyyy - e.g. 06 March, 2013
List of supported OSD time formats. This element shall be present when the value of Type field has Time or DateAndTime. The following TimeFormat are defined:
  • h:mm:ss tt - e.g. 2:14:21 PM
  • hh:mm:ss tt - e.g. 02:14:21 PM
  • H:mm:ss - e.g. 14:14:21
  • HH:mm:ss - e.g. 14:14:21
Font size of the text in pt. Font color of the text. Background color of the text. The content of text to be displayed.
This flag is applicable for Type Plain and defaults to true. When set to false the PlainText content will not be persistent across device reboots.
The URI of the image which to be displayed. Acceptable values are the same as in tt:Color. Describe the colors supported. Either list each color or define the range of color values. List the supported color. Define the range of color supported. Describe the option of the color and its transparency. Optional list of supported colors. Range of the transparent level. Larger means more tranparent. List of supported OSD text type. When a device indicates the supported number relating to Text type in MaximumNumberOfOSDs, the type shall be presented. Range of the font size value. List of supported date format. List of supported time format. List of supported font color. List of supported background color. List of available image URIs. List of supported image MIME types, such as "image/png". The maximum size (in bytes) of the image that can be uploaded. The maximum width (in pixels) of the image that can be uploaded. The maximum height (in pixels) of the image that can be uploaded. Reference to the video source configuration. Type of OSD. Position configuration of OSD. Text configuration of OSD. It shall be present when the value of Type field is Text. Image configuration of OSD. It shall be present when the value of Type field is Image The maximum number of OSD configurations supported for the specified video source configuration. If the configuration does not support OSDs, this value shall be zero and the Type and PositionOption elements are ignored. If a device limits the number of instances by OSDType, it shall indicate the supported number for each type via the related attribute. List supported type of OSD configuration. When a device indicates the supported number for each types in MaximumNumberOfOSDs, related type shall be presented. A device shall return Option element relating to listed type. List available OSD position type. Following are the pre-defined:
  • UpperLeft
  • UpperRight
  • LowerLeft
  • LowerRight
  • Custom
Option of the OSD text configuration. This element shall be returned if the device is signaling the support for Text. Option of the OSD image configuration. This element shall be returned if the device is signaling the support for Image.
Exported file name Normalized percentage completion for uploading the exported file Exported file name and export progress information identifier of an existing Storage Configuration. gives the relative directory path on the storage True if the device supports defining a region only using Rectangle. The rectangle points are still passed using a Polygon element if the device does not support polygon regions. In this case, the points provided in the Polygon element shall represent a rectangle. Provides the minimum and maximum number of points that can be defined in the Polygon. If RectangleOnly is not set to true, this parameter is required.