|author||Joe Ludwig <email@example.com>||Tue Nov 27 16:54:42 2018|
|committer||Joe Ludwig <firstname.lastname@example.org>||Tue Nov 27 16:54:42 2018|
OpenVR SDK Update 1.1.3 General: * Added required SteamVR version number to the header file. IVRCompositor: * New VRCompositor_FrameTiming ReprojectionFlag: VRCompositor_ReprojectionMotion. This flag will be set for application frames where motion smoothing was applied at least one of the times it was displayed. * New interface IsMotionSmoothingEnabled added to determine if that user has enabled motion smoothing or not. IVRChaperone: * Added VREvent_ChaperoneFlushCache, which is sent when the application should reload any cached data they loaded from the chaperone API. This event indicates the user's current chaperone settings have changed. * The VREvent_ChaperoneDataHasChanged event will no longer be sent, and IVRChaperone::ReloadInfo no longer has any effect. IVRChaperoneSetup: * Removed some unimplemented functions: * SetWorkingCollisionBoundsTagsInfo * GetLiveCollisionBoundsTagsInfo * SetWorkingPhysicalBoundsInfo * GetLivePhysicalBoundsInfo * Added ShowWorkingSetPreview/HideWorkingSetPreview, which will cause the application's current working set to show as the chaperone bounds rendered by the compositor. Unless your application is modifying the user's chaperone bounds, you won't need to call these functions. They are independent of bounds turning on and off based on how close the user is to them. IVROverlay: * Added flag VROverlayFlags_MakeOverlaysInteractiveIfVisible. If this flag is set on an overlay and that overlay is visible, SteamVR will be placed into laser mouse mode. This will prevent the scene application from receiving any input, so use this flag carefully. * Changed SetOverlayDualAnalogTransform to take a pointer to better support the C API. IVRTrackedCamera: * for headsets (like Vive Pro) which include multiple camera images in a single video stream, GetCameraIntrinsics and GetCameraProjection now support a uint32_t nCameraIndex parameter to get data about the specific camera. IVRInput: * Added an argument to GetOriginLocalizedName to allow the caller to specify which parts of the name they want in the returned string. The possible values are: * VRInputString_Hand - Which hand the origin is in. E.g. "Left Hand" * VRInputString_ControllerType - What kind of controller the user has in that hand. E.g. "Vive Controller" * VRInputString_InputSource - What part of that controller is the origin. E.g. "Trackpad" * VRInputString_All - All of the above. E.g. "Left Hand Vive Controller Trackpad" * Skeletal Input: * Added GetBoneCount to return the number of bones in the skeleton associated with an action * Added GetBoneHierarchy which returns the index of each bone's parent in a user-provided array * Added GetBoneName to retrieve the name of the bones in the skeleton * Added GetSkeletalReferenceTransforms to retrieve the transforms for several specific hand skeleton poses: * Bind Pose * Open Hand * Fist * Grip Limit, which is the shape of the hand when closed around the controller * Added GetSkeletalTrackingLevel to retrieve an estimate of the level of detail with which the controller associated with an action can track actual the movement of the user's body. The levels are: * Estimated: body part location can't be directly determined by the device. Any skeletal pose provided by the device is estimated by assuming the position required to active buttons, triggers, joysticks, or other input sensors. e.g. Vive Controller, Gamepad * Partial: body part location can be measured directly but with fewer degrees of freedom than the actual body part. Certain body part positions may be unmeasured by the device and estimated from other input data. e.g. Knuckles, gloves that only measure finger curl * Full: body part location can be measured directly throughout the entire range of motion of the body part. e.g. Mocap suit for the full body, gloves that measure rotation of each finger segment * Added GetSkeletalSummaryData which returns meta data about the current pose of the hand such as finger curl and splay * Removed ulRestrictToDevice as a parameter from all skeletal input functions Driver API: * Added TrackedControllerRole_Treadmill, which lets a driver specify that a device is intended to function as a treadmill. This opts the device out of hand selection. The new input path /user/treadmill is automatically assigned to the first treadmill device to activate. IVRCameraComponent: * CVS_FORMAT_BAYER16BG for cameras which support delivering raw sensor data * Added camera index to GetCameraDistortion, GetCameraProjection, and GetCameraIntrinsics to support multiple cameras on the same device (see also IVRTrackedCamera) * Added the ability for cameras to return one of a small set of programmatic distortion function types and function parameters in addition to or instead of UV-sampling distortion through GetCameraDistortion. See EVRDistortionFunctionType and IVRCameraComponent::GetCameraIntrinsics and refer to OpenCV camera calibration and undistortion documentation. IVRDriverInput: * Added parameter to CreateSkeletonComponent to allow the driver to specify the skeletal tracking level that the controller supports Samples: * Fixed texture corruption bug with hellovr_vulkan when controller is turned on after starting the application. [git-p4: depot-paths = "//vr/steamvr/sdk_release/": change = 4837234]
OpenVR is an API and runtime that allows access to VR hardware from multiple vendors without requiring that applications have specific knowledge of the hardware they are targeting. This repository is an SDK that contains the API and samples. The runtime is under SteamVR in Tools on Steam.
Documentation for the API is available on the Github Wiki
More information on OpenVR and SteamVR can be found on http://steamvr.com