XML configuration files let you specify the devices you use as well as their parameters
(VPRN host, port, buttons, and analogs), the SUI model (user position, screen positions and
orientations), and the navigation modes.
The root XML element of the configuration is an IVConfiguration
node.
It contains several attributes such as the XSD path and the name of
the configuration.
The XSD file can be found in
C:\Program Files\Dassault
Systemes\B424\win_b64\resources\xsd.
The IVConfiguration node contains four child elements describing four
sections of the configuration file:
Input Devices
SUI Model
Navigation
Device Mapping.
Input Devices
This section of the configuration file describes the
devices to use and their parameters.
The
type must be coherent with the types supported:
VRPN
device, and so on.
The name of the device can be used as reference in the
Device Mapping section.
The
physicalReferential of the device defines the
orientation (euler_zxy) and position (float3) of the device
referential in the interaction context referential.
A specific attribute for specific devices.
VRPN Device
The VRPN device is composed of one or several sensors, the
correct syntax for Sensor is as follows:
Sensor can only be of type: tracker,
button or analog.
devId must be different for each sensor in one category.
It is the VRPN id.
devId corresponds to button in VRPN glossary for
buttons.
devId corresponds to sensor in VRPN glossary for
trackers.
devId corresponds to channel in VRPN glossary for
analogs.
The name of each sensor can be used as reference in the Device Mapping section.
Before using a VRPN device, a VRPN server must be created because the
use of a VRPN device is a remote use.
Below is an example of a device section declaring a VRPN device:
The gamepad device is declared with a specific type, buttons and axis are predefined with predefined names. Each button or axis can be used as reference in the Device Mapping
section with the predefined names:
A: button A
B: button B
X: button X
Y: button Y
LB: rear left button
RB: rear right button
LSB: left stick button
RSB: right stick button
BACK: "back" button
START: start button
LT: left trigger
RT: right trigger
LSY: left stick vertical axis
LSX: left stick horizontal axis
RSY: right stick vertical axis
RSX: right stick horizontal axis
DPAD: D-pad value ( 0, 45, 90, ... 315).
Below is an example of a device section declaring a gamepad:
You need a configured ART DTrack™ server to send tracking information to the network port
of the computer on which the 3DEXPERIENCE platform is running.
The 3DEXPERIENCE platform must use a configuration file where a peripheral of type DTrackServer is
declared. This peripheral is parameterized with the DTrack server port (through the
port instruction).
Below is an example of a device section declaring an ART DTrack™ server:
This section is used to describe the SUI ((Sensorial User in Interaction) model of the
scenario.
The SUI model is a scenegraph-like
structure describing the different users and their context of
interaction (that is to say, the other elements of the VR system such as
screens or markers). It provides a common abstract representation
of the users and their context of interaction both in the real
world and in the virtual world.
This prevents immersive application developers from accessing low
level services such as tracking devices which have no semantic.
The SUI Model mainly stores the physical position of its elements
and a mapping matrix defining the virtual position of the whole
interaction context. It also stores additional attributes such as
screen sizes, resolutions, and so on.
The root element is the SUIScenario which can have two types:
CAVE
Immersive Wall.
The child element of SUIScenario is the InteractionContext.
If the env variable IV_MANUAL_INIT_V is not set, the current camera
of the console is used. Otherwise the attributes
virtualPosition
and
virtualRotation
define the
initial value of the interaction context position and orientation
in the 3DEXPERIENCE referential.
The child element of
InteractionContext
is the
User
or
PhysicalEnvironment.
The
User
element contains elements corresponding to the
user's members (hand, head, and so on). The user model can be really simple.
The
PhysicalEnvironment
contains elements of the VR System
as screens.
The screen can have additional attributes:
displayPosition: defines the offset of the screen in the video memory.
displayResolution: defines the size of the screen in the video memory.
physicalSize: defines the physical size of the screen (in millimeters).
mainScreen: indicates if the screen has to be treated as part of the main screen by SUI commands/behaviours.
activeStereoscopy: when set to true, this flag indicates that both left and right stereo frames have to be rendered on this screen.
passiveStereoscopy: this attribute specifies which stereo frame (left or right) is to be rendered on this screen.
The 2D screen is located in the xOy plane and is centered on the
origin of its local coordinate system.
Below is an
example of a SUI Model section declaring a user with a head and a
hand in front of a standard 16/9 full HD screen split into four:
Regarding the InteractionContext, note that the names of the head, hands and screens can be used as references in
the Device Mapping section.
Navigation
This section describes the navigation modes. Each
navigation mode is composed of transformations (Transform) and constraints (Constraint).
Transformations can be either translation
Translate,
rotations
Rotate, scale
Scale
or ResetHorizontal.
Translation is defined by an axis and a speed.
Rotate is defined by an axis, center and speed.
Scale is defined by a center and speed.
ResetHorizontal applies Z axis gravity to the camera.
Axis and centers can be
defined in the virtual world coordinate system or in the local
coordinate system of any SUI element by prepending the name of the
SUI element to the float3 specifier.
Constraints can be either
LimitPosition,
LimitPositionOffset,
LimitOrientation,
LimitScale
or
AlignWith.
LimitPositionOffset defines the coordinate min and max value. It is defined by a coordinate, a minValue and a maxValue.
LimitOrientation defines the angle min and max value. It is defined by an angle, a minValue and a maxValue.
LimitScale defines the scale min and max value. It is defined by a minValue and a maxValue.
AlignWith aligns one axis of the interaction context with a reference axis of the 3DEXPERIENCE referential. It is defined by a minAngle, a maxAngle, an axis and a referenceAxis.
Below is an example of a
navigation section declaring two navigation modes, Fly and Walk:
This section lets you activate or deactivate prehighlighting and highlighting effects, as well as the mirror mode.
You can activate or deactivate the following options:
disableMirrorWindow: when set to false, makes the VR mirror on the computer without having to put the headset on (for CAVE scenarios with an HTC). By default, this option is set to true.
disablePrehighlight: when set to true, deactivates
prehighlight in client windows (for CAVE scenarios without HTC) when hovering over an
object.
disableHighlight: when set to true, deactivates
highlight in client windows (for CAVE scenarios without HTC) when selecting an
object.
Note:
If disablePrehighlight/disableHighlight options
are not activated, then objects prehighlighted/highlighted in the 3DEXPERIENCE window are also prehighlighted/highlighted the client
windows.
Device Mapping
This section is used to map the devices declared in the
Device section to the actions exposed by the scenario.
Note that:
Trackers can be mapped to tracking actions.
TrackerMapping
is defined by the
actionTrack
and maps a SUI element as the
target
to a device component
tracker.
Axis can be mapped to navigate actions.
AxisMapping
is defined by the
actionNavigate
and maps the axis of the
transform
to a device component
axis.
Buttons can be mapped to navigate actions Navigate and to additional actions
such as SwitchNavigationMode, ResetViewpointRotation and Select.
Below is an example of a mapping section mapping trackers, axis and
buttons:
In the above example, the right vertical axis is mapped to the Move
transformation of the Fly navigation mode and the right horizontal
axis is mapped to the Yaw transformation of the Fly navigation
mode. Similar mappings are defined for the Walk navigation mode
(note that a single axis can be mapped to two transformations
declared in two distinct navigation modes as only one navigation
mode is active at a time). The b1Right button is mapped to the
Switch navigation mode action which loops on declared navigation
modes and the bumperRight button is mapped to the Select action
which performs selection.
Some rules apply to device mapping:
tracker,
axis
and
button
can reference the
names of the InputDevice (VRPNConfig in this case) and its sensors
(trackerHead,
joystickVerRight,
b1Right and so on).
transform
can reference the names of Navigation Modes
transforms (Move,
Yaw,
Turn).
axis
can reference the name of an element previously created
in the SUI (Hand in this case).