1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
|
.. toctree::
:maxdepth: 2
Getting started
===============
Requirements
------------
* The `Beam application <https://beam.eyeware.tech>`__ is installed on your system;
* you have an active subscription, which makes you able to receive head and eye tracking data;
* you run the calibration procedure within the Beam application at least once.
In addition, if you want to use the Python API, you need:
* Python 3.6;
* NumPy;
* adding ``<YOUR_BEAM_SDK_INSTALLATION_FOLDER>/API/python`` to your ``PYTHONPATH``.
Python example
--------------
.. code-block:: python
from eyeware.client import TrackerClient
import time
import numpy as np
# Build tracker client, to establish a communication with the tracker server (an Eyeware application).
#
# Constructing the tracker client object without arguments sets a default server hostname and port which
# work fine in many configurations.
# However, it is possible to set a specific hostname and port, depending on your setup and network.
# See the TrackerClient API reference for further information.
tracker = TrackerClient()
# Run forever, until we press ctrl+c
while True:
# Make sure that the connection with the tracker server (Eyeware application) is up and running.
if tracker.connected:
print(" * Head Pose:")
head_pose = tracker.get_head_pose_info()
head_is_lost = head_pose.is_lost
print(" - Lost track: ", head_is_lost)
if not head_is_lost:
print(" - Session ID: ", head_pose.track_session_uid)
rot = head_pose.transform.rotation
print(" - Rotation: |%5.3f %5.3f %5.3f|" % (rot[0, 0], rot[0, 1], rot[0, 2]))
print(" |%5.3f %5.3f %5.3f|" % (rot[1, 0], rot[1, 1], rot[1, 2]))
print(" |%5.3f %5.3f %5.3f|" % (rot[2, 0], rot[2, 1], rot[2, 2]))
tr = head_pose.transform.translation
print(" - Translation: <x=%5.3f m, y=%5.3f m, z=%5.3f m>" % (tr[0], tr[1], tr[2]))
print(" * Gaze on Screen:")
screen_gaze = tracker.get_screen_gaze_info()
screen_gaze_is_lost = screen_gaze.is_lost
print(" - Lost track: ", screen_gaze_is_lost)
if not screen_gaze_is_lost:
print(" - Screen ID: ", screen_gaze.screen_id)
print(" - Coordinates: <x=%5.3f px, y=%5.3f px>" % (screen_gaze.x, screen_gaze.y))
print(" - Confidence: ", screen_gaze.confidence)
time.sleep(1 / 30) # We expect tracking data at 30 Hz
else:
# Print a message every MESSAGE_PERIOD_IN_SECONDS seconds
MESSAGE_PERIOD_IN_SECONDS = 2
time.sleep(MESSAGE_PERIOD_IN_SECONDS - time.monotonic() % MESSAGE_PERIOD_IN_SECONDS)
print("No connection with tracker server")
Output
~~~~~~
Running the example code in a terminal will start printing information in real time.
There will be a lot of prints, one for each frame.
Let us zoom on the printed output associated to one frame only:
.. code-block::
* Head Pose:
- Lost track: False
- Session ID: 1
- Rotation: |-0.999 -0.005 -0.045|
|-0.008 0.999 0.051|
|0.045 0.051 -0.998|
- Translation: <x=0.166 m, y=0.181 m, z=0.260 m>
* Gaze on Screen:
- Lost track: False
- Screen ID: 0
- Coordinates: <x=698.000 px, y=149.000 px>
- Confidence: TrackingConfidence.HIGH
For the meaning of the returned fields, refer to the section :ref:`API overview`.
Explanation
-----------
When creating a Python script with the purpose of consuming head and eye tracking information from Beam SDK, you must ensure that the basic classes are correctly imported:
.. code-block:: python
from eyeware.client import TrackerClient
We can build a ``TrackerClient`` object, which is the main entry point of Beam SDK (see :ref:`API overview` for further details):
.. code-block:: python
tracker = TrackerClient()
Then, we verify that the connection between our client object and the tracker server (Eyeware application) is up and running as follows:
.. code-block:: python
if tracker.connected:
and we are now ready to receive head and gaze tracking data, doing something with that data.
Let us start from the head tracking part.
First, we will retrieve the head tracking information data structure.
Then, we will check whether tracking information is valid for the current frame.
In code:
.. code-block:: python
print(" * Head Pose:")
head_pose = tracker.get_head_pose_info()
head_is_lost = head_pose.is_lost
print(" - Lost track: ", head_is_lost)
If the head tracking information is indeed valid (i.e., head tracking was *not lost*), then we retrieve the 3D coordinates of the tracked person's head:
.. code-block:: python
if not head_is_lost:
print(" - Session ID: ", head_pose.track_session_uid)
rot = head_pose.transform.rotation
print(" - Rotation: |%5.3f %5.3f %5.3f|" % (rot[0, 0], rot[0, 1], rot[0, 2]))
print(" |%5.3f %5.3f %5.3f|" % (rot[1, 0], rot[1, 1], rot[1, 2]))
print(" |%5.3f %5.3f %5.3f|" % (rot[2, 0], rot[2, 1], rot[2, 2]))
tr = head_pose.transform.translation
print(" - Translation: <x=%5.3f m, y=%5.3f m, z=%5.3f m>" % (tr[0], tr[1], tr[2]))
For details about the rotation and translation notation, refer to the section :ref:`API overview`.
Now, we want to get screen gaze tracking information.
This follows the same logic that we applied for head tracking information.
First, retrieve the screen gaze information data structure.
Then, check the data validity (whether tracking is *not lost*):
.. code-block:: python
print(" * Gaze on Screen:")
screen_gaze = tracker.get_screen_gaze_info()
screen_gaze_is_lost = screen_gaze.is_lost
print(" - Lost track: ", screen_gaze_is_lost)
if not screen_gaze_is_lost:
print(" - Screen ID: ", screen_gaze.screen_id)
print(" - Coordinates: <x=%5.3f px, y=%5.3f px>" % (screen_gaze.x, screen_gaze.y))
print(" - Confidence: ", screen_gaze.confidence)
The rest of the example code is about printing head and gaze tracking data numbers on the terminal.
Printing those numbers, by itself, is not very useful or interesting.
Instead, you can exploit the Beam SDK tracking data for building your own creative applications!
Let us know how it goes at contact@eyeware.tech.
We would love to hear about your projects.
|