summaryrefslogtreecommitdiffhomepage
path: root/eyeware-beam-sdk/docs/_sources
diff options
context:
space:
mode:
Diffstat (limited to 'eyeware-beam-sdk/docs/_sources')
-rw-r--r--eyeware-beam-sdk/docs/_sources/api_overview.rst.txt104
-rw-r--r--eyeware-beam-sdk/docs/_sources/api_reference.rst.txt76
-rw-r--r--eyeware-beam-sdk/docs/_sources/getting_started.rst.txt165
-rw-r--r--eyeware-beam-sdk/docs/_sources/index.rst.txt33
-rw-r--r--eyeware-beam-sdk/docs/_sources/introduction.rst.txt31
-rw-r--r--eyeware-beam-sdk/docs/_sources/redistribution.rst.txt34
6 files changed, 443 insertions, 0 deletions
diff --git a/eyeware-beam-sdk/docs/_sources/api_overview.rst.txt b/eyeware-beam-sdk/docs/_sources/api_overview.rst.txt
new file mode 100644
index 0000000..1ea878a
--- /dev/null
+++ b/eyeware-beam-sdk/docs/_sources/api_overview.rst.txt
@@ -0,0 +1,104 @@
+.. toctree::
+ :maxdepth: 2
+
+API overview
+============
+
+Architecture
+------------
+
+The API of Beam SDK exposes head and eye tracking data in real time.
+
+Beam SDK works by establishing a *connection* between the Beam eye tracking application by Eyeware, and your own code which uses the API described in these pages.
+This is a **server-client model**.
+Beam acts as the server, and you write the client which consumes the tracking data published by the server, allowing you to build your own eye tracking-enabled application.
+There can be multiple clients connected to the same Beam server at one time.
+
+The access to the tracking data is provided by the ``TrackerClient`` class.
+This is the main entry point of the API, available in all supported languages.
+For language-specific information about ``TrackerClient``, check out the :ref:`API reference`.
+
+In general, a ``TrackerClient`` object creates a connection between the Beam application and your code.
+In the current version of Beam SDK, this connection is made via a TCP port over a network.
+Constructing a ``TrackerClient`` object without arguments sets a default hostname and port which work fine in many configurations.
+However, it is possible to set a specific hostname and port, depending on your setup and network.
+
+.. note::
+ The Beam application and your client program communicate data over TCP sockets.
+ Most commonly, they are designed to be running on the same PC.
+ In scenarios in which they are running on different PCs, they need to be on the same network (e.g., share wifi), and the IP of the PC running Beam needs to be specified in the client program (``hostname`` argument).
+
+Reference frames and units of tracking data
+-------------------------------------------
+
+Beam SDK gives tracking information about:
+
+* the head of the person who is being tracked (**head tracking**), in 3D;
+* the pixel of the computer screen that the person is gazing at (**screen gaze tracking**), in 2D.
+
+Head tracking information is expressed with respect to a **World Coordinate System** (WCS), or root frame, positioned on the center of your computer screen.
+The head pose reference frame is positioned on the **nose tip**: see :ref:`Head pose reference frame` for illustrations.
+Head tracking positions are expressed as physical distances, in **meters**.
+
+Screen gaze coordinates are expressed in **pixels** (integer units) with respect to the top-left corner of the screen, which has coordinates ``<0, 0>``.
+
+.. note::
+ The screen that is considered for screen gaze tracking is the display in which the Beam application is capable of showing the gaze bubble.
+ That is, the main display in the operating system.
+
+The diagram below shows the WCS at the center of the screen (xyz axes represented in red-green-blue, respectively), as well as the 2D frame to represent screen pixels at the top-left corner (xy axes represented in red-green).
+
+.. image:: media/beam_sdk_wcs.png
+ :width: 600
+ :alt: Representation of the World Coordinate System
+ :align: center
+
+Head pose reference frame
+-------------------------
+
+The head tracking information exposed by Beam SDK includes a rigid transform (rotation ``R`` and translation ``t``) from the World Coordinate System frame to the **head frame**, positioned on the person's **nose tip**.
+
+Here is a lateral view that shows the ``R,t`` transform, as well as the xyz axes of the WCS and of the head frame:
+
+.. image:: media/head_transform_isometric_rt_axes_wcs.png
+ :width: 600
+ :alt: Lateral view of the head transform
+ :align: center
+
+Frontal view showing the head axes from the point of view of the person:
+
+.. image:: media/head_transform_frontal_axes.png
+ :width: 300
+ :alt: Frontal view of the head frame
+ :align: center
+
+Top view showing the transform, the WCS frame at the screen center and the head frame on the nose tip:
+
+.. image:: media/head_transform_top_rt_axes.png
+ :width: 500
+ :alt: Top view of the head transform
+ :align: center
+
+Tracking data definitions
+-------------------------
+
+Here is the meaning of the head and eye tracking fields exposed by Beam SDK, in all supported languages:
+
+* **Head Pose**: 3D head tracking measurements
+
+ * *Lost track*: whether head tracking is lost or not for the current frame;
+ * *Session ID*: numeric identifier of the current, uninterrupted, tracking session; [#fn_sessionid]_
+ * *Rotation*: rotation matrix of the person's nose tip, with respect to the World Coordinate System; [#fn_rotation]_
+ * *Translation*: translation vector of the person's nose tip in meters, with respect to the World Coordinate System.
+
+* **Gaze on Screen**: 2D screen gaze tracking measurements
+
+ * *Lost track*: whether gaze tracking is lost or not for the current frame;
+ * *Screen ID*: numeric identifier of the screen being looked at; [#fn_screenid]_
+ * *Lost track*: whether gaze tracking is lost or not for the current frame;
+ * *Coordinates*: horizontal and vertical pixel coordinates of the gazed point, respectively. Counted from the top-left pixel of the screen;
+ * *Confidence*: realibility level of the gaze tracking result.
+
+.. [#fn_sessionid] If a person is lost or not tracked for at least three seconds, the *Session ID* will change. When a person starts being tracked again, he or she will automatically be assigned a new *Session ID* number.
+.. [#fn_rotation] Rotation matrices are computed by Beam SDK from a triplet of rotation angles in radians along the zxy axes of the World Coordinate System, respectively. For further information, look up `Euler angles <https://en.wikipedia.org/wiki/Euler_angles>`_ and `Rotation matrix <https://en.wikipedia.org/wiki/Rotation_matrix>`_.
+.. [#fn_screenid] In the current version, Beam SDK supports single-screen setups. The *Screen ID* number is always zero. \ No newline at end of file
diff --git a/eyeware-beam-sdk/docs/_sources/api_reference.rst.txt b/eyeware-beam-sdk/docs/_sources/api_reference.rst.txt
new file mode 100644
index 0000000..1be8039
--- /dev/null
+++ b/eyeware-beam-sdk/docs/_sources/api_reference.rst.txt
@@ -0,0 +1,76 @@
+.. toctree::
+ :maxdepth: 2
+
+API reference
+=============
+
+C++
+---
+
+.. doxygenclass:: eyeware::TrackerClient
+ :project: Beam_SDK_docs
+ :members:
+
+.. doxygenstruct:: eyeware::HeadPoseInfo
+ :project: Beam_SDK_docs
+ :members:
+
+.. doxygenstruct:: eyeware::ScreenGazeInfo
+ :project: Beam_SDK_docs
+ :members:
+
+.. doxygenenum:: eyeware::TrackingConfidence
+ :project: Beam_SDK_docs
+
+.. doxygenstruct:: eyeware::AffineTransform3D
+ :project: Beam_SDK_docs
+ :members:
+
+.. doxygentypedef:: eyeware::Matrix3x3
+ :project: Beam_SDK_docs
+
+.. doxygenstruct:: eyeware::Vector3D
+ :project: Beam_SDK_docs
+ :members:
+
+Python
+------
+
+.. autoclass:: eyeware.client.TrackerClient
+ :members:
+ :exclude-members: connected
+
+ .. autoproperty:: connected
+
+ .. versionadded:: 1.1.0
+
+.. autoclass:: eyeware.client.HeadPoseInfo
+ :members:
+
+.. autoclass:: eyeware.client.ScreenGazeInfo
+ :members:
+
+.. autoclass:: eyeware.client.TrackingConfidence
+
+.. autoclass:: eyeware.client.AffineTransform3D
+ :members:
+
+.. autoclass:: eyeware.client.Vector3D
+ :members:
+
+.. note::
+ Matrix and vector types, such as the rotation and translation properties of ``AffineTransform3D``, can be transformed to NumPy arrays efficiently.
+ This is useful for using tracking data and coordinates in your application.
+ Example:
+
+ .. code-block:: python
+
+ # Receive an AffineTransform3D instance
+ head_pose = tracker.get_head_pose_info()
+ # Transform the tracking information to standard NumPy arrays
+ import numpy as np
+ rotation_numpy = np.array(head_pose.rotation, copy=False)
+ translation_numpy = np.array(head_pose.translation, copy=False)
+ # Now we can manipulate tracking information to do several things:
+ # draw tracking coordinates on the screen, save them for statistics/heatmaps,
+ # perform arithmetic operations on them, trigger interactive behaviors based on thresholds, etc.
diff --git a/eyeware-beam-sdk/docs/_sources/getting_started.rst.txt b/eyeware-beam-sdk/docs/_sources/getting_started.rst.txt
new file mode 100644
index 0000000..58f8858
--- /dev/null
+++ b/eyeware-beam-sdk/docs/_sources/getting_started.rst.txt
@@ -0,0 +1,165 @@
+.. toctree::
+ :maxdepth: 2
+
+Getting started
+===============
+
+Requirements
+------------
+
+* The `Beam application <https://beam.eyeware.tech>`__ is installed on your system;
+* you have an active subscription, which makes you able to receive head and eye tracking data;
+* you run the calibration procedure within the Beam application at least once.
+
+In addition, if you want to use the Python API, you need:
+
+* Python 3.6;
+* NumPy;
+* adding ``<YOUR_BEAM_SDK_INSTALLATION_FOLDER>/API/python`` to your ``PYTHONPATH``.
+
+Python example
+--------------
+
+.. code-block:: python
+
+ from eyeware.client import TrackerClient
+ import time
+ import numpy as np
+
+ # Build tracker client, to establish a communication with the tracker server (an Eyeware application).
+ #
+ # Constructing the tracker client object without arguments sets a default server hostname and port which
+ # work fine in many configurations.
+ # However, it is possible to set a specific hostname and port, depending on your setup and network.
+ # See the TrackerClient API reference for further information.
+ tracker = TrackerClient()
+
+ # Run forever, until we press ctrl+c
+ while True:
+ # Make sure that the connection with the tracker server (Eyeware application) is up and running.
+ if tracker.connected:
+
+ print(" * Head Pose:")
+ head_pose = tracker.get_head_pose_info()
+ head_is_lost = head_pose.is_lost
+ print(" - Lost track: ", head_is_lost)
+ if not head_is_lost:
+ print(" - Session ID: ", head_pose.track_session_uid)
+ rot = head_pose.transform.rotation
+ print(" - Rotation: |%5.3f %5.3f %5.3f|" % (rot[0, 0], rot[0, 1], rot[0, 2]))
+ print(" |%5.3f %5.3f %5.3f|" % (rot[1, 0], rot[1, 1], rot[1, 2]))
+ print(" |%5.3f %5.3f %5.3f|" % (rot[2, 0], rot[2, 1], rot[2, 2]))
+ tr = head_pose.transform.translation
+ print(" - Translation: <x=%5.3f m, y=%5.3f m, z=%5.3f m>" % (tr[0], tr[1], tr[2]))
+
+ print(" * Gaze on Screen:")
+ screen_gaze = tracker.get_screen_gaze_info()
+ screen_gaze_is_lost = screen_gaze.is_lost
+ print(" - Lost track: ", screen_gaze_is_lost)
+ if not screen_gaze_is_lost:
+ print(" - Screen ID: ", screen_gaze.screen_id)
+ print(" - Coordinates: <x=%5.3f px, y=%5.3f px>" % (screen_gaze.x, screen_gaze.y))
+ print(" - Confidence: ", screen_gaze.confidence)
+
+ time.sleep(1 / 30) # We expect tracking data at 30 Hz
+ else:
+ # Print a message every MESSAGE_PERIOD_IN_SECONDS seconds
+ MESSAGE_PERIOD_IN_SECONDS = 2
+ time.sleep(MESSAGE_PERIOD_IN_SECONDS - time.monotonic() % MESSAGE_PERIOD_IN_SECONDS)
+ print("No connection with tracker server")
+
+Output
+~~~~~~
+
+Running the example code in a terminal will start printing information in real time.
+There will be a lot of prints, one for each frame.
+Let us zoom on the printed output associated to one frame only:
+
+.. code-block::
+
+ * Head Pose:
+ - Lost track: False
+ - Session ID: 1
+ - Rotation: |-0.999 -0.005 -0.045|
+ |-0.008 0.999 0.051|
+ |0.045 0.051 -0.998|
+ - Translation: <x=0.166 m, y=0.181 m, z=0.260 m>
+ * Gaze on Screen:
+ - Lost track: False
+ - Screen ID: 0
+ - Coordinates: <x=698.000 px, y=149.000 px>
+ - Confidence: TrackingConfidence.HIGH
+
+For the meaning of the returned fields, refer to the section :ref:`API overview`.
+
+Explanation
+-----------
+
+When creating a Python script with the purpose of consuming head and eye tracking information from Beam SDK, you must ensure that the basic classes are correctly imported:
+
+.. code-block:: python
+
+ from eyeware.client import TrackerClient
+
+We can build a ``TrackerClient`` object, which is the main entry point of Beam SDK (see :ref:`API overview` for further details):
+
+.. code-block:: python
+
+ tracker = TrackerClient()
+
+Then, we verify that the connection between our client object and the tracker server (Eyeware application) is up and running as follows:
+
+.. code-block:: python
+
+ if tracker.connected:
+
+and we are now ready to receive head and gaze tracking data, doing something with that data.
+
+Let us start from the head tracking part.
+First, we will retrieve the head tracking information data structure.
+Then, we will check whether tracking information is valid for the current frame.
+In code:
+
+.. code-block:: python
+
+ print(" * Head Pose:")
+ head_pose = tracker.get_head_pose_info()
+ head_is_lost = head_pose.is_lost
+ print(" - Lost track: ", head_is_lost)
+
+If the head tracking information is indeed valid (i.e., head tracking was *not lost*), then we retrieve the 3D coordinates of the tracked person's head:
+
+.. code-block:: python
+
+ if not head_is_lost:
+ print(" - Session ID: ", head_pose.track_session_uid)
+ rot = head_pose.transform.rotation
+ print(" - Rotation: |%5.3f %5.3f %5.3f|" % (rot[0, 0], rot[0, 1], rot[0, 2]))
+ print(" |%5.3f %5.3f %5.3f|" % (rot[1, 0], rot[1, 1], rot[1, 2]))
+ print(" |%5.3f %5.3f %5.3f|" % (rot[2, 0], rot[2, 1], rot[2, 2]))
+ tr = head_pose.transform.translation
+ print(" - Translation: <x=%5.3f m, y=%5.3f m, z=%5.3f m>" % (tr[0], tr[1], tr[2]))
+
+For details about the rotation and translation notation, refer to the section :ref:`API overview`.
+
+Now, we want to get screen gaze tracking information.
+This follows the same logic that we applied for head tracking information.
+First, retrieve the screen gaze information data structure.
+Then, check the data validity (whether tracking is *not lost*):
+
+.. code-block:: python
+
+ print(" * Gaze on Screen:")
+ screen_gaze = tracker.get_screen_gaze_info()
+ screen_gaze_is_lost = screen_gaze.is_lost
+ print(" - Lost track: ", screen_gaze_is_lost)
+ if not screen_gaze_is_lost:
+ print(" - Screen ID: ", screen_gaze.screen_id)
+ print(" - Coordinates: <x=%5.3f px, y=%5.3f px>" % (screen_gaze.x, screen_gaze.y))
+ print(" - Confidence: ", screen_gaze.confidence)
+
+The rest of the example code is about printing head and gaze tracking data numbers on the terminal.
+Printing those numbers, by itself, is not very useful or interesting.
+Instead, you can exploit the Beam SDK tracking data for building your own creative applications!
+Let us know how it goes at contact@eyeware.tech.
+We would love to hear about your projects. \ No newline at end of file
diff --git a/eyeware-beam-sdk/docs/_sources/index.rst.txt b/eyeware-beam-sdk/docs/_sources/index.rst.txt
new file mode 100644
index 0000000..2187a37
--- /dev/null
+++ b/eyeware-beam-sdk/docs/_sources/index.rst.txt
@@ -0,0 +1,33 @@
+.. Beam SDK documentation master file, created by
+ sphinx-quickstart on Wed Jul 21 15:31:40 2021.
+ You can adapt this file completely to your liking, but it should at least
+ contain the root `toctree` directive.
+
+.. image:: media/beam_logo.png
+ :width: 500
+ :alt: Beam logo
+ :align: center
+
+-----------
+
+Beam SDK documentation
+======================
+
+This guide describes **Beam SDK**, a development kit that enables you to add eye tracking capabilities to your applications.
+
+.. toctree::
+ :maxdepth: 2
+ :caption: Contents:
+
+ introduction
+ getting_started
+ api_overview
+ api_reference
+ redistribution
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/eyeware-beam-sdk/docs/_sources/introduction.rst.txt b/eyeware-beam-sdk/docs/_sources/introduction.rst.txt
new file mode 100644
index 0000000..6b8ef73
--- /dev/null
+++ b/eyeware-beam-sdk/docs/_sources/introduction.rst.txt
@@ -0,0 +1,31 @@
+.. toctree::
+ :maxdepth: 2
+
+Introduction
+============
+
+Beam SDK is the free development kit for `Beam <https://beam.eyeware.tech>`__ that enables you to build and distribute your own eye tracking-enabled PC apps.
+Beam SDK allows you to use Beam as a general-purpose eye tracker.
+
+.. image:: media/beam_sdk_overview.png
+ :width: 600
+ :alt: Beam SDK overview
+ :align: center
+
+The new generation of applications aims at breaking the barriers of digital interaction.
+Eye and head tracking technology are one of the contenders that push for the development of more intuitive and immersive devices and digital environments.
+Tech you can't see is the best tech.
+
+You can use Beam SDK to create your own immersive game experiences, interactions or accessibility solutions for PC on top of the Eyeware Beam head and eye tracker.
+Beam SDK helps you:
+
+* build accessibility apps to empower all people in the world;
+* create new games features that will bring players deeper into the digital worlds;
+* develop apps for academic and user experience (UX) researchers to make data collection easier;
+* create prototypes quicker for applications in driver monitoring, training simulators, robotics, human-machine interaction.
+
+Eyeware is providing early access to the Beam API for developers and independent software vendors.
+You can start using Beam as an eye tracking solution now.
+The API will remain compatible and accessible with future paid subscriptions tiers after the public beta.
+
+Website: https://beam.eyeware.tech/ \ No newline at end of file
diff --git a/eyeware-beam-sdk/docs/_sources/redistribution.rst.txt b/eyeware-beam-sdk/docs/_sources/redistribution.rst.txt
new file mode 100644
index 0000000..f9c3287
--- /dev/null
+++ b/eyeware-beam-sdk/docs/_sources/redistribution.rst.txt
@@ -0,0 +1,34 @@
+.. toctree::
+ :maxdepth: 2
+
+Redistributing a Beam client application
+========================================
+
+For developers
+--------------
+
+To distribute an application embedding the functionality provided by Beam SDK, you need to deploy the necessary libraries, modules or extra files from the Beam SDK package.
+This ensures that your application will function as intended for your end users.
+
+The file `<CREDITS.pdf>`_ displays the licenses of third party libraries that Beam SDK depends on.
+
+C++ API
+~~~~~~~
+
+The necessary files are in the folder ``API/cpp/lib``.
+Copy them next to the executable of your application.
+
+Python API
+~~~~~~~~~~
+
+The necessary files are in the folder ``API/python``.
+Keep the structure of that folder, including any subfolders contained therein.
+Copy the files next to the "executable" (Python interpreter) of your application.
+
+For end users
+-------------
+
+Your end users simply need to:
+
+* have the Eyeware Beam app installed;
+* have an active subscription for the Eyeware Beam app. \ No newline at end of file