summaryrefslogtreecommitdiffhomepage
diff options
context:
space:
mode:
-rw-r--r--README.md3
-rw-r--r--filter-nm/lang/zh_CN.ts16
-rw-r--r--gui/lang/zh_CN.ts6
-rw-r--r--tracker-neuralnet/lang/zh_CN.ts42
-rw-r--r--tracker-neuralnet/model_adapters.h2
5 files changed, 37 insertions, 32 deletions
diff --git a/README.md b/README.md
index 0ed51166..30d65428 100644
--- a/README.md
+++ b/README.md
@@ -40,7 +40,9 @@ Don't be afraid to submit an **issue/feature request** if you have any problems!
- BBC micro:bit, LEGO, sensortag support via Smalltalk<sup>[(1)](https://en.wikipedia.org/wiki/Smalltalk)[(2)](https://en.wikipedia.org/wiki/Alan_Kay)</sup>
[S2Bot](http://www.picaxe.com/Teaching/Other-Software/Scratch-Helper-Apps/)
- Wiimote (Windows)
+- NeuralNet face tracker
- Eyeware Beam<sup>[[1](https://beam.eyeware.tech/)]</sup>
+- Tobii eye tracker
## Output protocols
@@ -71,6 +73,7 @@ Don't be afraid to submit an **issue/feature request** if you have any problems!
- Stéphane Lenclud (Kinect Face Tracker, Easy Tracker)
- GO63-samara (Hamilton Filter, Pose-widget improvement)
- Davide Mameli (Eyeware Beam tracker)
+- Khoa Nguyen (Tobii eye tracker)
## Thanks
diff --git a/filter-nm/lang/zh_CN.ts b/filter-nm/lang/zh_CN.ts
index 92521e0f..48cda998 100644
--- a/filter-nm/lang/zh_CN.ts
+++ b/filter-nm/lang/zh_CN.ts
@@ -9,15 +9,15 @@
</message>
<message>
<source>°/s</source>
- <translation type="unfinished"></translation>
+ <translation ></translation>
</message>
<message>
<source>mm/s</source>
- <translation type="unfinished"></translation>
+ <translation ></translation>
</message>
<message>
<source>Responsiveness</source>
- <translation type="unfinished"></translation>
+ <translation >反应</translation>
</message>
<message>
<source>10.0</source>
@@ -25,7 +25,7 @@
</message>
<message>
<source>Drift speeds</source>
- <translation type="unfinished"></translation>
+ <translation >漂移速度</translation>
</message>
<message>
<source>50</source>
@@ -37,19 +37,19 @@
</message>
<message>
<source>Natural movement filter by Tom Brazier: Cancels higher frequency noise and the natural tendency for our heads to drift even when we think we are sitting still.</source>
- <translation type="unfinished"></translation>
+ <translation >Natural movement filter by Tom Brazier: 过滤高频噪声与我们头部的自然飘移, 即便我们认为我们保持不动.</translation>
</message>
<message>
<source>Rotation</source>
- <translation type="unfinished"></translation>
+ <translation >旋转</translation>
</message>
<message>
<source>Position</source>
- <translation type="unfinished"></translation>
+ <translation >位置</translation>
</message>
<message>
<source>Instructions: Set all sliders to minimum. Then for each of rotation and position: First, increase responsiveness until the filter only just cancels jerkiness for faster head movements. Second, increase drift speed until the filter only just cancels drift movement when your head is still.</source>
- <translation type="unfinished"></translation>
+ <translation >先将所有滑块最小化. 然后, 对于每个旋转和位置: 首先提高响应能力,直到过滤器只是刚好消除抖动(以获得更快的头部运动); 其次,增加漂移速度,直到过滤器只是在头部静止时刚好消除漂移. </translation>
</message>
</context>
<context>
diff --git a/gui/lang/zh_CN.ts b/gui/lang/zh_CN.ts
index 64ef8eee..c6102928 100644
--- a/gui/lang/zh_CN.ts
+++ b/gui/lang/zh_CN.ts
@@ -351,7 +351,7 @@ Press &quot;clear calibration&quot; to remove any calibration data pertaining to
</message>
<message>
<source>Centering method</source>
- <translation type="unfinished"></translation>
+ <translation >回中方法</translation>
</message>
<message>
<source>Point</source>
@@ -359,11 +359,11 @@ Press &quot;clear calibration&quot; to remove any calibration data pertaining to
</message>
<message>
<source>Wireless VR 360</source>
- <translation type="unfinished"></translation>
+ <translation type="unfinished">无线 VR 360</translation>
</message>
<message>
<source>Roll compensated</source>
- <translation type="unfinished"></translation>
+ <translation >滚转补偿</translation>
</message>
<message>
<source>Freeze the position returned by the tracker while this mode is active.</source>
diff --git a/tracker-neuralnet/lang/zh_CN.ts b/tracker-neuralnet/lang/zh_CN.ts
index c3a91211..cf12f304 100644
--- a/tracker-neuralnet/lang/zh_CN.ts
+++ b/tracker-neuralnet/lang/zh_CN.ts
@@ -1,35 +1,36 @@
<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE TS>
+
<TS version="2.1" language="zh_CN">
-<context>
+ <context>
<name>Form</name>
<message>
<source>Tracker settings</source>
- <translation type="unfinished"></translation>
+ <translation >追踪器设置</translation>
</message>
<message>
<source>Diagonal FOV</source>
- <translation type="unfinished"></translation>
+ <translation >对角FOV</translation>
</message>
<message>
<source>Camera name</source>
- <translation type="unfinished"></translation>
+ <translation >相机名</translation>
</message>
<message>
<source>Frames per second</source>
- <translation type="unfinished"></translation>
+ <translation >FPS</translation>
</message>
<message>
<source>Camera settings</source>
- <translation type="unfinished"></translation>
+ <translation >相机设置</translation>
</message>
<message>
<source>Camera Configuration</source>
- <translation type="unfinished"></translation>
+ <translation >相机配置</translation>
</message>
<message>
<source>Head Center Offset</source>
- <translation type="unfinished"></translation>
+ <translation >头部归中补偿</translation>
</message>
<message>
<source> mm</source>
@@ -38,27 +39,28 @@
<message>
<source>Use only yaw and pitch while calibrating.
Don&apos;t roll or change position.</source>
- <translation type="unfinished"></translation>
+ <translation >在校准时只使用偏航和俯仰,
+不要滚转或是改变位置. </translation>
</message>
<message>
<source>Start calibration</source>
- <translation type="unfinished"></translation>
+ <translation >开始校准</translation>
</message>
<message>
<source>Right</source>
- <translation type="unfinished"></translation>
+ <translation >向右</translation>
</message>
<message>
<source>Forward</source>
- <translation type="unfinished"></translation>
+ <translation >向前</translation>
</message>
<message>
<source>Up</source>
- <translation type="unfinished"></translation>
+ <translation >向上</translation>
</message>
<message>
<source>Show Network Input</source>
- <translation type="unfinished"></translation>
+ <translation >展示神经网络输入</translation>
</message>
<message>
<source>MJPEG</source>
@@ -112,20 +114,20 @@ Don&apos;t roll or change position.</source>
<source>Zoom factor for the face region. Applied before the patch is fed into the pose estimation model. There is a sweet spot near 1.</source>
<translation type="unfinished"></translation>
</message>
-</context>
+ </context>
<context>
<name>neuralnet_tracker_ns::NeuralNetDialog</name>
<message>
<source>Default</source>
- <translation type="unfinished"></translation>
+ <translation >默认</translation>
</message>
<message>
<source>Tracker Offline</source>
- <translation type="unfinished"></translation>
+ <translation >追踪器离线</translation>
</message>
<message>
<source>%1x%2 @ %3 FPS / Inference: %4 ms</source>
- <translation type="unfinished"></translation>
+ <translation >%1x%2 @ %3 FPS / 推理: %4 ms</translation>
</message>
<message>
<source>%1 yaw samples. Yaw more to %2 samples for stable calibration.</source>
@@ -141,11 +143,11 @@ Don&apos;t roll or change position.</source>
</message>
<message>
<source>Stop calibration</source>
- <translation type="unfinished"></translation>
+ <translation >结束校准</translation>
</message>
<message>
<source>Start calibration</source>
- <translation type="unfinished"></translation>
+ <translation >开始校准</translation>
</message>
</context>
</TS>
diff --git a/tracker-neuralnet/model_adapters.h b/tracker-neuralnet/model_adapters.h
index 820330cf..48f2fa2c 100644
--- a/tracker-neuralnet/model_adapters.h
+++ b/tracker-neuralnet/model_adapters.h
@@ -73,7 +73,7 @@ class PoseEstimator
std::string get_network_output_name(size_t i) const;
int64_t model_version_ = 0; // Queried meta data from the ONNX file
Ort::Session session_{nullptr}; // ONNX's runtime context for running the model
- Ort::Allocator allocator_; // Memory allocator for tensors
+ mutable Ort::Allocator allocator_; // Memory allocator for tensors
// Inputs
cv::Mat scaled_frame_{}, input_mat_{}; // Input. One is the original crop, the other is rescaled (?)
std::vector<Ort::Value> input_val_; // Tensors to put into the model