Skip to content

Commit 130367b

Browse files
committed
Merge branch 'main' of github.com:Robopipe/api
2 parents 154452b + aa8980d commit 130367b

File tree

4 files changed

+93
-23
lines changed

4 files changed

+93
-23
lines changed

docs/api/rest-api-reference/cameras.md

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,3 +35,31 @@ If you're using PoE devices, make sure that they are connected to a network with
3535
{% swagger src="../../.gitbook/assets/oas.yml" path="/cameras/{mxid}/ir" method="post" %}
3636
[oas.yml](../../.gitbook/assets/oas.yml)
3737
{% endswagger %}
38+
39+
## IR Perception
40+
41+
The PRO line-up of Luxonis devices has notch IR filters at 940nm on the stereo camera pair, which allows both visible light and IR light from illumination LED/laser dot projector to be perceived by the camera.
42+
43+
{% hint style="warning" %}
44+
This product is classified as a **Class 1 Laser Product** under the **EN/IEC 60825-1, Edition 3 (2014)** internationally. You should take safety precautions when working with this product.
45+
{% endhint %}
46+
47+
{% hint style="info" %}
48+
RGB cameras do not perceive IR light, only monochromatic sensors have IR perception.
49+
{% endhint %}
50+
51+
### Dot Projector
52+
53+
A laser dot projector emits numerous tiny dots in front of the device, aiding in disparity matching, particularly on surfaces with low visual features or texture, such as walls or floors. This method, known as ASV (Active Stereo Vision), functions similarly to passive stereo vision but incorporates active depth enhancement.
54+
55+
### Flood IR (Led)
56+
57+
LED lighting enables visibility in environments with minimal or no light. It allows you to execute AI or computer vision (CV) tasks on frames illuminated by the infrared (IR) LED.
58+
59+
{% swagger src="../../.gitbook/assets/oas.yml" path="/cameras/{mxid}/ir" method="get" %}
60+
[oas.yml](../../.gitbook/assets/oas.yml)
61+
{% endswagger %}
62+
63+
{% swagger src="../../.gitbook/assets/oas.yml" path="/cameras/{mxid}/ir" method="post" %}
64+
[oas.yml](../../.gitbook/assets/oas.yml)
65+
{% endswagger %}

docs/controller/os.md

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,43 @@
11
# OS
22

3+
## Description
4+
5+
Robopipe OS is a custom pre-configured OS image based on [Debian](https://www.debian.org/). It comes with all the packages and tools you need to build a AI/CV application pre-installed. You can find all releases along with the configuration scripts in [this repository](https://github.com/Robopipe/OS).
6+
7+
## Default configuration
8+
9+
### Users
10+
11+
The default user you should use when working on the controller directly is `admin`. The password is `robopipe.io`.
12+
13+
### Hostname & mDNS
14+
15+
A hostname is assigned to the controller on startup. The assigned hostname is in the format `robopipe-<id>`, where _id_ is determined dynamically on each startup. When the controller is started it will scan the local network, and assign itself the lowest possible _id_ which is not already taken. This means that the first controller you start will have _id_ 1, the next will have _id_ 2 and so on. At the same time, it will also broadcast an [mDNS](https://en.wikipedia.org/wiki/Multicast_DNS) record based on its hostname (`robopipe-<id>.local`). This means that the controller can be addresses not only by its IP address, but also by its hostname in the local network. This behaviour is enabled via `robopipehostname` service, if you wish to disable it, you can do so using:
16+
17+
```bash
18+
sudo systemctl disable robopipehostname # this will take effect on the next startup
19+
```
20+
21+
### SSH
22+
23+
SSH server is running on the controller by default on port 22.
24+
25+
### Robopipe API
26+
27+
Robopipe API is configured to start automatically at startup. If you wish to disable this, you can do so using:
28+
29+
```bash
30+
sudo systemctl disable robopipeapi
31+
```
32+
33+
## Service Mode
34+
35+
Sevice mode is used for backing up the operating system, uploading new OS images or restoring access to the unit. This mode does not publish an mDNS record.
36+
37+
### Entering the Service Mode
38+
39+
* Turn off the controller
40+
* Press and hold the _SERVICE_ button on your controller (usually located next to USB labels)
41+
* Turn on the controller while still hodling the _SERVICE_ button
42+
* Release the _SERVICE_ button when all the LEDs start blinking periodically
43+

docs/getting-started/connection.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,16 +24,16 @@ Connect the luxonis camera directly to the controller via USB and then connect t
2424

2525
### Accessing the controller
2626

27-
The easiest way to check if everything is running is to head over to `http://robopipe-controller-<id>.local`, where _id_ is the a number assigned to each controller on startup, based on the order in which you start the controllers, e.g. the first controller you connect will have id `1`, the next will have id `2` and so on. You should see output:&#x20;
27+
The easiest way to check if everything is running is to head over to `http://robopipe-<id>.local`, where _id_ is the a number assigned to each controller on startup, based on the order in which you start the controllers, e.g. the first controller you connect will have id `1`, the next will have id `2` and so on. You should see output:&#x20;
2828

2929
```
3030
Hello from Robopipe API!
3131
```
3232

33-
The controller is also accesible via [SSH](https://en.wikipedia.org/wiki/Secure_Shell). The default hostname is set to `robopipe-controller-<id>.local`. The default user is `admin` with password `robopipe.io`. Enter this command into your terminal to connect to the controller, enter `robopipe.io` when prompted for password:
33+
The controller is also accesible via [SSH](https://en.wikipedia.org/wiki/Secure_Shell). The default hostname is set to `robopipe-<id>.local`. The default user is `admin` with password `robopipe.io`. Enter this command into your terminal to connect to the controller, enter `robopipe.io` when prompted for password:
3434

3535
```bash
36-
ssh admin@robopipe-controller-1.local
36+
ssh admin@robopipe-1.local
3737
```
3838

3939
When connecting for the first time, you will be asked to verify the authenticity of the controller's key. Enter `yes` and press enter.

docs/getting-started/hello-world.md

Lines changed: 21 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -20,45 +20,45 @@ The complete example with all the python code is available below. The jupyter no
2020

2121
{% file src="../.gitbook/assets/hello_world_train.ipynb" %}
2222

23-
## Listing devices
23+
## Listing cameras
2424

25-
In order to capture training data we need to specify device from which to capture from. To do this we will list all devices.
25+
In order to capture training data we need to specify camera and specific stream from which to capture images. To do this we will first list all cameras and streams.
2626

2727
{% code fullWidth="false" %}
2828
```python
2929
import requests
3030

3131
ID = 1 # substitue with the id of your controller
32-
API_BASE = f"http://robopipe-controller-{ID}.local"
32+
API_BASE = f"http://robopipe-{ID}.local"
3333

3434
# fetch connected devices from the robopipe controller
35-
def get_devices():
35+
def get_cameras():
3636
return requests.get(f"{API_BASE}/cameras").json()
3737

38-
devices = get_devices()
38+
cameras = get_cameras()
3939
```
4040
{% endcode %}
4141

42-
Devices will contain an array of connected devices, containing their MXID along with other information. To find out more about devices API head over to the [API reference](../api/rest-api-reference/cameras.md#cameras).
42+
Cameras will contain an array of connected cameras, containing their MXID along with other information. To find out more about cameras API head over to the [API reference](../api/rest-api-reference/cameras.md#cameras).
4343

44-
We will use the first device.
44+
We will use the first camera.
4545

4646
```python
47-
mxid = devices[0].get("mxid")
47+
mxid = cameras[0].get("mxid")
4848
```
4949

50-
We will also need to select a camera. To list all cameras you can use the function below.
50+
We will also need to select a stream. Stream is either a single sensor on the camera or a combination of multiple sensors, usually used in detecting depth. To list all streams you can use the function below.
5151

5252
```python
53-
# retrieve the list of cameras associated with the selected device (mxid)
54-
def get_cameras(mxid: str):
55-
return requests(f"{API_BASE}/cameras/{mxid}/sensors").json()
53+
# retrieve the list of streams associated with the selected camera (mxid)
54+
def get_streams(mxid: str):
55+
return requests(f"{API_BASE}/cameras/{mxid}/streams").json()
5656
```
5757

58-
In my case, I will choose the first returned camera, which is _CAM\_A_. _CAM\_A_ is usually the RGB camera on most devices.
58+
In my case, I will choose the first returned stream, which is _CAM\_A_. _CAM\_A_ is usually the RGB camera on most cameras.
5959

6060
```python
61-
sensor_name = list(get_cameras().keys())[0]
61+
stream_name = list(get_streams().keys())[0]
6262
```
6363

6464
## Capturing images
@@ -86,7 +86,7 @@ def capture_data():
8686

8787
if key == "s":
8888
image_response = requests.get(
89-
f"{API_BASE}/cameras/{mxid}/sensors/{sensor_name}/still"
89+
f"{API_BASE}/cameras/{mxid}/streams/{stream_name}/still"
9090
)
9191
image = image_response.content
9292
save_image(f"data/{i}.jpeg", image)
@@ -95,10 +95,10 @@ def capture_data():
9595
break
9696
```
9797

98-
To capture data we simply call `capture_data()`. This will, however, capture the data in full camera resolution, which might not be desired. In our case, the model will be trained on images of size 200x200. We can create another function, which will properly configure the camera, so that the captured images are the right size. There are numerous options which you can configure via the [config](../api/rest-api-reference/streams.md#cameras-mxid-sensors-sensor_name-config) and [control](../api/rest-api-reference/streams.md#cameras-mxid-sensors-sensor_name-control) API.
98+
To capture data we simply call `capture_data()`. This will, however, capture the data in full camera resolution, which might not be desired. In our case, the model will be trained on images of size 200x200. We can create another function, which will properly configure the camera, so that the captured images are the right size. There are numerous options which you can configure via the [config](../api/rest-api-reference/streams.md#cameras-mxid-sensors-sensor_name-config) and [control](../api/rest-api-reference/streams.md#cameras-mxid-sensors-sensor_name-control) APIs.
9999

100100
```python
101-
def configure_camera(width: int, height: int):
101+
def configure_stream(width: int, height: int):
102102
data = {"still_size": (width, height)}
103103
return requests.post(
104104
f"{API_BASE}/cameras/{mxid}/sensors/{sensor_name}/config", data
@@ -262,8 +262,9 @@ To deploy out model we will use our API.
262262
```python
263263
def deploy_model(path="model.blob"):
264264
requests.post(
265-
f"{API_BASE}/cameras/{mxid}/sensors/{sensor_name}/nn",
265+
f"{API_BASE}/cameras/{mxid}/streams/{sensor_name}/nn",
266266
files={"model": open(path, "rb")},
267+
data={"nn_config": }
267268
)
268269
```
269270

@@ -273,11 +274,11 @@ Calling this function will take a few seconds, since the camera needs to load th
273274
import anyio
274275
import websockets
275276

276-
WS_BASE=f"ws://robopipe-controller-{ID}.local"
277+
WS_BASE=f"ws://robopipe-{ID}.local"
277278

278279
async def main():
279280
async with websockets.connect(
280-
f"{WS_BASE}/cameras/{mxid}/sensors/{sensor_name}/nn"
281+
f"{WS_BASE}/cameras/{mxid}/stream/{sensor_name}/nn"
281282
) as ws:
282283
while True:
283284
msg = await ws.recv()

0 commit comments

Comments
 (0)