Important Notes about RosettaDrone
Hardware Issues
- Your smartphone will start to lag when the RC is running low on battery.
- The drone will overheat if not flying after 10 or less minutes. You will notice that the drone camera view lags and goes green on your smartphone. Then everything starts to lag and fail (your drone, your RC, your smartphone, your desktop and your brain). Placing a fan in front of the drone while using the simulator is a must!. A fan is not necessary during flight, as the airflow naturally cools down the drone.

- If your drone is not taking off (even running in the simulator), check battery levels, temperature or the error messages using the DJI Fly app. Sometimes the drone is blocked saying it requires recalibration. Sometimes these warnings disappear when the GPS connection is established or you have more light in the room, etc.
- Please note there are many sources of problems, and this makes things more complex at the beginning. Get used to the whole Hardware and Software stack.
Debugging with Android Studio
- IMPORTANT: The debugger crashes if you don’t set “Debug Type” = “Java Only” in [Run > Edit Configurations > Debugger].
- Don’t use the “Apply Code Changes” button to Debug, since there is this horrible bug.
- When starting the app on your smartphone, AOA (Android Open Accessory) sometimes won’t work so there will be no connection with the RC. In this case:
- Restart the app manually on your smartphone
- Then attach the debugger to the app
- The same applies if the debugger disconnects. You will notice this when the Stop button is disabled.

- Setting breakpoints is slower than understanding and watching the logs in the LogCat panel. But in order to make LogCat useful, you will first need to:
- Filter out (Fold) all the lines that are not important:
- Select the pattern and select “Fold Lines Like This” with the right mouse button:

- Increase the LogCat buffer size (see instructions here) so you don’t lose old messages.
- Please note that if you pause the execution with a breakpoint, QGC will throw errors because it won’t receive responses or heartbeats on time (timeouts).
Testing
- Use the latest QGroundControl version for testing.
- Define a testing routine and repeat it all the time, until you identified, reproduced, debugged and fixed each little problem. Then define a new routine.
- If you feel that the systems are responding erratically, restart Rosetta and start from scratch.
- Code clean all the time. Avoid hacks or they will bite you (multithread nightmare).
- Automate the testing environment setup process:
- Enable ADB via Wifi on your smartphone (only required the first time or after rebooting your phone)
- Connect ADB via WiFi to your smartphone
- If you have configured IPs in several places and they use to change:
- Configure your firewall’s DHCP setting to use static IPs
- or use hostnames and set up a DNS
- or use hostnames and set the IPs in your host file
Test Mode
Rosetta provides a test mode which offers a simple internal simulator, so the app can be tested without needing to connect a real drone.
To avoid the need to press the “Test” button every time you start debugging, you can configure Android Studio to start directly in Test Mode. To do so, set:
[Run > Edit Configurations > General tab > Launch Flags] = “--es mode test”
Debugging the MavLink protocol
- The MavLink messages and commands are explained in detail here.
- You can configure Rosetta to connect to multiple telemetry UDP ports to send and receive MAVLink messages.
- To study the communication protocol, I used to connect QGC to the “ArduCopter SITL” simulator running:
- ArduCopter.exe -M+ -s1 --uartA tcp:0 --defaults "copter.parm"
- “ArduCopter.exe” and the “copter.parm” file can be installed/generated by Mission Planner. Under windows, the file is located here:
- "C:\Users\MyUser\Documents\Mission Planner\sitl\default_params/copter.parm"
- You can additionally connect ArduCopter.exe with QGC using MavProxy:
- "mavproxy.exe" --master=tcp:127.0.0.1:5760 --out=udp:127.0.0.1:14550
- The MAVLink communication is logged in MavProxy’s console and QGC shows and generates real time statistical graphs.
Reporting Issues on GitHub
- Post one issue per issue. If you open an issue reporting multiple problems, I will probably just close it.
- Don't create duplicated issues. I like people that invest their time in searching and reading existing issues with the intention to fix the issues themselves and contribute back. As the only active maintainer, I'm aware of all posted issues and immediately detect when someone posts a duplicated issue without doing his homework first. And we are also a very small GitHub user base, so we remember who posted what.
Computer Vision with RosettaDrone
Here I will explain how to setup the environment I use to:
- stream video (raw images) from Rosetta to DroneKit with low latency and high quality
- use computer vision do to pose estimation of AprilTags and land the drone
- simulate and test the whole Software stack without a real drone
Warning
RosettaDrone is developed by opensource developers for opensource developers. If you are not going to contribute back, don't use this Software, don’t read this documentation and don't post issues. We are not competing, but collaborating. The world is big and life is short.
About Video Issues on GitHub
- RosettaDrone offers compressed H264 video streaming, but it doesn't work for all models. See more details here.
- That's why I'm presenting an alternative approach to send raw uncompressed images for doing computer vision.
- I also send the timestamp and drone's yaw information at the moment of the image, which is important to deal with video latencies.
- Please note that this approach is not for sending video to QGC.
Setup
Download and Install all software in the same directory. You should finally end up with this directory structure:
- aruco
- apriltag
- SmartLanding
- vision_landing
- mavlink-camera-simulator
Environment
- I’m using Fedora Linux 37. Other Linux distros should also work fine.
- I’m using python 2.7 (it would be good to migrate the code to python 3).
Vision Landing, Aruco, ApilTag and SmartLanding
- Don’t use the Maverick automated installation.
- Although we use AprilTags, we will still need the Aruco library to load the camera parameters (we should replace this code in the future and completely deprecate the aruco library). Aruco version 3.1.0 can be downloaded here.
- I added a calibration file for the DJI Mini SE, but the calibration process wasn’t optimal, so if you have some spare time later I would recommend repeating the calibration process (follow the instructions for the aruco markers).
- If you need to test algorithms and visualize information using graphs, I would recommend extending this python module that is included in Vision Landing.
- I added the algorithms to a separate project to make it easier to test them outside Vision Landing and Rosetta.
- Some code currently added to Vision Landing should be moved to SmartLanding, since Vision Landing should only be used for the marker pose estimation, while SmartLanding should be an environment for testing algorithms and analyzing drone dynamics, but not limited to landing (I chose a bad name).
- Anyhow, for analyzing DJI drones dynamics, it makes sense to use AprilTags and use the pose estimation as a reference, since the positioning system provided by the DJI SDK is useless.
MAVLink Camera Simulator
If you need to simulate the camera image, you can use: https://github.com/RosettaDrone/mavlink-camera-simulator
I wrote this because I couldn’t personally use Gazebo (poor performance because of a missing GPU driver). If you can and feel comfortable using Gazebo, go for it.
For testing Vision Landing I just needed to render an AprilTag on the ground.
Receiving the Images
Enable the RawVideoStreamer plugin in RosettaDrone
- Edit PluginManager.java
- Replace the line:
- List<String> classNames = Arrays.asList();
- List<String> classNames = Arrays.asList("RawVideoStreamer");
- Please don’t commit this change, since enabling this plugin disables the video inside Rosetta (this should also be fixed).
- Also change the hostname where the images should be sent to. The hostname is hardcoded in the “connect()” method.
- You can send a dummy video by enabling this variable:
- private static final boolean TEST = true;
- This is useful to test the video receiver code using the Test Mode
Receive the Images in Vision Landing
- Use the input string “raw-tcp” (“-i raw-tcp” in the command line or “input=raw-tcp” in the vision_landing.conf file).
Receiving the Images directly in Python
- Oops…Now that I’m writing this, I remembered that “vision_landing” (the python frontend) is using “track_targets.cpp” (C++ backend) to receive the images from Rosetta and to do the pose estimation with the AprilTags library.
- “track_targets.cpp” then sends the pose estimation information to the python frontend, but not the images.
- If you are not going to use C++ but only Python (slower but easier), you would need to translate the C++ code I use to receive the images from Rosetta (together with the yaw and timestamp) to Python.
- UPDATE: Nevermind…I just added a class and an example to receive the image, timestamp and yaw directly in Python 2.7 and Python 3 without the need of using C++ nor Vision Landing:
- If you notice high latency, check the fps your are sending and the fps you are receiving. If you are sending too much frames per second, they will stack on the receiving end and produce latency. Check the code. It's quite simple.
Run Vision Landing
To perform a landing:
- Take off the drone
- Run vision_landing
- Optional:
- Receive the output video stream (with the augmented reality drawings) using gstreamer (see the README file).
- In case the drone doesn’t execute the final landing on irregular terrain, disable the “Landing Protection” configuration inside RosettaDrone.
Have fun and share your progress.