Tutorial
Table Of Contents
- 1. Premise
- 2. Raspberry Pi System Installation and Developmen
- 3 Log In to The Raspberry Pi and Install The App
- 4 Assembly and Precautions
- 5 Controlling Robot via WEB App
- 6 Common Problems and Solutions(Q&A)
- 7 Set The Program to Start Automatically
- 8 Remote Operation of Raspberry Pi Via MobaXterm
- 9 How to Control WS2812 RGB LED
- 10 How to Control The Servo
- 11 How to Control DC Motor
- 12 Ultrasonic Module
- 13 Line Tracking
- 14 Make A Police Light or Breathing Light
- 15 Real-Time Video Transmission
- 16 Automatic Obstacle Avoidance
- 17 Why OpenCV Uses Multi-threading to Process Vide
- 18 OpenCV Learn to Use OpenCV
- 19 Using OpenCV to Realize Color Recognition and T
- 20 Machine Line Tracking Based on OpenCV
- 21 Create A WiFi Hotspot on The Raspberry Pi
- 22 Install GUI Dependent Item under Window
- 23 How to Use GUI
- 24 Control The WS2812 LED via GUI
- 25 Real-time Video Transmission Based on OpenCV
- 26 Use OpenCV to Process Video Frames on The PC
- 27 Enable UART
- 28 Control Your AWR with An Android Device
- Conclusion
107
23 How to Use GUI
●Because the web and the GUI are not connected, if you want to use the GUI to control the robot product,
you need to manually run server.py. (The method is the same as manually running webserver.py, except that
the object is changed to server.py).
●When the server.py in the Raspberry Pi runs successfully, enter the IP address of the Raspberry Pi on
the GUI control terminal on the PC, and then click Connect to control the robot.
● Python running window: This window will accompany each GUI when it is opened. Any runtime
exceptions will be displayed here. If this window is closed, the GUI will be exited.
● Camera video streaming window: Display the picture captured by the camera. Depending on the
product type, the window rendering method may be different, and some products can also interact with this
window.
●IP address input box: Enter the IP address of the Raspberry Pi here. Clicking Connect will allow the GUI
to connect to the Raspberry Pi. Raspberry Pi status and connection status display bar: Shows some hardware
information of the Raspberry Pi and the current connection status.
●Mobile control:
·Forward: Control the robot to move forward; the shortcut key is W.
·Backward: Control the robot to move backwards; the shortcut key is S.
·Left: Control the robot to turn left; the shortcut key is A.
·Right: Control the robot to turn right; the shortcut key is D.
● Robot camera control (due to different layouts, the shortcut keys here are different from WEB
applications):
·Up: Control the camera to move upwards; the shortcut key is I.
·Down: Control the camera to move downwards; the shortcut key is K.
·Home: Control all servos to return to the neutral position; the shortcut key is H.
●FindColor:By default, the Robot finds the biggest yellow object in its view and follows it. When it gets
close enough, it would stop, and if it gets too close to the yellow object, it would go back.
●WatchDog:If the camera on the robot detects an object moving or changing, the LEDs on the robot will
turn red. This feature is developed based on Adrian Rosebrock's OpenCV code on pyimagesearch.com. You
can also learn more about the OpenCV to gain more fun to play with, such as syncing the captured image to
the dropbox after detecting the motion of the object. The example program we provide just makes the LEDs
display red however. For other functions, you can install the corresponding packages according to your needs,
just by changing the code in FPV.py
●FindLine:The robot can track lines and follow them, proceeding along a preset path that can be altered
by moving the lines, and this part of Python program is easy to understand. You can open findline.py and learn
to write it yourself.
●Add More Functions:Function 5 and Function 6 buttons are placeholders for other functions you want to
add. This robot is based on raspberry pi so there are a lot more functions you can play with, but some other
libraries are required.
We intend to simplify the installation steps as much as possible to lower the barriers for more people. Hence,
for example, voice recognition, which requires a large number of libraries t
o be installed, will not be provided in