1 Resources Links RobotName: Adeept_AWR RobotURL: https://github.com/adeept/Adeept_AWR RobotGit: https://github.com/adeept/Adeept_AWR.git [Official Raspberry Pi website] https://www.raspberrypi.org/downloads/ [Official website] https://www.adeept.com/ [GitHub] https://github.com/adeept/Adeept_AWR [Image file and Documentation for structure assembly] https://www.adeept.com/learn/detail-35.
2 Components List Acrylic Plates The acrylic plates are fragile, so please be careful when assembling them in case of breaking. The acrylic plate is covered with a layer of protective film. You need to remove it first. Some holes in the acrylic may have residues, so you need to clean them before the use.
3 Machinery Parts
4 Electronic Parts Raspberry Pi Camera Motor X4 X1 Servo x1 Wheel ROBOT HAT X1 18650 Battery Holder Set CAR LIGHT 3-Pin Wire X2 x4 X2 X1
5 Adeept Ultrasonic Module X1 4 PIN WIRE X1 5-Pin Wire X1 3-Pin Wire X2 Raspberry P1 Camera Ribbon X1 3 Tracking Module X1
6 Tools Hex Wrench-2.0mm X1 Cross Screwdriver X1 Cross Socket Wrench X1 Large Cross-head Screwdriver X1 Winding Pipe X1 Ribbon X1 Self-prepared Parts Requirements for 18650 lithium battery: 18650 lithium battery is required for normal operation of the robot, and the current output is above 4A.
1 Content 1. Premise.............................................................................................................................................................................. 1 1.1 STEAM and Raspberry Pi.................................................................................................................................................1 1.2 About The Documentation...............................................................................
2 14.1 Multi-threading Introduction..................................................................................................................................... 75 14.2 Realization of Police Lights / Breathing Lights............................................................................................................75 14.3 Warning Lights or Breathing Lights in Other Projects................................................................................................
1 1. Premise 1.1 STEAM and Raspberry Pi STEAM stands for Science, Technology, Engineering, Arts and Mathematics. It's a type of trans disciplinary education idea focused on practice. As a board designed for computer programming education, Raspberry Pi has lots of advantages over other robot development boards. Therefore, Raspberry Pi is used for function control of the robot. 1.2 About The Documentation This documentation is for software installation and operation guide for the Python robot product.
2
3 2. Raspberry Pi System Installation and Development Environment Establishment 2.1 Install An Operating System for The Raspberry Pi 2.1.1 Method A: Write 'Raspbian' to The SD Card by Raspberry Pi Imager Raspberry Pi Imager is an image writing tool to SD card developed by the Raspberry Pi Organization.
4 ●Insert the SD card into the card reader, connect the card reader with your computer. ●Run the Raspberry Pi Imager, select CHOOSE OS -> Raspbian(other) -> Raspbian Full - A port of Debian with desktop and recommended applications. ●Click on CHOOSE SD CARD for the SD card to write the Raspbian Full, please be noted that the image writing will automatically delete all files on the SD card if any. ●Click on WRITE, wait for the writing.
5 ●Do not remove the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot, WiFi configuration without any peripherals may fail in the following process. 2.1.2 Method B: Download The Image File Raspbian and Write It to The SD Card Manually ●Since the image file is downloaded with Raspberry Pi Imager in 2.1.1, it can take a long time due to a slow network in some places.
6 4. Download the image file `Raspbian` - Torrent file: [Raspbian-Raspbian Buster with desktop and recommended software] -Zip file: [Raspbian - Raspbian Buster with desktop and recommended software] 5. Unzip the file, be noted that the path should be in English for the `.img` file extracted, no special characters allowed. 6. Write the image file `Raspbian` downloaded to the SD card with `Raspberry Pi Imager` 7.
7 ●On the Raspberry Pi website [Official Raspberry Pi website], select through Downloads -> Raspbian -> Raspbian Buster with desktop and recommended software, and click on the torrent or zip file to download. Unzip the file after download, be noted that the path should be in English for the .img file extracted, no special characters allowed; otherwise Raspberry Pi Imager may not open the .img file. It's recommended to save the .img file to the root directory of the C:\ or D:\ disk, but do not save .
8 ●Do not remove the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot it up, WiFi configuration without any peripherals may fail in the following process. 2.1.3 Method C: Manually Download The Image File Provided by Us and Write It to The SD Card (Not Recommended) ● The Raspbian image file downloaded in 2.1.1 and 2.1.2 is the official source with some preinstalled software.
9 3. Install the `Raspberry Pi Imager` 4. Download the image file `Adeept_AWR` - [Image file for the Adeept_AWR Robot] 5. Unzip the file, be noted that the path should be in English for the `.img` file extracted, no special characters allowed. 6. Write the image file `Raspbian` downloaded to the SD card with `Raspberry Pi Imager` 7. Leave the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later.
10 will automatically delete all files on the SD card if any. ●Click on WRITE, wait for the writing. ●Do not remove the SD card connected after writing is completed, we'll use for configuring WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot it up, WiFi configuration without any peripherals may fail in the following process. 2.
11 2.2.1 Method A: Enable SSH with Peripherals ●If you use (2.1.3 to manually download the image file we provide and write it to the SD card) to write the operating system of the Raspberry Pi to the SD card, you do not need to refer to this section to open SSH, because The SSH service in the image is already enabled. ●If you've connected a mouse, keyboard, or monitor to the Raspberry Pi, follow these steps to enable SSH. 1.
12 ●If you haven't connected any monitor to the Raspberry Pi, follow these steps to enable SSH. 1. Do not remove the SD card after `Raspberry Pi Imager` writes the image file. 2. Create a file named `ssh` under any directory, without any extension name. You may create a `ssh.txt` and delete the `.txt` (make sure under Folder Options the box of Hide extensions for known file types is unchecked. Then you have an `ssh` file without extension name. 3.
13 } 4. Type in your own information for `Insert country code here`, `Name of your WiFi`, and `Password for your WiFi`. Pay attention to the capitalization. Refer to the example below: ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev update_config=1 country=US network={ ssid="MyName" psk="12345678" } 5. Save and exit. Copy the `wpa_supplicant.conf` to the root directory of the SD card. 6. If you've already copied the file `ssh` to the SD card as instructed in **2.
14 3 Log In to The Raspberry Pi and Install The App ● If you followed the steps in 2.2.1 and 2.3.1 for SSH and WiFi configuration, you may remove the peripherals now and use SSH to remotely control the Raspberry Pi later on. ●If you followed the steps in 2.2.2 and 2.3.2, you may now insert the SD card into the Raspberry Pi and boot it up. The Raspberry Pi will auto boot and connect WiFi when powered on, with no need of peripherals. ●If you use the operation steps of 2.1.
15 of the Raspberry Pi, `raspberry` (pay attention to capitalization). There's no change on the screen when you're typing in, but it doesn't mean you're not entering the information. Press ‘enter’ after you finish typing in. ●So now you've logged into the Raspberry Pi. 3.2 Log into Raspberry Pi (Linux or Mac OS) ●Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi.
16 ● For lower versions of Windows OS, SSH is not built in, and you may log into the Raspberry Pi by referring to the official documentation Raspberry Pi[SSH using Windows]. ●Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi. Check the Management interface for your router, or download the app `Network Scanner` -> search for a device named `RASPBERRY` or `Raspberry Pi Foundation` to get the IP address.
17 wait until it's done. 3.5 Install Corresponding Dependent Libraries ● Follow the steps below to install the libraries if you wrote the image file to the SD card based on 2.1.1 Write 'Raspbian' to the SD card by `Raspberry Pi Imager` and 2.1.2 Download the image file `Raspbian` and write it to the SD card manually. ● Here a script is provided for installing all dependent libraries needed and configuration of starting the camera and other auto start programs.
18 ● If it fails to enter the page, log into the Raspberry Pi via SSH, type in the command below to end the program auto run at boot to release resources, or else issues like camera initialization failure or occupied ports. sudo killall python3 ●Type in the command below to run `webServer.py`: sudo python3 Adeept_AWR/server/webServer.py ●Check whether there's any error and solve them based on instructions in the Q&A section below. 4 Assembly and Precautions 4.
19 If your servo does not return to the original position automatically, you can manually run the server.py file and then try to connect the servo. ●Preparations before Assembly Connect the Adeept Ultrasonic Module with 4-Pin wire.
20 Connect the car light. Please note that the end marked with white strip is the signal input, and the end without white strip is the signal output and connect the WS2812 to the Adeept Robot HAT.
21 Connect the Raspberry Pi Camera and the ribbon. Note: That in the next operation, the Pi Camera of the Raspberry Pi should always be connected to th e Raspberry Pi, and do not reverse the wires of the Raspberry Pi.
22 ● Fix four M2.5x10+6 Copper Standoffs on Raspberry Pi.
23 1. Connect the 18650 Battery Holder Set to the Adeept Motor HAT.
24 2. Put two 18650 batteries in 18650 Battery Holder Set according to the following method.
25 3. Connect servos to Adeept Motor HAT.
26 4. Before switching on, you need to insert the configured SD card into the Raspberry Pi. For details, please refer to the third chapter of the document. Otherwise, the servo will not rotate to the middle position after booting. If SD card is not inserted, the servo needs to be rotated to the middle position manually. After debugging, remove the servo and battery holder, and take the 18650 batteries out of the Holder Set. Do not rotate the rotation axis before the servo fixed to the rocker arm.
27 ●Body parts. 1.
28 2.
29 3. Assemble the camera.
30 4. Fix a debugged servo to the acrylic plate.
31 5. Fix the rocker arm on acrylic to the servo on acrylic.
32 Assemble the following components Effect diagram after assembling
33 6.
34 7.
35 Assemble the following components Effect diagram after assembling
36 Assemble the following components Effect diagram after assembling
37 8.
38 9.
39 10. Fix Adeept Ultrasonic Module on the acrylic plate.
40
41 11. Connect the Adeept Ultrasonic Module, Car Light, 3 Tracking Module Set and motor as shown below before assembling the body part.
42 12. Assemble the body part.
43
44 13. Connect the front part and the body part.
45
46 14. Strengthen the body part.
47 Assemble the following components Effect diagram after assembling
48 Assemble the following components Effect diagram after assembling
49 15. Install the wheels on the car.
50 4.2 Tips for Structural Assemblage ●Since many servos are used in the product, the servo installation is critical for the robot. Before installing the rocker arm to the servo, you need to connect the servo to power and make the servo shaft rotate to the central position, so the rocker arm installed at the designated degree will be in the central position. ●Generally Raspberry Pi will auto run `webServer.py` when booting, when `webServer.
51 ●You can also use a power lithium battery to power Motor HAT. Motor HAT supports the power supply that is below 15V. ●You can use a USB cable to supply power to Motor HAT when the installing the rocker arm of the servo during structural assembly. After the robot software is installed, the raspberry pi will control Motor HAT to set all servo ports to output the neutral signal after it is started up.
52 5 Controlling Robot via WEB App ●The WEB app is developed for common users to control the robot in an easier way. It's convenient to use WEB app; you may use it to wirelessly control the robot on any device with a web browser (Google Chrome was used for testing). ●Generally Raspberry Pi will auto run `webServer.py` when booting and establish a web server in the LAN. You may then use any other computer, mobile or tablet in the same LAN to visit the web page and control the robot.
53 ·`MOTION GET`: Motion detection function based on OpenCV. When objects move in the view of the camera, the program will circle the part in the `Video` window, and the LED light on the robot will show respective changes. ·`AUTO MATIC`: Obstacle avoidance function based on ultrasonic. When the ultrasonic module on the robot detects an obstacle, it will automatically turn left, and take a step backward before turning if it's too close to the obstacle.
54 6 Common Problems and Solutions(Q&A) ●Where to find the IP address of the Raspberry Pi? Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi. Check the Management interface for your router, or download the app `Network Scanner` -> search for a device named `RASPBERRY` or `Raspberry Pi Foundation` to get the IP address.
55 ●The servo doesn't return to the central position when connected to the driver board. In general, the Raspberry Pi will auto run `webServer.py` when booting, and `webServer.py` will run and control the servo ports to send a signal of rotating to the central position. When assembling the servo, you can connect it to any servo port anytime.
56 ●Motor movement direction is incorrect Due to the different batches of motors, when the same signal is given, the direction of rotation of the motor may be different. We have set an interface to adjust the direction of rotation of the motor in the program. You need to open move.py.
57 7 Set The Program to Start Automatically 7.1 Set The Specified Program to Run Automatically at Boot ●This section only introduces the auto-run method used by our products. If you need more information about the Raspberry Pi auto-run program, you can refer to this document from itechfythe document Auto-Run. ● If you have used the operation steps of 3.5 or 3.6, then the script program has been configured to automatically run the program at startup.
58 ●For example, if we want to replace webServer.py with server.py, we only need to edit the following: Replace sudo python3 [RobotName]/server/webServer.py with sudo python3 [RobotName]/server/server.py ●Save and exit so that the robot will automatically run server.py instead of webServer.py the next time the robot is turned on. ●server.py is a socket server used when using pythonGUI.
59 8 Remote Operation of Raspberry Pi Via MobaXterm ●To make daily use of the Raspberry Pi more convenient, we usually do not connect peripherals such as mouse, keyboard, and monitor to the Raspberry Pi. Since our Raspberry Pi is installed inside the robot, often with peripherals to control the Raspberry Pi, the efficiency of programming and testing will be seriously affected. Therefore, we introduce a method of programming in the Raspberry Pi. ●There are many ways to program in the Raspberry Pi.
60 to the Raspberry Pi again, if there is no save username and password will need to input the user name and password, if the IP address of the Raspberry Pi changed, you need to start a new dialogue. ● After a successful login, the left column is replaced with a file transfer system, which allows you to interact with the system inside the Raspberry Pi. If you want to return to session selection, just click Sessions.
61 9 How to Control WS2812 RGB LED ●WS2812 LED light is a commonly used module on our robot products. There are three WS2812 lights on each module. Please pay attention when connecting. The signal line is different in direction, which needs to be connected to WS2812 after being led out from the Raspberry Pi.
62 class LED: def __init__(self): self.LED_COUNT Raspberry Pi self.LED_PIN = 16 # Set to the total number of LED lights on the robot product.There are more LED lights on the = 12 # Set to the input pin number of the LED group self.LED_FREQ_HZ self.LED_DMA = 800000 = 10 self.LED_BRIGHTNESS = 255 self.LED_INVERT self.LED_CHANNEL = False =0 # Use the configuration item above to create a strip self.strip = Adafruit_NeoPixel( self.LED_COUNT, self.LED_PIN, self.LED_FREQ_HZ, self.LED_DMA, self.
63 ●Instantiate the object and execute the method function. The function colorWipe () needs to pass in three parameters, namely R, G, and B, which correspond to the brightness of the three primary colors of red, green, and blue. The value range is 0- 255, the larger the value, the higher the brightness of the corresponding color channel. If the values of the three color channels are the same, white light is emitted. Specific examples are as follows: if __name__ == '__main__': LED = LED() try: while 1: LED.
64 10 How to Control The Servo 10.1 Control The Steering Gear to Rotate to A Certain Angle ● Since the servo can use the PWM signal to control the rotation angle of a mechanism, it is a more commonly used module on robot products. Walking robots, robotic arms and gimbals are all driven by the servo. In our Raspberry Pi The driver board Motor HAT has a dedicated PCA9685 chip for controlling the servo. The Raspberry Pi uses I2C to communicate with the PCA9685.
65 ● pwm.
66 10.3 Non-blocking Control ●You can find the RPIservo.py file in the server folder of the robot product, copy it to the same folder as the program you want to run, and then you can use this method in your program. import RPIservo import time # Import a library that uses multiple threads to control the steering gear sc = RPIservo.ServoCtrl() # Instantiate the object that controls the steering gear sc.start() # Start this thread, when the servo does not move, the thread is suspended while 1: sc.
67 11 How to Control DC Motor ●If the Raspbian image version you installed is Raspbian Full provided by the official website, you do not need to install other dependent libraries. We only need to control the GPIO port of the Raspberry Pi for simple high and low levels and PWM to control the L298N chip on Motor HAT, thus controlling the direction and speed of the motor.
68 def setup(): # GPIO initialization, GPIO motor cannot be controlled without initialization global pwm_A, pwm_B GPIO.setwarnings(False) GPIO.setmode(GPIO.BCM) GPIO.setup(Motor_A_EN, GPIO.OUT) GPIO.setup(Motor_B_EN, GPIO.OUT) GPIO.setup(Motor_A_Pin1, GPIO.OUT) GPIO.setup(Motor_A_Pin2, GPIO.OUT) GPIO.setup(Motor_B_Pin1, GPIO.OUT) GPIO.setup(Motor_B_Pin2, GPIO.
69 Control A and B motors to rotate at full speed for 3 seconds ''' motor_A(1, 100) motor_B(1, 100) time.sleep(3) ''' Control A and B motors to rotate in opposite directions at full speed for 3 seconds ''' motor_A(-1, 100) motor_B(-1, 100) time.sleep(3) ''' Stop the motor rotation of A and B ports ''' motorStop() ●The above codes can be used to control the motor movement. The structure of the two functions motor_A and motor_B is the same, but the control servo port is different.
70 12 Ultrasonic Module ● The camera used by our Raspberry Pi robot is monocular, which cannot collect depth information. Therefore, many of our robot products use ultrasonic ranging modules to obtain depth information and detect whether there is an obstacle in a certain direction to obtain the distance of the obstacle.
71 GPIO.setmode(GPIO.BCM) GPIO.setup(Tr, GPIO.OUT,initial=GPIO.LOW) GPIO.setup(Ec, GPIO.IN) def checkdist(): GPIO.output(Tr, GPIO.HIGH) # Set the input end of the module to high level and emit an initial sound wave time.sleep(0.000015) GPIO.output(Tr, GPIO.LOW) while not GPIO.input(Ec): # When the module no longer receives the initial sound wave pass t1 = time.time() # Note the time when the initial sound wave is emitted while GPIO.input(Ec): # When the module receives the return sound wave pass t2 = time.
72 13 Line Tracking ● Some of our robot products are equipped with a three-channel infrared line patrol module. The line patrol module is converted to the robot's line patrol function design.
73 GPIO.setup(line_pin_middle,GPIO.IN) GPIO.setup(line_pin_left,GPIO.IN) def run(): ''' Read the values of three infrared sensor phototransistors (0 is no line detected, 1 is line detected) This routine takes the black line on white as an example ''' status_right = GPIO.input(line_pin_right) status_middle = GPIO.input(line_pin_middle) status_left = GPIO.
74 ●When your project needs to use the line patrol function, you don't need to rewrite the above code, just copy findline.py and move.py in the robot program server folder to the same as your own project In the folder, then use the following code to use the line patrol function: import findline findline.setup() while 1: findline.run() ● The reason why you need to import move.py is findline.py needs to use the method in move.py to control the robot movement.
75 14 Make A Police Light or Breathing Light 14.1 Multi-threading Introduction ● This chapter introduces the use of multi-threading to achieve some effects related to WS2812 LED lights. Multi-threading is a commonly used operation in robot projects. Because robots have high requirements for real-time response, when performing a certain task, try not to block main thread communication. ● Multi-threading is similar to executing multiple different programs or tasks simultaneously.
76 ''' Use the Threading module to create threads, inherit directly from threading.Thread, and then override the __init__ method and the run method ''' class RobotLight(threading.Thread): def __init__(self, *args, **kwargs): ''' Here initialize some settings about LED lights ''' self.LED_COUNT = 16 # Number of LED pixels. self.LED_PIN = 12 # GPIO pin connected to the pixels (18 uses PWM!). self.LED_DMA = 10 # DMA channel to use for generating signal (try 10) self.LED_FREQ_HZ = 800000 self.
77 super(RobotLight, self).__init__(*args, **kwargs) self.__flag = threading.Event() self.__flag.clear() # Define functions which animate LEDs in various ways. def setColor(self, R, G, B): ''' Set the color of all lights ''' color = Color(int(R),int(G),int(B)) for i in range(self.strip.numPixels()): self.strip.setPixelColor(i, color) self.strip.
78 ''' Call this function to turn on the police light mode ''' self.lightMode = 'police' self.resume() def policeProcessing(self): ''' The specific realization of the police light mode ''' while self.lightMode == 'police': ''' Blue flashes 3 times ''' for i in range(0,3): self.setSomeColor(0,0,255,[0,1,2,3,4,5,6,7,8,9,10,11]) time.sleep(0.05) self.setSomeColor(0,0,0,[0,1,2,3,4,5,6,7,8,9,10,11]) time.sleep(0.05) if self.lightMode != 'police': break time.sleep(0.
79 self.colorBreathB = B_input self.resume() def breathProcessing(self): ''' Specific realization method of breathing lamp ''' while self.lightMode == 'breath': ''' All lights gradually brighten ''' for i in range(0,self.breathSteps): if self.lightMode != 'breath': break self.setColor(self.colorBreathR*i/self.breathSteps, self.colorBreathG*i/self.breathSteps, self.colorBreathB*i/self.breathSteps) ''' time.sleep(0.03) All lights are getting darker ''' for i in range(0,self.breathSteps): if self.
80 def run(self): ''' Functions for multi-threaded tasks ''' while 1: self.__flag.wait() self.lightChange() pass if __name__ == '__main__': RL=RobotLight() RL.start() # Instantiate the object that controls the LED light # Start thread ''' Start breathing light mode and stop after 15 seconds ''' RL.breath(70,70,255) time.sleep(15) RL.pause() ''' Pause for 2 seconds ''' time.sleep(2) ''' Start the police light mode and stop after 15 seconds ''' RL.police() time.sleep(15) RL.pause() 14.
81 RL=robotLight.RobotLight() # Instantiate the object that controls the LED light RL.start() # Start thread ''' Start breathing light mode and stop after 15 seconds ''' RL.breath(70,70,255) time.sleep(15) RL.pause() ''' Pause for 2 seconds ''' time.sleep(2) ''' Start the police light mode and stop after 15 seconds ''' RL.police() time.sleep(15) RL.pause() 15 Real-Time Video Transmission ● Real-time video and OpenCV function are the advantages of the Raspberry Pi robot.
82 ●This chapter does not introduce the OpenCV part first, only introduces how to see the real-time picture of the Raspberry Pi camera on other devices. ●First download flask-video-streaming this project in the Raspberry Pi. You can download it from Clone on GitHub or download it on your computer and then pass it to the Raspberry Pi. The download command using the Raspberry Pi console is as follows: sudo git clone https://github.com/miguelgrinberg/flask-video-streaming.
83 ''' ●Finally, uncomment the code that imports Camera from camera_pi, # from camera_pi import Camera delete in front of #, note that there is a space after the # here, and also delete, the changed code is as follows: from camera_pi import Camera ●The following is the complete code of the modified app.py: #!/usr/bin/env python from importlib import import_module import os from flask import Flask, render_template, Response # import camera driver ''' if os.environ.
84 mimetype='multipart/x-mixed-replace; boundary=frame') if __name__ == '__main__': app.run(host='0.0.0.0', threaded=True) ●After editing, press CTRL+X to launch the editing, and prompt whether to save the changes, press Y and Entry after saving the changes. ●Then you can run app.py: sudo app.
85 16 Automatic Obstacle Avoidance ●The ultrasonic module of this product can only move up and down with the camera, and the left and right movement can only rotate with the body of the body, and can not move left and right relative to the body, so the obstacle avoidance function of this robot is relatively simple, as long as there is an obstacle in front Turn left and retreat if the obstacle is too close, and move forward if the obstacle is far away or there is no obstacle.
86 # Initialize servo angle pwm = Adafruit_PCA9685.PCA9685() pwm.
87 # Change the scanned direction if scanDir == 1: scanDir = -1 elif scanDir == -1: scanDir = 1 # Restore scanned location scanPos += scanDir*2 print(scanList) # If the distance of the nearest obstacle in front is less than the threshold if min(scanList) < rangeKeep: # If the closest obstacle is on the left if scanList.index(min(scanList)) == 0: # Then, turn right scGear.moveAngle(2, -30) # If the closest obstacle is right ahead elif scanList.
88 17 Why OpenCV Uses Multi-threading to Process Video Frames ●The OpenCV function is based on the GitHub project flask-video-streaming, we changed the camera_opencv.py to perform OpenCV related operations. 17.1 Single Thread Processing of Video Frames ● First, we introduce the process of single-thread processing of video frames. Let ’s start with a simple one, so that you will understand why OpenCV uses multiple threads to process video frames.
89 make it abnormally stuck. 17.2 Multi-thread Processing of Video Frames ●Next, the process of multi-thread processing of video frames is introduced: ● Process explanation: In order to improve the frame rate, we separate the analysis task of the video frame from the process of acquisition and display, and place it in a background thread to execute and generate drawing information. ● We change the complete code of camera_opencv.
90 import threading import imutils class CVThread(threading.Thread): ''' This class is used to process OpenCV analysis of video frames in the background. For more basic usage principles of the multi-threaded class, please refer to 14.2 ''' def __init__(self, *args, **kwargs): self.CVThreading = 0 super(CVThread, self).__init__(*args, **kwargs) self.__flag = threading.Event() self.__flag.clear() def mode(self, imgInput): ''' This method is used for incoming video frames that need to be processed ''' self.
91 self.CVThreading = 0 def resume(self): ''' Resume the thread ''' self.__flag.set() def run(self): ''' Process video frames in the background thread ''' while 1: self.__flag.wait() self.CVThreading = 1 self.doOpenCV(self.imgCV) class Camera(BaseCamera): video_source = 0 def __init__(self): if os.environ.get('OPENCV_CAMERA_SOURCE'): Camera.set_video_source(int(os.environ['OPENCV_CAMERA_SOURCE'])) super(Camera, self).__init__() @staticmethod def set_video_source(source): Camera.
92 img = camera.read() if cvt.CVThreading: ''' If OpenCV is processing video frames, skip ''' pass else: ''' If OpenCV is not processing video frames, give the video frame processing thread a new video frame and resume the processing thread ''' cvt.mode(img) cvt.resume() ''' Draw elements on the screen ''' img = cvt.elementDraw(img) # encode as a jpeg image and return it yield cv2.imencode('.jpg', img)[1].tobytes() ●The above is the code principle of using multi-threading to process OpenCV.
93 18 OpenCV Learn to Use OpenCV ●The real-time video transmission function comes from the open source project of Github the MIT open source agreement flask-video-streaming. ●First, prepare two .py files in the same folder in the Raspberry Pi. The code is as follows: ·app.py #!/usr/bin/env python3 from importlib import import_module import os from flask import Flask, render_template, Response from camera_opencv import Camera app = Flask(__name__) def gen(camera): while True: frame = camera.
94 from thread import get_ident except ImportError: from _thread import get_ident class CameraEvent(object): """An Event-like class that signals all active clients when a new frame is available. """ def __init__(self): self.events = {} def wait(self): """Invoked from each client's thread to wait for the next frame.""" ident = get_ident() if ident not in self.events: # this is a new client # add an entry for it in the self.events dict # each entry has two elements, a threading.Event() and a timestamp self.
95 """Invoked from each client's thread after a frame was processed.""" self.events[get_ident()][0].clear() class BaseCamera(object): thread = None # background thread that reads frames from camera frame = None # current frame is stored here by background thread last_access = 0 # time of last client access to the camera event = CameraEvent() def __init__(self): """Start the background camera thread if it isn't running yet.""" if BaseCamera.thread is None: BaseCamera.last_access = time.
96 for frame in frames_iterator: BaseCamera.frame = frame BaseCamera.event.set() # send signal to clients time.sleep(0) # if there hasn't been any clients asking for frames in # the last 10 seconds then stop the thread if time.time() - BaseCamera.last_access > 10: frames_iterator.close() print('Stopping camera thread due to inactivity.') break BaseCamera.thread = None ●When you use the follow-up tutorial to develop a certain OpenCV related function, you only need to put the corresponding camera_opencv.
97 19 Using OpenCV to Realize Color Recognition and Tracking 19.1 Color Recognition and Color Space ●For the development preparation and operation of OpenCV function, please refer to 18. ●Create camera_opencv.py in the folder where app.py and base_camera.py in 18. The code related to the OpenCV color tracking function to be introduced in this chapter is written in camera_opencv.py ●For safety reasons, this routine does not control the motor or servo motion, and only outputs OpenCV calculation results.
98 general process is as follows 19.3 Specific Code ●camera_opencv.
99 Set target color, HSV color space ''' colorUpper = np.array([44, 255, 255]) colorLower = np.array([24, 100, 100]) font = cv2.FONT_HERSHEY_SIMPLEX class Camera(BaseCamera): video_source = 0 def __init__(self): if os.environ.get('OPENCV_CAMERA_SOURCE'): Camera.set_video_source(int(os.environ['OPENCV_CAMERA_SOURCE'])) super(Camera, self).__init__() @staticmethod def set_video_source(source): Camera.video_source = source @staticmethod def frames(): camera = cv2.VideoCapture(Camera.
100 ''' c = max(cnts, key=cv2.contourArea) ((box_x, box_y), radius) = cv2.minEnclosingCircle(c) M = cv2.moments(c) center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"])) X = int(box_x) Y = int(box_y) ''' Get the center point coordinates of the target color object and output ''' print('Target color object detected') print('X:%d'%X) print('Y:%d'%Y) print('-------') ''' Write text on the screen:Target Detected ''' cv2.putText(img,'Target Detected',(40,60), font, 0.5,(255,255,255),1,cv2.
101 19.4 HSV Color Component Range in OpenCV HSV\color Black Grey White Red Orange Yellow Green Cyan Blue Purple H_min 0 0 0 0|156 11 26 35 78 100 125 H_max 180 180 180 10|180 25 34 77 99 124 155 S_min 0 0 0 43 43 43 43 43 43 43 S_max 255 43 30 255 255 255 255 255 255 255 V_min 0 46 221 46 46 46 46 46 46 46 V_max 46 220 255 255 255 255 255 255 255 255
102 20 Machine Line Tracking Based on OpenCV 20.1 Visual Line Inspection Process ●For the development preparation and operation of OpenCV function, please refer to 18. ●Create camera_opencv.py in the folder where app.py and base_camera.py in 18, the code related to the OpenCV visual line tracking function to be introduced in this chapter is written in camera_opencv.py. ●For safety reasons, this routine does not control the motor or servo motion, and only outputs OpenCV calculation results.
103 20.2 Specific Code import os import cv2 from base_camera import BaseCamera import numpy as np import time import threading import imutils ''' Set the color of the line, 255 is the white line, 0 is the black line ''' lineColorSet = 255 ''' Set the horizontal position of the reference, the larger the value, the lower, but not greater than the vertical resolution of the video (default 480) ''' linePos = 380 class Camera(BaseCamera): video_source = 0 def __init__(self): if os.environ.
104 Convert the picture to black and white, and then binarize (the value of each pixel in the picture is 255 except 0) ''' img = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) retval, img = cv2.threshold(img, 0, 255, cv2.THRESH_OTSU) img = cv2.erode(img, None, iterations=6) #Use Corrosion Denoising colorPos = img[linePos] #Get an array of pixel values for linePos try: lineColorCount_Pos = np.sum(colorPos == lineColorSet) #Get the number of pixels of line color (line width) lineIndex_Pos = np.
105 21 Create A WiFi Hotspot on The Raspberry Pi ● The method of turning on the WIFI hotspot in our robot product uses a project from GitHub create_ap. Our installation script will automatically install this program and related dependent libraries. If you have not run our installation script, you can use the following command to install create_ap: sudo git clone https://github.com/oblique/create_ap.
106 22 Install GUI Dependent Item under Window ● Our old version of robot programs all provide a desktop GUI program to control the robot. The GUI program is written in Python language, but this method uses a higher threshold and difficulty, and is not recommended for novices. ●This GUI program is only suitable for the Windows system for the time being. It is included in the client directory of the robot software package and is generally called GUI.py. 22.
107 23 How to Use GUI ●Because the web and the GUI are not connected, if you want to use the GUI to control the robot product, you need to manually run server.py. (The method is the same as manually running webserver.py, except that the object is changed to server.py). ●When the server.py in the Raspberry Pi runs successfully, enter the IP address of the Raspberry Pi on the GUI control terminal on the PC, and then click Connect to control the robot.
108 the standard program. If you are interested in this, you can try to expand more. We will offer the installation and application methods of other functions in the follow-up tutorials. Please subscribe our Youtube channel for more. ●Change LED Color:You can control the colors of the LEDs on the robot in real time by dragging these three sliders. These three sliders correspond to the brightness of the three channels of RGB.
109 import socket ''' Some settings related to LED lights come from the WS281X routine Source Code:https://github.com/rpi-ws281x/rpi-ws281x-python/ ''' LED_COUNT = 24 LED_PIN = 18 LED_FREQ_HZ = 800000 LED_DMA = 10 LED_BRIGHTNESS = 255 LED_INVERT = False LED_CHANNEL =0 ''' Process arguments ''' parser = argparse.ArgumentParser() parser.add_argument('-c', '--clear', action='store_true', help='clear the display on exit') args = parser.parse_args() ''' Create NeoPixel object with appropriate configuration.
110 tcpSerSock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR,1) tcpSerSock.bind(ADDR) tcpSerSock.listen(5) ''' Start listening to the client connection, after the client connection is successful, start to receive the information sent from the client ''' tcpCliSock, addr = tcpSerSock.accept() while True: data = '' ''' Receive information from the client ''' data = str(tcpCliSock.recv(BUFSIZ).
111 ''' Python uses Tkinter to quickly create GUI applications and instantiate them while importing ''' import tkinter as tk def lights_on(): ''' Call this method to send the light-on command 'on' ''' tcpClicSock.send(('on').encode()) def lights_off(): ''' Call this method to send the light off command 'off' ''' tcpClicSock.send(('off').encode()) ''' Enter the IP address of the Raspberry Pi here ''' SERVER_IP = '192.168.3.
112 ''' Use Tkinter's Button method to define a button. The button is on the root window. The name of the button is 'ON'. The text color of the button is # E1F5FE. The background color of the button is # 0277BD. )function ''' btn_on = tk.Button(root, width=8, text='ON', fg='#E1F5FE', bg='#0277BD', command=lights_on) ''' Choose a location to place this button ''' btn_on.place(x=15, y=15) ''' The same method defines another key, the difference is that the text above the key is changed to 'OFF'.
113 25 Real-time Video Transmission Based on OpenCV ● This chapter introduces real-time video transmission, which can transmit the images collected by the camera to other places in real time for displaying images or handing it to the host computer for machine vision processing. ● The software functions of this tutorial are based on opencv, numpy, zmq (read Zero MQ) and base64 libraries. Before writing the code, you need to install these libraries.
114 ''' IP = '192.168.3.11' ''' Then initialize the camera, you can change these parameters according to your needs ''' camera = picamera.PiCamera() camera.resolution = (640, 480) camera.
115 ''' Clear the stream in preparation for the next frame ''' rawCapture.truncate(0) ● In the following, we explain the program on the receiving end. Since the libraries used here are cross-platform, PC.py can be run on a Windows computer or another Linux computer. ●PC.
116 ''' Display image ''' cv2.imshow("Stream", source) ''' Generally, waitKey () should be used after imshow () to leave time for image drawing, otherwise the window will appear unresponsive and the image cannot be displayed ''' cv2.waitKey(1) ●When running the program, we first run RPiCam.py in the Raspberry Pi and PC.py in the PC to see the real-time picture of the Raspberry Pi in the PC.
117 footage_socket = context.socket(zmq.PAIR) footage_socket.bind('tcp://*:5555') while True: ''' Received video frame data ''' frame = footage_socket.recv_string() ''' Decode and save it to the cache ''' img = base64.b64decode(frame) ''' Interpret a buffer as a 1-dimensional array ''' npimg = np.frombuffer(img, dtype=np.uint8) ''' Decode a one-dimensional array into an image ''' source = cv2.imdecode(npimg, 1) ''' Display image ''' cv2.
118 import numpy as np ''' Here we instantiate the zmq object used to receive the frame Note that the port number needs to be consistent with the sender's ''' context = zmq.Context() footage_socket = context.socket(zmq.PAIR) footage_socket.bind('tcp://*:5555') while True: ''' Received video frame data ''' frame = footage_socket.recv_string() ''' Decode and save it to the cache ''' img = base64.b64decode(frame) ''' Interpret a buffer as a 1-dimensional array ''' npimg = np.frombuffer(img, dtype=np.
119 ''' source = cv2.erode(source, None, iterations=6) ''' Display image ''' cv2.imshow("Stream", source) ''' Generally, waitKey () should be used after imshow () to leave time for image drawing, otherwise the window will appear unresponsive and the image cannot be displayed ''' cv2.
120 27 Enable UART ● UART is a more commonly used communication protocol between devices. Using UART, you can allow MCUs such as Arduino, STM32, or ESP32 to communicate with the Raspberry Pi, which can make your robot more powerful. ●However, for some Raspberry Pis, the UART that is enabled by default is not a full-featured UART, so you need to refer to the following steps to enable the full-featured UART.
121 for other purposes requires this default behaviour to be changed. On startup, systemd checks the Linux kernel command line for any console entries, and will use the console defined therein. To stop this behaviour, the serial console setting needs to be removed from command line. ●This can be done by using the raspi-config utility, or manually. sudo raspi-config ●Select option 5, Interfacing options, then option P6, Serial, and select No. Exit raspi-config.
122 27.5 Relevant Differences Between PL011 and Mini UART ● The mini UART has smaller FIFOs. Combined with the lack of flow control, this makes it more prone to losing characters at higher baudrates. It is also generally less capable than the PL011, mainly due to its baud rate link to the VPU clock speed.
123 ●It should be noted that the port number when using the WEB application is 5000, the port number when using the GUI program is 10223, and the port number when using the mobile APP is 10123. ●The controller on the left can control the robot to move back and forth, left and right, and the controller on the right can control other movements of the robot. You can change the specific operation by editing appserver.py.
124 Conclusion Through the above operations on DarkPaw, you should learn how to use python language programming to control DarkPaw work on the Raspberry Pi, And also learned how to assemble a DarkPaw. If you have any questions about this product, please contact us via email or forum, we will reply to your questions within one working day: support@adeept.com https://www.adeept.com/forum/ If you want to try our other products, you can visit our website: www.adeept.
125