Description of the Task

The goal of this project is to develop a LEGO mobile robot that picks and places orange cups

Software Development

This section discusses the development of the software and the choices of the associated platform. To this end we first discuss the platform on which the software is implemented before discussing the code.

Option 1: EV3 Controller standalone

Weight: 230grams The ev3 brick controller is the default controller that comes with the Lego Mindstorms kit. It has 4 input and 4 output ports that it can use to receive sensor data and output actuators. It has a Hi-resolution 178x128 pixel display enabling detailed graph viewing and sensor data observation. It can communicate to a computer in one of 3 different ways; through a tethered USB connection, or wirelessly via an external Wi-Fi or a Bluetooth dongle.

Programming Option 1: EV3 Lab

Ev3 Lab is full-fledged software for desktop computers that uses a drag and drop programming interface. It has support for data logging and analysis and documentation.

Programming Option 2: EV3 MicroPython

Programming with Python on the EV3 involves using Visual Studio Code IDE on a Windows 10 or Mac OS system with the EV3 MicroPython extension installed.

An independent study showed that programming using the python environment was 4.4 times faster than the EV3 Lab [1]

Option 2: Ch Mindstorms Controller (CMC)

Weight: (EV3 + RPi) = 230 + 50 = 280g The C-STEM Studio software, developed by the C-STEM Center at UC Davis, allows to control the Lego Mindstorms EV3 (and NXT) from the RaspberryPi. The C-STEM Studio platform can be installed from an existing Raspbian installation by downloading the software modules or by installing C-Stembian, a free open-source Linux operating system for Raspberry Pi that has a user-friendly environment for computing, robotics, and cyber-physical systems. Connection to the robot takes place over Bluetooth, and it allows for real-time control of the LEGO motors and sensor data monitoring. Programming: C-Stem studio contains the Ch Mindstorms package that can be used to control the LEGO EV3 (or NXT) via ready-made sample applications with their source codes. The EV3 (or NXT)can be controlled through the GUI or through a generated program through the ChIDE.

Option 3: EVB with BeagleBone

The existence of the EVB is acknowledged however it seems that after a successful Kickstarter campaign [2] the EVB is no longer available on sale. Although it looked promising, it is unclear whether the EVB ever made it to the mainstream market.

Option4: Brick Pi with Raspberry Pi

Weight (RPi + BrickPi): 50g + 50g = 100g

The BrickPi is a board that slides over the GPIO headers of the Raspberry Pi and connects, controls, and powers Mindstorms motors and sensors, and provides power to the Raspberry Pi [3]. Firmware for the boards own microcontroller is written in Arduino so it’s open and hackable. The hardware designs and software source code are all available online. The BrickPi3 communicates with the Raspberry Pi using 18 SPI (Serial Peripheral Interface, a synchronous serial communication interface specification used for short-distance communication), with the Pi as Master and the BrickPi as the slave (Fig. 16). The bus can run at up to 500kbps [4]

Programming: Programming the BrickPi is by using libraries developed for use in one of the following programming languages: Python, Scratch, Java, NodeJs and EV3 Dev [5].

Selection of platform

As the task in hand was to detect orange cups and balls, and also differentiate between them to effect different end-effector grasping mechanisms, it was thought to resort to computer vision techniques. The BrickPi3 and Raspberry Pi option (option 4) was selected because of the following reasons:

  • Support: The first versions of the BrickPi released over 5 years ago and has garnered a lot of support online and plenty of resources are available besides the already sufficient official sample codes provided by Dexter Industries. Further the RaspberryPi is also recognized as one of the most popular single board computers with plenty of online support. This was considered crucial as the plan was to work with computer vision, a not so familiar domain for the software lead. This factor therefore ruled out option 3 as it was not considered a mature enough solution lacking support.

  • Performance: The Raspberry Pi specifications is more powerful with a 4-core processor at 1.4 GHz than the beaglebone which would be useful in running resource intensive computer vision applications. A comparison of relevant specs is presented in Table 1.

  • Size and weight: The BrickPi slides over the Raspberry Pi and has a smaller footprint than any of the other options. Moreover, it weighed the least of the options at a meagre 100 grams (excluding case and batteries). This was thought to come in handy while designing the robot.

  • Camera Interface: Although the existence of camera for the EV3 is acknowledged, no useful computer vision (CV) algorithms were expected to work due to the low computation capability of the EV3 and the lack of support for CV libraries for the EV3.

  • Libraries and Programming Language: Open CV was the library selected for computer vision due to its popularity and python was the programming language of choice for the software lead. This factor ruled out option 2 as python was not supported. Although ad-hoc mechanisms could have been made to code parts of programs in different languages and make them interoperable (through service-oriented architectures, socket programming or standard protocols like MQTT) , this was not of interest to the software lead as an easy route was available with option 4.

Aspect EV3 Beaglebone Black Raspberry Pi 3B+
Processor ARM9 300Mhz Arm Cortex A8 1Ghz Broadcom BCM2837B0 quad-core
A53 (ARMv8) 64-bit @ 1.4GHz
Memory 16MB Flash, 64MB RAM 512 MB RAM 1GB LPDDR2 SDRAM
Camera Support Vision Subsystem [6]
Pixy2 for LEGO [7]
Via USB or a
Camera Cape [8]
Raspberry PI Camera module via Camera Serial Interface (CSI) [9]

Software, Algorithm and Code

Operating System: Raspbian Buster (Kernel version:4.19) Computer Vision Library: Open-CV (Python) version 4.1.1 Programming Language: Python 3.7 Platform: Raspberry Pi 3B+

Raspberry Pi’s can fall short in performance when it comes running computer vision algorithms. But as we will soon see, if we have realistic expectations acknowledging the constraints we can get around to implement a satisfactory solution that solves the task in hand. The resource constrained specs of the RPi narrowed the search to less resource intensive algorithms. Since the object to be detected was both orange and nothing else the robot expected to have in its field of view is orange Color Thresholding was used.

Algorithm: Color Thresholding in Image Processing

Here we set ourselves two goals:

  1. Detect the presence of an orange object (ball or cup)
  2. Detect the position of the orange object in the video frame in near real time.

The gist of the image processing technique is as follows. We define a basic upper and lower threshold for the orange color in the HSV color space. For every frame of the video feed from the camera, we use OpenCV’s cv2.inrange() function to effect color thresholding yielding a mask where all pixels that follow within the threshold are white while the rest are black. We then draw a circular contour around the detected orange pixels so as to determine its center that consequently determines the heading of the robot. This center then acts as variable that drives a proportional controller that drives the wheels of the robot, that tries to keep the center of the detected orange object in the center of the frame.

Code : Image Processing

The code is discussed in detail below with the help of comments.

	def image_process_centroid():
		'''
		Function that process frames from the RaspberryPi Camera
		and returns the centre of the detected orange object (cup or balls)
		Args:
		None
		Attributes:
		center (int): This is the center of the detected
		orange object
		'''
		# Read frame from the PiCamera
		frame = vs.read()
		if frame is None:
			return "No frames detected"

		# resize the frame, blur it, and convert it to the HSV
		# color space
		frame = imutils.resize(frame, width=600)
		blurred = cv2.GaussianBlur(frame, (11, 11), 0)
		hsv = cv2.cvtColor(blurred, cv2.COLOR_BGR2HSV)

		# construct a mask for the orange color, then perform
		# a series of dilations and erosions to remove any small
		# blobs left in the mask

		mask = cv2.inRange(hsv, orangeLower, orangeUpper) # Fig. 17
		mask = cv2.erode(mask, None, iterations=2)
		mask = cv2.dilate(mask, None, iterations=2)

		# find contours in the mask and initialize the current
		# (x, y) center of the orange object
		cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,
		cv2.CHAIN_APPROX_SIMPLE)
		cnts = imutils.grab_contours(cnts) # Fig. 19
		center = None

		# only proceed if at least one contour was found
		if len(cnts) > 0:
			# find the largest contour in the mask that indicates
			# the closest orange object, then use
			# it to compute the minimum enclosing circle and its
			# centroid that would later determine the heading
			# of the robot
			c = max(cnts, key=cv2.contourArea) # Fig. 19
			((x, y), radius) = cv2.minEnclosingCircle(c)
			M = cv2.moments(c)
			center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"])) # Fig. 19
			# only proceed if the radius meets a minimum size
			if radius < 10:
			center = None
		return center

Code : The Robot Program

# infinite loop
		while True:
		# get center of orange object in frame
		Center = image_process_centroid ()
		# if no orange object is in the robot's field of view
		if center is None:
			print("No orange in sight, looking . .")
			robot.look_for_orange("left")
		
		# if an orange object is in the robot's field of view
		else:
			#(X,Y) coordinates of the center
			center, distance = center
			print("ORANGE DETECTED at",center,".. aligning")
			print(center)
			
			# align the robot with the centroid to determine
			# its heading
			if center >350:
				robot.look_for_orange("right")
			elif center <250:
				robot.look_for_orange("left")

			elif center >=250 and center <=350:
				#once aligned, move towards the robot
				robot.move_forward_P(0.5,center)
				
				# if the robot is close enough to the orange object
				# initiate grasping sequence
			
				if distance > 400:
					robot.stop()
					robot.pick("cup")
					time.sleep(3)
					robot.celebrate()
					robot.stop()
					break
			time.sleep(0.01)

Code : The Robot Class

	class Robot:
		"""
		A class used to represent a Robot
		...
		Attributes
		----------
		left_wheel_port : BP Class Motor Port PORT_A, PORT_B, PORT_C, and/or PORT_D.
		the motor port the left wheel is connected to
		right_wheel_port : BP Class Motor Port PORT_A, PORT_B, PORT_C, and/or PORT_D.
		the motor port the right wheel is connected to
		end_effector_motor : BP Class Motor Port PORT_A, PORT_B, PORT_C, and/or PORT_D.
		the motor port the end effector motor is connected to
		default_left_wheel_motor_power : int
		the default motor power of the left wheel
		default_right_wheel_motor_power : int
		the default motor power of the left wheel
		Methods
		-------
		move_forward_P(Kp, centre)
		Drives the Robot forward using a P controller to the motor wheels
		move_backward()
		Drives the Robot backward
		turn_right()
		Turns the Robot towards the right
		turn_left()
		Turns the Robot towards the left
		stop()
		Stops the motor
		look_for_orange(direction)
		Turns the motor in the direction specified
		getSensorReading(sensor)
		gets the reading of the sensor specified
		pick(what)
		effects the pick action for the object passed
		place(what)
		effects the place action for the object passed
		celebrate()
		effects the celebration action sequence
		"""

		def __init__(self, left_wheel_port, right_wheel_port, end_effector_motor):
			self.left_wheel_port = left_wheel_port
			self.right_wheel_port = right_wheel_port
			self.end_effector_motor = end_effector_motor
			self.default_left_wheel_motor_power = 200
			self.default_right_wheel_motor_power = 200
		
		def move_forward_P(self, Kp, centre):
			Kp = Kp # Proportality Constant
			offset_X = 300 # the centre of the field of view
			Tp = 200 # default power of the motor
			# how far the centre of the orange object is from the centre
			# of the field of view
			error = centre - offset_X
			# poportionally scaling the error
			Turn = Kp * error
			# converting values to comparable motor powers
			self.left_wheel_motor_power = -(Tp + Turn)
			self.right_wheel_motor_power = -(Tp - Turn)
			print("Left wheel Motor Power: ", self.left_wheel_motor_power)
			print("Right wheel Motor Power: ", self.right_wheel_motor_power)
			#setting motor powers
			BP.set_motor_power(self.left_wheel_port , self.left_wheel_motor_power)
			BP.set_motor_power(self.right_wheel_port , self.right_wheel_motor_power)

		def move_backward(self):
			BP.set_motor_power(self.left_wheel_port , 200)
			BP.set_motor_power(self.right_wheel_port , 200)

		def stop(self):
			print("Stopping")
			BP.set_motor_power(self.left_wheel_port , 0)
			BP.set_motor_power(self.right_wheel_port , 0)

		def look_for_orange(self, direction):
			if direction == "left":
				BP.set_motor_power(self.left_wheel_port , 20)
				BP.set_motor_power(self.right_wheel_port , -20)

			elif direction == "right":
				BP.set_motor_power(self.left_wheel_port , -20)
				BP.set_motor_power(self.right_wheel_port , 20)
		
		def getSensorReading(self, sensor):
			if sensor == "ultrasonic":
			try:
				GPIO.output(TRIG, True)
				time.sleep(0.0001)
				GPIO.output(TRIG, False)
				
				while GPIO.input(ECHO) == False:
					start = time.time()
				while GPIO.input(ECHO) == True:
					end = time.time()

				sig_time = end - start
				distance = sig_time / 0.000058

				return distance

			except:
				print("error reading from distance sensor")
			
			elif sensor == "gyro":
				try:
					gyroReading = BP.get_sensor(BP.PORT_1)
					return(gyroReading)

				except brickpi3.SensorError as error:
					print("error reading gyro sensor", error)


		def pick(self, what):
			if what == "cup":
			print("Picking Cup")
			BP.set_motor_power(self.left_wheel_port , 20)
			BP.set_motor_power(self.right_wheel_port , 20)
			time.sleep(1.2)
			self.stop()
			BP.set_motor_position_relative(self.end_effector_motor, 450)
		
		def place(self, what):
			if what == "cup":
				print("Placing cup")
				BP.set_motor_position_relative(self.end_effector_motor, -450)
		
		def celebrate(self):
			print("Celebrate")
			BP.set_motor_power(self.left_wheel_port , 200)
			BP.set_motor_power(self.right_wheel_port , -200)
			time.sleep(5)
			BP.set_motor_power(self.left_wheel_port , -200)
			BP.set_motor_power(self.right_wheel_port , 200)
			BP.set_motor_position_relative(self.end_effector_motor, -450)
			time.sleep(1.5)
			BP.set_motor_position_relative(self.end_effector_motor, 450)
			time.sleep(0.5)
			BP.set_motor_position_relative(self.end_effector_motor, -450)
			time.sleep(0.5)
			BP.set_motor_position_relative(self.end_effector_motor, 450)
			time.sleep(0.5)
			BP.set_motor_position_relative(self.end_effector_motor, -450)
			time.sleep(0.5)
			BP.set_motor_position_relative(self.end_effector_motor, 450)
			time.sleep(0.5)
			BP.set_motor_position_relative(self.end_effector_motor, -450)
			time.sleep(0.5)
			BP.set_motor_position_relative(self.end_effector_motor, 450)
			time.sleep(3)


References

[1] “(71) EV3-G vs ev3dev Python - Comparison and Benchmark - YouTube.” [Online]. Available: https://www.youtube.com/watch?v=U08R13phHIw. [Accessed: 13-Mar-2020].

[2] “EVB: Replace the brain of your LEGO® EV3 with BeagleBone by FATCATLAB — Kickstarter.” [Online]. Available: https://www.kickstarter.com/projects/fatcatlab/evb-replace-the-brain-of-yourlego-ev3-with-beagle. [Accessed: 13-Mar-2020].

[3] “BrickPi - Dexter Industries.” [Online]. Available: https://www.dexterindustries.com/brickpi/. [Accessed: 13-Mar-2020].

[4] “BrickPi3 Communication Protocol - Dexter Industries.” [Online]. Available: https://www.dexterindustries.com/BrickPi/brickpi3-technical-design-details/brickpi3- communication-protocol/. [Accessed: 13-Mar-2020].

[5] “BrickPi3 Tutorials and Documentation: Get Started with the BrickPi3 and the Raspberry Pi.” [Online]. Available: https://www.dexterindustries.com/brickpi3-tutorials-documentation/. [Accessed: 13-Mar-2020]

[6] “BrickPi3 Technical Details - Behavior, Design, and Electrical Schematic of the BrickPi3.” [Online]. Available: https://www.dexterindustries.com/BrickPi/brickpi3-technical-design-details/. [Accessed: 13-Mar-2020].

[7] “BrickPi3 Tutorials and Documentation: Get Started with the BrickPi3 and the Raspberry Pi.” [Online]. Available: https://www.dexterindustries.com/brickpi3-tutorials-documentation/. [Accessed: 13-Mar-2020].

[8] “Vision Subsystem v5 for NXT or EV3 (with fixed lens) - mindsensors.com.” [Online]. Available: http://www.mindsensors.com/vision-for-robots/191-vision-subsystem-v5-for-nxt-or-ev3-withfixed-lens. [Accessed: 14-Mar-2020].

[9] “Pixy2 for LEGO Mindstorms EV3.” [Online]. Available: https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:lego_wiki. [Accessed: 14-Mar-2020].