2D to 3D Mechanical Design

By: Vanessa Enriquez (Design & Manufacturing Engineer for Goliath)

Approved by: Lucas Gutierrez (Project Manager for ModWheels)

Table of Contents

Introduction

After PDR, I was temporarily reassigned to the ModWheels project to begin recreating the existing 2D design onto a 3D design in Solidworks.

 

Preliminary design

To begin with, the current design was provided to me by the project manager, Lucas. Figure 1 includes the instructions and features that will be present in the first design of the 3D model. The 2D .dxf file was converted into solidworks and was separated into four parts.  

Figure 1 – 2D design includes top and bottom panels, wheel, and latch.    

2D to 3D

The cut out instructions mentioned in figure 1 were implemented in sSolidworks. All parts were extruded to 3.429 millimeters.

Figure 2 – Solidworks assembly for current design includes top and bottom panels, wheel, and latch.   

Conclusion

The files and designs were then sent over to the new Design & Manufacturing Engineer for ModWheels, Natalie, to look over and finalize before laser cutting. Natalie will be in charge of any future design changes and assembly of the model.

Rules Of The Maze (Robot Avoidance Rules and Strategy)

By: Matt Shellhammer (Electronics & Control Engineer)

Approved by: Lucas Gutierrez (Project Manager)

Table of Contents

Introduction

When traveling through the maze, robots have the possibility to encounter other robots. To ensure the robots will be able to successfully navigate the maze and complete the mission objectives, we need to have general rules of the maze. These rules should be followed by all projects to guarantee that all robots are following the same rules for navigating the maze.

Possible encounter cases:

2.1.  Face-to-Face in hallway (North/South & East/West)

2.2.  Face-to-Back in hallway (North/South & East/West)

2.3.  T – Intersection

 

Rules for robots that occupy more than one room

Some projects have robots that are larger than one room size which if not addressed can cause issues with the rules of the maze. For projects that will take up more than one room size, the excess robot space should be in the previous room. This is to ensure that robots will not be seen as within the intersection when they have not yet stepped into the intersection. This will mean that robots that take up more than one room will be occupying their current room and part of the previous room, which should not cause any problems for the rules of the maze. If the robot has just stepped through the intersection and is partially within the intersection, it will be cleared once they step into the next room.

Rules for detection

To detect robots within intersections and in face-to-face or face-to-back encounters, detection should be continuous since robots will not be moving in synchronization and a robot can step into a new room at any time. Robots should be able to detect a robot in the next room ahead since all decisions will be made within a one room range.

Rules for encounters 2.1 (Face-to-Face) & 2.2 (Face-to-Back)

For encounters 2.1 & 2.2 the rules will be defined as follows:

A robot that is in a hallway (vertical/horizontal) that encounters another robot, either face-to-face (case 2.1) or face-to-back (case 2.2) will immediately pause movement for three seconds. If this robot still detects the other robot after these three seconds, the robots must be in case 2.1. If the robot no longer detects the other robot after these three seconds, the robot then must be in case 2.2.

If the robots are in case 2.1 and they are in a vertical hallway (north/south) the robot facing north has the highest priority. If the robots are in case 2.1 and they are in a horizontal hallway (east/west) the robot facing east has the highest priority. In case 2.1 robots facing either south or west have the lowest priority and are therefore left with the burden of clearing the path. The robot with lower priority should turn around and retrace its path until the robot with highest priority is able to pass. When the robot is retracing its path it should retrace its path until its last decision point (last intersection) and wait for the robot to pass. If the higher priority robot tries to go down the hallway that the lower priority robot is waiting in, the robots are now in either case 2.3 or 2.4. In this case the robot within the intersection is now at lower priority and follows the rules explained in Section 4.

If the robots are in case 2.2 they will then continue on with the desired path.

Rules for encounters 2.3 (T – Intersection)

For encounter 2.3 the rules will be defined as follows:

When a robot comes to an intersection it will attempt to step into the intersection and either turn or move straight along its path. If a robot steps into an intersection and sees another robot as it tries to make its maneuver, then these robots will be in case 2.3. In this case the robot that is within the intersection (in the middle of the T – intersection) has the lowest priority and must move out of the way of the other robots (if the other robots are in the path of the low priority robot). The lowest priority robot will step down the hallway that is not blocked and then wait for the other robot to pass. The robot may try to step down the hallway that the waiting robot is in and this will mean that this robot is now at lower priority and has to move for the previously waiting robot. A robot waiting at an intersection for another robot to pass should timeout after waiting ten seconds for a robot to pass. After timing out the robot should then make a second attempt to continue along its path. This timeout is used for when a robot in an intersection detects another robot in its path, as long as the detected robot was not traveling through the intersection (e.g. waiting to enter another intersection).

 

Recovery strategy

If a lower priority robot is required to clear the path of a higher priority robot then there must be a recovery strategy for that lower priority robot. The recovery strategy for case 2.1 should have the robot only travel forwards and backwards along its designated path to allow for the robot to easily repeat the steps taken clear the path. This then allows the robot to either step forward or backward along the recorded path, and requires no modification to the robot’s path. And for cases 2.3 and 2.4, the robots should only move off the designated path to move out of the way for other robots when required, and then continue back on the designated path.

 

Special Cases to consider

Three robots at T – Intersection (special case of 2.3)

Rules for three robots within a T – Intersection

If a robot is thought to be in case 2.3, then attempts to move down the other hallway, and then again encounters another robot, then there must be three robots within this intersection. I don’t know we should deal with this case… What if another robot comes up behind this robot, it is now boxed in… Additionally, the other robots do not know that there is three robots (four if you include the additional robot) at the intersection.

Other things to consider

Detection of robots multiple rooms away, so when there’s an intersection with three robots, there is an open space for the robot within the intersection to move into to allow the other robot to move by.

Color Sensor Trade Study

By: Matt Shellhammer (Electronics & Control Engineer)

Approved By: Lucas Gutierrez (Project Manager)

 

Table of Contents

Introduction

For the Fall 2017 semester of EE 400D, all of the projects at The Robot Company will be using color sensors to navigate a colored 2D-maze, most commonly using TCS34725 1. color sensor.  This color sensor will be mounted onto each of the robots in a particular manner, decided by each project.  The goal of this trade study is to help give the projects an idea of where exactly on their robots the color sensor should be mounted in reference to the colored 2D-maze.  The sensor should be placed far enough away from the line to avoid only sensing the line all the time; however, it should also be close enough to the line so it will quickly sense the line when the robot veers off the desired route.

Methodology

In this trade study I used two devices, namely the Arduino Uno and the TCS34725 color sensor.  I connected the color sensor to the Arduino Uno in the configuration shown in the table below (Table 1: Interface Matrix).  Additionally, to read the values from the color sensor, I2C communication was used.  To implement this communication, the Adafruit Arduino I2C communication library was used, which is available online on the Adafruit website 2..

 

Table 1: Interface Matrix                       

Arduino Uno pins

TCS34725 Color sensor pins

Vcc (3.3v)

VIN

GND

GND

SDA (pin18)

SDA

SCL (pin19)

SCL

 

The color sensor was then set up next to a paper with measurement tick marks to measure the distance away from the color sensor (vertical test), and then reconfigured later to measure the distance from either the side of the color sensor (horizontal test), both as shown in the figures below.

 

Figure 1: Color sensor trade study configuration for vertical distance test.

 

Figure 2: Color sensor trade study configuration for horizontal distance test.

 

Next, using strips of vinyl tape (electrical tape) I then tested four different color tape strips. To determine the optimal vertical distance away from color sensor, I tested to see where I got a peak measurement and then called that the optimal distance for that color. The color strips used are shown in the figure below with the results for the vertical test following (Table 2: Optimal vertical distance for color strips).

Figure 3: Color strips used in trade study. From left to right: Black, Red, Green, and Blue.

 

Table 2: Optimal vertical distance for color strips

Color-strip color

Optimal vertical distance

Black

2 mm

Red

3 mm

Green

2.5 mm

Blue

2 mm

 

After this test was performed, another test was performed to determine the optimal horizontal distance for the color sensor. This optimal distance will be more subjective, since the color should be far enough away from the sensor to not always be detecting the lines, but close enough to reduce the response time of the robot’s reaction to the line. For this test, I measured two distances for each color. One at the point the colored strip was detected as well as at the point at which the amount of detection significantly spiked. The origin (zero) value was set at the middle of the color sensor. I then performed the test, from the left and the right side of the color sensor, as defined in the figure below. The horizontal test results follow (Table 3: Optimal horizontal distance for color strips).

 

Figure 4: Test setup to show which side would be the right side of the color sensor vs which is the left.

 

Table 3: Optimal horizontal distance for color strips

Color-strip color Detected horizontal distance (Left) Detected horizontal distance (Right) Detection spike horizontal distance (Left) Detection spike horizontal distance (Right)
Red 9 mm 9 mm 2 mm 2.5 mm
Green 1.5 mm 4 mm 1 mm 1 mm
Blue 4 mm 10 mm 1 mm 2.5 mm
Black 3 mm 3 mm 0 mm 0 mm

 

It can be noted that the range of detection can vary from color to color, and from left to right due to the location of the color detector on the color sensor. In most cases, the range of detection of the color strip was a small but significant distance further from the right than the range when measuring from the left.

 

Conclusion

The first result that can be inferred from this trade study is that a vertical distance from the color sensor to the detected color is optimal within a range of 2 – 3 mm. The second result that can be inferred from this trade study is that a horizontal range of 2.5 – 3 mm away (zero being the center of the color sensor) to the detected color will result in consistent readings, with no significant unintentional readings. However, if trying to ensure no color detection when driving along a path, a horizontal range of 9-10 mm might be better desired. Testing with the project specific robot and software will also be an important factor when deciding the layout and placement of the color sensors. This trade study is to be used in supplement with testing to give the engineers a good starting point when designing the color sensor layout.

Source material

  1. https://www.adafruit.com/product/1334
  2. https://github.com/adafruit/Adafruit_TCS34725

ModWheels Preliminary Design Document

By: Lucas Gutierrez (Project Manager)

 

ModWheels Team Members:

Project Manager: Lucas Gutierrez

Mission, Systems, and Test Engineer: Andrew Yi

Electronics and Control Engineer: Matt Shellhammer

Design and Manufacturing Engineer: Adan Rodriguez

 

 

Table of Contents

Program Objectives / Mission Profile

By: Lucas Gutierrez (Project Manager)

Objective

ModWheels is a toy car that will navigate a multi-colored 2D maze using the Arxterra App for remote control.  The initial phase consists of having the toy car navigate the maze with user input.  ModWheels will then memorize the route taken during the initial phase and will be able to autonomously navigate the maze for the second phase.  Another rule of the maze will be to detect other robots and avoid collision.  These are the objectives stated by the customer.

ModWheels toy car is a new project within The Robot Company.  The modular design comes from the changeable paper overlay, allowing the user to swap out to their preferred design.  Color sensors will be used to detect the walls of the maze, keeping the toy car within the confines of the maze hallways.  To avoid collision, ultrasonic sensors will be used to detect the other robots within the maze.  Infrared sensors will detect the black lines in the maze indicating intersections.

Requirements

By: Andrew Yi (Mission, Systems, and Test Engineer)

Program Level 1 Requirements

L1-1: ModWheels shall be completed by Wednesday, December 13th, 2017.

L1-2: ModWheels will be a toy robot.

L1-3: ModWheels shall cost no more than $200.

Project Level 1 Requirements

L1-4: ModWheels will use a 3DoT board.

L1-5: ModWheels shall use a peripheral custom PCB connected to 3DoT board.

L1-6: ModWheels will be able to be controlled through the ArxRobot App or Arxterra Control Panel.

L1-7: ModWheels shall navigate a multi-colored 2D maze.

L1-8: ModWheels shall stop when another robot has been detected within a 1.5 foot radius ahead.

L1-9: ModWheels should be able to avoid collisions with other robots operating within the maze.  

L1-10: ModWheels shall provide video feedback through a smartphone placed on the toy car.

L1-11: ModWheels shall weigh no more than 500 grams.

L1-12: ModWheels shall be able to memorize a path through the maze taught by the user.

L1-13: ModWheels should be able to travel down the memorized path autonomously.

L1-14: ModWheels should be able to adopt an electronic differential with dual rear motors.

L1-15: ModWheels should be able to adopt a slip differential with dual rear motors.

System & Subsystem Level 2 Requirements

L2-1: ModWheels will have a 3DoT board mounted on the chassis of the ModWheels toy car, as defined by L1-4.

L2-2: ModWheels shall use 2 color sensors to detect the walls within the maze so that it can keep itself within the confines of the hallways, as defined by L1-7.

L2-3: ModWheels shall use the ultrasonic sensors to detect other objects 1.5 feet in front of the toy car, as defined by L1-8.

L2-4: ModWheels will be controllable through Arxterra App using the HM-11 Bluetooth module on the 3DoT board.  The Arxterra App has a graphical user interface (GUI) that allows control of the toy robot, as defined by L1-6.

L2-5: ModWheels should have an area for a smartphone to be placed onto it.  The phone should have a periscope attachment on its camera and will provide live feed video via the Arxterra App, as defined by L1-10.

L2-6: ModWheels shall navigate a maze autonomously after it has cleared the maze with user input.  The autonomous route shall follow the original route without user input, as defined by L1-12.

L2-7: ModWheels will be a remote controllable toy car with a paper shell overlay. The paper shell overlay gives the ModWheels its customizability, as defined by L1-2.

L2-8: ModWheels should use the ultrasonic sensor to reroute themselves if another robot is approaching it head on.  When oncoming traffic is detected, ModWheels should direct itself closer towards a wall to allow the oncoming robot to pass, as defined by L1-9.

L2-9: ModWheels shall use a custom PCB to control the ultrasonic, infrared, and color sensors.  This PCB shall be connected to the 3DoT board aboard the chassis, as defined by L1-5.

L2-10: ModWheels shall use 2 infrared (IR) sensors to detect the black lines in the maze that indicate intersections, as defined by L1-7.

L2-11: ModWheels shall stop when another robot is detected to be 1.5 feet in front of the toy car, as defined by L1-8.

L2-12: ModWheels shall cease all motor functions when another robot is detected 1.5 feet in front of it.  It shall resume resume normal operations after the robot has left the detection area, as defined by L1-8.

Source Material:

  1. ModWheels Requirements

 

Design Innovation

By: Matt Shellhammer (Electronics and Control Engineer)

Creative Design

  • Rules of the maze.
    • Navigating the maze with other robots traveling through at the same time across many various paths has proved to be a very elaborate ConOp. One possible solution for this problem, discussed during the forced relationship and dunker diagram activities, is creating a specific rules of the maze that will be followed by all robots in the maze to ensure that no robots collide. To establish the rules of the maze our project manager is speaking with the other project managers and trying to agree upon a standardized set of rules for the maze.
  • Able to download additional maze maps quickly.
    • This solution came about from the forced relationships technique trying to force a bookcase onto the ModWheels car. This brings up the task of being able to remotely drive through the maze using the Arxterra app and then being able to then repeat that path autonomously. Doing so, after changing the maze in ModWheels memory it can quickly and easily memorize and follow new mazes and routes.  
  • Sense other robots within the maze to avoid collisions
    • Yet again, during the dunker diagram activity, one of our big concerns was navigating the maze while other robots are also within the maze. Another technique for avoiding other robots within the maze was to add a sensor, such as an ultrasonic sensor, to use to detect and avoid obstacles within the maze. We will be using an ultrasonic sensor on ModWheels to achieve this awareness.
  • Making turns within the maze
    • In a maze, making accurate turns is something of great importance, since a poor turn can cause you to veer off the desired path. Two different possible approaches to this problem were discussed. The first approach discussed was to design a car that has wheels that can rotate 360 degrees allowing the car to never have to turn, the wheels will rotate and the car will stay in place. Since the ModWheels chassis relatively fixed, this approach isn’t really viable, however we did discuss another more possible approach, using encoders. Both of these approaches were a result of the attributes listing creative solutions activity.
  • Navigating within the maze
    • To navigate throughout the maze there must be certain sensory devices on the vehicle that allow it to be aware of its surroundings. The solutions developed to help ModWheels be aware of its surroundings is IR and/or color sensors to help detect the rooms of the maze. These sensors could also later be used as an additional tool to gather information when making a turn within the maze.

Source Material:

  1. ModWheels Creativity Presentation

 

Systems/Subsystem Design

By: Andrew Yi (Mission, Systems, and Test Engineer), Matt Shellhammer (Electronics and Control Engineer), and Lucas Gutierrez (Project Manager)

Product Breakdown Structure

By: Andrew Yi (Mission, Systems, and Test Engineer)

The ModWheels car is broken into 6 sections for the Product Breakdown Structure (PBS).  The visuals consist of a smartphone with periscope attachment feeding live video back to the user.  The motors, wheels, and servo are the main parts of the mobility of the toy car.  Bluetooth and the Arxterra app allow for mobile control of the toy car.  The color and ultrasonic sensors will aid in detecting walls (color) and robots (ultrasonic).  The paper overlay and 3D printed plastic allow for customization of the ModWheels toy car.  A singular 3.6V RCR123A battery will power the ModWheels car and its peripherals.

 

Electronic System Design

System Block Diagram

By: Andrew Yi (Mission, Systems, and Test Engineer)

 

The System Block Diagram (SBD) gives a visual of how the different components of a system are connected.  The motor drive on the 3DoT will power the dual RWD motor, which in turn controls the rear wheels.  The color sensors on the custom PCB assist in wall detection (in maze) and in detecting other robots within the maze.  A periscope will be attached onto the camera of a smart phone and placed on the ModWheels car.  This will give live feed video back to the user.  The servo allows for turning of the front wheel and controls the steering of the toy car.

 

Interface Definition

By: Matt Shellhammer (Electronics and Control Engineer)

To know how all of the external components will interface with the 3DoT board the table has been created to know what connections will be required for each of the individual components. This table can then be used to configure the interface matrix between the 3DoT board and the external components.

Mechanical Design

By: Lucas Gutierrez (Project Manager)

This Illustrator file shows the first and foundational model to which ModWheels will develop from.  This two dimensional vector file is converted to a .dxf file which is readable by Laserworks, a software interface for the Kaitian laser cutting and engraving machine.  Once correct settings in Laserworks are enabled, optimal settings for a certain material, the laser cutting and engraving machine cuts and engraves the material (1/8″ 3-ply birch) from which the car can be assembled.

Design and Unique Task Description

By: Matt Shellhammer (Electronics and Control Engineer)

Electronics and Control

  • Trade study of various ultrasonic sensors to determine the best fit for our application.
  • Trade study to determine if we tune two separate motors to drive straight at 40 percent power, will they continue on that same path when the power to the motors is increased (e.g. 50 percent, 60 percent, 70 percent, etc.).
  • Trade study to test maze configurations (1 to 4 cells) using the 3DoT board, and determine the optimal hedge distance and ideal sensor layout.
  • Perform a power system analysis to lay out a power budget for our system (determine current draw for all of the components).
  • Determine configuration to be used for I2C communications.
  • Create Fritzing diagram for custom PCB.
  • Customize Arxterra app / dashboard for specific ModWheels applications.
  • Breadboard testing of turning mechanism & sensors.
  • Determine how sensors will be used for navigating the maze (e.g. line follower, wall follower, room detection).
  • Develop software for: room detection, path memorization, autonomous navigation of the maze, etc.

 

Manufacturing

  • Create Solidworks model of ModWheels chassis from the current 2D model of the ModWheels chassis.
  • Design method for mounting sandblaster front wheels chassis and servo onto the current ModWheels chassis.
  • Design method for mounting PCB and sensors onto the current ModWheels chassis.
  • Design cell-phone mount for ModWheels chassis.

 

Preliminary Plan

By: Lucas Gutierrez (Project Manager)

ModWheels Team Members:

Project Manager: Lucas Gutierrez

Mission, Systems, and Test Engineer: Andrew Yi

Electronics and Control Engineer: Matt Shellhammer

  • Fritzing Diagram
  • Electronic Motor Study
  • Printed Circuit Board (PCB) Design & Test
  • 3DoT Firmware

Design and Manufacturing Engineer: Adan Rodriguez

  • Eagle CAD for PCB layout
  • SolidWorks for ModWheels 3D model
  • Manufacture Chassis, 3D printed parts, and laser cut parts
  • Assemble ModWheels

Work Breakdown Structure

By:

Project Schedule

By: Lucas Gutierrez (Project Manager)

Top Level Schedule

This layout provides the layout of due dates for the ModWheels project.

System/Subsytem Level Tasks

This layout provides the layout of due dates and assignments for engineers for the ModWheels project.

BurnDown Schedule

This layout provides the burn down for the ModWheels project.

Systems Resource Report

By: Andrew Yi (Mission, Systems, and Test Engineer)

Power Allocation Report

The power report is an overview of the power requirements of each physical component.  The MST team is currently working with Professor Hill to finish the power chart for the 3DoT board.  Once that task is finished, testing can begin on components to get test data.

Mass Allocation Report

This report shows the mass of all the physical components used in the toy robot.  There is quite a large amount of contingency because the rules of the maze has not been defined yet.  Once those are defined, then the correct amount of sensors and components can be chosen.

Project Cost Estimate

By: Lucas Gutierrez (Project Manager)

The cost report shows the current cost of components against the budget set by the customer.  A larger pool of money was given towards the PCB board and components because of the uncertainty of certain sensors being required (maze rules).  In case of laser cutting, money has been set aside to account for those costs.