Skip to content
GitLab
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
siyunhe2
ECE470
Commits
7ce946e7
Commit
7ce946e7
authored
Apr 11, 2018
by
siyunhe2
Browse files
Update README.md
parent
460f4407
Changes
1
Hide whitespace changes
Inline
Side-by-side
FinalProject/Check_Point_5/README.md
View file @
7ce946e7
## Checkpoint 4 (due 4/2/18)
### Demonstrate Collision Detection for UR3
First, make sure you have Remote API file and related Python files in the same folder as Checkpoint4.py.
You should also load the appropriate scene in the V-REP. The code does not ask for any user input.
We predefined some of the theta values, which can be looked up in the code. Based on these values,
the collision checker is ran. You can checkout the code in Collision Detection section.
## Checkpoint 5 (due 4/9/18)
### Demonstrate Motion/Path Planing for UR3
In this checkpoint, the user would be to asked to input desired position of the UR3.
Before the robot moves, the collision checker checks if the collision will occur and will print the
appropriate comment to the screen. Then it will move the joints. During the first "iteration",
V-REP is not running the simulation, therefore, the joints are allowed to move throught the obstacles.
During the second "iteration", V-REP is in simulation mode, therefore the joints will not be
able to move throught the obstacles no more. For both iterations, same joint configuration was used.
First, make sure you have Remote API file and related Python files in the same folder as Checkpoint5.py.
You should also load the appropriate scene in the V-REP.
In the Scene, there are two blocks which we recognize it as obstacles. In order to detect the obstacles, we implemented
boudning sphere on the two blocks.
So the goal is to to reach to the desired pose as given the user input.
### Collision Detection
The implementation of collision detection:
~~~~
def collisionChecker(S, initial_points, points, r, theta):
for i in range(len(theta)):
p_initial = np.array([
[initial_points[0][i]],
[initial_points[1][i]],
[initial_points[2][i]],
[1]
])
### Sampling based planner Algorithm
temp = np.identity(4)
1)The path planing is implemented by using sampling based planner Algorithm. We set a tree strcuture to store all the intermediate path
to connect betwwen the starting position and goal position. The tree structure has two trees: T(forward) and T(bacward), which served as
"bisection method"
for j in range(i+1):
temp = np.dot(temp, expm(screwBracForm(S[j]) * theta[j]))
First, we check the whether the UR3 robot would encounter with self-Collision or obstacles-collision.
p_final = np.dot(temp, p_initial)
2)Then we sample random thetas from random uniform distribution
points[0].append(p_final[0][0])
points[1].append(p_final[1][0])
points[2].append(p_final[2][0])
3)In order to genearte the one goal pose, we check whether the UR3 robot would encounter with
self-Collision(its self joints) or obstacles-collision(bounding spheres on the two blocks).
p = np.block([
[np.reshape(np.asarray(points[0]),(1,len(theta)+2))],
[np.reshape(np.asarray(points[1]),(1,len(theta)+2))],
[np.reshape(np.asarray(points[2]),(1,len(theta)+2))]
])
4)if no collision, add the current position/path to T(forward), and iteratetively found new thetas
in T(forward), while set the new thetas as the child of previous theta
for i in range(len(theta) + 2):
point1 = np.array([
[p[0][i]],
[p[1][i]],
[p[2][i]]
])
5)And do the same for T(backward)
for j in range(len(theta) + 2 - i):
if(i == i+j):
continue
6)If theta belongs to both T(forward) and T(backward), then we have a complete connected path from starting position to desired goal pose.
point2 = np.array([
[p[0][j+i]],
[p[1][j+i]],
[p[2][j+i]]
])
if(norm(point1 - point2) <= r*2):
return 1
return 0
~~~~
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment