Finding Perpendiculars
Finding perpendiculars steps (for parallel lines)
- Take away one parallel vector from another
- Write a new vector in terms of \(\lambda-\mu\) (\(t\)) (This vector is perpendicular to the other two vectors)
- Multiply this vector out to get a single vector. (? + ?t)
- If we multiply this vector with the direction vector of our parallel lines, we get 0.
- With this info, we can solve for \(t\)
- Once we solve for \(t\), we can sub this back in to our original perpendicular equation and find an equation for the vector of the distance between the two lines.
- You can then use the magnitude equation \(\sqrt{ a^2+b^2 +c^2}=\text{magnitude}\)
- We now have the distance bwteen the two vectors.
Finding perpendiculars steps (for any line)
- Take away one parallel vector from another.
- Write a new vector in the form \((?+?\lambda+?\mu)\)
- Call this vector \(AB\)
- \(AB\times \text{direction vector 1}=0\)
- \(AB\times \text{direction vector 2}=0\)
- We can multiply this out and get two simultaneous equations to solve in terms of \(\mu\) and \(\lambda\).
- By eliminating either of these variables we can find the other and the sub this back in to find the other.
- Finally we go back to our original perpendicular equation in terms of \(\mu\) and \(\lambda\).
- This gives us a final vector which we use the magnitude equation \(\sqrt{ a^2+b^2 +c^2}=\text{magnitude}\)
- We now have the distance between the two vectors.
Finding perpendiculars steps (for a line and a point)
- Multiply out your vector into a single vector equation.
- Find the perpendicular equation by inverting the signs on the point vector and adding this to our single vector equation we multiplied out.
- Call it AB
- \(AB\times \text{direction vector}=0\)
- Find \(\lambda\)
- Find the full form of AB.
- Find the magnitude.
Finding the distance from the origin to a plane.
\(\(\mathbf{r}.\mathbf{\hat{n}}=\mathbf{d}\)\)
- Where \(\mathbf{r}\) is any point on our plane.
- Where \(\mathbf{\hat{n}}\) is the normal vector of the plane.
- Where \(\mathbf{d}\) is the shortest distance between a plane and the origin.
Finding the distance between any two points
\(\(\frac{\mathbf{x}.\mathbf{n}-\mathbf{d}}{|\mathbf{n}|}\)\)
- Where \(\mathbf{x}\) is the vector of the point.
- Where \(\mathbf{n}\) is the normal vector of the plane.
- Where \(\mathbf{d}\) is the constant of the plane.
- Where \(|\mathbf{n}|\) is the magnitude of the normal vector.