This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
So, I have been playing with some 3D graphics and have been implementing various functions for fun. My problem is currently this:
If you have a model, and you want to shade a face based on the angle made between the normal vector of that face and another pointing to an arbitrary point in space from the center of the face. If the angle is less than 90 degrees, you shade, otherwise don't.
Everything is pretty straightforward formula calculations except the part about the normal. Since there are two normals to a plane, one of the angles will always be >= 90 degrees. If we calculate the normal vector by traditional methods, we might possibly end up with a vector pointing into the model. This will incorrectly calculate the degree of shading and will be unpredictable.
So, math people, how would you determine if the normal of a face in a model is pointing outward and not into the model itself?
I understand the right-hand rule, but I need a more algorithmic approach.
Post Details
- Posted
- 12 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/math/commen...