This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
So I have had very little formal instruction on Unity, but have had spent a lot of time using the application and self teaching.
Currently I am trying to get an AI to lob an object at an enemy, and having the lobbed object land right where the enemy was standing when it was lobbed. I have added force in the positive Y direction so that the object always takes a set amount of time to hit the ground(1.15 seconds). Using basic physics, I should be able to set the object's velocity and accurately predict where it will land. It should always land a distance equal to it's Velocity.X/second * 1.15. This is not the case though. At small distances it hardly even moves, and at large distances it it way over shoots.
The distance it travels in 1.15 seconds over varying velocities should be linear, but it seems as though it is quadratic. Does anyone know what is going on?
Subreddit
Post Details
- Posted
- 6 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/Unity3D/com...