Monday, September 3, 2018

c++ - How do I fix my planet-facing camera?



I'm having trouble implementing a camera controller suitable for first person use around a planet.


The camera needs to be oriented correctly according to gravity (vector from camera position to planet center). I want the look direction to stay the same relative to the planet's surface. This means that the look direction should move as the orientation changes, even if the mouse is not touched. Not doing this would be confusing for the player.




I've tried many different solutions. The biggest problem is that the transformation I use to go between the world up vector (0,1,0) and the camera's orientation results in spinning at the south pole. The camera yaws of it's own accord as the orientation changes near the south pole. If you stay still and look around, it's fine, but for each frame the orientation change, the camera rotates itself. I have isolated it to the pitch part of the camera direction (if you remove the pitch completely, the camera yaw behaves fine).


As far as I can understand it so far, it has something to do with this: imagine you're on the equator, facing the north pole. You walk to the north pole without changing direction. How you move right, all the way down to the equator. At this point, you are facing along the equator, despite having never purposefully changed direction.


I don't need theoretical help on how cameras work, or how matrices or quaternions work. I need help from math wizards or experienced people.



Each section represents a different attempt at a solution. There's a comment above each describing what the issues are. (I'm happy to scrap all this code; I just want something that works.)


void Camera::set_angles_advanced(float horizontal, float vertical) {
glm::mat4 trans;
float factor = 1.0f;
float real_vertical = vertical;
m_horizontal += horizontal;

m_vertical += vertical;

while (m_horizontal > TWO_PI) {
m_horizontal -= TWO_PI;
}

while (m_horizontal < -TWO_PI) {
m_horizontal += TWO_PI;
}


if (m_vertical > MAX_VERTICAL) {
vertical -= m_vertical - MAX_VERTICAL;

if (vertical < 0) {
vertical = 0;
}

m_vertical = MAX_VERTICAL;
}
else if (m_vertical < -MAX_VERTICAL) {

vertical -= m_vertical - MAX_VERTICAL;

if (vertical > 0) {
vertical = 0;
}

m_vertical = -MAX_VERTICAL;
}

// -------------------- south pole rotation

/*glm::quat rotation;

if (m_orientation != glm::vec3(0.0f, 1.0f, 0.0f)) {
glm::vec3 axis = glm::normalize(glm::cross(glm::vec3(0.0f, 1.0f, 0.0f), m_orientation));
rotation = glm::rotate(rotation, acosf(m_orientation.y) * ONEEIGHTY_PI, axis);
}

rotation = glm::rotate(rotation, m_horizontal * ONEEIGHTY_PI, glm::vec3(0.0f, 1.0f, 0.0f));
rotation = glm::rotate(rotation, m_vertical * ONEEIGHTY_PI, glm::vec3(1.0f, 0.0f, 0.0f));


m_direction = glm::vec3(rotation * glm::vec4(0.0f, 0.0f, -1.0f, 0.0f));*/



// --------------------- south pole rotation
/*glm::vec3 tmp = m_orientation;
float look_factor = 1.0f;
float addition = 0.0f;

if (tmp.y < 0.0f) {

tmp.y *= -1.0f;
look_factor = -1.0f;
addition = 180.0f;
}

glm::mat4 yaw = glm::rotate(glm::mat4(), m_horizontal * ONEEIGHTY_PI, m_orientation);
glm::mat4 pitch = glm::rotate(glm::mat4(), m_vertical * -ONEEIGHTY_PI, glm::vec3(1.0f, 0.0f, 0.0f));

if (tmp != glm::vec3(0.0f, 1.0f, 0.0f)) {
glm::vec3 axis = glm::normalize(glm::cross(glm::vec3(0.0f, 1.0f, 0.0f), tmp));

pitch = glm::rotate(glm::mat4(), acosf(tmp.y) * ONEEIGHTY_PI * look_factor + addition, axis) * pitch;
}

glm::mat4 cam = yaw * pitch;

m_direction = glm::vec3(cam[2]);*/

// -------------------- oscillation when looking close to vertical, vertical range capped
/*glm::mat4 yaw_matrix = glm::rotate(glm::mat4(), m_horizontal * ONEEIGHTY_PI, m_orientation);


m_right = glm::cross(m_direction, m_orientation);

glm::mat4 pitch_matrix = glm::rotate(glm::mat4(), m_vertical * -ONEEIGHTY_PI, glm::normalize(m_right));

glm::mat4 camera_matrix = pitch_matrix * yaw_matrix;
m_direction = glm::vec3(camera_matrix[2]);*/


// --------------------- oscillation when looking close to vertical, vertical range always capped to -90,90
/*glm::mat4 yaw = glm::rotate(glm::mat4(), m_horizontal * ONEEIGHTY_PI, m_orientation);

glm::mat4 pitch = glm::rotate(glm::mat4(), m_vertical * -ONEEIGHTY_PI, m_right);

glm::mat4 cam = pitch * yaw;

m_right = glm::vec3(cam[0]);
m_up = glm::vec3(cam[1]);
m_direction = glm::vec3(cam[2]);*/




// ----------------------- south pole rotation
/*glm::dvec3 dir = glm::dvec3(cos(m_vertical) * sin(m_horizontal),
sin(m_vertical),
cos(m_vertical) * cos(m_horizontal));

glm::vec3 tmp = m_orientation;
tmp.y = fabs(tmp.y);

glm::dmat4 dtrans;
float angle;


if (glm_sq_distance(tmp, glm::vec3(0.0f, 1.0f, 0.0f)) > 0.001f) {
glm::vec3 axis = glm::normalize(glm::cross(glm::vec3(0.0f, 1.0, 0.0f), m_orientation));
angle = acos(m_orientation.y) * ONEEIGHTY_PI;
dtrans = glm::rotate(glm::mat4(), angle, axis);
}
else if (m_orientation.y < 0.0f) {
factor = -1.0f;
}


dir = glm::dvec3(dtrans * glm::dvec4(dir.x, dir.y, dir.z, 0.0f));
m_direction = glm::vec3(dir);*/


m_dir_horizontal_norm = glm::normalize(m_direction - glm_project(m_direction, m_orientation));

m_view = glm::lookAt(m_position, m_position + m_direction, m_orientation);
m_vp = m_perspective * m_view;
}


Edit: Solved. Complete code for future reference. I wouldn't wish my trials with this problem upon anyone.


glm::mat4 trans;
float factor = 1.0f;
float real_vertical = vertical;
m_horizontal += horizontal;
m_vertical += vertical;

while (m_horizontal > TWO_PI) {
m_horizontal -= TWO_PI;
}


while (m_horizontal < -TWO_PI) {
m_horizontal += TWO_PI;
}

if (m_vertical > MAX_VERTICAL) {
m_vertical = MAX_VERTICAL;
}
else if (m_vertical < -MAX_VERTICAL) {
m_vertical = -MAX_VERTICAL;

}

glm::quat world_axes_rotation = glm::angleAxis(m_horizontal * ONEEIGHTY_PI, glm::vec3(0.0f, 1.0f, 0.0f));
world_axes_rotation = glm::normalize(world_axes_rotation);
world_axes_rotation = glm::rotate(world_axes_rotation, m_vertical * ONEEIGHTY_PI, glm::vec3(1.0f, 0.0f, 0.0f));

m_pole = glm::normalize(m_pole - glm::dot(m_orientation, m_pole) * m_orientation);

glm::mat4 local_transform;


local_transform[0] = glm::vec4(m_pole.x, m_pole.y, m_pole.z, 0.0f);
local_transform[1] = glm::vec4(m_orientation.x, m_orientation.y, m_orientation.z, 0.0f);
glm::vec3 tmp = glm::cross(m_pole, m_orientation);
local_transform[2] = glm::vec4(tmp.x, tmp.y, tmp.z, 0.0f);
local_transform[3] = glm::vec4(m_position.x, m_position.y, m_position.z, 1.0f);

world_axes_rotation = glm::normalize(world_axes_rotation);
m_view = local_transform * glm::mat4_cast(world_axes_rotation);
m_direction = -1.0f * glm::vec3(m_view[2]);
m_up = glm::vec3(m_view[1]);

m_right = glm::vec3(m_view[0]);

m_view = glm::inverse(m_view);

Answer



The simplest way to do this is to compute a correcting rotation every time the camera moves:


axis = cross(newPosition, oldPosition);
angle = acos(dot(normalize(oldPosition), normalize(newPosition)));

...and then rotate the camera's orientation matrix/quaternion/basis vectors by this correction. But since the movements are likely to be small and frequent, this will likely suffer poor numerical accuracy and drift issues.


To minimize drift when we do a lot of translating around the sphere without camera orientation input, we can store the camera's orientation within a fixed reference frame, and then transform it on demand to the part of the sphere we need.



Because of the Hairy Ball Theorem, we can't get a transformation to every point on the sphere we may want in a continuous way, or at least not without an extra input. So we'll also keep track of an extra vector to help construct this transformation.


Here's one potential setup:


Example coordinate spaces


(I've arbitrarily picked a left-handed coordinate system with y+ up and z+ forward; you can adjust as needed. Also, blanket warning: I have a habit of getting the wrong sign on rotations, so take my signs with a grain of salt)


Let's define...


cameraReferenceOrientation = camera's orientation within the reference space. In this space y-minus (say) corresponds to "down" in the planet's gravity well. We'll maintain an invariant that the camera's forward vector lies in the plane x=0 of this space.


cameraPosition = camera's offset relative to the center of the sphere.


poleDirection = our extra reference unit vector, here chosen to be one of the poles you orbit around when moving forward along a great circle. This choice means we only need to update this vector when we strafe or yaw, and errors don't accumulate when moving straight forward/backward.


Given a position for the camera, we can construct a resulting camera orientation using something like the following pseudocode:


localUp = normalize(cameraPosition);


// Construct a transformation matrix to go from our reference space
// to this point on the sphere.
// (could equivalently be done with a quaternion and translation vector)
localTransformation[0] = poleDirection;
localTransformation[1] = localUp;
localTransformation[2] = cross(poleDirection, localUp);
localTransformation]3] = cameraPosition;
// May need to transpose, depending on your matrix library's handling of rows/columns.


outputCameraTransformation = localTransformation * cameraReferenceOrientation;

There are just two cases where we need to modify the stored cameraReferenceOrientation and poleDirection values before computing the outputCameraTransformation:


1) Camera rotation


You can handle rotation of the camera in-place by transforming cameraReferenceOrientation however you want (just be wary of gimbal lock). After arriving at a new cameraReferenceOrientation, we need to adjust to maintain our invariant that the camera forward vector is in the plane x=0...


referenceForward = cameraReferenceOrientation * (0, 0, 1);
correctionAngle = atan2(referenceForward.x, referenceForward.z);

cameraReferenceOrientation = AngleAxisRotation(correctionAngle, (0, -1, 0)) * cameraReferenceOrientation;


poleDirection = normalize(AngleAxisRotation(correctionAngle, normalize(cameraPosition)) * poleDirection);

(Here to be concise I'm assuming a convenience function that generates a transformation for rotation by a given angle about a given axis. If your cameraReferenceOrientation is a matrix, rather than a quaternion, you may need to orthonormalize after rotating it to prevent accumulation of errors)


This effectively transfers yaw information out of our reference space and into the poleDirection, so movement "forward" in the camera's view stays perpendicular to the pole.


The pseudocode above does not maintain a particular look direction with regard to z - you can change that if it's more convenient.


2) Strafe movement


We also need to update the poleDirection when the viewpoint moves side to side:


localUp = normalize(cameraPosition);
poleDirection = normalize(poleDirection - dot(localUp, poleDirection) * localUp);


This keeps the poleDirection 90-degrees away from our position at all times, so we're never close to a "tuft" on the hairy ball.


Since this approach doesn't privilege any fixed region or direction on the sphere, you shouldn't encounter any singular spots where it suddenly behaves differently (like the "spinning South pole" problem). It should be possible to move forward and backward along any great circle all the way around the planet endlessly without experiencing drift in apparent yaw, pitch, or roll relative to the surface.


No comments:

Post a Comment

Simple past, Present perfect Past perfect

Can you tell me which form of the following sentences is the correct one please? Imagine two friends discussing the gym... I was in a good s...