I use an Arduino and a 9dof sensor ( gyroscope, accelerometer and magnetometer ) and i'm trying to use the pitch, roll and yaw that the sensor gives me to rotate an object in unity.
I managed to correctly compute pitch and roll ( the z and x axis in unity ) from the accelerometer but it seems that I can't get the Yaw right. By that I mean that when I rotate my sensor pitch or roll it rotates my Yaw too in a weird way.
Code in the arduino to get the heading
void getHeading(void)
{
heading=180*atan2(Mxyz[0],Mxyz[1])/PI;
if(heading <0) heading +=360;
}
void getTiltHeading(void)
{
//float pitch = asin(-Axyz[0]);
//float roll = asin(Axyz[1]/cos(pitch));
float pitch = atan( Axyz[0] / sqrt( Axyz[1] * Axyz[1] + Axyz[2] * Axyz[2] ) );
float roll = atan( Axyz[1] / sqrt(Axyz[0] * Axyz[0] + Axyz[2] * Axyz[2]));
float xh = Mxyz[0] * cos(pitch) + Mxyz[2] * sin(pitch);
float yh = Mxyz[0] * sin(roll) * sin(pitch) + Mxyz[1] * cos(roll) - Mxyz[2] * sin(roll) * cos(pitch);
float zh = -Mxyz[0] * cos(roll) * sin(pitch) + Mxyz[1] * sin(roll) + Mxyz[2] * cos(roll) * cos(pitch);
tiltheading = 180 * atan2(yh, xh)/PI;
if(yh<0) tiltheading +=360;
}
Pitch and roll
float _Pitch = (float)(180 / Math.PI * Math.Atan( m_ResultX / Math.Sqrt( m_ResultY * m_ResultY + m_ResultZ * m_ResultZ ) ) );
float _Roll = (float)(180 / Math.PI * Math.Atan(m_ResultY / Math.Sqrt(m_ResultX * m_ResultX + m_ResultZ * m_ResultZ)));
float _Yaw = (float)(m_TiltHeadingResult);
Don't hesitate to ask for details.
Answer
The name for this problem is Sensor Fusion, because we're taking readings from multiple sensors, each in some way incomplete or inaccurate, and combining them into a more reliable measure.
In the past, I've had good results using a Complementary Filter for this task, which is effectively a weighted average. It's not state of the art, but it's straightforward to implement and tune.
The filter retains some internal state frame to frame, which is a representation of the current estimated orientation of the device in space. You can use a world-to-local quaternion for this, but I'll describe it using two vectors:
localUp is a vector in the device's local coordinate frame pointing in the estimated direction of world up (0, 1, 0). (So if the device is upside-down, localUp points toward (0, -1, 0))
localNorth is a vector in the device's local coordinate frame pointing in the direction of world "forward" (0, 0, 1)
Each time we get new sensor data, we integrate it into our estimated orientation. The first step is to build an estimate of how much the device has rotated, according to each sensor, as a vector of radian rotation angles about each axis:
Vector3 gyroAngles = gyroscope.rotationalVelocity * dT;
Vector3 accelAngles = Vector3.Cross(localUp, accelerometer.sensedUp - localUp);
Vector3 magnetoAngles = Vector3.Cross(localNorth, magnetometer.sensedNorth - localNorth);
A few notes:
this assumes
gyroscope.rotationalVelocity
is in radians per second, and thatdT
is the number of seconds elapsed since the last integration step (which may differ from your frame's deltaTime if you're running this filter on a separate thread to get more frequent sensor updates)sensedUp
is a calculated value, applying some type of low-pass filter to the accelerometer readings to minimize noise. Ideally, this calculation should discount readings in which the magnitude differs substantially from 1G, as this indicates it's picking up device motion rather than gravity direction)sensedNorth
is a calculated value, low-pass filtered and compensated for magnetometer biases. If you tumble your device around, your sensedNorth readings should ideally trace out a unit sphere centered at the origin. Magnetometer mis-calibration can have a huge impact on the accuracy of the tracking. You may also need to compensate for magnetic inclination (where I am, for instance, the magnetic field points closer to vertical than horizontal)
Now that we have each sensor's estimated rotation from the current orientation, we use a weighted average to combine them into an angularDelta
vector.
You generally want to give the gyroscope a very high weight (in the ballpark of 90-99%) to make the filter respond immediately to quick movements, while the remaining weight helps rein-in drift from accumulating.
If you detect that the device is accelerating (accelerometer magnitude differing from 1G), you can drop the weighting of the accelAngles
since they're not trustworthy in motion.
Once we have our averaged angularDelta
, we use it to derive new estimates for our local Up and North:
Vector3 estimatedUp = localUp + Vector3.Cross(angularDelta, localUp);
Vector3 estimatedNorth = localNorth + Vector3.Cross(angularDelta, localNorth);
These will in general not be perpendicular or unit length, so we apply a quick error correction before using them to update our orientation state:
float error = Vector3.Dot(estimatedUp, estimatedNorth) * 0.5f;
localUp = Vector3.normalize(estimatedUp – error * estimatedNorth);
localNorth = Vector3.normalize(estimatedNorth – error * estimatedUp);
Now we're done. For ease of use, we can convert our orientation estimate into a quaternion:
Quaternion worldToLocal = Quaternion.LookRotation(localNorth, localUp);
Quaternion localToWorld = Quaternion.Inverse(worldToLocal);
One neat trick is that this filter even works for devices with no magnetometer. We just fake a constant magnetometer reading of (0, 0, 1) and assign the magnetometer a very low weight (<0.1%), keeping our estimated yaw from drifting uncontrollably.
No comments:
Post a Comment