Dasher Developments

Dasher/Owl
Automatic pointing calibration
Controlling Dasher by Breath or Buttons
Controlling Dasher by Tilt or Twist

Automatic Pointer Calibration with Dasher

Imagine that a user's pointing device requires continuous calibration. For example, most eyetrackers have parameters that need adjusting as the user wriggles around.

If the user is using Dasher, then the history of alleged pointing directions contains rich information that can be used to calibrate the information-conveying dimension of the screen (usually the y-dimension, in Dasher).

The idea is simple. Imagine that there is a slight vertical miscalibration. As the user uses Dasher for a short duration, he will be able, in spite of the miscalibration, to arrive at his destination (because the closer the user gets to one target the "Dasher library shelf", the smaller the calibration error is relative to the size of the target). Assume that at time 1 the user is indeed looking at a target, which is small and to the right-hand side; and that at time 2 the user has reached the target. At time 2 we can instantly deduce where the user was actually looking at time 1. So we can deduce the calibration error at time 1, and send this error signal to our on-the-fly calibration system.

The user might not always look exactly at required targets, but we assume that the contributions from the user's other random pointing gestures will average out.

We plan to implement this method, and wish to ensure that the idea is in the public domain, hence this web document, dated 28.6.2003.

(Since 2004, this feature has been in Dasher version 3.)

[Phil Cowans has a similar but much more elegant method for simultaneously calibrating both dimensions of an eye-tracker, x and y, in a Dasher-like dynamic environment.]


A related idea is deliberate miscalibration of Dasher. Dasher on ipaq While this idea is unlikely to be helpful in eyetrackers, it could be very useful for pen-based systems such as touch screens, tablet PCs, and palmtop pens driven by styluses. We plan to include an option in Dasher such that the Dasher mouse is rendered (either visibly or invisibly) in a location offset from the pointer direction delivered by the user. For example, a vertical offset of 10 pixels. This offset allows the user to steer without her stylus getting in the way of the crucial part of the screen. If a large offset is chosen, then it would probably be a good idea for the Dasher-mouse location to be visualized on the screen. The pointer could even be moved around on a completely separate part of the screen (just as a mouse user moves the physical mouse around on a mat, not on the screen!).


Dasher home page
David MacKay
Last modified: Thu Jul 31 17:21:46 2003