Saturday, April 24, 2010

The Pursuit Rotor Task

Recently I had a brief foray into into research on manual dexterity (award-winning BRiMS paper can be downloaded here), and I ran into the Pursuit Rotor task.  Originally, the task had a person hold a stylus in one hand and try to follow a disc moving quickly on a turntable.  When they were 'on' the disc, a circuit was completed and a timer activated.  A measure of dexterity was the proportion of time during the trial that you were 'on' target.  Better explanations are available on the web, such as here  and here (which is where the image to the left originates.)

Along with being a pure measure of visual-motor tracking, it seems to have been used in the past as  motor skill that can be proceduralized (and thus see improvement in patients like E.M., who got better at procedural tasks he never remembered doing).  It was probably also used as a diagnostic tool to identify certain types of neurological problems.   The most commonly used modern version seems to be Lafayette Instruments 30014, but it really seems like the task gets used fairly infrequently today, probably because many researcher don't care to invest in special-purpose hardware like this. But in some ways it seems very relevant to modern mouse-driven interfaces.  Next, I will show how I have implemented it in PEBL.

Here is an image of an old model from the U Akron Archives of the history of psychology.

To start with, I want to implement something fairly faithful to the original, but controlled by tracking with a mouse. So, I create a circular path, a target (whose size can be adjusted), and logic to control how fast it moves.  How fast should it move?  Sources I have found indicate that typical speeds were 1 revolution per second, but could be adjusted.  This rate may or may not be appropriate for our task, because moving a mouse doesn't have the same movement and feedback as moving your whole arm, so we may need to slow it down.

To start with, I defined a bunch of global variables (in the Init() function) that will make things easier.  I wanted to be faithful to a write-up of the original task I had, so went through the  bother of assuming a 17 inch 3:4 screen to get a good guess.  Below, gScale will compute pixels/inch on the screen.  These calculations could be used in a number of other contexts.

  gPi <- 3.141596
  gVideoWidth <- 800
  gVideoHeight <- 600
  gHomeX <-  400
  gHomeY <- 300

  ##if (a3)^2+(a4)^2 = 17^2, then
  ##    9a^2 + 16a^2 = 17^2
  ##    25 a^2 = 17^2
  ##    a^2 = (17/5)^2, a = 17/5
  ## this assumes 17" screen with 3x4 aspect ratio
  screenWidth <- (17/5) * 4
  screenHeight <- (17/5) *3
  gScale <- gVideoWidth/screenWidth
  ##These dimensions were taken from description of physical PR task:
  radiusInches <- (10.04-1.44)/2  #radius in inches
  gRadius <- radiusInches * gScale

Now, I just need to draw a path for the target to move on.  I want to make a donut shape, but I don't have a shape exactly like that in PEBL: just circles.  Since the radius of the path is set, what I'll do is  make two circles, one 10 pixels larger than the path, which I will color dark grey, and one ten pixels smaller than the path, which I will color the background color. This will give the illusion of a ring, when it is just a stack of circles, and the target will then appear as a path to follow.

  back1 <- Circle(gHomeX,gHomeY,gRadius+10,MakeColor("grey40"),1)
  back2 <- Circle(gHomeX,gHomeY,gRadius-10,MakeColor("grey"),1)

I then added a some text boxes for instructions and logic to present the instructions and so on.  This will be on the screen the entire experiment, and so we will just add them once and reuse them or hide them if necessary.

The next thing is to make the target that you follow with the mouse.  I'm not sure if the original machine provided feedback for when you were 'on' target, but it seems reasonable to do so.  So I want a target that will 'light up' when you are keeping on top of it.   What I'll do is make a single circle target, and define two colors (dark red and bright red).  During the tracking task, if the cursor is found to be on top of the target, we change the color from dark red to red; otherwise we set it to dark red, so it looks as if it light up when you are on top of it.   I didn't  define these as global variables, but they will be added and removed within each Trial function:

  col1 <- MakeColor("red")
  col2 <- MakeColor("darkred")
  gTarg <- Circle(gHomeX+gRadius,gHomeY,gTargSize,col2,1)
  AddObject( gTarg,gWin)

The last command puts the mouse cursor on the starting position of the red target.
Now, the logic is pretty straightforward.  Determine how long the trial will be, then cycle until this time is up.  On each cycle, check the current mouse position, and compare to the target position.  If the mouse is inside the target, change the color to col1 (bright red), otherwise change it to col2 (dull red).  Record whether you have a hit, as well as the absolute position of the cursor and target, for later analysis.  Here is the core logic within that inner loop:

Keep track of time:
    tLast <- tNow
    tNow <- GetTime()
    tDiff <- tNow -tLast
    tElapsed <- tNow - t0

Computing the location of the target is pretty straightforward trigonometry.  Its important to recognize that we won't always be able to count on a fixed time step, so rather than tracing the path around a circle in steps, I compute an elapsed time (time since start of trial), and determine exactly where the target should be at that time, given the rate of travel.  This is a simple sine and cosine computation:
    gTarg.x <- gHomeX+gRadius * Cos(tElapsed/1000*2*gPi*gSpeed)
    gTarg.y <- gHomeY+gRadius * Sin(tElapsed/1000*2*gPi*gSpeed)

Then, on each cycle, I check to see if the trial is over yet:

    cont <- (tElapsed < timeinsecs*1000)

To see if we are 'on' the target, we get mouse position and check if we have a hit, then change color appropriately:

    mouse <- GetMouseCursorPosition()
    inside <- Inside(mouse,gTarg)
        gTarg.color <- col1
       gTarg.color <- col2

Don't forget to draw to the screen!


Scoring is traditionally given in the proportion of time spent on the target.  But modern computers also allow us to keep track of the average distance from the target too.   Here is some logic that helps compute total deviation (in pixels) and total time on target:

    ##compute deviation on trial, and other statistics
    diff <- Dist(mouse,[gTarg.x,gTarg.y])
    totaldev <- totaldev + diff
    totaltime <- totaltime + tDiff*inside  #increment the elapsed time

While the task is running, I decided to keep a real-time log of cursor position.  This creates a potential bottleneck, because disk writing is relatively slow compared to other computer processes, and could potentially cause problems for some setups.   If that is the case, it  could be removed to improve performance.

    FilePrint(gFileOut,head+ " " + steps + " " + tNow + " " + tElapsed  + " " +  
        gTarg.x + " " + gTarg.y + " " + First(mouse) +  " " +
        Second(mouse) + " " +inside + " " + tdiff + " " + totaltime + 
        " " + diff + " " + totaldev)
    Wait(10)  ##Needed to allow mouse to sync.
    steps <- steps + 1

 At the end of Trial() we need to remove the target circles,  because they will get added again next trial, but otherwise this is really straightforward.  After some playing around, I determined that on my setup, 1 RPS was way too fast for this type of task.  A value of 5 to 10 seconds/rotation seemed  a better fit.  It all may depend on your mouse, your screen size, whether you use a touch screen, and parameters you can control in the test like circle diameter and target size.  So, any norms and possibly most conclusions from the physical version of the test are pretty much irrelevant here, but the test could still serve as a procedural memory task.

Here is a video of one trial:

Data Analysis
So what do the data look like?   Obviously, this is not really an experiment, because there are no independent variables being manipulated, but it still produces results.

In the current release, the data are reported in two ways.  First, there is a summary report giving you details:

PEBL Pursuit Rotor task, Version 0.1
Sat Jan 30 15:02:50 2010
PEBL Version 0.10
Participant code: test
System type: LINUX
Video Width:              800
Video Height:             600
Path radius:              252.941
Target radius:            25
Speed (rotations/sec):    0.133333
Total time per trial:     15
Number of trials:         4
Time on target:
   Round       Time on target      Mean deviation (pixels)
   Trial [1]   11823             18.3722
   Trial [2]   11884             17.5525
   Trial [3]   11919             17.2508
   Trial [4]   13005             15.083

There were four trials, and on each trial we computed time on target (out of 15000 ms) and mean deviation in pixels.  It looks like I was improving over trials, both in terms of time-on-target and mean deviation.  These scores are saved in the file pursuit-rotor-report-X.txt.  But I also save a log line at every cycle of the inner loop, in pursuit-rotor-X.txt.  With this, we can look at the actual mouse locations:

This shows overall where my mouse was, and they are tagged with the actual cursor position and the trial time, so that more interesting analyses could be carried out.  I think it makes a nice, brief (1 minute long) psychomotor task with some real potential.   50 years ago, the apparatus probably cost a few hundred dollars.  Today, Lafayette's version costs $1900, and somebody is selling a Windows version for about $100. These are certainly going to be the right approach for many users, but some folks may be able to use the one I provide, which offers a lot of flexibility and is completely free.  The PEBL pursuit rotor task is available in the latest version of the PEBL Test Battery, and more information can be found here.
Post a Comment