Hi there, I’ve noticed a few threads about timing in SL, and I have my own question regarding this.
We have an experiment that involves a one-paragraph story being displayed (as text) and left on screen for 10s. It is synched to an MRI scanner (via an RB-834) which pulses once every 2s. In the data file we get some consistently incorrect RT recording, such that SL records the first pulse incorrectly after about 1948ms, and the subsequent 4 pulses correctly after about 2000ms (± 2ms or so).
This pattern occurs regardless of the stimulus type, or how long it is to be displayed for. In another case, we’re presenting a crosshair (+) for 6s - the first pulse is recorded at 1965ms or so, and the following 2 at 2000ms (± 2ms).
My feeling is that the time taken to draw the corresponding text to the screen is not counting towards the total RT, which can make timing quite difficult (especially since we’d like to synch stimulus onset with scanner TRs). Is there any way to either make the drawing time non-existent, or to have the RT timer start immediately at the beginning of the trial (rather than once the text appears)?
many thanks in advance for any suggestions,