I’m trying to run a simple ‘flicker’ study on SuperLab 4. The idea is to present a small grey rectange centre screen, followed by a blank white screen, followed by the grey rectange, followed by a white screen …and so on for a set duration.
In order to ensure robust control over the ‘rate’ of the flicker i have checked how long it takes SL to put the image of the grey rectangle (file is a 16Kb Jpeg) up and this comes out at 2.5ms - nice. So, in order to present a flicker of 10hz (ie one grey and one white screen every 100ms) i’ve input presentation times of 47ms and i’ve created a list of 21 grey stimuli and 21 white. Hence 47ms for image, 2.5ms for ‘drawing’ would give a time of 49.5ms, repeated 42 times is 2079ms.
However, when i did this and piloted the study the flicker seemed to go on for longer than the 2 seonds. I tested this with a hand held timer and it came out at 3.6seconds, almost 1500ms slower! Of course, i know that it will take me a few hundred ms to press the button etc and so i had our Tech run a couple of timing trials and she came out with an average of 3500ms - again, a little over 1400ms slower than it should have been.
With regards to SL I am using the buffer and not removing one image before another is presented - as i thought this would help ‘speed’ things up.
I’ve checked screen and pc. Screen refresh rate is 60Hz, PC is a standard desktop running windows 7 with Intel® Core™ 2 Duo CPU E7500 @ 2.90Ghz. RAM is 4Gig with 3.25 usable running a 32 bit operating system. Graphics card is NVIDIA GeForce 9500 GT.
Is there any way to ‘check the timing’ of stimuli presentation rates. Can you think of why SL is taking longer to present the stimuli that it is programmed to do?
thanks - David