picture format?

just wondering what the best format to use for stimulus picture files (jpeg, bmp, tif). i am using a mac. thanks

Much depends on the type of visual stimuli. If using scanned images or bitmaps, I’d recommend JPG, GIF, or PNG. If presenting geometric shapes (“objects”), I’d recommend the PICT file format, but only if you have no plans ever to move the experiment to a PC because the PICT file format is the only one that is not cross-platform.

I agree that it depends on the stimuli.

GIF is also good for geometric shapes. I wouldn’t recommend GIF for photos.

Drawing speed

I’d be interested in hearing about how image file type (and size) interacts with drawing speed. I imagine that jpg’s take longer to unpack and copy to the screen buffer than an uncompressed format, although the file system access would take longer.

Also, is there much time savings to be derived from reducing the number of bits per pixel, either in the file or during runtime on the display?

Greg Shenaut

This difference you can easily see in the event editor of all version of SuperLab 4. If you draw your image to an offscreen buffer, the time you see when previewing is the amount of time it takes to copy the image’s offscreen buffer to the OS’s offscreen buffer (not including waiting for VSYNC or flushing to the screen). If you don’t draw to an offscreen buffer, the time includes decompression.

For Apple’s “Stones” desktop image at 512x320, my Mac Pro takes less than a millisecond to draw the buffered copy of the image, while it takes about 10ms to draw the unbuffered version.

If you buffer, decompression happens either between trials or prior to the start of the experiment (depending on the “Memory Management” experiment option). If you don’t buffer, decompression happens at event presentation. The only reason I can think you might not want to buffer is if you are really skimping on RAM.

As long as you are buffering your image, the size will make more of a difference in drawing time than anything else. The offscreen buffer is always 32 bits per pixel, (so your display should always be set to 32 bits, or millions of colors) which means that any bit depth conversion would be done when drawing to the image’s offscreen buffer.

One other Mac-specific detail… when you draw your images to an offscreen buffer, antialiasing is enabled, but it’s disabled when you don’t. If your images are resized by SuperLab, you’ll notice a major difference in quality between buffering and drawing straight to the screen.

It took me months to get the drawing code where it is today, and it’s still receiving minor improvements. Windows, OS X 10.3, and 10.4 each work differently.


What happens if you select “load them all before starting the experiment” and you run out of memory? Does the system start swapping like crazy, or do you get an error message?


Unfortunately, you don’t actually run out of memory until you hit the 4GB memory limit. So, yes, it will swap. It only swaps “like crazy” if you go way over your memory limit. You can watch your swap activity using “vm_stat” in the terminal.

Enter the following command in the Terminal:

vm_stat 15

This will output a new line every fifteen seconds. The last column is the important column (pageout). If you have regular pageouts while running anything, you don’t have enough RAM for what you are doing, whatever it may be.

I think that I am having memory issues with a program that I am trying to bring up in Superlab 4.0, and I am hoping that your suggestiong about endering the vm_stat 15 comman in the “Terminal” will help me determine whether this is true.

My only problem is I have no idea what you mean by “the Terminal”. Could you be a bit more explicit for a mere semi-geek?


  1. Double Click “Macintosh HD”
  2. Double Click “Applications”
  3. Double Click “Utilities”
  4. Double Click “Terminal.app”
  5. You’ll probably see a window with something like this:

beast:~ whschultz$

–with a cursor blinking to the right of the $. Type “vm_stat 15” without the quotes, so it looks like this:

beast:~ whschultz$ vm_stat 15

And then press return. You’ll see something like the following:

beast:~ whschultz$ vm_stat 15
Mach Virtual Memory Statistics: (page size of 4096 bytes, cache hits 24%)
free active inac wire faults copy zerofill reactive pageins pageout
266166 189914 436118 157269 550877453 35394959 232029769 1130214 1180555 64980
265817 190133 436468 157049 9308 2 7165 0 0 0
263679 190122 438617 157049 11027 1 9026 0 0 0

A new line will be spit out every fifteen seconds until you stop it (close the window, quit terminal, something like that). The last column is pageout. This is what you want to look at. 1 pageout is only 4KB, which is nothing to worry about. 1000 is 4MB, which is something to notice. 100000 is 400MB, which is a LOT. Note that the very first line after the column names is the number since booting up the computer. Ignore this line. Optimally, the last column will be full of zeroes, but you have to have a lot of RAM for this to happen (I have 4GB, and it’s usually all zeroes, but not guaranteed). Basically, if you have a bunch of big (greater than 1000) numbers in the last column when you run a SuperLab experiment, you might want to think about getting more RAM. There’s no hard rule here. It’s a bit subjective.

I hope this helps!

oops! I forgot to say I am running on a PC (XP OS).

On XP, if you press ctrl-alt-delete, it brings up the task manager. Under the Performance tab, it shows you how much of your Physical Memory is available. I’m not a Windows expert in this area, so I could be way off here, but my impression is that a low amount of available memory and a high amount of Page File Usage probably means you don’t have enough RAM. Perhaps someone more experienced on Windows can speak up here.