This is a real oddball suggestion. In our lab, we have three computers to test subjects in the lab, plus two that we use to test AD patients in their homes. We have had, in the past, problems coordinating our counterbalancing schemes across the five computers, and in a few cases, we have even lost data when a certain subject “slot” was tested on more than one of the computers. As a result, I’ve been working on a way to use a central server (in our case, a me.com–formerly mac.com–group iDisk) from which the individual computers “check out” subjects, scenarios, and stimuli, and “check in” full or partial data sets. The scheme I’ve been working on would really work with any remotely accessible file system, but that’s not the point of this post.
Perhaps the general problem could & should be dealt with at a higher level by enabling “cloud” data management within Superlab itself. It would require a bit of thought to figure out just how to do it, but in effect, there could be urls in the Superlab environment that could be used to select an experiment, group, and subject to run, and to write, view, and manage the data. It could all be done via SSL and with good authentication to keep data secure. It could be used with a single computer, but it definitely would be much more useful the more separate computers were running experiments in different locations. It would be especially useful, obviously, for multi-lab collaborative studies.
One specific use would be a me.com-style “sync” function, where Superlab updates, scenarios, and preferences could be synced to all of participating labs’ computers automagically.