Open Source, Portable Usability Testing Lab: Part 2 – The Parts

The last time I posted on Fedora’s portable usability testing lab, I talked about the video files it outputs and how Ray Strode helped me write a gstreamer pipeline to construct the videos into a quad-screen video. (Ray has sinced worked out a much more efficient pipeline, and created a git repository on GNOME.org to make it available.)
Well, in this post I’d like to review the equipment that’s in this lab. I’ve started to put together some documentation on how to assemble, use, and process the data out of the usability lab. This is really just an excerpt of the Assembly Instructions document I started writing in the Fedora wiki:

  1. Carrying case – it’s an Opteka equipment case, with foam padding on the inside, cut out to create various fitted components for the parts. Here’s a photo showing all the parts in the case.
  2. Microphone and microphone plate – it’s a crown omnidirectional mic. The pickup really isn’t that great, so I’m going to be looking to either get some phantom power for it or maybe replace it with a better one at some point.
  3. D-SUB connector video cable (for scan converter) – this connects the input of the scan converter to the output of the computer you’re running the tests on, so that the DVR can record the desktop. This means all desktop recording is done externally from the computer, so your usability tester won’t notice any lag or sluggishness due to recording.
  4. Scan converter USB power cable – this whole lab requires 6 power outlets, so thankfully the scan converter doesn’t require one more. It simply powers off of a USB port on your testing computer.
  5. Scan converter box – the scan converter takes the video out of your testing computer (up to 1024×768) and turns it into a signal the DVR can record.
  6. Camera AC power supplies (3) – the cameras are externally-powered by these.
  7. Camera stands (3) – these screw into the bottom of the camera cylinders and are directionally adjustable.
  8. Cameras (3) – these are Sony color security cameras and have very good pickup in dark conditions.
  9. Camera BNC & power cables (3) – these are 10 feet long – a bit too long to be honest – but they get the job done, connecting the cameras to the back of the DVR.
  10. Video cable for scan converter (RCA / BNC) – this allows the scan converter to send signal to the DVR. One end has a BNC converter to go into the DVR end.
  11. DVR / video mixer – this is an AverMedia AverDigi SATA+ embedded Linux security DVR unit. It takes the signal from the three cameras and scan converter as well as the microphone and handles the recording of those into video files. As discussed previously, the file formats output by this little beastie are a bit of a pain, but GStreamer comes to the rescue and transforms this box into a workable solution. (By the way, here’s what the ports on back look like.)
  12. DVR power supply – power for the DVR.

For a fuller run-down of the equipment as well as a listing of additional needed equipment see: https://fedoraproject.org/wiki/Design/UsabilityLab/Assembly_Instructions. By the way, the specs and pricing for the parts listed above are also available on the Fedora wiki now, but I do intend to clean up that page to be a little more useful / readable in the future.

13 Comments

  1. Astonishingly cool, Mo. My big questions:
    1. Who will get to use this kit?
    2. What will you do with the data?

  2. Usability testing is a laboratory method possible in the field because of the skill of the researcher. Fancy field equipment doesn't guarantee useful data. I don't understand the purpose of putting together an expensive device to record sessions when the video has limited value. Even when you're a professional, it provides a little value by which you will be able to make highlight reels (pure marketing) or do performance-based testing (which is rarely done in industry because of cost), and having a good observer is worth much more. It's also not like you can produce these en masse and hand them to untrained "field researchers" because your data is only as good as the person who is collecting it. Going through community-submitted video would be tedious for a minimal return. For companies or projects conducting usability testing on a minimal budget, training user researchers is worth more than building expensive recording devices. What's the point of going through the trouble of organizing and conducting a usability test for low-quality data?

    1. seele, this seems like an extremely negative position to take.
      From your bio, it looks like you are an HCI professional. Well done, and it's clear that you know plenty on this topic. But surely, somewhere between The Professional Observer (100% guaranteed insight!) and The Hyperactive Chimp (0% guaranteed insight!), there are ways for Ordinary But Interested Folks to have an effect?
      The goal of this kit is clearly to capture, as much as possible and in as cheap a way as possible, the observable behaviors of real, honest-to-God novice users. You know as well as anyone else does, given your experiences, how infrequently developers get to experience this kind of feedback. I would think that *any* truly honest feedback would be superior to the "none at all" that is the norm. One person's "low-quality data" is another person's gold mine.
      So why be hatin'?

      1. @Greg simple return on investment. analyzing usability tapes takes approximately twice the time of the test, and when the experiment isn't run well you are reducing the amount of useful information you can acquire. now there is a huge burden on usability engineers to take that data and spend time analyzing it, when they could be using that same amount of time doing something with a higher roi.
        the last thing i want are hundreds of hours of bad open source usability testing floating around on the internet. one, it puts the burden on usability engineers to spend time reviewing the data instead of doing other things. two, it isn't hard to notice that someone is leading or the test protocol is flawed, and amateur testing could taint the perceived value of usability testing. three, inexperienced/untrained people will attempt to analyze data to use as evidence rather than correctly interpreting it.
        it's not that ordinary but interested folks shouldn't be involved, they just shouldnt being trying to conduct research methods which have high logistic and experience requirements. there are other methods which could specially crafted to minimize the impact of the person collecting the data.
        "usability testing" is not a silver bullet. it is one of a number of methods which can collect certain types of data. it requires trained personnel, requires a lot of planning, and takes longer than most methods. usability testing doesnt tell developers much about their users, just what their users are doing. without a good moderator, you don't ever find out "why", which is the most important piece of data you could hope for in usability testing. if developers what to learn more about their users to make informed design decisions, then start with user interviews and documenting user needs and goals.

      2. @Seele I take it you're not a big fan of Steve Krug.
        Don't worry, I'm not going to ruin everything for you by building this lab. Seriously. Calm down.

    2. I disagree that putting together highlight reels or having video to show developers is pure-marketing. I've done rounds of usability testing in the past with only pen and paper, occasionally with an audio recorder, and gone back to developers. There's always been scenarios in which some error or problem with the software came up during a test, and the developers have been confused or not sure how to fix the problem because I didn't jot down the full error text and we didn't have a screenshot to see what happened, what exactly the user clicked on before the error occurred, and what page the user started from. If I had video on hand I could refer them to, they could see exactly what happened, with a screenshot. You can even make out the URL in the browser location bar using these videos. They are also handy for my own reference, when I'm looking to see how long, for example, a page in a web application took to load. (Very difficult to do on the fly; you don't know the page load is going to take a long time unless it takes a long time at which point it's too late to look at your watch to start timing.)
      I used to be in the camp that video equipment is a waste of money. I've been doing usability tests with only pen and paper and occasionally an audio recorder for years. This kit cost $850, and has honestly already provided me with enough value that I think it worth the money and trouble.
      I agree that you can't just buy equipment and send it around and expect quality data back. I hope I didn't make that impression with anything I wrote. I am rather hoping to be able to bring this kit to events and try to spark an interest in potential contributors and teach them how to run tests. Then at some point I would hope that I could train interested contributors enough that they could run their own testing sessions.
      I'm under no misconception that sending an $850 box of goodies to people is going to equal high-quality, usable data without a good deal of leg work that is going to include finding and training interested folks, and I'm surprised you seem to think that is the case.

      1. @mairin:
        "I agree that you can’t just buy equipment and send it around and expect quality data back. I hope I didn’t make that impression with anything I wrote. I am rather hoping to be able to bring this kit to events and try to spark an interest in potential contributors and teach them how to run tests. Then at some point I would hope that I could train interested contributors enough that they could run their own testing sessions."
        sorry, it wasn't clear to me what your plans were with the kits. portable usability testing kits has been a topic that has come again and again over the past few years, for the purpose of simply collecting data without any regard to having an actual trained moderator or research plan. so i'm overly sensitive to the topic. i'm glad to see you would want to use this with trained personnel.
        certainly if you are doing lab testing and you have the equipment with you, it is worth recording for the reasons you state. i see it as a perk if you happen to have the equipment, but i never rely or plan on it for field testing.
        in terms of training, i've done a few experiments where i trained LoCo members on how to run a very specific test and i did the observation and data collection. the usability task was very specific, and we spend a LoCo meeting reviewing usability testing, exploring the task, and practiced moderating for various scenarios surrounding the task. then on the day of testing, each LoCo member brought a friend/family member and guided that person through the task while i observed, took notes, and helped the moderator or participant if they got stuck. afterwards, we all got together and discussed the sessions as a whole and came up with a list of findings. overall, it was a very interesting experience. there was a lot of overhead necessary to "train" the LoCo members for the test, but it got the community involved and was less logistics than if i had to do it all myself. i would like to see more of this type of testing done, but unfortunately it still requires a trained usability engineer to be present for planning and data analysis.

      2. @Seele That is exactly the sort of thing I'm planning to do with this at events. Which is why I'm a bit miffed you'd assume otherwise not having known. It's usually not a good idea to assume the worst in others.

  3. Hey Greg, well…
    1) I'm honestly not sure who would be interested in using it, nor am I sure about the logistics of shipping it. I don't think the case it's in now is going to be trustworthy for shipping it around so we'd probably need to get a Pelican case or something of similar quality. I do intend to bring it to every FUDcon event I attend (Toronto this Dec will be its debut 🙂 ), and I am hoping to find volunteers with interest in learning to run it there to help me keep it running during the length of the event. I'm not sure if ambassadors would have any interest in helping run tests at events or in their locations.
    I'm trying to document it as much as I can not only to help enable others to use it, however we end up getting it into their hands, but also so that others could build their own. I brought it over to the GNOME Boston Summit last weekend and the GNOME Foundation has some interest in assembling a similar one to ship around to events.
    Oh and just to state the obvious – I keep this in my cube at work and have been using it there as much as I can find willing volunteers. 🙂
    2) So here's an example of what I've been doing with the data thus far, and as I finish up the first round of Fedora Community testing you'll see more in that space plus I'll do a full blog-up like this of that process: https://fedoraproject.org/wiki/FedoraCommunity/Us
    The first bit of data is I fill out one of these task sheets that basically serves as the script for the test with my own live notes and observations of the test: https://fedoraproject.org/wiki/FedoraCommunity/Us
    At the end of the test I transcribe it like so: https://fedoraproject.org/wiki/FedoraCommunity/Us
    After I transcribe a set of notes I try to break them down both by topic and state (good, bad, meh), referring to the video as needed to determine how long specific tasks *really* took, possibly taking screenshots of particularly troubling scenarios (haven't used screenshots this yet): https://fedoraproject.org/wiki/FedoraCommunity/Us
    Of course I also process the videos; I've been trying to upload them to blip.tv but somehow the FLV conversion always fails so I'm gonna figure out some way to do that on my own so I can just use blip.tv to host them. (I use blip because they're the only video hosting site I've found that's supportive of ogg and ogv)
    For Fedora Community I'm planning to run 5 tests. I've got 3 so far. They tend to be 30 mins a piece. Once I've got five, I'll have 2.5 hours of video.
    I think what I should do, since 2.5 hours of usability test video is maybe boring for the devels to watch contiguously, is put time stamps on every single bullet point on the analysis and takeaway summary page, and use that document to go over everything with the developers. When they are confused about a point I can simply refer to the video with timestamp so they can watch it firsthand.
    This would be a whole lot easier if totem (or any other free media player honestly) supported CMML. CMML lets you mark timestamps of videos with notes, so I could easily set up bookmarks within the video using it. Then, the developers could open the videos in say totem and see the list of bookmarks in the sidebar, and I'd just have to tell them, "go to the bookmark 'task set 2 question 3' to see what I mean."
    Ok enough geeking out on CMML. Hopefully that makes sense?

  4. "…The pickup really isn’t that great, so I’m going to be looking to either get some phantom power for it…"
    Don't apply phantom power to a Microphone unless you *know* it supports it!
    It won't improve it's pickup, and could possibly melt it…
    Condenser microphones need quite a lot of power, however USB condenser microphones are appearing. Try the Samson CO1U Mic: http://www.samsontech.com/products/productpage.cf
    (Phantom Power) http://en.wikipedia.org/wiki/Phantom_power

    1. Thanks for the tip TGM! I'm a bit out of my element when it comes to mics so I'll keep that in mind. Are you familiar with the mic? These are the specs:
      Crown Sound-Grabber-II Conference Microphone http://www.amazon.com/Crown-Sound-Grabber-II-Conf
      It is actually a condenser mic, but it just takes batteries.
      Thanks for the mic suggestions. I can't use USB, though, because of the connector on the DVR device – it's actually bare wires…!

  5. […] today I had the Fedora portable usability lab set up in room D and I ran through 4 usability tests of Fedora Community with Fedora package […]

Leave a Reply to seeleCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.