Human test suite

We try to automate as much of our testing as we can at Poll Everywhere, but it’s inevitable that a human needs to test the software to make sure it’s working (in addition to the automated tests). We’ve also found it’s a good idea to start building automated integration tests by first running humans through our test suite.

Want to run an HTS at Poll Everywhere?

Put together a testing spreadsheet

Start by creating a Google Sheet with these elements on it:

  1. Column with peoples’ names on it. If you don’t do this your HTS will enter the realm of diffused responsibility, which means nobody will know what to test. Reach out to the people you want to HTS your software. Ask them what platform they have and try your best to align your test suite with their configuration. If that’s not possible, you might setup a VM or give them remote access to a machine with the configuration you’d like to test.

  2. Column with specific configurations that people should test. You’ll want to personally check in with people and try to tailor the test suite for their environment. That’s not always possible, but a huge effort should be made to make an HTS as easy as possible to respect peoples’ time (they’re doing you a huge favor of testing your warez. Make life easy for them. Have VMs for them, etc.)

  3. Step and expected output.

  4. Formatting This might seem nit picky, but if you’re asking a non-technical person for their time to find all of the bugs in your software, you should want to make their experience as painless and pleasant as possible. Those little formatting tweaks help.

    • Conditional formatting. If it’s a pass, make it green. If it’s a failure, make it red.
    • Freeze the panes! Header and the left column should “stick”. This makes it easier to see the test case and expected results for the nose bleed seats in column Z.

If you do it right you’ll end up with something like this:

Example HTS document for our Mac App

Share with the team, but start small

Don’t shotgun blast the whole company for an HTS. Usually somebody will find a bug in the first few steps, which will grind everybody else’s work to a halt and you’ll end up with 10 of the same tickets.

Start small. 1 person who hasn’t worked on your team is a good first round. Once you fix that list of issues, invite 3 more people for a total of 4 testers.

Keep iterating into a bigger group with edgier configurations as you fix issues. 4 people might be enough for a trivial change. For something more complicated that tackles more platforms, it’s appropriate to bring the entire company in for testing and dogfooding.


When we trust HTS coverage, we start automating it. That could mean integration specs or a rainforest script.

If we automate first, we might be automating bad test cases. The human touch at first results in a much higher quality test suite that will catch regression more effectively.