A Survey of Survey Tools

This page was last updated: 19 November 2008.
Dates of evaluation of survey tools are in the “Roll-Call of Tools” section.
A “Changelog” at the end of the document lists changes for the updates to this page.
The content and comments on this page are the professional opinion solely of the Web Accessibility Center and do not represent the opinion of The Ohio State University or any organizations or units affiliated with the WAC.

Please note that some of the information on this page is out of date.

This document presents an overview of a number of popular tools for conducting surveys online. We have no hard statistics on which of these tools are currently being used at OSU. We have either taken surveys delivered by OSU entities via one of these tools or have been contacted by web administrators who have used or currently deploy the tool within their department or unit. At the end of this document we mention a few other tools we came across in our research.

In the comparisons and analysis below, we attempt to assess the degree to which the survey tools—for both the administrator and survey-taker—are accessible to keyboard alone and to a screen reader. Also important in this capacity, particularly for people with cognitive disabilities, are a logical page organization (visually and at the level of the HTML), easy recognizability of both the progress through the survey for the survey-taker and the various feature sets available for the administrator, and visual usability factors such as use of open/white space, proximity of related elements, and visual contrast between elements that might help in survey-takers making selections on multiple-choice and matrix question types.

For accessibility testing:

It is relatively easy to make static form elements accessible. And, by and large, survey-taker interfaces in all of the tools lack dynamic interface elements. That is, there is little AJAX or use of JavaScript to manipulate display, though some use JavaScript for tooltips and to trigger selection of elements in question types. We found more extensive use of dynamic techniques on the administrator interfaces, which tended to make them less accessible. Here is some of the markup we looked for:

There are some basic HTML techniques survey authors ought to be aware of, regardless of the accessibility of the tool they are using. So after discussing the survey tools, we offer a few tips for creating accessible surveys.

Roll-Call of Tools

Without further ado, the patients currently on the operating table:

For our tests, we composed a survey in each tool. The first seven questions in each survey are either identical or as close to identical as we could muster, given the idiosyncrasies of the tools and question types. We then imagined 10 survey-takers and had “them” fill in each of the five surveys overlapping questions identically. We did this to get a sense of how the statistics collected by each tool stack up head to head.

Question Types

We are listing only those features available at all package levels and try to constrain our choices to question types that are common across more than one survey tool. All of the remotely managed (non-locally installed) survey tools provide more question types and, in some cases, greater flexibility in collection of data (uploads of data files, for instance) at the higher cost package levels.

Basic Question Types Available in the Survey Tools
  SurveyGizmo SurveyMonkey Zoomerang LimeSurvey Checkbox Snap Survey
Text Field/Short Answer yes yes yes yes yes yes
Text Area/Long Answer yes yes yes yes yes yes
Table of Text Fields yes yes no no yes yes
Radio Button Multiple Choice
(high-contrast comparison of radio and checkbox controls)
yes yes yes yes yes yes
Checkbox Multiple Choice yes yes no yes yes yes
Drop-down Multiple Choice yes yes yes yes yes yes
Table of Radio Buttons yes yes yes yes yes yes
Table of Checkboxes yes yes yes no yes yes
Table of Drop-downs yes yes no no yes no
Rating Scale yes
(customizable)
yes yes yes yes yes
Rank In Order of Preference yes no yes yes no yes
Star-based Ranking
(high-contrast view of star ranking)
yes no no no no yes
(Snap star-ranking is a horizontal group of radio buttons with stars above each increment. Because of this implementation it is accessible, though it doesn't behave like the JavaScript-enabled star ranking most users are familiar with.)
Pre-Set Contact/Demographic Information Fields yes
(customizable, with validation)
yes yes no no no
Choose Between Images yes
(alt text and captioned, screen reader but not keyboard-only accessible)
no no no no yes
(alt text and captioned, screen reader and keyboard-only accessible)
Continuous Sum yes
(percentages, numbers toward a total)
no no yes
(see “multiple numerical” question type)
no yes
(percentages, numbers toward a total)
Admin Can Add Instructions or Other Copy yes yes yes yes yes yes
Admin Can Add Video & Images yes
(with alternative text descriptions and styling hooks)
yes
(images)
no yes
(images, by editing template)
yes
(images)
yes
(images and video)
Progress and Other Feedback yes
(feedback in charts or graphs (in pay-for packages) and progress indicator)
yes no yes yes yes
Make Answers Mandatory yes yes yes
(not all question types)
yes yes yes
Field-Level Validation yes
(some limited in free package, extensive in pay-for packages)
yes
(limited to specific question types)
no yes
(through limits and regular expressions)
yes
(pre-defined formats, such as social security number, postal codes, etc.)
yes

Special Question Types/Features Available in Pay-For Packages

Three of the survey tools (SurveyGizmo, Checkbox, and Snap Survey) are able to “pipe” responses from one question to other questions in the survey, thus controlling the flow of the survey. SurveyMonkey, Snap, SurveyGizmo (at pay-for package levels), and Checkbox can randomize the question order, and SurveyGizmo will even randomize rows within a matrix. SurveyGizmo and Checkbox have APIs for customized integration.

Reporting Features

Reporting Features Available in the Survey Tools
  SurveyGizmo
(screen shot of SurveyGizmo online report)
SurveyMonkey
(screen shot of SurveyMonkey online report)
Zoomerang
(screen shot of Zoomerang online report)
LimeSurvey
(screen shot of LimeSurvey online report)
Checkbox
(screen shot of Checkbox online report)
Snap Survey
(screen shot of Snap Survey online report)
Characteristics of Online Report view individual survey responses, demographic and geotracking info (pay-for packages), survey traffic graph, charts, tables showing counts, percentages, and calculated values view individual survey responses, tables with statistical data, including percentages and counts view individual survey responses, tables with statistical data, including percentages and counts view individual survey responses, tables with statistical data, including percentages, counts, and (where appropriate) calculated values, minimum, maximum, standard deviation, quartile breakdown view individual survey responses, charts and tables showing percentages and counts view individual survey responses, charts and tables showing percentages, counts, mean, and standard deviation
Display and Export Formats HTML, MS Excel, CSV, and SPSS (with Enterprise package) HTML, CSV, XML, PDF HTML, CSV HTML, MS Word, MS Excel, CSV, SPSS HTML (can publish reports to unique, open URL), CSV, SPSS HTML, Triple S, SPSS, SAV (Snap format), ASCII, MS Access, MS Excel, MS Word, PDF, CSV and tab separated values

Packages and Costs

Packages, Costs, and Features for the Survey Tools
  packages number of questions number of surveys number of responses cost
SurveyGizmo Basic unlimited unlimited 250/mo FREE
Personal unlimited unlimited 1000/mo $19/mo
Pro unlimited unlimited 5000/mo $49/mo
Enterprise unlimited unlimited 50000/mo $159/mo
SurveyMonkey Basic 10 unlimited 100/mo FREE
Monthly Pro unlimited unlimited 1000/mo $19.95/mo
Annual Pro unlimited unlimited unlimited $200/yr
Zoomerang Basic
(data kept for 10 days)
30 unlimited 100/survey FREE
zPro for Education unlimited unlimited unlimited $350/yr
zPro for Non-Profits unlimited unlimited unlimited $350/yr
zPro for Business unlimited unlimited unlimited $599/yr
CheckBox (installed locally) Professional unlimited unlimited unlimited $3,500, education pricing, 1 - 2 users
Workgroup unlimited unlimited unlimited $4,600, education pricing, up to 10 users
Enterprise unlimited unlimited unlimited $34,000, education pricing, unlimited users
CheckBox (remote hosted) Basic unlimited unlimited unlimited $1,450, 1 user
Professional unlimited unlimited unlimited $2,250, 1 - 2 users
Workgroup unlimited unlimited unlimited contact Prezza
Enterprise unlimited unlimited unlimited contact Prezza
LimeSurvey FREE
Snap Survey Snap Professional unlimited unlimited unlimited $1,995
(cost for survey creation tool and remote hosting, local hosting via WebHost product adds $2,995)

Privacy Concerns and Survey Security

For the university research community, both security and privacy of data collection are great concerns. For some uses—informal satisfaction surveys, anonymous polls, etc.—privacy and security are probably not crucial. But for any survey that collects potentially user-identifying data or which is covered under research grant projects that have stipulations, security and, especially, privacy may be one of the primary criteria influencing your decision of which tool to select. The information we provide in this section is not meant to be comprehensive, but it may be enough to narrow your choice. We recommend you go beyond the information provided on the web and contact customer support if you have concerns.

Three of the survey tools we discuss on this page can be installed on your own web server and all of those can be set up to serve surveys only over an encrypted (SSL/HTTPS) connection. Both Checkbox and Snap Survey have versions that can be hosted remotely or locally. The open source LimeSurvey is downloaded and deployed by your web administrator. In the case of these locally installed survey tools, data privacy and security are mostly within your own hands. However, we would strongly recommended putting these tools up against a battery of SQL injection and cross-site scripting attacks to verify that the surveys themselves cannot be used to discover stored data. And, of course, you should make sure your database and web server are protected from hackers.

You can find more information about web security standards for servers at The Ohio State University Web Security Standard web site. The OSU Office of Research Institutional Review Board maintains policies regarding collection and protection of data from human research.

As we mentioned, with Snap Survey, full security could be achieved by running the WebHost product on your own server. The Snap Survey licensing agreement discusses privacy of customers, not survey-takers. We could find no Safe Harbor or HIPAA language on their site, likely because Snap Survey is developed in the U.K.

SurveyGizmo offers SSL connections at Enterprise level, along with unique, non-SurveyGizmo URLs. Their privacy statement states that staff will not access data or grant access to third parties. Support requests may require staff to log into accounts for purposes of troubleshooting, only. Also, survey data can be scrubbed on written request. SurveyGizmo has self-certified with the U.S. Department of Commerce's Safe Harbor Framework and adheres to Safe Harbor Privacy Principles. SurveyGizmo has self-certified as HIPAA compliant for handling of Protected Health Information and follows the Privacy Rule and the Security Rule provisions of HIPAA.

SurveyMonkey maintains a page to help you decide whether the SurveyMonkey data collection process is HIPAA compliant. SSL connections are available at the Professional account level for an extra charge. SurveyMonkey has met Safe Harbor requirements and is on the Department of Commerce's list of companies.

Checkbox has many features designed for security, including SSL. The Checkbox privacy policy concentrates mostly on customer data and not on survey-taker data, however. For hosted solutions, SSL is available at all levels.

Zoomerang does not appear to offer secure connections. The Zoomerang privacy policy does not discuss HIPAA or Safe Harbor principles but does claim that data is secured using “industry-standard security measures.”

Administrator and End-User Accessibility

The survey-taker experience was our primary focus in evaluating the survey tools, but we also tested the administrator interfaces accessibility with a screen reader and attempted all input using only the keyboard. We assign letter grades to the online surveys. We rate relative levels of accessibility for the administrator interfaces.

Survey Accessibility Grade Scale

grade
A

For the survey-taker interface to get a grade of “A”, all of the question types must be fully accessible to screen reader and keyboard-only users. Means of determining progress through the survey must also be accessible. All question types must be usable when screen fonts are enlarged—and the fonts must be enlargeable via the browser—and when high-contrast settings are enabled on the Windows OS. Matrix tables should use table headings on both column and row. All form elements must have proper labels. Headings should be used appropriately to aid navigation through the survey. Error reporting, including indication of required fields, must be unambiguous and put focus in the offending field or to an error reporting box with links to bad fields.

grade
B

For the survey-taker interface to get a grade of “B”, the majority of key question types must be fully accessible to screen reader and keyboard-only users. Means of determining progress through the survey must also be accessible. All question types must be usable when screen fonts are enlarged—and the fonts must be enlargeable via the browser—and when high-contrast settings are enabled on the Windows OS. Matrix tables should use table headings on at least the top row of the matrix table. All form elements must have proper labels or the screen reader must not make any errors in determining label text for elements. Headings should be used appropriately to aid navigation through the survey. Error reporting, including indication of required fields, must be clear enough for the user to easily locate the offending question and understand the error.

grade
C

For the survey-taker interface to get a grade of “C”, some key question types must be fully accessible to screen reader and keyboard-only users. All question types must be usable when screen fonts are enlarged—and the fonts must be enlargeable via the browser—and when high-contrast settings are enabled on the Windows OS. Some form elements need to have proper labels or the screen reader must not encounter fatal errors in determining label text for elements.

grade
F

For the survey-taker interface to get a grade of “F”, most types of questions present severe access problems via keyboard or screen reader. Screen fonts do not enlarge or break the visual design when they enlarge and high-contrast settings make some inputs impossible to use. Proper labels are not used for identification of form elements.

Administrator Accessibility Levels

Practically Accessible
For the administrator interface to be “practically accessible”, a keyboard or screen reader reliant user must be able to get to all controls involved in setting up and publishing a survey with little effort, and there must be no ambiguity introduced in the process by dynamic interfaces, unexpected cursor focus changes, etc.

Functionally Accessible
For the administrator interface to be “functionally accessible”, a keyboard or screen reader reliant user must be able to get to all key controls involved in setting up and publishing a survey with little effort. There may be some ambiguity in using dynamic interfaces, but these must be surmountable via documented workaround or through a reasonable amount of experimentation by the user.

Technically Accessible
For the administrator interface to be “technically accessible”, a keyboard or screen reader reliant user must be able to get to all key controls involved in setting up and publishing a survey. That is, it must be technically possible to use the key interfaces—access must be feasible, though not necessarily practical.

Not Accessible
For the administrator interface to be “not accessible”, a keyboard or screen reader reliant user would be unable to set up and publish a survey.

Summaries of Survey Tool Accessibility

SurveyGizmo
grade
B+

Most of the question types in SurveyGizmo are screen reader and keyboard accessible, with the exception of the “image choice” and “star-ranking” types, both of which can be worked around using other question types. The “rank in order of preference” type is accessible to the screen reader but not the keyboard. Text fields highlight to show cursor focus but not radio, checkbox, or other question types. Images are all given proper alternative text, labels are properly associated with form elements, and matrix tables have column headers that are properly scoped, though row headers are not proper HTML table headers. Validation or required question errors reload the page and display a high-contrast error box at the top of the screen. However, the error box is not focused on page reload (it should be) and the only way we have to determine the required fields is by browsing the page. Adding to this difficulty is the fact that the “required” star indicator is not within the label tag (so it does not get read by the screen reader when tabbing to the control.

Though conscious regard was given to accessibility in the design of SurveyGizmo’s, in our opinion, most of the accessibility comes from the fact that SurveyGizmo tends to closely follow web standards. They have a page dedicated to the accessibility of SurveyGizmo surveys, which links to a short review comparing SurveyGizmo and SurveyMonkey, by Karen Mardahl and Lisa Pappas in the Usability Professionals Association Voice. (Note the information on SurveyMonkey appears to be out of date, or at least does not reflect the current status of screen reader accessibility in SurveyMonkey.)

Admin. Interface: Between Functionally and Practically Accessible — The administrator interfaces in SurveyGizmo are a model of usability. Interfaces are clean, navigation elements are easy to find, and “flow” through the creation process seemed to us more intuitive than any of the other survey tools. There are some limitations to accessibility but they can be worked around. Specifically, when choosing question types, a right-hand panel updates to allow the administrator to enter data. There is no indication of this screen update, and screen reader users would need prior instruction or experimentation to figure it out.

SurveyMonkey
grade
B

SurveyMonkey has put significant effort toward making their standard survey interfaces screen reader accessible. They have made very clever use of labels within most of their question types. A representative example: In the radio grouping type of question, the first response repeats the question before announcing the label. So, if the question were “Whom do you like better?” and the responses were either Batman or Robin. The label on Batman would read “Whom to you like better Batman”. The first part of the label is hidden from the visual user but provides excellent orientation for screen reader users. SurveyMonkey uses a similar trick in matrix tables and goes beyond SurveyGizmo in providing row headers in tables. Though they are not scoped, screen readers are pretty adept at handling two-dimensional tables, so long as TH’s are available. Another smart feature that is not implemented in SurveyGizmo is the focusing of question errors. If you “continue” from a page and have an error in your response, SurveyMonkey not only reloads the page, it focuses the question with the error. Better would be focusing a box that listed the errors and filled the box with hyperlinks to the problematic questions, but we feel that SurveyMonkey’s approach is closer to the mark than SurveyGizmo’s.

Where SurveyMonkey falls down, however, is in keyboard accessibility for for users who can either see the screen or need to set high-contrast settings. Though at the level of HTML, SurveyMonkey is using actual radio buttons and checkboxes, as opposed to Zoomerang, SurveyMonkey covers the checkboxes and radios with images that provide a unique look to those common form elements. Unfortunately, this disrupts keyboard accessibility, as one cannot see when the group of elements has the focus solely by tabbing. More dramatic a problem arises when the user turns on high-contrast settings in Windows. Because of the way these settings are implemented in Windows, background images are disabled though the physical space they occupy is still blocked out. Thus, with high-contrast on, we no longer see any indication that checkboxes or radio buttons either have focus or are selected.

Admin. Interface: Not Accessible — It is not possible to use the SurveyMonkey administrator interfaces with keyboard alone. There is no highlighting of elements and the user has no idea where she is or what she can do to make actions occur. Though the screen reader can read the contents of many interfaces, the modal dialogs that are used very widely throughout the administrator interfaces do not seem to get the screen reader focus when spawned. Thus it is virtually impossible for the screen reader user to figure out where she is in the dialogs (or if she is even in them at all).

Zoomerang
grade
D-

Zoomerang is accessible only to mouse users. It was not possible with the keyboard to get through all of the questions. In tabbing, the tendency was to get stuck in a question, moving over and over its options, throwing JavaScript errors with each press of the tab key. The tool uses no labels or headings, leading to ambiguity in selecting answers and a general inability to easily navigate the survey. While it was technically possible to get to all of the questions using a screen reader, since the tool does not implement checkboxes and radio buttons using standard HTML form elements but instead uses custom widgets, it is very difficult or impossible to determine what checkboxes or radio buttons have been selected. Additionally, since the interface is not tab navigable, the survey-taker has to move in and out of form input mode. We could not imagine either a keyboard or screen reader user being able to complete any of the very simplest of Zoomerang surveys.

Admin. Interface: Not Accessible — Some of the initial administrator interfaces can be navigated via keyboard and screen reader. However, once the user moves deeper into the interfaces to begin either editing or creating new surveys, all screen reader and keyboard access breaks down entirely. The tools implements three or four frames per page. Typically, the user gets stuck in the topmost frame and cannot get to the content of main body/survey-editing frame. This was the case both via screen reader and keyboard. For instance, though JAWS can typically pull up a list of frames or move through them by repeatedly pressing the “m” key. However, in the Zoomerang administrator interfaces it was not possible to use these navigation techniques to get to the survey creation editing tools.

LimeSurvey
grade
B

LimeSurvey uses proper HTML labels for text fields and checkbox and radio groupings. A table header row is used in one- and two-dimensional matrix questions, but the left heading column is not implemented with table headers.

Like Checkbox, errors in filling in a survey are given through a JavaScript alert that does not indicate which questions have problems. However, unlike Checkbox, the page is updated to indicate which questions have errors. This is done by adding red-highlighted text above the question inputs. The user would need to browse the page to locate the errors.

Mandatory questions are identified with a red asterisk. Better would be an image with alternative text of “Mandatory Question” or simply “Required”.

The “ranking” question type is neither screen reader nor keyboard accessible. It should be avoided for those concerned with accessibility because not only is it inaccessible it also causes the cursor to get stuck in the control. We could not tab out of the control once in it, using either keyboard alone or when reviewing with a screen reader.

Since we first reviewed LimeSurvey, in April 2008, a new JavaScript calendar widget has been implemented for the date question types. This new calendar is elegant and easy to use, though not accessible to the keyboard or with a screen reader. However the user can simply type in the date in the required format, bypassing the JavaScript pop-up calendar. Also, the administrator has the option to have the date question type rendered out as three HTML select menus—year, month, and day. This is a fully accessible alternative and probably the best choice, since, though the user can manually input the date in the pop-up calendar enabled version, the pop-up calendar is triggered by an HTML button with the text “...” in it, which might confuse a screen reader user.

Overall, LimeSurvey seems to be moving toward high accessibility and there are only a few problems, as mentioned above. We would like to see more standards-aware HTML—there is still frequent use of font tags and inline styles, for instance. And simplified HTML with more hooks for fuller control of styling would really make this free and easy to deploy tool shine.

Admin. Interface: Technically Accessible — The administrative interface uses very few dynamic effects and is easily tab navigable. All image icons that are links have appropriate alternative text descriptions. Though there are headings on the pages, they are few (many interfaces have only a single heading, despite the fact that there are many logical sections on the interfaces).

The admin interfaces use the FCKEditor for editing long text, by default. The text editing controls of this editor are inaccessible to the keyboard and screen reader. However, the editor is collapsed by default. So essentially you have an iframe in which to edit question text. This works fine for keyboard and screen reader users, behaving much as though you were in a text area. If the administrator wants to completely disable the HTML editor, she can by setting the defaulthtmleditormode PHP configuration file option to “none.”

Potentially problematic are the drop-down selects for choosing the survey to edit, the question group, and the question to edit. An on-change event triggers the page refresh that directs you to the appropriate interface. This is not considered a best practice, since an inexperienced user will simply down arrow through the select options and trigger the on-change and be redirected to a new page—an effect that can be very disorienting and for some screen reader and keyboard users may make it impossible to navigate through the surveys, groups, and questions. However, experienced keyboard and screen reader users will use the Alt key to expand the select. This bypasses the on-change event and allows the user to select any option.

The controls you use to set up question types and groups are somewhat confusing. They also often lack proper HTML labels, though the screen reader in most cases was able to discern the function of controls. Most confusing are the “Question Attributes” controls and the set-up of matrix-style questions. It is feasible that a screen reader user might become familiar with the idiosyncrasies of these interfaces with some guidance from a sighted user who had experience with the tool.

LimeSurvey has a configuration option called addTitlesToLinks that is geared toward improving accessibility. It has no useful effect, however. All of the icons in the admin interfaces are already given alternative text descriptions in the image tag and the purpose of other links is clear from context. Regardless of whether it is effective or not, it does demonstrate that the developers of the product—who are unpaid volunteers—are concerned with accessibility. We are hoping they will implement onfocus/onblur functionality in their pop-up tooltips to make them more accessible for keyboard-only users.

Checkbox
grade
C+

Checkbox uses proper HTML labels for checkboxes and radio button groupings. The interface uses deeply nested tables for arranging questions, which can cause problems with screen readers. Table-based matrices use no table headers and are confusing for a screen reader to navigate. Error reporting is through a generic JavaScript pop-up message, indicating only that there are errors on the page. It’s up to the user to determine which questions have problems. Its rating scale question is potentially accessible since it is based on radio buttons. However it lacks labels for the rating options and has its ratings listed in one row of a table and the radio buttons in the row below, a configuration which makes this question type inaccessible to screen reader users.

Admin. Interface: Between Technically Accessible and Not Accessible — The administrative interface uses very few dynamic effects and is easily tab navigable. However there are many images that lack alternative text, though the interface icons have adequate alt text, and there are no headings used. On any given survey set up page there are multiple instances of link text for moving, copying, and deleting items. There is no way for a screen reader user to figure out which of these links effects which control.

Snap Survey Professional Edition
grade
B+

The Snap home page has a prominent link to information on Snap Survey accessibility. The page notes that “Snap Professional is able to produce questionnaires which meet or exceed the highest level of web accessibility. It is able to produce questionnaires which meet Level 3 compliance (sometimes known as AAA compliance). Producing questionnaires to such a demanding standard can largely be done automatically by choosing the appropriate web publication options. Some though require care from questionnaire authors in avoiding certain layouts and applying appropriate settings.” The page links to pages that benchmark the survey tool features against both WCAG 1.0 and U.S. Section 508.

Basic user configuration recommendations having to do with accessibility on the site's pages on Section 508 and WCAG 1.0 include adding alternative text to images, ensuring link text is understood out of context, and allowing for input of “custom HTML,” such as language changes and abbreviations. More rigorously accessible “strict rules” can be enabled when producing “plain text” versions of surveys, which the tool can create automatically. The plain text versions are not text-only, rather they are simplified HTML versions. The strict rules affect positioning of field labels, expand grids into a single column, and wrap related fields—from a row of an expanded table, for example—in a fieldset. The strict rules plain text export does not appear, however, to use fieldsets to wrap groups of related checkboxes and radio buttons—the ideal usage of fieldsets, in our opinion. The strict rule on label positioning attempts to ensure that labels get placed in the code in such as way that a screen reader would automatically make the association, even in cases where “there is no support for explicit labelling.” We could not determine when this was the case; proper HTML labels are used in every question type we tested. As mentioned, the strict rule on expanding grids busts a grid-style question into separate question groups for each row in the grid and then wraps the question groups in fieldsets to disambiguate each form control. Strict rules also remove all use of font tags, allow for external style sheets to be used, a tab order to be set, place-holder text to be added to fields, and adds accesskeys to reset and submit the survey questions.

We were impressed that the tool makes accommodations for screen reader input even for complex question types. The horizontal slider-based rating question type has text input fields with proper labels. So, while a screen reader or keyboard user cannot access the sliders themselves, it is possible to fill in the text inputs—though the fields only allow whole numbers to be entered where using the slider with a mouse generates floating point numbers. JavaScript validation on the ranking (one to N) question type guarantees unique numbers in each field. However, while the error responses (in JavaScript alert boxes) are technically precise, they are not always user-friendly. For example, putting a bogus response in the second field of a question, for example, the fourth in the survey, produces this error response: ”Number invalid or out of range for question Q4.b.” Errors which referenced specific fields within questions were all similar to this and difficult to comprehend. However, other sorts of error messages were clear. There is no visual or screen reader cue of required fields by default. When moving through the survey, you would only find out whether a field was required after you submitted the page.

When a page loads with a new question on it, Snap puts the focus in the first input field on the page. This is a convenience for most users, but screen reader users will need to navigate up out of the field to read the question.

The image choice question type uses radio buttons and an image. Because of this implementation, the question type is accessible both through keyboard and screen reader. SurveyGizmo's implementation is more elegant looking, using mouse highlighting and a green check overlay for the selected image, but it is not screen reader or keyboard accessible.

In matrix-/grid-style questions, column and row headers are true HTML table headers and have the “scope” attribute set properly. This potentially is very beneficial for screen reader users. However, matrix-style questions insert extra, empty columns in between answer columns to pad out the design. This may cause screen reader users some confusion, as table based navigation is made more difficult.

Overall, throughout the surveys, Snap has made overt decisions to achieve high accessibility. And its plain text export is unique. We have to wonder how often it would be used, though. We believe the goal should be a single accessible product, not two versions, though we could see the plain text version being published, along with the regular version, as an alternative for screen reader users.

Snap was developed and the company is located in England. England's laws on web accessibility cover private and public entities, requiring both to comply, unlike in the U.S. where 508 covers only entities fully funded by the Federal government. We can speculate that Snap's extensive work toward accessibility is motivated by its primary market.

Admin. Interface: Not Accessible — The administrative interface is a Windows-only, installed, GUI application, which facilitates creating survey questions, outputting the surveys into HTML forms (for the web and hand-held devices) and to PDF, importing survey results from an online service, and creating sophisticated reports. The interface is complex but appears to be efficient for creating and interpreting surveys once its wealth of features have been learned. In the JAWS screen reader and using the keyboard alone, however, it was not possible to use the interface beyond accessing the main menu system.

Summary of Findings

SurveyMonkey has made strides toward screen reader accessibility, and it has the best error reporting of any of the tools tested. However, the difficulty survey-takers using the keyboard alone would have in completing surveys and the problems with high-contast display prevent us from scoring it above the low “B” grade range. Overall, administrator-side usability and the high accessibility of the survey-taker interface (with some exceptions for certain question types) compels us to score SurveyGizmo best of the group. Snap Survey Professional Edition also merits a high “B”. All of its question types are accessible and it has the ability to produce alternative versions that are geared specifically toward high accessibility. It has a few problems with error handling and focusing of pages. But outside of that the surveys it produces are highly usable, and even some complex, JavaScript-reliant question types (slider, for example) are accessible. Snap's installed GUI is currently not accessible. Zoomerang scores lowest. It is inaccessible to the keyboard alone and not practically accessible for a screen reader user. Out of the box, LimeSurvey provides decent accessibility for most question types, including matrix-style questions, which include at least a single header row. LimeSurvey's error reporting is adequate—it uses JavaScript alerts that route you back to the page. You must then review each of the questions on the page to determine which contain errors, a process that might be tedious for screen reader users depending on the number of questions on the page. Checkbox has a slightly more polished appearance than LimeSurvey out of the box and appears to be equally customizable in terms of its templates. The Checkbox matrix questions lack proper markup, its error reporting gives no on-page indication of which questions have errors, and its ranking has accessibility problems.

Tips for Creating Accessible Surveys

Changelog

A summary list of changes to this document.

Changes for 1 May 2008 update
  • Original composition of this web page was finalized. The first round of testing was completed April 4 2008.
Changes for 25 July 2008 update
  • SurveyGizmo claims in June 3 2008 change log post that they now group questions with fieldsets. We could not find this behavior for any of the question types where we figured it would make the most sense--groups of checkboxes or radio buttons.
  • SurveyMonkeyupdated their FAQ dealing with accessibility in June 2008. The software was "certified" 508 compliant by RampWeb. We were contacted by a representative from SurveyMonkey via email who indicated their 508 certification. We have retested the product and do not see any changes to either the survey creation interfaces or the use of background images in radio and checkbox questions, from when we tested in April. As we note in our review, the latter cause the surveys to be impossible to fill when Windows is in high-contrast mode. They state on their accessibility FAQ that two aspects of 508 compliance are “Keyboard access for mobility impaired users” and “Color contrast for users with low vision”. However, with high-contrast mode enabled, it is not possible to determine what checkbox or radio button has been selected.
  • Their have been no changes to Zoomerang since we tested in April.
  • Checkbox has come out in a new version (4.5) that promises more customization and styling of questions using a graphical editor. We have not re-evaluated Checkbox. Our review is based on version 4.4.
  • LimeSurvey is at version 1.71, the version we tested in April, though they say they are working on version 2.0. We will try to review that when it comes out. Note that the LimeSurvey Project Leader wrote us and said that we had some of our information “just plain wrong” regarding LimeSurvey accessibility. We responded but have not heard back.
  • Added material on Snap Survey Professional Edition
Changes for 24 September 2008 update
  • LimeSurvey is at version 1.71+. In late July we received extensive detailed feedback from the LimeSurvey Project leader. We have reviewed the newest version and rewrote our comments, raising the end-user grade from C+ to B, due to our previous misunderstanding of certain features. Please note our caveats on question types, however. The administrator interface score changed from “Between technically accessible and Not Accessible” to “Technically Accessible” in our re-review. However users should be aware of the potential access problems related to on-change events being used in the drop-downs that select between surveys, groups, and questions. See our commentary for more on this.
  • Added section on the security and privacy of the survey tools.
  • Added section offering tips on creating accessible surveys.
  • Updated “Summary of Findings” section.
  • Updated prices, grade, and review for Checkbox.
  • Removed some out of date text in the SurveyMonkey grade section.
  • Made edits to the introductory section of this page.
Changes for 19 November 2008 update
  • Slight edits and corrections and an update to the “Tips” section.

Other Tools We Looked At

We examined a number of other survey tools. As was discussed above, we feel that our final picks have the most relevance to OSU because they are either currently used or have a degree of usability that merits mention. But the sea is full of fish. We also considered: