A Survey of Survey Tools
This page was last updated: 19 November 2008.
Dates of evaluation of survey tools are in the “Roll-Call of Tools” section.
A “Changelog” at the end of the document lists changes for the updates to this page.
The content and comments on this page are the professional opinion solely of the Web Accessibility Center and do not represent the opinion of The Ohio State University or any organizations or units affiliated with the WAC.
Please note that some of the information on this page is out of date.
This document presents an overview of a number of popular tools for conducting surveys online. We have no hard statistics on which of these tools are currently being used at OSU. We have either taken surveys delivered by OSU entities via one of these tools or have been contacted by web administrators who have used or currently deploy the tool within their department or unit. At the end of this document we mention a few other tools we came across in our research.
In the comparisons and analysis below, we attempt to assess the degree to which the survey tools—for both the administrator and survey-taker—are accessible to keyboard alone and to a screen reader. Also important in this capacity, particularly for people with cognitive disabilities, are a logical page organization (visually and at the level of the HTML), easy recognizability of both the progress through the survey for the survey-taker and the various feature sets available for the administrator, and visual usability factors such as use of open/white space, proximity of related elements, and visual contrast between elements that might help in survey-takers making selections on multiple-choice and matrix question types.
For accessibility testing:
- We ran a screen reader (JAWS) against all of the interfaces, both survey-taker and administrator. How difficult was it to navigate the interfaces and accurately fill in information?
- We attempted to perform all input with keyboard alone, specifically using the tab key to navigate form elements. Can a user tab and arrow through the interfaces? Is it clear where the cursor is while doing so, that is, do form or other elements clearly visually indicate focus? Is it necessary to use the mouse for essential functionality?
- We set the computer into a high-contrast rendering mode. Are all essential page elements still visible and is contrast uniformly enhanced?
- We enlarged the screen fonts. Does enlarging fonts distort or “break” the interfaces?
- We made subjective appraisals of the clarity, organization, and contrast of the visual presentation. Does styling and layout aid visual recognition of the functionality or purpose and significance of the various elements of the interfaces?
- Use of headings to introduce questions or sections of content
- Use of labels to associate descriptive text with form elements
- Use of fieldsets and legends to group and identify related form elements
- Proper markup of tables with headers in matrix question types
There are some basic HTML techniques survey authors ought to be aware of, regardless of the accessibility of the tool they are using. So after discussing the survey tools, we offer a few tips for creating accessible surveys.
Roll-Call of Tools
Without further ado, the patients currently on the operating table:
- SurveyGizmo (commercial, hosted, re-reviewed 25 July 2008)
- SurveyMonkey (commercial, hosted, re-reviewed 25 July 2008)
- Zoomerang (commercial, hosted, re-reviewed 25 July 2008)
- Checkbox (commercial, server installed or hosted, re-reviewed 24 September 2008, version 4.4)
- LimeSurvey (open-source, server installed, re-reviewed 24 September 2008, version 1.71+)
- Snap Survey Professional Edition (commercial, survey configuration via desktop application (Windows only), manual publishing to web for data collection, reviewed 25 July 2008)
For our tests, we composed a survey in each tool. The first seven questions in each survey are either identical or as close to identical as we could muster, given the idiosyncrasies of the tools and question types. We then imagined 10 survey-takers and had “them” fill in each of the five surveys overlapping questions identically. We did this to get a sense of how the statistics collected by each tool stack up head to head.
We are listing only those features available at all package levels and try to constrain our choices to question types that are common across more than one survey tool. All of the remotely managed (non-locally installed) survey tools provide more question types and, in some cases, greater flexibility in collection of data (uploads of data files, for instance) at the higher cost package levels.
|Text Field/Short Answer|
|Text Area/Long Answer|
|Table of Text Fields|
|Radio Button Multiple Choice
(high-contrast comparison of radio and checkbox controls)
|Checkbox Multiple Choice|
|Drop-down Multiple Choice|
|Table of Radio Buttons|
|Table of Checkboxes|
|Table of Drop-downs|
|Rank In Order of Preference|
(high-contrast view of star ranking)
|Pre-Set Contact/Demographic Information Fields||
(customizable, with validation)
|Choose Between Images||
(alt text and captioned, screen reader but not keyboard-only accessible)
(alt text and captioned, screen reader and keyboard-only accessible)
(percentages, numbers toward a total)
(see “multiple numerical” question type)
(percentages, numbers toward a total)
|Admin Can Add Instructions or Other Copy|
|Admin Can Add Video & Images||
(with alternative text descriptions and styling hooks)
(images, by editing template)
(images and video)
|Progress and Other Feedback||
(feedback in charts or graphs (in pay-for packages) and progress indicator)
|Make Answers Mandatory||
(not all question types)
(some limited in free package, extensive in pay-for packages)
(limited to specific question types)
(through limits and regular expressions)
(pre-defined formats, such as social security number, postal codes, etc.)
Special Question Types/Features Available in Pay-For Packages
Three of the survey tools (SurveyGizmo, Checkbox, and Snap Survey) are able to “pipe” responses from one question to other questions in the survey, thus controlling the flow of the survey. SurveyMonkey, Snap, SurveyGizmo (at pay-for package levels), and Checkbox can randomize the question order, and SurveyGizmo will even randomize rows within a matrix. SurveyGizmo and Checkbox have APIs for customized integration.
(screen shot of SurveyGizmo online report)
(screen shot of SurveyMonkey online report)
(screen shot of Zoomerang online report)
(screen shot of LimeSurvey online report)
(screen shot of Checkbox online report)
(screen shot of Snap Survey online report)
|Characteristics of Online Report||view individual survey responses, demographic and geotracking info (pay-for packages), survey traffic graph, charts, tables showing counts, percentages, and calculated values||view individual survey responses, tables with statistical data, including percentages and counts||view individual survey responses, tables with statistical data, including percentages and counts||view individual survey responses, tables with statistical data, including percentages, counts, and (where appropriate) calculated values, minimum, maximum, standard deviation, quartile breakdown||view individual survey responses, charts and tables showing percentages and counts||view individual survey responses, charts and tables showing percentages, counts, mean, and standard deviation|
|Display and Export Formats||HTML, MS Excel, CSV, and SPSS (with Enterprise package)||HTML, CSV, XML, PDF||HTML, CSV||HTML, MS Word, MS Excel, CSV, SPSS||HTML (can publish reports to unique, open URL), CSV, SPSS||HTML, Triple S, SPSS, SAV (Snap format), ASCII, MS Access, MS Excel, MS Word, PDF, CSV and tab separated values|
Packages and Costs
|packages||number of questions||number of surveys||number of responses||cost|
(data kept for 10 days)
|zPro for Education||unlimited||unlimited||unlimited||$350/yr|
|zPro for Non-Profits||unlimited||unlimited||unlimited||$350/yr|
|zPro for Business||unlimited||unlimited||unlimited||$599/yr|
|CheckBox (installed locally)||Professional||unlimited||unlimited||unlimited||$3,500, education pricing, 1 - 2 users|
|Workgroup||unlimited||unlimited||unlimited||$4,600, education pricing, up to 10 users|
|Enterprise||unlimited||unlimited||unlimited||$34,000, education pricing, unlimited users|
|CheckBox (remote hosted)||Basic||unlimited||unlimited||unlimited||$1,450, 1 user|
|Professional||unlimited||unlimited||unlimited||$2,250, 1 - 2 users|
|Snap Survey||Snap Professional||unlimited||unlimited||unlimited||$1,995
(cost for survey creation tool and remote hosting, local hosting via WebHost product adds $2,995)
Privacy Concerns and Survey Security
For the university research community, both security and privacy of data collection are great concerns. For some uses—informal satisfaction surveys, anonymous polls, etc.—privacy and security are probably not crucial. But for any survey that collects potentially user-identifying data or which is covered under research grant projects that have stipulations, security and, especially, privacy may be one of the primary criteria influencing your decision of which tool to select. The information we provide in this section is not meant to be comprehensive, but it may be enough to narrow your choice. We recommend you go beyond the information provided on the web and contact customer support if you have concerns.
Three of the survey tools we discuss on this page can be installed on your own web server and all of those can be set up to serve surveys only over an encrypted (SSL/HTTPS) connection. Both Checkbox and Snap Survey have versions that can be hosted remotely or locally. The open source LimeSurvey is downloaded and deployed by your web administrator. In the case of these locally installed survey tools, data privacy and security are mostly within your own hands. However, we would strongly recommended putting these tools up against a battery of SQL injection and cross-site scripting attacks to verify that the surveys themselves cannot be used to discover stored data. And, of course, you should make sure your database and web server are protected from hackers.
You can find more information about web security standards for servers at The Ohio State University Web Security Standard web site. The OSU Office of Research Institutional Review Board maintains policies regarding collection and protection of data from human research.
As we mentioned, with Snap Survey, full security could be achieved by running the WebHost product on your own server. The Snap Survey licensing agreement discusses privacy of customers, not survey-takers. We could find no Safe Harbor or HIPAA language on their site, likely because Snap Survey is developed in the U.K.
SurveyGizmo offers SSL connections at Enterprise level, along with unique, non-SurveyGizmo URLs. Their privacy statement states that staff will not access data or grant access to third parties. Support requests may require staff to log into accounts for purposes of troubleshooting, only. Also, survey data can be scrubbed on written request. SurveyGizmo has self-certified with the U.S. Department of Commerce's Safe Harbor Framework and adheres to Safe Harbor Privacy Principles. SurveyGizmo has self-certified as HIPAA compliant for handling of Protected Health Information and follows the Privacy Rule and the Security Rule provisions of HIPAA.
SurveyMonkey maintains a page to help you decide whether the SurveyMonkey data collection process is HIPAA compliant. SSL connections are available at the Professional account level for an extra charge. SurveyMonkey has met Safe Harbor requirements and is on the Department of Commerce's list of companies.
Administrator and End-User Accessibility
The survey-taker experience was our primary focus in evaluating the survey tools, but we also tested the administrator interfaces accessibility with a screen reader and attempted all input using only the keyboard. We assign letter grades to the online surveys. We rate relative levels of accessibility for the administrator interfaces.
Survey Accessibility Grade Scale
For the survey-taker interface to get a grade of “A”, all of the question types must be fully accessible to screen reader and keyboard-only users. Means of determining progress through the survey must also be accessible. All question types must be usable when screen fonts are enlarged—and the fonts must be enlargeable via the browser—and when high-contrast settings are enabled on the Windows OS. Matrix tables should use table headings on both column and row. All form elements must have proper labels. Headings should be used appropriately to aid navigation through the survey. Error reporting, including indication of required fields, must be unambiguous and put focus in the offending field or to an error reporting box with links to bad fields.
For the survey-taker interface to get a grade of “B”, the majority of key question types must be fully accessible to screen reader and keyboard-only users. Means of determining progress through the survey must also be accessible. All question types must be usable when screen fonts are enlarged—and the fonts must be enlargeable via the browser—and when high-contrast settings are enabled on the Windows OS. Matrix tables should use table headings on at least the top row of the matrix table. All form elements must have proper labels or the screen reader must not make any errors in determining label text for elements. Headings should be used appropriately to aid navigation through the survey. Error reporting, including indication of required fields, must be clear enough for the user to easily locate the offending question and understand the error.
For the survey-taker interface to get a grade of “C”, some key question types must be fully accessible to screen reader and keyboard-only users. All question types must be usable when screen fonts are enlarged—and the fonts must be enlargeable via the browser—and when high-contrast settings are enabled on the Windows OS. Some form elements need to have proper labels or the screen reader must not encounter fatal errors in determining label text for elements.
For the survey-taker interface to get a grade of “F”, most types of questions present severe access problems via keyboard or screen reader. Screen fonts do not enlarge or break the visual design when they enlarge and high-contrast settings make some inputs impossible to use. Proper labels are not used for identification of form elements.
Administrator Accessibility Levels
Practically Accessible —
For the administrator interface to be “practically accessible”, a keyboard or screen reader reliant user must be able to get to all controls involved in setting up and publishing a survey with little effort, and there must be no ambiguity introduced in the process by dynamic interfaces, unexpected cursor focus changes, etc.
Functionally Accessible —
For the administrator interface to be “functionally accessible”, a keyboard or screen reader reliant user must be able to get to all key controls involved in setting up and publishing a survey with little effort. There may be some ambiguity in using dynamic interfaces, but these must be surmountable via documented workaround or through a reasonable amount of experimentation by the user.
Technically Accessible —
For the administrator interface to be “technically accessible”, a keyboard or screen reader reliant user must be able to get to all key controls involved in setting up and publishing a survey. That is, it must be technically possible to use the key interfaces—access must be feasible, though not necessarily practical.
Not Accessible —
For the administrator interface to be “not accessible”, a keyboard or screen reader reliant user would be unable to set up and publish a survey.
Summaries of Survey Tool Accessibility
Most of the question types in SurveyGizmo are screen reader and keyboard accessible, with the exception of the “image choice” and “star-ranking” types, both of which can be worked around using other question types. The “rank in order of preference” type is accessible to the screen reader but not the keyboard. Text fields highlight to show cursor focus but not radio, checkbox, or other question types. Images are all given proper alternative text, labels are properly associated with form elements, and matrix tables have column headers that are properly scoped, though row headers are not proper HTML table headers. Validation or required question errors reload the page and display a high-contrast error box at the top of the screen. However, the error box is not focused on page reload (it should be) and the only way we have to determine the required fields is by browsing the page. Adding to this difficulty is the fact that the “required” star indicator is not within the label tag (so it does not get read by the screen reader when tabbing to the control.
Though conscious regard was given to accessibility in the design of SurveyGizmo’s, in our opinion, most of the accessibility comes from the fact that SurveyGizmo tends to closely follow web standards. They have a page dedicated to the accessibility of SurveyGizmo surveys, which links to a short review comparing SurveyGizmo and SurveyMonkey, by Karen Mardahl and Lisa Pappas in the Usability Professionals Association Voice. (Note the information on SurveyMonkey appears to be out of date, or at least does not reflect the current status of screen reader accessibility in SurveyMonkey.)
Admin. Interface: Between Functionally and Practically Accessible — The administrator interfaces in SurveyGizmo are a model of usability. Interfaces are clean, navigation elements are easy to find, and “flow” through the creation process seemed to us more intuitive than any of the other survey tools. There are some limitations to accessibility but they can be worked around. Specifically, when choosing question types, a right-hand panel updates to allow the administrator to enter data. There is no indication of this screen update, and screen reader users would need prior instruction or experimentation to figure it out.
SurveyMonkey has put significant effort toward making their standard survey interfaces screen reader accessible. They have made very clever use of labels within most of their question types. A representative example: In the radio grouping type of question, the first response repeats the question before announcing the label. So, if the question were “Whom do you like better?” and the responses were either Batman or Robin. The label on Batman would read “Whom to you like better Batman”. The first part of the label is hidden from the visual user but provides excellent orientation for screen reader users. SurveyMonkey uses a similar trick in matrix tables and goes beyond SurveyGizmo in providing row headers in tables. Though they are not scoped, screen readers are pretty adept at handling two-dimensional tables, so long as TH’s are available. Another smart feature that is not implemented in SurveyGizmo is the focusing of question errors. If you “continue” from a page and have an error in your response, SurveyMonkey not only reloads the page, it focuses the question with the error. Better would be focusing a box that listed the errors and filled the box with hyperlinks to the problematic questions, but we feel that SurveyMonkey’s approach is closer to the mark than SurveyGizmo’s.
Where SurveyMonkey falls down, however, is in keyboard accessibility for for users who can either see the screen or need to set high-contrast settings. Though at the level of HTML, SurveyMonkey is using actual radio buttons and checkboxes, as opposed to Zoomerang, SurveyMonkey covers the checkboxes and radios with images that provide a unique look to those common form elements. Unfortunately, this disrupts keyboard accessibility, as one cannot see when the group of elements has the focus solely by tabbing. More dramatic a problem arises when the user turns on high-contrast settings in Windows. Because of the way these settings are implemented in Windows, background images are disabled though the physical space they occupy is still blocked out. Thus, with high-contrast on, we no longer see any indication that checkboxes or radio buttons either have focus or are selected.
Admin. Interface: Not Accessible — It is not possible to use the SurveyMonkey administrator interfaces with keyboard alone. There is no highlighting of elements and the user has no idea where she is or what she can do to make actions occur. Though the screen reader can read the contents of many interfaces, the modal dialogs that are used very widely throughout the administrator interfaces do not seem to get the screen reader focus when spawned. Thus it is virtually impossible for the screen reader user to figure out where she is in the dialogs (or if she is even in them at all).
Admin. Interface: Not Accessible — Some of the initial administrator interfaces can be navigated via keyboard and screen reader. However, once the user moves deeper into the interfaces to begin either editing or creating new surveys, all screen reader and keyboard access breaks down entirely. The tools implements three or four frames per page. Typically, the user gets stuck in the topmost frame and cannot get to the content of main body/survey-editing frame. This was the case both via screen reader and keyboard. For instance, though JAWS can typically pull up a list of frames or move through them by repeatedly pressing the “m” key. However, in the Zoomerang administrator interfaces it was not possible to use these navigation techniques to get to the survey creation editing tools.
LimeSurvey uses proper HTML labels for text fields and checkbox and radio groupings. A table header row is used in one- and two-dimensional matrix questions, but the left heading column is not implemented with table headers.
Mandatory questions are identified with a red asterisk. Better would be an image with alternative text of “Mandatory Question” or simply “Required”.
The “ranking” question type is neither screen reader nor keyboard accessible. It should be avoided for those concerned with accessibility because not only is it inaccessible it also causes the cursor to get stuck in the control. We could not tab out of the control once in it, using either keyboard alone or when reviewing with a screen reader.
Overall, LimeSurvey seems to be moving toward high accessibility and there are only a few problems, as mentioned above. We would like to see more standards-aware HTML—there is still frequent use of font tags and inline styles, for instance. And simplified HTML with more hooks for fuller control of styling would really make this free and easy to deploy tool shine.
Admin. Interface: Technically Accessible — The administrative interface uses very few dynamic effects and is easily tab navigable. All image icons that are links have appropriate alternative text descriptions. Though there are headings on the pages, they are few (many interfaces have only a single heading, despite the fact that there are many logical sections on the interfaces).
The admin interfaces use the FCKEditor for editing long text, by default. The text editing controls of this editor are inaccessible to the keyboard and screen reader. However, the editor is collapsed by default. So essentially you have an iframe in which to edit question text. This works fine for keyboard and screen reader users, behaving much as though you were in a text area. If the administrator wants to completely disable the HTML editor, she can by setting the
defaulthtmleditormodePHP configuration file option to “none.”
Potentially problematic are the drop-down selects for choosing the survey to edit, the question group, and the question to edit. An on-change event triggers the page refresh that directs you to the appropriate interface. This is not considered a best practice, since an inexperienced user will simply down arrow through the select options and trigger the on-change and be redirected to a new page—an effect that can be very disorienting and for some screen reader and keyboard users may make it impossible to navigate through the surveys, groups, and questions. However, experienced keyboard and screen reader users will use the Alt key to expand the select. This bypasses the on-change event and allows the user to select any option.
The controls you use to set up question types and groups are somewhat confusing. They also often lack proper HTML labels, though the screen reader in most cases was able to discern the function of controls. Most confusing are the “Question Attributes” controls and the set-up of matrix-style questions. It is feasible that a screen reader user might become familiar with the idiosyncrasies of these interfaces with some guidance from a sighted user who had experience with the tool.
LimeSurvey has a configuration option called
addTitlesToLinksthat is geared toward improving accessibility. It has no useful effect, however. All of the icons in the admin interfaces are already given alternative text descriptions in the image tag and the purpose of other links is clear from context. Regardless of whether it is effective or not, it does demonstrate that the developers of the product—who are unpaid volunteers—are concerned with accessibility. We are hoping they will implement onfocus/onblur functionality in their pop-up tooltips to make them more accessible for keyboard-only users.
Admin. Interface: Between Technically Accessible and Not Accessible — The administrative interface uses very few dynamic effects and is easily tab navigable. However there are many images that lack alternative text, though the interface icons have adequate alt text, and there are no headings used. On any given survey set up page there are multiple instances of link text for moving, copying, and deleting items. There is no way for a screen reader user to figure out which of these links effects which control.
- Snap Survey Professional Edition
The Snap home page has a prominent link to information on Snap Survey accessibility. The page notes that “Snap Professional is able to produce questionnaires which meet or exceed the highest level of web accessibility. It is able to produce questionnaires which meet Level 3 compliance (sometimes known as AAA compliance). Producing questionnaires to such a demanding standard can largely be done automatically by choosing the appropriate web publication options. Some though require care from questionnaire authors in avoiding certain layouts and applying appropriate settings.” The page links to pages that benchmark the survey tool features against both WCAG 1.0 and U.S. Section 508.
Basic user configuration recommendations having to do with accessibility on the site's pages on Section 508 and WCAG 1.0 include adding alternative text to images, ensuring link text is understood out of context, and allowing for input of “custom HTML,” such as language changes and abbreviations. More rigorously accessible “strict rules” can be enabled when producing “plain text” versions of surveys, which the tool can create automatically. The plain text versions are not text-only, rather they are simplified HTML versions. The strict rules affect positioning of field labels, expand grids into a single column, and wrap related fields—from a row of an expanded table, for example—in a fieldset. The strict rules plain text export does not appear, however, to use fieldsets to wrap groups of related checkboxes and radio buttons—the ideal usage of fieldsets, in our opinion. The strict rule on label positioning attempts to ensure that labels get placed in the code in such as way that a screen reader would automatically make the association, even in cases where “there is no support for explicit labelling.” We could not determine when this was the case; proper HTML labels are used in every question type we tested. As mentioned, the strict rule on expanding grids busts a grid-style question into separate question groups for each row in the grid and then wraps the question groups in fieldsets to disambiguate each form control. Strict rules also remove all use of font tags, allow for external style sheets to be used, a tab order to be set, place-holder text to be added to fields, and adds accesskeys to reset and submit the survey questions.
When a page loads with a new question on it, Snap puts the focus in the first input field on the page. This is a convenience for most users, but screen reader users will need to navigate up out of the field to read the question.
The image choice question type uses radio buttons and an image. Because of this implementation, the question type is accessible both through keyboard and screen reader. SurveyGizmo's implementation is more elegant looking, using mouse highlighting and a green check overlay for the selected image, but it is not screen reader or keyboard accessible.
In matrix-/grid-style questions, column and row headers are true HTML table headers and have the “scope” attribute set properly. This potentially is very beneficial for screen reader users. However, matrix-style questions insert extra, empty columns in between answer columns to pad out the design. This may cause screen reader users some confusion, as table based navigation is made more difficult.
Overall, throughout the surveys, Snap has made overt decisions to achieve high accessibility. And its plain text export is unique. We have to wonder how often it would be used, though. We believe the goal should be a single accessible product, not two versions, though we could see the plain text version being published, along with the regular version, as an alternative for screen reader users.
Snap was developed and the company is located in England. England's laws on web accessibility cover private and public entities, requiring both to comply, unlike in the U.S. where 508 covers only entities fully funded by the Federal government. We can speculate that Snap's extensive work toward accessibility is motivated by its primary market.
Admin. Interface: Not Accessible — The administrative interface is a Windows-only, installed, GUI application, which facilitates creating survey questions, outputting the surveys into HTML forms (for the web and hand-held devices) and to PDF, importing survey results from an online service, and creating sophisticated reports. The interface is complex but appears to be efficient for creating and interpreting surveys once its wealth of features have been learned. In the JAWS screen reader and using the keyboard alone, however, it was not possible to use the interface beyond accessing the main menu system.
Summary of Findings
Tips for Creating Accessible Surveys
- If the tool you are using supports formatting the question in HTML, make the question a heading and assign it a tabindex of zero:
<h3 tabindex="0">Question 1: The question?</h3>. This makes the question tab focusable and easily discoverable by screen reader users. (Note: Placing a tabindex on a normally non-tabable element will create non-valid HTML code. The accessibility benefits outweigh keeping the code valid.) SurveyGizmo, LimeSurvey, Checkbox, and Snap Survey support this ability. Zoomerang allows you to make a question a heading but it wraps the heading in a span element, always autonumbers the question, and strips out the tabindex.
- Make sure you use accessible question types. For instance, we have not seen any of the vendors implement an accessible star rating question type and many vendors matrix-style questions have poor accessibility.
- Make sure all links are understandable out of context. A screen reader can call up a list of links in any HTML document. There is no distinguishing between two instances of “click here”, for instance.
- Though it is simply a part of putting together a quality survey, make sure to check the spelling of your questions. In addition to spelling, edit for clear syntax, use active rather than passive phrasings, and double-check subject-verb agreement.
- Many of the survey tools allow a great deal of customization of look and feel through templates and stylesheets. Checkbox, LimeSurvey, SurveyGizmo, and Snap Survey all allow styling and custom templates. In your styles and templates set larger than default font sizes and high foreground-background contrast.
- Avoid the use of images in surveys. Decorative images can be unnecessarily distracting. If you must use images in choice questions or to illustrate a point, make sure to test the capability of image choice question types with a screen reader and, obviously, provide accurate alternative text descriptions for images. To date, the only survey tools we discuss in this document that allows for choice between images as a question type are SurveyGizmo and Snap Survey. SurveyGizmo's implementation is not accessible to the keyboard alone, though Snap Survey's is. Also, not all of the tools allow you to add alternative text to images.
- If you change language within a question or answer, indicate language changes using HTML:
<span lang="es">¿Qué honda, guero?</span>.
- Expand acronyms on first use and use the abbreviation tag subsequently:
<abbr title="Personal Health Assessment">PHA</abbr>.
A summary list of changes to this document.
- Changes for 1 May 2008 update
- Original composition of this web page was finalized. The first round of testing was completed April 4 2008.
- Changes for 25 July 2008 update
- SurveyGizmo claims in June 3 2008 change log post that they now group questions with fieldsets. We could not find this behavior for any of the question types where we figured it would make the most sense--groups of checkboxes or radio buttons.
- SurveyMonkeyupdated their FAQ dealing with accessibility in June 2008. The software was "certified" 508 compliant by RampWeb. We were contacted by a representative from SurveyMonkey via email who indicated their 508 certification. We have retested the product and do not see any changes to either the survey creation interfaces or the use of background images in radio and checkbox questions, from when we tested in April. As we note in our review, the latter cause the surveys to be impossible to fill when Windows is in high-contrast mode. They state on their accessibility FAQ that two aspects of 508 compliance are “Keyboard access for mobility impaired users” and “Color contrast for users with low vision”. However, with high-contrast mode enabled, it is not possible to determine what checkbox or radio button has been selected.
- Their have been no changes to Zoomerang since we tested in April.
- Checkbox has come out in a new version (4.5) that promises more customization and styling of questions using a graphical editor. We have not re-evaluated Checkbox. Our review is based on version 4.4.
- LimeSurvey is at version 1.71, the version we tested in April, though they say they are working on version 2.0. We will try to review that when it comes out. Note that the LimeSurvey Project Leader wrote us and said that we had some of our information “just plain wrong” regarding LimeSurvey accessibility. We responded but have not heard back.
- Added material on Snap Survey Professional Edition
- Changes for 24 September 2008 update
- LimeSurvey is at version 1.71+. In late July we received extensive detailed feedback from the LimeSurvey Project leader. We have reviewed the newest version and rewrote our comments, raising the end-user grade from C+ to B, due to our previous misunderstanding of certain features. Please note our caveats on question types, however. The administrator interface score changed from “Between technically accessible and Not Accessible” to “Technically Accessible” in our re-review. However users should be aware of the potential access problems related to on-change events being used in the drop-downs that select between surveys, groups, and questions. See our commentary for more on this.
- Added section on the security and privacy of the survey tools.
- Added section offering tips on creating accessible surveys.
- Updated “Summary of Findings” section.
- Updated prices, grade, and review for Checkbox.
- Removed some out of date text in the SurveyMonkey grade section.
- Made edits to the introductory section of this page.
- Changes for 19 November 2008 update
- Slight edits and corrections and an update to the “Tips” section.
Other Tools We Looked At
We examined a number of other survey tools. As was discussed above, we feel that our final picks have the most relevance to OSU because they are either currently used or have a degree of usability that merits mention. But the sea is full of fish. We also considered:
- EZQuestionnaire (markets itself as the survey tool for non-technical people and provides help setting up and running surveys and a questionnaire library of pre-fab survey types, for a fee, offers a free package)
- FeedbackFarm (hosted, completely free (no pay-for packages), limited question types and analysis tools)
- QuestionPro (hosted, geared toward enterprise and advertisers, sophisticated analysis packages, limited free accounts for non-profit and students, all typical packages are pay-for, 30-day demo available)
- KeySurvey (hosted, geared toward enterprise, API allows for integration into call center, HR, and CRM systems, 30-day trial but no free packages)