tree: b3e3bbc4a7912851662d2ced512613f9c110f557 [path history] [tgz]
  1. alert-manual.html
  2. alertdialog-manual.html
  3. application-manual.html
  4. article-manual.html
  5. banner_new-manual.html
  6. button_with_aria-haspopup__dialog__new-manual.html
  7. button_with_aria-haspopup__true__new-manual.html
  8. button_with_default_values_for_aria-pressed_and_aria-haspopup-manual.html
  9. button_with_defined_value_for_aria-pressed-manual.html
  10. cell_new-manual.html
  11. checkbox-manual.html
  12. columnheader-manual.html
  13. combobox-manual.html
  14. complementary_new-manual.html
  15. contentinfo_new-manual.html
  16. definition-manual.html
  17. dialog-manual.html
  18. directory-manual.html
  19. document-manual.html
  20. feed_new-manual.html
  21. figure_new-manual.html
  22. form_new-manual.html
  23. grid-manual.html
  24. gridcell-manual.html
  25. group-manual.html
  26. heading-manual.html
  27. img-manual.html
  28. link-manual.html
  29. list-manual.html
  30. listbox_not_owned_by_or_child_of_combobox-manual.html
  31. listbox_owned_by_or_child_of_combobox-manual.html
  32. listitem-manual.html
  33. log-manual.html
  34. main_new-manual.html
  35. marquee-manual.html
  36. math-manual.html
  37. menu-manual.html
  38. menubar-manual.html
  39. menuitem_not_owned_by_or_child_of_group-manual.html
  40. menuitem_owned_by_or_child_of_group-manual.html
  41. menuitemcheckbox-manual.html
  42. menuitemradio-manual.html
  43. navigation_new-manual.html
  44. none_new-manual.html
  45. note-manual.html
  46. option_inside_combobox-manual.html
  47. option_not_inside_combobox-manual.html
  48. OWNERS
  49. presentation-manual.html
  50. progressbar-manual.html
  51. radio-manual.html
  52. radiogroup-manual.html
  53. README.md
  54. region_with_an_accessible_name_new-manual.html
  55. region_without_an_accessible_name_new-manual.html
  56. row_inside_treegrid-manual.html
  57. row_not_inside_treegrid-manual.html
  58. rowgroup-manual.html
  59. rowheader-manual.html
  60. scrollbar-manual.html
  61. search_new-manual.html
  62. searchbox_new-manual.html
  63. separator_focusable_new-manual.html
  64. separator_non-focusable-manual.html
  65. slider-manual.html
  66. spinbutton-manual.html
  67. status-manual.html
  68. switch_new-manual.html
  69. tab-manual.html
  70. table_new-manual.html
  71. tablist-manual.html
  72. tabpanel-manual.html
  73. term_new-manual.html
  74. textbox_when_aria-multiline_is_false-manual.html
  75. textbox_when_aria-multiline_is_true-manual.html
  76. timer-manual.html
  77. toolbar-manual.html
  78. tooltip-manual.html
  79. tree-manual.html
  80. treegrid-manual.html
  81. treeitem-manual.html
core-aam/README.md

core-aam: Tests for the Core Accessibility API Mappings Recommendation

The Core Accessibility API Mappings Recommendation describes how user agents should expose semantics of web content languages to accessibility APIs. This helps users with disabilities to obtain and interact with information using assistive technologies. Documenting these mappings promotes interoperable exposure of roles, states, properties, and events implemented by accessibility APIs and helps to ensure that this information appears in a manner consistent with author intent.

The purpose of these tests is to help ensure that user agents support the requirements of the Recommendation.

The general approach for this testing is to enable both manual and automated testing, with a preference for automation.

Running Tests

In order to run these tests in an automated fashion, you will need to have a special Assistive Technology Test Adapter (ATTA) for the platform under test. We will provide a list of these for popular platforms here as they are made available.

The ATTA will monitor the window under test via the platforms Accessibility Layer, forwarding information about the Accessibility Tree to the running test so that it can evaluate support for the various features under test.

The workflow for running these tests is something like:

  1. Start up the ATTA for the platform under test.
  2. Start up the test driver window, select the core-aam tests to be run, and click “Start”
  3. A window pops up that shows a test, the description of which tells the tester what is being tested. In an automated test, the test will proceed without user intervention. In a manual test, some user input or verification may be required.
  4. The test runs. Success or failure is determined and reported to the test driver window, which then cycles to the next test in the sequence.
  5. Repeat steps 2-4 until done.
  6. Download the JSON format report of test results, which can then be visually inspected, reported on using various tools, or passed on to W3C for evaluation and collection in the Implementation Report via github.

Remember that while these tests are written to help exercise implementations, their other (important) purpose is to increase confidence that there are interoperable implementations. So, implementers are the audience, but these tests are not meant to be a comprehensive collection of tests for a client that might implement the Recommendation.

Capturing and Reporting Results

As tests are run against implementations, if the results of testing are submitted to test-results then they will be automatically included in documents generated by wptreport. The same tool can be used locally to view reports about recorded results.