W3C

W3C News

First Public Working Draft: Synchronization Accessibility User Requirements

The Accessible Platform Architectures Working Group has published a First Public Working Draft of Synchronization Accessibility User Requirements. This document outlines accessibility-related user needs and associated requirements for the synchronization of multimedia. The successful synchronization of multimedia content, especially audio and video, is essential to accessible web-based communication and cooperation. Understandable media is therefore media synchronized to very specific limits, according to multiple research studies. By clarifying the parameters of adequate synchronization, we can influence the development of future technologies, specifications, and accessibility guidelines. Comments are welcome through 5th November 2021.

First Public Working Draft: Incremental Font Transfer

The Web Fonts Working Group has published a First Public Working Draft of Incremental Font Transfer. This specification defines two methods to incrementally transfer fonts from server to client. Incremental transfer allows clients to load only the portions of the font they actually need which speeds up font loads and reduces data transfer needed to load the fonts. A font can be loaded over multiple requests where each request incrementally adds additional data.

Working Group Note: XR Accessibility User Requirements

The Accessible Platform Architectures (APA) Working Group has published XR Accessibility User Requirements (XAUR) as a Working Group Note. XR refers to hardware, applications, and techniques used for virtual reality or immersive environments (VR), augmented or mixed reality (AR), and other related technologies. XAUR introduces technical accessibility challenges, such as the need for multi-modal support, synchronization of input and output devices, and customization. It describes accessibility user needs and suggests requirements. XAUR is for designers and developers involved in creating immersive and augmented experiences. It’s also useful for anyone who wants to better understand accessibility in a range of immersive or augmented environments.

First Public Working Drafts: WebXR Depth Sensing, Hit Test, DOM Overlays Modules

The Immersive Web Working Group has published the following three First Public Working Drafts:

  • WebXR Depth Sensing Module is a module extending the capabilities of WebXR Device API. It enables apps to obtain depth information computed by supported XR devices in order to provide more immersive experiences. The example use cases of depth sensing API include (but are not limited to) simulating physical interactions of virtual objects with the real world, occlusion, and non-visual applications that can make use of increased awareness of users’ environment.
  • WebXR Hit Test Module describes a method for performing hit tests against real world geometry to be used with the WebXR Device API.
  • WebXR DOM Overlays Module expands the WebXR Device API with a mechanism for showing interactive 2D web content during an immersive WebXR session. When the feature is enabled, the user agent will display the content of a single DOM element as a transparent-background 2D rectangle.

First Public Working Draft: Digital Publishing WAI-ARIA Module 1.1

The Accessible Rich Internet Applications Working Group has published a First Public Working Draft of Digital Publishing WAI-ARIA Module 1.1. Enabling users of assistive technologies to find their way through web content requires embedding semantic metadata about web document structural divisions. This is particularly important for structural divisions of long-form documents and goes along with embedding semantic metadata about web-application widgets and behaviors for assistive technologies. This specification defines a set of WAI-ARIA roles specific to helping users of assistive technologies navigate through such long-form documents.

This document is part of the WAI-ARIA suite described in the WAI-ARIA Overview.