A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Thanks for your interest in The Question! Sign up to participate in future questions here. This file is the raw data from Episode 30, deep dive hosted on August 29, 2024 with Ben Callahan and Geri Reid. There are 60 answers. The question this week was: ----- Hello Design System nerds—welcome to Episode 30 of The Question! I can’t believe we’ve been doing this long enough to have 30 episodes, but here we are. Thanks for staying in learning mode. Buckle up, this is a big week… One of the main drivers of design system growth these days is a need for more equitable and accessible digital experiences. From a legal standpoint, we’re well past due to fix a lot of the a11y issues that many organizations have ignored for decades. Geri Reid has been pushing for this in her work for a while now. One challenge that she repeatedly faces is how to measure the impact of a design system on an accessibility program. This week, we’re going to tackle this difficult issue. But to do that, we have to break this complex measurement down into a few parts. First, we have to understand the kinds of metrics you’re capturing about the accessibility of your products. Second, we have to understand how you measure the accessibility of your design system assets. Third, we need a solid understanding of how you track the adoption of your design system. With these three things, we can begin to look for a correlation between a product’s design system adoption and improvements across that products’s accessibility metrics. This is a lot to process, but that’s what we’re here for! Warning: we have a few questions we need answered in order to dig into this. Thanks, as always, for being willing to share! With all of this as context, here is The Question for this week: How do you measure the accessibility of your products? How would you rate the accessibility of your products? (This is a gut check, not a precise measurement.) How do you measure the accessibility of your design system assets? How would you rate the accessibility of your design system assets? (This is a gut check, not a precise measurement.) How do you define and measure the adoption of your design system? How would you rate the adoption of your design system? (This is a gut check, not a precise measurement.) | |||||||||||||||||||||||||
2 | Question 1: How do you measure the accessibility of your products? | Question 2: How would you rate the accessibility of your products? (This is a gut check, not a precise measurement.) | Question 3: How do you measure the accessibility of your design system assets? | Question 4: How would you rate the accessibility of your design system assets? (This is a gut check, not a precise measurement.) | Question 5: How do you define and measure the adoption of your design system? | Question 6: How would you rate the adoption of your design system? (This is a gut check, not a precise measurement.) | ||||||||||||||||||||
3 | testing - assistive devices / focus tested in environments testing - Accessibility QA list (woven in w/ design QA) automated - contrast | 3 | testing - Accessibility QA list (woven in w/ design QA) automated - contrast | 4 | automated - consumption at component & versioning level feedback - checkins with product teams versioning | 4 | ||||||||||||||||||||
4 | we ensure 100% of our DS components go through a11y during the design and code phases. products are also required to go through an a11y review, since they could introduce an issue outside the components...now, this is true for customer-facing products, internal tools have gotten a pass for awhile. but in 2025, they are saying they will have to meet the same requirements. but i'm not sure about measurement - would love to hear more about this. | 3 | we have a team in our Risk department that reviews all of our components during design phase and again at code phase. | 4 | usage (Figma) and consumption reports (code). But, i feel uncertain how accurate our measurements are and if they are the right ones. | 2 | ||||||||||||||||||||
5 | No idea there was a yardstick to measure accessibility! I'll ask. | 3 | I figured if the assets were accessible, so were the products. | 3 | Currently, DS Adoption is measured qualitatively. Some interest in Omelet to back this up with numbers. | 4 | ||||||||||||||||||||
6 | There are so many ways to measure accessibility, none of which indicate anything on their own, but together create a picture of how different products perform against others. Measuring accessibility defects created over each quarter is useful. So is knowing the code coverage of your automated tests. Finally, and most importantly, working directly with people with disabilities to test your product with them to understand how they are trying to use it, and identify the barriers that still exist in your product. | 2 | I have created test cases from both a design, and engineering perspective that is based on WCAG and experience of system component needs to check for issues before they the components are published. This is used to measure how accessible a component is. However, one of the biggest holes is that accessibility is extremely context dependent. Without content, or placement on the page, it is hard to know if the end result is accessible - therefore, how can you measure it? You can measure it against other design systems. What does our guidance say that theres does better or worse? Guidance is key in accessibility, and so far comparison is the best way to measure if it up there with the best known components. Figma metrics are also really helpful. What is the ratio of button vs link components used, does that seem right for your product? If not, maybe the purpose of components are not clear. If people are using components not as intended, semantically that likely means your components are being used in an less accessible way. | 3 | Measuring the number of best practices adopted by teams is helpful to measure both cultural and practical adoption. Such as custom HEX values vs system tokens. This is often directly related to accessibility, teams that use custom values will most likely struggle to support different themes/modes. The number of instances in a product is easy to calculate. However, that doesn't always correlate to less accessibility defects. If a team is using more components, it could mean they are just building more of the product, and are statistically more likely to have more bugs too. | 4 | ||||||||||||||||||||
7 | Testing the accessibility of components is at the core of our "definition of done" for all tickets. Accessibility requirements are included in the acceptance criteria submitted to the QA team. We also collaborate with Perkins Access to perform an extensive audit and screen reader testing. | 3 | Unfortunately our design system is only used by engineering. It is not published nor is it available on a staging environment for access by others. I am the only designer on the team who is able to spin up a local environment to access the design system but am I not included in contributing or maintaining the system. This isn't from the lack of trying. All of that to say, I don't know. 🤷🏻♀️ | 3 | Unknown. | 2 | ||||||||||||||||||||
8 | We've used a VPAT® to create an accessibility conformance report, perform smaller component evaluations, include Cypress tests for components, and are actively engaging with third-party evaluators. | 4 | Manual evaluations and automated tests. | 4 | We have an adoption scanner (https://gitlab-org.gitlab.io/frontend/pajamas-adoption-scanner/) and the definition is based on use, but doesn't currently track if the component is used correctly or for the right use case. | 3 | ||||||||||||||||||||
9 | We use a checklist to ensure we hit AA on WCAG. Our company has had some unique moments regarding our product being a11y to those who are blind because if they are legally blind, they shouldn't have a driver's license to rent a vehicle. But we've had to work with leadership that just because someone is legally blind doesn't mean they shouldn't have the opportunity to browse our site/apps and have the same experience as those who are not blind. | 4 | Again, we use checklists and plug-ins to ensure our components are at least level AA WCAG compliant. | 4 | We use Figma analytics to see how frequently our component library is being used and some data the engineers can gather on their end for adoption. We also send quarterly sentiment surveys to see how folks feel about our design system, how they adopt it, and if it meets their needs. | 3 | ||||||||||||||||||||
10 | We do not have any specific metrics for accessibility within our products, however, we do target a WCAG AA alignment minimum for accessibility guidelines. We monitor this through our accessibility teams, who check designs and development for accessibility defects to fix and improve. | 4 | We currently do not, besides for ensuring color contrast. I would be extremely interested in the specific metrics that can track accessibility in assets though! | 3 | We are currently discussing this in our team as we are in the process of releasing a new iteration of the design systems. A few key metrics we will use are the usage of the design system vs to usage of the previous design system in both design and code, the usage of the new design system in general (amount of components and layers using design system artifacts) and measuring amount of time saved with the new design system when creating a design compared to the old design system. | 2 | ||||||||||||||||||||
11 | Run ad hoc testing with tools such as Lighthouse or Stark to gather anecdotal evidence. But then use buildchain tools in the QA process such as Pa11y, or running periodic website tests such Lighthouse+Datadog. All of these give you data to point to where a problem might be and to collect objective information about what is causing the accessibility problem: is it the design system? is it the combination of components? is it 3rd party libraries? is it internal end users who do incorrect things? is it customer-entered data? | 3 | Definitely Lighthouse and Stark in the design process. It's better to cut down on problems at the source. And then to check usage stats to see which teams are not consuming the DS---this includes using the documentation. | 5 | It Depends (TM). For Marketing purposes, I've been recommending a separate design system so visual designers and marketers can go wild, and yet still stay connected with the main themes. This has usually led to more usage than not, and it's pretty easy to tell. It also depends if Marketing uses website hosting systems such as Webflow, or a blog, or social media. This includes help websites which might be on a support ticket system that just happens to have public web pages. For product development purposes, this is easier. You can draw a graph of which teams are using the system (usually it's easier to map out teams vs. count all the products). Unless you work for a design-led company that is 100% brand new and demands the use of a design system, it's very unlikely you'll hit 100% adoption in the products. It's better, in my opinion, to take up space in people's brains: get all teams on board, get the leadership to understand the DS is a benefit and not using it results in problems, and get people to train other people. You know things are working when global changes start occurring in a shorter time period (2 sprints instead of 5 sprints), when people join your DS Slack/Teams channel, and the usage of the Figma library is more frequent. | 3 | ||||||||||||||||||||
12 | Before I arrived no care was taken to make sure our product was ADA compliant via WCAG guidelines, so there is no formal process. With new initiatives I am involved with, pages are tested through Polypane and other browser extensions to ensure AA requirements are met for things like color contrast and navigational landmarks. I'll also run things through Screen Reader and JAWS to make sure pronouncements are as intended with critical items being accessible via keyboard navigation. | 2 | The same set of process is put in place for components in the design system as I have established for new application initiatives. Color contrast needs to meet minimums across the entire component (including dark mode) and when I use the keyboard in combination with Screen Reader or JAWS I should be able to easily navigate the component with audio queues that make sense for what it is I'm focused on. | 4 | We are very early in our design system journey so no broad teams have adopted it in their day to day workflow. I'd like to measure adoption in the future by how many application front-ends include it as a package but also by how many people proactively reach out to have us sit in on early conversations with regard to how the DS can be incorporated at conception rather than as an after-thought. | 1 | ||||||||||||||||||||
13 | When designing our interface, we prioritize accessibility from the start. We believe that adhering to WCAG guidelines not only ensures compliance but also leads to more intuitive user experiences. | 4 | A wide-range of tools in Figma, Browser Plugins, VS Code Plugins, and knowledge/experience of the specs/guidelines. | 4 | Instead of viewing accessibility as a constraint, we see it as a framework for best practices. It’s much easier to integrate accessibility into the design and code from the beginning than to retrofit it later. Our goal is to achieve at minimum AA compliance wherever possible. Lastly, it helps aid in reviews – the classic, "it has to be this way" haha | 5 | ||||||||||||||||||||
14 | We don’t have a formal process. Individuals find issues and file tickets for them, but it’s not very organized. Different teams have tried multiple times to hire an accessibility consultant, but it’s never been a high enough company priority to get budget approval. | 3 | We’ve tried to build accessible components, particularly form elements, but we don’t have a formal review process. | 3 | Our design system is our code base, and our whole website is in our React code base. But it’s hard for us to monitor and track when people go rogue with components. We mostly find things when we try to convert a component and it breaks in a lot of places. | 4 | ||||||||||||||||||||
15 | we don´t yet | 3 | We don´t | 3 | we don´t yet | 3 | ||||||||||||||||||||
16 | Honestly, I'm not sure. In a past role, we utilized material UI and relied heavily on the accessibility they already had built in. My experience is limited, and I'm here to learn from others' successes and failures. | 3 | Same as the previous answer, we utilized material UI and relied heavily on the accessibility they already had built in. I'm not sure how often (if ever) we measured how accessible components were. | 4 | We were a very small company - ( 50 people total - with a design team of 2 and about 8 frontend devs and 5 backend devs ) it was extremely rare that we ever built anything that did not fully utilize the Design System, therefor I don't think we focused our energy much on measuring the adoption of it. We knew no different, the DS was there from the start and made our lives easier as devs. | 5 | ||||||||||||||||||||
17 | We have a team of accessibility specialists who are part of every design pod, and review and test the products for visual accessibility, keyboard input, and screen reader. | 3 | We have dedicated accessibility specialists who review and test all components in the system with the same checklist used for products. We have both manual and automated accessibility QA. | 5 | Main measurement is percentage of DS components out of all UI components, in code. We currently focus only on new projects, not legacy code. | 3 | ||||||||||||||||||||
18 | It is product by product but there is not alot of measuring in the sense of metrics. We make an effort to build accessible patterns but it is not tracked as it should be. | 3 | We use automated checks as well as manual testing to assure that the component is accessible. How it is used though is on the consumer and additional testing is needed within their product. | 4 | Low currently. | 2 | ||||||||||||||||||||
19 | We're a small team, so we rely on straightforward metrics: * jest-axe accessibility automated tests * yearly product audits with a professional auditor | 5 | Primarily automated tests with jest-axe We use MUI which we know is not perfect but their accessibility is pretty good and we also reference what is and isn't working for them | 5 | % of code by DS/non DS | 5 | ||||||||||||||||||||
20 | For now we’re focusing mostly on converting product to use better colors and contrast. There are no built in metrics - would love to learn about it though. | 3 | All new assets we create ensure to have colors / contrast at least of AA standard. | 4 | Not there yet - still working on a rollout. Plan to use trackers and eager to learn best practices. | 1 | ||||||||||||||||||||
21 | We have a dedicated team of Accessibility Specialists. I do what they tell me needs to be done. | 5 | We have a dedicated team of Accessibility Specialists. I do what they tell me needs to be done. | 3 | We're working on this. Currently it's easier to spot divergence through communication than it is through automated systems. | 5 | ||||||||||||||||||||
22 | We have internal and external audits made by experts. | 3 | We have internal audits made by experts. | 4 | We don't have yet a precise way to measure adoption other than qualitatively by inspecting our applications. We do have a general idea on adoption from different teams but nothing technical to measure it so far but we're looking into it right now. | 3 | ||||||||||||||||||||
23 | For content: Their adherence to the WCAG 2.2 standard | 3 | For content: Their adherence to the WCAG 2.2 standard | 4 | For content: - We define adoption as the regular use of content guidance for general UX writing and components. - We're in the process of selecting the metrics that measure adoption. Currently, it's anecdotal. | 3 | ||||||||||||||||||||
24 | WCAG compliance. | 2 | WCAG compliance + specific component level checklist and guidelines. Because even for fully compliant components, you can mess up a11y in implementation. -Arko | 5 | Number of products adopting out of all the products in the company. Component usage in design and dev in the adopting products. Bug reports, feature requests, contributions from products. Regular check-ins with each adopting team. -Arko | 3 | ||||||||||||||||||||
25 | We don't - but we'd like to start a program to support new program requirements. | 2 | Unfortunately, it's been an afterthought. We actually need metrics to make the case for additional Eng resources to support the implementation of accessibility work we're proposing. | 3 | Our existing metrics are very political and non-standard in the industry in terms of what they are measuring. We're proposing a new, updated model - based on framework, components, etc, but that has some challenges for buy-in. | 2 | ||||||||||||||||||||
26 | We're using the Siteimprove platform to automatically crawl our sites and detect issues, so I'd say there's a lot of reliance on the given scores as a metric. We don't have a consistent/efficient way to measure results of _manual_ testing for each site, though, so I must offer the constant caveat that the score is not a perfect metric. | 3 | I'm currently manually testing UI components and filing tickets against them, so I might be tempted to say "tickets/component", since I have numbers for that. However, I file a few related issues in a ticket, so it's not a perfect count. Also, component complexity varies a lot (some are "atoms" and some are "organisms"). | 3 | The tech team didn't seem to have any reliable visibility into that, so the (new) Director of Digital Platforms has been pushing for something, especially because we need to prioritize a11y efforts right now. We were able to get them to run some scripts to get an idea of component counts per site, but we will need to cross-ref with site analytics (e.g. a huge list on a page no one visits vs. a single component on a homepage). | 3 | ||||||||||||||||||||
27 | By accessibility auditing applications by external experts. In normal mode, a product team can order/pay for an audit, but likely seldom happens. Currently our company has been auditing a vast majority of the applications to get a better picture where we are at. | 3 | The design system components were just audited. Other than that no good metrics. The axe tools don't find stuff. Maybe that is a metric :D Auditors did find however. | 4 | We do have data and metrics of the adoption of our component library major versions. We know what the proportions of the different major versions are in use. We don't have more fine grained data, such as component level usage. We do know the teams use the DS components by default, which is awesome. | 5 | ||||||||||||||||||||
28 | Lighthouse for automated testing, also have manual testing team with screenreaders, tracking happens on Jira or ADO tickets in terms of critical or high priority accessibility bugs | 4 | NA but I'd want to do % user tasks accessible by representatives of different diversabilities | 3 | % usage where applicable | 3 | ||||||||||||||||||||
29 | We have no quantitative metrics around the accessibility of our products, only ad hoc accessibility reviews when a new product or feature is launched. Any issues discovered get turned into Jira tickets, but there is no tracking around how many were created or the most common types of accessibility issues. | 4 | No quantitative metrics, but we run manual and automated tests (using Axe plugins for Jest and Playwright) on all components before releasing them. These catch the vast majority of issues, but every now and then something gets through that can not be found through these automated test that an accessibility expert at the company catches and reports back to the design system team and a Jira ticket is created to resolve the issue. | 5 | Currently, this is completely done manually as the design system is relatively new and the design system team is intimately involved in most adoption projects. | 2 | ||||||||||||||||||||
30 | We don't 🙃 right now we're in a very bad place because there's no standard accessibility practice for designing and shipping experiences. | 1 | For the DS we're starting with a basic checklist though- things like do all the components have keyboards navigation, support voice over, meet color contrast, etc | 2 | We're looking at tracking this in code by seeing where we can identify the components that are prefixed with ds | 2 | ||||||||||||||||||||
31 | Using evaluation tools like wave and usability tests internally | 4 | Using tools and checkers like WAVE and usability tests | 5 | We can only capture figma access and GitHub commits | 3 | ||||||||||||||||||||
32 | We have an accessibility Specialist embedded in the team, who constantly audits the system and raises tickets in the backlog. The system is scored according to WCAG AA compliance and the severity of the unresolved issues. | 1 | We are now pretty good, with just a few major issues outstanding. The aim is to get above 85% compliance in the next quarter. | 4 | Adoption is a many layered cake, and fully using the code and design assets is only the top layer with the cherry on top. So honestly, I try not to define or measure this unless absolutely forced to, as it's a waste of my time. I'd rather spend the time building a better product and helping people use it well. The Accessibility Manager is much more interested in a brute "adoption" metric (but this applies much more strictly to the code) so there he relies on understanding which teams there are and individually checking on their adoption status. | 3 | ||||||||||||||||||||
33 | use of Checklist both automated and manual testing | 2 | Not sure if this applies to me as I currently dont measure ds | 3 | Probably looking thru out all the pages of a website and how many components from DS are used vs, one offs | 3 | ||||||||||||||||||||
34 | Accessibility scores. A11y champions | 3 | Each component has accessibility criteria and score attached. | 5 | Working on defining adoption metric for new design system launching in Q1 2025 | 4 | ||||||||||||||||||||
35 | I use accessibility tools such as Deque and monitor the users during the interview. | 1 | N/A | 1 | N/A | 1 | ||||||||||||||||||||
36 | Each individual products goes through an accessibility test by a team focused solely on accessibility. Products get a score and a list of things to improve or adjust. | 3 | We have an accessibility score we measure against | 3 | We're working on defining these metrics now | 2 | ||||||||||||||||||||
37 | We don’t track metrics but we do have dedicated Accessibility Consultants that are integrated into our team. We follow WCAG standards as a baseline. | 4 | We do regular A11y audits and keep track of outstanding findings that need remediation. We would like to measure this and it’s something we are talks about with our A11y partners. | 4 | We keep an inventory of UI tools (using an in-house insights team API) in production and which ones use our DS and which are using something else like Ant Design, Semantic UI or Material. As PM, I meet with product teams to see how converting them to our DS can fit into their roadmap. | 4 | ||||||||||||||||||||
38 | We measure the accessibility through defects and user testing feedback. | 4 | We measure the accessibility through embedded inclusive design and accessibility support on the DS team. We also have a team doing reviews when components have been added within product features. | 4 | We measure adoption through Figma analytics, we have a lot more work to do. | 3 | ||||||||||||||||||||
39 | We are fortunate enough to have an accessibility team that we consult with constantly. But we designers on the DS here anyway are very proactive about learning about accessible design and take Deque courses and try to learn as much as we can. We also have colleagues who are blind and so we're learning to adapt to new ways of working to ensure that everyone can participate in design conversations. | 4 | We go through rigorous testing and Q/A that requires that teams' sign off before things go "live". | 4 | There is another team that does this, so I am unfamiliar with the metrics or studies that they use to gather this data. | 3 | ||||||||||||||||||||
40 | We have a internal plugin that helps measure color contrast for text and shapes, that is used per Figma frame. For this current spike to help come in like with EAA, we have been sourcing external vendors to help us with a wider range audit. Often times, designer audits are color contrast only, these external audits go through screen reader and ARIA label issues. | 3 | Using internal plugin, contrast app plugin, and sim daltonism (for color vision). | 4 | Aiming to hit WCAG AA for color contrast. We have a internal scoring system that is tied to contrast and usability. The scale is from 1-5, 1 being non-functional, 5 being completely functional. | 4 | ||||||||||||||||||||
41 | Each web page that is launched is required to go through a review by an accessibility tester before going live. All issues found are logged, and if they are critical the production is blocked. There is a log kept of how many pages contain non-critical issues, and how many are fully compliant. This allows us to have metrics on how many issues are currently present, and how many have been remediated. | 5 | Before any design system asset is added to the library, or updated, it must be reviewed by an accessibility tester. This allows us to say that anything present in the system has been reviewed is accessibility compliant. | 5 | We keep track of how many teams and designers / developers are currently using the system. We also measure the number of products launched with out system compared to those that are not. | 3 | ||||||||||||||||||||
42 | Great question--I have no idea. We're just now hiring an Accessibility Lead to scope that kind of construct. | 2 | We build our components (and vet our styles) with accessibility in mind, scaffolding as many "hooks and levers" to make system resources as a11y as possible out of the box, and then add instructions to our documentation on places implementation teams need to take it over the finish line (like adding area-describedby content). | 4 | Any individual or team that uses ANY facet of the system is an adopter; hard stop. | 3 | ||||||||||||||||||||
43 | Not sure that we do! | 4 | Our designers research and design according to these findings, then we run our work through our ADA team. | 4 | It's kind of a manual process currently. No good data. Hoping we can build an internal tool to track this. | 4 | ||||||||||||||||||||
44 | I haven't been with a company that has measured accessibility as a "score" solely using automated tools. We've strived for certain ranks set by WCAG and have involved experts on the topic during builds. | 4 | Involving experts during build phases and testing them in real usability situations rather than as an isolated asset. | 4 | Currently, I'm on a smaller team where we aren't measuring it because the entire team is bought into it. | 5 | ||||||||||||||||||||
45 | Through semi-regular audits, automated axe a11y tests, and the axe dev tools. | 3 | We try to build it in from the start, but otherwise use the same techniques as our products - audits, automated testing, and the axe dev tools. | 4 | We haven't decided yet! Hoping to learn from others. | 2 | ||||||||||||||||||||
46 | We have an accessibility specialist who reviews things, checks with a screen reader, etc. | 3 | We aren’t yet - still advocating to get a content designer involved more deeply. | 1 | Not prioritized / not doing this yet | 1 | ||||||||||||||||||||
47 | We currently don't! I've was tasked with conducting an a11y audit of the existing product, as well as an ongoing re-build of the product. Not a full audit... no "score" provided.. but I did assess key pages/templates using AxeCore, NVDA, keyboard, voice control, etc, and documented my findings. Before I joined, there was no a11y analysis whatsoever! | 1 | We currently don't! The design system is brand new... we're retro-fitting it to an existing product, but it only contains Figma assets. There is no UI component library. I'm trying to introduce Storybook for the re-build, but the dev team are already 80-90% finished! There will be lots of reverse-engineering... with no budget! | 2 | N/A as it's a brand new design system | 1 | ||||||||||||||||||||
48 | We have an accessibility team who measures | 4 | We work closely with the accessibility team to test and review our components. | 4 | We haven't started measuring yet. | 2 | ||||||||||||||||||||
49 | We have completed manual audits of existing products to identify issues, labeled as low, medium, high, & critical. Also using automated tools. Implemented ADA annotations into design handoffs. Worked with dev and QA to implement ADA checklists for new features. | 4 | Through ADA reviews at different stages of creation (design, QA) | 5 | We have a small enough product organization it has been adopted across the board by all designers and devs, so tracking adoption has not been a priority. | 5 | ||||||||||||||||||||
50 | Axe, manual QA, fable | 3 | Keyboard navigation automated tests, manual screen reader testing, axe | 5 | Built a code scanning tool that looks for instances in code of component usage | 3 | ||||||||||||||||||||
51 | As a designer, color contrast and legibility. | 3 | Color contrast, legibility, focus states/order. | 4 | Design teams trying to use the design system assets when possible and working with us to make enhancements to components, vs just breaking them or doing something similar to a component but different. | 4 | ||||||||||||||||||||
52 | We are currently assessing how we will implement this in the organization. Our design system recently implemented the concept of measuring inaccessible components. | 3 | We are implementing a script that looks at aria labels. Since we are migrating from an old library to a design system, all new components are developed with accessibility features; adoption of these over time will also play a role in accessibility. | 4 | We define adoption as 1 team using more than one component more than one time. This action is repeated over time across the applications. We are at the very beginning of our design system with only a handful of components in pilots, to add context, but that the definition we have as a baseline. | 1 | ||||||||||||||||||||
53 | We use tools to review code for accessibility compliance. We manually review designs for accessibility. | 3 | Manual reviews with Accessibility staff | 4 | Currently manually observing teams work for adoption. Working on adding analytics to scan the code repos for adoption. | 3 | ||||||||||||||||||||
54 | We don’t. We have some pipeline checks to make sure we don’t ship things that are in obvious violation but we don’t track any metrics. | 2 | We do not measure it. | 4 | We do not measure our adoption. | 5 | ||||||||||||||||||||
55 | currently we do not measure it | 3 | manual checklist | 4 | currently we do not work in progress to develop a system | 3 | ||||||||||||||||||||
56 | Active usability testing with people with disabilities and those who use accessible technology. We also actively monitor two types of automated scans and do quarterly reporting on it. | 4 | Active usability testing with people with disabilities and those who use accessible technology. We also do regular accessibility audits on our components and provide guidance about how to test them in products that use our design system. | 4 | Big debate point right now! | 2 | ||||||||||||||||||||
57 | While we design & QA with accessibility in mind, we don’t have anything in place to measure product accessibility. | 3 | We use plugins for accessibility checks on color contrast, click & touch target size, color blindness, etc…and adhere to minimum of WCAG AA standards. | 4 | Design System adoption is not yet measured (design system is in its baby stages and while it will replace the old design systems currently used across our products, we are initially just starting with slowly adopting components as they are built across a few of our products. The rest will start adopting once the DS is more mature), but as a first step, we will be starting with zeroheight’s new adoption feature that allows to see which teams are using which components (and versions) and where. | 3 | ||||||||||||||||||||
58 | Through performing self-audits utilizing tools like ARC, WAVE, and axe browser plug ins. Also, automated accessibility front end tests. | 4 | Audits into color contrast ratio. Not sure how to perform other audits, though perhaps a check for presence of alt text with certain designs. I hear Figma uses an a11y plugin but I don't believe my team utilizes it. | 2 | We are part of a small org that is just beginning to use a design system. So we can't yet measure adoption. But one goal we have, for example, is to standardize the amount of input fields we use. But I believe to define adoption would mean that we only implement our established/recommended components and practices. | 1 | ||||||||||||||||||||
59 | We don't have any formal metric for accessibility but my DS team is well versed in accessibility guidelines and we also leverage a few different tools while designing/developing the DS to ensure we meet minimum accessibility requirements. The DS, as a product, is very accessible but at a broader org level other teams struggle with accessibility requirements. | 2 | We don't have any formal metric for accessibility but our internal accessibility team will run an audit on assets and provide results in a spreadsheet for use to review and fix. | 4 | This is something we are actively trying to work on as a team. Our orgs DS has been live for a few months now but we currently struggling on how best to measure adoption. | 2 | ||||||||||||||||||||
60 | We dont - and that's a problem. Design knows its a problem but leadership isn't concerned. Our guidance has been to try and make it as accessible as you can, but its not a priority. So although design is incorporating to the best of our ability - is development? | 2 | We're not. We need to be better. | 2 | we're still working through this process. our design system has a ROCKY past that made it virtually impossible to have any adoption. We're making head way and finally to a point where we can start capturing this data. | 3 | ||||||||||||||||||||
61 | pass automatic and manual checks and usability testing | 3 | Adherence to accessibility standards, User testing with diverse users | 3 | Track how often components from the design system are used in various projects. This could include specific UI elements, patterns, or styles. | 3 | ||||||||||||||||||||
62 | QC specialists have a11y acceptance criteria. Part of designer QA process as well. Try our best to talk a11y considerations at every step. | 3 | We track a11y "status". This is a gut check bases on semantics, color contrast and things like aria. | 3 | Number or products in compliance with DS, number of teams/verticals embedded with DS. | 4 | ||||||||||||||||||||
63 | ||||||||||||||||||||||||||
64 | ||||||||||||||||||||||||||
65 | ||||||||||||||||||||||||||
66 | ||||||||||||||||||||||||||
67 | ||||||||||||||||||||||||||
68 | ||||||||||||||||||||||||||
69 | ||||||||||||||||||||||||||
70 | ||||||||||||||||||||||||||
71 | ||||||||||||||||||||||||||
72 | ||||||||||||||||||||||||||
73 | ||||||||||||||||||||||||||
74 | ||||||||||||||||||||||||||
75 | ||||||||||||||||||||||||||
76 | ||||||||||||||||||||||||||
77 | ||||||||||||||||||||||||||
78 | ||||||||||||||||||||||||||
79 | ||||||||||||||||||||||||||
80 | ||||||||||||||||||||||||||
81 | ||||||||||||||||||||||||||
82 | ||||||||||||||||||||||||||
83 | ||||||||||||||||||||||||||
84 | ||||||||||||||||||||||||||
85 | ||||||||||||||||||||||||||
86 | ||||||||||||||||||||||||||
87 | ||||||||||||||||||||||||||
88 | ||||||||||||||||||||||||||
89 | ||||||||||||||||||||||||||
90 | ||||||||||||||||||||||||||
91 | ||||||||||||||||||||||||||
92 | ||||||||||||||||||||||||||
93 | ||||||||||||||||||||||||||
94 | ||||||||||||||||||||||||||
95 | ||||||||||||||||||||||||||
96 | ||||||||||||||||||||||||||
97 | ||||||||||||||||||||||||||
98 | ||||||||||||||||||||||||||
99 | ||||||||||||||||||||||||||
100 |