This is a survey to crowdsource information about algorithms or AI tools that are used in the context of healthcare. Please enter one algorithm per survey.
The purpose of this survey is to identify prevalently used clinical algorithms that may pose risks for bias of discrimination toward a certain group of patients. The information may be used by academic researchers and health policy advocates and experts whose desire is to mitigate the risk of bias and promote health equity.
Who should complete this survey?
If you are a healthcare professional, a medical student, or a trainee who knows about, interacts with, or works with algorithms or AI tools in the healthcare context, please complete this survey.
What is an Algorithm or AI tool used in the healthcare context?
We intentionally do not include a definition for "algorithm" or "AI" here because the examples of tools you provide will enable the formulation of a comprehensive and accurate definition. However, clinical algorithms can be described as "tools used to guide health care decision-making and can range in form from flowcharts and clinical guidelines to complex computer algorithms, decision support interventions, and models. End-users, such as hospitals, providers, and payers (e.g., health insurance issuers) use these systems to assist with decision-making for various purposes." (see Department of Health and Human Services proposed rule on Section 1557 of the Affordable Care Act).
This survey will be used by the Institute for Healing and Justice as well as its members. The results may be used and published. We will do our best to prevent disclosure of survey participants and contributors. The data will be used to better understand the scope of existing algorithms used in clinical settings, and its potential impact on marginalized groups of patients. Part of the results may be published or used for the purpose stated above.
Contact Us