已透過 Google 文件發布
Survey Summary Brief
每 5 分鐘自動更新

Public perspectives on facial recognition technology: Attitudes, preferences, hopes, and concerns

Survey Summary Brief

One application of artificial intelligence (AI) that has entered public consciousness in recent years is “facial recognition technology” (FRT), which is used to identify or discover individuals by comparing an image of their face to a database of known faces for a match. Related uses of FRT purport to use facial imagery to measure a person’s attentiveness, to gauge characteristics like honesty, and to confirm that a person in an image is who they say they are.

FRT has developed significantly in recent years due to important technology gains and the ubiquity of cameras in society. And the use of FRT throughout society is growing. While public sector use is gaining momentum (in applications such as law enforcement and citizen service delivery), private sector actors are using FRT and related biometric technologies — such as scans of fingerprints, iris and retina, or voice patterns — for operations such as identity verification, secure-space access control, identification of individuals on watchlists, controlling regulated public spaces such as casinos, monitoring remote workers, invigilating online tests, conducting and assessing automated employment interviews, and proactively identifying customers who enter an establishment. On the receiving end of this technology, FRT provides clients and customers with the convenience and security of personal identification such as when unlocking a smartphone or gaining access to an electronic service.

How might people feel about such uses of FRT, where do their concerns over the technology outweigh any possible benefits, and when do they judge the benefits that the technology offers as offsetting any potential risks? As the public sector and private sector actors explore ways in which FRT can be used in support of their aims, while governments, civil society, and technology developers navigate the social and technological implications of its use, and in light of current and future anticipated applications of FRT throughout society, this research sought to empirically test what Canadians deem acceptable in the context of FRT applications used by private sector actors, focusing on attitudes towards, preferences with respect to, hopes for, and concerns about private sector use of FRT, as well as the policy-relevant aspects of their acceptance of FRT use by private sector actors.

This document reports on the results of a survey of approximately 3000 Canadian residents, from all provinces and in both official languages, focused on public perceptions of emerging uses of FRT in private sector settings. After preliminary demographic questions including gender, age, race, province and community type, citizenship, education, and income, respondents were asked for their perspective on six FRT implementation scenarios (covering a range of applications including workplace security, airline boarding, unlocking a smartphone, banking, hotel access, and work-from-home monitoring) and for their response to 29 statements related to a hypothetical FRT banking application.

The key measure explored is a concept called “behavioural intention”, which predicts as best as possible what a person might do if actually faced with the technology rather than contemplating a hypothetical application. While we can’t know for certain whether actual use would follow an intention to use something, psychological research has consistently shown that behavioural intention is a good proxy for actual use, and the stronger the intention to perform the behaviour the more likely the behaviour will be performed (Ajzen 1991). The results from the survey indicate that the following concepts have significant positive effects on behavioural intention (that is, for higher values for each of these measures, a respondent would be more likely to use the technology):

“Perceived risk” — a measure of the user’s belief that the hypothetical banking technology will put their privacy and money at risk — has a significant negative effect on behavioural intention to use the technology. As perceived risk increases, respondents would be less likely to use the technology. This relationship is stronger for younger participants than older ones.

The central results from this survey are that consumers will adopt systems that are easy to understand and use, and that integrate well with their current technology. If their friends and family adopt these systems, if they trust their bank, and feel such systems make for a better banking experience, they will also be more likely to adopt them. However, if they feel FRT technology puts them or their money at risk, they will reject its use.

FRT today is the science fiction of yesterday. This is true of most technologies, where incremental advances come together seemingly rapidly and appear in a futuristic state that is both appealing in its convenience and striking in its magicalness.[1] FRT systems based on ubiquitous digital cameras, AI-supported interpretation of the data harvested from those devices, and the creeping normalization of the functions that flow from those embedded, connected, systems will mean a future of increased surveillance based on FRT systems that offer appealing social functions like security and safety, and personal functions like convenience, speed, personalization, and system performance.

AI technologies like FRT offer new opportunities for the management and administration of the public service, the delivery of citizen service, and the development of public policy. Simultaneously, the use of FRT by private sector actors raises important governance questions for citizens, civil society, and public sector leaders. As much as governments must prepare for the effective application of FRT to meeting public sector goals, governments must also facilitate the positive use of FRT in the private sector while responding to the governance challenges it presents.

These issue — the effective use of FRT by the public sector, and the regulation of its use in the private sector — confront governments around the world. As jurisdictions, globally, attempt to manage these challenges, the cultural, economic, and governance similarities between Canada and the United States provide a valuable testing ground for revealing best practices, hopeful experiments, emerging concerns, and potential dangers. This project offers governments and public policy researchers in Canada, the United States, and beyond actionable insights into the beneficial uses of FRT in the public sector and the appropriate regulation of private sector uses of FRT.

However, as FRT and related technologies are increasingly adopted in the public sector, and their private sector uses push beyond the boundaries of current regulation, we will be faced with twin challenges generated by these technological advances: in whether and how to adopt new technologies, public sector actors will need to consider not only the cost/benefit assessment and the appeal of enhanced citizen services, but whether the full spectrum of societal interests are captured in those adoption decisions; and in reacting to the use of such technologies by private and public sector actors, governments will need to balance competing interests to ensure that our societies benefit fully from new technologies (as governments focus on promoting domestic technology development, fostering productivity gains through private sector adoption of world-leading technologies, and supporting competitiveness and improvements in service quality to ensure consumer utility) while mitigating their potential negative effects (including workers’ rights, environmental sustainability, fairness, and strategic economic development).

Further information

Dr. Justin Longo, Associate Professor

Johnson Shoyama Graduate School of Public Policy

University of Regina - Regina, Saskatchewan

Contact: https://www.digitalgovernancelab.org/contact/ 


Public perspectives on facial recognition technology: Survey summary brief        


[1] Reiterating the third of Arthur C. Clarke’s “three laws”, often stated as “any sufficiently advanced technology is indistinguishable from magic” (1977).