Essays on user adoption of connected and autonomous vehicles
Köster, Nils; Salge, Torsten-Oliver (Thesis advisor); Piller, Frank Thomas (Thesis advisor)
Dissertation / PhD Thesis
Dissertation, Rheinisch-Westfälische Technische Hochschule Aachen, 2021
Cars have increasingly become “rolling computers”, as connected and autonomous vehicles demonstrate. Connected cars are part of the Internet of Things (IoT), collecting and processing data in unprecedented amounts and specificity through their on-board sensors and communication modules. Connected car services, such as real-time parking space reservations or voice-assisted concierge services, are becoming increasingly important for car buyers. Autonomous vehicles (AVs) are a particularly far-reaching application of intelligent automation: Enabled by artificial intelligence (AI), cars will be able to transport their passengers without human intervention. This ability is expected to increase passengers’ available time and comfort. However, with these benefits comes a loss of control for individuals, over their personal data and over the responsibility for their safety. Tragic accidents with autonomous test vehicles and hacker attacks against connected cars have exemplified the potential risks. This raises the question of how users assess and account for potential risks when deciding whether to use autonomous vehicles or connected cars. Against this backdrop, my thesis addresses how research into individuals’ decisions whether to use a technology given its risks can be expanded not only in breadth but also in depth through these boundary-shifting contexts of intelligent automation and the Internet of Things. I conducted two extensive empirical research projects: In my first project, entitled “Connected Cars, Privacy Risk, and Individual Contingencies”, I examined how individuals perceive and consider privacy risks regarding connected cars as a particularly pervasive IoT applications. I explicate how individuals arrive at an assessment of privacy risks based on the specific negative consequences they anticipate for connected cars along dimensions such as physical safety, psychological well-being, social status, or freedom of actions. I also found that in the highly regulated context of connected cars, individuals who are more inclined to comply with rules perceive lower exposure to privacy risks. The extent to which these privacy risk perceptions eventually determine individuals’ decisions whether to share data is impacted by personal contingency factors. I address the critique that in extant research, perceived privacy risk has often been conceptualized as a rather vague “gut feeling,” that sharing data could results in “losses.” In my research, I demonstrate that highlighting the exact negative consequences of privacy invasion in the given IS context enhances compatibility to other areas investigating risk in consumer decision-making, better reflects the merging of virtual and physical space in the IoT and helps practitioners improve service design. My second project is entitled “Autonomous Vehicles, Initial Trust, and Structural Assurance” (Articles II and III). To assess whether they can trust AVs, individuals need to receive signals that AVs have adequate functionality, reliability, and transparency, in other words, that these vehicles make plausible and predictable decisions. I investigated how five structural assurance mechanisms, namely technical, provider, legal, certifier, and social protection, can be designed and quantified how effective they are as signals for initial trust through conjoint experiments with representative samples in Germany, China, and the USA. For instance, I found that legal and certifier protection can be more effective in building trust than the signals of the technology provider itself. On the one hand, differences between the studied countries were identified, indicating sensitivity of signaling effectiveness with respect to the cultural and social context. On the other hand, different user groups were also distinguished within the studied countries. These groups differ in the preferred structural assurance mechanisms and indicate that signaling, as a partially subjective process, is sensitive to individual characteristics. These results also provide implications for practice and policy by identifying recommendations for the design of structural assurance mechanisms and prioritization of user groups.