No document available.
Abstract :
[en] There exist three main philosophies of technology: instrumentalism, soft and hard technological determinism.
Instrumentalism is the theory that technology is neither good nor bad, but a tool in the hands of human beings who might use it to produce good or bad outcomes. Technology is a neutral means serving ends given by others that, thus, does not contain inherent moral values. If there is no room for ethical choice in the design of technology, then regulating the technician is irrelevant. Reversely, the user of technology is the one who gives technology an end and, therefore, is the one to be regulated.
Hard technological determinism, on the contrary, suggests that technology causes social changes but is uncaused by social factors. This line of thought denies the capacities of human beings to exert a control over technology. Its regulatory counterpart is therefore a laissez-faire strategy: whether lawmakers believe technology will solve all the world’s problems (solutionism) or has resigned themselves to the consequences of technology (fatalism), technology is deemed an independent variable hence immune to regulation.
Between these two extremes lies soft technological determinism, that is, the theory that technology is both determining society and determined by it. Soft technological determinism leaves room for human agency: engineers can inscribe their intentions within technology to shape human behaviour. A soft technological deterministic lawmaker will therefore regulate designers. However, soft technological determinism warns against unintended social or environmental consequences that go far beyond the initial purposed assigned to a technology by its architect.
Against that background, this paper assesses the Artificial Intelligence Act (AIA). As its mandatory requirements target both users and developers, it seems it is driven by both instrumentalist and soft determinism. This paper argues the AIA, however, partially missed the lessons from soft technological determinism when it comes to unintended consequences.