- About IIIA
- Current news
There is a growing concern about the social impact and dangers of intelligent and autonomous systems in our hyper-connected society. It has been argued that one sensible approach to address this concern is to find ways to align the technology with the ethical and social values held by the communities and individuals that use those systems. In spite of being recognised as a central research goal for the future of AI, this Value Alignment Problem (VAP) is still little understood and poorly explored.
We propose to address the VAP in the context of hybrid social networks —those that serve to coordinate a collective activity that involves humans and artificial entities— because autonomy plays a significant role in them, and because of their substantial economic and social impact.
Our approach is based in the Value-Sensitive Design (VSD) initiative that strives to include an analysis of human values (privacy, fairness, solidarity...) as part of the design process of a system. In addition to rising awareness of the problem, VSD has formulated some heuristics for elicitation of those values and has motivated some directives for specific values like privacy and security. However, there is no formal backing and little technological developments beyond the challenge VSD identifies.
We intend to go step further and make those intentions operative and imbue values in a system. We propose to link values with the norms that regulate interactions in the systems that support social networks.