The collective work presents a study that was conducted to determine how important the ethical component is for Russian organizations and individual business representatives who work in the field of developing and implementing various systems and applications based on artificial intelligence technologies.
The study was conducted within the framework of the methodology below, which was developed by experts from the MGIMO Center for Artificial Intelligence, taking into account the best modern international practices and standards, which allowed us to take into account best practices in this field and ensure high quality research.
The research methodology includes several basic groups of indicators, such as:
- project activities of organizations related to ethical aspects of AI;
- the level of cooperation between organizations in the framework of work with AI;
- indicators related to the participation of organizations in global and national processes related to AI ethics;
- transparency of policies and processes;
- investing in the ethical aspects of AI.
In turn, several relevant subcategories were identified in each group of indicators, which allowed us to obtain reliable and detailed results.
After processing the results, relevant indicators were identified, including quantitative and qualitative indicators. In fact, the presented indicators differ to some extent from typical survey indicators that were previously conducted as part of the study of various aspects of artificial intelligence. At the same time, individual indicators had already been used in other studies conducted in Russia at the time of the study, and in the framework of this work, this complex is necessary in order to come to unique and reliable conclusions.
The final result of the study is a report that provides well-founded conclusions about the actual level of attention and immersion in the ethical component of artificial intelligence in Russia among organizations and entrepreneurs who develop and implement projects using artificial intelligence.