Vast amounts of data mean a vast amount of responsibility

Vast amounts of data mean a vast amount of responsibility

With the advance of social networks, artificial intelligence and robots, many new ethical questions arise. Companies in particular are faced with increasing demands with regard to the responsible use of digital technologies and data. “Corporate Digital Responsibility” has now been coined as a separate term referring to this.

Endless questions

There is no doubt that the digital change is making our lives better in many ways and that it plays an important role in meeting the major challenges of the 21st century – whether at work or in our personal lives. This has been proven once again during the Covid-19 crisis as regards working from home, warning apps and digital vaccination records. However, even with all the advantages of digitalization, we must not forget about the open questions that are at least just as important: How do we reconcile the advantages of networking with data protection and privacy rights? How can we protect intellectual property when information can be shared with the world in seconds? How do we deal with crypto-currencies that pry the financial system away from state control? How do we ensure that social relationships and social cohesion are not lost despite the increasing use of digital communication?

The digital change also raises highly complex ethical questions: How can we protect the right to informational self-determination despite global networking? How do we remain responsible citizens if algorithms will eventually know us better than we know ourselves? And then, the question arises as to which value compass artificial intelligence should apply, for example, when a self-propelled car in a dangerous traffic situation has to decide between the life of the driver and that of other road users?

Companies need to act

Many more questions could be added to the list. But who has the answers? As usual, the call for politicians to act could now become loud, as these are obviously questions that concern us all. However, technical progress is now so rapid that legislation often cannot keep pace. That’s why close dialog between politics, the private sector and civil society is necessary.

In this context, over the past five years, the term Corporate Digital Responsibility (CDR) has become established to describe the responsibility of companies in the digital society. In concrete terms, CDR refers to the voluntary commitment of companies to a responsible economy that takes account of the social implications of digitalization.

Another important step in this direction is the CDR initiative of the German Federal Ministry of Justice and Consumer Protection, which the Ministry already launched some time ago together with several companies. The initiative triggered the development of common CDR guidelines. And companies should be motivated to make digitalization value-oriented beyond what is required by law.

I believe that companies, as major beneficiaries and the driving force behind digitalization, have a particularly great responsibility. After all, digital technologies make it possible to achieve more efficient production and new innovative business models. In addition, digital products and services give them an ever deeper insight into our lives. In return, companies should therefore define clear guidelines for digital transformation and their own actions.

Digital responsibility in practice

No alt text provided for this image

Of course, what exactly digital responsibility requires in practice differs from company to company. Nevertheless, there are a few basic guidelines that everyone should follow, such as the correct handling of data. It should be a matter of course for companies to manage the data of employees, customers and business partners securely and to exclude the possibility of passing on data without consent. This has been a firm component of the sustainability strategy at Merck for a long time.

At the same time, companies should create the greatest possible transparency about the purpose of their data use. In particular, when it comes to ethical issues, external expertise is required here. That’s why Merck launched a Digital Ethics Advisory Panel. The committee, which consists of scientific and industrial experts, helps us with regard to ethical issues concerning the use of data, algorithms and new technologies. A Code of Digital Ethics was also recently published to give all employees a common basis and provide orientation. The code is designed for business purposes and is not an attempt to create regulations. It is divided into five core areas: fairness, autonomy, goodwill, harm avoidance, and transparency. With the code of ethics, Merck, stipulates, for example, that human intervention in algorithmic systems should be possible at any time. Apart from that, digital competencies should be continuously promoted in those who are responsible for AI or algorithmic systems.

Investing in education

Digital responsibility also means investing in digital education for everyone. This is not just about bringing digital technologies closer to people in order to prepare them for the changing labor market. It is just as important to handle digital information confidently in order to be able to critically examine content and ultimately make informed decisions. I already emphasized this some time ago in my blogpost on digital education.

Incidentally, companies should also assume digital responsibility out of a sense of well-understood self-interest for two reasons: Firstly, to prevent over-regulation by the state, because all progress needs room for maneuver. Secondly, new digital business models will only find long-term acceptance if they correspond to generally accepted social values. Only if companies approach digitalization responsibly will they retain the trust of employees, partners and customers and ultimately their “social license to operate”.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics