The acceleration of digitalisation since the beginning of 21st century has brought radical transformations in production models and brought the Industry 4.0 process to the centre of global agenda. Technologies (internet of things (IoT), cyber-physical systems, big data analytics, artificial intelligence (AI), cloud computing and augmented reality) are affecting not only the industry but also the functioning of states, international relations, and legal frameworks. This transformation is beyond being a technical evolution. It has important effects on the basic norms of international law and the concept of sovereignty, enabling a search for a new global order.
While Industry 4.0 combines the classical understanding of production with industrialisation, it is building a new production paradigm going from digital twins to robotic factories. However, this paradigm affects production systems and gets with it a series of new problems from cybersecurity to intellectual property rights, from data sovereignty to military artificial intelligence applications.
The development of artificial intelligence brings the digitalisation of production processes with transformation of human decision-making mechanisms to the agenda. In this context, the presence and variation of artificial intelligence as autonomous decision-making systems has started to test the principles of human duty and responsibility, which are the basis of the rule of law. International law, especially human rights, armed conflict law and liability regimes, is experiencing uncertainty about how to solve the acts of autonomous systems. For example, it is not yet clear whether the state or the system developer company will be responsible if an AI-supported weapon system kills a civilian. Such scenarios raise the question of whether existing international law is enough against the unpredictability of artificial intelligence. In this sense, it can be said that Industry 4.0 and artificial intelligence, which has become an indivisible part of it, have expanded areas called “normative gaps” in legal literature.
Data has become the strategic resource of Industry 4.0 and has been defined as the new form of power. With digitalisation, the ability to own data has begun to decide the economic and the political power of states. In this context, the regulations shaped by the concept of “digital sovereignty” by the European Union, especially General Data Protection Regulation (GDPR) and the Artificial Intelligence Act (AI Act), and their efforts to guide global data law are noteworthy. However, creating and regulating more market-oriented models by countries like US and China creates legal challenges in data management. This situation indicates that international law has not had an effective regime establishing a structure that controls global data flows. In the international arena, sovereignty is no longer defined by the protection of borders; the control of digital infrastructure, algorithmic transparency and data processing competence also redefines it.
States, traditional actors of international law, entered into a competition for sovereignty with technology firms in the context of Industry 4.0. In particular, the international actions of media that develop artificial intelligence and manage global data flow (such as Google, Amazon, Baidu) are weakening the post-Westphalian sense of sovereignty and supporting the role of non-state actors in international law. In this context, a new kind of “digital proxy war” is occurring, and the safety procedures of states are increasingly entangled with the claims of technology firms. In such environment, the question of which ethical norms will control decisions concerning the use of artificial intelligence on the battlefield or in public administration has become critical because it is known that these systems have issues such as algorithmic prejudice or absence of transparency. This situation necessitates the international society to create a common normative base against the violation of human rights by algorithms.
The digitalisation wave of Industry 4.0 also creates cracks in the law of war. For example, in the event that unmanned aerial vehicles (UAVs) seeing targets with artificial intelligence-based systems generate damage to civilians, how responsibilities arising from the Geneva Conventions, such as the principle of proportionality and reputation, will be monitored is still a grey area. Discussions under the UN umbrella considering autonomous weapon systems (LAWS) reveal that there is a normative gap based on artificial intelligence in the law of armed conflict. At this point, the question of the degree to which human control should be included within the system has become not only a technical but also an ethical and legal issue. How the principles of international humanitarian law will be applied to these new weapon designs is an issue that impacts both the nature of war and peacetime security rules. In addition, the issue of who is responsible for the damage caused by an algorithm that is not included in the chain of control in the event that artificial intelligence cannot determine targets on the battlefield is also staying to be resolved in terms of international law.
The intellectual property power is also changing in the context of Industry 4.0. If artificial intelligence creates patents on its own or creates original works, the question of who will own the rights occurs. In copyright law, which is based on the belief that the author is human, there is no legal ground yet for effective artificial intelligence systems to claim rights over their works. The discussions initiated by the World Intellectual Property Organization (WIPO) reveal the need for global consent. These systems are the topics to different understandings in other jurisdictions, complicating the global views of digital outcomes by improving the multilateral legal authorities. In this context, redefining intellectual property in the age of artificial intelligence is a strategic issue affecting economic competition and technology transfer.
In the context of Industry 4.0, the problem of cybersecurity is becoming central to international law. Cyberattacks are now being used as a new form of warfare in interstate connections, and the issue of when these attacks will be viewed “armed attacks” is a serious discussion in international law. Although NATO has started the first initiatives on this issue through the Tallinn Manual, a critical rule has not yet been established at the UN. In this context, the international community needs to develop common norms on points such as sovereignty in cyberspace, fair defence against cyberattacks, traceability of attacks, and compensation for damages. Otherwise, the culture of immunity in cyberspace will worsen and international consensus will be threatened.
With Industry 4.0, the foreign policy and diplomacy means of states are also undergoing a shift. Digital diplomacy, public diplomacy, information warfare and manipulations carried out via social media are now inseparable parts of international relations. These developments are increasing the concept of “information sovereignty” and making the struggle against disinformation a foreign policy. Dangers such as interference in election cycles, digital espionage and fake news production challenge the banning of intervention and the principle of social relations in terms of international law. Therefore, drawing legal limitations of digital diplomacy is strategically important to prevent technology from becoming a geopolitical weapon.
In conclusion, Industry 4.0 is not only an economic and technological process, but also a multi-layered modification that forces the limits of international law. Artificial intelligence, cyber systems and data-based production models are forcing the reinterpretation of essential principles of international law such as sovereignty, responsibility, transparency, and human rights. In this context, there are three main tasks for the global community: filling normative gaps, establishing a new balance of responsibility between states and technology companies, and editing the international legal regime with ethical norms that centre on human pride. In this process, the cooperation of international organisations, academic circles and civil society is not only a legal but also a moral responsibility. Otherwise, the technological opportunities offered by Industry 4.0 may overpower the legal achievements of society.