EU Copyright and Data Access in the Age of AI: Insights from the INSPIRING ERA Exchange
As artificial intelligence accelerates scientific discovery, European researchers are navigating an increasingly complex legal landscape. How can copyright, data access, and open science frameworks evolve to support innovation while protecting rights and security?
This question brought together research managers, legal experts, open science practitioners, and EU policymakers at the INSPIRING ERA Exchange on ERA Actions 1 and 2, held in Paris on 9 July 2025 in partnership with the IP4OS project.
The event explored how Europe’s copyright and data legislation can better serve researchers in an AI-driven world—highlighting the need for harmonised rules, practical guidance, and institutional support to ensure that open science remains both legally sound and globally competitive.
Setting the Scene
Opening the event, Bertil Egger Beck from the European Commission’s DG RTD placed the discussion within the broader transformation of the European Research Area (ERA). “We’re entering a new phase of coordination and accountability,” he noted, describing how the ERA Policy Agenda 2025–2027 aims to integrate open science, research security, AI ethics, and knowledge valorisation into a single strategic vision.
Open science, he reminded participants, is not merely a principle but a practical tool to enhance quality, efficiency, and trust. “Transparency drives excellence. But for openness to thrive, our legal and technical systems must evolve with it.”
Victor Alamercery (DG CNECT) elaborated on the Digital Services Act’s Article 40, which for the first time grants vetted researchers access to data from major online platforms and search engines. The framework, he explained, enables scholars to study societal risks such as disinformation and algorithmic bias while upholding privacy and data security.
However, he warned that operationalising this right requires secure infrastructure, harmonised accreditation, and trusted technical standards—areas where EU coordination remains uneven.
Legal and Academic Perspectives
Kacper Szkalej (Institute for Information Law, University of Amsterdam) provided an in-depth analysis of how EU copyright exceptions affect research. While the DSM Directive introduced mandatory text and data mining (TDM) exceptions, implementation remains fragmented.
“Researchers face a patchwork of interpretations,” Szkalej observed. “We need harmonised exceptions that treat access to knowledge as a right, not a regulatory obstacle.” He advocated recognising research exceptions as enablers of academic freedom in the digital age.
Thomas Margoni (KU Leuven Centre for IT & IP Law) broadened the view, mapping a dense web of EU instruments—the Open Data Directive, Data Governance Act, Data Act, Digital Markets Act, and AI Act—that collectively shape research data governance. His central message: definitions and procedures must align. “Researchers shouldn’t need a lawyer to collaborate across borders,” he said, proposing a coordinated ‘Researcher’s Act’ to unify legal standards for digital research in Europe.
From the civil society side, Teresa Nobre (COMMUNIA Association) examined how general AI models challenge open science principles. While the Copyright in the Digital Single Market Directive (CDSMD) created new pathways for lawful data use, she highlighted lingering uncertainty around opt-out regimes and the legality of AI-generated outputs. “AI is testing the boundaries of our legal frameworks,” she said. “We must ensure that the rules foster innovation, not fear.”
Finally, Julia Priess Buchheit (IP4OS, Kiel University) presented the IP4OS Synergy Framework, a model for integrating intellectual property management, data ethics, and research security. By aligning legal experts, data stewards, and researchers, it helps institutions manage risks—from export controls to data provenance—and maximise the value of FAIR outputs. “Security and openness can coexist,” she concluded, “but only if they are designed to reinforce each other.”
Challenges on the Ground
Participants from across Europe identified a shared set of systemic and operational obstacles:
- Legal fragmentation across Member States
 National transpositions of EU directives differ widely, leading to uncertainty and duplicated effort in cross-border projects. Researchers often struggle to determine whether an activity—such as text and data mining—is legal in all participating countries.
- Lack of institutional expertise
 Few universities or research organisations employ dedicated copyright or data governance officers. This leaves researchers to interpret complex legal texts without professional guidance.
- Insufficient awareness of data rights and obligations
 Many researchers remain unaware of their rights under the DSM Directive or the new data access frameworks, leading to underutilisation of available legal tools.
- Tension between openness and security
 Expanding open science infrastructures without clear governance can expose institutions to cybersecurity threats, data leaks, or misuse of sensitive research.
- Administrative burden and compliance fatigue
 Multiple overlapping laws (e.g., GDPR, DGA, DSA) require extensive documentation, discouraging smaller organisations and widening countries from engaging in data-driven research.
- Ethical and technical uncertainty in AI research
 Questions persist around lawful data use for AI training, intellectual property in AI-generated works, and the accountability of AI-assisted publications.
- Fragmented support structures
 The absence of a coordinated European competence network for copyright and data access leaves institutions isolated in managing compliance and capacity building.
Pathways Forward
In their discussions, experts and participants identified several priorities for action at both policy and institutional levels:
- Harmonise copyright and data access rules across Europe
 Establish clear, consistent interpretations of research exceptions, TDM rights, and lawful data reuse through EU-wide coordination.
- Build institutional capacity
 Introduce dedicated copyright and IP officers in research institutions—modelled on the GDPR Data Protection Officer system—to guide compliance and training.
- Develop accessible legal tools
 Create plain-language templates, online guides, and automated checklists to support researchers in navigating copyright and AI governance requirements.
- Integrate legal literacy into research training
 Embed copyright, data governance, and ethics modules in doctoral and postdoctoral programmes to empower the next generation of researchers.
- Foster secure, FAIR-by-design infrastructures
 Expand trusted repositories, cybersecurity standards, and risk protocols to ensure open science remains both transparent and safe.
- Promote a European ‘Researcher’s Act’
 Coordinate digital and legal standards for research across EU legislation—ensuring that academic freedom and data access are enshrined as fundamental principles.
- Encourage cross-sector cooperation
 Link open science actors, IP professionals, and research security experts through competence centres and European networks of practice.
Why It Matters
The Paris Exchange underscored that open science, legal certainty, and research security are not opposing forces—they are interdependent pillars of Europe’s knowledge ecosystem.
As one participant summarised:
“Researchers need freedom to innovate, clarity to comply, and trust to collaborate. That’s the balance Europe must strike.”
By bridging legal frameworks with practical research realities, the INSPIRING ERA Exchange on ERA Actions 1 and 2 contributed vital insights toward a more coherent, secure, and innovation-friendly European Research Area—one that empowers scientists to harness AI and data for the collective good.
 
				 
													