«They say things are happening at the border, but nobody knows which border» (Mark Strand)
by Ettore Maria Lombardi and Alexey Ivanov
Abstract: This article identifies a dual shift in Italian and EU private law under digitalization. Personality rights are “constitutionalized”: from scant civil-code rules to a dignity-centred framework protecting name, image, reputation, and digital identity (including avatars and post-mortem profiles). Data protection remains a fundamental, non-proprietary right with safeguards against solely automated decisions. Concurrently, digital intangibles (cryptocurrencies, NFTs, virtual items) are cautiously integrated into patrimonial law through analogies on property, transfer, and succession, aided by MiCA and comparative reforms. EU instruments (GDPR, DSA, DMA, Data/Data Governance Acts, AI Act) effect “digital constitutionalism,” embedding rights and accountability while preserving an inalienable core of personhood.
Summary: 1. Introduction. – 2. Private law, personality rights and digitalization’s phenomena: from codal fragments to constitutional rights. – 3. Digital Identity and Avatars: Reinterpreting the Self in the Online Sphere. – 4. Personal Data as a Personality Right: The Non-Property Paradigm. – 5. Algorithms, Discrimination, and Dignity. – 6. Virtual Assets: Patrimonial Assimilation of the Digital. – 7. The EU Regulatory Framework: Toward Digital Constitutionalism. –
1. Digitalization in the twenty-first century has profoundly challenged traditional private law categories. Rights once firmly rooted in the corporeal sphere—such as the rights to name, image, reputation, and identity—now extend into an immaterial realm of online avatars, algorithmically-generated profiles, and data-based reputations. Simultaneously, patrimonial law faces unprecedented forms of intangible assets: cryptocurrencies, non-fungible tokens (NFTs), in-game virtual items, and algorithmic creations.
Italian civil law, traditionally cautious in extending codified categories, has had to rely on constitutional principles and judicial creativity to integrate these phenomena into its dogmatic framework.
At the European Union (EU) level, the legislative response has been intense, producing a complex regulatory mosaic that includes the General Data Protection Regulation (GDPR), the Digital Services Act (DSA), the Digital Markets Act (DMA), the Markets in Crypto-Assets Regulation (MiCA), and the forthcoming Artificial Intelligence Act (AI Act). These instruments collectively seek to update legal governance in line with fundamental values.
This chapter addresses the transformation of personal non-property rights and the governance of virtual assets from the perspective of Italian private law and European Union regulation. The hypothesis is that we are witnessing a dual movement: on the one hand, the constitutionalization of personality rights in the digital environment, ensuring that dignity, identity, and autonomy remain protected; on the other, the progressive assimilation of digital intangibles into patrimonial law, through interpretative and legislative processes. The two movements are not parallel but deeply intertwined, as the commodification of digital personae raises questions about the boundary between the personal and the patrimonial.
In what follows, Part 1 discusses the evolution of personality rights in Italian law from the sparse provisions of the 1942 Civil Code to the broader constitutional rights framework, also drawing comparative insights from other jurisdictions. Part 2 explores the notion of digital identity and avatars as new manifestations of the self, including post-mortem digital identity, highlighting how Italian law and others address these phenomena. Part 3 analyzes personal data protection as a modern personality right in Europe’s dignity-based paradigm versus property-oriented conceptions, while Part 4 addresses the regulation of algorithms, AI, and the imperative of human oversight to prevent discrimination. Part 5 turns to virtual assets, examining how digital goods are being assimilated into property and inheritance law in Italy and comparatively. Part 6 considers the broader European regulatory framework as a form of “digital constitutionalism” embedding fundamental rights into digital market governance. The Conclusion reflects on how private law’s resilience and adaptability can ensure that technological progress remains aligned with human values.
2. The 1942 Italian Civil Code provides only fragmentary regulation of what we now call personality rights. Articles 6 to 9 concern the name, article 10 protects the image, while honor and reputation are indirectly safeguarded through criminal law provisions and the general clauses of tort liability. For a long time, Italian jurists debated whether these isolated norms could constitute a coherent system of diritti della personalità. Authors noted the insufficiency of the codal provisions, insisting that they merely offered scattered protection[1].
Anyway the turning point came with the 1948 Constitution, whose article 2 recognizes and guarantees the inviolable rights of the person, both as an individual and in social formations. Italian courts gradually interpreted this provision as a generative clause, capable of giving birth to new rights not explicitly enumerated in the Italian Civil Code. Thus, the right to personal identity emerged as an autonomous figure in Cass. civ., sez. I, no. 3769 of June 22, 1985, where the Court recognized that individuals have the right to be represented according to their true personality, without distortions or manipulations[2]. Subsequent jurisprudence confirmed this development, such as Cass. civ. no. 13161 of 2006 and Cass. civ. no. 18853 of 2011, which stressed that even truthful information may infringe personal identity if presented in a misleading context.
Refarding the right to image, article 10 I.C.C. and articles 96 and 97 of the Legge sul diritto d’autore protect against unauthorized publication of a person’s image. Jurisprudence expanded this right beyond literal reproduction, covering evocative elements capable of recalling a famous person. The Totò case on the abusive exploitation of another’s reputation(Cass. civ. 2223/1997), indeed, is emblematic: a caricature used by Sperlari S.p.A. on chocolate products was held to violate the actor’s rights[3]. Similarly, in Tribunale Milano, decision no. 766 of January 21, 2015, an advertisement using a Hepburn look-alike was prohibited[4]. Italian courts thus interpret the right to image broadly, aligning it with the protection of personal dignity.
If the case law generally adopts the described perspective, doctrinally, personality rights are conceived as rights enforceable against all, extra-patrimonial, inalienable, and imprescriptible. More to the point, while case law progressively expanded protection, legal doctrine provided theoretical framing. Personality rights in Italian theory are seen as absolute rights enforceable erga omnes (against all), extra-patrimonial (non-economic), inalienable, and imprescriptible (not extinguished by lapse of time). Scholars have long highlighted the absolute and non-transferable nature of these rights as a core feature. Leading voices, writing in a constitutional vein, argue that the inalienability of personality rights is a direct corollary of human dignity — one cannot surrender or trade away essential attributes of personhood without undermining the respect owed to the human person. At the same time, other scholars have nuanced this position by observing that certain commercial exploitations of personality attributes (for instance, the licensing of one’s image or name for merchandising) can be compatible with the inalienable core of the right.
In other words, there is a distinction between the non-disposable essence of personality (which must remain outside commerce) and a peripheral zone where economic interests in one’s persona can be recognized by law (such as royalty contracts for using one’s likeness). This theoretical distinction between an inviolable core and a disposable periphery of personality rights remains central in today’s digital context, where the commodification of personal data, social media profiles, and even virtual avatars is pervasive. Italian law insists that the dignity-based core of personality rights cannot be alienated or waived, even if individuals may contractually allow certain uses of their personal attributes. This approach reflects a broader civil-law and European tradition of tying personality rights to the intrinsic value of the person, as opposed to a market-centered view. In other words, scholars emphasize their absolute nature[5]. Some leading scholar, in a constitutional key, stresses that their inalienability is a corollary of human dignity[6], while some other has nuanced this position by acknowledging that the economic exploitation of certain rights (i.e., licensing of image) is compatible with the inalienable core of the underlying right[7]. This distinction between the non-disposable core and the disposable periphery remains central in digital contexts, where commercialization of personal data and avatars is pervasive.
3. Regarding the identity in social networks, with the spread of social media, the right to personal identity acquired new relevance. In this regard, Cass. pen., Sec. V, no. 22049 of July 6, 2020, affirmed that the unauthorized use of a person’s image without their knowledge constitutes conduct capable of representing a digital identity that does not correspond to the individual using it. Therefore, the crime of identity fraud (s.c. “sostituzione di persona” ex article 494 Italian Penal Code) is committed by anyone who creates and uses a social network profile by unlawfully employing the image of an entirely unaware person. From a civil perspective, such acts violate the right to identity, demonstrating how traditional provisions adapt to digital misrepresentations.
Not coincidentally, by an administrative recognition of digital identity’s point of view, the Codice dell’amministrazione digitale (d.lgs. no. 82 of 2005) introduced a statutory notion of “identità digitale”, particularly in art. 64, concerning access to public services. On the other hand, the Consiglio di Stato, opinion no. 785 of March 23, 2016, recognized that digital identity is instrumental for effective participation in the digital society. While primarily functional, this recognition signals the growing juridical salience of digital self-representation.
Against this background, it appears appropriate to explore in greater depth the subject of avatars as extensions of self, because in immersive environments and metaverses, they serve as digital doubles. Do avatars fall within the scope of image rights? Italian doctrine suggests an affirmative answer when avatars unmistakably evoke real individuals. The analogy with sosia jurisprudence, where look-alikes were deemed infringing, supports this extension. Unlike the U.S. decision in Lohan v. Take-Two Interactive Software, Inc. (2018)[8], where celebrity avatars were protected as transformative expression, Italian courts and scholars are likely to prioritize dignity over freedom of expression[9]. It has argued, indeed, that avatars embody self-projection and thus require recognition as a manifestation of personal identity[10].
In this regard, and especially in a post mortem digital identity’s perspective, Italian law recognizes protection of image and data. Because Article 2-terdecies of the Codice Privacy allows relatives to exercise rights of the deceased, courts have applied this rule to demand removal of online content damaging to memory. But with the rise of AI chatbots simulating deceased persons, new questions arise, and thus may relatives prevent or authorize such simulations? In this sense the general idea tends to affirm the continuity of dignity beyond death, following the tradition of robust post mortem protection[11].
4. The GDPR frames data protection as a fundamental right, grounded in article 8 of the EU Charter. The European Court of Justice had spared no effort in codifying, in Google Spain SL, Google Inc. vs. Agencia Española de Protección de Datos and Mario Costeja Gonzàlez (C-131/12 – May 13, 2014) the right to de-listing from search engines, establishing that data subjects’ rights prevail over economic interests, subject only to public interest exceptions, and in Digital Rights Ireland Ltd. vs. Minister for Communications, Marine and Natural Resources and Others (C-293/12 – April 8, 2014), the annulment of Directive 2006/24/EC (the so-called “Frattini” or “data retention” directive) on the retention of personal data by providers of electronic communications services for disproportionate interference, confirming the dignity-based paradigm.
It is interesting to note that Italian jurisprudence anticipated this development. Cass. civ., Sec. III, no. 3679 of April 9, 1998, recognized the right to prevent republication of outdated news, thereby formulating a right to be forgotten (s.c. “diritto all’oblio”)[12], and later decisions refined this doctrine, balancing memory with the right to be forgotten[13].
In a perspective of property vs. personality, italian doctrine has insisted that data protection must remain within the sphere of dignity, not property[14], and it has been underlined that GDPR rights are structurally inalienable, resembling personality rights[15]. Conversely, some economic scholars suggested recognizing data as a tradable asset, but this position was criticized for undermining autonomy and violating article 2 Italian Constitution. Thus, some Author has persuasively argued that property models cannot capture the relational nature of data[16].
The Data Governance Act (2022/868) and Data Act (2023/2854) confirm this approach, promoting data sharing but explicitly preserve the fundamental rights framework of GDPR, thus rejecting commodification.
5. The expansion of algorithmic decision-making and artificial intelligence in private and public spheres has introduced new urgency to principles of equality, accountability, and human oversight. European law has taken the stance that algorithmic processes should not be black boxes immune from fundamental rights; instead, human dignity and non-discrimination must guide the design and deployment of AI and algorithms.
A key provision in the GDPR, Article 22, establishes that individuals have the right not to be subject to a decision based solely on automated processing that produces legal or similarly significant effects on them, unless certain safeguards are in place (such as the individual’s explicit consent, or the decision being necessary for a contract, with measures to protect the person’s rights). This “right to human review” reflects the principle that human dignity requires a human presence or oversight in decisions that profoundly affect someone’s life — be it hiring, credit, insurance, or similarly consequential matters. European regulators thus operationalized a concept of algorithmic accountability well before the current proliferation of AI. The underlying idea is that individuals should not be reduced to scores or profiles generated by algorithms without an opportunity for context, contestation, or appeal to human judgment.
To be more punctual, Article 22 GDPR establishes a right not to be subject to decisions based solely on automated processing that significantly affect individuals. This provision operationalizes the principle that human dignity requires human oversight in critical decisions, and in this sense it has an exemplary character the italian “Deliveroo”’s decision.
The Bologna Labor Court, order December 31, 2020, condemned Deliveroo’s algorithm “Frank” for indirect discrimination, since its ranking penalized riders absent due to strikes or illness(18). More punctually, the Court ordered Deliveroo to pay damages to the trade union organizations that had brought the action, on the basis of the assumption that the algorithm developed by the company to determine the modalities of access to the booking of work sessions through the digital platform had a discriminatory nature. Indeed, according to the Bologna judge, such algorithm, by sanctioning riders with a loss of score when they failed to comply with the scheduled work sessions, penalized indiscriminately all forms of work abstention by riders, thereby causing a decrease in the score assigned to them, with the consequent downgrading in the booking band and fewer job opportunities[17]. This judgment is a landmark because it extends equality principles into algorithmic contexts, affirming that private digital governance cannot circumvent constitutional guarantees.
At the EU level, the Schrems I (C-362/14) and Schrems II (C-311/18) cases invalidated international data transfer schemes, stressing inadequate protection against algorithmic surveillance. Indeed, in these two cases the EUCJ declared invalid two EU–US data transfer frameworks – Safe Harbor (Schrems I) and Privacy Shield (Schrems II) – on the grounds of insufficient protection of European citizens’ personal data in the United States, with significant consequences for transatlantic data transfers. The EUCJ linked these deficiencies to disproportionate interference with dignity and autonomy.
Moving from existing law to new legislative proposals, the EU’s draft Artificial Intelligence Act represents a bold attempt to comprehensively regulate AI systems with a risk-based approach. The AI Act (expected to be finalized in 2024) categorizes AI uses into prohibited, high-risk, and lower-risk. Certain uses of AI that are deemed unacceptable in a democratic society — such as social scoring of individuals by governments (à la China’s social credit system), or real-time remote biometric identification in public for law enforcement — are banned outright[^18]. This reflects a judgment that these practices inherently violate human dignity or other fundamental values. For high-risk AI systems (like those used in employment screening, credit scoring, law enforcement analysis, or other sensitive arenas affecting people’s rights), the Act will impose strict obligations: developers must implement risk management, ensure high-quality training data to avoid bias, provide transparency and the possibility of human oversight, and so forth. The aim is to prevent algorithmic outcomes that are discriminatory or opaque. Notably, the Act also requires that AI systems interacting with humans disclose that they are AI (so people know, for instance, when they are conversing with a chatbot and not a human) and mandates labeling of deepfakes unless they are clearly parody or otherwise exempt. Italian legal scholars interpret these provisions as effectively embedding constitutional and human rights values into private-sector AI deployments[^18]. In other words, the European legislature is proactively instilling the principles of dignity, equality, and transparency — which echo rights to non-discrimination (e.g., Article 21 of the EU Charter), privacy (Art. 7 Charter), and a general good administration/fair process — into the requirements for AI systems before harm occurs.
The focus on algorithmic discrimination dovetails with traditional non-discrimination law. The Deliveroo case mentioned above is one example of adapting existing norms. One can also anticipate that Europe’s extensive body of anti-discrimination law (on gender, race, etc.) will be applied to algorithmic outcomes: for instance, if a credit scoring algorithm consistently disadvantages a protected group without justification, it could be challenged under equality laws. In some jurisdictions, authorities are already issuing guidelines: the UK’s Equality and Human Rights Commission has released guidance on AI and discrimination, and France’s CNIL (data protection authority) has studied algorithmic bias in decisions like job recruiting. The United States, albeit without a comprehensive AI law, has seen enforcement actions using existing laws (i.e., the Equal Employment Opportunity Commission warning that biased AI hiring tools can violate Title VII of the Civil Rights Act). New York City notably passed a law requiring bias audits for automated hiring tools used by employers, illustrating a growing awareness even in more market-driven legal systems that unchecked algorithms can perpetuate inequality[^38].
Underpinning the European stance is a commitment to human oversight and contestability. This can be viewed through the lens of a “due process” requirement for algorithms. While not explicitly framed as such in law, the spirit of GDPR’s Article 22 and related provisions (like the right to an explanation of algorithmic decisions, derived from Recital 71 GDPR) is akin to a due process safeguard. It insists that significant decisions about individuals should not be made in a black box fashion. Individuals should have the right to know the reasons behind an automated decision and to object or seek human intervention. This is an evolving area: debates continue on how much transparency is feasible (given trade secrets or complexity of AI) and what constitutes an adequate explanation. Nonetheless, Europe’s early incorporation of these ideas sets it apart from other jurisdictions.
In the U.S., for example, there is not a general right to explanation for algorithmic decisions, though sector-specific laws like the Fair Credit Reporting Act provide some rights if credit is denied. Absent statutory mandates, issues of algorithmic bias or unfairness in the U.S. often fall to consumer protection agencies or litigation, and redress requires finding a legal hook (such as disparate impact in employment, or unfair practice under FTC authority).
The concept of digital constitutionalism ties into this discussion, and it refers to the effort to apply constitutional principles (like fundamental rights, checks and balances, democratic oversight) to the digital domain, often in situations where private companies wield significant power over individual rights. By enforcing transparency, accountability, and fairness in algorithmic systems, EU regulations are effectively constitutionalizing the private governance of technology platforms and AI developers[18]. They are ensuring that technological progress does not override the primacy of the human being — a view consistent with the long-standing maxim of European legal humanism that technology must remain the servant of humanity, not the other way around.
6. Looking at the Italian Civil Code categories, Article 810 c.c. defines goods (“beni”) as «coseche possono formare oggetto di diritti» («things that may be subject to rights»). The question arises whether incorporeal digital assets qualify. In this regard, if some Authors argue that they do, with the concept of “cose” extending beyond materiality[19], others, however, stress that the numerus clausus of real rights prevents uncontrolled expansion, urging analogies with “titoli di credito” (s.c. negotiable instruments)[20].
In a comparative perspective, divergent approaches may be noticed. While German courts exclude cryptocurrencies from Sachen for lack of corporeality[21], English courts have recognized property status for crypto and NFTs[22] and Dutch jurisprudence even recognized virtual items as objects of theft[23].
Considering more in depth NFTs and ownership limits, there is not doubt that unique blockchain tokens raise complex issues. In fact, because ownership of the token does not transfer underlying IP rights, absent express agreement, Italian jurisprudence on photographs, for example, could offer a useful analogy affirming that the buyer of a photograph does not acquire copyright[24], although, at the same time, some scholars interprets NFTs as dematerialized “titoli di credito”[25].
Very recently MiCA regulation (2023/1114), providing a harmonized framework for crypto-assets, imposing whitepaper obligations, licensing, and supervision, has exluded NFTs, if unique. In the meanwhile, on the one side, Italian jurisprudence (Cass. pen. SU 44378/2022) classified certain crypto as financial products, showing judicial openness to reclassification[26], and, on the other, CONSOB and Banca d’Italia have issued guidelines for intermediaries.
Looking at a different virtual asset, digital patrimony, issues can arise in case of its inheritance. In thew light of article 459 I.C.C., according to which heirs succeed to all patrimonial relationships, and following the BGH decision on Facebook accounts (III ZR 183/17, as of July 12, 2018)[27], Italian heirs may claim access to digital assets. However, practical barriers remain, because access often depends on contractual terms and possession of private keys. Doctrinal debates explore whether service providers must facilitate access, invoking analogy with article 649 I.C.C. on succession of credits[28].
In summary, virtual assets are gradually being woven into the fabric of patrimonial law, through a mix of analogical reasoning and new regulations. The process can be described as one of cautious assimilation: acknowledging the economic reality and social importance of digital assets (so they need legal recognition and protection) while mindful of doctrinal fit and potential risks. The Italian experience exemplifies this balance—hesitant to label crypto or game items as “things” too hastily, yet pragmatic in applying existing principles (inheritance, consumer protection, fraud prevention) to them. The comparative outlook suggests eventual convergence: many jurisdictions are moving toward recognizing a property or quasi-property status for at least some digital assets, evidenced by law reform efforts like UCC Article 12 in the U.S. which now explicitly defines controllable electronic records as a new category of personal property[29], or the UK Law Commission’s proposal of a third category of personal property called “data objects” to accommodate crypto-tokens and similar digital assets[30]. Such reforms echo the notion that the legal system must evolve beyond the strictly physical concept of property to accommodate the intangible wealth of the digital era.
7. Over the past decade, the European Union has developed a broad regulatory architecture for the digital sphere, one that increasingly embeds fundamental rights into the obligations of private actors. The EU Charter provides, in fact, dignity (article 1), private life (article 7), data protection (article 8), and property (article 17), as principles that orient the legislative interventions of the last years.
It is nit accidental thatthe GDPR enforces article 8, ensuring control over personal data, the Digital Service Act (Regulation 2022/2065) creates a modern liability regime for platforms based on transparency in terms of service, on obligations to respect fundamental rights (article 12), on risk assessments for very large platforms. Some scholars see this approach as a form of “digital constitutionalism,” where fundamental rights are integrated into private regulation[31].
Furthermore, if Digital Markets Act (Regulation 2022/1925) seeks to contain gatekeepers, prohibiting cross-service data combination without consent [article 5(2)], and it enforces autonomy by preventing abusive concentrations of informational power, the Data Governance Act promotes trustworthy data intermediaries, while the Data Act ensures fair access and sharing, particularly in IoT contexts. Both emphasize dignity and fairness over commodification.
In the general perspective of a Digital Private Law, it is evident that Italian doctrine traditionally distinguished between personality rights (personhood) and patrimonial rights (patrimony), but digitalization has destabilizeed this neat separation. Avatars embody both selfhood and economic value, personal data are non-proprietary yet exploited commercially, NFTs are patrimonial but tied to identity and reputation.
Therefore, Italy, within the EU, has also championed constitutionalization of private law in the digital context via its own courts and doctrine. We saw how constitutional principles (like dignity and equality) are used by Italian judges to interpret private law in cases from identity theft to algorithmic discrimination. Thus, Italian private law and EU law are converging in philosophy. One can say there is an emerging Digital Private Law that bridges traditional distinctions: personality rights versus patrimonial rights, public regulation versus private ordering. Digitalization erodes these neat separations. Consider that an avatar might be simultaneously an expression of one’s personality (thus invoking fundamental rights of identity) and an economic asset that could be bought or sold (thus invoking property/contract principles). Personal data similarly has a dual nature: intimately personal yet commercially exploitable. NFTs can be status symbols tied to identity (people show them off as part of their online persona) and also investments. This blurring of lines was foreseen by visionary scholars, who as early as 2001 spoke of the need for a new legal status for the “electronic person” (persona elettronica)[32]. He argued that the law must recognize the continuity of personhood in the digital realm, treating one’s digital presence as legally significant and worthy of protection. Now, concepts like Germany’s allgemeines Persönlichkeitsrecht or France’s droit à l’image are being extended to digital contexts, and even the U.K., through privacy torts and data protection law, indirectly acknowledges similar interests despite its lack of a written constitution. The EU regulations act as a harmonizing force, bringing these national developments under an umbrella of common standards.
Notably, the constitutionalization trend also means that private companies are asked (or compelled) to take on responsibilities that reflect public law values. The DSA’s requirement for platforms to assess and mitigate risks to “fundamental rights, civic discourse, electoral processes, public security” etc. essentially mandates them to self-police in line with societal interests, with oversight. This is quite unprecedented if one thinks in 20th-century terms when only states bore duties to uphold fundamental rights. Now, very large digital intermediaries have quasi-public obligations. If they fail, they face penalties (and under the DSA, users and representative bodies have avenues to seek redress).
To illustrate the contrast: in the United States, the dominant legal approach is still largely that platforms are private parties with broad discretion over content (thanks to Section 230 immunizing them from liability and the First Amendment limiting regulation of speech, even by requiring neutrality from platforms). Europe has chosen a different path, one that treats the digital space as too integral to democracy and individual lives to be left to private ordering alone. Even Germany before the DSA had its Netzwerkdurchsetzungsgesetz (NetzDG) law, imposing fines on social networks for failing to remove manifestly illegal content quickly, a precursor to the DSA approach of platform accountability[33]. The DSA builds a more comprehensive and due-process-oriented system on a European scale.
In summation, the EU’s regulatory initiatives of the 2010s and 2020s mark a decisive shift toward embedding human-centric principles in digital governance. The phrase “digital constitutionalism” captures the ambition: to ensure that fundamental rights — human dignity, privacy, personality development, equality, freedom of expression, property rights in one’s creations or data — are respected in the complex web of interactions that define the digital world, whether those interactions are with the state, with corporations, or peer-to-peer. Italy’s experience, informed by its constitutional tradition of protecting personality rights and by its adaptation of private law via constitutional principles, meshes with this broader European trajectory. Italian doctrine traditionally drew a bright line between personality (pertaining to personhood) and patrimonial rights, but digitalization has destabilized that dichotomy[34]. Avatars, data, and digital assets simultaneously implicate both spheres. The legal response, as we have outlined, has been to extend the shield of personhood where appropriate (to avatars, personal data, etc.) and to cautiously bring digital assets into the fold of property and succession law while maintaining necessary safeguards.
In the light of the above and to sum up, from avatars to algorithms, and from personal data to digital tokens, private law is undergoing a profound evolution in the face of technological change. The Italian and European experiences demonstrate a commitment to ensuring that this evolution is guided by enduring legal values, foremost among them human dignity and fundamental rights. Personality rights, once sparsely recognized in statute, have been greatly expanded through constitutional interpretation to cover new forms of identity and expression in the digital realm. Data protection has been elevated to a core personality right in Europe, rejecting visions of data as mere commodity. At the same time, the law is innovating to address the patrimonial side of the digital revolution: developing frameworks for virtual assets, adapting property and inheritance rules, and crafting regulations for fair digital markets.
A unifying theme is the human-centric approach: digital law is not left to technological or market determinism but is being deliberately shaped so that technology serves the individual and society. Private law doctrines that originated in the 19th and 20th centuries are being reinterpreted for the 21st. In Italy, the resilience of concepts like dignity, inviolable rights of the person, and solidarity has allowed jurists to integrate new phenomena without abandoning fundamental principles. The European Union’s legislative surge in the digital domain similarly seeks to tame the Wild West of cyberspace by reasserting the primacy of rights and values even vis-à-vis powerful private platforms and novel assets.
There remains much work ahead. As AI becomes more sophisticated, questions of algorithmic accountability and personhood for autonomous systems may intensify. The metaverse and extended reality could blur lines between the real and virtual even further, challenging notions of identity, liability, and jurisdiction. Issues of cross-border data governance will persist, as will debates about the ethical limits of commodifying personal information or biometric data. Law will need to continue drawing on interdisciplinary insights — from philosophy, to sociology of technology, to economics — to keep pace with these developments. The comparative views discussed suggest that while legal systems vary, they face common challenges and can learn from each other’s experiments.
In crafting the future of digital private law, a delicate balance must be maintained: fostering innovation and the free flow of information and commerce online, while also ensuring individuals are not reduced to mere data points or consumers devoid of agency. The Italian and European paradigm, anchored in dignity, offers one promising model of that balance. It insists that no matter how virtual or algorithmic our world becomes, law’s ultimate measure remains the human person. As technology marches on, avatars will become more lifelike and algorithms more influential, but the jurisprudence evolving now aims to ensure that human values, rights, and identity are not left behind. In conclusion, the trajectory from codal fragments to digital constitutionalism exemplifies law’s adaptive capacity: it shows that even in the face of transformative technological change, legal principles can be renewed and expanded to govern new realities, ensuring that the core commitments to personhood and justice endure.
Authors: Ettore Maria Lombardi is associated professor of private law at the University of Florence
Alexey Ivanov is Director of the BRICS Competition Law and Policy Centre
[1] P. Rescigno, Manuale del diritto privato, Milano, 2020, p. 355 ff.
[2] Cass. civ., sez. I, no. 3769 of June 22, 1985, which decided the case of an interview given by the director of the National Institute for the Study and Treatment of Tumours that had been used, through the misleading extraction of certain sentences from their overall context, to support a promotional campaign for the sale of “light” cigarettes, as though the director, rather than being opposed to any use of tobacco, had expressed himself in favour of the consumption of such cigarettes. The Supreme Court, finding that the lower court had established that such use of the interview distorted the social image of the institute and its director in relation to their consistent commitment to cancer prevention and anti-smoking campaigns, held that the lower court had correctly granted the protection provided for under Article 7 of the Civil Code, thereby establishing the principle set out above). Acoordingly the Corte di Cassazione affirmed that « The interest of a person—whether natural or legal—in preserving his or her personal identity, understood as social image, that is, the ensemble of values (intellectual, political, religious, professional, etc.) relevant to the representation conveyed in social life, as well as, correlatively, in opposing the conduct of others that undermines such image, even without infringing honour or reputation or impairing name or physical likeness, must be regarded as a subjective right. This follows from the principles enshrined in Article 2 of the Italian Constitution concerning the protection of personality in the complexity and unity of all its components, and it is further protectable by way of analogical application of the provisions set out in Article 7 of the Civil Code on the right to one’s name. Consequently, injunctive relief and damages may be sought against such conduct, and it is also possible to obtain, pursuant to Article 7(2) of the Civil Code, the publication of the judgment upholding the claim, or, where the infringement has occurred through the press, the publication of a correction under Article 42 of Law no. 416 of 5 August 1981».
[3] Cass. civ., no. 2223 of March 12, 1997, in which the right to Totò’s image was considered infringed by the marketing of certain chocolates marked with a trademark consisting of a heart-shaped design and the inclusion of elements evocative of the actor. In legal scholarship, it had already been argued that the manner in which the right to one’s image is infringed is entirely irrelevant, since the violation may concern not only the portrait in the strict sense but also any form of representation. See, for all, A. Arienzo, Ritratto, in Nov. dig.it., 1970, p. 202.
[4] Trib. Milano, 21 gennaio 2015, according to which «The protection of the right to one’s image under Article 10 of the Italian Civil Code extends beyond the mere photographic reproduction of a person’s physical features. It also encompasses the use of elements not directly referable to the individual, such as clothing, ornaments, hairstyle, makeup, and environmental context which, due to their distinctiveness and characterisation, immediately evoke in the viewer’s perception the very person to whom such elements have become inextricably linked. An infringement of the right to one’s image occurs where a photographic reconstruction is commercially exploited through the use of a model faithfully reproducing the distinctive features of a well-known character, when such reconstruction is clearly intended to evoke and take advantage of the subject’s image through an evident reference to it. Such conduct constitutes an attempt to circumvent the requirement to obtain the authorisation of the right holder and the related remuneration associated with the commercial exploitation of the image.The right to one’s image belongs to the category of personality rights and is specifically protected under civil law provisions as well as the constitutional foundations enshrined in Article 2 of the Italian Constitution. By contrast, copyright law merely provides for exceptional circumstances justifying the unauthorised dissemination of a portrait, without the functional jurisdiction of the specialised business law section being relevant where the dispute does not concern the use of an actual photographic portrait but rather the reconstruction of a character portrayed by a model. The legitimate children and heirs of the deceased person hold the right to express consent for the use of the image of the de cuius by third parties, both pursuant to Article 93(2) of the Italian Copyright Law for the protection of moral rights, and as heirs in respect of patrimonial rights. They may bring an action for compensation for pecuniary damage resulting from the failure to receive payment for such consent, as well as for non-pecuniary damage deriving from the unlawful association with commercial promotional initiatives, with equitable damages to be determined in proportion to the specific use of the image and its dissemination».
[5] P. Rescigno, Manuale del diritto privato, cit., p. 360.
[6] S. Rodotà, Il diritto di avere diritti, Roma-Bari, 2012, p. 211.
[7] G. Oppo, Scritti giuridici, Milano, 1999, II, p. 514.
[8] Lohan v. Take-Two Interactive Software, Inc. 2018, decided on March 29, 2018 by the Court of Appeals of New York.
[9] See, ex multis, M. Rosaldo, Toward an Anthropology of Self and Feeling, in Culture Theory. Essays on Mind, Self, and Emotion, edited by R.A. LeVine, R. A. Shweder, 1984, Cambridge, p. 135 ff.; G. Sartor, L’intenzionalità dei sistemi informatici e il diritto, in Riv. Trim. di Dir. e Proc. Civ., 1, 2003, p. 23 ss.; L. Arnaudo, R. Pardolesi, Ecce robot. Sulla respoinsabilità dei sistemi adulti di intelligenza artificiale, in Responsabilità, 2023, 4, p. 409 ff.; C. Romeo, L’avatar, il metaverso e le nuove frontiere del lavoro tecnologico: dal lavoro agile al metaverso, in Diritto e universi paralleli. I diritti costituzionali nel metaverso, a cura di A. Fucillo, V. Nuzzo, M. Rubino De Ritis, Napoli, 2023, p. 71 ss.
[10] G. Ferrigno, Agenti AI nei mondi virtuali: evoluzione e rischi degli avatar realistici, May 29, 2025, available at https://www.agendadigitale.eu/cultura-digitale/agenti-ai-nei-mondi-virtuali-evoluzione-e-rischi-degli-avatar-realistici/, where the A. argues that Artificial intelligence in virtual worlds introduces new forms of interaction and control, but it also raises concerns related to privacy, transparency, and ethical risks. European regulation seeks to establish a clear framework for the responsible use of these technologies.
[11] See, espcially on AI-based chatbots known as deadbots or griefbots, which simulate deceased persons mimicking language and personality based on the so-called digital footprint of the deceased, T. Hollanek, K. Nowaczyk-Basińska, Griefbots, Deadbots, Postmortem Avatars: on Responsible Applications of Generative AI in the Digital Afterlife Industry, in Philos. Technol., 37, 63, 2024, available at https://doi.org/10.1007/s13347-024-00744-w.
[12] With judgment no. 5525 o f2012, the Corte di Cassazione clarified what the right to be forgotten is, defining it as the right «to prevent further dissemination of information which, due to the passage of time, has now been forgotten or is unknown to the general public».
[13] The right to be forgotten must also be considered in relation to other personality rights, in particular honor and reputation. In fact, the Corte di Cassazione, in judgment no. 3679 of 1998, characterized the right to be forgotten as «the legitimate interest of every person not to remain indefinitely exposed to the further harm that repeated publication of information—lawfully disclosed in the past—may cause to his or her honor and reputation». The right to be forgotten concerns the processing of personal data: under certain conditions, the individual may obtain their erasure by the data controller. In cases of publication of personal data, the right to be forgotten also entails the right that personal information not be further disseminated. The aim is to prevent the data subject from being subjected to a “media pillory”.
[14] S. Rodotà, Tecnologie e diritti, Bologna, 2021, p. 87.
[15] F. Pizzetti, Privacy e il diritto europeo alla protezione dei dati personali, Torino, 2016, p. 119.
[16] See A. Galiano, A. Leogrande, S.F. Massari, A. Massaro, I dati non personali: la natura e il valore, in Riv. Italiana di informatica e dir., 1, 2020, p. 61 ff., where the A., at p. 65, describing the “excludability” or the ability to exclude as a constitutive element of property rights because the purchaser wishes to exclude others from using the acquired data, affirm that private goods are excluded from use by third parties. Therefore, whether the traditional concept of property is applicable to data, or to all categories of data, or whether it represents the best framework for exploiting the potential of data, is still under discussion.
[17] See, for more details, G. Fava, L’ordinanza del Tribunale di Bologna in merito alla possibile discriminatorietà dell’algoritmo utilizzato da Deliveroo, in Lavoro Diritti Europa, 1, 2021, p. 1 ff.
[18] See, ex multis, F. Donati, Diritti fondamentali e algoritmi nella proposta di regolamento sull’intelligenza artificiale, in Il Diritto del’Unione Europea, 2021, p. 453 ff.; O. Pollicino, AI Act e diritti fondamentali, in Civiltà delle macchine, 2, 2024, p. 53 ff.
[19] See, for all, G. De Nova, I nuovi beni come cate- goria giuridica, in Dalle res alle New Properties, edited by G. De Nova, B. Inzitari, G. Tremonti and G. Visentini, Milano, 1991, p. 13 ff.; M. Giuliano, I nuovi beni digitali nel quadro normativo europeo, in Il nuovo diritto delle società, 11, 2022, p. 2077 ff.
[20] D. Messinetti, Oggettività giuridica delle cose incorporali, Milano, 1970, passim; M. Trimarchi, Nuovi beni e situazioni di godimento, in Il diritto civile oggi. Compiti scientifici e didattici del civilista, Napoli, 2006, 575 ff.
[21] In 2021 the Higher Regional Court of Frankfurt am Main (OLG) recognized crypto tokens as digital assets without transferability under property law. More punctually this ruling comes from the decision of the Higher Regional Court where it was held that a friend who, with the consent of another, invested funds in various cryptocurrencies is not liable for lost profits arising from exchange losses during conversions between currencies (i.e., Ethereum/Bitcoin). The court dismissed the claim for the transfer of Ethereum units, thereby overturning in part the predominantly favorable decision of the Regional Court. See, further, M. Härtel, OLG Frankfurt: No liability for lost profits when buying cryptocurrencies as a courtesy, in Blockchain and web law, May 15, 2023, available at https://itmedialaw.com/en/olg-frankfurt-no-liability-for-lost-profits-when-buying-cryptocurrencies-as-a-courtesy/.
[22] See Osbourne vs. Persons Unknown & Ors. [2023] EWHC 340 (KB), where the court recognized cryptoassets such as Bitcoin as property under English law, thereby enabling the granting of proprietary injunctions to prevent their dissipation. Building on this foundation, Osbourne v. Persons Unknown extended the principle to Non-Fungible Tokens (NFTs) and, in a notable procedural innovation, permitted service of proceedings through an NFT, offering a novel mechanism for engaging with unidentified parties in the digital environment.
[23] A relevant example in this sense comes from the Runescape case of 2012 where the Dutch Supreme Court decided that virtual objects in a video game could be considered “goods” and thus subject to theft under the law. This seminal ruling arose from a real incident in which teenagers employed violence and intimidation to compel a younger player to relinquish virtual items from his Runescape account, which they subsequently appropriated. The judgment marked a turning point in recognizing that virtual items may constitute objects of property offenses, thereby shaping the legal understanding of virtual theft. See, fro more details, A.R. Lodder, Dutch Supreme Court 2012: Virtual Theft Ruling a One-Off or First in a Series?, in 6 Journal of Virtual Worlds Research, 3, September 2013, available at https://research.vu.nl/ws/portalfiles/portal/796857/vwr2013.pdf.
[24] See, for example, Cass. civ., Sec. I, judgment no. 4094 of June 28, 1980, where it is held that in the case of a photographic portrait made on commission, governed by Article 98 of Law No. 633 of 22 April 1941 on copyright, the commissioning party—unlike what is established by Article 88, third paragraph, of said law for photographs of objects in his or her possession—does not acquire the exclusive right of use of the photograph. Such right remains with the photographer, although it coexists with the right of the person portrayed, or of his or her successors in title, to publish and reproduce the photograph freely, subject to payment of equitable remuneration to the photographer in the event of commercial use. In this situation, therefore, absent any different agreement, it must be held that the photographer retains ownership of the negative and is not obliged to deliver it to the commissioning party.
[25] E. Damiani, Cripto-arte e ‘non-fungible tokens’: i problemi del civilista, in Rass. dir. della moda e delle arti, 2022, p. 1 ff.; M. D’Onofrio, La versatilità degli NFT e le loro funzionalità nel campo della circolazione e catalogazione delle opere d’arte, in Aedon – Rivista di arti e diritto on line, 1, 2024, available at https://aedon.mulino.it/archivio/2024/1/donofrio.htm.
[26] See Cass. pen., judgment no. 44378 of November 22, 2022, where the Italian Supreme Court of Cassation classified cryptocurrencies as financial products, acknowledging their nature as investment instruments characterized by the use of capital, the expectation of profit, and the presence of risk. This decision is particularly significant in the criminal law context, especially with regard to the preventive seizure of cryptocurrency wallets and the potential application of offenses such as unauthorized financial activity under the Consolidated Law on Finance (T.U.F.).
[27] In 2018, Germany’s Federal Court of Justice (BGH) held that, under German inheritance law, heirs are entitled to access a deceased person’s Facebook account. The court clarified that the GDPR, which came into force in May 2018, does not obstruct such access, and that Facebook is obliged to provide heirs with full access to the account.
[28] L. Collura, L’eredità digitale: il problema della successione nell’account, in Ianus – Quaderni, 2023, p. 63 ff. and, more specifically, p. 107 ff.
[29] Uniform Commercial Code (UCC) 2022 Amendments, Article 12 (“Controllable Electronic Records”), §§12-102 et seq., create a legal framework for digital assets (controllable electronic records, including cryptocurrencies and NFTs) in commercial transactions. Article 12 defines the concept of “control” of a digital asset (analogous to possession), provides for transfer of interests, and protects good-faith purchasers. This innovation by the Uniform Law Commission, if adopted by U.S. states, facilitates recognizing digital assets as a new type of personal property for purposes of secured transactions and beyond.
[30] UK Law Commission, Digital Assets: Final Report (Law Com No. 401, June 2023). The Law Commission recommends statutory recognition of a distinct third category of personal property, termed “data objects,” to encompass crypto-tokens and similar digital assets that do not fit readily within traditional categories of personal property. The report affirms that under common law, such digital objects are already treated as property, and it proposes clarifications and reforms (e.g., on possession and control, and on collateralization rules) to accommodate them.
[31] O. Pollicino, Di cosa parliamo di costituzionalismo digitale?, in 3 Quaderni costoituzionali, September 2023, p. 569 ff.; P. Dunn, O. Pollicino, Diritti individuali e poteri privati: la sfida del costituzionalismo digitale, in Agenda Digitale, 2024, available at https://www.agendadigitale.eu/cultura-digitale/diritti-individuali-e-poteri-privati-la-sfida-del-costituzionalismo-digitale/; C. Spinello, È (ancora) tempo di costituzionalismo digitale, in 1 Riv. Italian di informatica e diritto, 2025, p. 579 ff.
[32] S. Rodotà, Uno statuto giuridico globale della persona elettronica, in InterLex – Diritto Tecnologia Informazione, 2001, available at http://www.interlex.it/675/rodota5.htm, where the A. proposes the recognition of a legal status for the electronic persona that integrates rights to identity, privacy, and existence in digital networks.
[33] Netzwerkdurchsetzungsgesetz (NetzDG) – Germany’s Network Enforcement Act of 2017 (amended 2021) – requires large social media platforms to remove “manifestly unlawful” content (hate speech, etc. as defined by German law) within 24 hours of notice, or within 7 days for less obvious cases, or face heavy fines. NetzDG has been criticized but was an early attempt to enforce national law on global platforms, foreshadowing some aspects of the DSA’s approach to illegal content.
[34] S. Rodotà, Uno statuto giuridico globale della persona elettronica, cit.