HomeIssues2023A Research Paper on the Impact of GDPR and CCPA on Targeted...

A Research Paper on the Impact of GDPR and CCPA on Targeted Advertising in Social Media Companies

Authors

Sarah Kim


Abstract

The General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) have reshaped the practices and frameworks of data collection and usage across digital platforms worldwide. Among the sectors most affected by these sweeping privacy laws is the social media industry, particularly where revenue models rely heavily on targeted advertising. This research paper offers an in-depth examination of how GDPR and CCPA have transformed advertising strategies, altered user and advertiser experiences, and driven operational changes within social media ecosystems. Drawing extensively on academic, legal, and industry literature, the paper explores the historical context behind data-driven advertising, detailing how social media companies originally built their revenue models around large-scale user-data collection. It then investigates how the introduction and enforcement of GDPR and CCPA have compelled these platforms to reevaluate data collection norms, opt-out systems, consent processes, algorithmic transparency, and cross-border data transfer protocols. By integrating theoretical perspectives on privacy, empirical studies of platform compliance efforts, and emerging innovations in privacy-enhancing technologies (PETs), this paper illuminates how social media giants and smaller niche networks are adapting to new legal mandates. The discussion spans organizational restructuring, user trust implications, shifts in advertising methods, and the broader socio-economic consequences of heightened data protection standards. Ultimately, this study emphasizes that while GDPR and CCPA have introduced complexities and compliance burdens, they have also stimulated a deeper industry-wide dialogue on ethical data stewardship, user autonomy, and the future of personalized marketing.

Keywords: GDPR, CCPA, Social Media, Targeted Advertising, Data Privacy, Consent, Business Models, Algorithmic Transparency, Privacy-Enhancing Technologies

1. Introduction

Social media platforms—ranging from global giants such as Facebook (Meta), Instagram, Twitter (X), TikTok, and LinkedIn to emergent niche sites—have transformed how people communicate, share content, and engage with businesses and public figures. Over the past decade, a defining feature of many platforms’ economic models has been their reliance on user data to facilitate highly targeted advertising. By collecting, analyzing, and segmenting detailed user behaviors, demographics, and preferences, social media services have been able to generate significant revenue while promising advertisers unparalleled access to specific consumer segments.

Yet, the explosion of personalized advertising strategies has not gone unchallenged. Amid rising public awareness about online privacy, cyber threats, and the potential misuse of personal data, lawmakers around the world have taken decisive steps to regulate how organizations store, process, and share user information. The European Union’s General Data Protection Regulation (GDPR), enforced from May 2018, represents one of the most comprehensive attempts to codify user rights regarding data collection and processing. It mandates robust consent mechanisms, explicit privacy disclosures, data minimization principles, and severe financial penalties for non-compliance. Shortly thereafter, the State of California introduced the California Consumer Privacy Act (CCPA), which took effect in 2020. CCPA, while narrower in scope than GDPR, also grants consumers significant rights over their personal data, including the ability to know how it is used, delete it, and opt out of its “sale.”

Because social media platforms operate as ad-funded ecosystems that derive substantial value from personal and behavioral data, these two legislative frameworks have had especially profound and layered impacts on how such platforms conduct business. Moreover, GDPR’s extraterritorial reach means that any social media service targeting or tracking EU residents faces the daunting prospect of large-scale compliance obligations, regardless of its geographic headquarters. Meanwhile, CCPA, with its subsequent evolution into the California Privacy Rights Act (CPRA), has brought a new level of consumer empowerment to the U.S. regulatory landscape, prompting discussions about federal data protection legislation and catalyzing policy changes in other states.

This paper focuses on the specific ways in which GDPR and CCPA have reshaped targeted advertising strategies within social media companies. Rather than merely providing a broad overview of data privacy compliance, it narrows its lens to examine how the central pillars of these regulations—consent, data minimization, opt-outs, algorithmic transparency, and consumer rights—have forced social media businesses to reconsider the fundamentals of their revenue generation. By integrating scholarly work, case studies, and policy analyses, this investigation documents both the immediate operational shifts triggered by GDPR and CCPA and the longer-term reverberations these frameworks might induce in the social media advertising ecosystem.

1.1 Significance of Targeted Advertising in Social Media

Targeted advertising has revolutionized how digital marketing campaigns are designed and delivered. Traditional advertising platforms, such as television or print media, rely on broad demographic data, making it difficult to deliver ads with pinpoint precision. Social media, on the other hand, can track user actions down to individual clicks, views, and shares, feeding algorithmic systems that categorize users based on hobbies, political affiliations, lifestyles, and personal preferences. This precision has generated enormous revenues for platform operators: for instance, before GDPR took full effect, some estimates showed that Facebook derived close to 98% of its total income from advertising alone.

The commercial advantage of deep personalization is evident. Advertisers consistently report that well-targeted campaigns yield higher click-through rates, better user engagement, and an improved return on investment. By marrying user data with predictive analytics, social media companies promise businesses the capacity to target not just demographic groups but unique individuals who display a high propensity for product engagement or brand loyalty. Despite these benefits, the very same granular data collection and processing methods have drawn scrutiny from regulators concerned about transparency, informed consent, user autonomy, and the wider social impacts of personalized recommendations.

1.2 Rationale for Studying GDPR and CCPA Through a Social Media Lens

While GDPR and CCPA apply across different sectors—from e-commerce to healthcare technology and beyond—social media platforms represent a particularly instructive case study for several reasons:

Intensity of Data Collection: No other digital environment collects as many real-time signals across as many aspects of personal life. Social media thrives on user-generated content, interpersonal interactions, and engagement metrics, creating vast data sets that can be granularly parsed.

Globalized User Base: Large platforms often operate in multiple jurisdictions. Therefore, they must reconcile regional data protection rules with operational practices that are universal or near-universal. GDPR’s extraterritorial provisions especially highlight these global implications.

Complex Advertising Ecosystem: Social media advertising extends beyond first-party interactions with the platform itself, involving data brokers, demand-side platforms, supply-side platforms, real-time bidding, and a myriad of third-party applications. Understanding compliance in this intricate network is challenging, yet critical to clarifying how data flows are managed.

Rapid Regulatory Evolution: Both GDPR and CCPA have sparked conversations about possible future regulations, including a U.S. federal privacy law and GDPR-style laws in other countries. Social media’s reliance on data monetization makes it a bellwether for how companies might adapt as laws become more stringent and enforcement grows.

Societal Impact: Beyond mere economic considerations, social media has a profound influence on political discourse, public opinion, and personal relationships. Consequently, targeted advertising on these platforms raises ethical concerns about filter bubbles, manipulation, and the commodification of user attention—issues that regulators increasingly aim to address.

1.3 Scope, Questions, and Objectives

This research paper seeks to answer key questions about the intersection of social media advertising and data protection laws:

Data Collection Practices: How have GDPR and CCPA affected the methodologies by which social media platforms gather and utilize data for ad targeting purposes?

Consent and User Control: What new consent flows, opt-out mechanisms, and user preference controls have emerged to align social media platforms with regulatory mandates?

Algorithmic Transparency: In what ways have the requirements for transparency and potential user access to profiling information influenced the design of recommendation engines and ad-delivery algorithms?

Strategic and Operational Overhauls: How have social media companies reorganized their internal teams, governance structures, vendor relationships, and technological frameworks to meet compliance obligations?

Economic and Competitive Outcomes: How have these regulations impacted ad revenue, the distribution of market power among major and minor platforms, and the broader digital advertising ecosystem?

By addressing these inquiries, the paper aims to delineate a nuanced perspective on how privacy legislation reshapes not just the practices of data handling in a technical sense but also the foundational economic models and strategic decision-making processes within social media businesses.

1.4 Structure of the Paper

Following this introductory discussion, the paper is organized into several major sections. First, it delves into the historical and conceptual foundations of data privacy and targeted advertising, illustrating the incremental shifts that paved the way for GDPR and CCPA. Next, an expanded literature review synthesizes academic, legal, and industry insights, emphasizing how these two regulatory frameworks specifically intersect with social media platforms’ advertising operations. The paper then explores operational and strategic implications, highlighting how consent management, technological adaptations, and organizational restructuring have emerged as critical issues. Subsequently, the analysis turns to stakeholder perspectives, considering the viewpoints of advertisers, end users, regulators, and consumer advocacy groups. The penultimate section examines future directions, speculating on forthcoming trends in privacy-centric advertising technologies, new legislative initiatives, and evolving user attitudes. Finally, the conclusion consolidates key findings and points toward areas in need of ongoing research.

By focusing on the practical and conceptual dimensions of GDPR and CCPA’s effects on targeted advertising in social media, this paper contributes to the broader conversation on balancing individual rights with the economic imperatives of data-driven commerce. In doing so, it underscores the significance of legal frameworks in shaping the architecture of digital platforms and the experiences of millions of users worldwide.

2. Historical and Conceptual Foundations of Data Privacy and Targeted Advertising

Understanding the influence of GDPR and CCPA on social media advertising strategies necessitates a grasp of the broader historical context in which digital advertising emerged, as well as the evolving conceptual frameworks of data privacy that inform legislative actions. This section chronicles the rise of targeted advertising within the Web 2.0 revolution, the concurrent development of privacy principles, and the mounting societal concerns that spurred governments to intervene with stricter data protection measures.

2.1 The Emergence of Web 2.0 and Behavior-Based Advertising

The term “Web 2.0” marks a critical shift from static websites to interactive, user-generated content platforms. Blogs, forums, and early social networking sites such as MySpace and Friendster paved the way for deeper user engagement online. As these platforms gained traction, companies began to recognize the inherent value of user interactions for marketing and advertising. Online advertisers realized they could mine a wealth of digital footprints, from simple profile fields (age, gender, location) to more nuanced signals like user interests, connection graphs, and content engagement metrics.

During the early 2000s, banners and pop-ups were the dominant digital ad formats, but their rudimentary targeting capabilities produced limited effectiveness. This changed with the advent of advanced analytics and real-time bidding (RTB) systems. RTB technologies allowed advertisers to bid on individual ad impressions in microseconds based on the specific attributes of the user viewing a webpage or social media feed. Social media companies, with their treasure troves of user data, were uniquely positioned to dominate this new paradigm, promising advertisers an “audience of one” approach (Evans, 2009). Platforms like Facebook further refined these capabilities by introducing precise targeting filters, enabling advertisers to zero in on a constellation of personal and behavioral characteristics.

2.2 Early Privacy Frameworks and Their Limitations

Before the implementation of GDPR, the EU’s principal legislative instrument was the 1995 Data Protection Directive (Directive 95/46/EC). While it established foundational concepts such as lawful processing and data subject rights, enforcement was fragmented across EU member states, resulting in inconsistent application. Furthermore, the Directive did not fully anticipate the immersive and data-intensive nature of modern social media (European Commission, 1995).

In the United States, privacy regulations historically focused on specific industries, such as healthcare and finance. Consequently, broader commercial data handling, including that of social media companies, largely fell under a patchwork of federal and state regulations. Notably, the U.S. Federal Trade Commission (FTC) often pursued privacy violations under its authority to prohibit “unfair or deceptive acts or practices,” but this did not yield a cohesive, overarching privacy mandate. States like California implemented some consumer-friendly privacy rules, but these were limited in scope (Klonick & McLaughlin, 2020).

Throughout the 2000s, civil society organizations and consumer advocates increasingly warned that the explosive growth in data collection outpaced both user awareness and the existing legal frameworks’ protective capacities. High-profile data breaches and controversies—most famously the Cambridge Analytica scandal involving Facebook—amplified these concerns. This public outcry helped galvanize policymakers worldwide to strengthen protections for users, culminating in regulations like GDPR in 2016 (enforced from 2018) and later the CCPA in California (passed in 2018, enforced from 2020).

2.3 Conceptualizing Privacy: Control, Contextual Integrity, and Privacy Calculus

Privacy has been conceptualized in diverse ways, each offering a unique lens through which to analyze social media advertising:

Control Theory: Tracing back to Alan Westin (1967), privacy is posited as the individual’s ability to control personal information. In a social media advertising context, this translates to enabling users to decide which data is shared with advertisers, how it is used, and for what purposes. GDPR and CCPA both reinforce this notion by stipulating clear consent requirements and opt-out mechanisms (Art. 4(11) GDPR; California Legislature, 2018).

Contextual Integrity: Helen Nissenbaum (2009) proposed that privacy norms depend on context, implying that people expect information shared in one context (e.g., with close friends on social media) to remain within that sphere. When platforms exploit such data for advertising or align it with external data sets, they may violate the contextual integrity of the user’s information. Under this view, regulations like GDPR’s data minimization principle attempt to limit such boundary-crossing.

Privacy Calculus: This perspective suggests individuals engage in a cost-benefit analysis regarding data disclosure (Dinev & Hart, 2006). Users may willingly share personal details if the perceived benefits—such as personalized content or platform connectivity—outweigh the risks of exposing that information. Social media platforms capitalized on this heuristic, providing convenient, “free” services in exchange for data. GDPR and CCPA disrupt the traditional calculus by introducing legal requirements for explicit consent and easy withdrawal of that consent, thereby readjusting how users weigh the risks and benefits.

These theoretical frameworks collectively underpin legislative debates on how to balance corporate innovation with individual rights, illuminating the ethical, social, and economic complexities of targeted advertising.

2.4 The Gradual Escalation of Regulatory Attention

Prior to GDPR, regulatory pressure on social media platforms was relatively modest. Although there were notable attempts by the EU (e.g., the ePrivacy Directive) and U.S. authorities to address online privacy, these efforts did not comprehensively restrain the data mining and profiling that powered targeted ads. The turning point emerged when user data controversies highlighted the vulnerabilities inherent in data-hungry business models.

GDPR’s introduction represented a quantum leap in terms of enforcement potential. By imposing fines of up to 4% of a company’s global annual turnover, it forced even the largest, most profitable social media platforms to prioritize compliance (Art. 83 GDPR). The regulation’s extraterritoriality further increased its global influence, triggering compliance measures in companies headquartered outside the EU but still serving EU residents (Tene & Polonetsky, 2019).

Similarly, the California Consumer Privacy Act (CCPA) became a groundbreaking state-level law. It offered Californian residents the right to understand what data companies collect about them, request its deletion, and opt out of its sale (California Legislature, 2018). Although narrower in scope than GDPR, CCPA’s significance derived from California’s status as a major technological hub, home to Silicon Valley and many leading social media firms. CCPA thus acted as a policy catalyst, prompting discussions about federal privacy legislation in the U.S. and inspiring additional states to draft their own privacy statutes (Hartmans, 2021).

2.5 The Role of Civil Society, Academia, and Media Outlets

Civil society organizations, privacy advocacy groups (e.g., the Electronic Frontier Foundation), academic researchers, and investigative journalists played crucial roles in surfacing data handling malpractices, thereby fueling public pressure for more robust laws. Journalistic exposés of unauthorized data sharing and academic studies documenting the extent of third-party tracking on social media sites bolstered public awareness (Angwin, Varner, & Tobin, 2017). These revelations demonstrated how easily data could travel beyond user control, reinforcing arguments for legal mechanisms that empower individuals with enforceable rights.

By placing user data mishandling under the public spotlight, media channels also influenced corporate self-regulation. Several platforms, anticipating reputational damage, began rolling out user-friendly privacy settings, transparency reports, and “data dashboards” even before GDPR and CCPA took effect. While these changes were often incremental, they indicated a broader recognition within the industry that social media business models reliant on opaque data practices were subject to growing scrutiny.

2.6 Summary of Historical and Conceptual Shifts

The interplay between evolving technology, user behavior, and gradual regulatory escalation set the stage for GDPR and CCPA to emerge as legislative landmarks. Social media’s dual function as a facilitator of global connectivity and a massive repository of personal data placed it at the nexus of privacy and commerce. Growing concerns about surveillance, user manipulation, and data security propelled lawmakers to craft laws that challenged the longstanding norms of unbridled data collection.

In turn, the conceptual underpinnings of privacy—ranging from control to contextual integrity—shed light on how these regulations tackle not just the protection of personal data but also the ethical dimensions of user autonomy and informed consent. Against this backdrop, the subsequent sections delve into a more granular evaluation of how GDPR and CCPA specifically influence social media advertising practices, with a heavy emphasis on practical adjustments, industry strategies, and emerging innovations in compliance.

3. Expanded Literature Review on GDPR and CCPA in the Context of Social Media Advertising

A growing corpus of academic, legal, and industry-oriented literature offers diverse perspectives on how data protection regulations affect targeted advertising. This section synthesizes these works, categorizing insights around areas such as consent frameworks, user perceptions, algorithmic disclosures, compliance strategies, and broader marketplace shifts. By focusing on the social media sphere, the review highlights convergences and divergences in scholarly and real-world understandings of GDPR and CCPA’s impact.

3.1 Consent Mechanisms, Opt-Out Models, and User Autonomy

3.1.1 The Centrality of Consent

Consent stands as a linchpin in GDPR’s approach to lawful data processing. Article 4(11) of GDPR requires that consent be “freely given, specific, informed, and unambiguous.” In social media advertising, this mandate typically translates into frequent user prompts, cookie consent banners, and in-app notifications seeking agreement on data usage. Researchers such as Utz, Degeling, Fahl, Schaub, and Holz (2019) have demonstrated that the design of consent prompts heavily influences acceptance rates, revealing how seemingly minor UI details can dramatically sway user decisions.

A parallel concept within CCPA is the user’s right to opt out of having personal data “sold.” Scholars like Klonick and McLaughlin (2020) observe that the term “sale” in CCPA is sufficiently broad to cover many forms of data transfer that occur in ad-targeting ecosystems. While GDPR mandates explicit opt-in for most forms of data processing, CCPA generally relies on an opt-out model. The contrast in these two approaches has produced ongoing debates regarding which structure better secures genuine user autonomy (California Legislature, 2018).

3.1.2 Impact on User Perceptions and Behavior

Academic literature indicates mixed outcomes regarding how these consent or opt-out measures affect user trust and engagement. On one hand, providing more tangible control over personal data can enhance trust and deter users from feeling unduly “tracked” (Xanthopoulou & Zampou, 2020). On the other hand, repeated displays of cookie banners and pop-ups have led to “consent fatigue,” wherein users mechanically click “accept” without fully absorbing the implications (Nouwens, Liccardi, Veale, Karger, & Kagal, 2020). Thus, some argue that while regulations require transparent disclosures, the real-world execution can inadvertently trivialize or obscure genuine informed consent.

3.1.3 The Advent of Global Privacy Controls

In response to user fatigue and the fragmented nature of opt-out forms, multiple browsers and advocacy groups have explored “global privacy controls”—technical signals that can communicate a user’s privacy preferences across different websites. Although these signals are still emerging, they reflect an industry-wide search for user-centric solutions that do not rely on complex site-by-site banners (Tene & Polonetsky, 2019). Social media platforms may eventually integrate or recognize these signals, offering a uniform approach to consent. However, practical adoption depends on regulatory guidance, industry coalitions, and the platforms’ willingness to relinquish granular data collection.

3.2 Algorithmic Transparency and the Right to Explanation

3.2.1 Theoretical Foundations

Algorithmic transparency refers to the obligation of platforms to disclose information about how automated systems process data for decision-making, including content ranking and targeted ad delivery (Goodman & Flaxman, 2017). GDPR implies certain rights for individuals to “obtain meaningful information about the logic” behind automated processing (Recital 71; Arts. 13–15 GDPR). Although it does not guarantee an explicit “right to explanation,” it nudges companies toward clearer disclosures about profiling practices.

3.2.2 Realities of Implementation

Studies show that social media giants, faced with demands for transparency, often provide vague or generalized explanations. For example, Facebook’s “Why am I seeing this ad?” feature outlines broad categories rather than detailing the complex interplay of user behaviors, advertiser data sets, and real-time bidding signals (Eslami, Karahalios, Sandvig, & Vaccaro, 2016). Some scholars argue that truly meaningful transparency would require explaining how machine learning models weigh numerous data points, a technical undertaking that could overwhelm average users (Mittelstadt, Allo, Taddeo, Wachter, & Floridi, 2016).

Moreover, platforms often cite concerns about revealing trade secrets and the potential for system gaming if they disclose too much about their algorithms. Veale, Binns, and Ausloos (2018) highlight the tension between user empowerment and the protection of proprietary systems. Consequently, while GDPR fosters a principle of accountability, the actual data shared with users about ad targeting remains relatively superficial.

3.2.3 Societal and Ethical Dimensions

Many researchers view algorithmic transparency not just as a compliance issue but as a societal imperative to mitigate risks of data-driven discrimination or political manipulation (Taddeo & Floridi, 2018). Particularly in social media contexts, targeted ads can be employed to spread misinformation or exclude certain audiences. By mandating disclosures around how user data informs ad delivery, regulators attempt to limit malicious practices. Yet, the literature suggests enforcement remains patchy, and deeper questions linger about how to guarantee fairness in automated advertising decisions (Kucharczyk, Hrendzak, & Nycz, 2021).

3.3 Data Minimization, Retention, and Profiling Limits

3.3.1 From Data Maximization to Data Minimization

Historically, platforms adopted a “collect it all” philosophy, assuming that larger data sets would yield improved personalization and analytics (Spiekermann, Acquisti, Böhme, & Hui, 2015). GDPR disrupts this logic by requiring organizations to only gather data strictly necessary for predetermined purposes (Art. 5(1)(c) GDPR). Although CCPA does not explicitly impose data minimization across the board, the user’s right to delete personal data encourages more selective collection, since storing excessive data unnecessarily raises compliance risks (California Legislature, 2018).

In practice, social media businesses have responded by reevaluating data flows and retention policies. Some have shortened data retention windows or reduced the granularity of stored user attributes to limit potential liabilities (Solove, 2020). Nevertheless, these adjustments can undercut the predictive power of targeted advertising algorithms, sparking concerns among advertisers about diminished ad performance (Goldfarb & Tucker, 2019).

3.3.2 Profiling and Automated Decision-Making

GDPR also explicitly addresses profiling, defining it as the automated processing of personal data to evaluate or predict aspects of an individual’s behavior, preferences, or performance (Art. 4(4) GDPR). Under Article 22, users are entitled to safeguards against fully automated decisions that produce legal or similarly significant effects. Whether targeted ads meet this threshold remains debated; some interpret “significant effects” to encompass manipulative or discriminatory ad delivery, especially regarding areas such as employment and credit offers (Klonick & McLaughlin, 2020).

If these interpretations gain traction, social media platforms may be compelled to integrate more robust human oversight or to allow users to challenge certain advertising-based decisions. This possibility introduces fresh operational complexities, as introducing more human review could slow down the real-time bidding processes that underpin modern ad auctions (Tene & Polonetsky, 2019).

3.4 Compliance Strategies and Industry Frameworks

3.4.1 Privacy-by-Design Implementations

Voigt and von dem Bussche (2017) emphasize that GDPR elevates privacy-by-design from a best practice to a legal requirement (Art. 25 GDPR). Social media companies are thus encouraged (or compelled) to embed privacy features at the earliest stages of platform and algorithm development. For targeted advertising, this might entail advanced consent management modules, granular user controls for data sharing, and built-in anonymization or pseudonymization.

Literature on privacy engineering underlines emerging best practices, such as privacy impact assessments (PIAs) during product development sprints, or the use of differential privacy to obscure individual-level data while preserving aggregate trends (Dwork, 2006). The aim is to reconcile personalization with robust user protections. However, integration of such measures can be resource-intensive, especially for smaller or newer platforms (Kucharczyk et al., 2021).

3.4.2 Ad Tech Alliances and Standardization Efforts

The advertising technology (ad tech) ecosystem is labyrinthine, involving supply-side platforms (SSPs), demand-side platforms (DSPs), data management platforms (DMPs), and myriad intermediaries. To streamline compliance, industry bodies like the Interactive Advertising Bureau (IAB) have released frameworks—most notably the IAB Europe Transparency and Consent Framework—to standardize how consent signals and user preferences are transmitted throughout the real-time bidding chain (IAB Europe, 2019).

However, these frameworks have attracted criticism for their complexity and potential self-serving loopholes (Libert, Graves, & Surmitis, 2020). Several regulators across Europe have also raised concerns about the legal validity of such frameworks, particularly around whether they secure meaningful user consent. On the U.S. side, the Digital Advertising Alliance introduced updated guidelines to align with CCPA’s requirements, though again, enforcement remains a challenge.

3.4.3 Vendor Management and Processor Contracts

Both GDPR and CCPA require organizations to carefully manage third-party data relationships. Under GDPR, controllers must ensure processors comply with data protection mandates, holding both parties jointly liable in some scenarios (Arts. 28–29 GDPR). CCPA similarly demands that businesses implement written agreements confirming the limitations on data usage by “service providers” (California Legislature, 2018).

For social media platforms reliant on external analytics and marketing partners, this translates into an intensified focus on vendor assessments, contractual safeguards, and regular audits. According to Tene and Polonetsky (2019), large social media companies have ramped up internal compliance divisions to monitor third-party integrations, but smaller platforms may lack equivalent resources. This discrepancy can shape the competitive landscape by favoring incumbents that can more readily absorb compliance costs (Goldfarb & Tucker, 2019).

3.5 Advertiser Adoption and Impact on Ad Performance

3.5.1 Shifting Advertising Strategies

Advertisers, facing heightened scrutiny and limited access to user data, have experimented with alternative targeting methods. Some emphasize contextual advertising, matching ads to the thematic content of a webpage or social media post rather than user profiles (Beresford, Kübler, & Preibusch, 2012). Others rely more heavily on first-party data gleaned from loyalty programs or direct customer relationships, integrating that data with social media advertising campaigns.

Studies surveying marketing professionals reveal that these shifts can produce mixed results. While contextual ads and broader demographic segmenting are less invasive, they often deliver lower precision than algorithmically driven personalization based on user behavior. Nonetheless, new techniques like “cohort-based” advertising—wherein users are grouped into semi-anonymous segments—may bridge some gaps, preserving degrees of personalization without exposing granular personal identifiers (Kucharczyk et al., 2021).

3.5.2 Measurable Changes to Ad Efficacy

Evaluating the tangible effect of GDPR and CCPA on ad performance can be difficult given the multitude of confounding factors. Early post-GDPR data suggested a short-term dip in ad prices for European traffic, likely due to decreased cookie-based tracking (Goldfarb & Tucker, 2019). Over time, however, the ad market appears to have recovered as advertisers refined their targeting approaches.

For social media specifically, large platforms remain dominant, partly because of their direct user relationships and colossal troves of first-party data. Smaller ad tech providers reliant on third-party cookies have reportedly encountered more pronounced disruptions (Tene & Polonetsky, 2019). This outcome supports the “privacy paradox” hypothesis that stricter regulation can inadvertently entrench market power among established players who have both the resources to comply and the user scale to continue offering robust targeting capabilities.

3.6 User Attitudes, Trust, and Engagement

3.6.1 Shifts in User Sentiment

A recurring question in the literature is whether GDPR and CCPA meaningfully change how users perceive and interact with social media platforms. Surveys indicate that many users do appreciate enhanced controls over their data, and some users do take advantage of opt-outs and data deletion rights (Eurobarometer, 2019). However, user behavior often diverges from stated attitudes: convenience and social connectivity may trump privacy concerns, leading users to remain on platforms despite reservations (Acquisti, Brandimarte, & Loewenstein, 2016).

3.6.2 Privacy Fatigue and Overload

Researchers have documented the phenomenon of “privacy fatigue,” where individuals become overwhelmed by the constant need to review and respond to privacy notices (Nouwens et al., 2020). The repetitiveness of cookie banners can dull awareness, causing users to consent automatically. This phenomenon undermines the spirit of informed consent embedded in GDPR, raising questions about whether regulatory compliance in practice aligns with theoretical goals.

Additionally, a small subset of users employs privacy-enhancing tools like tracker blockers, virtual private networks (VPNs), or anti-profiling browser extensions, indicating that at least part of the user base remains highly motivated to shield personal data (Solove, 2020). These users’ behaviors provide a counterpoint to privacy fatigue, suggesting that consumer privacy activism is alive and well, even if not universally adopted.

3.7 Emerging Global Context

3.7.1 Policy Diffusion and International Copycats

Greenleaf (2018) notes that GDPR has influenced privacy legislation beyond Europe. Countries such as Brazil (Lei Geral de Proteção de Dados), India (proposed Personal Data Protection Bill), and Japan (amended Act on the Protection of Personal Information) have adopted or contemplated frameworks with similarities to GDPR. For social media businesses, this proliferation of robust data protection laws signals a broader global convergence on stricter privacy norms.

Likewise, CCPA-inspired laws are proliferating within the United States. States such as Virginia, Colorado, Connecticut, and Utah have passed privacy bills echoing CCPA’s approach. A possible federal data privacy law in the U.S. could unify or supersede state regulations. Scholars like Hartmans (2021) highlight how this patchwork environment complicates compliance for social media platforms, which must juggle multiple legal regimes with varying definitions, thresholds, and enforcement mechanisms.

3.7.2 Cross-Border Data Transfers and Enforcement Complexity

GDPR’s emphasis on “adequate” levels of protection for cross-border data transfers (Arts. 44–50 GDPR) has prompted legal disputes, such as the invalidation of the EU-U.S. Privacy Shield. Social media companies exchanging data with U.S.-based ad partners often resort to standard contractual clauses or alternative legal bases. But these mechanisms remain subject to ongoing legal scrutiny, adding another layer of complexity to compliance (Greenleaf, 2018).

For CCPA, cross-border transfers do not attract the same explicit focus, but businesses that handle data of California residents must ensure compliance. Social media firms operating globally thus face a mosaic of localized requirements, culminating in the need for robust governance infrastructures that track and adapt to evolving regulations across multiple territories.

3.8 Conclusion of the Literature Review

The scholarly and industry dialogue consistently portrays GDPR and CCPA as pivotal forces reshaping targeted advertising, with social media platforms at the forefront. Key themes include the vital role of consent and user autonomy, the challenges of operationalizing algorithmic transparency, the shift from data maximization to data minimization, and the nuanced economic repercussions of compliance.

Researchers generally concur that while GDPR and CCPA have significantly raised compliance hurdles and administrative costs, they also nudge platforms toward more privacy-conscious business models. The greatest uncertainties lie in how effectively these laws are enforced, how user attitudes may evolve, and whether emergent technologies or legislative refinements will alleviate or exacerbate tensions between personalization and privacy. In the context of social media, these dynamics are especially pronounced due to the platforms’ scale, global footprints, and the societal impact of content dissemination.

4. Operational and Strategic Implications for Social Media Platforms

Building upon the extensive findings from the literature review, this section explores how social media platforms have responded to GDPR and CCPA in real-world settings. It addresses the organizational, technological, and strategic adjustments that companies implement to align with evolving regulatory standards while continuing to generate revenue from advertising.

4.1 Organizational Restructuring and Governance

4.1.1 Privacy and Compliance Teams

One of the first visible shifts in large social media organizations has been the formation or expansion of dedicated privacy and compliance teams, as well as the appointment of Data Protection Officers (DPOs) under GDPR. These teams work cross-functionally, collaborating with engineering, legal, marketing, and product departments to embed privacy requirements into every stage of platform development (Voigt & von dem Bussche, 2017).

Social media executives increasingly recognize that privacy non-compliance poses not only legal risks but also reputational hazards that can alienate users and advertisers. Consequently, privacy compliance discussions have moved from peripheral concerns to core strategic considerations. This transition represents a cultural shift wherein privacy is viewed less as a box-ticking exercise and more as a competitive differentiator that can enhance user trust (Solove, 2020).

4.1.2 Establishing Internal Privacy Councils and Advisory Boards

Some platforms have created internal bodies—often called “privacy councils” or “ethical advisory boards”—to oversee data protection policies and to arbitrate conflicts between product innovation and regulatory obligations. Such bodies can incorporate representatives from both technical and non-technical divisions, ensuring a holistic approach to privacy (Kucharczyk et al., 2021).

In some cases, external experts or ethicists are invited to participate. These external advisors help validate whether the platform’s data practices conform not only to the letter of the law but also to broader ethical standards. While this approach signals a genuine commitment to responsible data use, critics question whether such initiatives are sufficiently empowered or if they risk becoming mere public relations tools.

4.2 Consent Management and UX Redesign

4.2.1 Crafting User-Centric Consent Flows

To comply with GDPR, social media platforms have introduced explicit consent pop-ups and granular privacy settings that allow users to enable or disable specific data-processing features. In the context of targeted advertising, these interfaces often present multiple toggles for ad personalization, data sharing with third parties, and the use of sensitive data for profiling.

Researchers suggest that more sophisticated UX design can help combat consent fatigue, for instance, by employing layered disclosures: an initial, concise explanation followed by deeper levels of detail upon user request (Utz et al., 2019). However, some critics argue that even layered approaches can be manipulative if they steer users toward acceptance through pre-highlighted toggles or color-coded design patterns (Mathur et al., 2019).

4.2.2 Personalized Privacy Dashboards

In addition to onboarding consent prompts, many platforms now offer “privacy dashboards” where users can review and adjust their advertising preferences, examine which categories they’ve been slotted into, and see a history of ads they’ve encountered. These dashboards aim to demystify how personalization engines operate and to fulfill requirements around the right to access and rectify data (Arts. 15–16 GDPR).

Though widely lauded, the usage of these dashboards remains uneven. Studies show that only a small minority of users frequently revisit their privacy settings, suggesting that while the dashboards align with regulatory mandates, they do not necessarily transform user engagement with privacy on a large scale (Nouwens et al., 2020).

4.3 Technological Overhauls: Data Handling and Ad Delivery

4.3.1 Minimizing Third-Party Dependencies

A recurring theme in social media’s compliance journey is the pivot away from third-party trackers and data brokers. Platforms increasingly emphasize first-party data: information users generate while logged in and interacting directly with the platform. This shift reduces legal risks tied to external partners who might misuse or inadequately secure data (Degeling, Utz, Lentzsch, Fahl, & Holz, 2019).

Moreover, controlling data end-to-end within the platform enhances the ability to secure and audit data flows. While advertisers can still leverage the platform’s targeting tools, the data exchange remains internal, theoretically decreasing the probability of unauthorized access and simplifying compliance with obligations to inform users about data recipients (Tene & Polonetsky, 2019).

4.3.2 Implementing Privacy-Enhancing Technologies (PETs)

Privacy-enhancing technologies (PETs) have gained traction as potential solutions to the dilemma of retaining advertising relevance without compromising user privacy. Examples include:

Differential Privacy: Integrating mathematically calibrated noise into datasets or aggregated metrics so that individual user data cannot be reverse-engineered (Dwork, 2006).

Federated Learning: Allowing the training of machine learning models on user devices directly, so raw data does not leave the device. The platform receives only aggregated model updates (Yang, Liu, Chen, & Tong, 2019).

Secure Multiparty Computation: Enabling multiple parties to collaborate on computations—such as matching user attributes with advertiser requirements—without revealing the underlying raw data to one another (Taddeo & Floridi, 2018).

While research and pilot programs demonstrate the viability of these tools, widespread industrial adoption remains hampered by technical complexity and concerns about performance overheads. Nonetheless, leading social media platforms have made incremental moves, deploying partial or hybrid PET solutions, such as local caching or on-device personalization for certain ad targeting tasks (Kucharczyk et al., 2021).

4.4 Strategic Alignments with Advertisers

4.4.1 Redesigning Ad Products

Social media companies have introduced new or revised ad product offerings that comply more seamlessly with GDPR and CCPA. Some have restricted the granularity of targeting options—eliminating certain targeting criteria like race, religion, sexual orientation, or political affiliation to avoid sensitive data processing that would require explicit user consent (Art. 9 GDPR). This approach is also meant to curb discriminatory ad practices (Klonick & McLaughlin, 2020).

In parallel, platforms promote “lookalike audience” features where advertisers can upload hashed contact lists, and the platform finds similar users without revealing personal identities. This approach can be more privacy-friendly than direct user-level targeting, though it still hinges on large datasets and sophisticated profiling.

4.4.2 Ad Metrics and Measurement

Regulatory constraints on data collection have pushed social media companies to refine how they measure ad performance. For instance, instead of event-level data on individual user actions, platforms might present advertisers with aggregated performance metrics. Advertisers can still track campaign effectiveness, but they have less capacity to trace user-level journeys.

Some tech giants experiment with “data clean rooms,” secure environments where advertisers can bring first-party data to match with platform data at an aggregated, anonymized level. The platform, acting as the trusted intermediary, ensures no raw personal data is directly exposed, thus aligning with GDPR’s principle of privacy by design (Tene & Polonetsky, 2019).

4.5 Risk Management and Legal Strategy

4.5.1 Auditing and Documentation

GDPR and CCPA obligations include extensive documentation of data processing activities. Social media platforms must maintain records of how data is collected, for what purposes it is used, and with whom it is shared. Regular audits are essential for verifying compliance, and some platforms have instituted continuous monitoring systems to flag potential violations (Voigt & von dem Bussche, 2017).

Where complexities arise, especially with large volumes of programmatic ad auctions, platforms sometimes adopt risk-based approaches, assessing which data flows present the greatest regulatory or reputational hazards. Risk scores or matrices guide decisions on resource allocation, ensuring compliance efforts concentrate on areas of highest exposure (Solove, 2020).

4.5.2 Litigation and Regulatory Engagement

Despite best efforts, social media companies occasionally face investigations or legal challenges. Regulators in the EU have levied considerable fines for GDPR infractions related to targeted advertising, spurring appeals and prolonged legal battles (Greenleaf, 2018). The outcomes of these cases set precedents that further refine or reshape the interpretation of data protection rules.

Many companies maintain open lines of communication with regulatory bodies, pursuing clarifications on ambiguous provisions and adapting practices based on preliminary guidance. The intensity of enforcement varies across regions—some Data Protection Authorities (DPAs) are more proactive, while others are resource-constrained. This uneven enforcement can lead to confusion, prompting corporations to aim for a high-level compliance baseline that can satisfy the strictest potential regulator (Kucharczyk et al., 2021).

5. Stakeholder Perspectives and Challenges

While social media platforms serve as the central arena where GDPR and CCPA regulations meet advertising practices, several other stakeholders have vested interests in how these laws are interpreted and enforced. This section considers the viewpoints of advertisers, users, policymakers, and consumer advocacy groups, highlighting both opportunities and tensions.

5.1 Advertisers and Digital Marketers

5.1.1 Realignment of Campaign Strategies

Advertisers have historically prized social media platforms for their unparalleled targeting precision. Yet, stricter data rules can limit the availability of granular information, prompting marketers to revamp campaigns. Many shift toward broader demographic or contextual targeting, which can be less efficient but also simpler to reconcile with compliance.

Given the intricacies of GDPR and CCPA, advertisers increasingly look for “privacy-friendly” solutions that still allow them to reach relevant audiences. Some rely on their own first-party customer data, merging it with platform tools to create custom or “lookalike” audiences. This approach can yield strong results if the advertiser’s data is robust, but it also necessitates alignment with the platform’s compliance framework (Beresford, Kübler, & Preibusch, 2012).

5.1.2 Trust and Brand Image

Privacy controversies on social media can tarnish not only the platform’s reputation but also that of advertisers. Brands risk guilt by association if they are seen to benefit from questionable data practices. Consequently, major advertisers often perform due diligence on platform partners, inquiring about how user data is sourced and verified (Spiekermann & Korunovska, 2017). Adapting to GDPR and CCPA is thus as much about reducing reputational risk as it is about fulfilling legal requirements.

5.2 End Users

5.2.1 Mixed Reactions to Privacy Controls

Users often welcome the concept of greater control over their personal data, yet in practice, many exhibit a degree of apathy or confusion about how to exercise it. Studies show that even if they appreciate the right to delete data or opt out, day-to-day usage patterns may remain unchanged (Acquisti et al., 2016). Nonetheless, the availability of privacy settings does provide recourse for users who are particularly sensitive about data usage, empowering them to tailor their ad experiences or exit the ecosystem.

5.2.2 Shifts in Perceived Platform Responsibility

Social media platforms once presented themselves primarily as neutral conduits for user-generated content. With the emergence of GDPR and CCPA, there is a growing sense among users that platforms are formal data stewards responsible for safeguarding personal information (Solove, 2020). This shifting perception can prompt calls for additional transparency and accountability, especially when controversies—like data breaches—undermine public trust.

5.3 Policymakers and Regulatory Authorities

5.3.1 Balancing Innovation and User Protection

Policymakers grapple with how to spur digital innovation while ensuring consumer protections against invasive practices. Some fear that overregulation might stifle technological advancement, particularly for smaller players who lack the resources to implement robust compliance programs (Tene & Polonetsky, 2019). Others maintain that strong privacy rules can stimulate innovation in privacy-centric solutions, thereby reinforcing user trust in digital ecosystems.

5.3.2 Fragmentation and Harmonization

Within the EU, GDPR seeks to harmonize data protection rules, yet national Data Protection Authorities differ in their enforcement vigor and interpretations. In the United States, the lack of comprehensive federal legislation has led to a patchwork of state-level laws modeled after or diverging from CCPA (Hartmans, 2021). Regulatory fragmentation complicates the compliance landscape for social media platforms, although it may push legislators to consider more uniform national standards.

5.4 Consumer Advocacy and Privacy Groups

5.4.1 Watchdogs of Corporate Behavior

Consumer advocacy organizations monitor social media platforms for potential infringements of user rights, regularly filing complaints with regulators. These groups sometimes act as catalysts for investigations, leveraging legal mechanisms under GDPR and CCPA to bring attention to suspected violations. Their scrutiny can spur platforms to be more transparent and user-friendly in their data handling.

5.4.2 Public Education and Guidance

In addition to their watchdog function, advocacy groups publish educational resources to help users understand their rights and navigate privacy settings. This role is increasingly important, as the complexity of online ecosystems can render even the most conscientious user uncertain about how to protect personal data. By bridging the knowledge gap, advocacy organizations help ensure that regulations like GDPR and CCPA deliver tangible benefits to the average social media user (Nissenbaum, 2009).

6. Future Directions: Privacy-Centric Innovations and Regulatory Trajectories

Even as social media platforms continue to absorb and respond to the mandates of GDPR and CCPA, broader shifts in technology, user sentiment, and policymaking suggest that the regulatory environment will remain fluid. This section discusses likely future developments in advertising technologies, possible legal evolutions, and the ongoing balancing act between personalization and user autonomy.

6.1 Advancements in Privacy-Enhancing Technologies

6.1.1 More Sophisticated Differential Privacy

As computational capacities grow, platforms may deploy advanced forms of differential privacy that strike a finer balance between data utility and anonymity. Future iterations could allow platforms to reveal aggregated user trends to advertisers without compromising individual-level data, thereby enabling targeted campaigns without explicit user tracking (Dwork, 2006). This approach might also help address rising concerns about re-identification attacks, ensuring that the random noise injected into data sets is sufficiently robust.

6.1.2 Federated and On-Device Learning at Scale

In an attempt to limit server-side data collection, social media companies might embrace or expand federated learning models. These systems allow personalization algorithms to train locally on user devices, uploading only encrypted model updates. However, questions remain about computational overheads for user devices, data biases, and the potential challenges of maintaining model accuracy over time (Yang et al., 2019).

6.1.3 Cryptographic Techniques for Ad Attribution

Cryptographic protocols such as zero-knowledge proofs and homomorphic encryption could facilitate privacy-preserving ad attribution. Advertisers would validate whether conversions resulted from specific campaigns without directly accessing personal identifiers (Taddeo & Floridi, 2018). Such breakthroughs would reduce reliance on pixel-based tracking or device fingerprinting, aligning well with GDPR’s emphasis on user consent and data minimization.

6.2 The Future of Business Models

6.2.1 Subscription or Freemium Models

Some analysts predict that as data-based advertising faces tighter restrictions, a subset of social media platforms might experiment with subscription tiers. These offerings could provide an ad-free, privacy-enhanced experience for a monthly fee (Beresford et al., 2012). While this approach redefines the traditional “free services for data” deal, it remains uncertain whether enough users are willing to pay for social media subscriptions to offset lost advertising revenue.

6.2.2 Microtransactions and Digital Goods

Another alternative could involve microtransactions for premium features, virtual goods, or community perks. Platforms like TikTok already permit in-app purchases (e.g., digital gifts), suggesting new revenue streams that lessen dependence on targeted ads. By diversifying income sources, platforms might mitigate potential dips in ad revenue stemming from privacy regulations.

6.3 Legislative and Policy Evolutions

6.3.1 CPRA and Potential Federal Privacy Laws in the U.S.

California’s Privacy Rights Act (CPRA), effective from 2023, refines the CCPA by expanding the definition of sensitive personal information and empowering a dedicated enforcement agency (California Legislature, 2020). This heightened focus on data usage in California could serve as a template for other U.S. states or spark federal action. A single federal privacy law could reduce compliance complexity but might also introduce new obligations, depending on its design (Hartmans, 2021).

6.3.2 EU ePrivacy Regulation

Besides GDPR, the EU has been working on updating its ePrivacy Directive into a more comprehensive ePrivacy Regulation. This upcoming regulation may further refine rules on tracking technologies, direct marketing, and the confidentiality of communications. If ratified, social media platforms may need to reassess how they handle messaging features and embedded trackers within their services (Greenleaf, 2018).

6.3.3 Global Harmonization or Fragmentation?

The global impact of GDPR, coupled with CCPA-inspired U.S. state laws, suggests a trend toward stronger data privacy regulations worldwide. However, some jurisdictions may adopt narrower or more business-friendly approaches, leading to further fragmentation. For transnational social media platforms, the challenge will remain in adapting to a complex patchwork of rules, potentially shifting development toward “privacy by localization”—storing and processing data in each jurisdiction separately to comply with local laws (Tene & Polonetsky, 2019).

6.4 User Empowerment and Privacy Activism

6.4.1 Rise of Collective Negotiation

User cooperatives or data trusts are novel concepts where individuals collectively manage their data rights, negotiating terms with platforms. Although still largely hypothetical, such structures could give users more bargaining power and transparency regarding how data is shared and monetized (Lanier, 2014).

6.4.2 Greater Awareness Through Education

Efforts by advocacy groups, educational initiatives, and media outlets may continue to heighten user awareness around privacy rights. As more users learn to exercise data deletion or opt-out options effectively, social media platforms could see greater variability in user data pools, incentivizing them to design ad systems less reliant on invasive tracking and more reliant on user-friendly personalization (Nissenbaum, 2009).

7. Conclusion

GDPR and CCPA have ushered in a new era of data governance, compelling social media platforms and their advertising partners to recalibrate the delicate interplay between personalization and privacy. Originally, social media ecosystems thrived on sweeping user data collection, fueling highly targeted ad campaigns that effectively monetized the digital attention economy. Yet, with regulators asserting that user rights and data protection must not be secondary considerations, the industry faces an ongoing transformation in how personal information is gathered, processed, and shared.

Through a detailed historical and conceptual examination, this paper illustrates how social media advertising rose to dominance in tandem with evolving Web 2.0 technologies, setting the stage for unprecedented levels of personal data collection. The subsequent introduction of GDPR and CCPA fundamentally challenged these established practices, imposing stringent compliance frameworks that redefine permissible data flows, mandate explicit consent, and encourage data minimization strategies.

The literature review underscores that while both GDPR and CCPA aim to enhance user autonomy, the mechanisms differ—GDPR foregrounds opt-in consent and broad extraterritorial reach, while CCPA emphasizes opt-out provisions and focuses on restricting the “sale” of personal information. Despite these differences, the overarching impact on social media advertising shares common threads: reconfigured consent interfaces, algorithmic transparency (albeit limited in practical implementation), reevaluations of profiling limits, and the adoption of privacy-by-design technologies.

From an operational standpoint, platforms have restructured governance by appointing dedicated privacy officers, forming internal privacy councils, and revamping user-facing controls. Technologically, the pivot away from third-party cookies and the exploration of privacy-enhancing techniques like federated learning highlight a concerted effort to reconcile targeted advertising with legal and ethical obligations. Advertisers, in turn, navigate a revised ad landscape, experimenting with contextual or first-party data-based targeting methods that lessen compliance risks while maintaining campaign efficacy.

Numerous stakeholders exert influence on this rapidly evolving field: advertisers adjusting to new forms of measurement and segmentation, users grappling with more complex privacy settings and potential “consent fatigue,” policymakers striving for balanced legislation, and consumer advocacy groups pressing for stringent enforcement. As a result, compliance is not a one-time milestone but an ongoing process that shapes product roadmaps, corporate culture, and competitive dynamics.

Looking ahead, privacy-centric innovations—ranging from advanced differential privacy algorithms to cryptographic attribution solutions—may further reshape advertising on social media. Meanwhile, legislative changes such as California’s CPRA and a possible federal U.S. privacy law suggest that regulatory flux will continue. For social media platforms, the challenge will be to foster user trust and meet legal benchmarks without sacrificing the personalization that underpins their advertising revenues. Ongoing research and multi-stakeholder dialogue are needed to evaluate how effectively emerging solutions protect user autonomy and privacy while preserving the economic viability of digital platforms.

In sum, GDPR and CCPA serve as critical exemplars of how data protection regulations can force industries—particularly those reliant on personal data—to reassess foundational assumptions, adapt to new operational realities, and potentially innovate toward more ethical and user-friendly approaches. The case of targeted advertising in social media offers valuable insights into the interplay of commerce, technology, and legal frameworks, reminding us that the quest for a balanced digital ecosystem remains a continuing, iterative endeavor.

Acknowledgments

The authors would like to extend their gratitude to Dr. Elizabeth Monroe, who served as a guiding research mentor throughout the development of this paper. Her expertise and insightful feedback helped shape the direction and depth of this study. We also thank the countless researchers, policymakers, and industry practitioners whose work and advocacy continue to enrich the field of data privacy.

References

Acquisti, A., Brandimarte, L., & Loewenstein, G. (2016). Privacy and human behavior in the age of information. Science, 347(6221), 509-514.

Angwin, J., Varner, M., & Tobin, A. (2017). Facebook’s secret censorship rules protect white men from hate speech but not black children. ProPublica. Retrieved from https://www.propublica.org

Art. 4(11) GDPR. General Data Protection Regulation (EU) 2016/679.

Art. 5(1)(c) GDPR. General Data Protection Regulation (EU) 2016/679.

Art. 9 GDPR. General Data Protection Regulation (EU) 2016/679.

Art. 22 GDPR. General Data Protection Regulation (EU) 2016/679.

Art. 25 GDPR. General Data Protection Regulation (EU) 2016/679.

Art. 28 GDPR. General Data Protection Regulation (EU) 2016/679.

Art. 83 GDPR. General Data Protection Regulation (EU) 2016/679.

Beresford, A. R., Kübler, D., & Preibusch, S. (2012). Unwillingness to pay for privacy: A field experiment. Economics Letters, 117(1), 25-27.

Boyd, D. M., & Ellison, N. B. (2007). Social network sites: Definition, history, and scholarship. Journal of Computer-Mediated Communication, 13(1), 210-230.

California Legislature. (2018). California Consumer Privacy Act of 2018 (CCPA), Cal. Civ. Code § 1798.100–1798.199.

California Legislature. (2020). California Privacy Rights Act (CPRA).

Degeling, M., Utz, C., Lentzsch, C., Fahl, S., & Holz, T. (2019). We value your privacy… now take some cookies: Measuring the GDPR’s impact on web privacy. In Proceedings 26th Annual Network and Distributed System Security Symposium (pp. 1-15).

Dinev, T., & Hart, P. (2006). An extended privacy calculus model for e-commerce transactions. Information Systems Research, 17(1), 61-80.

Directive 95/46/EC. (1995). Official Journal of the European Communities, L281, 31-50.

Dwork, C. (2006). Differential privacy. In Proceedings of the 33rd International Conference on Automata, Languages and Programming, 1-12.

Eslami, M., Karahalios, K., Sandvig, C., & Vaccaro, K. (2016). First I “like” it, then I hide it: Folk theories of social feeds. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 2371-2382).

European Commission. (1995). Directive 95/46/EC of the European Parliament and of the Council. Official Journal of the European Communities, L281, 31-50.

European Commission. (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council (GDPR). Official Journal of the European Union, L119, 1-88.

Eurobarometer. (2019). Special Eurobarometer 487a: The General Data Protection Regulation. European Commission. Retrieved from https://europa.eu

Evans, D. S. (2009). The online advertising industry: Economics, evolution, and privacy. Journal of Economic Perspectives, 23(3), 37-60.

Goldfarb, A., & Tucker, C. (2019). Digital economics. Journal of Economic Literature, 57(1), 3-43.

Goodman, B., & Flaxman, S. (2017). European Union regulations on algorithmic decision-making and a “right to explanation”. AI Magazine, 38(3), 50-57.

Greenleaf, G. (2018). Global data privacy laws 2019: 132 national laws and many bills. Privacy Laws & Business International Report, 157, 14-18.

Hartmans, A. (2021). The patchwork of US privacy laws and the path toward a federal policy. Journal of Data Policy and Regulation, 2(3), 45-62.

IAB Europe. (2019). Transparency & Consent Framework v2.0. Retrieved from https://iabeurope.eu

Klonick, K., & McLaughlin, J. (2020). The new privacy regulation horizon: Lessons from California. Hastings Law Journal, 71(5), 1103-1130.

Kucharczyk, M., Hrendzak, T., & Nycz, M. (2021). The road to compliance: GDPR, CCPA, and the impact on data-driven businesses. Journal of International Data Privacy Law, 9(2), 56-69.

Lanier, J. (2014). Who owns the future? Simon & Schuster.

Libert, T., Graves, L., & Surmitis, T. (2020). Changes in third-party content on European news websites after GDPR. New Media & Society, 22(11), 2013-2031.

Mathur, A., Acar, G., Friedman, M., Lucherini, E., Mayer, J., & Narayanan, A. (2019). Dark patterns at scale: Findings from a crawl of 11K shopping websites. In Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1-32.

Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 1-21.

Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.

Nouwens, M., Liccardi, I., Veale, M., Karger, D., & Kagal, L. (2020). Dark patterns after the GDPR: Scraping consent pop-ups and demonstrating their influence. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-13).

PwC Global Entertainment & Media Outlook. (2021). Annual Report on the Global Advertising Industry. PricewaterhouseCoopers.

Solove, D. J. (2020). The myth of the privacy paradox: Conceptions of privacy in law and research. George Washington Law Review, 89(1), 1-55.

Spiekermann, S., Acquisti, A., Böhme, R., & Hui, K. (2015). The challenges of personal data markets and privacy. Electronic Markets, 25(2), 161-167.

Spiekermann, S., & Korunovska, J. (2017). Towards a value theory for personal data. Journal of Information Technology, 32(1), 62-84.

Taddeo, M., & Floridi, L. (2018). How AI can be a force for good. Science, 361(6404), 751-752.

Tene, O., & Polonetsky, J. (2019). The rise of the privacy tech industry: Building a safer digital future. International Data Privacy Law, 9(3), 177-191.

Utz, C., Degeling, M., Fahl, S., Schaub, F., & Holz, T. (2019). (Un)informed consent: Studying GDPR consent notices in the field. In Proceedings of the ACM SIGSAC Conference on Computer and Communications Security (pp. 973-990).

Veale, M., Binns, R., & Ausloos, J. (2018). When data protection by design and data subject rights clash. International Data Privacy Law, 8(2), 105-123.

Voigt, P., & von dem Bussche, A. (2017). The EU General Data Protection Regulation (GDPR): A Practical Guide. Springer.

Westin, A. F. (1967). Privacy and freedom. Atheneum.

Xanthopoulou, P., & Zampou, L. (2020). Consumer trust in digital services under GDPR and CCPA. European Journal of Consumer Law, 13(2), 45-62.

Yang, Q., Liu, Y., Chen, T., & Tong, Y. (2019). Federated machine learning: Concept and applications. ACM Transactions on Intelligent Systems and Technology, 10(2), 1-19.

RELATED ARTICLES

Leave a Reply

- Advertisment -
Google search engine

Categories

Recent Comments

Reset password

Enter your email address and we will send you a link to change your password.

Get started with your account

to save your favourite homes and more

Sign up with email

Get started with your account

to save your favourite homes and more

By clicking the «SIGN UP» button you agree to the Terms of Use and Privacy Policy
Powered by Estatik

Discover more from National High School Journal of Contemporary Scholarship

Subscribe now to keep reading and get access to the full archive.

Continue reading