Noga Rosenthal is General Counsel and Chief Privacy Officer at Ampersand, a $2B data-driven TV company. With over 17 years of experience advising leading technology companies, Noga brings extensive expertise in international privacy laws, complex commercial transactions, and public policy work. Her career spans roles at Epsilon, the Network Advertising Initiative, Xaxis, and Sotheby's, where she has built global privacy programs, led high-performing teams, and driven industry collaboration in the digital space.
As an industry thought leader, Noga has published extensively on topics including data subject rights, data retention guidance, and AI governance. She holds certifications as a Certified Information Privacy Professional (CIPP/US, CIPP/EU, and AIGP) and serves as an advisor on privacy, security, and AI governance strategies.
Q: Tell us about your journey into privacy and what drew you to this field.
I've been in the privacy field for over 17 years now, with my journey beginning in the advertising technology (ad tech) industry. Ad tech has always been at the forefront of the privacy conversation—it's where the tension between data-driven innovation and consumer expectations plays out most visibly. A lot of the privacy greats started in the ad tech field.
Back in 2008, we were often navigating gray data ethics questions amid rapidly evolving technologies and a constantly shifting ad tech landscape. I remember a specific instance about 15 years ago when a company wanted to run a banner ad that would include the consumer's first name in the creative, calling them out to open a new financial account. I immediately said no to our running this ad. People expected a certain anonymity online and did not expect an advertiser to know their name when they were surfing the web. At that time, there wasn’t clear guidance around ethical data use, so we were working together to create self-regulatory principles and making gut decisions about whether ad campaigns like this were too intrusive or "creepy."
Now, with comprehensive privacy laws passing across various states and countries, we have frameworks that can help companies define appropriate boundaries between personalization and intrusion. What I enjoy most now is that many of the new privacy open up questions that must be answered like puzzles: how does this new data use fit in within this privacy law? Having said that, privacy officers in the ad tech space still grapple with evolving issues that fall outside the law.
During my career, I have built out global privacy programs across different ecosystems each with its own unique challenges including programmatic advertising, email advertising and TV advertising. Each role required a different strategic viewpoint on privacy and globally compliance.
Q: What does Ampersand do, and what is your role there?
Ampersand is a data-driven TV company that helps advertisers reach audiences across TV platforms through advanced data viewership insights. As General Counsel and Chief Privacy Officer, I serve as the chief legal advisor to our senior leadership and Board of Directors, guiding comprehensive legal, compliance, and risk management strategies.
My role encompasses everything from corporate governance and antitrust issues to leading negotiation of complex contracts. On the privacy side, I help strategize to manage our privacy risks and develop proactive policies for privacy, advertising, and AI ethics, continuously aligning the legal function with Ampersand's revenue growth objectives. Being in the TV advertising space means dealing with complex privacy issues, so my privacy expertise is particularly valuable as we navigate the evolving landscape of personalized advertising while maintaining consumer trust.
Q: As both General Counsel and Chief Privacy Officer, you wear two hats. How do you balance the sometimes competing priorities of privacy and other business considerations?
Balancing the roles of General Counsel and Chief Privacy Officer means I’m constantly weighing legal risk, business goals, and privacy principles together, not in isolation. I don’t see privacy as something that competes with the business; I see it as a strategic asset that, when handled well, builds trust, strengthens brand value, and mitigates our long-term risk. When tensions do arise- like needing to move quickly in a data-driven initiative- I focus on collaboration, asking: how can we achieve our goals in a way that respects individuals' privacy, complies with the law, and still delivers value to the business? It’s less about tradeoffs and more about finding smart, compliant solutions that support growth and innovation. I truly believe that my employees also expect us to follow ethical data principles.
The other key is to be a practical problem-solver. When the business wants to pursue a new initiative that raises privacy concerns, instead of simply saying "no," I work with teams to find alternative approaches that achieve the same business objective while respecting privacy principles. This collaborative approach helps maintain my credibility as someone who enables the business rather than blocks it. They can also lead to innovative, new business practices.
I also ensure I create space for deep thinking on complex issues. As one former supervisor taught me, it's incredibly important to carve out "quiet time" for strategic reflection rather than constantly reacting to the immediate demands of both roles. I rely on a strong team of attorneys to help push through other initiatives so I have time to do a deep dive on an issue.
Q: With your extensive experience across different organizations, how does your approach to AI governance differ from traditional privacy management?
I actually don’t think that our approach is so different. People often think that a privacy officer works in silo. But we build out a network of colleagues in HR, marketing, IT and security to keep us in build out a privacy compliance program and keep us in compliance. AI has a new layer of complexity but on familiar ground- we built out a similar program for AI. I also sit in product roadmap meetings whether it’s to review a new use of our ad tech platform or to create a new AI tool. Either way, we approach our privacy and AI governance program the same way - as a multidisciplinary approach.
At Ampersand, I helped initiate a multi-stakeholder approach to AI governance, bringing together our CTO, IT team and security team, to build a framework for responsible AI development and use at Ampersand. Rather than trying to own AI governance exclusively within legal or privacy, I positioned myself as the coordinator of a cross-functional effort.
At times, what surprised me the most was that I was one of the most vocal proponents for using AI across our organization. I’m excited to see what we can do with AI, but I know we should do so with guardrails.
Q: With the rise of state privacy laws and increasing regulatory complexity, what strategies have you found effective for managing privacy compliance without constantly reinventing your program?
I've found that building a privacy program for flexibility rather than reacting to each new law individually is absolutely essential. Being proactive rather than reactive has saved us countless hours and resources. Trying to predict what new privacy concern will pop up next is also key.
For years, the ad tech industry has had to adjust to shifting regulatory and platform-driven privacy changes, from GDPR to Apple's App Tracking Transparency. Now, with state laws restricting sensitive data use, companies must move from a law-by-law approach to a risk-based, scalable compliance model. We also need to make sure that our privacy guardrails are understandable and that our employees can follow them. This includes operationalizing our data governance policies- but not just setting policy. We also have our own internal ethical standards that go beyond the law that we create and instill within our Ampersand teams.
To do this right, I recommend establishing a strong privacy foundation based on global principles that represent the highest common denominator across regulations. This means implementing robust data governance program that can adapt to various jurisdictional requirements without complete overhauls and then training your teams about your privacy program.
Another effective strategy is to leverage privacy-enhancing technologies such as clean rooms, having a first-party data strategy, and using contextual advertising approaches that reduce regulatory exposure. These approaches often deliver effective business outcomes while minimizing privacy risks. These strategies may also lead to new product innovations.
Finally, I also have a wonderful network of privacy officers at other companies that help me benchmark my program and talk about those “Am I crazy?” moments.
The key insight I've gained from all this is that regulatory compliance is no longer just about minimizing risk—it's about building consumer trust and differentiating from competitors. Our clients ask us questions about our privacy program that we already have answers for.
Q: The privacy profession continues to grow and evolve rapidly. What are the top skills or attributes you look for when hiring privacy professionals, and how do you determine if a candidate is truly "doing privacy" versus just having privacy in their job title?
When hiring privacy professionals, I look for several key attributes that help me distinguish exceptional candidates from those who simply have "privacy" on their resume.
First and foremost, I value practical problem-solving ability. Privacy work requires balancing legal requirements with business objectives, so I seek candidates who can navigate complexity and find creative solutions rather than simply identifying problems. During interviews, I present candidates with real scenarios we've faced and evaluate their approach to resolving them. The key here is also making sure that the candidate can work in the grey. Sometimes the answer is not clear cut.
Second, I look for contextual understanding across disciplines. The most effective privacy professionals understand technology, business operations, and legal requirements. I appreciate candidates who have questions about our business. I also ask candidates detailed questions about how they've implemented privacy controls in actual systems or influenced product development to assess their practical understanding.
To determine if someone is truly "doing privacy" versus just claiming it, I have them walk through specific privacy initiatives they've led. True privacy practitioners can describe the full lifecycle—from risk assessment to implementation to monitoring effectiveness—with concrete details about challenges they faced and how they overcame them. Those with superficial experience typically speak in generalities or focus only on policy development without addressing operational realities.
I also assess adaptability and learning agility, as privacy is an ever-changing field. Candidates who demonstrate continuous self-education, engagement with industry developments, and the ability to apply previous experience to new contexts are particularly valuable.
One of the key skills I look for is issue-spotting—being able to identify potential legal or ethical risks early, even when they’re embedded in complex or fast-moving situations. The second is the ability to communicate those issues simply and succinctly. Business leaders don’t have the time or appetite for dense legal memos. What they need is clarity: What’s the issue, why does it matter, and what are the options?
I often say that if you truly understand an issue, you should be able to explain it to someone outside of legal in one or two sentences. That’s the test. If you can’t explain it clearly, you probably don’t understand it fully yet. These two skills, spotting risk and translating complexity into clarity, are what make someone truly effective in a legal role that supports the business.
These attributes collectively reveal whether someone has genuine privacy expertise or merely checked boxes without developing the judgment and practical skills that effective privacy work requires.
Q: With the rise of AI and other emerging technologies, how do you see the role of the privacy professional evolving over the next 3-5 years? What should early and mid-career privacy professionals focus on to future-proof their careers?
I’ve been watching my own role expand over the past ten years. At first, I would be answering legal or privacy compliance questions. I lobbied in DC for privacy laws. All of sudden, I am answering trust and safety questions. Now we are at another an inflection point as AI and other emerging technologies reshape business operations and regulatory landscapes. The IAPP has reflected this change with its updated mission statement to encompass AI governance and digital responsibility, recognizing the interconnectedness of privacy, AI, and brand safety.
Privacy professionals will need deeper technical understanding, particularly around AI systems, machine learning models, and data engineering. The ability to speak knowledgeably about technical concepts like algorithmic bias, model explainability, and synthetic data will become essential, not optional. Again, we are seeing this inflection within the IAPP’s AIGP certification that requires a technical understanding as well.
Second, privacy work will continue to merge ethics and responsible innovation. Privacy professionals will need to broaden their focus beyond legal compliance to address ethical considerations of technology use. This means developing frameworks for evaluating the potential societal impacts of new technologies and data uses.
Third, the role will become more strategic as privacy considerations increasingly influence business models, product design, and competitive positioning. Privacy professionals who can connect privacy principles to business value will thrive, while those focused solely on compliance may find their roles narrowed.
For early and mid-career professionals looking to future-proof their careers, I recommend focusing on several key areas:
Develop technical literacy beyond basic IT concepts. Take courses in data science, AI ethics, or machine learning fundamentals to understand the technologies you'll be advising on.
Build business acumen by learning about your company's business model, market dynamics, and financial metrics. The ability to frame privacy in business terms significantly enhances your impact. This includes sitting in on product meetings for any new technology services or products.
Cultivate cross-functional collaboration skills, as privacy work increasingly requires partnership with product, engineering, marketing, and other teams.
Gain experience with privacy-enhancing technologies (PETs) like differential privacy, homomorphic encryption, and federated learning, which will become increasingly important for balancing data utility with privacy protection.
Develop expertise in emerging regulatory areas like algorithmic impact assessments, automated decision-making regulations, and cross-border data governance frameworks.
The most successful privacy professionals of the future will be those who position themselves at the intersection of technology, business, and ethics—using privacy principles to guide responsible innovation rather than simply enforcing compliance requirements.
Q: What final thoughts would you like to share with our readers about the future of privacy, AI governance, and responsible data use?
As we look ahead, the convergence of privacy, AI governance, and responsible data use will define the next chapter of innovation. We are entering an era where trust is the currency of digital business. Organizations that lead with transparency, accountability, and ethics will not only comply with regulation but also earn long-term competitive advantage.
Privacy professionals are uniquely positioned to help shape this future—not just by enforcing rules, but by championing principles that respect individual autonomy while enabling innovation. That requires moving beyond traditional privacy thinking and embracing a more holistic view of data governance and digital ethics.
AI governance, in particular, is still in its early stages. The most promising approaches are multidisciplinary and collaborative, bringing legal, technical, and business perspectives together to create frameworks that are both principled and practical.
From my experience, the most successful privacy and AI programs are not driven by policy alone. They are embedded in company culture, supported by leadership, and aligned with incentive structures that reward responsible decision-making—even when it comes at the expense of short-term gains.
The coming years will test our ability to harness powerful technologies while preserving human agency and dignity. Those of us in privacy and governance roles have an opportunity—and a responsibility—to help build a digital future that earns and sustains trust.
Q: Thank you for sharing your insights with us today, Noga.
Thank you for the opportunity. It's been a pleasure discussing these important topics.
Andy Roth is a pioneering privacy lawyer who has built and scaled privacy programs at Ame...
Read MoreIntroduction: Q: It was great to meet you at the IAPP’s Global Summit 2023 Tell us abou...
Read MoreVal joined TrustArc in October 2023 to lead its Legal and Privacy Departments and serve a...
Read More