Engelberg Center Live!

Conspicuous Consumers: Keynote of Lisa Bonner

Episode Summary

This episode is audio from Lisa Bonner's keynote address to the Engelberg Center's Conspicuous Consumers Symposium. It was recorded on October 17, 2025.

Episode Notes

Lisa Bonner, Esq. Bonner Law, A Professional Corporation

Episode Transcription

Announcer  0:00  

Lisa, welcome to engelberg center live a collection of audio from events held by the engelberg center on innovation Law and Policy at NYU Law. This episode is audio from Lisa bonner's keynote address to the engelberg Center's conspicuous consumers symposium. It was recorded on October 17, 2025

 

Michael Weinberg  0:25  

so when we started putting this event together and telling people about it, quickly, I realized that there were kind of two reactions. And I said, Yeah, we're going to do this symposium. It's about conspicuous consumers and the ways that consumers kind of operate across various laws, areas of law. And sometimes, when I told people that they would be like, Great, this totally makes sense. I'm very excited. I'm totally in and sometimes when I would explain to people, they would look at me like, I know all of the words that you just said, but I don't understand them. When you put them together in that way, and there was no, like, obvious split, right? They were like, smart, engaged, thoughtful people who thought this was a great idea, and they were smart, engaged, thoughtful people who were like, I don't even know what you're beginning to talk about. And so when we were looking for keynote speakers, I was a little bit concerned, and one of the one of the members of the law school's board of trustees suggested that I reach out to Lisa Bonner, and she didn't know me and I didn't know her, and I sent her this email about this idea, and I had, I didn't know was she going to be I'm totally on board with this person or a this thing you're proposing makes no sense person. But I was very happy that she emailed back very quickly and was like, Yeah, I'm totally in. This makes sense. I'm good to go. So I was, I was thrilled to get that response, and I'm thrilled that she is joining us today for this keynote. Now, Lisa graduated from NYU Law, so we take all credit for all of the successes that she's had in her career. She began her career as a litigator, and then transitioned to practice Entertainment and Media Law, where she provides councils for large media companies, film distribution companies, music, television and film producers, independent artists, Grammy Award winning recording artists and music publishers. Lisa doesn't just represent these people. She's also an independent film producer, a journalist, a podcaster, a columnist and a chair of the Board of Directors of the National Black Arts Festival. So she is engaged with so many different things, in so many different ways, that she is the perfect person to come and give us the lunch keynote. So please welcome me. Welcome join me in welcoming Lisa Bonner.

 

Lisa Bonner  3:19  

Well, hello everyone, and good afternoon. Well, that was a very true statement that Michael said, but I don't know how much I understood the assignment, but you know, here we are. And it was very interesting, because I was sitting there listening to that last the last panel that was up here, and I really felt like my life was passing before my eyes, because a lot of what we talked about, of what they talked about, is kind of what I'm going to talk about, obviously, IP because I'm an entertainment lawyer. So here we go. So if you've heard it before, you're going to hear it again and in a different voice. But Michael, thank you so much for that introduction, and I want to say good afternoon to all of you, and just say what a privilege it is to join the audience of lawyers and scholars here. Thank you to the engelberg Center for Innovation law and policy. And as Michael said, NYU Law, School of Law, my alma mater for inviting me to speak on this very important topic. Our symposium today asks us to confront a deceptively simple question, one that seems to morph over time. Who is the consumer in law and how does it affect behavior? How does the consumer behavior affect the law? Traditionally, the law has dictated boundaries of consumption. Intellectual property told us what we could copy. Antitrust law prevented unfair competition and dictated what we can choose from and technology law set parameters around data collection and use. Historically, the law has provided the legal framework and the consumer conformed a passive participant, if you will. But over the last two decades, the consumer has become anything but passive. In my work as a transactional entertainment lawyer, media strategist and on air Legal Correspondent, I have witnessed firsthand how the consumer has forged legal change from a passive participant to driving precedent, someone that the law has constrained to someone that the law is now being forced to follow. I've consistently had to strategize with clients and corporations to keep pace with the ever evolving law and media and create and conform contracts to keep pace with the times. The Consumer Legal evolution, as I call it, started in 1999 when Sean Parker and Sean fanning founded Napster, a peer to peer music file sharing service that revolutionized the way that consumers, that the world consumes music. Now I must say, yesterday, I was in Dean Sexton's office. Who will ever always be Dean Sexton to me, and I was talking to him about my speech, and talking to the kid, two kids in there about Napster, and they literally looked at me and had no idea what Napster was. So they're there. There is like, it's just amazing how the law has just continued to change. But in any event, Napster's peer to peer delivery system illegally enabled consumers to access vast libraries of others, music catalogs instantly and freely circumventing the record labels entirely. Although Napster was eventually forced into bankruptcy with ongoing litigation from copyright holders and record labels, the genie was out of the bottle. Consumers demanded a system that legally allowed the delivery of music straight to our phones, our mp three players at the time in any configuration without prepackaged album constraints. Over a decade elapsed before the music industry learned to adapt. This prolonged adjustment period nearly devastated the music industry until music streaming services emerged as a legal sharing model. Today's music ecosystem, Spotify and Apple music exists because consumers, not companies, dictated the mandate. They changed not only laws, but our music consumption, revolutionizing how audiences would consume all forms of law of entertainment. Napster wasn't just a disruption of music, it was the opening chapter of the Consumer Legal evolution, if Napster cracked the gate open, streaming services kicked the gates down. Platforms like Netflix, Hulu, Disney plus and prime video emerged because consumers insisted on abundance, immediacy and personalization and consuming large format programming. But while consumer power moved towards digital. Our legal framework remained analog. Antitrust law was still looking through the lens of 1970 constructs around price and output, even as 21st Century harms emerged around consumer autonomy, access and manipulation. This paradox has forced the law to evolve for decades, US measured harm, largely through the writer versus sonatone Corporation antitrust lens of price and output if prices fell and content proliferated, competition was presumed healthy. Yet in the streaming area, in the streaming era, where platforms like YouTube, Netflix and Amazon prime are free or low cost. Price no longer tells the full story. The harm is subtler, but no less real. It emerges in the form of diminished consumer agency. Think of how easy it is to sign up for a subscription, one click, sometimes with pre check boxes, but how difficult it can be to cancel. Are you sure you want to miss out the platform as if you're breaking up with a friend? These dark patterns, as the FTC calls them, are not design accidents. They're strategic frictions carefully engineered to keep users subscribed engaged in generating data. This is manipulation through design, and it reveals that the real price of free services is often our autonomy. The FTC 2023 lawsuit against Amazon settled just last month underscores how serious this has become. The FTC found that prime sign up and cancelation processes were so opaque and intentionally designed to confuse and frustrate users, a practice known as engineered friction, consumers fought back and won the $2.5 billion settlement awarded prime users $1.5 billion in refunds and $1 billion in structural obligations that require Amazon to redesign Its interface to make subscription and cancelation, clear, simple and honest. The settlement is a remarkable legal moment. It shows that the law is no longer just protecting consumers from unfair prices, but from unfair designs, meaning user interface design can itself constitute a deceptive trade practice. The architecture of the digital marketplace, how consumers are guided and sometimes tracked, is now subject to legal scrutiny, and design has become a new domain of the law. Regulators and by extension, the courts are now treating user experience design, not just contracts or pricing, as part of the competitive and legal landscape. That's what makes this different. The law is catching up to the reality that in digital markets, code, choice, architecture and interface design are now the new contracts. But what's most remarkable is how this case came to be. It did not start with regulators. It started with consumers, their complaints, their class actions and public pressure forced the FTC to act if Amazon demonstrated the consumer's ability to compel regulators. Disney illustrates the consumer's ability to compel markets by exercising economic power in real time when Disney made the decision last month to pull Jimmy Kimmel off the air after his controversial monolog about Charlie Kirk's murder, the FCC threatened to pull ABCs broadcast license, an action that far exceeded the agency's constitutional authority and threatened our First Amendment rights. And the reaction was swift. Within days, consumers canceled Disney plus and Hulu subscriptions, abandoned trips to Disney parks and flooded social media announcing boycotts. Disney saw $4 billion in market capitalization vanish in less than a week, and the brand damage was immediate and measurable in six days. ABC reinstated Jimmy Kimmel, his post subscription, excuse me, his post suspension, opening. Monolog had its largest viewership ever, over 26 million views across social media and 6 million broadcast viewers tuned in over four times its normal audience. This moment illustrates that consumers are no longer the passive beneficiaries of law. They are active enforcers of market discipline, leveraging collective economic power to reshape corporate behavior in real time.

 

Lisa Bonner  12:14  

But if the past two decades have been about consumer empowerment through access and market regulation, the coming decade will be about consumer entanglement through algorithms. In AI, we are entering an era where consumption itself is shaped by machine learning systems trained on us, by us, on our clicks, our voices, our searches and our preferences. Every interaction becomes a data point that trains the system's understanding of what we want or what it thinks that we want. In this feedback loop, the consumer is no longer simply a buyer or a viewer. The consumer has become both the product and the producer, the raw material that trains the machine and the audience that consumes what the machine produces in return. Now that's a profound shift, because if competition law in the streaming era asked whether consumers truly had choice, AI must now ask whether consumers even retain agency. When an algorithm predicts what you like before you even know it, curates your news feed, recommends what to watch, what to buy, what to believe or think. Are you choosing or are you being chosen for? And if your data helps train the system that later markets this to you, where does the participant end and where does exploitation begin? This evolution forces us to reconsider traditional legal categories and intellectual property who owns the output of models trained on the public's creative work in privacy and competition, how do we safeguard autonomy in markets where personalization is indistinguishable from manipulation, and how in consumer protection, excuse me, in Consumer Protection, what does even consent mean when algorithms anticipate our desires before we articulate them? Just as Napster forced copyright law to modernize and Amazon's dark patterns forced the FTC to redefine consumer harm, AI will force the law again to reckon with the simple but urgent question when the consumer's data becomes the engine that drives the market, can the law still protect consumer as an independent actor since the dawn of the 21st Century, as we've just discussed, we've seen how consumers can reshape markets. The Age of artificial intelligence shows us that consumers now generate these markets. AI has blurred the lines between producer and consumer entirely. Every text we write, every song we stream, every image we scroll past, produces data, the raw material that trains in the train systems that now generate the new creative works. This is not passive consumption. This. Is involuntary collaboration, and unlike traditional collaboration, consumers retreat, received no attribution, little to no compensation, and, most critically, were never asked for consent. We saw this most recently with the Ninth Circuit's decision in Barts. Excuse me, BARTs versus anthropic. This case settled in December. Excuse me, settled in September illustrates how current IP laws are failing to safeguard authors despite allegations that anthropic trained its AI models on vast amounts of consumer generated data without their consent, the settlement provides the IP holders with a mere pittance, a $20 settlement check for authors whose books were ingested into their AI models. But that does not reflect the value of being a co author of a generative system whose output generates billions of dollars and shapes our future. It reflects a legal framework built for a world where consumption and creation were distinct acts, a world that no longer exists. So the legal question is not merely academic. If consumer data is the essential input that makes generative AI commercially viable, do consumers hold an equitable interest in the output? What we need now is not necessarily copyright reform, but I posit that we need a new legal category altogether, one that recognizes participatory contribution. Consumers are effectively crowdsourcing the creative baseline for AI. Yet the law has not revised itself to recognize this participatory role, not individual ownership of AI outputs, per se, but claims to transparency consent and perhaps a share of the value when our collective behavioral data becomes the foundation of billion dollar creative systems. I know this is easier said than done. This would mean, in a perfect world, that platforms couldn't simply bury consent in the terms of service agreements. It would mean that consumers can opt out of having their data used for model training without losing access to those same services, and it would mean that when AI companies monetize systems built on our participation, the law will require them to account for that relationship, not as a transaction, but As an ongoing collaboration that demands ongoing consent, but herein lies the profound irony. Consumers have demanded personalization, tailored consent, smart recommendations, predictive searches, but personalization requires surveillance. Every can. Every convenience is powered by the commodification of attention and preference. Every smart experience is an act of surrender. The consumer is once again forced to follow law's evolutions. Data Protection regimes like EU's General Data Protection Regulation and California's Consumer Privacy Protection Act arose not because regulators foresaw the problem, but because consumers demanded transparency and control, and the next iteration will need will not just be about privacy, but Providence, the right to know whether the media that we consume and or create is authentic, synthetic or something in between. And at its core, this is not just an economic or technological story, it's a constitutional one, because what the AI era truly challenges is not copyright or competition law per se, but consent. Just as Napster forced the music industry to legalize digital access and Amazon forced regulators to recognize engineer friction. Engineered friction, AI will force the law to confront the consumer's right to participation, transparency and identity. As technology evolves, consumers are not asking for permission. They're setting the terms. They're demanding Explainable AI, traceable consent and authentic human connection. So when we ask what is special about consumers and competition law today, the answer is this, the consumers are no longer the end point of legal protection. They are the starting point of market correction. We saw it with Napster's music evolution. We saw it with the proliferation of the streaming platforms. We saw with the $1.5 million settle, Amazon settlement. And we saw it with Disney when they lost $4 billion that vanished in just days. At every turn, the consumer has moved faster than the law, and at every turn the law has eventually followed. But AI presents a different challenges, one where the stakes are not just economic, but existential to consumer agency itself. In the Napster era, we lost control of distribution. In the streaming era, we lost control of access. In the eye era, we are losing something far more fundamental, authorship of our own preferences, our own culture and our own choices, if the law does not recognize participatory contribution. Now. And if we allow consumer agency to be reduced to a $20 settlement check, will we wake up in a world where the consumer is no longer the author of market disruption, but the raw material of it, and at that point, we won't be asking how consumers change the law. We'll be asking whether the law can protect us consumers at all. The Consumer Legal evolution is not over. It is accelerating. And the question for this room, for the lawyers, the scholars, the policymakers, all of us here, is this, will we build frameworks that truly recognize and protect consumer agency in this era, or will we once again find ourselves scrambling to catch up after consumers have already rewritten the rules. History has shown us this one undeniable truth, the consumer will not wait for permission, because the consumer never does. If we fail to anticipate, to codify, to design law around their agency, the law will always be chasing and never leading. And as yet, we have seen time and time again, consumers will continue to shape markets, rewrite norms and define, redefine the boundaries of possibility the Consumer Legal evolution. Legal evolution is accelerating, and the question is whether we, as lawyers and policymakers, will we rise to meet it, or we or whether we will once again be forced to follow after the rules have already been written? Thank you.

 

Speaker 1  21:36  

Thank you so so much. And just the perspective from someone who's kind of been in the trenches. So I just wanted to get some more thoughts on you, on how the kind of consumer plays a role in the kind of current AI space. So my initial thought is the consumer plays a role because we don't. Aaron's not here. It just avoids purchasing or wanting tacky things that are produced by AI. I mean, is that, you know, one of the roles in which the consumer kind of kind of leads the charge as to kind of devaluing, let's say, potentially infringing AI work or AI work that goes too far. Or is it, I think the other way could be the consumer kind of embracing AI and using it so much, particularly in these personal llms, that it just becomes almost impossible to enforce on the other side. So I'd just be interested in your thoughts of exploring like how that kind of consumer led either civil disobedience or avoidance kind of affects the way the industry works, and whether you think it'll be if we could say, potentially as successful as Napster et cetera, was in creating iTunes and our kind of current model of consuming music? Well,

 

Lisa Bonner  22:51  

that's a great question. And thank you for that. I think that, and we talked a little bit about, they talked a little bit about this in the last panel, that AI is completely different, and our world now is completely different. We are everything is coming out infinitely more quickly than it has before. Just two years ago, chat GPT and AI wasn't a thing, and now everybody is using it. Children are using it, and it's something that is that we're using. And as one of the gentlemen just on the panel said, you know, you can have the engineers in the room, and you can have the policymakers in the room, and you can talk about what we want to do, and then the the engineers are going to work around it. So this is the pair. This is a different era that we're dealing with in AI, I mean, you know, we we talked about how in Napster, the music industry was forced to catch up, and they did, and they offered, you know, a viable solution. I think the what we're all struggling with is there's so many legal issues that are tied to AI, right? We talked about competition now we're talking about agency, autonomy and them and CO authorship. I don't really know if we, if we have an answer, because it is moving at more at more speed. So the fact that we can try and, you know, legislate around it is one thing, but we might be playing catch up, and that's why, in my speech, I said it's, you know, it's easier said than done, right? In a perfect world, we would love laws that would that would tailor and we could go in and have a new category of what generative use and CO authorship look like. But what happens next? So I don't really, I don't have the answer to the question, and I don't know if any of us really do, because of the rapid speed that AI is is is morphing, and the laws are morphing in society is changing as a part of it, I just I do know that the traditional constructs that we have been looking at through the legal lens are not going to be enough to protect us from the. Harms, that they're not physical harms, but we are. We are feeding our data into these large language models, and in they're showing us, you know, they're taking it. They are now we're the consumer. At one point, we're the consumer, the content, the producer. The lines are are blurred. So I wish I had a better answer for you for that. But, you know, this is this AI, is something completely different, and maybe I'm saying that now because I was in the trenches with when, when Napster was going on, and I could see how the things were evolving. But this just seems a little a little bit different, in my opinion.

 

Michael Weinberg  25:41  

All right, we have time for one more question. Make it great, amazing.

 

Speaker 2  25:47  

This is the best question ever asked. No, I'm asking if it's so complicated and I get your way, and there's so many moving parts, and it's partly related to the framework that malo offer us in the previous panel. Consumer, can we really put so much weight on consent? We are already consenting every day. We are signing away our constitutional right every day, clicking through agreement that we don't read. That's how we are right for a jury, for anything under the sun. So if this system is even more complicated, why do we think that the consumer, through their agency, through their consent, will be able to rein it, versus other approach, top down approach,

 

Lisa Bonner  26:28  

well, I'm not saying that that is the broad answer to everything, but that is one that it that is one lane and one road that we are looking at. Because, as we've talked about through this speech and through the last panel, there are various iterations of harm that come from Ai. Consent is just one of them. CO authorship is another. So although we may not be able to legislate from the top down and across the board, if we can take maybe, perhaps the simplest approach, which would be consent, even if people don't read it, right? You know, you always Terms of Service. What does that even look like on chat? GPT, have you seen his terms? Terms of Service? I mean, right, but I mean, you know, the average person doesn't. So what is it? So give us the ability, and, like I said, perhaps we consent, and then are we opt out of our, our, our our data being used to train the model. I mean, we saw with anthropic right in terms of these authors had no idea that their books were being used to change to to train the AI model. And then here this, this system is that it's generating billions of dollars in revenue and actually shaping our culture, and they are getting a $20 check. So I'm not saying that consent is the only avenue in which to to fix and to opt to offer a solution, but I am saying is probably the easiest, and that would prevent at least some harm. And it's basically, you know, like I said, just being making your giving you opt in or opt out provisions and making the services, the terms of use, a little bit more transparent, because I know if I'm asking question, whether you know it's about any kind of question, or if it's a personal dating question, or what have you. Maybe I don't want that to be used in in their in their large language model in their learning. But I think that consent is probably the easiest one, but it's absolutely not the only avenue.

 

Michael Weinberg  28:27  

All right, please. All right. Short one, short question.

 

Speaker 3  28:35  

Add a little bit of color, because I was a commercial general litigator at the inception in my career. Now I'm doing now I work at JPMorgan Chase, and I'm in privacy compliance for the Consumer Bank, so we deal with these issues every day from a compliance perspective, and I think what you have to do, and I'm speaking on behalf of the bank in terms of we don't sell data, we don't share data, we give the customer information, we craft disclosures and let them know exactly what we're doing with the data, how we're going to use it. We track usage to make sure that it's being used in a reasonable, permissible purpose, only to service accounts, and that it's de identified if we're using it for in llms to kind of develop new products, if we're hiring a vendor to come up with some fraud detection service for a product that we have if we're looking to potentially market new products to a certain customer segment. So it's really in terms of financial services at the corporate level, it's something that we're grappling with every day, because the technology is really off the rails, and obviously there's legal exposure as well.

 

Speaker 3  29:51  

So thank you. Thank you.

 

Michael Weinberg  29:56  

Please join me once again in thanking Lisa. Applause.

 

Announcer  30:07  

The engelberg center live podcast is a production of the engelberg center on innovation Law and Policy at NYU Law, and is released under a Creative Commons Attribution, 4.0 International license. Our theme music is by Jessica batke and is licensed under a Creative Commons Attribution, 4.0 International license. You.