The New York Post quoted me this week in a piece on AI and privacy. The article covers Ring's Super Bowl backlash, OpenAI's failure to report a future school shooter, and the growing gap between what privacy laws promise and what they actually enforce.
My take hasn't changed. Privacy erosion isn't a bug. It's the business model.
The Pitch vs. The Reality
We tell ourselves these products are about convenience, safety, productivity. That's the pitch. The reality is simpler. Data is the asset. Everything else is distribution.
Ring sells peace of mind. What it's really building is a camera network pointed at everyday life. Not because it's malicious. Because it's valuable.
OpenAI feels less like software and more like a confidant. People don't "use" it. They talk to it. And when something feels human, people share more than they realize.
That's the shift. The interface got more human. The data got more intimate.
We've Seen This Before
Social. Search. "Free" products that turned into data machines at global scale. Gmail launched in 2005 promising unlimited storage. We knew immediately they were scanning emails and serving ads. My wife emails me to pick up milk, I get an ad for Gristedes. We shrugged and kept using it.
Cookies were the original handshake. Accept everything, get convenience. Reject them, and online shopping becomes unusable. So everyone accepted. The pattern was set.
Users have been happily trading free stuff for their data ever since. We've become accustomed to the convenience. But you have to remember: if the product is free, then you are the product.
Fines Are Line Items
When it crosses the line, we fine them.
Meta paid $725 million to settle privacy violations. On their revenue, that's roughly a day of business. Not accountability. A line item.
Disney just paid $2.75 million for violating California's consumer privacy laws. A record under the state's privacy act. For Disney, that's nothing.
These companies that have a built-in subscription model are going to have a better opportunity to maximize shareholder value. The ones running on ad revenue and data extraction? The fines are just a cost of doing business.
So nothing changes. Because the incentives don't change.
The Real Question
Washington isn't solving this anytime soon. The tech is moving too fast and the business model is too entrenched. At the federal level, stricter AI legislation is unlikely under the current administration's libertarian approach to tech regulation. Most action will happen at the state level, and it'll be slow.
If this gets fixed, it won't be because of regulation. It'll be because a company decides trust is more valuable than extraction.
That would be a real product decision. And a real competitive advantage.
Great piece by Mike Avila in the New York Post.