Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It may be relevant that Mozilla recently acquired a Meta-created ad tracking company and is now awash with Meta ad execs. [0]

That greatly misrepresents what the article says; really Mozilla acquired a company with a mission to get user data out of the advertising industry, which happened to be founded by former Meta employees:

Two years after leaving Meta to launch their own privacy-focused ad measurement startup in 2022, Graham Mudd and Brad Smallwood have sold their company to Mozilla. ...

Mozilla had initially been talking to Anonym, which uses privacy-enhancing technologies to build measurement and targeting solutions, about a potential partnership.

“But that quickly turned into, ‘Wow, our missions are basically the same,’” Chambers said. “We realized that together we could move a lot faster.”

That shared mission is predicated on the notion that advertising and privacy are not – or at least don’t have to be – mutually exclusive.

“We both believe that privacy-preserving technologies are a critical part of the solution to the privacy problem in digital advertising,” Chambers said. ...

Anonym also has technology that allows ad platforms and advertisers to securely share encrypted impression and conversion data within a trusted execution environment for attribution, causal lift measurement and lookalike modeling. (A trusted execution environment is the secure area of a main processor where code can be run safely and in isolation.)

To be fair, the major ad platforms have long offered attribution and measurement solutions, Mudd said. “But they required the data to come into their system,” he added. “In this world, that doesn’t have to happen.”



> Anonym also has technology that allows ad platforms and advertisers to securely share encrypted impression and conversion data within a trusted execution environment for attribution, causal lift measurement and lookalike modeling.

Wow, "secure", "encrypted", and "trusted" all in one sentence. They're trying to make it sound as reassuring as possible, but they're still doing tracking.


They're not, in fact. That's the whole point of their business. Where does it say they are tracking anyone - which means recording personal information?


"impression and conversion data", "attribution, causal lift measurement and lookalike modeling". These are all terms of art in the field of tracking user behavior: collecting information, and using it to infer what you can't collect directly.


That data only exists in encrypted form and in a trusted execution environment, based on the evidence. I mean, everything you do on the computer is also in RAM - is that tracking too?

There is no evidence of risk. A general freakout is not evidence of anything besides maybe some bad acid.


> That data only exists in encrypted form and in a trusted execution environment, based on the evidence.

First of all, please don't try to pretend that claims that the data remains encrypted and secure and only in trusted environments are claims that should carry any weight. The data cannot only exist in encrypted form. The entire goal of these systems is to mangle and aggregate the data "enough", then share that result as plaintext with the highest bidder.

> I mean, everything you do on the computer is also in RAM - is that tracking too?

No, not everything I do on my computer is tracking. Most software doesn't keep a long-term record of detailed interaction data. I don't expect my window manager to log how much time each app spends in the foreground. But even for the stuff that is logged, you should be able to understand that the real concern comes from when that information is exfiltrated from my computer, processed by a third-party, and sold.


How exactly would you suggest one looks for "evidence of risk" in this scenario? We already don't have any clear visibility into what companies do with our days (which is kind of the whole reason everyone is upset about the changes to the privacy policy that everyone is discussing here in the first place), so if a new company comes in and also starts doing _something_ with data that we also have no visibility into, we should just assume that they aren't doing anything sketchy because they didn't happen to say anything incriminating? What would you expect a company that _is_ doing something sketchy to say in this scenario? You don't think that's a company might just lie about something that they know no one can disprove?

Your "no evidence of risk" is my "no evidence of a lack of risk", and at the end of the day, I don't see any reason to blindly trust any company on their claims of being benevolent, let anyone one operating in such a historically sketchy one like adtech.


> I mean, everything you do on the computer is also in RAM - is that tracking too?

Uh... what do you mean? It's not like every programs has free access to the RAM and can just whatever that's in it, there are boundaries. Just because something exists in RAM doesn't mean it can be read, collected and analysed by someone else.

Also, data existing in "encrypted form" and being executed in a "trusted execution environment" mean nothing either. People whose goal is to collect the data can still decrypt it and read it, and a "trusted execution environment" basically means nothing if they whatever they get by analysing that data in that "trusted execution environment" is going to be disseminated to third parties who may or may not have the capability to use that data to identify you.

It's not nice to accuse people of freaking out over "maybe some bad acid". Even if it's "freaking out", in this case it's actually safer to "freak out" and avoid it than taking your ill-reasoned advice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: