You are thinking like a software engineer that an illegal practice should be completely impossible to do, but socially it always sufficed to just put a reasonably high penalty on something and fine offenders regularly.
And, unlike some other computer crimes, it is sufficient to prevent local companies from using the technique. While someone who's hacking or laundering money might simply use a proxy in the Bahamas without problems, a company local company won't risk fines for using it. Sure, companies from far away can still advertise to you via tracking, but the value of their advertisements will go down sharply if no business near you can buy them.
Ok, so follow up question then is what about the same techniques used for finger printing but done so legitimately and then gray areas in between? Because I think relying on legislation cuts both ways, no?
I'm not an expert on front end technologies, but all those capabilities exist, I think, for some legitimate technical purposes. Now the only question is how is the data used...
Obviously using media queries to display a page correctly is fine.
Someone's preferred language and user agent detection also fine.
And then eventually you do all this legitimate stuff and maybe cache it to improve page speed. (Bear with me haha, I'm out of my depth)
Until, eventually, the same laws get used to do something kinda bad... I could see security heuristics being used as sort of an excuse to do actual fingerprinting and storing and sharing that data... all with promises of free stuff and totally safe and trustworthy partners.
Maybe a good way to help alleviate the situation is for browser vendors to provide an actual good way to track people without identifying them... what's it called? Differential Privacy? I think that's like a mathematically proven way to do this.
Even better would be if you could some how also poison the utility of finger printing in a persisted format, although I don't know how you could do that... I guess fully holomorphic encryption?
Maybe browser vendors could provide some kind of "clearing house" for operations that utilize these fingerprinted traits to take place. Like you, the developer, supply a function that accepts some fingerprinted input, to sort of a black box that then performs the work but hides the inputs. I just guess one problem with this is that the same developer could sample before and after and intuit what the original input was.
At any rate, I do sort of prefer well architected technical solutions, and would rather see the legislation demand that, then demand good behavior when good behavior can be so wishy washy.