It includes four important elements. One, making sure there’s strong enforcement. Realistically, in the US, strong enforcement has to include the ability for individuals to sue companies who violate their rights — to have a private right of action.
A private right of action has historically been very much been a part of civil rights laws in the US because often, government doesn’t have the resources or the will to enforce laws. You need private litigants to be able to take care of themselves and bring companies to court who don’t comply with the law. So one, a private right of action.
The second thing would be leaving room for states to put in place higher privacy standards in the event that either our federal standards become outdated or prove themselves to be inadequate — having a federal law that is a floor but never acts as a ceiling.
That’s been the case in other federal laws in the US; for example, healthcare. Obamacare put in place a floor for what had to be included in an insurance plan, but it left states free to put in place higher standards if they felt it was necessary to protect consumers.
The third thing that’s critical is what I call “data minimisation”; restrictions on how companies collect and retain data. This current system where there’s a long privacy policy, someone clicks “yes,” and doesn’t understand what they’re consenting to is obviously not working. We need strong limits that say a company cannot use or collect your data beyond what’s necessary for the service you’ve asked for unless they receive separate, clear consent; that consumers understand what they’re consenting to; and that we’re not in an arrangement where a company says, “You don’t get to use my service unless you consent to all of this data collection that’s not even necessary for this service.”
For example, if I have a flashlight app, it doesn’t need to know where I am all the time. Collecting my location information shouldn’t be a condition of me using it. A key component of strong data minimisation is making sure there aren’t what I consider coercive terms that deny people access to services if they won’t provide the consent a company seeks.
Finally, there’s a broader conversation happening in the US on how to ensure our data processing and collection activities are fully covered by civil rights laws, which prevent discrimination. Beefing up things like algorithmic transparency; questions around how you get information that might help you sustain a legal claim about being discriminated against, even if it’s through an algorithm.
This document may not be reproduced, distributed, or transmitted in any form or by any means including resale of any part, unauthorised distribution to a third party or other electronic methods, without the prior written permission of IP 1 Ltd.
IP 1 Ltd, trading as In Practise (herein referred to as "IP") is a company registered in England and Wales and is not a registered investment advisor or broker-dealer, and is not licensed nor qualified to provide investment advice.
In Practise reserves all copyright, intellectual and other property rights in the Content. The information published in this transcript (“Content”) is for information purposes only and should not be used as the sole basis for making any investment decision. Information provided by IP is to be used as an educational tool and nothing in this Content shall be construed as an offer, recommendation or solicitation regarding any financial product, service or management of investments or securities.
© 2024 IP 1 Ltd. All rights reserved.
Subscribe to access hundreds of interviews and primary research