In May 2017, U.S. Representative Marsha Blackburn introduced H.R.2520, the Browser Act, a bill designed to provide consumers with some control over the use of their personal information. Specifically, consumers that use broadband access services or websites or applications providing subscription, purchase, account, or search engine services are provided, depending on the sensitivity of personal information, the choice to opt-in or opt-out of policies used by these services to manage consumer information.
For sensitive information, opt-in approval must be expressly granted by the consumer. Sensitive information includes:
- financial information;
- health information;
- information about children under the age of thirteen;
- social security numbers information;
- information regarding a consumer’s geo-location;
- web browsing information: or
- information on the history of usage of software programs or mobile applications.
Consumers must be provided the opportunity to give opt-out approval for non-sensitive information.
Mrs Blackburn’s intent with the legislation is to equalize broadband access providers and edge content providers under the eyes of the Federal Trade Commission, the federal agency responsible for consumer protection and anti-trust law enforcement. In my opinion, this is not a far off from former Federal Communications Commission chairman Tom Wheeler’s goal of openness and transparency throughout the entire internet ecosystem; from consumer to broadband access provider to websites provided by edge providers.
What Mrs Blackburn’s bill also does is address information asymmetries, where edge content providers are viewed as having more knowledge on the value of consumer information that they extract from websites than the consumer does herself. The consumer cannot answer the question, “Is the value of the information I receive from online, x, greater than the value of the information that I give up, p, where that information is private?
It is not readily apparent whether H.R.2520 was also designed to save the consumer from asking this question: ” Why should I pay for an economic good i.e. privacy that isn’t Google’s to sell in the first place?”
Professor Caleb S. Fuller of Grove City College describes privacy as an economic good, something that the consumer wants more of. Most consumers are not willing to pay to protect this good, even though they know that firms like Google are collecting this information for free. For example, according to Professor Fuller’s research, 90% of Google’s users know that their “mouse droppings” are being tracked.While 29% of Google’s users don’t mind being tracked, 71% do. Their reasoning, according to Professor Fuller includes the fear of price discrimination based on their information; the receipt of spam advertising; the risk of identity theft; and the “dis-utility in just not knowing who knows what.”
One equitable solution, in my opinion, would be for Google and other edge providers to pay their subscribers to provide private data. Google could provide an offering schedule based on the sensitivity of the information it wishes to purchase. Consumers would have to consider the value of the privacy they give up in exchange for the value obtained from accessing web content. I wouldn’t expect every consumer to sell their data. Google will wisely set limits on its offers and a significant portion of consumers unable to get a price they want will settle for the old private data for access exchange that they have been conducting for two decades.
The opt-in, opt-out policy mitigates the work that the consumer should put in to determine the value of her data, but gives her the final say over how her private data can be used. Unfortunately this is also the down side where the market won’t be used to truly determine the real value that can be sold.