Nevada Broadens Definition of Personal Information for Purpose of Encryption and Breach Notices

May 20th, 2015 by Scott Lyon and Nora Wetzel

On May 13, Nevada passed a new law (A.B. 179) expanding the definition of “personal information” to include a natural person’s first name or initial and last name in combination with: 1) medical and health insurance identification numbers; 2) user names, unique identifiers or email addresses in combination with passwords, access codes or security questions and answers that would permit access to an online account; and 3) driver’s authorization card numbers. The broader “personal information” definition applies to Nevada’s breach notice and security measure laws, which regulate both the collection of personal information of Nevada residents as well as data collectors doing business in the State of Nevada. (N.R.S. 603A.210, 603A.220).  However, the definition of “personal information” only applies to the specified data elements “when the name and data elements are not encrypted.”

Previously, the definition of “personal information” only included a natural person’s name when combined with a Social Security number, driver’s license or other identification card number, or an account or credit card number together with the security code or password necessary to permit access to a financial account.  Importantly, Nevada’s expanded definition includes both information often defined as “personal health information” (i.e. medical and health insurance identifiers) as well as computer access credentials (i.e. user names and passwords).  Given how many businesses assign their users unique identifiers and/or maintain email addresses with passwords for their users, this new law may impose significant obligations on companies maintaining Nevada residents’ personal information or doing business in the state.

Consequently, businesses maintaining Nevada residents’ personal information or doing business in Nevada should confirm they are compliant with the new Nevada law which goes into effect July 1, 2015.  Nevada requires a data collector to implement reasonable security measures, left undefined, to protect any Nevada resident’s personal information.  Under Nevada law, a data collector is broadly defined to include any entity or association (including universities, banks, and government agencies) that “handles, collects, disseminates or otherwise deals with nonpublic personal information.” Companies doing business in Nevada that accept payment cards in connection with the sales of goods or services must comply with the Payment Card Industry Data Security Standards.

A key element of Nevada’s data security requirements is its treatment of encrypted data.  Under both the original and newly expanded definition, encrypted data is not included within the definition of “personal information.”  In addition, any data collector doing business in the State of Nevada is required to encrypt personal information when transferring the data electronically (excluding fax transmissions) or when moving data storage devices containing personal information beyond the “logical or physical controls of the data collector.”  N.R.S. 603A.215(5)(b) defines the types of encryption deemed sufficient to satisfy Nevada law.

Any business maintaining records containing Nevada residents’ “personal information,” as newly expanded, or otherwise doing business in Nevada should ensure they have reasonable security measures in place.  Businesses which have not yet implemented reasonable security measures should do so such that they are in place by July 1, 2015.  In addition, the common threat of data breaches should encourage businesses maintaining Nevada resident’s personal information to prepare for data breaches by mapping their current data and maintaining up-to-date records of the types of data they maintain.  If a data breach occurs, businesses must assess whether any personal information of a Nevada resident, as newly defined by Nevada state law, was subject to the breach and notify the Nevada residents in accordance with Nevada state law.

In-Store Monitoring: How to Enjoy the Benefits of Tracking While Minimizing Potential Privacy Issues

May 18th, 2015 by Meegan Brooks

In the latest example of the conflict between technological innovation and privacy concerns, the Federal Trade Commission (FTC) reached a settlement agreement last month with Nomi Technologies, Inc.

Nomi is a startup whose technology allows retail merchants to analyze aggregate data about consumer traffic in the merchants’ stores. Although different companies track this data in different ways, it is generally done by monitoring signals emitted from a mobile phone to see where a device moves over time. Nomi’s technology can tell a retailer where a customer walks in a store, or whether she is a repeat customer; it is not able to identify her personally.

Notwithstanding heavy criticism from the public and privacy advocates for invading customers’ privacy by tracking their movement without their consent, the FTC’s action was not brought pursuant to any privacy law or privacy-based right. Instead, the FTC’s action amounted to a run-of-the-mill consumer deception claim. The FTC alleged that Nomi misled consumers by falsely promising to provide mechanisms for consumers to opt-out of tracking and be notified when their information is being tracked. The proposed settlement prohibits the startup from misrepresenting people’s options for controlling whether information about them or their devices is collected, used, disclosed or shared. Notably, it did not impose notice and consent requirements for retail trackers or offer more specific guidance for retailers who track their customers.

The FTC’s decision, which was split 3-2, highlights the tension between allowing emerging retail technologies to grow and innovate, and the potential privacy risks that come with allowing companies to track consumers. The dissenters argued that the FTC should have refrained from bringing this action, given the immateriality of the representation, the lack of evidence of consumer harm and the potential chilling effect to other innovative startups.

Lack of Formal Guidance for Retailers

Even though thousands of retailers currently use some type of in-store tracking technology, the FTC has not yet issued formal standards for how retailers should use this technology without violating customers’ right to privacy.

Still, the FTC has made its interest in this area clear. Over the last several years, the FTC has published several guidance documents related to mobile phone tracking more generally, which touched on retailers’ tracking of their customers. Last spring, the FTC hosted a seminar dedicated to the in-store tracking technology, including the different kinds of technology available and the privacy concerns with each. The Nomi action was just the latest reflection of the FTC’s increasing concern with this issue.

Days after the Nomi settlement, Ashkan Soltani, chief technologist at the FTC, blogged about the policy trade-offs in retail tracking. Soltain emphasized a point that was also clear in the FTC’s majority opinion in Nomi: “Retail tracking has many benefits for retailers and consumers alike. Stores are able to better understand the behaviors and preferences of their shoppers, and individuals in turn receive better service.” For example, by knowing where customers walk in a store, retailers are able to improve store layouts and reduce customer wait times.

Retailers looking to protect customer privacy should look to both Soltani’s blog and the FTC’s cell phone tracking reports for advice. Each reiterates that to best strike the balance between information and privacy, companies should disclose what information they are taking and how they plan on using it, and should ask for customers’ consent. Below are several considerations that apply specifically to the retail context:

1. Individual Identification

Currently, the predominant use for tracking information is to track customers in the aggregate. Although this is done by using unique identifiers to track each individual phone over time and across locations, each phone’s owner remains anonymous in this process.

However, the technology is available to track customers on a more individual basis. When a customer signs into a commercial hotspot, her MAC address can give a retailer access to her name and other WiFi networks she has used, and can “link” the customer’s online and in-store shopping behavior. Although it is unclear whether any companies collect or use this information, accessing this more personal information would clearly elevate privacy concerns related to in-store tracking. Notably, both dissenters in the Nomi case emphasized that Nomi’s technology did not provide the company with information about individual consumers, which suggests that they may have applied different analyses had Nomi been tracking individual customers.

Several efforts are currently being made to randomize phones’ wireless identifiers, so that retailers are not able to track individuals across multiple trips to multiple stores. For example, some smartphone manufacturers have attempted to build in features that limit retail tracking by randomizing the phone’s wireless identifier; according to Soltani, however, the effectiveness of these technologies is somewhat limited. The Internet Engineering Task Force (an Internet standards body) is currently working to achieve the same goal.

2. Consent

Although the FTC has not yet required that retailers obtain customers’ consent before tracking their locations, its recent publications in this area suggest that receiving consent is an effective way to minimize privacy risks.

Notably, it is much easier to receive customer consent for some kinds of tracking technology than others. Soltani distinguished active monitoring, which “is typically performed by the service the device is communicating with, such as by the cellular provider or by the WiFi hotspot the device is connected to,” and passive monitoring, which intercepts signals from the device as it communicates or searches for other devices and networks. Typically, customers are required to agree to terms and conditions before the retailer can use active monitoring; for example, by signing a cellular service contract or by connecting to a WiFi hotspot.

By creating a loyalty program application or offering free in-store WiFi, stores can offer benefits to their customers while also receiving their consent to data tracking. Another option, which is currently used by Apple, Macy’s, Coca-Cola, and Procter & Gamble, is known as proximity marketing. This is an opt-in system that allows retailers to send promotions to customers who are in the proximity of their stores.

Several smartphone location technology companies also allow customers to opt out of data tracking through an opt-out website, http://www.smart-places.org/. This website is one aspect of The Mobile Location Analytics Code of Conduct, which was created by analytics companies in October 2013 to assuage customers’ privacy concerns. Additionally, the Code also calls for companies to obtain consent before collecting customers’ personal information. Although the FTC praised the Code for “[recognizing] consumer concerns about invisible tracking in retail spaces and [taking] a positive step forward in developing a self-regulatory code of conduct,” this code is not legally enforceable. Following the Nomi decision, however, analytics companies could be liable for deceiving consumers by claiming to comply with the Code but then failing to actually do so.

3. Notice

Notice is closely intertwined with consent. By not imposing a notice requirement on Nomi, the FTC — at least for the meantime — seems to have signaled that retailers are not required to notify their customers that they are being tracked through their cell phones. However, both Soltani’s blog post and the FTC’s recent cell phone guidance publications treat notice as a best practice.

As with consent, customers normally receive notice before signing up for a cell phone contract, opening a retailers’ phone app or joining a wireless hotspot. Unlike with these forms of active monitoring, however, customers are generally not notified before being tracked through passive monitoring.

Notice may prove difficult for retailers who use passive monitoring. Although retailers can notify many of its customers by posting signs within their stores, this would not notify every person being tracked because the tracking technology also pulls cell phone signals from people passing by the storefront. To solve this problem, Soltani suggests that passive retail analytics technology devices begin to automatically notify users to the existence of mobile retail tracking and allow them to temporarily join in order to opt-out.

4. Other Ideas from Nomi

Until the FTC issues more concrete guidance in this area, retailers should at least make sure to follow the FTC’s guidance in Nomi by fulfilling any promises they make regarding privacy. Although Nomi provides rather than uses tracking services, the same legal principles apply to retailers. Retailers should act in accordance with every part of their privacy policies by respecting customers’ opt-out options and heeding any statements about what kind of information they collect or how they use that information.

Given that the law in this area is rapidly evolving, retailers should consult with legal counsel before implementing data tracking technology in their stores.

California’s Song-Beverly “Consumer Perception Test” in Jeopardy — Will Retailers in California Be Barred from Requesting Any Personal Information from Consumers at the Point-of-Sale?

May 8th, 2015 by Meegan Brooks and Stephanie Sheridan

On May 5, 2015, the Ninth Circuit certified for the California Supreme Court the issue of whether the Song-Beverly Credit Card Act (“the Act”) prohibits retailers from requesting a customer’s personal information at the point-of-sale (POS) after the customer has already paid, even if a reasonable consumer would not interpret the request as a condition for paying by credit card.
 
The case, Davis v. Devanlay Retail Group, concerns retailer Lacoste’s practice of requesting customers’ ZIP codes after the customer’s card has already been swiped. The lower court, like courts in a number of other district court cases, interpreted the statute to impose a “reasonableness” standard. Because a reasonable customer would not believe that she is required to share her information once her card has been swiped, the lower court determined that Lacoste did not violate the Act.
 
Plaintiff argues that “the consumer perception standard” has been improperly read into the Act by district courts, and that the law prohibits retailers from requesting any information while the customer is at the POS, regardless of whether the customer believes that she is required to share her information. According to Plaintiff’s counsel Gene Stonebarger, who has brought many suits pursuant to Song-Beverly, the Act even prohibits retailers from collecting information from customers who offer it, or to enroll customers in a store loyalty program.
 
The Ninth Circuit’s Order
 
The three-judge panel — which consisted of Judges Consuela Callahan, Milan Smith and Paul Watford — found the statute’s language, legislative history and case law to be ambiguous, and noted that each could be interpreted to support Plaintiff’s broad interpretation of the Act. The relevant portion of the statute, which appears in Civil Code § 1747.08, states that businesses shall not “[r]equest, or require as a condition to accepting the credit card as payment in full or in part for goods or services, the cardholder to provide personal identification information [PII]. …” The court noted that although this text suggests that the Act broadly prohibits any information requests, the grammatical rule used in reaching that interpretation had been rejected by California Courts of Appeal in other Song-Beverly cases. In Absher v. AutoZone, for example, a California Court of Appeal explicitly interpreted the disputed portion of the Act to “prohibit[] merchants from requesting or requiring credit card customers to write personal identification information on a credit card form as a condition precedent to accepting payment by credit card.”
 
The court also found that while many district courts have cited Florez v. Linens ’n Things as endorsing an objective consumer perception test, the Florez opinion is ambiguous and could also be read to hold that Song-Beverly prohibits all requests for information “in conjunction with” credit card transactions:
 
[W]e also find it plausible that the passage means Song-Beverly prohibits requests for PII that are “in conjunction with the use of a credit card” … We note that the Florez court does not appear to have actually applied an objective test in deciding the case … [A portion of Florez concerning the timing of a request] cuts against interpreting Florez to endorse an objective consumer perception test [and] suggests instead that Song-Beverly prohibits requests for PII that a consumer might interpret as a condition to payment by credit card, even if it would not be objectively reasonable to do so.
 
The court also explained that the court in Florez never explained how to determine whether a request for information was made “in conjunction with the use of a credit card,” and that a request made after the customer returns a customer’s credit card may not fall into that category.
 
At oral argument in March 2015 (attended by these authors), the panel seemed compelled by Plaintiff’s strict reading of the statute, but also emphasized the issues that would arise if retailers were never allowed to request customer information at the POS. Judge Callahan, who described herself as an “expert shopper,” noted that it would be “absurd” for the law to require customers who want to sign up for a store’s mailing list to first put their cards away and then walk away from the POS before they could legally sign up to receive information they desired. Justice Smith joked that Plaintiff wanted shoppers to “go to the bathroom … or do three somersaults” before being able to share their information.
 
The court’s order notes that a broad construction of Song-Beverly, as proposed by Plaintiff, “could have a significant impact on the practices of thousands of California retailers.” Although this statement was in reference to retailers who request information after the transaction, a broad construction of the Act would impact any retailer that requests information “in conjunction with” the use of a credit card. In effect, the statute could become a strict liability prohibition against any information requests at the POS during credit card transactions.
 
Certification to the Supreme Court
 
It is rare for the California Supreme Court to receive requests, and even rarer for them to accept them — the Court decided two civil cases resulting from Ninth Circuit certifications in 2014, three in 2013, and none in 2012. Although the Court is not required to accept a certified issue, the Court will likely accept the Ninth Circuit’s request, given the significance of the issue to many retailers and the current lack of guidance in this area. If the Supreme Court does accept the case, it will have the authority to reformulate the question presented to it by the Court, or to explore additional issues. Assuming the Court does accept the Devanlay issue, the case will be added to the Court’s regular civil docket, which means that it is likely to be a year or more before the Court hands down its decision.
 
If the Supreme Court adopts Plaintiff’s broad interpretation of the Act, retailers who currently request customer information at the POS may also be subject to retroactive lawsuits. In the month after the Supreme Court decided that ZIP codes are “personal information” under Song-Beverly in the 2011 Pineda v. Williams-Sonoma Stores, Inc. decision, for example, 106 class-action lawsuits were filed based on transactions that occurred before the decision was issued. If the Supreme Court again decides to apply its ruling retrospectively, retailers could be liable for any requests for information made in the year leading up to the decision. 
 
Retailers are advised to consult counsel with expertise in this area for guidance as to “best practices” in light of this new development.

New FCC Rules on CPNI Will Impact ISP’s and Businesses Who Rely on Internet Tracking Data

April 2nd, 2015 by Jia-Ming Shang

By now, most people know that in its recent Open Internet Order adopted on February 26, 2015, the FCC reclassified internet access services as common carrier “telecommunications services” subject to FCC jurisdiction under the Telecommunications Act of 1996.  The Order imposes a new regulatory framework on internet providers and, among many other things, augurs a sea change in how internet providers and their business partners may use certain data, including a class of information called Customer Proprietary Network Information (“CPNI”).

CPNI is defined as “(A) information that relates to the quantity, technical configuration, type, destination, location, and amount of use of a telecommunications service subscribed to by any customer of a telecommunications carrier, and that is made available to the carrier by the customer solely by virtue of the carrier-customer relationship; and (B) information contained in the bills pertaining to telephone exchange service or telephone toll service received by a customer of a carrier.”  See 2007 FCC CPNI Order.

Outside of telecom insiders, most people have probably never heard of CPNI or the FCC’s specific regulations on their use.  But later this month, new rules on collection, disclosure, consent and use of CPNI in the internet context will be take center stage as the FCC decides whether and to what extent previously-exempt internet service providers and their business partners are bound by CPNI rules that phone and cable companies have observed for years.

Of course, the Devil’s in the details.  Current CPNI rules, for example, prevent phone companies from sharing the phone numbers a customer calls or receives without express consent.  How that rule translates in the internet context, where the entire notion of internet marketing relies on some measure of tracking, is less clear.  But some restrictions on the current system is likely, with the FCC indicating that many of the same consumer privacy concerns applicable to phone companies are present with internet providers:

[c]onsumers’ privacy needs are no less important when consumers communicate over and use broadband Internet access than when they rely on [telephone] services.  As broadband Internet access service users access and distribute information online, the information is sent through their broadband provider.  Broadband providers serve as a necessary conduit for information passing between an Internet user and Internet sites or other Internet users, and are in a position to obtain vast amounts of personal and proprietary information about their customers. Absent appropriate privacy protections, use or disclosure of that information could be at odds with those customers’ interests.

Feb. 26, 2015 Open Internet Order, para. 463.

In short, if your business relies on or uses tracking data on consumer internet traffic or behavior in any way (e.g., customized ad buys, cookies, big data algorithms, mobile payments processing), there’s a good chance that the forthcoming new CPNI rules will affect you in some way.

For now, ISP’s have a reprieve, and the FCC has stated that it will forbear from applying its existing rules because they are “not well suited to broadband Internet access service.”  In particular, the FCC found that existing rules are more focused on concerns that have been associated with voice telephone service and do not address many of the types of sensitive information to which broadband providers (more so than phone companies) are likely to have access.

These comments suggest the possibility that the new CPNI rules may be more strict than the current ones for phone companies.  FCC Chairman Tom Wheeler has announced that the agency will hold a workshop on April 28 for stakeholders to discuss details, with final rules probably coming out in Q3 or Q4 of 2015.

Second Circuit Joins Chorus In Favor Of CDA Immunity

April 1st, 2015 by Afigo Fadahunsi

In Ricci v. GoDaddy.com, the United State Court of Appeals for the Second Circuit affirmed a dismissal of defamation claims against GoDaddy.com, a website host, invoking the immunity and preemption provisions of the Communications Decency Act (“CDA”), 47 U.S.C. § 230. The lawsuit against GoDaddy stemmed from an “offline” dispute between the Ricci plaintiffs and the Teamsters’ Union, to which Mr. Ricci belonged. Following Mr. Ricci’s refusal to endorse the union president at the time, Ricci endured various forms of retaliation from union leadership, including the union’s publication of newsletters containing offensive and defamatory statements about the Riccis. The newsletters were posted onto a website hosted on GoDaddy’s servers.

In the Complaint, the Ricci’s acknowledged that GoDaddy had no role in creating the alleged defamatory letters. Rather, the Riccis sought to impose liability upon GoDaddy because it hosted the website on which the newsletters were republished, refused to remove the newsletters, and refused to investigate the plaintiffs’ complaints about the statements in the newsletters. The trial court dismissed the Riccis’ suit based on CDA immunity, and the Second Circuit affirmed.

The CDA shields hosts like GoDaddy from publisher liability (with respect to third-party or user-generated web content) when it acts in the capacity as the provider of an interactive computer service. Section 230 offers broad protection to website operators and courts have typically rejected any interpretation that renders meaningless the core immunity provided by Section 230(c), or clouds the vision of an uninhibited and open Internet. Section 230(c)(1) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by an information content provider.” Section 230(e)(3) further states that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”

In its first opinion construing the immunity provisions of the CDA, the Second Circuit made three critical points in its ruling: First, “a plaintiff defamed on the internet can sue the original speaker, but typically cannot sue the messenger.” The Riccis’ should have pursued defamation claims only against the union, and not against GoDaddy. Second, GoDaddy did not play a role in creating the alleged defamatory newsletters. Since GoDaddy was sued in its capacity as a provider of an “interactive computer service,” it is immune from defamation liability under the CDA. Third, an interactive computer service like GoDaddy can win a Section 230 lawsuit on a motion to dismiss. According to the court, although preemption under the CDA is an affirmative defense, “it can still support a motion to dismiss if the statute’s barrier to suit is evident from the face of the complaint.” The court found that the defect was patently evident in the Riccis’ case.

FTC Advises That Mergers Don’t Eliminate Privacy Promises of Acquired Companies

March 28th, 2015 by Paul Pittman

The FTC recently posted comments on its business blog about the responsibility of companies to comply with privacy representations made to prior customers on how the companies will collect, use or disclose personal information, following a merger or change in ownership. Noting that companies must keep their promises to customers regarding the privacy of the personal information, the FTC identified three options for a company to consider when merging or changing ownership:

• Companies can continue to honor the privacy promises made to consumers before the merger or acquisition;

• To change the privacy promises already made to consumers, such as sharing personal information with third parties, companies will need to inform consumers and get their express affirmative consent to opt in to any new practices;

• To change how information is collected in the future, companies need to provide consumers with notice of the change and a choice of whether to agree to the collection. According to the FTC, simply revising the language in a privacy policy or user agreement isn’t sufficient because existing customers may have viewed the original policy and may reasonably assume it’s still in effect. Further, the notice and choice must be sufficiently prominent and robust to ensure that existing customers can see the notice and easily exercise their choices.

The FTC’s commentary and cites to specific case examples can be found at

https://www.ftc.gov/news-events/blogs/business-blog/2015/03/mergers-privacy-promises?utm_source=govdelivery

Net Neutrality: More Winners Than Losers

March 17th, 2015 by John Stephens

On March 12, 2015, The Federal Communications Commission (FCC) released the full text of the Net Neutrality rules it approved last month. Net Neutrality essentially means an open Internet where all traffic is equal, anyone can publish content, and everyone has access to media. The new rules are not a guarantee that the Internet will remain neutral as there will very likely be legal challenges to the proposal, but for now, things should remain pretty much the same in cyber-world. This post will explain Net Neutrality, the FCC’s new rules and explain how Net Neutrality benefits most.

The agency’s move to reclassify broadband Internet as a “telecommunications service,” which gives it more legal muscle to force broadband providers to treat all Web traffic equally, was given the green light in a high-profile party-line vote on Feb. 26, 2015 but the agency took two weeks to incorporate dissenting opinions from Republican commissioners and meet other procedural requirements.

History of Net Neutrality

Fundamentally, Net Neutrality is the idea that broadband providers deliver every Internet site’s traffic without discrimination. At its core, Net Neutrality demands equality in the treatment of consumers who pay for the same or a greater quality of service, permitting peer-to-peer communication in any platform of the consumers’ choosing, regardless of the amount of content transmitted or bandwidth utilized.

Prior to the new rules, the FCC classified broadband providers as information services under Title I of the Communications Act of 1934.   In the past 10 years, the FCC twice issued Net Neutrality principles under Title I, each time losing to challenges by broadband providers like Verizon. In the most recent challenge, Verizon v. FCC, 740 F.3d 623 (D.C. Cir. 2014), the Court of Appeals for the District of Columbia Circuit rejected the FCC’s second set of proposed Net Neutrality regulations because the rules treated broadband providers as entities regulated under Title II of the Act.

FCC’s New Rules

The court in Verizon held that the FCC had regulatory power to impose Net Neutrality standards, but not under Title I. The court effectively invited the FCC to adopt a Title II regulatory program. The newly-proposed rules accept the court’s invitation by declaring the Internet to be a public utility under Title II.  Under the proposed neutrality rules, any retail broadband service Americans buy from a cable operator, telecommunications company or a wireless operator would be reclassified as a telecommunications service, instead of a lightly-regulated information service.

The new rules provide:

•No blocking: Broadband providers will not be able to block access to legal content, applications, services or non-harmful devices.

•No throttling: Broadband providers will not be able “impair or degrade” lawful Internet traffic on the basis of content, applications, services or non-harmful devices.

•No paid prioritization: Broadband providers may not favor some lawful Internet traffic over other lawful traffic in exchange for payment, i.e., there will be no “fast lanes.” Broadband providers will also be barred from prioritizing content and services of their affiliates.

•The commission’s new rules would also include a “standard for future conduct,” with the rationale being that because the Internet is always evolving, “there must be a known standard by which to determine whether new practices are appropriate or not.

Legal Challenges Ahead

The FCC’s proposal is leveraging two main elements of legal authority: Title II of the Communications Act and Section 706 of the Telecommunications Act of 1996. By using these two provisions, the FCC said the “proposal provides the broad legal certainty required for rules guaranteeing an open Internet.”

Already, reports have emerged that the broadband providers AT&T and Verizon are ready to launch legal challenges to the FCC’s proposal. The reason why broadband providers have been so adamantly opposed to an open Internet is obvious and summed up well by SBC CEO Ed Whitacre in a 2005 interview with Business Week:

We own the pipes and we should be able to control the traffic that flows through them!… How do you think they’re going to get to customers? Through a broadband pipe. Cable companies have them. We have them. Now what they would like to do is use my pipes free, but I ain’t going to let them do that because we have spent this capital and we have to have a return on it.

Besides these broadband providers, the FCC’s proposal is being challenged by a group of Republican lawmakers that have proposed another method to ensure the openness of the Internet while not permitting the agency to reclassify broadband as a utility under Title II of the Communications

The Future

Critics of the rules, like AT&T, quickly jumped on the release of the rules as another chance to criticize the agency’s approach and to lightly threaten litigation.

“Unfortunately, the order released today begins a period of uncertainty that will damage broadband investment in the United States,” AT&T Senior Executive Vice President Jim Cicconi said. “Ultimately, though, we are confident the issue will be resolved by bipartisan action by Congress or a future FCC, or by the courts.”

It’s unclear when that legal action might come, but Thursday’s release does move the FCC’s rules forward through the process of becoming law. Barring any unforeseen complications, they could be finalized, and published in the Federal Register, by the end of the month. Certain transparency requirements in the new rules will face additional procedures at the Office of Management and Budget, which could delay things further.

After they’re published, the rules will take effect in 60 days. Internet service providers or other interested parties will also have 30 days from the date of publication to file a lawsuit. Before then, they can also petition the FCC to stay the rules pending judicial review.

The FCC’s decision to support Net Neutrality brings to a close an era of uncertainty as to exactly what position the FCC would take on the issue. The FCC’s action set sustainable rules of the cyber roads that should protect free expression, continue to encourage and reward innovation and grow our economy.

Does the E.U. “Right to be Forgotten” Pose a Threat to Companies in U.S.?

March 8th, 2015 by Paul Pittman

Even observed from “across the pond,” the right of European Union (“E.U.”) consumers to compel an Internet search engine to de-link specific personal information of the consumer from certain search results – the “Right to be Forgotten” – has garnered considerable attention in the United States (“U.S.”). Until recently, the “Right to Be Forgotten” seemed to be a concept that arose solely in the E.U. However late last year a French court, relying on the “Right to Be Forgotten,” issued an injunction requiring Google to remove allegedly defamatory material linked to a Danish lawyer employed in France from its search engine worldwide. The French court’s order raises a significant question of whether a U.S. court would enforce a E.U. “Right to Be Forgotten” order.

The Right to be Forgotten

Last May, the Court of Justice of the European Union (“CJEU”) ruled in Google Spain v. AEPD and Mario Costeja Gonzalez that E.U. data subjects have a privacy right to request that Internet search engines such as Google, remove certain search results linking to third party websites containing personal information deemed “inadequate, irrelevant or no longer relevant” absent an overriding public interest in the information. The decision has become synonymous with a “Right to Be Forgotten” and is based on an action by a Spanish citizen (Mario Gonzalez) to force Google Inc. to remove links in its search engine to an old article reporting that Gonzalez’s home was repossessed to pay off social security debts.

Since the decision in Google Spain, Google has received more than 201,194 requests to de-link information and has removed the search results in 42 percent of cases. When Google grants a removal request it typically only removes the personal data from the servers facing the specific E.U. country. The information may still be visible in other E.U. countries and in the U.S. It is this practice that likely led to the dispute in the French case last year.

French Court Places “Right to Be Forgotten” Demand on Google’s U.S. Operations

In August 2013, Dan Shefat filed a lawsuit seeking to de-link materials that were used in a “defamation campaign” by an unknown individual against his law firm on blogs and websites. A French court granted Shefat’s request, under the “Right to be Forgotten”, and ordered both Google France and Google, Inc. – the U.S. based operator of Google’s search engine – to de-link the material from search results involving Shefat’s name, worldwide. Google complied with the court’s order, by removing the link on its Google France search engine but refused to de-link the materials on its Google Inc. search engine. In September 2014, on Shefat’s request, the Paris Tribunal de Grande Instance issued an injunction requiring Google Inc. to remove links to the materials worldwide and imposed a fine of 1,000 euros per day on Google France until Google Inc. complies. Google recently confirmed that it would only remove search results from European websites, but reserved the right to re-review its policy in the future.

Notably, while very little guidance on applying the “Right to be Forgotten” directive existed at the time the French court issued its ruling, it appears to be consistent with guidelines published by the Article 29 Working Party shortly after the decision. Those guidelines allow E.U. courts to broadly extend their jurisdiction by requiring companies to remove contested links from all domains, not just those in the E.U. Under these guidelines, the search results and content on a U.S. facing search engine, that also operates an E.U. facing search engine would be subject to a de-linking request pursuant to the “Right to be Forgotten.” The Article 29 Working Party recently issued letters to several search engines reminding them of this policy.

Data Privacy in E.U. and U.S.

Predicting whether a U.S. court would enforce a “Right to be Forgotten” order should begin with an understanding of the differences between the privacy regimes of the two sovereigns. Both regimes are based on principles of freedom of expression, access to information, fairness, notice and consent.

As the Google Spain case illustrates, the E.U. data privacy regime favors consumer privacy. Data privacy in the E.U. is generally governed by Directive 95/46/EC, a comprehensive statute that regulates the processing and transfer of personal data in E.U. member states (“Data Protection Directive”). The Data Protection Directive is enforced by data protection authorities in each E.U. member state who also implement and enforce their own national data protection laws. While the Data Protection Directive has been the law of the land for nearly 20 years the General Data Privacy Regulation, approved by the European Commission on March 12, 2014, is set to supersede the Data Protection Directive.

On the other hand, the U.S. data privacy regime encourages access to information and free expression. U.S. privacy laws are a medley of state and federal laws, and administrative decisions, targeting specific data for protection including personal, financial, health and children’s data. Although U.S. privacy laws also consider consumer privacy, there is equal if not overriding concern with ensuring these laws do not inhibit the right to free speech and freedom of expression established by the First Amendment of the U.S. Constitution. In fact, with regard to search engine search results, U.S. courts have held that search engine results are constitutionally protected activity under the First Amendment.

Nonetheless, some U.S. laws extend protections similar to those under the E.U.’s Right to be Forgotten, at least with regard to children and minors. Federal law, such as the Children’s Online Privacy Protection Act (“COPPA”) and proposed amendments, and state law, such as the Privacy Rights for California Minors in the Digital World that went into effect this month generally allow the removal of certain online personal information about children or minors. In addition, although the regulatory focus in the U.S. is currently on minors there does appear to be a general interest by the U.S. public for a Right to be Forgotten law.

Is the Right to be Forgotten Enforceable in the U.S.?

Putting aside issues of international comity, ultimately, a U.S. court’s willingness to enforce the “Right to be Forgotten” directive could depend on whether there are similarities between the privacy protection sought by the E.U. court and the protections provided by analogous privacy laws in the U.S.

Given the First Amendment implications of censoring online content and the search results of an Internet search engine, a U.S. court may be hesitant to enforce the order issued by the French court requiring Google Inc. to de-link the defamatory material from the search results in its U.S. search engine. The French court’s decision is consistent with E.U. privacy principles that focus on the privacy rights of the consumer, but gives little regard to the principle of freedom of expression – a principle a U.S. court is likely to find overriding. In addition, adopting a “Right to be Forgotten” principle is inconsistent with U.S. public policy of transparency and accuracy in information about citizens. Practically, however, Google may have no choice but to comply with the “Right to be Forgotten” order in an effort to preserve its business operations in E.U. countries.

This does not mean that a U.S. court is likely to decline to enforce a “Right to be Forgotten” directive in all situations. A U.S. court might be willing to enforce a “Right to be Forgotten” directive in situations where a E.U. member country seeks to enforce the directive against a U.S. company’s U.S. operations with regard to materials concerning children or minors. The current and proposed legislation in the U.S. allowing the removal of online information relating to children and minors suggests that U.S. courts may be willing to provide such relief, especially where it is consistent with these laws.

Needless to say, the question will only be answered if a E.U. member state entity petitions a U.S. court to enforce a “Right to be Forgotten” order. Any such case should be closely followed as it could have a significant impact on the jurisdictional reach of the E.U. over U.S. companies operating in their countries as well as provide some insight on how U.S. court’s perceive the “Right to be Forgotten.”

This article was published in The Privacy Advisor for the International Association of Privacy Professionals on February 24, 2015.

Illinois Federal Court Leaves AMEX to Defend TCPA Claims Based on Third Party Actions

March 7th, 2015 by Paul Pittman

Recently, an Illinois federal court denied American Express’ (“AMEX”) motion for partial summary judgment, finding that AMEX can be directly liable under the Telephone Consumer Protection Act (“TCPA”) for debt collection and telemarketing calls made on its behalf. The court’s decision alleges that West Asset Management made debt collections calls on AMEX’s behalf to plaintiffs Jennifer Ossola and Scott Dolemba, and that Alorica placed telemarketing phone calls for AMEX to plaintiff Joetta Callentine.

Ossola filed suit in July 2013, claiming that AMEX used an autodialer to call her cellphone many times over a four year period – even though she was not the debtor that AMEX was seeking to reach. Callentine alleges that AMEX violated the TCPA by having West Asset Management and Alorica make debt collection calls to her cellphone that were intended for her deceased mother. Dolemba claims that he also received a call from West Asset Management in June 2013.

The Illinois court held that it is irrelevant whether AMEX or the third-party vendors West Asset Management Inc. and Alorica Inc., made the calls: AMEX as the primary creditor can still be liable for debt collection calls made on its behalf. Plaintiffs are still conducting discovery over the role American Express played in making the telemarketing calls.

The plaintiffs propose a national class of non AMEX customers who received autodialed debt collection calls or telemarketing calls from AMEX, Alorica or West Asset Management, after July 2009. This decision means that AMEX may have to defend against these class claims for actions taken by a third party, which should serve as a warning to companies enlisting third parties for their debt collection and telemarketing service.

New California Privacy & Protection Act Proposes Standards for Personal Information Encryption, Bans Sales of Voice-Recording TVs, Criminalizes Vehicle Hacking, and Slew of Other Privacy-Related Measures

February 28th, 2015 by Nora Wetzel

New legislation proposed in California includes a package of privacy-related bills referred to as the California Privacy & Protection Act.

The bills proposed include:

  • Encryption: A.B. 83 would set encryption standards for personal information stored in the cloud. The bill’s author rejected a specific standard in favor of a “reasonably prudent encryption standard” to flex with technology developments. The bill would also require entities who suffer a data breach to disclose the code vulnerability that caused the breach. The disclosure requirement’s purpose is to prevent additional breaches by allowing other entities to search for the code vulnerability in their systems. The current draft of the bill does not yet provide the specifics regarding encryption standards or the code disclosure.
  • Voice Collecting TV Sales: a yet-to-be-proposed bill would ban the sale of televisions that record customers’ voices when a TV’s voice recognition feature is not in use. There has been much recent publicity surrounding Samsung’s Smart TVs which feature voice recognition tools that may record and transmit consumers’ conversations, even when the tools are not in use. A consumer privacy group, the Electronic Privacy Information Center (EPIC), has urged the FTC to investigate Samsung TV’s alleged recording of consumers’ private in-home conversations despite the fact that Samsung’s privacy policy informs customers that their personal information may be captured and transmitted by the voice recognition features. The recent attention to the voice recognition component of Samsung’s TVs may have spurred California lawmakers to target this issue.
  • Collection of Vehicle Data: S.B. 206 would prohibit public agencies from collecting information from a vehicle’s diagnostic system unnecessary to the state’s emission prevention program such as data related to a vehicle’s location or driving speed.
  • Hacking Vehicles: a yet-to-be-proposed bill would criminalize hacking into a vehicle’s computer system. If the breach causes the driver to lose control of the car, the offense would be a felony while mere access without taking control or causing injury would be a misdemeanor. Since the bill has not been drafted yet, what defines losing control or mere access is unclear.
  • Drones over Schools: S.B. 271 would bar the use of drones over public schools covering grades kindergarten through high school, except that the ban would not apply to drones operated by law enforcement during a public safety emergency.
  • Retention of Blood Samples: A.B. 170 would require California’s Department of Public Health to inform parents about collection and retention of blood samples taken from newborns for a genetic screening program. The bill would also require the Department to provide information describing the program and informing parents of their right and the child’s right (upon reaching adulthood) to request the blood sample be destroyed and/or not be used for research.  
  • Police Body Cameras: a bill yet-to-be- introduced would require law enforcement agencies to develop and make publicly available policies regarding the usage of body cameras. The proposed bill would provide that body camera footage recorded inside homes when there is no arrest does not constitute public data and cannot be subject to the California Public Records Act.

Other privacy-related measures are included in the package of legislation proposed by Assemblyman Mike Gatto and Senator Ted Gaines. The final versions of the California Privacy & Protection Act is not yet available as several bills must be drafted or undergo further drafting while proceeding through committees. Nevertheless, it is worth paying attention to this Act, particularly with respect to the proposed encryption standards, to assess any compliance gaps once the Act is finalized and passed into law.

About Us
Sedgwick provides trial, appellate, litigation management, counseling, risk management and transactional legal services to the world’s leading companies. With more than 350 attorneys in offices throughout North America and Europe, Sedgwick's collective experience spans the globe and virtually every industry. more >

Search
Subscribe
Subscribe via RSS Feed
Receive blog updates via email: