Headlines for state attorneys general (AGs) have been dominated by tangles with the Trump administration — from the travel ban case going to the U.S. Supreme Court to challenges to legacy regulations at federal agencies. Less visible are actions by state AGs to push forward their interests and influence in technology-oriented consumer products, as highlighted in panel topics at various attorney general meetings this summer.
An attorney general is often known as the “top cop” in his or her state. However, rather than having widespread criminal prosecutorial powers, state AGs utilize broad consumer protection authority. Particularly in assessing deceptive and unfair acts and practices with consumer-facing business, AGs are market regulators and enforcers.
Furthermore, state AGs enforce certain federal regulations, and even the elusive “abusive” standard of the Consumer Financial Protection Bureau. Combining into potential multi-state actions, AG investigations can be pervasive. On top of that, as predominantly elected officials, state AGs are highly motivated by policy interests affecting consumers and businesses.
For years, data breaches have been big news for state AGs, as there remains no federal compliance standard. Individual states maintain their own requirements for notification in case of a breach, and they are enforced by state AGs. Some states take the opportunity to establish heightened privacy standards for the types of data that companies can collect. For instance, the Illinois legislature recently passed legislation to restrict geolocation data, and the rules are to be enforced by the attorney general. Moving from reactive roles to proactive interests, state AGs are mapping out technology sectors where they see significant instances of security and privacy at stake.
Three “huge” technologies that will shape the future of consumers have the current interest of state AGs: driverless cars, the “internet of things,” and artificial intelligence. The interconnectedness of computing devices along with the capture of personal data, including at times when a consumer may be unaware, has some state AGs on high alert. The concern from AGs is not a particular innovation itself, but rather a self-realization of how AGs themselves should react to the seismic shift in consumer preferences where a desire for efficiency, personalization and freedom is trumping traditional notions of consumer protection.
First, with driverless or “autonomous” vehicles and connected cars, we have the Jetsons becoming reality. A fleet of cars without drivers roams the streets of Pittsburgh, and a production vehicles will show up at your door. The National Highway Traffic Safety Administration categorizes five levels of automated driving from level one, which includes cruise control, to levels four and five, in which the vehicle monitors all roadway conditions and reacts appropriately. Between the ends of this spectrum is an incremental revolution, as more and more driver assistance features are introduced into vehicles.
With the productivity and safety gains for those no longer seated behind a steering wheel, state AGs recognize potential privacy concerns with location data, driving habits and occupant identification that could be at risk of unauthorized use or disclosure. State AGs will also seek to defend their state laws from the preemptive effects of federal regulations that may otherwise be necessary to usher the advancement of driverless technology. With state AGs clearly having a role to influence the driverless industry and its future, proactive engagement with AGs, even in spite of their enforcement role, is critical.
Second, the “internet of things” (IoT) describes smart devices connected together. Smart devices may be activated remotely, may detect information independently, or may be able to learn and repeat functions. IoT devices collect information from a person’s home or surroundings, some which may be personal. Earlier this year, for instance, the FTC and the New Jersey attorney general scored a $2.2 million settlement with a TV manufacturer that collected viewing histories.
For state AGs, IoT enforcement considerations involve unfair and deceptive acts and practices. These include, for example, giving no notice to consumers about personally identifiable information that may be collected and possible HIPAA violations in sharing confidential health information. The proliferation of non-secure connected devices creates growing risks.
Last year, the Mirai virus searched the internet for vulnerable IoT devices, attacked them using common manufacturer default settings, and infected devices to control them for additional attacks. State AGs are aware of ways in which IoT devices — from cordless tea kettles to connected medical devices — could be compromised when poor security opens up possibilities to gain access to a wireless home network.
Third, artificial intelligence, or AI, certainly brings images of science fiction. AI involves computers performing tasks in ways that would otherwise require human intelligence, such as recognizing speech, having visual perception, or making decisions. Last year, an AI “robot journalist” wrote 450 stories on the Olympics, and such superhuman feats will continue, as AI learns to understand pictures and videos of events.
State AGs understand how AI may be useful for law enforcement, such as managing unregistered drones by taking them safely out of the sky. This method of using technology advances to manage technology risks is certainly appealing and needs to be better understood by AGs across a variety of industries.
State AGs have already been receiving a similar education with their regulatory and enforcement authority toward the sharing economy, as traditional methods of consumer protection do not fit. More so, AI will transform our economy as a whole, which has state attorneys general considering how their consumer protection roles must change.
This article was written by partner Joe Jacquot and was originally published in The Hill. Read the online version here.
An attorney general is often known as the “top cop” in his or her state. However, rather than having widespread criminal prosecutorial powers, state AGs utilize broad consumer protection authority. Particularly in assessing deceptive and unfair acts and practices with consumer-facing business, AGs are market regulators and enforcers.
Furthermore, state AGs enforce certain federal regulations, and even the elusive “abusive” standard of the Consumer Financial Protection Bureau. Combining into potential multi-state actions, AG investigations can be pervasive. On top of that, as predominantly elected officials, state AGs are highly motivated by policy interests affecting consumers and businesses.
For years, data breaches have been big news for state AGs, as there remains no federal compliance standard. Individual states maintain their own requirements for notification in case of a breach, and they are enforced by state AGs. Some states take the opportunity to establish heightened privacy standards for the types of data that companies can collect. For instance, the Illinois legislature recently passed legislation to restrict geolocation data, and the rules are to be enforced by the attorney general. Moving from reactive roles to proactive interests, state AGs are mapping out technology sectors where they see significant instances of security and privacy at stake.
Three “huge” technologies that will shape the future of consumers have the current interest of state AGs: driverless cars, the “internet of things,” and artificial intelligence. The interconnectedness of computing devices along with the capture of personal data, including at times when a consumer may be unaware, has some state AGs on high alert. The concern from AGs is not a particular innovation itself, but rather a self-realization of how AGs themselves should react to the seismic shift in consumer preferences where a desire for efficiency, personalization and freedom is trumping traditional notions of consumer protection.
First, with driverless or “autonomous” vehicles and connected cars, we have the Jetsons becoming reality. A fleet of cars without drivers roams the streets of Pittsburgh, and a production vehicles will show up at your door. The National Highway Traffic Safety Administration categorizes five levels of automated driving from level one, which includes cruise control, to levels four and five, in which the vehicle monitors all roadway conditions and reacts appropriately. Between the ends of this spectrum is an incremental revolution, as more and more driver assistance features are introduced into vehicles.
With the productivity and safety gains for those no longer seated behind a steering wheel, state AGs recognize potential privacy concerns with location data, driving habits and occupant identification that could be at risk of unauthorized use or disclosure. State AGs will also seek to defend their state laws from the preemptive effects of federal regulations that may otherwise be necessary to usher the advancement of driverless technology. With state AGs clearly having a role to influence the driverless industry and its future, proactive engagement with AGs, even in spite of their enforcement role, is critical.
Second, the “internet of things” (IoT) describes smart devices connected together. Smart devices may be activated remotely, may detect information independently, or may be able to learn and repeat functions. IoT devices collect information from a person’s home or surroundings, some which may be personal. Earlier this year, for instance, the FTC and the New Jersey attorney general scored a $2.2 million settlement with a TV manufacturer that collected viewing histories.
For state AGs, IoT enforcement considerations involve unfair and deceptive acts and practices. These include, for example, giving no notice to consumers about personally identifiable information that may be collected and possible HIPAA violations in sharing confidential health information. The proliferation of non-secure connected devices creates growing risks.
Last year, the Mirai virus searched the internet for vulnerable IoT devices, attacked them using common manufacturer default settings, and infected devices to control them for additional attacks. State AGs are aware of ways in which IoT devices — from cordless tea kettles to connected medical devices — could be compromised when poor security opens up possibilities to gain access to a wireless home network.
Third, artificial intelligence, or AI, certainly brings images of science fiction. AI involves computers performing tasks in ways that would otherwise require human intelligence, such as recognizing speech, having visual perception, or making decisions. Last year, an AI “robot journalist” wrote 450 stories on the Olympics, and such superhuman feats will continue, as AI learns to understand pictures and videos of events.
State AGs understand how AI may be useful for law enforcement, such as managing unregistered drones by taking them safely out of the sky. This method of using technology advances to manage technology risks is certainly appealing and needs to be better understood by AGs across a variety of industries.
State AGs have already been receiving a similar education with their regulatory and enforcement authority toward the sharing economy, as traditional methods of consumer protection do not fit. More so, AI will transform our economy as a whole, which has state attorneys general considering how their consumer protection roles must change.
This article was written by partner Joe Jacquot and was originally published in The Hill. Read the online version here.
Related Insights
24 October 2024
Events
The Future of AI in the Consumer Goods Industry
Join us on October 24 for a panel made up of Artificial Intelligence (AI) experts from academia and industry.
07 October 2024
Labor & Employment Law Perspectives
Illinois Updates Its Law Governing Use of Staffing Agencies — Again
The Illinois General Assembly has been busy at work revising and updating the Illinois Day and Temporary Labor Services Act.
07 October 2024
Labor & Employment Law Perspectives
Disappearing Messages, Unofficial Communications Platforms and Ever-Increasing Scrutiny by Regulators
Corporate use of third-party messaging platforms, including ephemeral messaging tools (which allow messages to disappear), is quite common and has become both a cost efficiency for employers and a convenient way for employees to communicate quickly with colleagues and clients.