States are racing to rein in the growing use of artificial intelligence and computer algorithms to set prices, with New York leading the charge. On Dec. 15, a major amendment to New York’s antitrust law took effect, banning the use of certain algorithmic rent-setting tools for residential units under Section 340-b of the state’s General Business Law.
This follows New York’s November implementation of a first-in-the-nation disclosure law, Section 349-a of the General Business Law, that requires merchants to tell consumers when individualized prices are set using algorithms trained on personal data. State lawmakers are also considering S.B. 7033, which would outlaw algorithmic price discrimination based on protected characteristics like race and age.
These measures arrive amid a torrent of litigation challenging the use of algorithmic pricing software in contexts ranging from apartment rents to healthcare reimbursements, and as more companies adopt pricing systems that tailor offers to individual consumers.[1]
By pushing these laws, New York and other states seek to fill perceived gaps in the Sherman Antitrust Act and Robinson-Patman Act, the primary federal antitrust statutes prohibiting price-fixing and price discrimination.
Industry pushback has been swift: Multiple lawsuits are challenging the New York laws on constitutional and other grounds. The outcomes could determine not only the future of these statutes but the viability of similar regulations nationwide.
For businesses, this means both immediate new compliance burdens and continued uncertainty over how and whether these rules will ultimately be enforced.
The Algorithmic Rent-Setting Ban: Section 340-b
Section 340-b, enacted in October and effective Dec. 15, is the first state law barring the use of algorithms to set rents.[2]
The statute deems it an unlawful agreement in violation of New York’s Donnelly Act “for a residential rental property owner or manager to knowingly or with reckless disregard set or adjust rental prices, lease renewal terms, occupancy levels, or other lease terms and conditions” based on recommendations from algorithmic pricing software that performs a “coordinating function.”[3]
Notably, the law represents a potential expansion of liability beyond the federal Sherman Act, which requires proof of a horizontal agreement between competitors or potential competitors for a per se violation of the law. In contrast, the New York law deems the unilateral use of algorithmic pricing software to set or adjust rents and lease terms to constitute an unlawful agreement, even without a meeting of the minds between the landlords, provided the software performs a coordinating function.[4]
Further, the statute bars setting or adjusting rates using an algorithm, meaning a landlord could potentially violate the statute even if it does not reflexively adopt the recommended rent.[5] For example, landlords might arguably violate the statute by using algorithmic output as a default benchmark then adjusting the benchmark up or down to determine the actual rent.
This possibility represents a departure from some prior federal algorithmic pricing decisions, which dismissed complaints in part because the alleged conduct demonstrated that the defendants sometimes deviated from the algorithm-recommended price.[6]
Section 340-b is limited to algorithmic pricing software that performs a coordinating function. But the term “coordinating function” is defined broadly to include collecting past and present data on price and other factors from two or more competing property owners or managers, processing that data, and recommending rental prices or lease terms.[7]
The definition does not distinguish between algorithms that use confidential information or publicly available information. This too represents a meaningful departure from federal cases, which have generally treated algorithmic pricing based on public information as less problematic.[8]
Additionally, the statute does not require landlords to know they are using an offending pricing software. Instead, it is enough that they act with “reckless disregard.”[9] It is thus prudent for landlords to understand the mechanics of their software, as a lack of knowledge will not be a viable defense.
Landlords should also consider that software may fall within the covered definition even if coordinating rent prices is not its primary function. For example, one of the law’s sponsors indicated that the law could possibly apply to situations where “mom-and-pop style landlords” use ChatGPT or other AI tools to analyze data available on the internet to help them set rents.[10] While it is unclear whether courts would enforce the law in such a circumstance, landlords of any size should be aware of the potential risk.
In addition to prohibiting the use of certain software, the law makes it illegal for any person or entity to facilitate an agreement to not compete between residential rental properties, “including by operating or licensing” algorithmic pricing software that performs a coordinating function.[11] This provision plainly targets companies that make pricing software, but it may sweep in other actors who could be deemed to have facilitated an agreement.
It is unclear whether the statute requires proof of a predicate agreement to not compete for facilitator liability to attach, or if merely operating or licensing the offending software is enough. Given that the statute treats the use of such software to set rents as an unlawful agreement, a cautious approach is to assume that operating or licensing software used in that manner is sufficient to trigger the statute.
As to the new law’s geographic and temporal scope, Section 340-b adds a new provision to New York’s existing antitrust statute, the Donnelly Act. By its terms, the Donnelly Act applies to restraints on “business, trade or commerce or in the furnishing of any service in this state,” meaning there must be a nexus between the alleged conspiracy and injury to competition in New York.[12] The new law further specifies that the property owners and managers subject to the ban are individuals and entities that own or manage residential rental units in New York state.[13]
Although the law does not indicate whether it applies retroactively or prospectively, New York courts generally hold that “amendments are presumed to have prospective application unless the Legislature’s preference for retroactivity is explicitly stated or clearly indicated” or the amendment is remedial in nature.[14]
Violations of the new law may result in civil penalties of up to $1 million, as well as potential criminal prosecution.[15] The Donnelly Act authorizes suits by the government and provides a private right of action, meaning this new section could create a basis for private plaintiffs to sue.[16] Parties suing under Section 340-b may be entitled to treble damages and attorney fees.
A Developing Challenge to Section 340-b
RealPage Inc., a company that provides pricing software to rental properties and recently entered a consent decree resolving a federal antitrust investigation, sued New York’s attorney general on Nov. 26 to block enforcement of Section 340-b.[17]
In its complaint, RealPage argues the statute violates the First Amendment by restricting lawful advice and analysis, imposing severe penalties, and discriminating based on viewpoint. The company seeks declaratory and injunctive relief, claiming the law is overbroad, unsupported by evidence and a threat to routine data-driven tools.
While RealPage’s preliminary injunction motion is still pending, the action’s outcome will have a large impact on the viability of Section 340-b and potentially other algorithmic rent-setting bans elsewhere.
Regulating Personalized Algorithmic Pricing: Section 349-a and S.B. 7033
In addition to legislative efforts aimed at restricting the use of algorithmic tools in rent-setting, New York has enacted and is considering further measures to regulate algorithms that leverage consumers’ personal data for individualized pricing. These legislative developments stem from heightened scrutiny of personalized pricing practices that have emerged in response to the proliferation of data-driven systems, particularly in industries like e-commerce, where algorithms dynamically adjust prices in real time based on consumer profiles and behavioral data.
Notably, New York’s Algorithmic Pricing Disclosure Act, codified at Section 349-a of the General Business Law, went into effect in November. The statute requires merchants to disclose when a published price has been set by an algorithm using a consumer’s personal data.[18]
The law recently survived a challenge brought by the National Retail Federation alleging that the disclosure requirement violates the First Amendment’s prohibition on compelled speech, though that ruling is under appeal.[19]
A second bill, New York S.B. 7033, the Preventing Algorithmic Pricing Discrimination Act, is currently under committee review. If enacted, this bill would prohibit the use of protected class data to set prices for goods or services that differ from those offered to other consumers. Protected class data encompasses information identifying legally protected characteristics, such as ethnicity, nationality, age, disability, sex or sexual orientation.
Unlike the Robinson-Patman Act, S.B. 7033 would regulate price discrimination in direct-to-consumer transactions and apply to both goods and services, thereby expanding protections to areas previously outside the reach of federal law.
Takeaways for Businesses
New York Property Owners and Managers
Property owners and managers within the state should:
- Reevaluate the use of algorithmic pricing software or systems for setting rents, occupancy levels and other lease terms;
- Understand the software’s mechanics. Identify its data sources, processing methods and outputs to determine whether it performs a coordinating function under Section 340-b; and
- Train leasing and management teams on permissible and impermissible uses of third-party pricing tools.
Companies Providing Algorithmic Software to New York Property Owners
Providers of algorithmic software to those renting out property should:
- Audit products to confirm they cannot reasonably be classified as performing a coordinating function under Section 340-b;
- Review audit results and confirm compliance, especially in borderline cases;
- Document findings in detail, including data sources, model versions and compliance audit results to defend against potential facilitator liability; and
- Implement compliance features that prevent or flag client uses likely to implicate Section 340-b and provide clear guidance and training materials to help clients avoid inadvertent violations.
New York Companies Using Algorithmic Consumer Pricing
Companies within the state that use algorithmic pricing should:
- Review pricing algorithms to determine coverage under Section 349-a;
- Disclose clearly to consumers when published prices are set by algorithms using personal data; and
- Monitor the progress of New York S.B. 7033, which could prohibit algorithmic price discrimination based on protected class data.
Companies Outside New York
Companies outside the state should:
- Track legislative developments and enforcement actions involving algorithmic pricing in relevant jurisdictions, as other states — such as California — have enacted or are considering similar regulations.
Foley & Lardner associates Richard Lee and Savannah Miracle and law graduate Hannah Allbery contributed to this article, which was originally published in Law360 on December 17, 2025, and is republished here with permission.
[1] See, e.g., In re: RealPage Inc., Rental Software Antitrust Litig. (No. II) , 709 F. Supp. 3d 478 (M.D. Tenn. 2023); In re: MultiPlan Health Ins. Provider Litig. , 789 F. Supp. 3d 614 (N.D. Ill. 2025).
[2] Numerous cities and localities have enacted their own bans on algorithmic rental pricing, including San Francisco, Berkeley, Santa Monica and San Diego (California); Jersey City and Hoboken (New Jersey); Seattle and King County (Washington); Philadelphia; Minneapolis; and Providence.
[3] N.Y. Gen. Bus. Law § 340-b(3).
[4] Id.
[5] Id.
[6] See, e.g., Gibson v. MGM Resorts Int’l , No. 2:23-CV-00140-MMD-DJA, 2023 WL 7025996, at *3 (D. Nev. Oct. 24, 2023) (finding that failure to allege hotel operators were required to adopt recommended prices was a “fatal deficiency in the Complaint”).
[7] N.Y. Gen. Bus. Law § 340-b(1)(c).
[8] See, e.g., Dai v. SAS Inst. Inc. , No. 24-CV-02537-JSW, 2025 WL 2078835, at *5 (N.D. Cal. July 18, 2025) (dismissing where “the facts [we]re insufficient to permit the Court to reasonably infer an exchange of confidential information”); Gibson v. Cendyn Grp. LLC , 148 F.4th 1069, 1083, n.8 (9th Cir. 2025) (affirming dismissal where plaintiffs did not allege that software provider “shared confidential information of each competing hotel among the licensees”).
[9] N.Y. Gen. Bus. Law § 340-b(3).
[10] 6-10-25 Session: Hearing on AO1417, N.Y. State Assemb. 161-63 (2025).
[11] N.Y. Gen. Bus. Law § 340-b(2).
[12] See id. § 340(1); Glob. Reinsurance Corp. U.S. Branch v. Equitas Ltd. , 969 N.E.2d 187, 196 (N.Y. 2012).
[13] N.Y. Gen. Bus. Law § 340-b(1)(d).
[14] In re: Gleason (Michael Vee Ltd.) , 749 N.E.2d 724, 726 (N.Y. 2001).
[15] N.Y. Gen. Bus. Law §§ 341, 342-a.
[16] Id. §§ 340(5).
[17] Complaint at 1, RealPage Inc. v. James, No. 25-cv-9847 (S.D.N.Y. 2025).
[18] See N.Y. Gen. Bus. Law § 349-a(2).
[19] See Nat’l Retail Fed’n v. James , No. 25-CV-5500 (JSR), 2025 WL 2848212 (S.D.N.Y. Oct. 8, 2025).