There’s a new thing to get sued for, and it’s really expensive: the use of AI property management software. Interestingly, the end users of this software have had to pay millions in fines and settlements from antitrust litigation, while the makers of the actual software have paid almost nothing. How can using this software be more illegal than making and selling it?
The major software companies that help property managers determine pricing algorithmically do so using AI. This is all well and good in theory; however, the problem is that this type of software often has allowed users to enter internal company data about pricing, vacancy rates, etc. The software then uses AI to suggest what the user should charge for rent. Again, this is all theoretically fine… except that these companies have been accused of using internal company data input into their systems by many different firms to recommend pricing schemes. Allegedly, this allows coordinated setting of rents across competitors using competitively sensitive, nonpublic data, which may amount to illegal price fixing.
Use of this software has been implicated in apparently coordinated plans to raise rents astronomically in many cities. For example, the city of Hoboken, New Jersey banned algorithmic price setting software after almost all corporate landlords in the city instituted a 25% rent increase across most of their properties during the same year.
Some AI rental programs have forums where competitors can privately discuss their business strategies. Who needs old-fashioned in-person secret meetings to fix prices when you and your competitors can just upload your internal company data to your rental software system and train AI on it? Even better, you can discuss this pricing with competitors in private forums. It seems like a price-fixing dream come true!
The California Attorney General’s Office alleged in a suit against major corporate landlords that “Landlords also understood that their nonpublic data would be used to recommend prices not just for their own units, but also for competitors who use the programs.” This argument has been used in similar cases to suggest that corporate landlords were not confused about why the programs asked for their sensitive internal data. That said, price fixing is one of the few crimes that does not require proof of intent, or even proof of any damages, in order to bring charges. Price fixing is a per se violation of the Sherman Antitrust Act, meaning that even without intent to harm, and even without any harm actually occurring, it’s illegal. It’s like drunk driving; there’s no need to prove intent to get drunk, or that any harm was caused; if your blood alcohol level is above the legal limit, you will face charges. Similarly, if communication about sensitive pricing information has occurred between competitors, or if all rents rise by the same amount across all competitors and no external factors are to blame, no intent or damages need to be proven in order for the government to bring charges.
The DOJ suit on price-fixing AI software was based on a per se violation of antitrust law. This meant that the DOJ did not need to prove any intent to violate the law, and did not need to prove damages. The DOJ simply had to prove that the price fixing was likely happening. Because the goal was stopping illegal behavior, and not proving damages or intent, the case proceeded quickly, and the makers of the software were ordered to stop allowing price fixing-related activity to occur on their platforms. Trying to prove damages or intent can take forever, so when the goal is to simply stop the behavior, it’s best to skip the intent and damages part. That said, this verdict was much cheaper for the AI software company than if damages had been proven. It’s ironic that the software company was so central to the price fixing but had to pay very little as a financial penalty. It was the companies using this software that were sued for damages and ended up paying millions.
Corporate landlords using this software were sued by various class actions and attorneys general for actual damages, not just to stop anticompetitive behavior. The plaintiffs were groups of tenants who had suffered financial and other harm by having their rents raised in an allegedly illegal price fixing scheme facilitated by AI software. These suits were against the landlords, not against the software they used.
These AI rent pricing software companies generally insist that, though they suggest what rent to charge, landlords do not have to follow their suggestions. Lawyers for these platforms have argued that landlords follow their suggestions only a small percentage of the time. Similarly, landlords who use these platforms have argued that they are just getting suggestions that they can either follow or not. However, a major problem with these arguments is that they’re basically saying “sure, we may be indirectly sharing competitively sensitive information with competitors, but we’re just choosing not to use it to fix prices most of the time.” This doesn’t seem compelling, especially when no intent needs to be proven, at least in order for the DOJ to file an injunction to stop even the slightest appearance of price fixing.
These cases also bring up some common questions about AI… if no single human or entity is doing something illegal, then who is responsible for what AI is doing? If AI is just mysteriously choosing copyrighted material as a reference when creating content, has any copyright been violated? If an AI-rendered tree falls in a forest and no one hears it, did it really fall?
https://oag.ca.gov/news/press-releases/attorney-general-bonta-announces-7-million-settlement-greystar-participating
https://www.housingisahumanright.org/cities-across-united-states-are-banning-price-fixing-software-for-rental-housing/
https://patch.com/new-jersey/hoboken/hoboken-nj-bans-landlords-using-rent-setting-algorithms-what-it-means
https://www.paulweiss.com/insights/client-memos/algorithmic-pricing-and-antitrust-risk
https://www.hoganlovells.com/en/publications/federal-judge-in-washington-applies-per-se-treatment-for-algorithmic-pricefixing-claims
https://www.reuters.com/legal/government/greystar-agrees-50-million-settlement-realpage-rental-pricing-lawsuit-2025-10-02/
https://www.insurancejournal.com/news/southeast/2025/10/09/843098.htm
https://lawgaze.com/greystar-lawsuit/


