Fear Of Leaks Restricts Commercial Real Estate's AI Use
In real estate, data might as well be currency.
Neighborhood demographics, vacancy rates and sales figures are tracked by nearly every brokerage and developer. The numbers inform daily decision-making, and, to defend against competition, many are kept under lock and key.
But a sacrifice may be needed if executives want to reap the benefits of artificial intelligence, proptech experts said at Bisnow’s New York AI & Technology event on Wednesday.
“Real estate companies are in this difficult place where they feel like they have all this special data. They don't want Claude or OpenAI to have access to the data,” Fundrise CEO Ben Miller said onstage at Silverstein Properties’ 7 World Trade Center. “So they're afraid to adopt the cutting-edge.”
In reality, large language models already have access to thousands of brokerage reports and online listings. Beyond that, census reports, economic indexes and news stories — essentially anything that lives online — can also be easily pulled and used.
“No one cares about your data,” Miller said. “You have a little bit of data compared to any AI company, but you do have to get smart.”
Nearly every company has been forced to consider what AI could mean for its future. In commercial real estate, 66% of professionals use AI at least weekly, according to a survey by First American Data & Analytics and DealGround. Of those, only 5% trust it enough to inform real decisions.
Common reasons respondents don’t use AI include confusion over which tools to use, a lack of trust, and data privacy or security concerns. Such fears are common for an industry that is notoriously slow to embrace new technologies, panelists said at the event.
“Real estate still operates as very broker-driven,” AvalonBay Communities Vice President of Product Management Pratik Dhebri said. “Keep it in spreadsheets. Keep it close to my chest. I'm not going to share this data. This data is truly mine.”
What executives may not know is that, behind the scenes, their employees may be using AI without authorization. Onyxia Cyber CEO Sivan Tehila calls the practice “shadow AI.”
Having control of proprietary data has grown increasingly difficult since employees shifted away from secure in-office computers to laptops used at home. Add in the use of generative AI, and things become a lot messier.
“When an employee copies your data [and pastes it into ChatGPT], it's out there, and the big companies can use this data to train their models,” Tehila said. “They can do whatever they want with it, especially if employees are using their own console.”
Over one-third of employees admit to sharing sensitive work information with AI tools without their employers' permission, according to IBM.
To prevent leaks and other consequences, Tehila suggests investing in a companywide system. Some real estate firms, such as JLL, Compass and Janover, have already built their own tools.
AvalonBay invests in its own platforms, but only if it can differentiate itself from what already exists, Dhebri said. There may be little reason to design customer relationship management software if they can use a company like Salesforce. Creating a specific app for residents in a community, however, may pay off.
But when working with third-party product developers, Dhebri remains guarded.
“We've explicitly put that in the contract, saying you cannot use this data to train anything else,” Dhebri said. “This is AvalonBay’s data. This is not your data.”
Still, creating a company-specific AI tool and implementing that tool are two separate problems, panelists said.
Oftentimes, a select few employees “somehow have the magic” to implement AI properly but operate independently, Miller said. They may be the ones lurking in the shadows.
“Find those people in your organization who have the right stuff, the right taste, the right agency and the desire to learn,” Miller said. “Then everyone in the organization learns from them.”
Nearly half of real estate companies are running AI pilots, according to a survey by proptech firm Keyway. More than half plan to increase AI spending by over 20% in the next 24 months.
Yet, many are stuck. Only 9% have successfully deployed AI at the enterprise level. Just 8% are data-ready for adoption, meaning the data is accurate and properly structured so it can be used in models.
“They're realizing that it's not just adopting Claude or ChatGPT,” George Smith Partners Director of Investments and Technology Eric Migicovsky said. “What's the system around it? Where does the data sit? Is it secure? Is it maintainable? How do you actually implement real workflows across the teams?”
Answering those questions and introducing staff to new technology can be a hurdle.
Tehila recommended pairing experienced employees, who have years of analog experience, with younger employees, who may lack the industry know-how but have a better understanding of technology. The collaboration allows them to learn from each other and can help pinpoint processes that may be automated.
Every other Friday, Migicovsky’s firm hosts team meetings where employees demonstrate tools they have experimented with, he said. Executives may then permit certain technologies to be incorporated into workflows and create guardrails.
Silverstein Chief Technology Officer Yael Urman said her team “created a lot of FOMO” to get people on board. Trainings include games, and employees are encouraged to find opportunities for implementation, especially when it comes to repetitive tasks.
At the World Trade Center, Silverstein is running pilot programs for window-cleaning drones and delivery robots. In residential buildings, it's testing ways to automate package rooms.
Even so, Urman faces some of the same anxieties as non-AI users.
“We ask ourselves every day: [Do] we feel that we run too fast or we don't run fast enough? Then, from the other side, how can we make sure that we are not increasing our risk and exposure?” Urman said. “This bothers us on a daily basis.”