Facial Recognition Tech In Real Estate Pits Safety Against Liberty
Its supporters argue it can prevent terrorist attacks and catch criminals. Its critics say it is racist, ineffective and destroys civil liberties. And now commercial real estate is caught up in the ferocious debate about the use of facial recognition technology.
Last month, UK developer Argent faced a storm of criticism when it emerged it had deployed facial recognition technology at King’s Cross Central, a 67-acre London scheme where Google and Facebook are building their UK headquarters.
The UK Information Commission, a privacy watchdog, has launched an investigation into the technology’s use at King’s Cross Central, but Argent is not alone in using facial recognition.
Canary Wharf Group, the London office investor and developer owned by Brookfield and the Qatar Investment Authority, is also considering installing the technology, the Financial Times reports. And campaign group Big Brother Watch said British Land has used the technology at its Meadowhall shopping centre in Sheffield.
Police forces and intelligence services across the world use facial recognition technology, which can scan faces in a crowd and match them against computer databases. Its use is so controversial that San Francisco banned its police department from deploying it, although it can still be used by private companies.
Critics say it is ineffective, giving false positive results, especially in the case of nonwhite and female faces, and that it is inappropriately intrusive because it takes an image of a person’s face, scans it and sometimes stores it without their consent.
Until recently its use by private commercial real estate firms had been rare and unremarked upon, but as technology advances and the public debate about privacy and technology gets louder, that is likely to change.
Property owners across the world are more safety conscious than ever, face ever-evolving types of threat, and are utilising new methods to try and keep the people that use their schemes safe.
At the same time, concerns over privacy and data use are more high-profile than ever before, particularly in the wake of revelations about Facebook’s use of data. As property owners gather more and more data about the people that use their buildings, and the lines between what is public space and private space blur, technology like that utilised in facial recognition will be a nexus in the discussion about the balance between safety and privacy.
“If you take the long view, we are at a juncture where the existing social contract on individual freedom and privacy needs updating in the face of new technologies," WiredScore President and EMEA Managing Director William Newton said. “It is a conversation about balancing national security and individual liberty. Are we comfortable with anyone holding an omniscient overview of what individuals are doing? This kind of debate takes time and we are only now starting to feel our way towards the answers. But until we come to some sort of consensus you are going to see a lot of companies, including property companies, facing negative headlines because of their use of data."
Bisnow contacted more than 20 UK and U.S. developers to discuss the implications of the use of facial recognition technology in commercial real estate, but it is such a controversial topic that none were willing to go on the record.
Speaking on background, many pointed out that facial recognition technology has potential benefits in terms of making buildings and schemes safer and protecting users from threats including personal attacks and terrorism. Police forces and security services use the technology to protect places like airports and sports stadiums. No one wants their scheme to be the scene of the next mass terrorism attack.
“There are some great potential uses,” one landlord said. “Imagine you could put on a watch and be notified when a terrorist suspect walks on to your estate.”
“If it is a major terrorism target, or an area of high criminality, there is a case for it,” another said.
One developer said it could have the impact of reducing attacks on individuals using a space late at night.
Both Argent at King’s Cross and British Land at Meadowhall stressed this safety element as an argument for their implementation of facial recognition at UK properties, and pointed out that it had been deployed on a trial basis in conjunction with regional police forces on one street in the wider development. Argent declined to be interviewed for this article.
In a letter to London Mayor Sadiq Khan seen by the BBC, Argent explained how the system it used from 2016 to 2018 would have worked if it had been permanently installed.
“[It] is designed to run in the background, effectively dormant unless it matches against a small number of 'flagged' individuals (for example, individuals who have committed an offence on the estate or high risk-missing persons),” partner Robert Evans wrote. “At this point, all other faces are automatically blurred out when the footage is played back or captured. The system does not store the facial images of others.”
One developer said the issue is made more complicated by the fact that large developments like King’s Cross Central or the Canary Wharf Estate are privately owned public spaces. Because of this, the line around what is an acceptable level of privacy is not entirely clear. There is also the question of how far developers are allowed to go in policing behaviour in space that they own but which is open to the public.
“I have a degree of sympathy,” the developer said. “These are private spaces that are publicly available, paid for and maintained by the developer. The developer surely has some right to control the environment, and if there is persistent anti-social behaviour like drunkenness or intimidation, to control that, and make it a pleasant place for the public to be. But at the same time, the public don't want to feel like they have been blacklisted without ever having been convicted of a crime.”
“The average person doesn’t have a clue if they are in public or private space,” Newton said. “With individual buildings it is more obvious that when you walk through the door you are giving your implicit consent to abide by that building’s rules. That doesn’t exist in privately owned public realm.”
Increasingly, property owners are going to have to wrestle with how they collect and store data, as buildings become smart and start measuring how people use them. It is a problem that tech companies have faced for years now. The controversy over facial recognition technology is at the sharp end of this, but only a small part of a wider conversation: see for instance the controversy in Toronto over Google’s smart city, and who owns the data generated by the people who use the new district the tech company is building.
“We’re in the early stages of this, and there are a lot of potential uses of this information that we don’t understand,” one landlord said. “We have discussed using this technology internally, and we think it’s better to avoid the moral hazard, especially in the wake of the criticism Facebook received for its use of data.”
“When you sign up for it, you sign a waiver saying what the data can and can’t be used for, but who knows what uses could be found for it further down the line that aren’t covered by that waiver? It only takes one employee of yours to say, oh, we could use it for this, you don’t have oversight of that, and then you face big reputational risk.”
The landlord added that while some tenants might appreciate the benefits that facial recognition technology might bring, others might be uneasy.
“You don’t really want to do anything to worry the tenants.”
So while society at large grapples with this issue, how should real estate owners balance the different forces at play?
One developer said it is putting facial recognition technology into a new office building, but users will have to explicitly opt in to its use rather than their faces being scanned automatically. It will also mainly be for external guests who don’t want to queue at reception. Facial images would be stored on a server to which the building owner does not have access.
For the owners of large, multi-building schemes, the issue is more tricky, but not impossible, according to WiredScore’s Newton.
“I would use the word transparency rather than consent,” he said. “Property owners using facial recognition technology should have visible signage alerting people to its use."
This could include mentioning it on the menus of bars and restaurants.
“As much as possible you need to try and let people know that’s happening,” he added. You have to measure how many people using the area regularly know about it, and if awareness is below a certain threshold, you need to improve.”
Property owners are going to have to balance security and liberty, data use and privacy, more and more, for decades to come. It is best to start thinking about these questions now.