Leading real estate first are building AI committees. Is yours?
Most real estate companies approach AI the same way. Someone in the C-suite reads an article, gets excited, and asks the team to "look into it." Six months later, you've got three people using ChatGPT for emails, and nothing much has changed except now IT is blocking half the tools people want to use. An AI committee changes that. Not because committees are exciting per se, but because they force you to answer the right questions first.
Start with the why and not the what
The first mistake companies make is jumping straight to vendor demos. They want to see the shiny interface and the case studies from competitors who've already implemented it. But none of that matters if you haven't identified what you're trying to fix in the first place.
An AI committee's job starts with mapping real problems rather than the theoretical ones. Think bottlenecks that cost your team time and money. The investment manager spends four hours screening transactions every other day, while the asset management team loses three weeks every quarter writing portfolio reports. Then there’s the analyst , who manually extracts lease data from hundreds of PDFs.
These are tedious, repetitive tasks that real estate professionals shouldn't be doing but can't avoid. When you bring together people from across the business, be it asset management, investment, portfolio management, finance, you name it, you start to see patterns. Five different people handle the same document before anything even happens with it. The same data is entered three times, and the same analysis is repeated for every new deal.
This is where AI does what it’s meant to do. Not by replacing jobs or "transforming" your business overnight, but by handling the work that shouldn't require a human brain in the first place. Your committee exists to identify those tasks and build consensus around which ones matter most.
Implementation takes longer than you think
The asset manager who's been writing quarterly reports the same way for eight years won't immediately trust software to do it for them. Neither will the investment team that's spent years refining their screening process and IC memo. It makes sense, as people are protective of work they've mastered for good reasons.
Change management moves slowly in real estate because the industry values expertise and relationships over rapid experimentation. Your AI committee needs to account for that reality instead of fighting it. They become the bridge between early adopters and skeptics, sharing results across teams and being honest about where the platform works and where it doesn't.
Boiling the ocean doesn’t work. Some committees make the mistake of trying to automate everything simultaneously. Pick one workflow that matters and get it working properly first. When the investment manager watches their underwriting go from four hours to ninety minutes, they'll tell their colleagues. It’s these types of conversation that hold more weight than any executive memo about AI adoption ever could.
At Fifth Dimension our approach is simple: Crawl … Walk … Run. Get your team skilled with prompting, start scoping high ROI automations and finally, introduce powerful integrations that allow 5D to connect dispersed data sources enabling RE companies to make smarter, quicker decisions on your portfolio.
Sort out the procurement side early
At companies with banking or private equity roots, security reviews can take as long as six months. All the vendor checks and compliance work need to be done before you run a proof of concept, otherwise the project stalls as soon as it starts.
Nothing's worse than delivering a successful pilot that solves a real problem, only to discover three months later that the vendor can't pass your internal requirements. Your AI committee should make procurement part of the initial evaluation because by the time everyone's excited about the results, finding out you can't use the platform will stop momentum in its tracks.
A flexible approach to training
The same AI platform ends up serving different needs depending on who is using it. The investment team might screen deals and pull comparable data, while asset management wants portfolio reports and tenant analysis. At the same time, perhaps finance needs data extraction and formatting for their models.
Generic training doesn't work because the questions each team asks are fundamentally different. Implementation needs to reflect that. Over three to five weeks, people learn how to use the platform for their specific workflows, which means they're using it instead of abandoning ship after the first frustrating attempt.
Companies that already use tools like Copilot or ChatGPT tend to move faster here. Their teams understand how to interact with AI platforms, even if they're switching to something more specialised. They've already cleared the basic hurdle of knowing how to frame requests and refine outputs.
Control what the AI can see
One concern that comes up repeatedly is handing over access to an entire document management system. The worry makes sense because nobody wants AI trawling through terabytes of sensitive files with no restrictions.
The solution is simpler than most people expect, however. You control exactly which folders and files the platform can access. If you're working on ESG reporting, the AI sees your BREEAM certificates and sustainability audits. It doesn't see acquisition documents or tenant correspondence unless you specifically grant that access.
Taking this approach solves two problems: your data stays contained to what's relevant for the task at hand, while AI also performs better because it's not searching through irrelevant information to find what matters. A surveyor extracting lease terms doesn't need the system scanning every email thread from the past five years. Narrow the scope and improve the output, so you maintain control. Your AI committee sets those boundaries based on how different teams work.
Security should never be an afterthought
What tends to happen is someone on a team starts using a general-purpose AI tool on their own laptop because it’s quicker than waiting for IT to approve anything. They paste in sensitive documents and start feeding deal data into whatever free tool looks helpful. No one is trying to cause a problem, but that’s how sensitive information ends up in places it shouldn’t.
Any technology tool your real estate organisation implements should meet ISO 27001 and SOC 2 standards. These certificates give you a clear baseline for how your data is handled and who can access it. The advantage comes from the information you hold in real estate, from client portfolios to deal history and internal analysis. Those shouldn’t end up training someone else’s system or leaking into other accounts. Keeping control of that data is part of doing the job
An AI committee helps to stop that from happening in the first place and gets everyone on the same page before any tools are rolled out. Your technology team needs to be involved from the start instead of being dragged in later to explain why something has to be switched off. Their job to define what’s allowed and what isn’t, then make sure buying decisions follow those rules before anyone starts experimenting.
Without a thorough understanding of AI security you’re putting systems in place with no real foundation - you can learn more about this here.
Time to get specific or go home
Strip away the corporate language and an AI committee does three things.
- They identify where time is wasted on work that doesn't need human judgment.
- They make sure AI adoption holds across the business, instead of stalling on risk, approvals or habit.
- And they help people across the business use it instead of reverting to old habits after two weeks.
None of this is glamorous work, and you're not revolutionizing real estate or building the future. But the work is important, as you’re finding bottlenecks that cost your best people four hours a week and figuring out how to give them that time back.
The companies that make AI stick treat it as an operational project. Teams from different functions meet regularly and try out specific use cases in real work. They keep what proves its worth and lose what doesn’t. That steady approach is what turns AI into something people use instead of another short-lived experiment.
Committee before strategy
Companies that succeed with AI start with the groundwork, like identifying which problems matter and getting the right people aligned before testing anything. An AI committee gives that process a shape and keeps it moving.
Fifth Dimension works with real estate companies to identify high-value use cases and implement platforms that teams use. Book a chat to discuss how we can help.


