What Is a CRE Technology Assessment?
A CRE technology assessment is a structured evaluation of the systems, data, integrations, and processes that support a corporate real estate organization. It maps what exists today, identifies what is working and what is not, and produces recommendations that leadership can act on. Done well, it is the foundation for every technology decision that follows, whether that is migrating to a new IWMS, adopting corporate real estate analytics software, or simply rationalizing what you already have.
Most assessments fail not because they lack rigor, but because they never lead to action. They produce a 200-page report that sits on a shelf. The methodology described here is designed to avoid that outcome. It is based on practitioner experience conducting assessments for large corporate real estate organizations, including IWMS environments, lease management systems, facilities platforms, and the integrations between them.
Step 1: Start With the Business, Not the Technology
The most common mistake in a CRE technology assessment is starting with the systems. Teams begin by inventorying every module, every feature, every configuration. That work is necessary, but it is not where you start.
Start with the decisions leadership needs to make. What questions does the CRE organization struggle to answer today? How long does it take to produce a board-ready portfolio report? Where is the organization overspending? What lease obligations are at risk? Which facilities are underperforming relative to their cost?
These are business questions, not technology questions. But the technology exists to serve them. If you do not understand what the business needs, you cannot evaluate whether the technology is delivering it. Every subsequent step in the assessment is measured against this baseline: does the current technology enable the decisions that matter?
Step 2: Map the Technology Landscape
Now map what exists. Every system, every data source, every manual process. This means the IWMS (TRIRIGA, Archibus, Planon, or whatever the organization runs), lease management and administration platforms, ERP and financial systems, facilities management and CMMS tools, project management platforms, document repositories, and the spreadsheets and manual workarounds that have grown around the gaps.
For each system, document who owns it, who uses it, what data it holds, what condition the data is in, and what integrations connect it to other systems. Pay special attention to integrations. In most CRE technology environments, integrations are the most fragile and least documented components. They are often the first thing to break during a transition and the last thing anyone thought to map.
Do not rely on architecture diagrams from the original implementation. They are almost certainly out of date. Map what actually exists today, not what was designed five years ago.
Step 3: Talk to the People Who Actually Use the Systems
This is where most assessments conducted by large consulting firms fall short. They talk to IT. They talk to leadership. They review system documentation. But they do not talk to the facility managers, the lease administrators, the project coordinators, and the regional directors who interact with these systems every day.
The people closest to the work know things no documentation captures. They know which modules are actually used and which were abandoned years ago. They know the workarounds. They know where data entry is duplicated, where reporting is manual, and where the system's design does not match how the business actually operates.
Stakeholder conversations should span the full organizational boundary: real estate leadership, facilities management, lease administration, finance, IT, and external service providers. Each group has a different relationship with the technology and a different set of requirements for what comes next.
In one recent assessment, conversations with frontline users revealed that an entire IWMS module, one that was actively maintained and supported, had not been used by anyone on the business side in over two years. That insight alone changed the scope and cost of the transition plan.
Step 4: Assess Data Quality
Data quality is the single most underestimated variable in any CRE technology decision. If the data in your current systems is inconsistent, incomplete, or stale, migrating it to a new platform does not solve the problem. It transfers it.
For each system and data domain, assess completeness (are the required fields populated?), consistency (do naming conventions, categories, and hierarchies match across systems?), currency (when was the data last updated, and is it still accurate?), and ownership (who is responsible for maintaining this data going forward?).
Lease data, space data, facilities data, and financial data each have different quality profiles. Lease data is often the most complete because it feeds financial obligations. Space data is often the most stale because occupancy changes faster than systems are updated. Facilities data quality varies dramatically depending on the service delivery model.
Data quality findings directly shape the transition plan. If data quality is high, migration is straightforward. If it is not, data remediation becomes a prerequisite that adds time and cost to any path forward.
Step 5: Document Current Process Maps
Map how information actually flows today. Not how it was designed to flow. Not how the org chart says it should flow. How it actually flows.
This means documenting the real process for key workflows: how a lease gets administered from execution to expiration, how a facilities work order moves from request to completion, how capital projects are tracked and reported, how portfolio reporting is assembled for leadership.
In every assessment, the documented process and the actual process diverge. People have built workarounds, manual handoffs, and shadow processes that compensate for gaps in the technology. These workarounds are not failures. They are information. They tell you exactly where the current systems are not meeting the business need.
Process maps become the baseline for any future state design. You cannot design a better process without understanding the current one. And you cannot evaluate new technology without knowing what processes it needs to support.
Step 6: Define Future Technology Requirements
This is where the assessment shifts from understanding the present to shaping the future. Based on everything you have learned, what does the organization actually need from its CRE technology?
Requirements should come from the business, not from a vendor demo. They should be organized by domain: lease management, space and occupancy, facilities and operations, capital projects, reporting and analytics, document management, and integrations. For each domain, define what capability is needed, what the current gap is, and how critical it is to the business.
Be honest about what is a true requirement and what is a nice-to-have. Most organizations use roughly 30% of their IWMS platform's capabilities. The other 70% was purchased but never adopted. Future requirements should reflect what the business will actually use, not what looks impressive in a vendor evaluation matrix.
Requirements documents become the foundation for vendor evaluation, RFP development, and build-vs-buy decisions. They also become the measuring stick for evaluating AI tools and analytics platforms that may complement or replace parts of the existing stack.
Step 7: Present Options, Not Mandates
The assessment deliverable should present multiple paths forward, each with clearly defined tradeoffs. Not one recommendation with a predetermined conclusion.
Common paths include staying on the current platform and investing in optimization, migrating to a new IWMS or suite, adopting an analytics layer to connect existing systems, or a hybrid approach that replaces some components while connecting others. Each path should include estimated cost, timeline, internal resource requirements, risk factors, and what the organization gains and gives up.
Leadership makes the decision. The assessor provides the facts, the options, and an honest recommendation. But the organization needs to see the tradeoffs clearly enough to make an informed choice. A recommendation without alternatives is a sales pitch, not an assessment.
Step 8: Define Immediate Next Steps
Every assessment should answer the question: what can we do right now? Not after a 12-month vendor selection process. Not after a 24-month platform migration. Right now.
There are almost always quick wins that improve visibility, data quality, or process efficiency without waiting for a full transformation. These might include connecting existing systems through an analytics platform like Osprey to get portfolio visibility immediately, remediating data quality in the most critical domains, decommissioning unused modules or integrations to reduce complexity, or standardizing naming conventions and hierarchies across systems.
Quick wins build momentum and credibility. They show stakeholders that the assessment is leading to tangible improvement, not just another planning exercise.
Step 9: Keep It Tight
A CRE technology assessment does not need to take six months. With the right practitioner, focused stakeholder engagement, and modern tools to accelerate data analysis, a comprehensive assessment can be completed in four to seven weeks.
The key is efficiency without sacrificing rigor. Targeted interviews rather than endless workshops. Programmatic data quality analysis rather than manual spreadsheet reviews. A deliverable that is actionable and concise rather than comprehensive and unreadable.
If your assessment is taking longer than two months, the scope is too broad or the approach is too heavy. The goal is clarity, not completeness. A decision-maker needs enough information to act with confidence. Not a catalog of everything that exists.
What a Good Assessment Delivers
At the end of a well-conducted CRE technology assessment, leadership should have a clear picture of the current technology landscape, including systems, integrations, data quality, and usage patterns. They should have documented current-state process maps showing how information actually flows. They should have a future-state requirements document grounded in business needs. They should have multiple paths forward with cost, timeline, and tradeoff analysis. And they should have a set of immediate next steps that create value before the long-term path is fully defined.
That is what separates an assessment that leads to action from one that leads to another assessment.
Closing Thought
The best CRE technology assessments are conducted by people who have operated these systems in production, not by people who study them from the outside. The difference is in the questions they ask, the patterns they recognize, and the recommendations they make.
If your organization is navigating an IWMS transition, rationalizing a fragmented technology environment, or trying to figure out where AI fits into your CRE operations, the assessment is the right starting point. Not a vendor demo. Not an RFP. A clear-eyed look at where you are today and what makes sense next.
If you are considering a CRE technology assessment and want to understand what the process looks like, learn about our advisory approach or book a 20-minute call to discuss your situation.