The Disconnect Between PropTech Teams and End Users in Commercial Real Estate

The Disconnect Between PropTech Teams and End Users in Commercial Real Estate

Summary

- In our CRE technology benchmarking survey, the lens through which CTOs evaluate technology did not necessarily match those of end users

- The standard process of first collecting and integrating data is flawed, instead the focus should start with the end user

- When that is done, the other pieces of centralization, integration and actionability fall into place

It's a nice thought

Last week, we launched a CRE technology benchmarking survey focused on "back-of-house" operations. There were a range of insights, but one theme stood out.

That is, the lens through which CTOs and "PropTech" teams are evaluating technology does not necessarily match conditions on the ground for end users.

In the survey results, CTOs were more likely to focus on the challenges of deploying AI-driven HVAC automation and digital twins. Yet the biggest challenges for asset managers and operators were around the limits of legacy technologies and an inability to access basic information when they need it.

It's nobody's fault, really. The job of a CTO in a commercial real estate company is very difficult. Even conceptually wrapping your head around the tech stack for an entire organization is a daunting task.

The good news, supposedly, is that much of the data needed is already available within the existing infrastructure (BMS, HVAC systems, sensors, CMMS, BIM). Making buildings smarter is mostly a matter of connecting these systems and unlocking their full potential.

It often looks something like this:

  1. Figure out what data we have
  2. Figure out how to connect all systems to a data warehouse
  3. Figure out what the hell we can do with all this data

As logical as it sounds on paper, many portfolios have spent years of effort and millions of dollar on these efforts with little to show for it.

That's because each step grows in complexity and the inputs are constantly changing as asset are bought and sold and systems are upgraded. It begs the question of whether the same result can be accomplished faster, better and cheaper.

An alternative route

At Enertiv, we went down that road as well. We started by developing sensors that could affordably collect granular real-time data. We built a sophisticated software backend that now houses over 10 billion hours of this data. We spent many hours of engineering talent on translating this data in actionable insights for operators.

Over time however, we have come to realize there's a better way. It goes more like this:

  1. Start with engaging the end users - centralize legacy systems and bring any workflows still done on paper or in spreadsheets onto software
  2. Data is a natural byproduct of engagement with this software
  3. This data informs where investments in more advanced capabilities should be made.

There's already a template for this approach with systems that focus on "front-of-house" operations like market research, tenant payments, amenities, etc.

But for some reason, when it comes to building operations, there's a tendency to skip ahead a couple generations and start piloting complex and expensive smart building solutions that reliably fail to scale.

The real world

Recently, Microsoft's Azure Digital Twins released exciting features such as an "open modeling language (Digital Twins Definition Language), graph-based relationships, and seamless integrations for advanced execution environments, input sources, and output destinations."

Huh?

Meanwhile, the building operators on site use one system for their rounds and inspections, another system for work orders, and clipboards for tenant submeter readings. 

Their boss, a director of operations in charge of 14 buildings, is trying to verify the condition assessments for every piece of equipment in time for a budgeting meeting. 

The asset manager, who has no transparency into anything happening on site, is working to figure out which service contracts to renew and which spreadsheet has the latest inventory of HVAC equipment for capital planning.

Finally, the property manager has spent the last several hours going through a filing cabinet of old utility bills to try to resolve a tenant's dispute with their submeter bill.

None of these problems require a million-dollar digital twin to solve.

We're getting there

In fact, addressing these problems is relatively straightforward. 

Give operators one tool with which they can tag equipment and manage their rounds, inspections, preventative maintenance, and meter readings. Have that information flow into a software that makes it easy to find information, whether that's condition assessments or utility bills.

Doing this also transforms routine workflows into data-collection activities that can inform investments in more advanced capabilities by mapping out the infrastructure and pain points.

This is important because even the largest and most sophisticated portfolios have no intention of upgrading every building to have a modern BMS and up-to-date BIM necessary for complex digital twins.

If the goal is to introduce advanced capabilities unlocked by real-time data across an entire portfolio, there needs to be a strategy that can be applied to any asset. This will necessarily require installing stand alone sensors.

The question is what that looks like.

Some portfolios will need to prioritize retrofit investments in value add acquisitions, others will need to calculate overtime HVAC billing or streamline tenant submetering, others have to reduce the risk of water damage, still others want to verify the work done by elevator maintenance vendors.

There's no generalized blueprint for which solutions will provide the most value and that calculation is likely to change over time (for example, the sudden interest in air quality monitoring due to COVID-19).

CTOs should focus intensely on the needs of their "customers," the end users in the organization who are starved for elegant and intuitive solutions they can start using today.

When you do that, the other pieces fall into place. Technology companies have already solved many of the challenges CTOs are working through related to APIs, integration, networking, cloud services, and cybersecurity.

What hasn't been solved is an understanding of the particular needs of the end users in each portfolio. If that can be mapped today, it would make the goals of tomorrow that much closer.

Terry Herr

Buildings Systems Integration and BAS Analytics

3y

Connell McGill, You are absolutaley right in that there is way too much focus on AI/ML, fancy cloud apps, when the underlying brownfield (BAS controllers, sensor, actuators) and the equipment its controlling suffers from deferred maintenance. Second good point is the building operators are key, and they need training and engagement if the tech is going to reach its potential.

Scott Kaplanis

Partner at Groundbreak Ventures

3y

simple but effective wins. understanding workflows wins.

Chris Smith

Managing Partner @ Playfair

3y

Great article Connell McGill

To view or add a comment, sign in

Insights from the community

Explore topics