What’s the relationship between the number of sensors deployed and data center capacity to support the resulting data?

It’s quite significant — certainly non-trivial from a computation perspective because of the I/O requirements. Think about 150MB every 5 minutes for 500,000 square feet, stored for 5 years or 10 years. That’s a lot of I/O capabilities, that’s a lot of storage capabilities, and running analytics across data sets of that size takes very significant computational horsepower. [Enlighted] will be pushing the envelope on both the computation and storage side, because we’re heavily write-centric, whereas a more classical system is often very read-centric. We’re pouring huge amounts of sensor data into a server farm and then we’re running analytics across that on a regular basis. When people want a monthly report of their savings, we’re swiping across these huge data sets to deliver the comparison of before and after. It’s very compute- and storage-intensive.

— Rich Green, Enlighted senior vice president of products and technology.