by Jon Rabinowitz | Mar 23, 2018 | Energy, ICT
A client of ours recently installed energy sensors across two areas of their facility. One area is significantly old using good practice equipment for the time, the other brand new and utilising advancements in equipment technology. Both areas are similarly sized and perform the same operation, however, measuring energy performance between the old and new will provide our client with real insights when making future strategic decisions.

While our client operates a large portfolio of facilities around the North Island, they are using this specific site as a sandbox environment, a testbed to trial new initiatives as they look to upgrade and replace existing equipment at their other facilities.
Utilising real-time energy data to measure performance against a range of benchmarks will allow them to verify performance gains and deliver insights into which areas should be prioritised in their long-term business plan.
Non-intrusive wireless energy sensors that can be easily moved to measure other areas, combined with powerful cloud-based software reporting tools provide a cost-effective and flexible way to build business cases.
The following article, written by Jon Rabinowitz at Panoramic Power, highlights the fact that, with Internet of Things (IoT), businesses can now test ideas in a quick and cost-effective manner while collecting valuable data for future decision making.
The Internet of Things has exploded onto the scene and with it a slew of potential business applications. In navigating this terra nova, most decision makers take their cues from the competition, afraid of wading too far into the unknown. This is reasonable, of course, but it’s also a big mistake.
Smart business owners and managers should know that they don’t need to resign themselves to the role of a follower in order to hedge their bets and mitigate their exposure to risk. You can lead and be cautious at the same time!
A False Dichotomy in Applied Internet of Things Investment
Consider, for example, the business value of smart, self-reporting assets. These assets hold the promise of constantly refined operational processes, reduced maintenance costs (as issues are caught and corrected in the earliest stages before degradation occurs), extended lifecycles and the elimination of unplanned downtime.
Still, few things ever go exactly according to plan and deliver quite as advertised. So it’s understandable that prudent decision makers might set expectations below the promised value. Add to that the fact that overhauling and replacing the entirety of your asset infrastructure is incredibly expensive and a terrible disruption to operations.
It’s easy to see why some business owners and managers might prefer to sit back and let “the other guys” take the lead in implementing Internet of Things into their business operations. But easy to see and right are two very different things.
The right approach is significantly more nuanced, as the rationale presented above is built on a false dichotomy. Your choice isn’t between sitting back and doing only what the other guy already succeeded at or totally replacing all your critical assets. There’s a world of options spanning the divide between those two.
The Golden IoT Mean: New Operational Intelligence, Old Equipment
Science and technology are both predicated on the principle of testing and your business should be the same. It always makes sense to “pilot” new technologies or techniques before deploying them at large. Beyond that though, using advanced Internet of Things technologies and tools, you can infuse new operational intelligence into old equipment without replacing anything.
Until your industry has reached a “mature” state in its development and integration of IoT technologies, this is the best way to mitigate risk without forfeiting access to value while it’s still a comparative advantage.
Using smart, non-intrusive energy sensors – each about the size of a 9-volt battery – you could retrofit past-gen assets to enable next-gen operational intelligence. Simply snap a sensor onto the circuit feeding the intended asset. No need to suspend operations; no need for complicated installation.
After your sensors are in place, enter the corresponding ID numbers into the mapping console. Immediately, these sensors will begin reporting granular energy data, pumped through an advanced, machine-learning analytics platform, and turning out new operational intelligence to be acted upon.
In this manner, facility managers can give a voice to their critical assets, allowing for advanced operational automation, predictive maintenance and generally increased production.
by Jon Rabinowitz | Feb 16, 2018 | Energy, ICT
In the past couple of years, we have written a lot of commentary about how the world of IT and the world of Energy are converging. As more and more companies are choosing not to own and operate their IT, the role of IT departments is fast moving away from just technical support towards strategic thinking informed by data analysis. The same can be said for Energy as companies gain access to wireless energy monitoring via cloud-based analytics, CIO’s and IT managers have equal interest as COO’s and plant managers as the Internet of Things is rolled out.
The convergence of data across all areas of a business is quickly growing and how this data is used to inform executive as well as operational level decision making and planning. In the last 12 months, Total Utilities has installed several hundred wireless energy sensors around New Zealand. Data from these energy sensors are being used across numerous areas from cost allocation, performance and product benchmarking, energy efficiency, sustainability reporting, preventative maintenance and tenant rebilling.
The following article was written by Jon Rabinowitz and underlines the need for cross-section planning and acceptance to ensure IoT success. Total Utilities can help your business through this planning process to ensure that key stakeholders are part of your IoT journey.
The most effective way to derive value from sensor technology and big data is to ensure you can analyze and act on the information gathered.
When sensors are deployed as data capture/communication instruments and paired with best-in-class Big Data analytics, the result is the life breath of value-driven Internet of Things technology. Unfortunately, more often than not, IoT systems are rolled out without proper planning, and simply so some manager can check a box and say that he or she undertook an IoT initiative.
When organizations are driven by hype, that value-generating substance generally falls by the wayside. Too often this is the case with IoT projects pursued for their buzz-worthiness, without due attention paid to the smart sensor and Big Data nuts and bolts of the thing.
Indeed, IoT for IoT’s sake can only get you so far. Ganesh Ramamoorthy, a principal research analyst at Gartner, told The Economic Times that “8 out of 10 IoT projects fail even before they’re launched.” If we’re to believe that figure, it begs the question, what are companies doing wrong?
Why Companies Fail In Putting Sensors and Big Data to Work
Compelled by topical hype, there’s often a sense of urgency associated with IoT projects. According to Mark Lochmann, a senior consultant at Qittitut Consulting, that self-imposed urgency gets the best of organizations. In an interview with IndustryWeek, Lochmann explains that many enterprises “dive headfirst” into Internet of Things projects without really understanding how the technology affects their operations.
Too often decision-makers get themselves drunk on buzz and embark on technology rollouts, without really understanding what needs to happen and how that tech needs to tie back into business operations in order to derive value. They simply neglected to consider how installing hundreds, possibly thousands of sensors across their business is supposed to portend an improved bottom line.
To avoid missteps, be sure to consider the following practical factors:
- Where the sensors will process the data (point of creation, edge devices, in the Cloud, etc.)
- Which tools analysts need to visualize and interpret data.
- How the company will validate and cleanse sensor information.
- The data governance policies which will protect data.
- The infrastructure necessary to support analysis (data warehousing, data mart, extract transport load servers, etc.).
Lacking wherewithal on these basic project components, research from PricewaterhouseCoopers and Iron Mountain reveals that only 4% of enterprises “extract full value” from the data they own. However, it’s not just planning that’s holding companies back either. The same study found that an additional 36% of businesses were hamstrung – regardless of how thoroughly they’d thought through the project – due to system and resource limitations.
That paints a rather bleak picture, but there’s no reason to despair. The above statistics notwithstanding, there remains plenty that conscientious managers can do to ensure the success and of their integrated sensor and Big Data initiatives.
Start With Specific Use Cases, Then Dig Deeper
Many organizations don’t know how sensors and big data will impact their data centres.
Managers leading sensor and Big Data projects need to outline specific use cases before even thinking about implementation.
A smart manager will have a firm handle on how such technology is likely to impact operations on a day-to-day level – not based on intuition or imagination but on data. Better yet, that manager will set very specific expectations, delivery processes and timelines for what he or she wants to achieve on the back of sensor and big data technology.
For example, let’s say you want to install advanced wireless sensors in your facility. The person leading the project should explicate his or her intention to leverage collected data in order to:
- Diagnose problems with equipment to predict failures.
- Analyze the production efficiency of each asset.
- Determine how many tons of material you process on an hourly basis.
- Monitor worker health and locations across the facility.
Of course, not all data is equal and the lion’s share of that discrepancy will depend on the data type. Data pertaining to equipment functionality, it should be understood, is among the most important and time-sensitive. Such data should be processed immediately given that it can tell the story, in real time, of costly malfunctions.
Some of your data will demand review in real-time, some every half hour and some every other month. Most will find itself living between those poles. Regardless of the data type though, you’ll need to be sure that when you consult it, it conveys something that is actually meaningful and actionable. For this, you’ll need to plan out and gain relative mastery over your processing system.
This means carefully choosing an analytics solution or a combination of complementary solutions. It may also mean hiring an in-house data scientist or simply bringing in outside help for training purposes. In some cases, it will even mean building out an on-site information network. In every case, it will mean instilling a data- and value-driven culture, where employees not only have access to tools but know when and how to properly use them.
Remember, there’s no such thing as too much research and that hype can be as dangerous as it is enticing. Bearing that in mind, the world of IoT is big and wide and waiting for you!
by DavidSpratt | Jul 24, 2017 | Energy
The boundaries between physical, digital and biological worlds are breaking down — giving way to a new world of computer based business known as cyber-physical systems. These cyber-physical systems are characterized by the merging of physical, digital, and biological realms in profound ways. Artificial intelligence (AI) serves as the primary catalyst of this transformation.
Klaus Schwab, Chairman and Founder of the World Economic Forum wrote:
We are at the beginning of a revolution that is fundamentally changing the way we live, work, and relate to one another. In its scale, scope and complexity … the fourth industrial revolution is unlike anything humankind has experienced before.

We have all heard this kind of hyperbole before. So why should this matter and what are local companies doing to address the issues?
Beyond hyperbole
It matters because we have already seen our lives changed by these tools in the most dramatic fashion. The last presidential election in the USA was directed affected by the use of AI.
These tools were used in identifying and directly addressing those electors who were undecided or felt strongly about key issues. These powerful compute engines, combined with good old-fashioned phone calls and door knocks, meant voters were either encouraged to vote by “people like them” who knocked on the door (e.g. young mum talking to young mum) or to not vote through messages directed directly to them about the futility of “rigged” elections.
Leveraging AI in the New Zealand Business Context
Politics and business are uneasy bedfellows so I will get back to the brief. How do Kiwi companies respond to international and local competitors who already understand and are leveraging artificial intelligence and bringing it to our competitive landscape?
Let’s start with energy. It is one of the most fundamental parts of any business. Most of us just focus on getting a cheap price for electricity or gas and then move on to running the enterprise day to day.
This approach just won’t work when machines are making the micro-decisions that can mean success and failure.
International Competition
Consider some of our major New Zealand computer companies. They are in a life or death struggle with public cloud providers like Microsoft, Amazon and Google. These gigantic multinationals have access to all the tools mentioned above and even deliver them “as a service” to companies everywhere. Competing with organisations like this is not just a question of having good people or getting the best price for inputs. It is about innovation and very, very careful monitoring of all the inputs and outputs, including energy.
One of the most brutally competitive battlegrounds is over the provision of data centre services to the business market. In the past ten years companies like Datacom and Spark have invested hundreds of millions of dollars in state of the art datacentres. These datacentres require huge amounts of energy to keep them running.
New Zealand’s advantage
New Zealand has a natural advantage because over 80% of our energy is created via renewable means. In the years ahead this advantage will become a cost and strategic advantage.
As the forth industrial revolution unfolds New Zealand’s energy advantage that will drive our strategic advantage in data centres. Don’t believe me? Microsoft recently announced that a key new measure for its Azure data centres was energy inputs to data outputs. Thus Microsoft has directly linked energy usage as a means to define its compute power efficiency in terms of services delivered.
So what are our Kiwi companies doing to compete on this stage? Both Datacom and Spark use tools like artificial intelligence to monitor, control and measure their energy inputs. Historically it was simply a case of installing a few sub meters and a cost calculator (macro energy measurements). Today these companies aim to measure and monitor right down to the lightbulb (micro monitoring). The rise of the internet of things has made this ability to micro monitor even greater. As new data centres and factories are being built across the country, architects are being required to include in their plans, tools and products that embed internet of things, artificial intelligence and micro monitoring in the very fabric of the design.
Companies building factories in this country that do not think of micro monitoring of energy use as a strategic tool should be reminded that in the last few decades we exported more manufacturing jobs overseas than we created in all of IT. Ignorance of the strategic possibilities of micro energy monitoring is wilful blindness in a world where the Fourth Industrial Revolution is not only upon us, it is rapidly transforming the competitive landscape we work in.
by pushkar | Jul 22, 2017 | Energy
Investment in energy monitoring has traditionally been dominated by lengthy CAPEX discussions and the technical specifications of proposed monitoring infrastructure which means spending more cash to find out where cost savings might be made. Little thought was ever given to the data output and associated software –most competing products delivered similar back end services and data displays which required users to export data to CSV format before being able to really interrogate it.
That’s changing with Panoramic Power smart sensors available now in New Zealand, through Total Utilities.

My colleague David, has previously written a series of articles regarding the rise of artificial intelligence algorithms and how major corporations are using these to exploit customer data and drive behavior. Data obtained from raw internet traffic, page clicks, key search words and online transactions is now being structured by algorithms in order to deliver insights and show trends. Further to this, the data is normalised by user defined groups and then compared.
Smart recommendations for energy flows
If Amazon or Apple can recommend a book or record that I might like, why can’t my energy monitoring software make recommendations? And if the data is all I really need, why should I have to purchase a very expensive metering asset that may only be required for 12 months? Of if my usage pattern changes why can’t I quickly adjust my monitoring setup?
Total Utilities encounter clients every day who operate energy intensive equipment, while the type of equipment varies greatly from production and manufacturing applications, cold storage, and commercial buildings, the issues remain the same. Clients need real time visibility of where energy is being used so that they can make strategic decisions and act quickly to save money.
Further to this, they need to know when energy intensive systems are under stress and may require attention outside of their normal maintenance cycle. They want the ability to see their energy flow within their site in various graphical formats and to be able to benchmark their HVAC or compressors across multiple facilities.
Total Utilities use Panoramic Power’s IoT (Internet of Things) sensor technology and cloud based analytics to help customers understand energy consumption.
Fast to install, fast to get benefits
Worldwide there are eight billion data points per month across 800 sites in 30 countries. With more being added in New Zealand every week.
Forty smart meters were installed in 1.5hrs at an Auckland CBD site the week before last and fifty were installed in 2hrs at a site in West Auckland on Friday. Each sensor is clamped onto the outgoing electrical wires of a customer’s distribution board. This eliminates expensive wiring, investment in new panels, lengthy shutdowns, IT connections and reduces health and safety risks. Once installed, it monitors the flow of electricity, sending information wirelessly to the cloud-based analytics platform every ten seconds.
With such an ease of install combined with effective data presentation and representation, potential energy savings can be identified quickly by pinpointing specific areas for further investigation. It took less than two days for Total Utilities to identify that the lighting of a commercial building was switching on at 2am and running for two hours every morning despite the BMS (Building Management System) showing all lights were off. Simple measures were implemented quickly which means the energy monitoring system has already paid for itself.
While the above is a relatively rudimentary and common example, Total Utilities can just as easily correlate key variables such as chiller temperature against outdoor temperature on a monthly, daily and hourly basis across multiple sites located throughout NZ without the need for pain staking manual calculations. We then deliver clients meaningful information and advice quickly so they can act and make significant energy savings.
Total Utilities believes that traditional energy meters are merely becoming a means to an ends as clients engage us for the value we create with intelligent data and analytics.
by DavidSpratt | May 5, 2017 | Energy, ICT
In the first two parts of this series I looked at what the internet of things (IoT) actually is and then at the Energy Management possibilities for businesses competing on the world stage.
In this, part three, of the series we get down to the nitty gritty of Manufacturing Production Management and how measuring energy flows and consumption can inform critical decisions.
I could make this complicated but if we really get down to the basics there are three main categories that require constant attention in a production environment: People, Processes and Technology.
People – Creating Feedback Loops
In the context of production management one of the most important variables is the performance of individual staff members. How someone uses equipment, works within a team, learns to adapt to new systems can make the difference between a highly profitable unit and one that is not.
What drives people’s decisions and actions will often come down to feedback loops. By using the Internet of Things to deliver energy monitoring information we can give people useful data about what is happening on their production line.
For example, if we can demonstrate that the team’s correct use of energy efficiency tools delivers a better product this not only reinforces their behaviour it also opens up the opportunity for them to take this information and find even better ways to improve efficiency.
How we use technology can be directly connected to the energy a unit or group of units consumes. If a lathe is running at full tilt throughout an eight hour shift does that necessarily mean that unit is being properly used? Or is the operator just running it on full because it that is what they were told to do when they first started years ago?
People make decisions at work every day. How you create feedback loops will inform those decisions more effectively and in doing so improved performance, job satisfaction and company results.
Processes
Every experienced Production Manager can tell you that each team on a production line is quite different and that their results vary considerably. The bigger question is “what causes this?”. By monitoring the flow of individual products and components through a production line we can identify bottlenecks, part shortages and defects quickly and effectively. With the Internet of Things we can break down a process into its smallest components, create various quality checkpoints along the way and eventually ensure near 100% accuracy, complete adherence to standards and instant identification of faults and tolerance variances.
Many would suggest that this is already the case in many well-run sites. But what happens off site with the parts we order and the products we ship?
The key to the internet of things is that our production process begins at the point where a component is ordered , right through the creation of unique SCU’s or products and all the way to the end user’s home, office or factory. This is because we can now potentially track billions of components throughout the supply chain at a cost much lower than we ever thought possible.
Technology
Remember the Internet of Things is not just adding an RFID tag to a unit and tracking it. It is about potentially billions of components that communicate. This can be back to a central point, with multiple other components, with the warehouse, the truck and of course with the end user.
By integrating Internet of Things components to the technologies we use in making things we can establish how one production line consumes energy at a fairly constant rate while another’s consumption appears to ebb and flow throughout the day.
In this instance we might have identified human errors, a malfunctioning device or quality issues with the parts or components being used on this line.
For the first time in our history we can easily measure our technology’s performance down to the tiniest detail. We are no longer limited by the number, location or stage of development of any component.
The Internet of Things opens a world of opportunities for us to deliver better quality at a lower cost and more reliably than ever before.