MES – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Mon, 09 Oct 2023 08:26:54 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png MES – Novotek Ideas Hub https://ideashub.novotek.com 32 32 Smart Factory and the future of energy https://ideashub.novotek.com/smart-factory-and-the-future-of-energy/ Mon, 25 Sep 2023 13:28:26 +0000 https://ideashub.novotek.com/?p=3381

The manufacturing industry accounts for much of the world’s energy consumption. In 2021, manufacturing accounted for a whopping 33% of all energy consumption in the U.S. Energy Information and Administration. In Norway, industry uses almost twice as much energy as private individuals annually*. By taking steps to reduce energy consumption, manufacturing companies can make a major impact on total consumption in the world.

* Based on calculation with average figures (2022) from Statistics Norway.

Many manufacturing companies have already adopted sustainability strategies to reduce consumption and emissions, and more and more are trying to get started. With the ongoing energy crisis and rising energy costs, many manufacturing companies are dependent on reducing their energy consumption to remain competitive – or even survive. In addition, the industry faces stricter regulations and regulatory requirements related to sustainability, as well as more environmentally conscious consumers. The time to start producing more sustainably is now.

Use digital tools to implement sustainability strategies on the site floor

Although many manufacturing companies already have sustainability strategies in place, the practical challenge is implementing the strategy on the plant floor. In order to produce more sustainably, it is crucial that production personnel have access to information on a daily basis. Only with insight into energy and raw material consumption can measures be taken to optimise production.

To solve this, you should use digital tools, which give operators and other personnel the information they need, while constantly working on the production process. Access to both real-time and historical data makes it possible to make both immediate and long-term improvements in production related to energy and raw material consumption, faulty manufacturing, traceability and more.

3 steps to reduce energy consumption

How can digital tools be used in practice? Below we share 3 steps on how you can map and optimize energy consumption in the production process.

Step 1: Map – “Are we using too much energy?”

See your spend compared to your normal spending, goals or budgeted spend in real time.

  • Monitor consumption related to process areas and production lines
  • Record events in production
  • Record shifts, time of day and weather conditions
  • Compare performance across plants, products, and manufacturing teams
  • See consumption compared to sustainability KPIs (e.g. production carbon emissions)

Step 2: Explain – “Why are we using extra energy?”

Leverage context from the site floor to understand how to improve resource efficiency.

  • Map the resource consumption of all products
  • Find inefficient equipment
  • Discover unknown patterns, wrecks or opportunities for improvement
  • Contextualize data to manage sustainability KPIs
  • Use best practice to standardize operations

Step 3: Optimize – “How can we reduce energy consumption and costs?”

Take actions that improve operational performance and sustainability, both at the process level and throughout the plant.

  • Optimize production planning for better utilization of resources
  • Reduce resource consumption and associated costs
  • Reduce variations in production processes
  • Make your supply chain more agile and resilient
  • Ensure holistic optimization of the entire production environment

Sustainable production with Proficy Smart Factory

GE Digital’s Proficy Smart Factory software comes with all the features you need to gain insight into the manufacturing process and take action for a more sustainable production. Already using a MES solution from GE Digital? Then you have all the tools you need at your fingertips!

Via the Web-based dashboard platform Proficy Operations Hub, you can access visualized data anywhere, anytime. Below you can see tutorials of six widgets that can be used to gain insight into the energy and raw material consumption of the production process.

Proficy Operations Hub widgets

Sparkline

Displays time series data. Can be used in several areas:

  • See energy or water consumption over a period of time
  • Correlate energy or water consumption to temperatures/precipitation/weather conditions over a period of time

Bullet Graph

Displays target value and real value.

  • See your energy consumption compared to normal consumption, budgeted consumption or goals

Bar Gauge, Circular Gauge and Solid Gauge

Three widgets with different visualization of value compared to bucket.

  • See your energy consumption compared to normal consumption, budgeted consumption or goals

Pie Chart

Displays data values in pie or doughnut chart. Can be used in several areas:

  • Illustrate how consumption of e.g. energy and water affects total costs and greenhouse gas emissions
  • Show most energy-intensive processes
  • View material consumption

Join us in reversing the trend

According to Statista, it is expected that energy consumption in industry will continue to increase in the coming years. This is despite an increased focus on sustainability and several challenges for the manufacturing industry, including increased energy costs and stricter regulations and regulatory requirements.

With the right tools in place, you can make a difference – both for the environment and your own business. Do you want to help reverse the trend and work for a more sustainable industry? We’ll help you get started!

Ask us about Smart Factory and sustainability

]]>
Data capture and regulatory reporting https://ideashub.novotek.com/data-capture-and-regulatory-reporting/ Thu, 29 Jun 2023 07:41:33 +0000 https://ideashub.novotek.com/?p=3358

Data capture is critical when you’re looking to drive continuous improvements in manufacturing, and it is equally crucial for regulatory compliance. In this article, we’ll look at how intelligent systems can not only streamline the capture of data required for quality management and regulatory compliance in regulated industries, but ensure faster, easier use (and re-use!) of data once captured.

Does your business track critical control points? If so, are you able to retrieve that information quickly and easily? With the right sensors, platforms and software solutions the necessary quality parameters can be continuously captured, with alerts generated in a timely manner for any deviation from specification. This can mean the difference between a batch of good quality, or finished goods that require time and energy to rework to the appropriate standard.

Furthermore, automatically performing regular in-process checks can improve the efficiency of operators and reduce the chances of incorrect data being captured or recorded which could lead to unnecessary work. With solutions from Novotek, we can help you start the journey to a fully automated quality management system.

Automated quality management reduces waste, increases yield and provides data for root cause analysis.

As all production processes consume raw materials, the exact nature and variability of these materials and the quantities used can have a significant impact on the quality of the finished product. Automatically adjusting the production setpoints to cater for the individual characteristics of raw materials can lead to a more consistent output.

By continuously capturing quality data through intelligent systems, you have the tools to perform a historical review of production performance based on different batches of raw materials. You may have implemented a historian, a lab system, even a performance metrics system already but what if the information in isolated silos that are not easily accessed? In these kinds of situations, we can take advantage of innovations in technology that may have been born outside the factory, but can offer value within the production world.

The Industrial Internet of Things (IIoT) is often understood to mean the devices and sensors that are interconnected via computers with industrial applications. In fact, it also includes the kind of data management platforms, “data ops” tools and methodologies that make managing and using industrial data easier.Although IIoT may sometimes appear vast and daunting, through an iterative and scalable process, you will rapidly see tangible results in reducing workloads, with an innovative platform for better quality and improved compliance with your industry’s regulations and standards. Linking together the disparate assets and data stores in your operation provides vital visibility, both in real-time and over the history of your production process.

Your data collection and collation processes are streamlined and automated through this connectivity, facilitating the generation of electronic batch records (EBR) that can be used to satisfy regulatory compliance. Modern, data ops tools, combined with low-code app development tools, make it straightforward to combine data from siloed systems into intuitive user interfaces make reviewing data against specific dates, batch codes, raw material lot numbers, or other production parameters more accessible and understandable.

And this approach suits additional needs: Compliance with standards and regulations is vital for the image of your operation. A tailored solution meets your requirements, from recording hitting required temperatures, or exact measurements when combining the right amount of ingredients at the right time. With our solutions, you can rest assured that you have access in perpetuity to every detail of what you’ve produced. And that in turn means being able to support investigations and deliver reporting to to fulfil obligations, and for both internal and external stakeholders.

Smart systems offer robust methods for ensuring regulatory compliance

Many manufacturers are both blessed and cursed with an ever-growing flow of potentially useful data. We see our role as being to provide guidance on the best way to tap that flow so that many different stakeholders can be served for the least incremental cost, and the least disruption to existing OT and IT. Thanks to the modernisation of plant systems, and increasing adoption of IIoT-style tools and platforms, our customers can  put their data into the right hands at the right time more easily than ever!

]]>
MES: Build vs Buy https://ideashub.novotek.com/building-or-buying-mes/ Mon, 05 Jun 2023 08:01:57 +0000 https://ideashub.novotek.com/?p=3336

Every manufacturing operation requires communication and the sharing of data. In the past, data was manually recorded with pen and paper and shared at the walking speed of an operator.

The industry has come a long way since then, with forward-thinking operations undertaking digital transformation journeys to unlock greater efficiency, visibility and the capabilities required for continual improvement and profitability in the contemporary manufacturing landscape. 

However, not all approaches yield the same results. While point solutions for individual functions to provide new capabilities in your manufacturing operation may seem a sensible way to begin a digital transformation journey, there are a number of issues to consider. 

Building MES functionality with point solutions requires careful consideration. The pitfalls are all too common, resulting in delayed progress and increased costs versus a single platform.

If you were to consider the data flow in your operation like plumbing in a house, concerns about differing approaches would soon become apparent. As numerous plumbers from different companies arrive to distribute water and heating around your home, difficulties reconciling differences between pipe diameters, connectors and joining mechanisms would result in burst pipes and water everywhere. 

Amongst the issues that come with composite systems is security. While plumbing together these systems, how do you consider cybersecurity with due diligence? Should you experience a cybersecurity threat, a growing and tangible danger, which vendors would you call for support? 

Vulnerabilities can reveal themselves when disparate solutions take diverging paths, at an incongruent pace, through their product roadmaps. The result is a constantly changing landscape in which your platform can continuously fall out of sync with its various component solutions, requiring constant attention and maintenance. That is not to mention the security risks of each system requiring different access routes in and out of information silos, which requires careful consideration as increased connections mean more potential attack vectors.

Bad actors take advantage of vulnerabilities in poorly secured systems

How do you ensure consistency and implementation of standards across vendor organisations? Best practice becomes challenging to implement, with no single approach for your entire MES system. Learning each solution will require training courses for each, resulting in increased time to competency for your operators. 

Implementing a single platform that provides seamless connectivity, efficiency management, quality management and production management solutions in addition to a raft of other capabilities rather than numerous point solutions avoids the headache. 

Where do you begin if you choose to implement a complete and integrated MES solution from a single vendor? The good news is that independent analysts have done a lot of homework for you. Gartner has asked vendors the tough questions to independently test the product and ensure confidence in connectivity, security, training, and the product’s roadmap.  

Novotek is the only Premier Solutions Partner for GE in the United Kingdom. With extensive experience and expertise in delivering and exceeding customers’ ambitions, Novotek has helped many manufacturers achieve greater profitability and efficiency with GE Digital products. 

So, what does Gartner have to say about GE Digital? 

“GE Digital is a Leader in this Magic Quadrant.” Gartner’s Magic Quadrant considers the completeness of a vendor’s vision alongside their ability to execute that vision to sort vendors between Leaders, Challengers, Niche Players and Visionaries.  

Gartner has highlighted strengths such as innovation, product improvements and customer experience as factors in GE Digital serving as a leading platform in the MES space. 

As systems trend towards more and more connectivity, owing to the significant value offered by data analysis for operational improvement, implementing unconnected or imperfectly deployed point solutions can put your operation on the back foot competitively. Additionally, a consistent naming structure and technical ontology are required to ensure systems can communicate flawlessly. This is inherent in a complete MES solution, but your team must consider and continuously monitor a collection of point solutions to achieve compatibility. 

Another downside to such an approach is paying multiple times for the same service. When deploying a point solution, each integration will require design, testing and implementation phases – each made more challenging by the need for each team to consider the other’s work, compatibility, language and methodology. 

GE Digital’s Plant Apps cover the following functionality as a rounded MES platform: 

  • Dispatching – Distributing work orders based on transactional data and demand 
  • Execution – Managing the production process
  • Data Management – Enabling the collection and management of data at regular intervals from all connected assets. 
  • Operational Data Store – Readily tailorable for purpose, MES can serve as a relational database for operational data or integrate with a data historian or IIoT platform. 
  • Quality Management – Regulated industries and products can benefit from standardisation and data capture to ensure compliance. 
  • Process – MES ensures all manufacturing steps are undertaken correctly, with the correct raw materials, temperatures, times, etc. 
  • Traceability – The ability to track the entire process from raw materials to intermediate and finished goods by lot, batch number, or other signifiers. 
  • Analytics and Reporting – Dashboard displays, advanced analytical tools and real-time KPIs provide data for accurate decision support. 
  • Integration – MES can bring together many disparate systems to create something greater than the sum of its parts. Tying together all production levels with enterprise systems, site planning, bill of materials, and recipe planning. 

With a single platform, the Novotek team will tailor the solution to your individual needs within a coherent integration process. With a project undertaken in an orderly way and to return to the analogy of tradespeople in the home, you can be sure your plasterers, painters, and plumbers aren’t tripping over each other. 

]]>
DataOps: The Fuel Injectors For Your Transformation Engine? https://ideashub.novotek.com/dataops-the-fuel-injectors-your-transformation-engine/ Thu, 19 May 2022 11:43:48 +0000 https://ideashub.novotek.com/?p=3060

Data – everyone agrees it’s the fuel for the fires of innovation and optimisation. The industrial world is blessed with an abundance of rich, objective (being machine-generated) data, so should be well-equipped to seek new advantages from it. Too often, the first efforts an industrial firm takes to harness its machine and process data for new reporting or advanced analysis initiatives involve simple use cases and outputs that can mask what it takes to support a mix of different needs in a scalable and supportable way. Data Ops practices provide a way of systemically addressing the steps needed to ensure that your data can be made available in the right places, at the right times, in the right formats for all the initiatives you’re pursuing.


Industrial data (or OT data) poses particular challenges that your Data Ops strategy will address:

  • It can be generated at a pace that challenges traditional enterprise (or even cloud layer) data collection and data management systems (TO say nothing of the costs of ingestion and processing during reporting/analysis typical of cloud platforms is considered).
  • The data for functionality identical assets or processes is often not generated in a consistent structure and schema.
  • OT data generally does not have context established around each data point – making it difficult to understand what it represents, let alone the meaning inherent in the actual values!
  • Connecting to a mix of asset types with different automation types and communications protocols is often necessary to get a complete data set relevant to the reporting or analytics you’re pursuing.
  • A wide array of uses demands different levels of granularity of some data points and a breadth of collection points that is significantly wider than many individual stakeholders may appreciate.

These are the reasons why in many firms, the engineering team often ends up becoming the “data extract/Excel team” – their familiarity with the underlying technology means they can take snapshots and do the data cleansing necessary to make the data useful. But that’s not scalable, and data massaging is a far less impactful use of their time – they should be engaged with the broader team interpreting and acting on the data!


Data Ops – Quick Definition There’s no one way to “do” Data Ops. In the industrial world, it’s best thought of as a process involving: – Determining the preferred structures and descriptions (models) for OT data, so it may serve the uses the organisation has determined will be valuable. – Assessing what approaches to adding such models can be adopted by your organisation. – Choosing the mix of tools needed to add model structures to a combination of existing and new data sources. – Establishing the procedure to ensure that model definitions don’t become “stale” if business needs change. – Establishing the procedures to ensure that new data sources, or changing data sources are brought into the model-based framework promptly.


A Rough Map is Better Than No Map.

Take a first pass at capturing all the intended uses of your OT data. What KPIS, what reports, what integration points, and what analytics are people looking for? Flesh out those user interests with an understanding of what can feed into them:

  1. Map the different stakeholder’s data needs in terms of how much they come from common sources, and how many needs represent aggregations, calculations or other manipulations of the same raw data.
  2. Flesh out the map by defining the regularity with which data needs to flow to suit the different use cases. Are some uses based on by-shift, or daily views of some data? Are other users based on feeding data in real-time between systems to trigger events or actions?
  3. Now consider what data could usefully be “wrapped around” raw OT data to make it easier for the meaning and context of that data to be available for all. Assess what value can come from:
    1. Common descriptive models for assets and processes – a “Form Fill & Seal Machine” with variables like “Speed” and “Web Width” (etc.) is a far easier construct for many people to work with then a database presenting a collection of rows reflecting machines’ logical addresses with a small library of cryptically structured variables associated to each one.
    2. An enterprise model to help understand the locations and uses of assets and processes. The ISA-95 standard offers some useful guidance in establish such a model.
    3. Additional reference data to flesh out the descriptive and enterprise models. (eg: Things like make and model of common asset types with many vendors; or information about a location such as latitude or elevation). Be guided by what kind of additional data would be helpful in comparing/contrasting/investigating differences in outcomes that need to be addressed.
  4. Now assess what data pools are accumulating already – and how much context is accumulating in those pools. Can you re-use existing investments to support these new efforts, rather than creating a parallel set of solutions?
  5. Finally, inventory the OT in use where potentially useful data is generated, but not captured or stored; particularly note connectivity options.

Avoiding A Common Trap “Data for Analytics” means different things at different stages. A data scientist looking to extract new insights from OT data may need very large data sets in the data centre or cloud, where they can apply machine learning or other “big data” tools to a problem. A process optimisation team deploying a real-time analytic engine to make minute-by-minute use of the outputs of the data scientists’ work may only need small samples across a subset of data point for their part of the work. Data Ops thinking will help you ensure that both of these needs are met appropriately.


Map’s Done – Now How About Going Somewhere?

The work that comes next is really the “Ops” part of Data Ops – with the rough map of different uses of OT data at hand, and the view of whether each use needs granular data, aggregated data, calculated derivations (like KPIs), or some kind of combination, you’ll be able to quickly determine where generating desired outputs requires new data pools or streams, or where existing ones can be used. And for both, your data modelling work will guide what structures and descriptive data need to be incorporated.

At this point, you may find that some existing data pools lend themselves to having asset and descriptive models wrapped around the raw data at the data store level – ie: centrally. It’s a capability offered in data platform solutions like GE’s Proficy Historian. This approach can make more sense than extracting data sets simply to add model data and then re-writing the results to a fresh data store. Typically, streaming/real-time sources offer more choice in how best to handle adding the model around the raw data – and there are solutions like HighByte’s Intelligence Hub, that allow the model information to be added at the “edge” – the point where the data is captured in the first place. With the model definitions included at this point, you can set up multiple output streams – some feeding more in-the-moment views or integration points, some feeding data stores. In both cases, the model data having been imposed at the edge makes it easier for the ultimate user of the data to understand the context and the meaning of what’s in the stream.



Edge Tools vs Central Realistically, it’s you’re likely to need both. And the driving factor will not necessarily be technical. Edge works better when: 1. You have a team that deal well with spreading standardised templates. 2. Data sources are subject to less frequent change (utility assets are a good example of this). 3. The use cases require relatively straightforward “wrapping” of raw data with model information. 4. Central works well when. 5. The skills and disciplines to manage templates across many edge data collection footprints are scarce. 6. The mix of ultimate uses of the data are more complex – requiring more calculations or derivations or modelling of relationships between different types of data sources. 7. Change in underlying data sources is frequent enough that some level of dedicated and/or systematized change detection and remediation is needed.


Regardless of which tools are applied, the model definitions defined earlier, applied consistently, ensure that different reports, calculations and integration tools can be developed more easily, and adapted more easily as individual data sources under the models are inevitably tweaked, upgraded or replaced – as the new automation or sensors come in, their unique data structures simply need to be bound to the models representing them, and the “consumer” of their outputs will continue to work. So, while tools will be needed, ultimately the most valuable part of “doing” Data Ops is the thinking that goes into deciding what needs to be wrapped around raw data for it to become the fuel for your digital journey.

]]>
1,000 miles or around the block: Start one step at a time… https://ideashub.novotek.com/1000-miles-or-around-the-block-start-one-step-at-a-time/ Wed, 16 Mar 2022 11:44:08 +0000 https://ideashub.novotek.com/?p=2996

The rise of connected industrial technologies and Industry 4.0 has prompted the development and launch of countless systems with extensive capabilities and functions. This is often beneficial for businesses with a defined and set long-term strategy, but it can lead to forcing early adoption and spending when deployments and licensing outstrip company’s capacity to change work processes ad adopt new tech.


Here, Sean Robinson, software solutions manager at Novotek UK and Ireland, explains how less can be more with new plant tech deployments – and why immediate problem-solving needs to be a distinct effort within longer-term strategies.


Countless proverbs, maxims and quotes have been formed around the idea of moderation, dating back as far as – or even further – Ancient Greek society. The notion remains important to this day for everything from diet to technology. However, engineers and plant managers frequently over-indulge in the latter and over-specify systems that offer functionality well beyond what is necessary or even practically useful.

It can initially appear that there is no harm in opting for an automation or plant IT system that has extensive functionality, because this may help to solve future problems as they arise. That being said, and investment, positioned to be all-encompassing, like a full, material-receiving-throguh-WIP-execution-with-performances-analysis-and-enterprise-integration manufacturing execution system (MES) can sometimes present its own barriers to adoption for certain businesses, especially those in sectors that favour flexibility such as fast-moving consumer good (FMCG) or food manufacturing (also – interestingly – increasingly in the contract production side of consumer health and life sciences). Where core production processes and related enabling technology are well-established, it can be risky, expensive and overkill to treat the need to implement specific new capabilities as the trigger for wholesale replacement or re-working. They key is to identify where critical new functional needs can be implemented around the installed technology base in focused ways that deliver results, while leaving open the option of incrementally adding additional functionally-focused solutions in a staged way, over time.

At Novotek, our role is to help our customers choose technology that delivers on an immediate need, while opening up the potential to build incrementally in a controlled, low-risk way.

Fortunately, both the licensing models and the technical architectures of plant IT solutions are changing in ways that support this kind of approach. So, the software cost and deployment services costs of bringing on board very specific capabilities can be scaled to match the user base, and the technical and functional boundaries of a specific need. We can think of these focused deployments as “micro-apps”. A key part of this approach is that the apps aren’t built as bespoke, or as an extension of a legacy (and possibly obsolete) system. It’s a productised solution – with only the “right” parts enables and delivered to the right stakeholders. Consider quality in toiletry production and specifically challenges with product loss due to variability in the quality of raw materials. It’s safe to assume that a plant will already have local control systems in place elsewhere to track the overall quality outcomes, but monitoring the raw material quality is often left to supplier-side data that may be under used – serving as a record of supplier compliance with a standard, rather then being used to proactively trigger adjustments in key process settings to avoid losses. In this scenario, an ideal micro-app could be focused on captured raw material data, using machine learning to provide deep analysis of how existing machines can best process the material lot and alerting supervisors and process owners to take action. Such a function might have a small number of users; it might even have integration with inventory or quality systems replacing some manual data entry. So, the software licensing and services and timelines to deliver impact can be kept small. When we consider some of the demands manufacturers now face on fronts ranging from qualifying new supplier/materials, to furthering energy and water reduction, to adopting more predictive maintenance and asset management strategies. We see a lot of potential to tackle these with focused solutions that happen to borrow from the underlying depth and breath of MES solutions.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

]]>
Getting started with food digitalisation https://ideashub.novotek.com/getting-started-with-food-digitalisation/ Mon, 16 Aug 2021 11:09:00 +0000 https://ideashub.novotek.com/?p=2809 The food and beverage industry is one where innovation in product development or design can boast a significant competitive advantage. As such, it’s no surprise that food manufacturers are increasingly considering digitalisation of operations to augment adaptability, improve throughput and strengthen flexibility. Here, Sean Robinson, service leader at food automation software specialist Novotek UK and Ireland, explains how food manufacturers can plan digitalisation in the most effective way.

In the past 12 months, the food industry has been forced to re-evaluate and re-assess its operational priorities. For years, many manufacturers focussed on flexible production to enable diverse product lines and mass customisation, in line with shifting consumer demands. In 2020, this was forced to change, and production efficiency and operational adaptability became the focus. Once more, automation and digital technologies came to the forefront of food manufacturing priorities.

Digitalisation is a word that has been banded around a lot in industrial markets for the past few years, serving as a catch-all phrase encompassing everything that generates, records and communicates data. Unfortunately, as with most amorphous phrases, this leads to confusion among managers about how to introduce these technologies, which causes costly errors in implementation, such as overlapping data collection systems or introduction of technologies that do not serve a strategic purpose.

For food manufacturers at the beginning of their digitalisation journey, the first step is to define an agreed and important goal, which the company can reverse engineer a solution from. Whether looking to deliver on a continuous improvement object that has been identified as part of a formal process, or just illustrating the value of an engineering team unleashed with time to think, it’s key to let the desired improvement dictate what kind of digitalisation will be needed.

For example, if material costs are too high and the agreed goal is to reduce them, a digitalisation project should establish systems that identify the factors influencing this. Understanding the root causes for yield problems could require a combination of machine data, ambient condition data, quality or lab data and information about material quality provided by suppliers. Thinking through where data is readily available, versus where it’s trapped in paper, spreadsheets or isolated automation, will ensure the plan can deliver on the purpose.

Planning at the outset of investing in digitalisation, but some food manufacturers may will have undoubtedly already rushed into digitalisation in years past. For businesses with some digital or automation technologies in place, one of the most valuable things to do is review the lay of the existing digital landscape. The easiest approach to doing this is to apply the ‘three Rs’ to your existing data: reduce systems overlap, reuse data and recycle data.

Reducing data collection system overlap not only makes it easier for managers to identify the source of a specific data set, it also streamlines costs. Why have a downtime system collecting machine event data, a yield analysis system collecting overlapping data and a work in progress tracking system that is separate to both of those? Having three systems collecting fundamentally the same data means duplicated configuration and deployment costs, as well as possible conflict over which one holds the ‘truth’.

An effective data and digitalisation strategy should also aim to use collected data in various calculations to produce several insights. For example, the downtime event data collected for OEE calculations may be part of what’s needed to solve a quality problem. The energy and water data collected for sustainability reporting may hold the key to real savings opportunities. Wherever there is a connection to a data source, managers should think of ways to make sure that a data point only needs to be collected once in order to be used many times.

Finally, offline analysis tools and some of the new analytics packages on the market could mean that old data offers recurring value as a firm chases finer and finer points of improvement. So, it’s important to set up a data management approach and data management platforms that can give you the option of making repeated use of data.

Digitalisation projects can lead to more innovative and effective ways of working for food manufacturers, but they rely on careful planning and strategic implementation. By giving full consideration to the goals to be reached or how data is used within a site, food businesses can ensure their systems are always effectively aligned with their goals.

]]>
The 3 Rs of data as applied to water https://ideashub.novotek.com/the-3-rs-of-data-as-applied-to-water/ Mon, 01 Mar 2021 15:43:46 +0000 http://ideashub.novotek.com/?p=2364

Innovation has been a buzzword uttered by many a water provider in recent years. Since the UK’s water services regulation authority, Ofwat, published its price review 2019 (PR19) paper in 2017 and made innovation a focal point, most water providers have considered the use of new technologies to bolster productivity, reduce costs and strengthen continuity of service. This has made data acquisition, aggregation and analysis technologies more appealing to water operators — however, introducing new systems is understandably complicated in such complex networks.

The water industry’s need to innovate is not going to pass any time soon. In December 2020, Ofwat opened a consultation on what the industry should look like by 2040 and how the regulator, the sector and stakeholders can meet the challenges of the intervening years.

At the launch, Ofwat commented:

“There will always be lessons to learn and going forward, the industry will need to become better at anticipating and adapting to uncertainty and change. They will also need to innovate at a greater pace than before and make full use of opportunities from smart networks, nature based solutions and markets to thrive in the future.”

It’s the first of these suggestions, smart networks, that initially appears to be relatively easy for water operators to develop. A smart water network consists of various analytical, measurement and sensing devices and systems that offer insights into everything from water quality and pipe pressure to pump speeds and ambient temperatures in reserves. This field data can provide various insights to stakeholders.

However, a physical network as expansive and complex as water infrastructure presents ample opportunity for devices and systems to be deployed in an isolated and inefficient way. This means that its very likely certain raw data will be collected numerous times, stored in multiple disparate systems and siloed from certain groups of stakeholders. In our view, this isn’t a smart network; it’s a pseudo-intelligent network. If the data is being collected and used to inform strategic decisions, then it stands to reason that the systems collecting and housing that data should be deployed strategically as well.

Water operators at the start of their deployment journey can avoid a wide range of headaches by considering the three Rs of data:

  • Reduce overlap of data collection
  • Reuse collected data for multiple purposes
  • Repackage data for multiple stakeholders

Reducing overlap in data collection means ensuring that raw data points are collected only once by a single sensor or system, and stored in a clearly defined system, such as a Historian software. This avoids the expense of investing in multiple systems to collect the same data several times for different purposes, as well as the cost of setting up those systems.

Reusing collected data for multiple purposes builds on this. The insights a technician will need to draw from machine data will differ significantly to that of an operations manager or area manager. However, the raw data can be fed into various reports and calculations to offer different kinds of insights. For example, the energy requirement data of water treatment equipment is relevant not only for energy usage reports, but also for sustainability reporting. Making the data readily available for multiple reporting purposes enhances business flexibility.

These values are especially important in the water sector, where context is key and the interplay between different parts of the network can have a tangible impact on overall operations.

One of the costliest oversights that many businesses encounter when focussing on data collection is that technicians often embrace a silo mentality, where they understand the value that data offers to their immediate machine or area of operation. For example, it might be that pump pressure monitoring is perceived as being valid only insofar as it informs maintenance schedules for that system. This mentality might make it appear to be a good idea to invest, separately, in sensor devices to measure water flow rates in the local area, which monitor water flow and pipe pressure.

Instead, repackaging the pump pressure data to support analysis of water flow would reduce overlap between systems, and in turn reduce the time and cost expenditure of configuring another system to also collect pressure data.

This same principle of the three Rs can be applied to data collection and analysis of all kinds throughout the entire network. In effect, data should be treated similarly to the water network itself; collected from a single source and transferred into a defined system, from which it can be supplied to where it is needed, for what it is needed for. If water operators are truly to improve productivity and reduce operational costs with innovation and new technologies, the key is to find ways of strategically collecting data once and using it in multiple ways. Doing so enables the company to be more flexible, adaptable and prepared for changing market conditions.

Click here to find out more about the 3 Rs of automation data with our infographic guide.

]]>
Consolidating tech in the utilities sector https://ideashub.novotek.com/consolidating-tech-in-the-utilities-sector/ Mon, 11 Jan 2021 10:07:00 +0000 http://ideashub.novotek.com/?p=2225 Despite the utilities sector being one of the first areas of industry to digitalise its operations in the 1970s, business leaders have been slow to make systematic changes in recent years. Here, Sean Robinson, service leader at Novotek UK and Ireland, explains how the latest technologies can better equip utilities companies to adapt to future energy demands.

In 2015, at the UN climate conference of parties (COP), world leaders agreed to take united action in limiting the rise of global temperatures to less than two degrees Celsius. The pressures to reduce carbon emissions, as well as the shift to post-recession, less energy-intensive industries, have led to a surge in demand for new power and utilities offerings across the globe.

Fossil fuels account for up to 82 per cent of the world’s primary energy usage, but as governments begin to tightly regulate this usage and renewable energy generation is on the rise, utilities companies need to evolve. This presents several growth opportunities for the utilities industry to integrate additional services into their portfolio. However, one of the greatest challenges facing utilities companies is the integration of new and emerging technologies into their business models.

Utilities companies need to begin by evaluating their current systems and infrastructure against their business goals. At Novotek, we’ve found that many utilities companies are using legacy equipment or disparate systems from a wide range of suppliers that, often, are out of sync with other operations in the facility.

While global investment in digital electricity infrastructure and software may have grown by over 20 per cent annually since 2014, at Novotek we are urging utilities companies in particular to move faster. By modernising and consolidating a facility’s existing systems into one, businesses can make a significant return on investment (ROI).

Because data comes from a broad range of sources, consolidation allows organisations to present data easier, while also facilitating effective data analysis.

Data consolidation techniques reduce inefficiencies like data duplication, costs related to reliance on multiple databases and multiple data management points.

Currently, the utilities sector is greatly fragmented as a result of decades of outsourcing in incremental functional and geographic silos. With technologies that exist today, like GE Digital’s Predix Plant Applications software, which is part of the Predix manufacturing execution system (MES) suite, utilities companies can now manage the hundreds of devices and pieces of equipment operating simultaneously across not just one plant, but an entire portfolio from one system in real-time.

Combining predictive machine learning and advanced analytics, the technology can help utilities managers to transition from a reactive to a proactive and prescriptive operating model.

This is because Plant Applications allow plant managers to analyse and configure insights from the data collected, to make informed business decisions and establish new unfragmented processes, to improve other areas of the business, like reducing waste. By 2025, data analytics will to be a core component in assisting companies, like those operating in the utilities sector, in making key business decisions. By consolidating various processes and integrating automation technologies, like GE Digital’s, utilities companies can optimise their operations to significantly improve performance and retain a competitive edge.

]]>