MES – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Mon, 18 Mar 2024 16:40:03 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png MES – Novotek Ideas Hub https://ideashub.novotek.com 32 32 Sustainability – Many Birds, One Stone https://ideashub.novotek.com/sustainability-many-birds-one-stone/ https://ideashub.novotek.com/sustainability-many-birds-one-stone/#respond Mon, 18 Mar 2024 16:37:42 +0000 https://ideashub.novotek.com/?p=3406 ‘Two birds with one stone’ – so goes the well-known phrase. But what if you could get more than two birds for a single throw of a stone? How about many birds? And what if these birds were not just flights of fantasy but offered foundational improvements and real upsides, such as increased profitability, recovered capacity and the ability to meet sustainability targets?

Starkly stated sustainability targets such as ‘Net-Zero by 2030’ imply an inherent struggle, and while this may be true in certain arenas, there is a genuine opportunity to achieve environmental goals, automate accountability and improve profitability within manufacturing – all at once.

In this article, we’ll outline exactly how the right capabilities, infused with expertise, can offer a profitable and intelligent pathway to a brighter business and environmental future. Carbon is cash, and reducing your output means retaining capital and growing profitability for the future.

So how is this achieved? Firstly, by fostering a different mindset when conceptualising sustainability measures. Data on utility usage can tell you when you’ve used more or less, but this aggregated data doesn’t have the granularity to explain why. While this is fine for quantifying and reporting on consumption to participate in a carbon exchange, this approach offers no mechanisms to improve these figures. But it doesn’t have to be this way.

Success and Sustainability

Novotek Solutions delivers operational technology with a methodology shaped by a deep knowledge gained in over three decades of experience in IT domains.

We’ve led the way in delivering all our projects to a high, IT-compliant standard. Our solutions are supportable, maintainable, and extensible to keep your operation fit for the future.

In decades past, manufacturers in various sectors have embraced initiatives focused on continuous improvement, aiming to enhance production yields, improve equipment reliability, and minimise waste in materials, labour, and capital. Advanced measurement systems that track metrics like machine downtime and material usage, leading to the establishment of comprehensive factory data infrastructures, all support manufacturers’ end goals.

These systems contextualise raw data by associating it with specific details such as order numbers and product codes. Advanced platforms like Proficy Plant Applications from GE Vernova can integrate data from primary sources like water flow meters into this contextual framework. This practice of collecting detailed data related to core equipment and products results in a robust dataset, which serves multiple purposes:

  1. Automating Environmental and Compliance Reporting: Using directly measured consumption data to create regulatory reports and calculate incentives.
  2. Enhancing Carbon Accounting: With varying standards for translating energy consumption into emissions, having granular data allows for flexibility in reporting and adapting to evolving auditing requirements.
  3. Incorporating Footprint Analysis in Continuous Improvement: Analysing measured environmental factors alongside traditional performance metrics reveals the interplay between operational changes and environmental impact. Comparing a product’s footprint data across different times or locations helps identify significant variations.

This approach allowed a major North American brewer to spot cases where energy consumption varied when all other factors were equal. Measuring energy consumption next to production orders meant it could hunt for root causes through its efficiency management system.

Root causes for relative spikes in usage ranged from inefficient process control algorithms for heating or chilling equipment, inconsistent adherence to recipe setpoints, and poor power management relative to down or idle times. The brewer utilised this insight to make recipes and procedures consistent across all sites.

The result? The brewer met a 5-year energy-savings target in just three years!

Operator Behaviour, Transparency and Compliance

As more firms conclude that a functional information strategy is a critical first step in their sustainability journey, gaining the correct capabilities to gather and process data is essential. In times gone by, multiple data collection regimens assembled reports for different purposes, such as customers or regulators, which led to inconsistencies and undue workload on operators and analysts.

The alternative is a single data platform that serves multiple stakeholders, such as GE Vernova’s Plant Applications. Through a single platform, data is gathered once at an appropriate resolution, and the same data can then be repacked for multiple purposes.

Through this method, operations can automate the management and delivery of regulatory data. Adherence to future carbon passport schemes also becomes a process through which you already have the tools to deal with.

Turning to transparency, increasingly, customers are willing to pay a premium for ‘green’ products, where you can demonstrate a complete genealogy and the positive credentials of your products in total confidence. With a comprehensive data platform in place, you have the power to track and demonstrate the exact journey a product has gone through, from raw materials to finished goods. And that is not to overlook the power of transparent data on your operation.

With greater process visibility, automated with real-time data collection, operations gain the insight required for intelligence decision-making from the shop floor to the top floor. Ingesting and utilising this data with a powerful analytics platform drives an understanding of the cause-and-effect relationships between asset performance and input consumption. This granular data is then fed into corporate EHS and carbon accounting systems, allowing true utility cost profiles to be a part of production costing and planning exercises. Manufacturers then use cross-plant metrics to accelerate best-practice identification and dissemination.

But that’s not where it ends; by embedding analytics into control and visualisation programs, operators can be presented with rich information to drive decision-making at the shopfloor level. By using intelligent systems in this way, operations can also ensure they are not held hostage to the availability of specialists.

Innovative Strategies in Sustainability

To demonstrate how adopting a manufacturing execution system can offer a ‘many birds for one stone’ solution, we can look at the capabilities and conditions of an operation both before and after implementation.

Before

Without a detailed understanding of how changing utility inputs will affect processes, efforts to be ‘green’ can cause efficiency and material losses while also potentially introducing quality or product safety risks.

The differences between equipment and processes also present difficulties in formulating an effective strategy. With better data collection, all elements of variability can be profiled – including materials used in processes.

After

Data-driven decision-making brings cost, quality and carbon footprint into balance. With the confidence to act backed by information, tuning processes and utility infrastructure ensures sustainability efforts do not compromise operational performance.

The root causes of overconsumption are more easily understood, and strategies to mitigate them can be formulated and actioned at pace.

The ‘Many Birds’ at a Glance

If we’ve demonstrated anything in this article, we hope it’s the broad scope of what’s possible when looking to drive sustainability – and reap the real rewards on offer for manufacturing! Here are the key takeaways of what’s on the table as we progress towards environmental goals:

  1. Expose hidden relationships between production and sustainability factors.
    • A single MES solution provides insight into materials, recipes, assets and processes to find the root causes of the overconsumption of utilities.
  2. Gain a single source of truth and improve the visibility of your consumption.
    • Granular data gathered by the single platform can be packaged, analysed and presented to serve many needs.
  3. Integrate metrics and analysis to provide additional insight.
    • Automating analytics within a single, scalable platform provides value from the shop floor to the top floor and drives fast, accurate decision-making powered by information.
  4. Automate regulatory compliance and power transparency and traceability.
    • Gain competitive capabilities to demonstrate green credentials to customers and other stakeholders.

Last but not least, and in a nutshell, why select Plant Applications from GE Vernova?

  • Flexibility in Data Management: The platform can easily link basic time-series data from meters to a wider range of elements like materials, products, and events, all through straightforward configuration.
  • Support for Multiple Stakeholders: Plant Applications offers a variety of reporting and analytics capabilities, catering to both internal stakeholders focused on improvement and external stakeholders, ensuring their diverse needs are met.
  • Open and Layered Approach: Unlike many sustainability metrics systems that are manual or limited to specific sensors, Plant Applications enhances existing sensor, automation, and software investments, offering a more integrated solution.

Continue the conversation

Do you have any questions about sustainability and manufacturing? Chat to one of our friendly experts to find out more.

]]>
https://ideashub.novotek.com/sustainability-many-birds-one-stone/feed/ 0
MES: Build vs Buy https://ideashub.novotek.com/building-or-buying-mes/ Mon, 05 Jun 2023 08:01:57 +0000 https://ideashub.novotek.com/?p=3336

Every manufacturing operation requires communication and the sharing of data. In the past, data was manually recorded with pen and paper and shared at the walking speed of an operator.

The industry has come a long way since then, with forward-thinking operations undertaking digital transformation journeys to unlock greater efficiency, visibility and the capabilities required for continual improvement and profitability in the contemporary manufacturing landscape. 

However, not all approaches yield the same results. While point solutions for individual functions to provide new capabilities in your manufacturing operation may seem a sensible way to begin a digital transformation journey, there are a number of issues to consider. 

Building MES functionality with point solutions requires careful consideration. The pitfalls are all too common, resulting in delayed progress and increased costs versus a single platform.

If you were to consider the data flow in your operation like plumbing in a house, concerns about differing approaches would soon become apparent. As numerous plumbers from different companies arrive to distribute water and heating around your home, difficulties reconciling differences between pipe diameters, connectors and joining mechanisms would result in burst pipes and water everywhere. 

Amongst the issues that come with composite systems is security. While plumbing together these systems, how do you consider cybersecurity with due diligence? Should you experience a cybersecurity threat, a growing and tangible danger, which vendors would you call for support? 

Vulnerabilities can reveal themselves when disparate solutions take diverging paths, at an incongruent pace, through their product roadmaps. The result is a constantly changing landscape in which your platform can continuously fall out of sync with its various component solutions, requiring constant attention and maintenance. That is not to mention the security risks of each system requiring different access routes in and out of information silos, which requires careful consideration as increased connections mean more potential attack vectors.

Bad actors take advantage of vulnerabilities in poorly secured systems

How do you ensure consistency and implementation of standards across vendor organisations? Best practice becomes challenging to implement, with no single approach for your entire MES system. Learning each solution will require training courses for each, resulting in increased time to competency for your operators. 

Implementing a single platform that provides seamless connectivity, efficiency management, quality management and production management solutions in addition to a raft of other capabilities rather than numerous point solutions avoids the headache. 

Where do you begin if you choose to implement a complete and integrated MES solution from a single vendor? The good news is that independent analysts have done a lot of homework for you. Gartner has asked vendors the tough questions to independently test the product and ensure confidence in connectivity, security, training, and the product’s roadmap.  

Novotek is the only Premier Solutions Partner for GE in the United Kingdom. With extensive experience and expertise in delivering and exceeding customers’ ambitions, Novotek has helped many manufacturers achieve greater profitability and efficiency with GE Digital products. 

So, what does Gartner have to say about GE Digital? 

“GE Digital is a Leader in this Magic Quadrant.” Gartner’s Magic Quadrant considers the completeness of a vendor’s vision alongside their ability to execute that vision to sort vendors between Leaders, Challengers, Niche Players and Visionaries.  

Gartner has highlighted strengths such as innovation, product improvements and customer experience as factors in GE Digital serving as a leading platform in the MES space. 

As systems trend towards more and more connectivity, owing to the significant value offered by data analysis for operational improvement, implementing unconnected or imperfectly deployed point solutions can put your operation on the back foot competitively. Additionally, a consistent naming structure and technical ontology are required to ensure systems can communicate flawlessly. This is inherent in a complete MES solution, but your team must consider and continuously monitor a collection of point solutions to achieve compatibility. 

Another downside to such an approach is paying multiple times for the same service. When deploying a point solution, each integration will require design, testing and implementation phases – each made more challenging by the need for each team to consider the other’s work, compatibility, language and methodology. 

GE Digital’s Plant Apps cover the following functionality as a rounded MES platform: 

  • Dispatching – Distributing work orders based on transactional data and demand 
  • Execution – Managing the production process
  • Data Management – Enabling the collection and management of data at regular intervals from all connected assets. 
  • Operational Data Store – Readily tailorable for purpose, MES can serve as a relational database for operational data or integrate with a data historian or IIoT platform. 
  • Quality Management – Regulated industries and products can benefit from standardisation and data capture to ensure compliance. 
  • Process – MES ensures all manufacturing steps are undertaken correctly, with the correct raw materials, temperatures, times, etc. 
  • Traceability – The ability to track the entire process from raw materials to intermediate and finished goods by lot, batch number, or other signifiers. 
  • Analytics and Reporting – Dashboard displays, advanced analytical tools and real-time KPIs provide data for accurate decision support. 
  • Integration – MES can bring together many disparate systems to create something greater than the sum of its parts. Tying together all production levels with enterprise systems, site planning, bill of materials, and recipe planning. 

With a single platform, the Novotek team will tailor the solution to your individual needs within a coherent integration process. With a project undertaken in an orderly way and to return to the analogy of tradespeople in the home, you can be sure your plasterers, painters, and plumbers aren’t tripping over each other. 

]]>
DataOps: The Fuel Injectors For Your Transformation Engine? https://ideashub.novotek.com/dataops-the-fuel-injectors-your-transformation-engine/ Thu, 19 May 2022 11:43:48 +0000 https://ideashub.novotek.com/?p=3060

Data – everyone agrees it’s the fuel for the fires of innovation and optimisation. The industrial world is blessed with an abundance of rich, objective (being machine-generated) data, so should be well-equipped to seek new advantages from it. Too often, the first efforts an industrial firm takes to harness its machine and process data for new reporting or advanced analysis initiatives involve simple use cases and outputs that can mask what it takes to support a mix of different needs in a scalable and supportable way. Data Ops practices provide a way of systemically addressing the steps needed to ensure that your data can be made available in the right places, at the right times, in the right formats for all the initiatives you’re pursuing.


Industrial data (or OT data) poses particular challenges that your Data Ops strategy will address:

  • It can be generated at a pace that challenges traditional enterprise (or even cloud layer) data collection and data management systems (TO say nothing of the costs of ingestion and processing during reporting/analysis typical of cloud platforms is considered).
  • The data for functionality identical assets or processes is often not generated in a consistent structure and schema.
  • OT data generally does not have context established around each data point – making it difficult to understand what it represents, let alone the meaning inherent in the actual values!
  • Connecting to a mix of asset types with different automation types and communications protocols is often necessary to get a complete data set relevant to the reporting or analytics you’re pursuing.
  • A wide array of uses demands different levels of granularity of some data points and a breadth of collection points that is significantly wider than many individual stakeholders may appreciate.

These are the reasons why in many firms, the engineering team often ends up becoming the “data extract/Excel team” – their familiarity with the underlying technology means they can take snapshots and do the data cleansing necessary to make the data useful. But that’s not scalable, and data massaging is a far less impactful use of their time – they should be engaged with the broader team interpreting and acting on the data!


Data Ops – Quick Definition There’s no one way to “do” Data Ops. In the industrial world, it’s best thought of as a process involving: – Determining the preferred structures and descriptions (models) for OT data, so it may serve the uses the organisation has determined will be valuable. – Assessing what approaches to adding such models can be adopted by your organisation. – Choosing the mix of tools needed to add model structures to a combination of existing and new data sources. – Establishing the procedure to ensure that model definitions don’t become “stale” if business needs change. – Establishing the procedures to ensure that new data sources, or changing data sources are brought into the model-based framework promptly.


A Rough Map is Better Than No Map.

Take a first pass at capturing all the intended uses of your OT data. What KPIS, what reports, what integration points, and what analytics are people looking for? Flesh out those user interests with an understanding of what can feed into them:

  1. Map the different stakeholder’s data needs in terms of how much they come from common sources, and how many needs represent aggregations, calculations or other manipulations of the same raw data.
  2. Flesh out the map by defining the regularity with which data needs to flow to suit the different use cases. Are some uses based on by-shift, or daily views of some data? Are other users based on feeding data in real-time between systems to trigger events or actions?
  3. Now consider what data could usefully be “wrapped around” raw OT data to make it easier for the meaning and context of that data to be available for all. Assess what value can come from:
    1. Common descriptive models for assets and processes – a “Form Fill & Seal Machine” with variables like “Speed” and “Web Width” (etc.) is a far easier construct for many people to work with then a database presenting a collection of rows reflecting machines’ logical addresses with a small library of cryptically structured variables associated to each one.
    2. An enterprise model to help understand the locations and uses of assets and processes. The ISA-95 standard offers some useful guidance in establish such a model.
    3. Additional reference data to flesh out the descriptive and enterprise models. (eg: Things like make and model of common asset types with many vendors; or information about a location such as latitude or elevation). Be guided by what kind of additional data would be helpful in comparing/contrasting/investigating differences in outcomes that need to be addressed.
  4. Now assess what data pools are accumulating already – and how much context is accumulating in those pools. Can you re-use existing investments to support these new efforts, rather than creating a parallel set of solutions?
  5. Finally, inventory the OT in use where potentially useful data is generated, but not captured or stored; particularly note connectivity options.

Avoiding A Common Trap “Data for Analytics” means different things at different stages. A data scientist looking to extract new insights from OT data may need very large data sets in the data centre or cloud, where they can apply machine learning or other “big data” tools to a problem. A process optimisation team deploying a real-time analytic engine to make minute-by-minute use of the outputs of the data scientists’ work may only need small samples across a subset of data point for their part of the work. Data Ops thinking will help you ensure that both of these needs are met appropriately.


Map’s Done – Now How About Going Somewhere?

The work that comes next is really the “Ops” part of Data Ops – with the rough map of different uses of OT data at hand, and the view of whether each use needs granular data, aggregated data, calculated derivations (like KPIs), or some kind of combination, you’ll be able to quickly determine where generating desired outputs requires new data pools or streams, or where existing ones can be used. And for both, your data modelling work will guide what structures and descriptive data need to be incorporated.

At this point, you may find that some existing data pools lend themselves to having asset and descriptive models wrapped around the raw data at the data store level – ie: centrally. It’s a capability offered in data platform solutions like GE’s Proficy Historian. This approach can make more sense than extracting data sets simply to add model data and then re-writing the results to a fresh data store. Typically, streaming/real-time sources offer more choice in how best to handle adding the model around the raw data – and there are solutions like HighByte’s Intelligence Hub, that allow the model information to be added at the “edge” – the point where the data is captured in the first place. With the model definitions included at this point, you can set up multiple output streams – some feeding more in-the-moment views or integration points, some feeding data stores. In both cases, the model data having been imposed at the edge makes it easier for the ultimate user of the data to understand the context and the meaning of what’s in the stream.



Edge Tools vs Central Realistically, it’s you’re likely to need both. And the driving factor will not necessarily be technical. Edge works better when: 1. You have a team that deal well with spreading standardised templates. 2. Data sources are subject to less frequent change (utility assets are a good example of this). 3. The use cases require relatively straightforward “wrapping” of raw data with model information. 4. Central works well when. 5. The skills and disciplines to manage templates across many edge data collection footprints are scarce. 6. The mix of ultimate uses of the data are more complex – requiring more calculations or derivations or modelling of relationships between different types of data sources. 7. Change in underlying data sources is frequent enough that some level of dedicated and/or systematized change detection and remediation is needed.


Regardless of which tools are applied, the model definitions defined earlier, applied consistently, ensure that different reports, calculations and integration tools can be developed more easily, and adapted more easily as individual data sources under the models are inevitably tweaked, upgraded or replaced – as the new automation or sensors come in, their unique data structures simply need to be bound to the models representing them, and the “consumer” of their outputs will continue to work. So, while tools will be needed, ultimately the most valuable part of “doing” Data Ops is the thinking that goes into deciding what needs to be wrapped around raw data for it to become the fuel for your digital journey.

]]>
1,000 miles or around the block: Start one step at a time… https://ideashub.novotek.com/1000-miles-or-around-the-block-start-one-step-at-a-time/ Wed, 16 Mar 2022 11:44:08 +0000 https://ideashub.novotek.com/?p=2996

The rise of connected industrial technologies and Industry 4.0 has prompted the development and launch of countless systems with extensive capabilities and functions. This is often beneficial for businesses with a defined and set long-term strategy, but it can lead to forcing early adoption and spending when deployments and licensing outstrip company’s capacity to change work processes ad adopt new tech.


Here, Sean Robinson, software solutions manager at Novotek UK and Ireland, explains how less can be more with new plant tech deployments – and why immediate problem-solving needs to be a distinct effort within longer-term strategies.


Countless proverbs, maxims and quotes have been formed around the idea of moderation, dating back as far as – or even further – Ancient Greek society. The notion remains important to this day for everything from diet to technology. However, engineers and plant managers frequently over-indulge in the latter and over-specify systems that offer functionality well beyond what is necessary or even practically useful.

It can initially appear that there is no harm in opting for an automation or plant IT system that has extensive functionality, because this may help to solve future problems as they arise. That being said, and investment, positioned to be all-encompassing, like a full, material-receiving-throguh-WIP-execution-with-performances-analysis-and-enterprise-integration manufacturing execution system (MES) can sometimes present its own barriers to adoption for certain businesses, especially those in sectors that favour flexibility such as fast-moving consumer good (FMCG) or food manufacturing (also – interestingly – increasingly in the contract production side of consumer health and life sciences). Where core production processes and related enabling technology are well-established, it can be risky, expensive and overkill to treat the need to implement specific new capabilities as the trigger for wholesale replacement or re-working. They key is to identify where critical new functional needs can be implemented around the installed technology base in focused ways that deliver results, while leaving open the option of incrementally adding additional functionally-focused solutions in a staged way, over time.

At Novotek, our role is to help our customers choose technology that delivers on an immediate need, while opening up the potential to build incrementally in a controlled, low-risk way.

Fortunately, both the licensing models and the technical architectures of plant IT solutions are changing in ways that support this kind of approach. So, the software cost and deployment services costs of bringing on board very specific capabilities can be scaled to match the user base, and the technical and functional boundaries of a specific need. We can think of these focused deployments as “micro-apps”. A key part of this approach is that the apps aren’t built as bespoke, or as an extension of a legacy (and possibly obsolete) system. It’s a productised solution – with only the “right” parts enables and delivered to the right stakeholders. Consider quality in toiletry production and specifically challenges with product loss due to variability in the quality of raw materials. It’s safe to assume that a plant will already have local control systems in place elsewhere to track the overall quality outcomes, but monitoring the raw material quality is often left to supplier-side data that may be under used – serving as a record of supplier compliance with a standard, rather then being used to proactively trigger adjustments in key process settings to avoid losses. In this scenario, an ideal micro-app could be focused on captured raw material data, using machine learning to provide deep analysis of how existing machines can best process the material lot and alerting supervisors and process owners to take action. Such a function might have a small number of users; it might even have integration with inventory or quality systems replacing some manual data entry. So, the software licensing and services and timelines to deliver impact can be kept small. When we consider some of the demands manufacturers now face on fronts ranging from qualifying new supplier/materials, to furthering energy and water reduction, to adopting more predictive maintenance and asset management strategies. We see a lot of potential to tackle these with focused solutions that happen to borrow from the underlying depth and breath of MES solutions.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

]]>