Ideas – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Mon, 18 Mar 2024 16:40:03 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png Ideas – Novotek Ideas Hub https://ideashub.novotek.com 32 32 Sustainability – Many Birds, One Stone https://ideashub.novotek.com/sustainability-many-birds-one-stone/ https://ideashub.novotek.com/sustainability-many-birds-one-stone/#respond Mon, 18 Mar 2024 16:37:42 +0000 https://ideashub.novotek.com/?p=3406 ‘Two birds with one stone’ – so goes the well-known phrase. But what if you could get more than two birds for a single throw of a stone? How about many birds? And what if these birds were not just flights of fantasy but offered foundational improvements and real upsides, such as increased profitability, recovered capacity and the ability to meet sustainability targets?

Starkly stated sustainability targets such as ‘Net-Zero by 2030’ imply an inherent struggle, and while this may be true in certain arenas, there is a genuine opportunity to achieve environmental goals, automate accountability and improve profitability within manufacturing – all at once.

In this article, we’ll outline exactly how the right capabilities, infused with expertise, can offer a profitable and intelligent pathway to a brighter business and environmental future. Carbon is cash, and reducing your output means retaining capital and growing profitability for the future.

So how is this achieved? Firstly, by fostering a different mindset when conceptualising sustainability measures. Data on utility usage can tell you when you’ve used more or less, but this aggregated data doesn’t have the granularity to explain why. While this is fine for quantifying and reporting on consumption to participate in a carbon exchange, this approach offers no mechanisms to improve these figures. But it doesn’t have to be this way.

Success and Sustainability

Novotek Solutions delivers operational technology with a methodology shaped by a deep knowledge gained in over three decades of experience in IT domains.

We’ve led the way in delivering all our projects to a high, IT-compliant standard. Our solutions are supportable, maintainable, and extensible to keep your operation fit for the future.

In decades past, manufacturers in various sectors have embraced initiatives focused on continuous improvement, aiming to enhance production yields, improve equipment reliability, and minimise waste in materials, labour, and capital. Advanced measurement systems that track metrics like machine downtime and material usage, leading to the establishment of comprehensive factory data infrastructures, all support manufacturers’ end goals.

These systems contextualise raw data by associating it with specific details such as order numbers and product codes. Advanced platforms like Proficy Plant Applications from GE Vernova can integrate data from primary sources like water flow meters into this contextual framework. This practice of collecting detailed data related to core equipment and products results in a robust dataset, which serves multiple purposes:

  1. Automating Environmental and Compliance Reporting: Using directly measured consumption data to create regulatory reports and calculate incentives.
  2. Enhancing Carbon Accounting: With varying standards for translating energy consumption into emissions, having granular data allows for flexibility in reporting and adapting to evolving auditing requirements.
  3. Incorporating Footprint Analysis in Continuous Improvement: Analysing measured environmental factors alongside traditional performance metrics reveals the interplay between operational changes and environmental impact. Comparing a product’s footprint data across different times or locations helps identify significant variations.

This approach allowed a major North American brewer to spot cases where energy consumption varied when all other factors were equal. Measuring energy consumption next to production orders meant it could hunt for root causes through its efficiency management system.

Root causes for relative spikes in usage ranged from inefficient process control algorithms for heating or chilling equipment, inconsistent adherence to recipe setpoints, and poor power management relative to down or idle times. The brewer utilised this insight to make recipes and procedures consistent across all sites.

The result? The brewer met a 5-year energy-savings target in just three years!

Operator Behaviour, Transparency and Compliance

As more firms conclude that a functional information strategy is a critical first step in their sustainability journey, gaining the correct capabilities to gather and process data is essential. In times gone by, multiple data collection regimens assembled reports for different purposes, such as customers or regulators, which led to inconsistencies and undue workload on operators and analysts.

The alternative is a single data platform that serves multiple stakeholders, such as GE Vernova’s Plant Applications. Through a single platform, data is gathered once at an appropriate resolution, and the same data can then be repacked for multiple purposes.

Through this method, operations can automate the management and delivery of regulatory data. Adherence to future carbon passport schemes also becomes a process through which you already have the tools to deal with.

Turning to transparency, increasingly, customers are willing to pay a premium for ‘green’ products, where you can demonstrate a complete genealogy and the positive credentials of your products in total confidence. With a comprehensive data platform in place, you have the power to track and demonstrate the exact journey a product has gone through, from raw materials to finished goods. And that is not to overlook the power of transparent data on your operation.

With greater process visibility, automated with real-time data collection, operations gain the insight required for intelligence decision-making from the shop floor to the top floor. Ingesting and utilising this data with a powerful analytics platform drives an understanding of the cause-and-effect relationships between asset performance and input consumption. This granular data is then fed into corporate EHS and carbon accounting systems, allowing true utility cost profiles to be a part of production costing and planning exercises. Manufacturers then use cross-plant metrics to accelerate best-practice identification and dissemination.

But that’s not where it ends; by embedding analytics into control and visualisation programs, operators can be presented with rich information to drive decision-making at the shopfloor level. By using intelligent systems in this way, operations can also ensure they are not held hostage to the availability of specialists.

Innovative Strategies in Sustainability

To demonstrate how adopting a manufacturing execution system can offer a ‘many birds for one stone’ solution, we can look at the capabilities and conditions of an operation both before and after implementation.

Before

Without a detailed understanding of how changing utility inputs will affect processes, efforts to be ‘green’ can cause efficiency and material losses while also potentially introducing quality or product safety risks.

The differences between equipment and processes also present difficulties in formulating an effective strategy. With better data collection, all elements of variability can be profiled – including materials used in processes.

After

Data-driven decision-making brings cost, quality and carbon footprint into balance. With the confidence to act backed by information, tuning processes and utility infrastructure ensures sustainability efforts do not compromise operational performance.

The root causes of overconsumption are more easily understood, and strategies to mitigate them can be formulated and actioned at pace.

The ‘Many Birds’ at a Glance

If we’ve demonstrated anything in this article, we hope it’s the broad scope of what’s possible when looking to drive sustainability – and reap the real rewards on offer for manufacturing! Here are the key takeaways of what’s on the table as we progress towards environmental goals:

  1. Expose hidden relationships between production and sustainability factors.
    • A single MES solution provides insight into materials, recipes, assets and processes to find the root causes of the overconsumption of utilities.
  2. Gain a single source of truth and improve the visibility of your consumption.
    • Granular data gathered by the single platform can be packaged, analysed and presented to serve many needs.
  3. Integrate metrics and analysis to provide additional insight.
    • Automating analytics within a single, scalable platform provides value from the shop floor to the top floor and drives fast, accurate decision-making powered by information.
  4. Automate regulatory compliance and power transparency and traceability.
    • Gain competitive capabilities to demonstrate green credentials to customers and other stakeholders.

Last but not least, and in a nutshell, why select Plant Applications from GE Vernova?

  • Flexibility in Data Management: The platform can easily link basic time-series data from meters to a wider range of elements like materials, products, and events, all through straightforward configuration.
  • Support for Multiple Stakeholders: Plant Applications offers a variety of reporting and analytics capabilities, catering to both internal stakeholders focused on improvement and external stakeholders, ensuring their diverse needs are met.
  • Open and Layered Approach: Unlike many sustainability metrics systems that are manual or limited to specific sensors, Plant Applications enhances existing sensor, automation, and software investments, offering a more integrated solution.

Continue the conversation

Do you have any questions about sustainability and manufacturing? Chat to one of our friendly experts to find out more.

]]>
https://ideashub.novotek.com/sustainability-many-birds-one-stone/feed/ 0
How to Implement IT Compliant OT https://ideashub.novotek.com/how-to-implement-it-compliant-ot/ Thu, 26 Oct 2023 14:32:53 +0000 https://ideashub.novotek.com/?p=3402 As manufacturing operations adopt more intelligent systems, we’ve seen control systems, equipment, and networks rebranded as Operational Technology (OT). With this has come a change in approach from IT departments, who for decades wanted nothing to do with the weird and wonderful equipment that populated the OT space. While keeping the operational world at arm’s length was possible for IT in the past, they are now converging at such a pace and in a way that is impossible, or even perilous, to ignore.

A vital convergence

Cybersecurity is a crucial concern. OT equipment has become more IT aligned by necessity through standard protocols and ethernet/IP connectivity. Like a bucket of cold water, this fact woke the IT world to the significant vulnerabilities presented by connected operational systems. Furthermore, the press has continued to fill with stories of backdoors exploited by nefarious actors and the dire consequences of which to reputations, service, and profitability.

It was time for OT to be taken seriously and become part of the IT estate with the same high standards and best practice approaches to security.

So, what does this mean for you as a manufacturer?

Firstly, you must ensure that your control systems, such as PLC, SCADA etc., are secure from threats by keeping systems up to date and only providing connectivity between systems that require it. Leaving your entire operation wide open, with everything connected to everything else, is particularly hazardous. The optimal solution is to establish communication channels secured via switches and routers, allowing protocols to be enabled and disabled as required. Through this method, you can install firewalls between departments to further mitigate the threat of a cybersecurity breach.

The second point to consider is access control. Users should only be granted permissions to systems they require within an IT-supported domain. Paired with appropriate password complexity, a policy of regularly changing those passwords can minimise a potential vector of attack.

Next is virtualisation. By abstracting OT systems from the IT hardware, you can install physical hosts in an environmentally controlled data centre; rather than the old method of putting server racks under desks in control rooms, where they were subject to dust, heat, and the occasional accidental kicking from a steel-toe-capped boot.

Rounding out this brief overview is patching and backups. Patching regularly, at the same frequency as IT systems, ensures systems are constantly kept up to date and reduces the impact of ‘timely’ vulnerabilities such as Log4j. We still visit sites where Windows XP, NT and Server 2000 are still in use. These operating systems are running long after official support has ended, meaning security patches are no longer available and the vulnerabilities are well known and widely published.

Because OT should now be firmly on your IT department’s radar, creating a thorough backup regime will mean your systems are recoverable in the event of data loss due to a ransomware attack, operator error or any other disruption.

Experience and Expertise

Novotek Solutions delivers operational technology with a methodology shaped by a deep knowledge gained in over three decades of experience in IT domains.

We’ve led the way in delivering all our projects to a high, IT-compliant standard. Our solutions are supportable, maintainable, and extensible to keep your operation fit for the future.

Read more

]]>
Smart Factory and the future of energy https://ideashub.novotek.com/smart-factory-and-the-future-of-energy/ Mon, 25 Sep 2023 13:28:26 +0000 https://ideashub.novotek.com/?p=3381

The manufacturing industry accounts for much of the world’s energy consumption. In 2021, manufacturing accounted for a whopping 33% of all energy consumption in the U.S. Energy Information and Administration. In Norway, industry uses almost twice as much energy as private individuals annually*. By taking steps to reduce energy consumption, manufacturing companies can make a major impact on total consumption in the world.

* Based on calculation with average figures (2022) from Statistics Norway.

Many manufacturing companies have already adopted sustainability strategies to reduce consumption and emissions, and more and more are trying to get started. With the ongoing energy crisis and rising energy costs, many manufacturing companies are dependent on reducing their energy consumption to remain competitive – or even survive. In addition, the industry faces stricter regulations and regulatory requirements related to sustainability, as well as more environmentally conscious consumers. The time to start producing more sustainably is now.

Use digital tools to implement sustainability strategies on the site floor

Although many manufacturing companies already have sustainability strategies in place, the practical challenge is implementing the strategy on the plant floor. In order to produce more sustainably, it is crucial that production personnel have access to information on a daily basis. Only with insight into energy and raw material consumption can measures be taken to optimise production.

To solve this, you should use digital tools, which give operators and other personnel the information they need, while constantly working on the production process. Access to both real-time and historical data makes it possible to make both immediate and long-term improvements in production related to energy and raw material consumption, faulty manufacturing, traceability and more.

3 steps to reduce energy consumption

How can digital tools be used in practice? Below we share 3 steps on how you can map and optimize energy consumption in the production process.

Step 1: Map – “Are we using too much energy?”

See your spend compared to your normal spending, goals or budgeted spend in real time.

  • Monitor consumption related to process areas and production lines
  • Record events in production
  • Record shifts, time of day and weather conditions
  • Compare performance across plants, products, and manufacturing teams
  • See consumption compared to sustainability KPIs (e.g. production carbon emissions)

Step 2: Explain – “Why are we using extra energy?”

Leverage context from the site floor to understand how to improve resource efficiency.

  • Map the resource consumption of all products
  • Find inefficient equipment
  • Discover unknown patterns, wrecks or opportunities for improvement
  • Contextualize data to manage sustainability KPIs
  • Use best practice to standardize operations

Step 3: Optimize – “How can we reduce energy consumption and costs?”

Take actions that improve operational performance and sustainability, both at the process level and throughout the plant.

  • Optimize production planning for better utilization of resources
  • Reduce resource consumption and associated costs
  • Reduce variations in production processes
  • Make your supply chain more agile and resilient
  • Ensure holistic optimization of the entire production environment

Sustainable production with Proficy Smart Factory

GE Digital’s Proficy Smart Factory software comes with all the features you need to gain insight into the manufacturing process and take action for a more sustainable production. Already using a MES solution from GE Digital? Then you have all the tools you need at your fingertips!

Via the Web-based dashboard platform Proficy Operations Hub, you can access visualized data anywhere, anytime. Below you can see tutorials of six widgets that can be used to gain insight into the energy and raw material consumption of the production process.

Proficy Operations Hub widgets

Sparkline

Displays time series data. Can be used in several areas:

  • See energy or water consumption over a period of time
  • Correlate energy or water consumption to temperatures/precipitation/weather conditions over a period of time

Bullet Graph

Displays target value and real value.

  • See your energy consumption compared to normal consumption, budgeted consumption or goals

Bar Gauge, Circular Gauge and Solid Gauge

Three widgets with different visualization of value compared to bucket.

  • See your energy consumption compared to normal consumption, budgeted consumption or goals

Pie Chart

Displays data values in pie or doughnut chart. Can be used in several areas:

  • Illustrate how consumption of e.g. energy and water affects total costs and greenhouse gas emissions
  • Show most energy-intensive processes
  • View material consumption

Join us in reversing the trend

According to Statista, it is expected that energy consumption in industry will continue to increase in the coming years. This is despite an increased focus on sustainability and several challenges for the manufacturing industry, including increased energy costs and stricter regulations and regulatory requirements.

With the right tools in place, you can make a difference – both for the environment and your own business. Do you want to help reverse the trend and work for a more sustainable industry? We’ll help you get started!

Ask us about Smart Factory and sustainability

]]>
Data capture and regulatory reporting https://ideashub.novotek.com/data-capture-and-regulatory-reporting/ Thu, 29 Jun 2023 07:41:33 +0000 https://ideashub.novotek.com/?p=3358

Data capture is critical when you’re looking to drive continuous improvements in manufacturing, and it is equally crucial for regulatory compliance. In this article, we’ll look at how intelligent systems can not only streamline the capture of data required for quality management and regulatory compliance in regulated industries, but ensure faster, easier use (and re-use!) of data once captured.

Does your business track critical control points? If so, are you able to retrieve that information quickly and easily? With the right sensors, platforms and software solutions the necessary quality parameters can be continuously captured, with alerts generated in a timely manner for any deviation from specification. This can mean the difference between a batch of good quality, or finished goods that require time and energy to rework to the appropriate standard.

Furthermore, automatically performing regular in-process checks can improve the efficiency of operators and reduce the chances of incorrect data being captured or recorded which could lead to unnecessary work. With solutions from Novotek, we can help you start the journey to a fully automated quality management system.

Automated quality management reduces waste, increases yield and provides data for root cause analysis.

As all production processes consume raw materials, the exact nature and variability of these materials and the quantities used can have a significant impact on the quality of the finished product. Automatically adjusting the production setpoints to cater for the individual characteristics of raw materials can lead to a more consistent output.

By continuously capturing quality data through intelligent systems, you have the tools to perform a historical review of production performance based on different batches of raw materials. You may have implemented a historian, a lab system, even a performance metrics system already but what if the information in isolated silos that are not easily accessed? In these kinds of situations, we can take advantage of innovations in technology that may have been born outside the factory, but can offer value within the production world.

The Industrial Internet of Things (IIoT) is often understood to mean the devices and sensors that are interconnected via computers with industrial applications. In fact, it also includes the kind of data management platforms, “data ops” tools and methodologies that make managing and using industrial data easier.Although IIoT may sometimes appear vast and daunting, through an iterative and scalable process, you will rapidly see tangible results in reducing workloads, with an innovative platform for better quality and improved compliance with your industry’s regulations and standards. Linking together the disparate assets and data stores in your operation provides vital visibility, both in real-time and over the history of your production process.

Your data collection and collation processes are streamlined and automated through this connectivity, facilitating the generation of electronic batch records (EBR) that can be used to satisfy regulatory compliance. Modern, data ops tools, combined with low-code app development tools, make it straightforward to combine data from siloed systems into intuitive user interfaces make reviewing data against specific dates, batch codes, raw material lot numbers, or other production parameters more accessible and understandable.

And this approach suits additional needs: Compliance with standards and regulations is vital for the image of your operation. A tailored solution meets your requirements, from recording hitting required temperatures, or exact measurements when combining the right amount of ingredients at the right time. With our solutions, you can rest assured that you have access in perpetuity to every detail of what you’ve produced. And that in turn means being able to support investigations and deliver reporting to to fulfil obligations, and for both internal and external stakeholders.

Smart systems offer robust methods for ensuring regulatory compliance

Many manufacturers are both blessed and cursed with an ever-growing flow of potentially useful data. We see our role as being to provide guidance on the best way to tap that flow so that many different stakeholders can be served for the least incremental cost, and the least disruption to existing OT and IT. Thanks to the modernisation of plant systems, and increasing adoption of IIoT-style tools and platforms, our customers can  put their data into the right hands at the right time more easily than ever!

]]>
MES: Build vs Buy https://ideashub.novotek.com/building-or-buying-mes/ Mon, 05 Jun 2023 08:01:57 +0000 https://ideashub.novotek.com/?p=3336

Every manufacturing operation requires communication and the sharing of data. In the past, data was manually recorded with pen and paper and shared at the walking speed of an operator.

The industry has come a long way since then, with forward-thinking operations undertaking digital transformation journeys to unlock greater efficiency, visibility and the capabilities required for continual improvement and profitability in the contemporary manufacturing landscape. 

However, not all approaches yield the same results. While point solutions for individual functions to provide new capabilities in your manufacturing operation may seem a sensible way to begin a digital transformation journey, there are a number of issues to consider. 

Building MES functionality with point solutions requires careful consideration. The pitfalls are all too common, resulting in delayed progress and increased costs versus a single platform.

If you were to consider the data flow in your operation like plumbing in a house, concerns about differing approaches would soon become apparent. As numerous plumbers from different companies arrive to distribute water and heating around your home, difficulties reconciling differences between pipe diameters, connectors and joining mechanisms would result in burst pipes and water everywhere. 

Amongst the issues that come with composite systems is security. While plumbing together these systems, how do you consider cybersecurity with due diligence? Should you experience a cybersecurity threat, a growing and tangible danger, which vendors would you call for support? 

Vulnerabilities can reveal themselves when disparate solutions take diverging paths, at an incongruent pace, through their product roadmaps. The result is a constantly changing landscape in which your platform can continuously fall out of sync with its various component solutions, requiring constant attention and maintenance. That is not to mention the security risks of each system requiring different access routes in and out of information silos, which requires careful consideration as increased connections mean more potential attack vectors.

Bad actors take advantage of vulnerabilities in poorly secured systems

How do you ensure consistency and implementation of standards across vendor organisations? Best practice becomes challenging to implement, with no single approach for your entire MES system. Learning each solution will require training courses for each, resulting in increased time to competency for your operators. 

Implementing a single platform that provides seamless connectivity, efficiency management, quality management and production management solutions in addition to a raft of other capabilities rather than numerous point solutions avoids the headache. 

Where do you begin if you choose to implement a complete and integrated MES solution from a single vendor? The good news is that independent analysts have done a lot of homework for you. Gartner has asked vendors the tough questions to independently test the product and ensure confidence in connectivity, security, training, and the product’s roadmap.  

Novotek is the only Premier Solutions Partner for GE in the United Kingdom. With extensive experience and expertise in delivering and exceeding customers’ ambitions, Novotek has helped many manufacturers achieve greater profitability and efficiency with GE Digital products. 

So, what does Gartner have to say about GE Digital? 

“GE Digital is a Leader in this Magic Quadrant.” Gartner’s Magic Quadrant considers the completeness of a vendor’s vision alongside their ability to execute that vision to sort vendors between Leaders, Challengers, Niche Players and Visionaries.  

Gartner has highlighted strengths such as innovation, product improvements and customer experience as factors in GE Digital serving as a leading platform in the MES space. 

As systems trend towards more and more connectivity, owing to the significant value offered by data analysis for operational improvement, implementing unconnected or imperfectly deployed point solutions can put your operation on the back foot competitively. Additionally, a consistent naming structure and technical ontology are required to ensure systems can communicate flawlessly. This is inherent in a complete MES solution, but your team must consider and continuously monitor a collection of point solutions to achieve compatibility. 

Another downside to such an approach is paying multiple times for the same service. When deploying a point solution, each integration will require design, testing and implementation phases – each made more challenging by the need for each team to consider the other’s work, compatibility, language and methodology. 

GE Digital’s Plant Apps cover the following functionality as a rounded MES platform: 

  • Dispatching – Distributing work orders based on transactional data and demand 
  • Execution – Managing the production process
  • Data Management – Enabling the collection and management of data at regular intervals from all connected assets. 
  • Operational Data Store – Readily tailorable for purpose, MES can serve as a relational database for operational data or integrate with a data historian or IIoT platform. 
  • Quality Management – Regulated industries and products can benefit from standardisation and data capture to ensure compliance. 
  • Process – MES ensures all manufacturing steps are undertaken correctly, with the correct raw materials, temperatures, times, etc. 
  • Traceability – The ability to track the entire process from raw materials to intermediate and finished goods by lot, batch number, or other signifiers. 
  • Analytics and Reporting – Dashboard displays, advanced analytical tools and real-time KPIs provide data for accurate decision support. 
  • Integration – MES can bring together many disparate systems to create something greater than the sum of its parts. Tying together all production levels with enterprise systems, site planning, bill of materials, and recipe planning. 

With a single platform, the Novotek team will tailor the solution to your individual needs within a coherent integration process. With a project undertaken in an orderly way and to return to the analogy of tradespeople in the home, you can be sure your plasterers, painters, and plumbers aren’t tripping over each other. 

]]>
What SCADA Evolution Means for Developers https://ideashub.novotek.com/what-scada-evolution-means-for-developers/ Fri, 28 Oct 2022 13:58:37 +0000 https://ideashub.novotek.com/?p=3296

If you’ve walked through factories and seen operator or supervisor screens like the one below, you’re actually seeing both the best and worst aspects of technology evolution! Clearly, no data is left hidden within the machine or process, but screen design looks to have been driven by the abililty to visualiase what’s available from the underlying controls, rather than a more nuanced view of how to support different people in their work. you could say that the adoption of modern design approaches to building a “good” HMI or SCADA application has lagged what the underlying tools can support.

One place to configure & manage for SCADA, Historian, Visualisation

In Proficy iFIX, GE Digital has incorporated a mix of development acceleration and design philosophies that can both lead to more effective user experiences with a deployed system, while also making the overall cost of building, maintaining, and adapting a SCADA lower.

Three critical elemetns stand out:

1. Model-centric design

This brings object-oriented developement principles to SCADA and related applications. With a “home” for standrad definitions of common assets, and their related descriptibe and attribute data, OT teams can create reusable application components that are quick to deploy for each physical instance of a type. The model also provides useful application foundations, so things like animations, alarm filters and so on can be defined as appropriate for a class or type – and thereofore easily rolled out into the screens where instances of each type are present. And with developments in the GE site making the model infrastructure available to Historain, analytics and MED solutions, work done once can defray the cost and effort needed in related programs.

2. Centralised, web-based administation and development

In combination with the modelling capability, this offers a big gain in productivity for teams managing multiple instances of SCADA. With common object definitions, and standard screen templates, the speed at which new capabilites or chages to exisiting footprints can be built, tested, and rolled out means a huge recovery of time for skilled personnel.

3. The subtle side of web-based clients

Many older application have large bases of custom scripting – in many cases to enable interaction with data sources outside the SCADA, drive non-standard animations, or to enable conditional logic. With the shift to web-based client technology, the mechanics for such functions are shifting to more configurable object behaviours, and to server-side functions for data integrations. These mean simipler, more maintainable, and less error prone deployments.

Taking advantage of what current-generation iFIX offers will mean a different development approach – considering useful asset and object model structure, then templating the way objects should be deployed is a new starting point for many. But with that groundwork laid, the speed to a final solution is in many (most!) cases, faster than older methodologies – and that’s beofer considering the advantage of resuability across asset types, or across multiple servers for different lines or sites.

Recovered time buys room for other changes

With rich automation data mapped to the model, and faster methods to build and roll out screen, different users can have their views tailored to suit their regualr work. Our earlier screen example reflected a common belief that screen design is time-consuming, so best to put as much data as possible in one place so that operators, technicicans, maintenance and even improvement teams can all get what they need without excessive development effort. But that can mean a confused mashup of items that get in the way of managing the core process, and in turn actually hamper investigations when things are going wrong.

But where development time is less of a constraint, more streamlined views can be deployed to support core work processes, with increasing levels of detail exposed on other screen for more technical investigation or troubleshooting. Even without fully adopting GE Digital’s Efficient HMI design guidelines, firms can expect faster and more effective responses form operators and supervisors who don’t have to sift through complex, overloaded views simplu to maintain steady-state operators.

With significant gains to be had in terms of operator responsiveness, and effective management of expectations, the user experience itself can merit as much consideration as the under-the-bonent changes that benefit developers.

Greenfield vs. Brownfield

It may seem like adopting a model-based approach, and taking first steps with the new development environments would be easier on fresh new project, whereas an upgrade scenario should be addressed by “simply” porting forward old screens, the database, etc. But when you consider all that can be involved in that forward migration, the mix of things that need “just a few tweaks” can mean as much – or more – work than a fresh build of the system, where the old serves as a point of reference for design and user requirements.

The proess database is usually the easiest part of the configuration to migrate forward. Even if changing from legacy drivers to IGS or Kepware, these tend to be pretty quick. Most of the tradeoffs of time/budget for an overall better solution are related to screen (and related scripting) upgrades. From many (many!) upgrades we’ve observed our customers make, we see common areas where a “modernisation” rather than a migration can actully be more cost effective, as well as leaving users with a more satisfying solution.

Questions to consider include:

While there is often concen about whether modernisation can be “too much” change, it’s equally true that operators genuinely want to support their compaines in getting better. So if what they see at the end of an investment looks and feels the same way it always has, the chance to enable improvements may have been lost – and with it a chance to engage and energise employees who want to be a part of making things better.

Old vs. New

iFIX 2023 and the broader Proficy suite incorporating more modern tools, which in turn offer choices about methods and approahces. Beyond the technical enablement, enginerring and IT teams may find that exploring these ideas may offer benefit in areas as straightforward as modernising system to avoid obsolescene risk to making tangile progress on IoT and borader digital initiatives.

]]>
https://ideashub.novotek.com/3290-2/ Mon, 24 Oct 2022 09:46:04 +0000 https://ideashub.novotek.com/?p=3290 One of the advantages of managing technology assets is that you can do things with them beyond “just running them”, such as, keeping track of them and repairing them! Optimising a productions process for efficiency or utility usage if often a matter of enhancing the code in a control program, SCADA, or related system, so the tech assets themselves can be the foundation for ongoing gains. And similarly, as customer or regulatory requirements for proof of security or insight into production processes increase, the tech assets again become the vehicle to satisfy new demands, rather than re-engineering the underlying mechanical or process equipment.

It’s this very adaptability that makes version control around the configurations and programs valuable. As configurations and programs change, being sure that the correct version are running is key to sustaining the improvements that have been built into those latest releases. With that in mind, a good technology asset management program, such as octoplant will have version control as a central concern.

Whether deploying solutions in this area for the first time, or refreshing an established set of practices, it’s worthwhile to step back and evaluate what you want version control to do for you – operationally, compliance-wise and so on. And from that, the capabilities needed from any tools deployed will become clearer. With that in mind, we’ve noted some of the key areas to consider, and the decision that can come from them. We hope this helps you set the stage for a successful project!

Decide How to Deeply Embed Version Control

 We take VPNS, remote access and web applications for granted in a lot of ways – but this combination of technology means that it’s easier than ever to incorporate external development and engineering teams into your asset management and version control schemes. Evaluate whether it makes sense to set up external parties as users of your systems, or if it makes more sense to have your personnel manage the release and return of program / configuration files. The former approach can be most efficient in terms of project work, but it may mean some coordination with IT, to ensure access is granted securely. Either way, setting your version control system to reflect when a program is under development by other can ensure you have a smooth process for reincorporating their work back into your operation.

Be Flexible About the Scope of What Should be Version-Controlled.

Program source codes and configurations are the default focus of solutions like octoplant. Yet we see many firms deploying version control around supporting technical documentation, diagrams, even SOP (Standard Operating Procedure) documents relating to how things like code troubleshooting and changed should be done.

Define Your Storage and Navigation Philosophy. 

In many cases, this can be a very easy decision – set up a model (and associated file storage structure) that reflects your enterprise’s physical reality, as illustrated below. This works especially well when deploying automated backup and compare-to-master regimens, as each individual asset is reflected in the model.

However, some types of business may find alternatives useful. If you have many instances of an asset where the code base is genuinely identical between assets, and changes are rolled out en masse, and automated backup and compare is not to be deployed, it can make sense to think of a category-based or asset-type-specific model and storage scheme.

It may be that a blended approach make sense – where non-critical assets and programs may have variance both in their automation, and therefore in the program structure, an enterprise model can make sense. But in some industries (food, pharma, CPG), it can be common to maintain identical core asset types, and associated automation and process control. So having some category / type-based manager versions can be useful, too.

Reporting and Dashboards – Version Control Data is Not Just for Developers and Engineers.

A robust solution will track actions taken by different users in relation to different asset’s code bases, and in relation to any automated comparisons. This means you can have a rich audit trial that can certainly be used to ensure disciplines are being followed, but it also means that you can easily support any regulatory or customer requirements for data. And with a model of your operation reflecting the different makes models, variants and generation of tech assets, you’ll have a tech inventory at your fingertips that can make reinvestment and replacement planning much more efficient. So, make sure your plan to share dashboards and reports reflects the different people in your organisation who could use their own view of the tech assets, and the programs running in them.

If you’d like to learn more about the work we do with our customers on technology asset management, you can get in touch here; or ring us on +44 113 531 2400

]]>
Managing multiple energy sources https://ideashub.novotek.com/managing-multiple-energy-sources/ Tue, 18 Oct 2022 12:51:20 +0000 https://ideashub.novotek.com/?p=3270

In 2013, the UK Government Office for Science produced a report, entitled the Future role for energy in manufacturing. In this report, they identified two threats to UK-based manufacturers. The first was that the price of energy in the UK will rise, compared to the cost faced by competitor firms abroad, placing UK manufacturers at a significant disadvantage. Well, the price has risen but globally because of the Russia Ukraine war. Nevertheless, the threat to UK manufacturing is still valid. The second threat is that a low-carbon electricity supply will be unreliable, and that the cost of power cuts will rise. Well, that is certainly true if you rely solely on low-carbon electricity. But using multiple sources of power can be greatly beneficial.

In 2021, US rankings put technology companies at the top of their list for renewables users. Google derives 93% of its total electricity consumption from solar and wind power. Microsoft accounted for 100% of its electricity use from wind, small hydro and solar power, while Intel also derived 100% of its electricity from various renewables.

In the manufacturing world, more and more producers are turning to multiple sources to power their manufacturing, particularly those that are in the energy intensive production industries.

Tesla is well known for committing to renewable energy in manufacturing, with its solar-panelled roofs and use of waste heat and cold desert air to govern production processes in its Gigafactories.

Some of the bigger names in the manufacturing world that are utilising a solar system include GM, L’Oreal and Johnson & Johnson.

Manufacturing companies make ideal spots for solar system installations for several reasons. First, these businesses typically operate out of large plants with sizeable roofs. These expansive, flat spaces are perfect for setting up many solar panels. Also, manufacturing plants tend to be located in industrial parks and other areas far away from tall buildings, so they avoid the problems caused by massive structures looming over solar panels and creating shade. And smaller manufacturers can also benefit from multiple energy sources to both reduce their costs and reliance on the grid.

Making it work

The process of combining various types of energy is called a multi-carrier energy system, which increases energy efficiency. The technology that allows two or more independent three-phase or single-phase power system to synchronise can be achieve using a Power Sync and Measurement (PSM) system, such as the module found in the PACSystem RX3i Power Sync & Measurement Systems (IC694PSM001 & IC694ACC200). This will monitor two independent three-phase power grids. It incorporates advanced digital signal processor (DSP) technology to continuously process three voltage inputs and four current inputs for each grid.

Measurements include RMS voltages, RMS currents, RMS power, frequency, and phase relationship between the phase voltages of both grids.

The PSM module performs calculations on each captured waveform, with the DSP processing the data in less than two-thirds of a power line cycle. The PSM module can be used with wye or delta type three-phase power systems or with single-phase power systems.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

The PSM system can be used for applications such as:

  • Electrical power consumption monitoring and reporting
  • Fault monitoring
  • Generator control features for generator to power grid synchronization
  • Demand penalty cost reduction/load shedding

The PSM system consists of:

  • PSM module – A standard IC694 module that mounts in an RX3i main rack. The PSM module provides the DSP capability.
  • Terminal Assembly – A panel-mounted unit that provides the interface between the PSM module and the input transformers.
  • Interface cables – Provide the GRID 1 and GRID 2 connections between the PSM module and the Terminal Assembly

The image below shows how a basic PSM system can be connected.

PSM System Features
  • Uses standard, user-supplied current transformers (CTs) and potential transformers (PTs) as its input devices.
  • Accurately measures RMS voltage and current, power, power factor, frequency, energy, and total three-phase 15-minute power demand.
  • Provides two isolated relays that close when the voltage phase relationships between the two monitored grids are within the specified ANSI 25 limits provided by the RX3i host controller. These contacts can be used for general-purpose, lamp duty or pilot duty loads. Voltage and current ratings for these load types are provided in GFK-2749, PACSystems RX3i Power Sync and Measurement System User’s Manual.
  • Provides a cable monitoring function that indicates when the cables linking the PSM module and Terminal Assembly are correctly installed.
  • PSM module and Terminal Assembly are easily calibrated by hardware configuration using the PAC Machine Edition (PME) software.

To find out how Novotek can help you reduce your energy consumption and manage multiple energy sources email us at info_uk@novotek.com

]]>
DataOps: The Fuel Injectors For Your Transformation Engine? https://ideashub.novotek.com/dataops-the-fuel-injectors-your-transformation-engine/ Thu, 19 May 2022 11:43:48 +0000 https://ideashub.novotek.com/?p=3060

Data – everyone agrees it’s the fuel for the fires of innovation and optimisation. The industrial world is blessed with an abundance of rich, objective (being machine-generated) data, so should be well-equipped to seek new advantages from it. Too often, the first efforts an industrial firm takes to harness its machine and process data for new reporting or advanced analysis initiatives involve simple use cases and outputs that can mask what it takes to support a mix of different needs in a scalable and supportable way. Data Ops practices provide a way of systemically addressing the steps needed to ensure that your data can be made available in the right places, at the right times, in the right formats for all the initiatives you’re pursuing.


Industrial data (or OT data) poses particular challenges that your Data Ops strategy will address:

  • It can be generated at a pace that challenges traditional enterprise (or even cloud layer) data collection and data management systems (TO say nothing of the costs of ingestion and processing during reporting/analysis typical of cloud platforms is considered).
  • The data for functionality identical assets or processes is often not generated in a consistent structure and schema.
  • OT data generally does not have context established around each data point – making it difficult to understand what it represents, let alone the meaning inherent in the actual values!
  • Connecting to a mix of asset types with different automation types and communications protocols is often necessary to get a complete data set relevant to the reporting or analytics you’re pursuing.
  • A wide array of uses demands different levels of granularity of some data points and a breadth of collection points that is significantly wider than many individual stakeholders may appreciate.

These are the reasons why in many firms, the engineering team often ends up becoming the “data extract/Excel team” – their familiarity with the underlying technology means they can take snapshots and do the data cleansing necessary to make the data useful. But that’s not scalable, and data massaging is a far less impactful use of their time – they should be engaged with the broader team interpreting and acting on the data!


Data Ops – Quick Definition There’s no one way to “do” Data Ops. In the industrial world, it’s best thought of as a process involving: – Determining the preferred structures and descriptions (models) for OT data, so it may serve the uses the organisation has determined will be valuable. – Assessing what approaches to adding such models can be adopted by your organisation. – Choosing the mix of tools needed to add model structures to a combination of existing and new data sources. – Establishing the procedure to ensure that model definitions don’t become “stale” if business needs change. – Establishing the procedures to ensure that new data sources, or changing data sources are brought into the model-based framework promptly.


A Rough Map is Better Than No Map.

Take a first pass at capturing all the intended uses of your OT data. What KPIS, what reports, what integration points, and what analytics are people looking for? Flesh out those user interests with an understanding of what can feed into them:

  1. Map the different stakeholder’s data needs in terms of how much they come from common sources, and how many needs represent aggregations, calculations or other manipulations of the same raw data.
  2. Flesh out the map by defining the regularity with which data needs to flow to suit the different use cases. Are some uses based on by-shift, or daily views of some data? Are other users based on feeding data in real-time between systems to trigger events or actions?
  3. Now consider what data could usefully be “wrapped around” raw OT data to make it easier for the meaning and context of that data to be available for all. Assess what value can come from:
    1. Common descriptive models for assets and processes – a “Form Fill & Seal Machine” with variables like “Speed” and “Web Width” (etc.) is a far easier construct for many people to work with then a database presenting a collection of rows reflecting machines’ logical addresses with a small library of cryptically structured variables associated to each one.
    2. An enterprise model to help understand the locations and uses of assets and processes. The ISA-95 standard offers some useful guidance in establish such a model.
    3. Additional reference data to flesh out the descriptive and enterprise models. (eg: Things like make and model of common asset types with many vendors; or information about a location such as latitude or elevation). Be guided by what kind of additional data would be helpful in comparing/contrasting/investigating differences in outcomes that need to be addressed.
  4. Now assess what data pools are accumulating already – and how much context is accumulating in those pools. Can you re-use existing investments to support these new efforts, rather than creating a parallel set of solutions?
  5. Finally, inventory the OT in use where potentially useful data is generated, but not captured or stored; particularly note connectivity options.

Avoiding A Common Trap “Data for Analytics” means different things at different stages. A data scientist looking to extract new insights from OT data may need very large data sets in the data centre or cloud, where they can apply machine learning or other “big data” tools to a problem. A process optimisation team deploying a real-time analytic engine to make minute-by-minute use of the outputs of the data scientists’ work may only need small samples across a subset of data point for their part of the work. Data Ops thinking will help you ensure that both of these needs are met appropriately.


Map’s Done – Now How About Going Somewhere?

The work that comes next is really the “Ops” part of Data Ops – with the rough map of different uses of OT data at hand, and the view of whether each use needs granular data, aggregated data, calculated derivations (like KPIs), or some kind of combination, you’ll be able to quickly determine where generating desired outputs requires new data pools or streams, or where existing ones can be used. And for both, your data modelling work will guide what structures and descriptive data need to be incorporated.

At this point, you may find that some existing data pools lend themselves to having asset and descriptive models wrapped around the raw data at the data store level – ie: centrally. It’s a capability offered in data platform solutions like GE’s Proficy Historian. This approach can make more sense than extracting data sets simply to add model data and then re-writing the results to a fresh data store. Typically, streaming/real-time sources offer more choice in how best to handle adding the model around the raw data – and there are solutions like HighByte’s Intelligence Hub, that allow the model information to be added at the “edge” – the point where the data is captured in the first place. With the model definitions included at this point, you can set up multiple output streams – some feeding more in-the-moment views or integration points, some feeding data stores. In both cases, the model data having been imposed at the edge makes it easier for the ultimate user of the data to understand the context and the meaning of what’s in the stream.



Edge Tools vs Central Realistically, it’s you’re likely to need both. And the driving factor will not necessarily be technical. Edge works better when: 1. You have a team that deal well with spreading standardised templates. 2. Data sources are subject to less frequent change (utility assets are a good example of this). 3. The use cases require relatively straightforward “wrapping” of raw data with model information. 4. Central works well when. 5. The skills and disciplines to manage templates across many edge data collection footprints are scarce. 6. The mix of ultimate uses of the data are more complex – requiring more calculations or derivations or modelling of relationships between different types of data sources. 7. Change in underlying data sources is frequent enough that some level of dedicated and/or systematized change detection and remediation is needed.


Regardless of which tools are applied, the model definitions defined earlier, applied consistently, ensure that different reports, calculations and integration tools can be developed more easily, and adapted more easily as individual data sources under the models are inevitably tweaked, upgraded or replaced – as the new automation or sensors come in, their unique data structures simply need to be bound to the models representing them, and the “consumer” of their outputs will continue to work. So, while tools will be needed, ultimately the most valuable part of “doing” Data Ops is the thinking that goes into deciding what needs to be wrapped around raw data for it to become the fuel for your digital journey.

]]>
1,000 miles or around the block: Start one step at a time… https://ideashub.novotek.com/1000-miles-or-around-the-block-start-one-step-at-a-time/ Wed, 16 Mar 2022 11:44:08 +0000 https://ideashub.novotek.com/?p=2996

The rise of connected industrial technologies and Industry 4.0 has prompted the development and launch of countless systems with extensive capabilities and functions. This is often beneficial for businesses with a defined and set long-term strategy, but it can lead to forcing early adoption and spending when deployments and licensing outstrip company’s capacity to change work processes ad adopt new tech.


Here, Sean Robinson, software solutions manager at Novotek UK and Ireland, explains how less can be more with new plant tech deployments – and why immediate problem-solving needs to be a distinct effort within longer-term strategies.


Countless proverbs, maxims and quotes have been formed around the idea of moderation, dating back as far as – or even further – Ancient Greek society. The notion remains important to this day for everything from diet to technology. However, engineers and plant managers frequently over-indulge in the latter and over-specify systems that offer functionality well beyond what is necessary or even practically useful.

It can initially appear that there is no harm in opting for an automation or plant IT system that has extensive functionality, because this may help to solve future problems as they arise. That being said, and investment, positioned to be all-encompassing, like a full, material-receiving-throguh-WIP-execution-with-performances-analysis-and-enterprise-integration manufacturing execution system (MES) can sometimes present its own barriers to adoption for certain businesses, especially those in sectors that favour flexibility such as fast-moving consumer good (FMCG) or food manufacturing (also – interestingly – increasingly in the contract production side of consumer health and life sciences). Where core production processes and related enabling technology are well-established, it can be risky, expensive and overkill to treat the need to implement specific new capabilities as the trigger for wholesale replacement or re-working. They key is to identify where critical new functional needs can be implemented around the installed technology base in focused ways that deliver results, while leaving open the option of incrementally adding additional functionally-focused solutions in a staged way, over time.

At Novotek, our role is to help our customers choose technology that delivers on an immediate need, while opening up the potential to build incrementally in a controlled, low-risk way.

Fortunately, both the licensing models and the technical architectures of plant IT solutions are changing in ways that support this kind of approach. So, the software cost and deployment services costs of bringing on board very specific capabilities can be scaled to match the user base, and the technical and functional boundaries of a specific need. We can think of these focused deployments as “micro-apps”. A key part of this approach is that the apps aren’t built as bespoke, or as an extension of a legacy (and possibly obsolete) system. It’s a productised solution – with only the “right” parts enables and delivered to the right stakeholders. Consider quality in toiletry production and specifically challenges with product loss due to variability in the quality of raw materials. It’s safe to assume that a plant will already have local control systems in place elsewhere to track the overall quality outcomes, but monitoring the raw material quality is often left to supplier-side data that may be under used – serving as a record of supplier compliance with a standard, rather then being used to proactively trigger adjustments in key process settings to avoid losses. In this scenario, an ideal micro-app could be focused on captured raw material data, using machine learning to provide deep analysis of how existing machines can best process the material lot and alerting supervisors and process owners to take action. Such a function might have a small number of users; it might even have integration with inventory or quality systems replacing some manual data entry. So, the software licensing and services and timelines to deliver impact can be kept small. When we consider some of the demands manufacturers now face on fronts ranging from qualifying new supplier/materials, to furthering energy and water reduction, to adopting more predictive maintenance and asset management strategies. We see a lot of potential to tackle these with focused solutions that happen to borrow from the underlying depth and breath of MES solutions.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

]]>