Data analytics insights – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Thu, 29 Jun 2023 07:41:36 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png Data analytics insights – Novotek Ideas Hub https://ideashub.novotek.com 32 32 Data capture and regulatory reporting https://ideashub.novotek.com/data-capture-and-regulatory-reporting/ Thu, 29 Jun 2023 07:41:33 +0000 https://ideashub.novotek.com/?p=3358

Data capture is critical when you’re looking to drive continuous improvements in manufacturing, and it is equally crucial for regulatory compliance. In this article, we’ll look at how intelligent systems can not only streamline the capture of data required for quality management and regulatory compliance in regulated industries, but ensure faster, easier use (and re-use!) of data once captured.

Does your business track critical control points? If so, are you able to retrieve that information quickly and easily? With the right sensors, platforms and software solutions the necessary quality parameters can be continuously captured, with alerts generated in a timely manner for any deviation from specification. This can mean the difference between a batch of good quality, or finished goods that require time and energy to rework to the appropriate standard.

Furthermore, automatically performing regular in-process checks can improve the efficiency of operators and reduce the chances of incorrect data being captured or recorded which could lead to unnecessary work. With solutions from Novotek, we can help you start the journey to a fully automated quality management system.

Automated quality management reduces waste, increases yield and provides data for root cause analysis.

As all production processes consume raw materials, the exact nature and variability of these materials and the quantities used can have a significant impact on the quality of the finished product. Automatically adjusting the production setpoints to cater for the individual characteristics of raw materials can lead to a more consistent output.

By continuously capturing quality data through intelligent systems, you have the tools to perform a historical review of production performance based on different batches of raw materials. You may have implemented a historian, a lab system, even a performance metrics system already but what if the information in isolated silos that are not easily accessed? In these kinds of situations, we can take advantage of innovations in technology that may have been born outside the factory, but can offer value within the production world.

The Industrial Internet of Things (IIoT) is often understood to mean the devices and sensors that are interconnected via computers with industrial applications. In fact, it also includes the kind of data management platforms, “data ops” tools and methodologies that make managing and using industrial data easier.Although IIoT may sometimes appear vast and daunting, through an iterative and scalable process, you will rapidly see tangible results in reducing workloads, with an innovative platform for better quality and improved compliance with your industry’s regulations and standards. Linking together the disparate assets and data stores in your operation provides vital visibility, both in real-time and over the history of your production process.

Your data collection and collation processes are streamlined and automated through this connectivity, facilitating the generation of electronic batch records (EBR) that can be used to satisfy regulatory compliance. Modern, data ops tools, combined with low-code app development tools, make it straightforward to combine data from siloed systems into intuitive user interfaces make reviewing data against specific dates, batch codes, raw material lot numbers, or other production parameters more accessible and understandable.

And this approach suits additional needs: Compliance with standards and regulations is vital for the image of your operation. A tailored solution meets your requirements, from recording hitting required temperatures, or exact measurements when combining the right amount of ingredients at the right time. With our solutions, you can rest assured that you have access in perpetuity to every detail of what you’ve produced. And that in turn means being able to support investigations and deliver reporting to to fulfil obligations, and for both internal and external stakeholders.

Smart systems offer robust methods for ensuring regulatory compliance

Many manufacturers are both blessed and cursed with an ever-growing flow of potentially useful data. We see our role as being to provide guidance on the best way to tap that flow so that many different stakeholders can be served for the least incremental cost, and the least disruption to existing OT and IT. Thanks to the modernisation of plant systems, and increasing adoption of IIoT-style tools and platforms, our customers can  put their data into the right hands at the right time more easily than ever!

]]>
MES: Build vs Buy https://ideashub.novotek.com/building-or-buying-mes/ Mon, 05 Jun 2023 08:01:57 +0000 https://ideashub.novotek.com/?p=3336

Every manufacturing operation requires communication and the sharing of data. In the past, data was manually recorded with pen and paper and shared at the walking speed of an operator.

The industry has come a long way since then, with forward-thinking operations undertaking digital transformation journeys to unlock greater efficiency, visibility and the capabilities required for continual improvement and profitability in the contemporary manufacturing landscape. 

However, not all approaches yield the same results. While point solutions for individual functions to provide new capabilities in your manufacturing operation may seem a sensible way to begin a digital transformation journey, there are a number of issues to consider. 

Building MES functionality with point solutions requires careful consideration. The pitfalls are all too common, resulting in delayed progress and increased costs versus a single platform.

If you were to consider the data flow in your operation like plumbing in a house, concerns about differing approaches would soon become apparent. As numerous plumbers from different companies arrive to distribute water and heating around your home, difficulties reconciling differences between pipe diameters, connectors and joining mechanisms would result in burst pipes and water everywhere. 

Amongst the issues that come with composite systems is security. While plumbing together these systems, how do you consider cybersecurity with due diligence? Should you experience a cybersecurity threat, a growing and tangible danger, which vendors would you call for support? 

Vulnerabilities can reveal themselves when disparate solutions take diverging paths, at an incongruent pace, through their product roadmaps. The result is a constantly changing landscape in which your platform can continuously fall out of sync with its various component solutions, requiring constant attention and maintenance. That is not to mention the security risks of each system requiring different access routes in and out of information silos, which requires careful consideration as increased connections mean more potential attack vectors.

Bad actors take advantage of vulnerabilities in poorly secured systems

How do you ensure consistency and implementation of standards across vendor organisations? Best practice becomes challenging to implement, with no single approach for your entire MES system. Learning each solution will require training courses for each, resulting in increased time to competency for your operators. 

Implementing a single platform that provides seamless connectivity, efficiency management, quality management and production management solutions in addition to a raft of other capabilities rather than numerous point solutions avoids the headache. 

Where do you begin if you choose to implement a complete and integrated MES solution from a single vendor? The good news is that independent analysts have done a lot of homework for you. Gartner has asked vendors the tough questions to independently test the product and ensure confidence in connectivity, security, training, and the product’s roadmap.  

Novotek is the only Premier Solutions Partner for GE in the United Kingdom. With extensive experience and expertise in delivering and exceeding customers’ ambitions, Novotek has helped many manufacturers achieve greater profitability and efficiency with GE Digital products. 

So, what does Gartner have to say about GE Digital? 

“GE Digital is a Leader in this Magic Quadrant.” Gartner’s Magic Quadrant considers the completeness of a vendor’s vision alongside their ability to execute that vision to sort vendors between Leaders, Challengers, Niche Players and Visionaries.  

Gartner has highlighted strengths such as innovation, product improvements and customer experience as factors in GE Digital serving as a leading platform in the MES space. 

As systems trend towards more and more connectivity, owing to the significant value offered by data analysis for operational improvement, implementing unconnected or imperfectly deployed point solutions can put your operation on the back foot competitively. Additionally, a consistent naming structure and technical ontology are required to ensure systems can communicate flawlessly. This is inherent in a complete MES solution, but your team must consider and continuously monitor a collection of point solutions to achieve compatibility. 

Another downside to such an approach is paying multiple times for the same service. When deploying a point solution, each integration will require design, testing and implementation phases – each made more challenging by the need for each team to consider the other’s work, compatibility, language and methodology. 

GE Digital’s Plant Apps cover the following functionality as a rounded MES platform: 

  • Dispatching – Distributing work orders based on transactional data and demand 
  • Execution – Managing the production process
  • Data Management – Enabling the collection and management of data at regular intervals from all connected assets. 
  • Operational Data Store – Readily tailorable for purpose, MES can serve as a relational database for operational data or integrate with a data historian or IIoT platform. 
  • Quality Management – Regulated industries and products can benefit from standardisation and data capture to ensure compliance. 
  • Process – MES ensures all manufacturing steps are undertaken correctly, with the correct raw materials, temperatures, times, etc. 
  • Traceability – The ability to track the entire process from raw materials to intermediate and finished goods by lot, batch number, or other signifiers. 
  • Analytics and Reporting – Dashboard displays, advanced analytical tools and real-time KPIs provide data for accurate decision support. 
  • Integration – MES can bring together many disparate systems to create something greater than the sum of its parts. Tying together all production levels with enterprise systems, site planning, bill of materials, and recipe planning. 

With a single platform, the Novotek team will tailor the solution to your individual needs within a coherent integration process. With a project undertaken in an orderly way and to return to the analogy of tradespeople in the home, you can be sure your plasterers, painters, and plumbers aren’t tripping over each other. 

]]>
What SCADA Evolution Means for Developers https://ideashub.novotek.com/what-scada-evolution-means-for-developers/ Fri, 28 Oct 2022 13:58:37 +0000 https://ideashub.novotek.com/?p=3296

If you’ve walked through factories and seen operator or supervisor screens like the one below, you’re actually seeing both the best and worst aspects of technology evolution! Clearly, no data is left hidden within the machine or process, but screen design looks to have been driven by the abililty to visualiase what’s available from the underlying controls, rather than a more nuanced view of how to support different people in their work. you could say that the adoption of modern design approaches to building a “good” HMI or SCADA application has lagged what the underlying tools can support.

One place to configure & manage for SCADA, Historian, Visualisation

In Proficy iFIX, GE Digital has incorporated a mix of development acceleration and design philosophies that can both lead to more effective user experiences with a deployed system, while also making the overall cost of building, maintaining, and adapting a SCADA lower.

Three critical elemetns stand out:

1. Model-centric design

This brings object-oriented developement principles to SCADA and related applications. With a “home” for standrad definitions of common assets, and their related descriptibe and attribute data, OT teams can create reusable application components that are quick to deploy for each physical instance of a type. The model also provides useful application foundations, so things like animations, alarm filters and so on can be defined as appropriate for a class or type – and thereofore easily rolled out into the screens where instances of each type are present. And with developments in the GE site making the model infrastructure available to Historain, analytics and MED solutions, work done once can defray the cost and effort needed in related programs.

2. Centralised, web-based administation and development

In combination with the modelling capability, this offers a big gain in productivity for teams managing multiple instances of SCADA. With common object definitions, and standard screen templates, the speed at which new capabilites or chages to exisiting footprints can be built, tested, and rolled out means a huge recovery of time for skilled personnel.

3. The subtle side of web-based clients

Many older application have large bases of custom scripting – in many cases to enable interaction with data sources outside the SCADA, drive non-standard animations, or to enable conditional logic. With the shift to web-based client technology, the mechanics for such functions are shifting to more configurable object behaviours, and to server-side functions for data integrations. These mean simipler, more maintainable, and less error prone deployments.

Taking advantage of what current-generation iFIX offers will mean a different development approach – considering useful asset and object model structure, then templating the way objects should be deployed is a new starting point for many. But with that groundwork laid, the speed to a final solution is in many (most!) cases, faster than older methodologies – and that’s beofer considering the advantage of resuability across asset types, or across multiple servers for different lines or sites.

Recovered time buys room for other changes

With rich automation data mapped to the model, and faster methods to build and roll out screen, different users can have their views tailored to suit their regualr work. Our earlier screen example reflected a common belief that screen design is time-consuming, so best to put as much data as possible in one place so that operators, technicicans, maintenance and even improvement teams can all get what they need without excessive development effort. But that can mean a confused mashup of items that get in the way of managing the core process, and in turn actually hamper investigations when things are going wrong.

But where development time is less of a constraint, more streamlined views can be deployed to support core work processes, with increasing levels of detail exposed on other screen for more technical investigation or troubleshooting. Even without fully adopting GE Digital’s Efficient HMI design guidelines, firms can expect faster and more effective responses form operators and supervisors who don’t have to sift through complex, overloaded views simplu to maintain steady-state operators.

With significant gains to be had in terms of operator responsiveness, and effective management of expectations, the user experience itself can merit as much consideration as the under-the-bonent changes that benefit developers.

Greenfield vs. Brownfield

It may seem like adopting a model-based approach, and taking first steps with the new development environments would be easier on fresh new project, whereas an upgrade scenario should be addressed by “simply” porting forward old screens, the database, etc. But when you consider all that can be involved in that forward migration, the mix of things that need “just a few tweaks” can mean as much – or more – work than a fresh build of the system, where the old serves as a point of reference for design and user requirements.

The proess database is usually the easiest part of the configuration to migrate forward. Even if changing from legacy drivers to IGS or Kepware, these tend to be pretty quick. Most of the tradeoffs of time/budget for an overall better solution are related to screen (and related scripting) upgrades. From many (many!) upgrades we’ve observed our customers make, we see common areas where a “modernisation” rather than a migration can actully be more cost effective, as well as leaving users with a more satisfying solution.

Questions to consider include:

While there is often concen about whether modernisation can be “too much” change, it’s equally true that operators genuinely want to support their compaines in getting better. So if what they see at the end of an investment looks and feels the same way it always has, the chance to enable improvements may have been lost – and with it a chance to engage and energise employees who want to be a part of making things better.

Old vs. New

iFIX 2023 and the broader Proficy suite incorporating more modern tools, which in turn offer choices about methods and approahces. Beyond the technical enablement, enginerring and IT teams may find that exploring these ideas may offer benefit in areas as straightforward as modernising system to avoid obsolescene risk to making tangile progress on IoT and borader digital initiatives.

]]>
DataOps: The Fuel Injectors For Your Transformation Engine? https://ideashub.novotek.com/dataops-the-fuel-injectors-your-transformation-engine/ Thu, 19 May 2022 11:43:48 +0000 https://ideashub.novotek.com/?p=3060

Data – everyone agrees it’s the fuel for the fires of innovation and optimisation. The industrial world is blessed with an abundance of rich, objective (being machine-generated) data, so should be well-equipped to seek new advantages from it. Too often, the first efforts an industrial firm takes to harness its machine and process data for new reporting or advanced analysis initiatives involve simple use cases and outputs that can mask what it takes to support a mix of different needs in a scalable and supportable way. Data Ops practices provide a way of systemically addressing the steps needed to ensure that your data can be made available in the right places, at the right times, in the right formats for all the initiatives you’re pursuing.


Industrial data (or OT data) poses particular challenges that your Data Ops strategy will address:

  • It can be generated at a pace that challenges traditional enterprise (or even cloud layer) data collection and data management systems (TO say nothing of the costs of ingestion and processing during reporting/analysis typical of cloud platforms is considered).
  • The data for functionality identical assets or processes is often not generated in a consistent structure and schema.
  • OT data generally does not have context established around each data point – making it difficult to understand what it represents, let alone the meaning inherent in the actual values!
  • Connecting to a mix of asset types with different automation types and communications protocols is often necessary to get a complete data set relevant to the reporting or analytics you’re pursuing.
  • A wide array of uses demands different levels of granularity of some data points and a breadth of collection points that is significantly wider than many individual stakeholders may appreciate.

These are the reasons why in many firms, the engineering team often ends up becoming the “data extract/Excel team” – their familiarity with the underlying technology means they can take snapshots and do the data cleansing necessary to make the data useful. But that’s not scalable, and data massaging is a far less impactful use of their time – they should be engaged with the broader team interpreting and acting on the data!


Data Ops – Quick Definition There’s no one way to “do” Data Ops. In the industrial world, it’s best thought of as a process involving: – Determining the preferred structures and descriptions (models) for OT data, so it may serve the uses the organisation has determined will be valuable. – Assessing what approaches to adding such models can be adopted by your organisation. – Choosing the mix of tools needed to add model structures to a combination of existing and new data sources. – Establishing the procedure to ensure that model definitions don’t become “stale” if business needs change. – Establishing the procedures to ensure that new data sources, or changing data sources are brought into the model-based framework promptly.


A Rough Map is Better Than No Map.

Take a first pass at capturing all the intended uses of your OT data. What KPIS, what reports, what integration points, and what analytics are people looking for? Flesh out those user interests with an understanding of what can feed into them:

  1. Map the different stakeholder’s data needs in terms of how much they come from common sources, and how many needs represent aggregations, calculations or other manipulations of the same raw data.
  2. Flesh out the map by defining the regularity with which data needs to flow to suit the different use cases. Are some uses based on by-shift, or daily views of some data? Are other users based on feeding data in real-time between systems to trigger events or actions?
  3. Now consider what data could usefully be “wrapped around” raw OT data to make it easier for the meaning and context of that data to be available for all. Assess what value can come from:
    1. Common descriptive models for assets and processes – a “Form Fill & Seal Machine” with variables like “Speed” and “Web Width” (etc.) is a far easier construct for many people to work with then a database presenting a collection of rows reflecting machines’ logical addresses with a small library of cryptically structured variables associated to each one.
    2. An enterprise model to help understand the locations and uses of assets and processes. The ISA-95 standard offers some useful guidance in establish such a model.
    3. Additional reference data to flesh out the descriptive and enterprise models. (eg: Things like make and model of common asset types with many vendors; or information about a location such as latitude or elevation). Be guided by what kind of additional data would be helpful in comparing/contrasting/investigating differences in outcomes that need to be addressed.
  4. Now assess what data pools are accumulating already – and how much context is accumulating in those pools. Can you re-use existing investments to support these new efforts, rather than creating a parallel set of solutions?
  5. Finally, inventory the OT in use where potentially useful data is generated, but not captured or stored; particularly note connectivity options.

Avoiding A Common Trap “Data for Analytics” means different things at different stages. A data scientist looking to extract new insights from OT data may need very large data sets in the data centre or cloud, where they can apply machine learning or other “big data” tools to a problem. A process optimisation team deploying a real-time analytic engine to make minute-by-minute use of the outputs of the data scientists’ work may only need small samples across a subset of data point for their part of the work. Data Ops thinking will help you ensure that both of these needs are met appropriately.


Map’s Done – Now How About Going Somewhere?

The work that comes next is really the “Ops” part of Data Ops – with the rough map of different uses of OT data at hand, and the view of whether each use needs granular data, aggregated data, calculated derivations (like KPIs), or some kind of combination, you’ll be able to quickly determine where generating desired outputs requires new data pools or streams, or where existing ones can be used. And for both, your data modelling work will guide what structures and descriptive data need to be incorporated.

At this point, you may find that some existing data pools lend themselves to having asset and descriptive models wrapped around the raw data at the data store level – ie: centrally. It’s a capability offered in data platform solutions like GE’s Proficy Historian. This approach can make more sense than extracting data sets simply to add model data and then re-writing the results to a fresh data store. Typically, streaming/real-time sources offer more choice in how best to handle adding the model around the raw data – and there are solutions like HighByte’s Intelligence Hub, that allow the model information to be added at the “edge” – the point where the data is captured in the first place. With the model definitions included at this point, you can set up multiple output streams – some feeding more in-the-moment views or integration points, some feeding data stores. In both cases, the model data having been imposed at the edge makes it easier for the ultimate user of the data to understand the context and the meaning of what’s in the stream.



Edge Tools vs Central Realistically, it’s you’re likely to need both. And the driving factor will not necessarily be technical. Edge works better when: 1. You have a team that deal well with spreading standardised templates. 2. Data sources are subject to less frequent change (utility assets are a good example of this). 3. The use cases require relatively straightforward “wrapping” of raw data with model information. 4. Central works well when. 5. The skills and disciplines to manage templates across many edge data collection footprints are scarce. 6. The mix of ultimate uses of the data are more complex – requiring more calculations or derivations or modelling of relationships between different types of data sources. 7. Change in underlying data sources is frequent enough that some level of dedicated and/or systematized change detection and remediation is needed.


Regardless of which tools are applied, the model definitions defined earlier, applied consistently, ensure that different reports, calculations and integration tools can be developed more easily, and adapted more easily as individual data sources under the models are inevitably tweaked, upgraded or replaced – as the new automation or sensors come in, their unique data structures simply need to be bound to the models representing them, and the “consumer” of their outputs will continue to work. So, while tools will be needed, ultimately the most valuable part of “doing” Data Ops is the thinking that goes into deciding what needs to be wrapped around raw data for it to become the fuel for your digital journey.

]]>
1,000 miles or around the block: Start one step at a time… https://ideashub.novotek.com/1000-miles-or-around-the-block-start-one-step-at-a-time/ Wed, 16 Mar 2022 11:44:08 +0000 https://ideashub.novotek.com/?p=2996

The rise of connected industrial technologies and Industry 4.0 has prompted the development and launch of countless systems with extensive capabilities and functions. This is often beneficial for businesses with a defined and set long-term strategy, but it can lead to forcing early adoption and spending when deployments and licensing outstrip company’s capacity to change work processes ad adopt new tech.


Here, Sean Robinson, software solutions manager at Novotek UK and Ireland, explains how less can be more with new plant tech deployments – and why immediate problem-solving needs to be a distinct effort within longer-term strategies.


Countless proverbs, maxims and quotes have been formed around the idea of moderation, dating back as far as – or even further – Ancient Greek society. The notion remains important to this day for everything from diet to technology. However, engineers and plant managers frequently over-indulge in the latter and over-specify systems that offer functionality well beyond what is necessary or even practically useful.

It can initially appear that there is no harm in opting for an automation or plant IT system that has extensive functionality, because this may help to solve future problems as they arise. That being said, and investment, positioned to be all-encompassing, like a full, material-receiving-throguh-WIP-execution-with-performances-analysis-and-enterprise-integration manufacturing execution system (MES) can sometimes present its own barriers to adoption for certain businesses, especially those in sectors that favour flexibility such as fast-moving consumer good (FMCG) or food manufacturing (also – interestingly – increasingly in the contract production side of consumer health and life sciences). Where core production processes and related enabling technology are well-established, it can be risky, expensive and overkill to treat the need to implement specific new capabilities as the trigger for wholesale replacement or re-working. They key is to identify where critical new functional needs can be implemented around the installed technology base in focused ways that deliver results, while leaving open the option of incrementally adding additional functionally-focused solutions in a staged way, over time.

At Novotek, our role is to help our customers choose technology that delivers on an immediate need, while opening up the potential to build incrementally in a controlled, low-risk way.

Fortunately, both the licensing models and the technical architectures of plant IT solutions are changing in ways that support this kind of approach. So, the software cost and deployment services costs of bringing on board very specific capabilities can be scaled to match the user base, and the technical and functional boundaries of a specific need. We can think of these focused deployments as “micro-apps”. A key part of this approach is that the apps aren’t built as bespoke, or as an extension of a legacy (and possibly obsolete) system. It’s a productised solution – with only the “right” parts enables and delivered to the right stakeholders. Consider quality in toiletry production and specifically challenges with product loss due to variability in the quality of raw materials. It’s safe to assume that a plant will already have local control systems in place elsewhere to track the overall quality outcomes, but monitoring the raw material quality is often left to supplier-side data that may be under used – serving as a record of supplier compliance with a standard, rather then being used to proactively trigger adjustments in key process settings to avoid losses. In this scenario, an ideal micro-app could be focused on captured raw material data, using machine learning to provide deep analysis of how existing machines can best process the material lot and alerting supervisors and process owners to take action. Such a function might have a small number of users; it might even have integration with inventory or quality systems replacing some manual data entry. So, the software licensing and services and timelines to deliver impact can be kept small. When we consider some of the demands manufacturers now face on fronts ranging from qualifying new supplier/materials, to furthering energy and water reduction, to adopting more predictive maintenance and asset management strategies. We see a lot of potential to tackle these with focused solutions that happen to borrow from the underlying depth and breath of MES solutions.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

]]>
Out-of-the-Box Solution Templates Offer More Than Meets The Eye. https://ideashub.novotek.com/out-of-the-box-solution-templates-offer-more-than-meets-the-eye/ Fri, 21 Jan 2022 13:23:08 +0000 https://ideashub.novotek.com/?p=2976

Industries such as food and beverage manufacturing and consumer packaged goods (CPG) production are fast-moving environments, with high traceability and proof-of-quality requirements alongside  throughput demands. As such, automation offers a lot of benefits to operations — so changes to existing systems, or implementing new ones can be seen as a source of risk, rather than opportunity. Here, Sam Kirby, a Solutions Engineer for Novotek UK & Ireland, looks at how manufacturers in the food, beverage and CPG sectors can reliably and rapidly extend automation deployments. 

Sam Kirby (Industrial IT & OT Automation Specialist)

The food and beverage industry has a long history with automated systems. In fact, one of the first fully automated production processes was that of an automatic flour mill, in 1785. The industry has generally kept abreast of automation developments since then, allowing it to keep ahead of ever-growing demand. Similar is true of CPG production, particularly in recent years, as product innovation has become a key business strategy.

CPG and food and beverage production tend towards automation because, in both sectors, much of the workforce is at the field level. As such, connecting systems to gain greater visibility into equipment health, output performance and product quality is invaluable. This is nothing new; engineers have been undertaking such projects for years, In particular, firms in these sectors have firmly established the benefits of connectivity and supervisory control and data acquisition (SCADA) systems. 

However, the fast-moving nature of product development, with knock-on effects on  operations means that systems are evolved in place – the goal is to undertake minimal technical work to allow for product and process changes without compromising the overall setup. There is an additional complication in that, due to the complexity of many production lines, the human-machine interfaces (HMIs) are often densely packed with information — much of which is seldom necessary for an operator’s day-to-day operations, but may be useful to maintenance and engineering staff. As small changes and additions build around the initial core, the firm can feel that the know-how captured in the system can’t be risked/lost so even as core technology upgrades are rolled out, the applications that have been developed end up reflecting that gradual evolution in place. And that evolution may mean: that the capabilities of the core product are underused and that legacy development tools and security  methods have been preserved long past their use-by date – this is explored more deeply in our article on technology strategy here.

In recent years, we’ve seen automation software providers work to address some of these challenges. Modern SCADA software can come with preset templates that are configured toreflect the common assets, process and related key data models for specific industry applications, such as pasteurising in a dairy plant or packaging in CPG environments. Such presets can reduce the setup time for most engineers, but beyond that, the templates provided by vendors can also offer a quick start on adopting more modern development tools, and best practices for structuring an application. With that in mind, such templates can provide time savings on basic building blocks for a new/refreshed system that in turn “give back” the time needed to migrate any unique code/intellectual property into the more modern platform.

Even with this leg-up on the application “plumbing”, many SCADA system still suffer from cluttered HMIs, and the vendor provided templates are intended to help address that as well.

“Efficient HMI” Concept – being delivered by GE Digital.

Experience serving the industrial sector has shown that in the most productive environments, SCADA systems present screen to operators that are easy to interpret. By having the operator’s work foremost in screen design, they can be up to 40%  more effective in spotting issue that require technical teams to resolve. Engineers can then respond faster to events on the production line. GE Digital has been delivering templates intended to support this “Efficient HMI” concept as part of their IFIX HMI/SCADA system. 

The templates, refine HMI displays to focus on the most critical information related to executing the work. This decluttered interface improves operator effectiveness in regular operation, and makes it easier to spot issues exceptions, with means improved situational awareness, and more focused reactions to such issues. The overall effect is a higher level of productivity, on measures such as machine utilisation, throughput and quality.

Following this approach, IFIX also features preconfigured sample systems that are aimed at elements of the food, beverage and CPG industries. For example, Novotek can supply the IFIX software with a preset tailored for juice plants, with a display that provides an overview of processes from raw material intake to filling and packaging. Beverage production engineers can run this system immediately following installation to gain an immediate assessment of how production is performing. Even where adaptation is needed, the templates provide working examples of both the efficient look-and-feel, and of the most modern approaches to the technical configuration underneath it all. So engineers and IT teams get a practical hand in furthering their technical knowledge, smoothing the adoption of new versions and related modern development tools. 

It’s not unusual for engineers to believe that preset templates might not adequately fit their unique operations, yet we’ve found that the preconfigured settings often provide a substantial benefit. Of course there is no substitute for testing this directly, which is why Novotek and GE Digital offer CPG companies and food and beverage manufacturers a free demo of IFIX to see how effectively the templates suit them. 

Automation may not necessarily be something new for the food, beverage and CPG sectors, but its continued evolution brings with it new implementation challenges and operational complexities. Novotek values the work by GE Digital on the Efficient HMI concept and application templates as they offer useful tools to customers to help them modernise quickly and safely. And by sharing the methods and example applications, the customer’s IT and engineering groups are given a helping hand in building a bridge from the technical realities of long-established SCADA systems to a more modern solution. 

]]>
Don’t “Stay Current” – Upgrade! https://ideashub.novotek.com/dont-just-keep-it-current-upgrade/ Wed, 12 Jan 2022 15:18:03 +0000 https://ideashub.novotek.com/?p=2932

As the UK’s industrial base continues to adapt to changes in business models, supply networks and trade environments, there’s an opportunity to tap into a massive hidden resource – the install base of HMI/SCADA systems deployed everywhere from power plants to food processors. Many of these systems were implemented in the 90s and 00s as part of the first wave of productivity investments – by supplying a way to visualise the critical elements of complex machines and processes, industrial firms improved the effectiveness of their front-line workers and supervisors, as well as the reliability of their operations. However, rather than a desire to gain a more operational and competitive advantage, the pattern of investment during the previous 20 years has been driven by the feeling of “forced need.” As we’ve aided several clients with improvements to their SCADA systems, we’ve seen two things that influence their choice to upgrade:  

1. Windows compatibility (really a matter of improving the maintainability and security of the system). 

2. The PC/server equipment hosting the SCADA system has failed, forcing the firm to take steps to restore it. 

And an unfortunate sub-theme is common – we’re often told that any upgrade “must keep the system the same as the operators are used to”. No changes. No adoption of new functions. No assessment of whether current engineering practices could lead to a better, more maintainable footprint. “Convert the application and get it running again” is the instruction of the day. Even in cases where a firm has run an assessment of different providers and switched to a new SCADA vendor, they’ve then asked to have their old application replicated, rather than taking the upgrade work as a chance to consider what new capabilities might be delivered by the more modern SCADA platform that’s replacing the one from 20 or 30 years(!) ago.  

From the operations leaders’ perspectives, the core mission – make the product; keep the assets running – is the same, and it can be hard to step back, and consider whether the automation and systems around the operation should work the way they always have. And when vendors supply a laundry list of new features or technical updates, it doesn’t necessarily give an operational, maintenance or other non-technical leader compelling information in the right terms for them to see the value in taking that pause to step back and consider a broader approach to what first looked like a “forced necessity”. 

If we had the opportunity to be face to face with every as they took that step back, here’s what we recommend they consider: 

Where vendors spend their SCADA product development dollars IS driven by YOUR needs (yes, some technical, but many more are user/functional focused). Just a few examples would include: 

Thing’s customers ask for Thing’s vendors invested in 
Better access to historical data – at the fingertips of operators and supervisors Integration of data Historians in the product offer and into the screens deployed to users 
Freedom from the desk! Choice of ways to make SCADA available remotely, or via the web, or via device-based apps 
A way to separate how information is delivered to production/operations people vs. technical or maintenance people Application build-and-delivery guides (and starter kits) that supply guidance on how to serve different users from the same underlying platform 
Ways to filter and focus the flood of data coming from machines and processes Improved alarm and event management functions, and even incorporation of solutions that can route/escalate different events to different people for more effective response 
Better support for continuous improvement practices such as Lean, or Autonomous Maintenance practices Improved interoperability with other systems such as ERP, Quality, or asset maintenance systems, so data (rather than paper) can flow, reducing non-productive work and making sure issues and exceptions are managed effectively in the tool best suited 

Using the more modern SCADA could save you time, effort and budget that would otherwise be spent on competing/overlapping new systems.  

OK – this one goes a little deeper into some technological details but stay with us! 

As firms pursue the value that may come from adoption of big data platforms and machine learning or analytics, they often launch data acquisition projects that without understand how existing plant systems part of the landscape can be, rather than driving additional tech into a plant where it may not be needed. 

Often, the IT and HQ operations teams don’t realise that their existing SCADA could accelerate these initiatives.  

Security – the elephant in the room 

It’s true that this is a topic that can cause people’s heads to throb before a discussion even gets started. But we’re going to skip all of the technical details and simply accept that improving cyber security postures is a priority for practical reasons, including: 

  • Yup – there are bad actors. Anyone remember the last round of Wannacry incidents? 
  • There are increasingly heavy regulatory burdens around certain industries or sectors 
  • To the above point, I hear many of you exclaiming “But my firm isn’t part of Critical National Infrastructure!”. That may be true, but we’re even seeing scenarios where brand owners or retailers are insisting that their suppliers be able to prove that they have a solid cyber security position – as that’s seen as an indicator that they’ll be less subject to disruption, one way or another… 

Again, vendors haven’t been idle. It may take some basic work on the applications you have, but if you’ve at least invested in a current version of your SCADA, you’ll be able to take advantage of what’s noted below. 

Things customers need for their customers and regulators Thing’s vendors invested in 
Modern, patchable plant systems, that are kept compatible with major operating systems like Windows Server Platform This is pretty much table stakes – it’s covered! 
Auditability Better deployment of user definitions; option to deploy e-signatures on critical functions 
Flexible, role-based restrictions on access to functions Making their internal user/security model the basis for screen or even function-based restrictions so only authorized users can do certain things 
Security that coordinates with corporate resources and policies Integration of SCADA security/users with Active Directory 
  

Another area where IT teams struggle is understanding what can be done at the SCADA layer to meet security goals. As a result, they may consider some unique (and often extreme!) techniques and approaches that may be more difficult and expensive than needed. 

Click here to see what our vendors now have available! iFIX Secure Deployment Guide.

Don’t just “keep it current” – Upgrade!

Given the breadth of functions, capabilities, and technological upgrading that SCADA providers have implemented, it’s probably safe to argue that there’s more “in the box” that might be employed in areas that matter. And we’ve even reserved the details of the important technical things, such as development speed, part and code reusability, and so on, for another time! And we believe that looking at their SCADA with fresh eyes and thinking about what the current platform CAN do – rather than what the 20-year-old application created on the legacy platform from 1992 IS doing – is the key to gaining some new operational gains. And the route to those benefits can be faster to follow, making other digital projects speedier as well. 

]]>
Combating flooding with automation https://ideashub.novotek.com/combating-flooding-with-automation/ Tue, 14 Dec 2021 12:10:58 +0000 https://ideashub.novotek.com/?p=2928 Each winter, the UK is battered with a barrage of storms that lead to all manner of problems for utilities operators, from power outages to water asset damage. Due to the predictability of seasonal flooding, effective automation systems are proving an increasingly vital investment to help water operators keep assets functional. Here, George Walker, managing director of water utilities automation specialist Novotek UK and Ireland, explains how software deployments can keep water networks afloat during flooding.

The UK Met Office announces a new A–Z of storm names every September, officially beginning the new storm season for the UK. In winter 2021, the Met Office named the first two storms in as many weeks, with projections from mid-December expecting that a further six storms would hit the country leading into the new year.

Despite the predictability of storm season, the impact on utilities companies routinely causes significant problems. In the wake of Storm Arwen on November 26 2021, approximately 3000 homes in northern England remained without power for more than a week. This reflects the challenge of harsh seasonal weather on utilities companies — a challenge that is only set to escalate as global climate change makes erroneous weather events a more common occurrence.

Unsurprisingly, an excessive surplus of water can cause problems in the water network. If assets such as pumping stations become flooded due to a high volume of rainfall or overflowing surface water sources, it can cause further flooding in domiciles and office spaces. It’s for this reason that water and sewage companies are obligated under the Water Industry Act 1991 to ensure their systems are resilient and that the area they serve has effective drainage.

Yet ensuring resilience in the water network is no simple task due to the size of the network and the number of distributed assets. It’s for this reason that water operators depend upon supervisory control and data acquisition (SCADA) systems at remote sites and, increasingly, an effective data management and control platform. The local control systems are necessary to accurately monitor and control equipment, but an effective overarching system makes it possible to remotely address issues as they arrive.

For example, Novotek routinely works with water companies across the country to help them establish more effective automation setups to facilitate remote decision-making in a streamlined, efficient manner. One of the challenges that arises frequently is that of data silos, where field engineers may have access to pertinent equipment health or performance data that is valuable but inaccessible to other teams. Fortunately, this is best — and easily — addressed with an overarching system that collects data once and presents different views to different stakeholders.

Not every system will be well positioned to provide flexible data views to users and be capable of ensuring effective response to floods. Ideally, an industrial automation platform should also feature effective data visualisation, as well as predictive analytics that can use locally collected data to anticipate the likelihood of asset damage or outage. These attributes allow operators to easily coordinate an effective and rapid response to seasonal flooding as it occurs, at the most vulnerable or at-risk parts of a network before further problems ensue.

As winter storms continue to become more frequent and impactful, water operators must be increasingly prepared to combat the effects and maintain uptime of network assets. Automation has long been a necessity due to the scale of operations, but the effectiveness of automation deployments has never been so important.

]]>
A recipe for lasting success https://ideashub.novotek.com/a-recipe-for-lasting-success/ Wed, 01 Sep 2021 11:03:50 +0000 https://ideashub.novotek.com/?p=2802 Few businesses routinely challenge every part of their organisation like food manufacturers. New technologies and digital transformation can help food manufacturers manage the constant change, but the traditional approach of comprehensive digitalisation planning is often not flexible enough to ensure success. Here, Sean Robinson, software solutions manager at food automation expert Novotek UK and Ireland, explains why the key ingredient for success in flexible food manufacturing are micro-applications.

Food production is truly a sector that operates under the mantra of “reinvent the everyday, every day”. The sector is constantly evolving, whether manufacturers are innovating new product ranges that meet changing consumer tastes or switching packaging materials to extend shelf-life or reduce waste. And these are just examples of substantial shifts; food manufacturers are also regularly making smaller challenges by refining recipes, adapting processes or adjusting ingredient and material supply lines.

Despite — or perhaps because of — the environment of constant change, food processors can benefit more than many other manufacturers from carefully targeted use of data collection, visualisation and analysis solutions. After all, yesterday’s optimisation isn’t particularly optimal if today means a new stock-keeping unit (SKU), a new critical ingredient supplier or a new recipe.

The approach that many businesses take to becoming data-driven is to extensively map out their digitalisation journey, with each aspect comprehensively planned. This doesn’t generally support the flexibility needed in food manufacturing.

Rather than taking this approach, modern solutions make it possible to build or buy micro-applications that share common data infrastructure and even app-building or visualisation tools. This means that impactful new capabilities can be adopted through fast initial works that create re-usable building blocks. Later works then become incremental, rather than potentially having different systems creating overlapping capabilities.

Micro-apps in practice

We can see how this micro-app approach can be put into action by considering one of the most common challenges in food processing: managing the effect of variability in key ingredients, so that yields are maximised with minimal re-work or ingredient waste. It’s likely that a manufacturer would already have some of the information needed to address the challenge. The question is, how can you quickly supplement what’s in place?

It’s a safe bet that the factory has automation and maybe supervisory control and data acquisition (SCADA) systems, so there is an abundance of machine-generated data to tell us about the details of how processes are performing. Focussing more closely on yield performance, we can assume our manufacturer has a lab system where in-process and finished good tests give very clear indicators of how well a product is being made.

From Novotek’s experience, the most common gaps in tackling yield issues come from two areas. The first is supplier quality data, which is often provided either written down or in an electronic format that doesn’t mesh with existing systems. This makes analysis more difficult, because there’s no actual database to work from.

The second area is that the variations in raw materials that affect yields may actually be within the specifications defined for those materials. As such, there may not be an obvious fix. It’s likelier that material data needs to be analysed alongside several process performance and quality performance data points. Understanding the relationships between more than two or three variables will probably mean adding a new kind of analysis tool.

Micro-apps can be highly focussed on the core capabilities required. In this case, the micro-app would provide three core functions. First, it would provide a simple means to capture ingredient quality data as it’s received, into a system that also holds the specific material characteristic specifications and limits – all on a “by-lot” basis. It would also offer a machine learning tool that can help clarify how the range of material quality variation can be managed in relation to what machine settings or recipe adjustments might allow for good final yield and quality results.

Finally, the micro-app would be able to alert production staff to make recommended changes to a recipe or process as different raw material lots are staged for use – an automated monitor of yield/quality risk from material variation. This could be as simple as a new smart alarm sent back to existing SCADA, or a notification on a smartphone.

Industrial software vendors are adapting their offers, in recognition of the trend towards micro-apps aimed at specific business processes. So, the software licensing needed to enable material data collection and quality specification monitoring on a key process would be built around a low user count and narrow set of underlying configuration and integration points, rather than a comprehensive plant-wide project. That can mean starting investments in the low thousands for software and some deployment work.

Some of Novotek’s customers are now progressing through projects defined by such very specific functional needs. Our job at Novotek is to ensure that any new solutions serve the purpose of being able to act as supplements to other such micro-apps in the future.

Next stages

A strategic advantage of micro-apps is that the planning and execution stages are less time-intensive than a far-reaching, plant-wide digitalisation project. Food engineers can do several things to begin reinventing their everyday processes. For example, food manufacturers can deploy predictive downtime applications on key processes. These are apps that can even take into consideration whether the products made have their own impact on failure modes.

Each micro-app reflects an opportunity to make the overall food manufacturing operation more adaptable. This means that innovation in products, processes and business models can be done, all the while knowing that refining and optimising the “new” won’t be held up by tools and practices that are too difficult to adapt from the “old”.

]]>
Free whitepaper: Enhancing data management in utilities https://ideashub.novotek.com/free-whitepaper-enhancing-data-management-in-utilities/ Fri, 20 Aug 2021 10:30:00 +0000 https://ideashub.novotek.com/?p=2748 Innovation has been one of the biggest focuses for utilities operators in recent years, particularly in the water market due to pressures from regulatory bodies. However, innovation is a broad term that offers no indication of the best and most impactful changes to implement.

The best approach may be to let the data dictate where to focus your innovation efforts. Or, if there’s a lack of useful data, then that itself may be the answer.

In this whitepaper, Novotek UK and Ireland explains how utilities operators can get to grips with data management to create an effective data-driven approach to innovation. Covering how to consolidate and modernise assets for data collection, how to make sense of utilities data and which method to use to get the most long-term value from data, the whitepaper is an invaluable resource for utilities operations managers and engineers.

Complete the form below to receive a copy of the whitepaper.

Subscribe to receive the Enhancing data management in utilities whitepaper

* indicates required
]]>