Amie Shannon – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Tue, 20 Dec 2022 09:28:49 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png Amie Shannon – Novotek Ideas Hub https://ideashub.novotek.com 32 32 How Novotek is helping the energy industry meet its sustainability development goals https://ideashub.novotek.com/how-novotek-is-helping-the-energy-industry-meet-its-sustainability-development-goals/ Tue, 20 Dec 2022 09:28:46 +0000 https://ideashub.novotek.com/?p=3327 This is the second of two articles where we examine the key industries of water and energy in the context of The Sustainable Development Goals, developed by the United Nations. In the first, article we looked at the water industry, with a particular focus on how it could improve water quality, minimising leakages, and reduce untreated wastewater entering the environment. In this article, we look at how energy companies can ensure the lights stay on and the consumer benefits from clean and affordable energy.

Affordable and clean energy

Energy is the dominant contributor to climate change, accounting for around 60 per cent of total global greenhouse gas emissions. More recently, with the war in Ukraine, it has also become a national security issue. So, as well as, increasing substantially the share of renewable energy in the global energy mix and doubling the global rate of improvement in energy efficiency by 2023, energy providers also need to ensure universal access to affordable, reliable, and modern energy services.

Achieving this is a huge task, but as with the water industry, it starts with data. With data you can fuel the advanced technology needed to monitor, predict and reduce emissions. So, for instance, using advanced analytics and artifical intelligence (AI) energy companies can set and achieve emission targets. And with greater

visibility and understanding it is easier to identify cost-saving in the network. This is set against a background of an increasing number of distributed energy sources, companies, digital technologies, and solutions in use; coupled with the fact that this technology is also beginning to mainstream the bi-directional flow of power back into the grid from consumer level via domestic battery storage and generation such as solar, and vehicle to grid technology.

There is no doubt that energy systems are becoming more and more complex and multi-factorial. Therefore, the need for data to integrate, so it can be managed as a whole, becomes more vital. This is essential to provide greater insight and thereby achieve net zero.

Gathering data allows companies and policy makers to visualise and monitor the totality of the transition to a decarbonised energy system in real time. With a clearer view they can determine the direction in which the transition is moving, and the effectiveness of implemented policy. The more visibility of data, the more business and policy makers can move away toward a proactive approach, built around data driven predictions.

In summary, managing the change from fossil fuel to decarbonised power generation requires the necessary data so that a balance between the environment and business economies, can be achieved. To see how this can work in practice download this useful whitepaper.

To learn more about how Novotek can help the energy industry achieve its sustainability development goals, get in touch with us here

]]>
How Novotek is helping to transform the water industry be promoting sustainable development https://ideashub.novotek.com/how-novotek-is-helping-to-transform-the-water-industry-be-promoting-sustainable-development/ Thu, 15 Dec 2022 09:05:26 +0000 https://ideashub.novotek.com/?p=3329 The Sustainable Development Goals are a call for action by all countries – poor, rich, and middle-income – to promote prosperity while protecting the planet. These goals developed by the United Nations Sustainable Development Group are vital for a recovery that leads to greener, more inclusive economies, and stronger, more resilient societies. In the first of a series of two articles, we focus on the clean water and sanitatiion goal and show how our solutions are making a difference within the water industry.

The clean water and sanitation goal

There are six sub-goals to this, but the one that concerns most in the developed world is:

6.3 By 2030, improve water quality by reducing pollution, eliminating dumping and minimising release of hazardous chemicals and materials, halving the proportion of untreated wastewater, and substantially increasing recycling and safe reuse globally.

In the UK, it is illegal for water companies to dump untreated sewage from discharge pipes without a permit. Yet data released by the Environment Agency revealed that water companies discharged raw sewage into English rivers 372,533 times last year. This is often due to very heavy rainfall, blockages, and unexpected equipment failures. Key to these issues, is having the data available to alert you to potential issues before they occur. These issues must be set in the context from the pressures coming from PR19, such as supply shortages and pricing, which is why there is now an additional focus on using data to address the sector’s environmental impact.

At Novotek, we are using software from GE Digital to help Water/Wastewater companies improve their operations and practices using the latest HMI/SCADA software applications. You can explore real customer examples here, but in summary these are:

  • Using data analytics to detect and predict leaks within the network before they even occur. Thus, allowing preventative maintenance to take place, which in turn reduces wastage.
  • Using smart monitoring of sewerage controls to prevent flooding and bursting; and combining this with metrology data to predict periods of heavy rainfall.
  • Prevent wastage of fresh water by monitoring reservoir fill and usage to avoid wastage from overflow, as well as monitoring clean water pipes to avoid wastage of freshly treated water.

In practice, many water companies lack the right data systems and processes to extract meaningful predictions. There is also a lack of experience in the complex area of data analytics. Having a partner, such as Novotek, can help water companies over come these hurdles and allow them to effectively utilise the data.

Usually there are a number of key steps we go through with customers. These are:

  • Conduct an audit to determine what data is currently available, where the data is, and where the gaps in data are against the objectives they are trying to achieve.
  • Assessing who needs what data, and what they should do with it.
  • Ensure these data systems are connected and able to communicate with each other. This might involve rearchitecting, upgrading SCADA systems, new hardware, and the use of middleware.
  • Finally, presenting the data in a way that transforms it into tangible predictions or recommendations.

To find out how Novotek are helping water companies achieve their sustainability and resilience goals, get in touch with us here

]]>
The Circular Economy vs. Industrial Automation. https://ideashub.novotek.com/the-circular-economy-vs-industrial-automation/ Fri, 09 Dec 2022 11:23:01 +0000 https://ideashub.novotek.com/?p=3318 The Circular Economy is quickly permeating throughout the way consumers/end users think about manufacturing. Beyond end-of-life solutions, the circualr economy at its heart is demonstrating the power of tapping into waste resources to create value, and new ‘product-as-a-service’ business models are highlighting the poer of new consumer demands.

Front-running companies are already pursuing circular strategies and successfully developing new, circular markets. This includes start-ups such as ACTronics, which remanufacturers automotive electronic equipment and CRS Holland, which recovers and recycles marine cable. It is becoming increasingly apparent that to remain competitive in the globl market and create a future-proof business, circular economy business strategies must be adopted.

Why is the circular economy important to industrial businesses?

Although it is a simple concept at heart, truly adopting a circular economy would be difficult to achieve overnight. It requires a change of mindset towards how we can be more sustainable and this needs to be present at every level within a business. On a general view, those who adopt the circular economy will design products and services in such a way that: 

  • The value added in manufactured products is maintained through maintenance, reuse, and remanufacturing.
  • Where value can no longer be retained in the above way, products and packaging are recycled.
  • Energy inputs are sourced from renewable sources.
  • Resource use is consistent and responsible, making the most of natural resources.  

What key environmental issues is the circular economy tackling today? 

Through adopting the principles of the circular economy, businesses can reduce their reliance in using and disposing of the world’s natural resources. Materials “looping around” may carry lower cost structures than those extracted and processed from source – and that can mean that cost incentives line up alongside environmental goals. And that can allow industrial firms that have traditionally been focused on throughput and uptime to a more balanced view where the plan reflects the mix of targets. One of the primary environmental issues that the circular economy is tackling today is recyclability. 

Recyclability is the process of retaining the highest value of products at the end of their life, enabling their recycling into high-end applications. It is also what you might call as ‘end-of-pipe’, while a circular economy’s ‘upstream’ solutions address potential problems right at the source.  

In a properly built circular economy, one should rather focus on avoiding the recycling stage at all costs. It may sound straightforward but preventing waste from being created in the first place is the only realistic strategy” – World Economic Forum. 

It is common for manufacturers to partner with end-of-life resource management companies during the design phase to integrate the appropriate features to facilitate end-of-life handling.  

What Novotek can bring to the table.

Novotek are an advanced industrial IT and automation solutions company who provide world class hardware and software to a range of manufacturing, process, and production sectors. Within their wide portfolio they offer products that fulfill manufacturers’ needs for greater visibility through a broader circular loop. One product that stands out is the Proficy Plant Applications software , which interacts with inventory and supply chains as well as ERP so that the condition under which something is processed in a factory is easier to share as is the provenance of the materials used.

Furthermore, the ability to track and trace in relation to having detailed data about the processes, conditions, and quality of those materials going through the full loop can be monitored using Proficy Plant Applications.  

Real-life Example: 

For instance, if a steel manufacturer is running a manufacturing execution system (MES) like Proficy Plant Applications, it provides rock solid traceability in terms of both the materials that go into the process and quality-related process data that occurs in relation to a specific run. Combined with final product quality data, the steel manufacturer is now able to easily understand (and prove) what process conditions allows them to make best use of recycled or repurposed materials. Which is data they can share with their customers too.  

Core sustainability elements are supported in systems capabilities like MES for track and trace, and quality management alongside Proficy Historian for gathering and sharing very detailed process data. And then if we think about dealing with the variability that comes as a product or a material that goes through multiple lifetimes, that’s where an analytics tool like Proficy CSense can help determine the best way to adjust processes and recipes.  

Conclusion.

Although in its relative infancy, regulation in support of the circular economy is coming. As global political momentum gathers around climate change and related challenges such as marine plastic pollution, long term waste disposals, politicians, stakeholders, and businesses will need to adapt. Those who have already taken steps in this direction will benefit the most. There are already companies that are shaping themselves around capitalizing on the circular economy. There is still time for less sustainability-minded companies to take the necessary steps to adapt their business models to position themselves within the circular economy.  

Novotek primarily sees its role as enabling industrial businesses to become more efficient. It is an inevitable consequence of this enablement that waste is reduced, and less raw materials are consumed. But the circular economy is more than that, it is also about recycling and reusing to move towards a close loop system.  

]]>
What SCADA Evolution Means for Developers https://ideashub.novotek.com/what-scada-evolution-means-for-developers/ Fri, 28 Oct 2022 13:58:37 +0000 https://ideashub.novotek.com/?p=3296

If you’ve walked through factories and seen operator or supervisor screens like the one below, you’re actually seeing both the best and worst aspects of technology evolution! Clearly, no data is left hidden within the machine or process, but screen design looks to have been driven by the abililty to visualiase what’s available from the underlying controls, rather than a more nuanced view of how to support different people in their work. you could say that the adoption of modern design approaches to building a “good” HMI or SCADA application has lagged what the underlying tools can support.

One place to configure & manage for SCADA, Historian, Visualisation

In Proficy iFIX, GE Digital has incorporated a mix of development acceleration and design philosophies that can both lead to more effective user experiences with a deployed system, while also making the overall cost of building, maintaining, and adapting a SCADA lower.

Three critical elemetns stand out:

1. Model-centric design

This brings object-oriented developement principles to SCADA and related applications. With a “home” for standrad definitions of common assets, and their related descriptibe and attribute data, OT teams can create reusable application components that are quick to deploy for each physical instance of a type. The model also provides useful application foundations, so things like animations, alarm filters and so on can be defined as appropriate for a class or type – and thereofore easily rolled out into the screens where instances of each type are present. And with developments in the GE site making the model infrastructure available to Historain, analytics and MED solutions, work done once can defray the cost and effort needed in related programs.

2. Centralised, web-based administation and development

In combination with the modelling capability, this offers a big gain in productivity for teams managing multiple instances of SCADA. With common object definitions, and standard screen templates, the speed at which new capabilites or chages to exisiting footprints can be built, tested, and rolled out means a huge recovery of time for skilled personnel.

3. The subtle side of web-based clients

Many older application have large bases of custom scripting – in many cases to enable interaction with data sources outside the SCADA, drive non-standard animations, or to enable conditional logic. With the shift to web-based client technology, the mechanics for such functions are shifting to more configurable object behaviours, and to server-side functions for data integrations. These mean simipler, more maintainable, and less error prone deployments.

Taking advantage of what current-generation iFIX offers will mean a different development approach – considering useful asset and object model structure, then templating the way objects should be deployed is a new starting point for many. But with that groundwork laid, the speed to a final solution is in many (most!) cases, faster than older methodologies – and that’s beofer considering the advantage of resuability across asset types, or across multiple servers for different lines or sites.

Recovered time buys room for other changes

With rich automation data mapped to the model, and faster methods to build and roll out screen, different users can have their views tailored to suit their regualr work. Our earlier screen example reflected a common belief that screen design is time-consuming, so best to put as much data as possible in one place so that operators, technicicans, maintenance and even improvement teams can all get what they need without excessive development effort. But that can mean a confused mashup of items that get in the way of managing the core process, and in turn actually hamper investigations when things are going wrong.

But where development time is less of a constraint, more streamlined views can be deployed to support core work processes, with increasing levels of detail exposed on other screen for more technical investigation or troubleshooting. Even without fully adopting GE Digital’s Efficient HMI design guidelines, firms can expect faster and more effective responses form operators and supervisors who don’t have to sift through complex, overloaded views simplu to maintain steady-state operators.

With significant gains to be had in terms of operator responsiveness, and effective management of expectations, the user experience itself can merit as much consideration as the under-the-bonent changes that benefit developers.

Greenfield vs. Brownfield

It may seem like adopting a model-based approach, and taking first steps with the new development environments would be easier on fresh new project, whereas an upgrade scenario should be addressed by “simply” porting forward old screens, the database, etc. But when you consider all that can be involved in that forward migration, the mix of things that need “just a few tweaks” can mean as much – or more – work than a fresh build of the system, where the old serves as a point of reference for design and user requirements.

The proess database is usually the easiest part of the configuration to migrate forward. Even if changing from legacy drivers to IGS or Kepware, these tend to be pretty quick. Most of the tradeoffs of time/budget for an overall better solution are related to screen (and related scripting) upgrades. From many (many!) upgrades we’ve observed our customers make, we see common areas where a “modernisation” rather than a migration can actully be more cost effective, as well as leaving users with a more satisfying solution.

Questions to consider include:

While there is often concen about whether modernisation can be “too much” change, it’s equally true that operators genuinely want to support their compaines in getting better. So if what they see at the end of an investment looks and feels the same way it always has, the chance to enable improvements may have been lost – and with it a chance to engage and energise employees who want to be a part of making things better.

Old vs. New

iFIX 2023 and the broader Proficy suite incorporating more modern tools, which in turn offer choices about methods and approahces. Beyond the technical enablement, enginerring and IT teams may find that exploring these ideas may offer benefit in areas as straightforward as modernising system to avoid obsolescene risk to making tangile progress on IoT and borader digital initiatives.

]]>
https://ideashub.novotek.com/3290-2/ Mon, 24 Oct 2022 09:46:04 +0000 https://ideashub.novotek.com/?p=3290 One of the advantages of managing technology assets is that you can do things with them beyond “just running them”, such as, keeping track of them and repairing them! Optimising a productions process for efficiency or utility usage if often a matter of enhancing the code in a control program, SCADA, or related system, so the tech assets themselves can be the foundation for ongoing gains. And similarly, as customer or regulatory requirements for proof of security or insight into production processes increase, the tech assets again become the vehicle to satisfy new demands, rather than re-engineering the underlying mechanical or process equipment.

It’s this very adaptability that makes version control around the configurations and programs valuable. As configurations and programs change, being sure that the correct version are running is key to sustaining the improvements that have been built into those latest releases. With that in mind, a good technology asset management program, such as octoplant will have version control as a central concern.

Whether deploying solutions in this area for the first time, or refreshing an established set of practices, it’s worthwhile to step back and evaluate what you want version control to do for you – operationally, compliance-wise and so on. And from that, the capabilities needed from any tools deployed will become clearer. With that in mind, we’ve noted some of the key areas to consider, and the decision that can come from them. We hope this helps you set the stage for a successful project!

Decide How to Deeply Embed Version Control

 We take VPNS, remote access and web applications for granted in a lot of ways – but this combination of technology means that it’s easier than ever to incorporate external development and engineering teams into your asset management and version control schemes. Evaluate whether it makes sense to set up external parties as users of your systems, or if it makes more sense to have your personnel manage the release and return of program / configuration files. The former approach can be most efficient in terms of project work, but it may mean some coordination with IT, to ensure access is granted securely. Either way, setting your version control system to reflect when a program is under development by other can ensure you have a smooth process for reincorporating their work back into your operation.

Be Flexible About the Scope of What Should be Version-Controlled.

Program source codes and configurations are the default focus of solutions like octoplant. Yet we see many firms deploying version control around supporting technical documentation, diagrams, even SOP (Standard Operating Procedure) documents relating to how things like code troubleshooting and changed should be done.

Define Your Storage and Navigation Philosophy. 

In many cases, this can be a very easy decision – set up a model (and associated file storage structure) that reflects your enterprise’s physical reality, as illustrated below. This works especially well when deploying automated backup and compare-to-master regimens, as each individual asset is reflected in the model.

However, some types of business may find alternatives useful. If you have many instances of an asset where the code base is genuinely identical between assets, and changes are rolled out en masse, and automated backup and compare is not to be deployed, it can make sense to think of a category-based or asset-type-specific model and storage scheme.

It may be that a blended approach make sense – where non-critical assets and programs may have variance both in their automation, and therefore in the program structure, an enterprise model can make sense. But in some industries (food, pharma, CPG), it can be common to maintain identical core asset types, and associated automation and process control. So having some category / type-based manager versions can be useful, too.

Reporting and Dashboards – Version Control Data is Not Just for Developers and Engineers.

A robust solution will track actions taken by different users in relation to different asset’s code bases, and in relation to any automated comparisons. This means you can have a rich audit trial that can certainly be used to ensure disciplines are being followed, but it also means that you can easily support any regulatory or customer requirements for data. And with a model of your operation reflecting the different makes models, variants and generation of tech assets, you’ll have a tech inventory at your fingertips that can make reinvestment and replacement planning much more efficient. So, make sure your plan to share dashboards and reports reflects the different people in your organisation who could use their own view of the tech assets, and the programs running in them.

If you’d like to learn more about the work we do with our customers on technology asset management, you can get in touch here; or ring us on +44 113 531 2400

]]>
DataOps: The Fuel Injectors For Your Transformation Engine? https://ideashub.novotek.com/dataops-the-fuel-injectors-your-transformation-engine/ Thu, 19 May 2022 11:43:48 +0000 https://ideashub.novotek.com/?p=3060

Data – everyone agrees it’s the fuel for the fires of innovation and optimisation. The industrial world is blessed with an abundance of rich, objective (being machine-generated) data, so should be well-equipped to seek new advantages from it. Too often, the first efforts an industrial firm takes to harness its machine and process data for new reporting or advanced analysis initiatives involve simple use cases and outputs that can mask what it takes to support a mix of different needs in a scalable and supportable way. Data Ops practices provide a way of systemically addressing the steps needed to ensure that your data can be made available in the right places, at the right times, in the right formats for all the initiatives you’re pursuing.


Industrial data (or OT data) poses particular challenges that your Data Ops strategy will address:

  • It can be generated at a pace that challenges traditional enterprise (or even cloud layer) data collection and data management systems (TO say nothing of the costs of ingestion and processing during reporting/analysis typical of cloud platforms is considered).
  • The data for functionality identical assets or processes is often not generated in a consistent structure and schema.
  • OT data generally does not have context established around each data point – making it difficult to understand what it represents, let alone the meaning inherent in the actual values!
  • Connecting to a mix of asset types with different automation types and communications protocols is often necessary to get a complete data set relevant to the reporting or analytics you’re pursuing.
  • A wide array of uses demands different levels of granularity of some data points and a breadth of collection points that is significantly wider than many individual stakeholders may appreciate.

These are the reasons why in many firms, the engineering team often ends up becoming the “data extract/Excel team” – their familiarity with the underlying technology means they can take snapshots and do the data cleansing necessary to make the data useful. But that’s not scalable, and data massaging is a far less impactful use of their time – they should be engaged with the broader team interpreting and acting on the data!


Data Ops – Quick Definition There’s no one way to “do” Data Ops. In the industrial world, it’s best thought of as a process involving: – Determining the preferred structures and descriptions (models) for OT data, so it may serve the uses the organisation has determined will be valuable. – Assessing what approaches to adding such models can be adopted by your organisation. – Choosing the mix of tools needed to add model structures to a combination of existing and new data sources. – Establishing the procedure to ensure that model definitions don’t become “stale” if business needs change. – Establishing the procedures to ensure that new data sources, or changing data sources are brought into the model-based framework promptly.


A Rough Map is Better Than No Map.

Take a first pass at capturing all the intended uses of your OT data. What KPIS, what reports, what integration points, and what analytics are people looking for? Flesh out those user interests with an understanding of what can feed into them:

  1. Map the different stakeholder’s data needs in terms of how much they come from common sources, and how many needs represent aggregations, calculations or other manipulations of the same raw data.
  2. Flesh out the map by defining the regularity with which data needs to flow to suit the different use cases. Are some uses based on by-shift, or daily views of some data? Are other users based on feeding data in real-time between systems to trigger events or actions?
  3. Now consider what data could usefully be “wrapped around” raw OT data to make it easier for the meaning and context of that data to be available for all. Assess what value can come from:
    1. Common descriptive models for assets and processes – a “Form Fill & Seal Machine” with variables like “Speed” and “Web Width” (etc.) is a far easier construct for many people to work with then a database presenting a collection of rows reflecting machines’ logical addresses with a small library of cryptically structured variables associated to each one.
    2. An enterprise model to help understand the locations and uses of assets and processes. The ISA-95 standard offers some useful guidance in establish such a model.
    3. Additional reference data to flesh out the descriptive and enterprise models. (eg: Things like make and model of common asset types with many vendors; or information about a location such as latitude or elevation). Be guided by what kind of additional data would be helpful in comparing/contrasting/investigating differences in outcomes that need to be addressed.
  4. Now assess what data pools are accumulating already – and how much context is accumulating in those pools. Can you re-use existing investments to support these new efforts, rather than creating a parallel set of solutions?
  5. Finally, inventory the OT in use where potentially useful data is generated, but not captured or stored; particularly note connectivity options.

Avoiding A Common Trap “Data for Analytics” means different things at different stages. A data scientist looking to extract new insights from OT data may need very large data sets in the data centre or cloud, where they can apply machine learning or other “big data” tools to a problem. A process optimisation team deploying a real-time analytic engine to make minute-by-minute use of the outputs of the data scientists’ work may only need small samples across a subset of data point for their part of the work. Data Ops thinking will help you ensure that both of these needs are met appropriately.


Map’s Done – Now How About Going Somewhere?

The work that comes next is really the “Ops” part of Data Ops – with the rough map of different uses of OT data at hand, and the view of whether each use needs granular data, aggregated data, calculated derivations (like KPIs), or some kind of combination, you’ll be able to quickly determine where generating desired outputs requires new data pools or streams, or where existing ones can be used. And for both, your data modelling work will guide what structures and descriptive data need to be incorporated.

At this point, you may find that some existing data pools lend themselves to having asset and descriptive models wrapped around the raw data at the data store level – ie: centrally. It’s a capability offered in data platform solutions like GE’s Proficy Historian. This approach can make more sense than extracting data sets simply to add model data and then re-writing the results to a fresh data store. Typically, streaming/real-time sources offer more choice in how best to handle adding the model around the raw data – and there are solutions like HighByte’s Intelligence Hub, that allow the model information to be added at the “edge” – the point where the data is captured in the first place. With the model definitions included at this point, you can set up multiple output streams – some feeding more in-the-moment views or integration points, some feeding data stores. In both cases, the model data having been imposed at the edge makes it easier for the ultimate user of the data to understand the context and the meaning of what’s in the stream.



Edge Tools vs Central Realistically, it’s you’re likely to need both. And the driving factor will not necessarily be technical. Edge works better when: 1. You have a team that deal well with spreading standardised templates. 2. Data sources are subject to less frequent change (utility assets are a good example of this). 3. The use cases require relatively straightforward “wrapping” of raw data with model information. 4. Central works well when. 5. The skills and disciplines to manage templates across many edge data collection footprints are scarce. 6. The mix of ultimate uses of the data are more complex – requiring more calculations or derivations or modelling of relationships between different types of data sources. 7. Change in underlying data sources is frequent enough that some level of dedicated and/or systematized change detection and remediation is needed.


Regardless of which tools are applied, the model definitions defined earlier, applied consistently, ensure that different reports, calculations and integration tools can be developed more easily, and adapted more easily as individual data sources under the models are inevitably tweaked, upgraded or replaced – as the new automation or sensors come in, their unique data structures simply need to be bound to the models representing them, and the “consumer” of their outputs will continue to work. So, while tools will be needed, ultimately the most valuable part of “doing” Data Ops is the thinking that goes into deciding what needs to be wrapped around raw data for it to become the fuel for your digital journey.

]]>
1,000 miles or around the block: Start one step at a time… https://ideashub.novotek.com/1000-miles-or-around-the-block-start-one-step-at-a-time/ Wed, 16 Mar 2022 11:44:08 +0000 https://ideashub.novotek.com/?p=2996

The rise of connected industrial technologies and Industry 4.0 has prompted the development and launch of countless systems with extensive capabilities and functions. This is often beneficial for businesses with a defined and set long-term strategy, but it can lead to forcing early adoption and spending when deployments and licensing outstrip company’s capacity to change work processes ad adopt new tech.


Here, Sean Robinson, software solutions manager at Novotek UK and Ireland, explains how less can be more with new plant tech deployments – and why immediate problem-solving needs to be a distinct effort within longer-term strategies.


Countless proverbs, maxims and quotes have been formed around the idea of moderation, dating back as far as – or even further – Ancient Greek society. The notion remains important to this day for everything from diet to technology. However, engineers and plant managers frequently over-indulge in the latter and over-specify systems that offer functionality well beyond what is necessary or even practically useful.

It can initially appear that there is no harm in opting for an automation or plant IT system that has extensive functionality, because this may help to solve future problems as they arise. That being said, and investment, positioned to be all-encompassing, like a full, material-receiving-throguh-WIP-execution-with-performances-analysis-and-enterprise-integration manufacturing execution system (MES) can sometimes present its own barriers to adoption for certain businesses, especially those in sectors that favour flexibility such as fast-moving consumer good (FMCG) or food manufacturing (also – interestingly – increasingly in the contract production side of consumer health and life sciences). Where core production processes and related enabling technology are well-established, it can be risky, expensive and overkill to treat the need to implement specific new capabilities as the trigger for wholesale replacement or re-working. They key is to identify where critical new functional needs can be implemented around the installed technology base in focused ways that deliver results, while leaving open the option of incrementally adding additional functionally-focused solutions in a staged way, over time.

At Novotek, our role is to help our customers choose technology that delivers on an immediate need, while opening up the potential to build incrementally in a controlled, low-risk way.

Fortunately, both the licensing models and the technical architectures of plant IT solutions are changing in ways that support this kind of approach. So, the software cost and deployment services costs of bringing on board very specific capabilities can be scaled to match the user base, and the technical and functional boundaries of a specific need. We can think of these focused deployments as “micro-apps”. A key part of this approach is that the apps aren’t built as bespoke, or as an extension of a legacy (and possibly obsolete) system. It’s a productised solution – with only the “right” parts enables and delivered to the right stakeholders. Consider quality in toiletry production and specifically challenges with product loss due to variability in the quality of raw materials. It’s safe to assume that a plant will already have local control systems in place elsewhere to track the overall quality outcomes, but monitoring the raw material quality is often left to supplier-side data that may be under used – serving as a record of supplier compliance with a standard, rather then being used to proactively trigger adjustments in key process settings to avoid losses. In this scenario, an ideal micro-app could be focused on captured raw material data, using machine learning to provide deep analysis of how existing machines can best process the material lot and alerting supervisors and process owners to take action. Such a function might have a small number of users; it might even have integration with inventory or quality systems replacing some manual data entry. So, the software licensing and services and timelines to deliver impact can be kept small. When we consider some of the demands manufacturers now face on fronts ranging from qualifying new supplier/materials, to furthering energy and water reduction, to adopting more predictive maintenance and asset management strategies. We see a lot of potential to tackle these with focused solutions that happen to borrow from the underlying depth and breath of MES solutions.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

]]>
Out-of-the-Box Solution Templates Offer More Than Meets The Eye. https://ideashub.novotek.com/out-of-the-box-solution-templates-offer-more-than-meets-the-eye/ Fri, 21 Jan 2022 13:23:08 +0000 https://ideashub.novotek.com/?p=2976

Industries such as food and beverage manufacturing and consumer packaged goods (CPG) production are fast-moving environments, with high traceability and proof-of-quality requirements alongside  throughput demands. As such, automation offers a lot of benefits to operations — so changes to existing systems, or implementing new ones can be seen as a source of risk, rather than opportunity. Here, Sam Kirby, a Solutions Engineer for Novotek UK & Ireland, looks at how manufacturers in the food, beverage and CPG sectors can reliably and rapidly extend automation deployments. 

Sam Kirby (Industrial IT & OT Automation Specialist)

The food and beverage industry has a long history with automated systems. In fact, one of the first fully automated production processes was that of an automatic flour mill, in 1785. The industry has generally kept abreast of automation developments since then, allowing it to keep ahead of ever-growing demand. Similar is true of CPG production, particularly in recent years, as product innovation has become a key business strategy.

CPG and food and beverage production tend towards automation because, in both sectors, much of the workforce is at the field level. As such, connecting systems to gain greater visibility into equipment health, output performance and product quality is invaluable. This is nothing new; engineers have been undertaking such projects for years, In particular, firms in these sectors have firmly established the benefits of connectivity and supervisory control and data acquisition (SCADA) systems. 

However, the fast-moving nature of product development, with knock-on effects on  operations means that systems are evolved in place – the goal is to undertake minimal technical work to allow for product and process changes without compromising the overall setup. There is an additional complication in that, due to the complexity of many production lines, the human-machine interfaces (HMIs) are often densely packed with information — much of which is seldom necessary for an operator’s day-to-day operations, but may be useful to maintenance and engineering staff. As small changes and additions build around the initial core, the firm can feel that the know-how captured in the system can’t be risked/lost so even as core technology upgrades are rolled out, the applications that have been developed end up reflecting that gradual evolution in place. And that evolution may mean: that the capabilities of the core product are underused and that legacy development tools and security  methods have been preserved long past their use-by date – this is explored more deeply in our article on technology strategy here.

In recent years, we’ve seen automation software providers work to address some of these challenges. Modern SCADA software can come with preset templates that are configured toreflect the common assets, process and related key data models for specific industry applications, such as pasteurising in a dairy plant or packaging in CPG environments. Such presets can reduce the setup time for most engineers, but beyond that, the templates provided by vendors can also offer a quick start on adopting more modern development tools, and best practices for structuring an application. With that in mind, such templates can provide time savings on basic building blocks for a new/refreshed system that in turn “give back” the time needed to migrate any unique code/intellectual property into the more modern platform.

Even with this leg-up on the application “plumbing”, many SCADA system still suffer from cluttered HMIs, and the vendor provided templates are intended to help address that as well.

“Efficient HMI” Concept – being delivered by GE Digital.

Experience serving the industrial sector has shown that in the most productive environments, SCADA systems present screen to operators that are easy to interpret. By having the operator’s work foremost in screen design, they can be up to 40%  more effective in spotting issue that require technical teams to resolve. Engineers can then respond faster to events on the production line. GE Digital has been delivering templates intended to support this “Efficient HMI” concept as part of their IFIX HMI/SCADA system. 

The templates, refine HMI displays to focus on the most critical information related to executing the work. This decluttered interface improves operator effectiveness in regular operation, and makes it easier to spot issues exceptions, with means improved situational awareness, and more focused reactions to such issues. The overall effect is a higher level of productivity, on measures such as machine utilisation, throughput and quality.

Following this approach, IFIX also features preconfigured sample systems that are aimed at elements of the food, beverage and CPG industries. For example, Novotek can supply the IFIX software with a preset tailored for juice plants, with a display that provides an overview of processes from raw material intake to filling and packaging. Beverage production engineers can run this system immediately following installation to gain an immediate assessment of how production is performing. Even where adaptation is needed, the templates provide working examples of both the efficient look-and-feel, and of the most modern approaches to the technical configuration underneath it all. So engineers and IT teams get a practical hand in furthering their technical knowledge, smoothing the adoption of new versions and related modern development tools. 

It’s not unusual for engineers to believe that preset templates might not adequately fit their unique operations, yet we’ve found that the preconfigured settings often provide a substantial benefit. Of course there is no substitute for testing this directly, which is why Novotek and GE Digital offer CPG companies and food and beverage manufacturers a free demo of IFIX to see how effectively the templates suit them. 

Automation may not necessarily be something new for the food, beverage and CPG sectors, but its continued evolution brings with it new implementation challenges and operational complexities. Novotek values the work by GE Digital on the Efficient HMI concept and application templates as they offer useful tools to customers to help them modernise quickly and safely. And by sharing the methods and example applications, the customer’s IT and engineering groups are given a helping hand in building a bridge from the technical realities of long-established SCADA systems to a more modern solution. 

]]>
Don’t “Stay Current” – Upgrade! https://ideashub.novotek.com/dont-just-keep-it-current-upgrade/ Wed, 12 Jan 2022 15:18:03 +0000 https://ideashub.novotek.com/?p=2932

As the UK’s industrial base continues to adapt to changes in business models, supply networks and trade environments, there’s an opportunity to tap into a massive hidden resource – the install base of HMI/SCADA systems deployed everywhere from power plants to food processors. Many of these systems were implemented in the 90s and 00s as part of the first wave of productivity investments – by supplying a way to visualise the critical elements of complex machines and processes, industrial firms improved the effectiveness of their front-line workers and supervisors, as well as the reliability of their operations. However, rather than a desire to gain a more operational and competitive advantage, the pattern of investment during the previous 20 years has been driven by the feeling of “forced need.” As we’ve aided several clients with improvements to their SCADA systems, we’ve seen two things that influence their choice to upgrade:  

1. Windows compatibility (really a matter of improving the maintainability and security of the system). 

2. The PC/server equipment hosting the SCADA system has failed, forcing the firm to take steps to restore it. 

And an unfortunate sub-theme is common – we’re often told that any upgrade “must keep the system the same as the operators are used to”. No changes. No adoption of new functions. No assessment of whether current engineering practices could lead to a better, more maintainable footprint. “Convert the application and get it running again” is the instruction of the day. Even in cases where a firm has run an assessment of different providers and switched to a new SCADA vendor, they’ve then asked to have their old application replicated, rather than taking the upgrade work as a chance to consider what new capabilities might be delivered by the more modern SCADA platform that’s replacing the one from 20 or 30 years(!) ago.  

From the operations leaders’ perspectives, the core mission – make the product; keep the assets running – is the same, and it can be hard to step back, and consider whether the automation and systems around the operation should work the way they always have. And when vendors supply a laundry list of new features or technical updates, it doesn’t necessarily give an operational, maintenance or other non-technical leader compelling information in the right terms for them to see the value in taking that pause to step back and consider a broader approach to what first looked like a “forced necessity”. 

If we had the opportunity to be face to face with every as they took that step back, here’s what we recommend they consider: 

Where vendors spend their SCADA product development dollars IS driven by YOUR needs (yes, some technical, but many more are user/functional focused). Just a few examples would include: 

Thing’s customers ask for Thing’s vendors invested in 
Better access to historical data – at the fingertips of operators and supervisors Integration of data Historians in the product offer and into the screens deployed to users 
Freedom from the desk! Choice of ways to make SCADA available remotely, or via the web, or via device-based apps 
A way to separate how information is delivered to production/operations people vs. technical or maintenance people Application build-and-delivery guides (and starter kits) that supply guidance on how to serve different users from the same underlying platform 
Ways to filter and focus the flood of data coming from machines and processes Improved alarm and event management functions, and even incorporation of solutions that can route/escalate different events to different people for more effective response 
Better support for continuous improvement practices such as Lean, or Autonomous Maintenance practices Improved interoperability with other systems such as ERP, Quality, or asset maintenance systems, so data (rather than paper) can flow, reducing non-productive work and making sure issues and exceptions are managed effectively in the tool best suited 

Using the more modern SCADA could save you time, effort and budget that would otherwise be spent on competing/overlapping new systems.  

OK – this one goes a little deeper into some technological details but stay with us! 

As firms pursue the value that may come from adoption of big data platforms and machine learning or analytics, they often launch data acquisition projects that without understand how existing plant systems part of the landscape can be, rather than driving additional tech into a plant where it may not be needed. 

Often, the IT and HQ operations teams don’t realise that their existing SCADA could accelerate these initiatives.  

Security – the elephant in the room 

It’s true that this is a topic that can cause people’s heads to throb before a discussion even gets started. But we’re going to skip all of the technical details and simply accept that improving cyber security postures is a priority for practical reasons, including: 

  • Yup – there are bad actors. Anyone remember the last round of Wannacry incidents? 
  • There are increasingly heavy regulatory burdens around certain industries or sectors 
  • To the above point, I hear many of you exclaiming “But my firm isn’t part of Critical National Infrastructure!”. That may be true, but we’re even seeing scenarios where brand owners or retailers are insisting that their suppliers be able to prove that they have a solid cyber security position – as that’s seen as an indicator that they’ll be less subject to disruption, one way or another… 

Again, vendors haven’t been idle. It may take some basic work on the applications you have, but if you’ve at least invested in a current version of your SCADA, you’ll be able to take advantage of what’s noted below. 

Things customers need for their customers and regulators Thing’s vendors invested in 
Modern, patchable plant systems, that are kept compatible with major operating systems like Windows Server Platform This is pretty much table stakes – it’s covered! 
Auditability Better deployment of user definitions; option to deploy e-signatures on critical functions 
Flexible, role-based restrictions on access to functions Making their internal user/security model the basis for screen or even function-based restrictions so only authorized users can do certain things 
Security that coordinates with corporate resources and policies Integration of SCADA security/users with Active Directory 
  

Another area where IT teams struggle is understanding what can be done at the SCADA layer to meet security goals. As a result, they may consider some unique (and often extreme!) techniques and approaches that may be more difficult and expensive than needed. 

Click here to see what our vendors now have available! iFIX Secure Deployment Guide.

Don’t just “keep it current” – Upgrade!

Given the breadth of functions, capabilities, and technological upgrading that SCADA providers have implemented, it’s probably safe to argue that there’s more “in the box” that might be employed in areas that matter. And we’ve even reserved the details of the important technical things, such as development speed, part and code reusability, and so on, for another time! And we believe that looking at their SCADA with fresh eyes and thinking about what the current platform CAN do – rather than what the 20-year-old application created on the legacy platform from 1992 IS doing – is the key to gaining some new operational gains. And the route to those benefits can be faster to follow, making other digital projects speedier as well. 

]]>
Are you ready for digital transformation? https://ideashub.novotek.com/are-you-ready-for-digital-transformation/ Mon, 06 Sep 2021 11:57:30 +0000 https://ideashub.novotek.com/?p=2891

For years, digital transformation has been considered a new frontier for manufacturing and industrial operations. In the wake of the COVID-19 pandemic and the series of subsequent supply chain struggles, it’s becoming apparent that digital transformation is no longer the next frontier; it’s the next normal. However, it is also a journey rather than a destination. As with any journey, you need to know where you are departing from, not simply where you intend to go.

The 2021 State of Manufacturing report by Fictiv identified that the pandemic has accelerated interest and investment in digital transformation. 91 per cent of manufacturing business leaders increased their investments into digital transformation in the past year, with 77 per cent reporting to have done so substantially. Beyond the 91 per cent that have already ramped up investment, an additional four per cent agreed that digital transformation is essential to future success.

There are many possible reasons for why the extra four per cent have not increased their investments. One of the most common reasons that we encounter is uncertainty:  where to invest, what the benefit could be or whether the implementation will align with longer-term objectives. In many cases, it’s a reflection of business leaders not knowing their level of digital maturity or where they are in the digital transformation cycle.

Assessing your level of digital maturity and readiness is an essential step prior to undertaking any digitalisation project. Although they sound similar, digital maturity and digital readiness are distinct; the former focusses on technological deployments, whereas the latter encompasses a review of enterprise-wide processes, skills and staff.

Most manufacturers tend to focus on digital maturity as the sole measurement. This impedes the effectiveness of digital transformation initiatives. However, it is still an important metric to track.

Many manufacturers that Novotek encounters are around level two in the digital maturity index but set their project objectives on leaping to levels four and beyond. This leads to expensive, ineffectual system deployments that could be avoided with an accurate view of digital maturity and proper consultation. It is important that business leaders think of the index as an elevator; they must ascend one level at a time.

Digital readiness

Transformation does not happen in isolation. Assessing digital readiness involves looking at each aspect of a business to determine whether the foundations are in place for the next level of digital investment to be successful.

There are five assessment areas for digital readiness:

Whether assessing digital readiness or maturity, the most important factor is the honesty of the organisation. Assessments should be conducted on as much tangible, observable evidence and data as possible to produce an accurate reflection of its current state. In many cases, involving a specialist as a neutral third party is advantageous at this stage because it limits biases or accidental omissions and oversights.

Digital transformation appears to be an essential aspect of the next normal for manufacturers. Business leaders must be able to embrace it effectively to recover from and adapt to a post-COVID world.

]]>