Industry – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Tue, 18 Oct 2022 12:52:50 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png Industry – Novotek Ideas Hub https://ideashub.novotek.com 32 32 Managing multiple energy sources https://ideashub.novotek.com/managing-multiple-energy-sources/ Tue, 18 Oct 2022 12:51:20 +0000 https://ideashub.novotek.com/?p=3270

In 2013, the UK Government Office for Science produced a report, entitled the Future role for energy in manufacturing. In this report, they identified two threats to UK-based manufacturers. The first was that the price of energy in the UK will rise, compared to the cost faced by competitor firms abroad, placing UK manufacturers at a significant disadvantage. Well, the price has risen but globally because of the Russia Ukraine war. Nevertheless, the threat to UK manufacturing is still valid. The second threat is that a low-carbon electricity supply will be unreliable, and that the cost of power cuts will rise. Well, that is certainly true if you rely solely on low-carbon electricity. But using multiple sources of power can be greatly beneficial.

In 2021, US rankings put technology companies at the top of their list for renewables users. Google derives 93% of its total electricity consumption from solar and wind power. Microsoft accounted for 100% of its electricity use from wind, small hydro and solar power, while Intel also derived 100% of its electricity from various renewables.

In the manufacturing world, more and more producers are turning to multiple sources to power their manufacturing, particularly those that are in the energy intensive production industries.

Tesla is well known for committing to renewable energy in manufacturing, with its solar-panelled roofs and use of waste heat and cold desert air to govern production processes in its Gigafactories.

Some of the bigger names in the manufacturing world that are utilising a solar system include GM, L’Oreal and Johnson & Johnson.

Manufacturing companies make ideal spots for solar system installations for several reasons. First, these businesses typically operate out of large plants with sizeable roofs. These expansive, flat spaces are perfect for setting up many solar panels. Also, manufacturing plants tend to be located in industrial parks and other areas far away from tall buildings, so they avoid the problems caused by massive structures looming over solar panels and creating shade. And smaller manufacturers can also benefit from multiple energy sources to both reduce their costs and reliance on the grid.

Making it work

The process of combining various types of energy is called a multi-carrier energy system, which increases energy efficiency. The technology that allows two or more independent three-phase or single-phase power system to synchronise can be achieve using a Power Sync and Measurement (PSM) system, such as the module found in the PACSystem RX3i Power Sync & Measurement Systems (IC694PSM001 & IC694ACC200). This will monitor two independent three-phase power grids. It incorporates advanced digital signal processor (DSP) technology to continuously process three voltage inputs and four current inputs for each grid.

Measurements include RMS voltages, RMS currents, RMS power, frequency, and phase relationship between the phase voltages of both grids.

The PSM module performs calculations on each captured waveform, with the DSP processing the data in less than two-thirds of a power line cycle. The PSM module can be used with wye or delta type three-phase power systems or with single-phase power systems.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

The PSM system can be used for applications such as:

  • Electrical power consumption monitoring and reporting
  • Fault monitoring
  • Generator control features for generator to power grid synchronization
  • Demand penalty cost reduction/load shedding

The PSM system consists of:

  • PSM module – A standard IC694 module that mounts in an RX3i main rack. The PSM module provides the DSP capability.
  • Terminal Assembly – A panel-mounted unit that provides the interface between the PSM module and the input transformers.
  • Interface cables – Provide the GRID 1 and GRID 2 connections between the PSM module and the Terminal Assembly

The image below shows how a basic PSM system can be connected.

PSM System Features
  • Uses standard, user-supplied current transformers (CTs) and potential transformers (PTs) as its input devices.
  • Accurately measures RMS voltage and current, power, power factor, frequency, energy, and total three-phase 15-minute power demand.
  • Provides two isolated relays that close when the voltage phase relationships between the two monitored grids are within the specified ANSI 25 limits provided by the RX3i host controller. These contacts can be used for general-purpose, lamp duty or pilot duty loads. Voltage and current ratings for these load types are provided in GFK-2749, PACSystems RX3i Power Sync and Measurement System User’s Manual.
  • Provides a cable monitoring function that indicates when the cables linking the PSM module and Terminal Assembly are correctly installed.
  • PSM module and Terminal Assembly are easily calibrated by hardware configuration using the PAC Machine Edition (PME) software.

To find out how Novotek can help you reduce your energy consumption and manage multiple energy sources email us at info_uk@novotek.com

]]>
DataOps: The Fuel Injectors For Your Transformation Engine? https://ideashub.novotek.com/dataops-the-fuel-injectors-your-transformation-engine/ Thu, 19 May 2022 11:43:48 +0000 https://ideashub.novotek.com/?p=3060

Data – everyone agrees it’s the fuel for the fires of innovation and optimisation. The industrial world is blessed with an abundance of rich, objective (being machine-generated) data, so should be well-equipped to seek new advantages from it. Too often, the first efforts an industrial firm takes to harness its machine and process data for new reporting or advanced analysis initiatives involve simple use cases and outputs that can mask what it takes to support a mix of different needs in a scalable and supportable way. Data Ops practices provide a way of systemically addressing the steps needed to ensure that your data can be made available in the right places, at the right times, in the right formats for all the initiatives you’re pursuing.


Industrial data (or OT data) poses particular challenges that your Data Ops strategy will address:

  • It can be generated at a pace that challenges traditional enterprise (or even cloud layer) data collection and data management systems (TO say nothing of the costs of ingestion and processing during reporting/analysis typical of cloud platforms is considered).
  • The data for functionality identical assets or processes is often not generated in a consistent structure and schema.
  • OT data generally does not have context established around each data point – making it difficult to understand what it represents, let alone the meaning inherent in the actual values!
  • Connecting to a mix of asset types with different automation types and communications protocols is often necessary to get a complete data set relevant to the reporting or analytics you’re pursuing.
  • A wide array of uses demands different levels of granularity of some data points and a breadth of collection points that is significantly wider than many individual stakeholders may appreciate.

These are the reasons why in many firms, the engineering team often ends up becoming the “data extract/Excel team” – their familiarity with the underlying technology means they can take snapshots and do the data cleansing necessary to make the data useful. But that’s not scalable, and data massaging is a far less impactful use of their time – they should be engaged with the broader team interpreting and acting on the data!


Data Ops – Quick Definition There’s no one way to “do” Data Ops. In the industrial world, it’s best thought of as a process involving: – Determining the preferred structures and descriptions (models) for OT data, so it may serve the uses the organisation has determined will be valuable. – Assessing what approaches to adding such models can be adopted by your organisation. – Choosing the mix of tools needed to add model structures to a combination of existing and new data sources. – Establishing the procedure to ensure that model definitions don’t become “stale” if business needs change. – Establishing the procedures to ensure that new data sources, or changing data sources are brought into the model-based framework promptly.


A Rough Map is Better Than No Map.

Take a first pass at capturing all the intended uses of your OT data. What KPIS, what reports, what integration points, and what analytics are people looking for? Flesh out those user interests with an understanding of what can feed into them:

  1. Map the different stakeholder’s data needs in terms of how much they come from common sources, and how many needs represent aggregations, calculations or other manipulations of the same raw data.
  2. Flesh out the map by defining the regularity with which data needs to flow to suit the different use cases. Are some uses based on by-shift, or daily views of some data? Are other users based on feeding data in real-time between systems to trigger events or actions?
  3. Now consider what data could usefully be “wrapped around” raw OT data to make it easier for the meaning and context of that data to be available for all. Assess what value can come from:
    1. Common descriptive models for assets and processes – a “Form Fill & Seal Machine” with variables like “Speed” and “Web Width” (etc.) is a far easier construct for many people to work with then a database presenting a collection of rows reflecting machines’ logical addresses with a small library of cryptically structured variables associated to each one.
    2. An enterprise model to help understand the locations and uses of assets and processes. The ISA-95 standard offers some useful guidance in establish such a model.
    3. Additional reference data to flesh out the descriptive and enterprise models. (eg: Things like make and model of common asset types with many vendors; or information about a location such as latitude or elevation). Be guided by what kind of additional data would be helpful in comparing/contrasting/investigating differences in outcomes that need to be addressed.
  4. Now assess what data pools are accumulating already – and how much context is accumulating in those pools. Can you re-use existing investments to support these new efforts, rather than creating a parallel set of solutions?
  5. Finally, inventory the OT in use where potentially useful data is generated, but not captured or stored; particularly note connectivity options.

Avoiding A Common Trap “Data for Analytics” means different things at different stages. A data scientist looking to extract new insights from OT data may need very large data sets in the data centre or cloud, where they can apply machine learning or other “big data” tools to a problem. A process optimisation team deploying a real-time analytic engine to make minute-by-minute use of the outputs of the data scientists’ work may only need small samples across a subset of data point for their part of the work. Data Ops thinking will help you ensure that both of these needs are met appropriately.


Map’s Done – Now How About Going Somewhere?

The work that comes next is really the “Ops” part of Data Ops – with the rough map of different uses of OT data at hand, and the view of whether each use needs granular data, aggregated data, calculated derivations (like KPIs), or some kind of combination, you’ll be able to quickly determine where generating desired outputs requires new data pools or streams, or where existing ones can be used. And for both, your data modelling work will guide what structures and descriptive data need to be incorporated.

At this point, you may find that some existing data pools lend themselves to having asset and descriptive models wrapped around the raw data at the data store level – ie: centrally. It’s a capability offered in data platform solutions like GE’s Proficy Historian. This approach can make more sense than extracting data sets simply to add model data and then re-writing the results to a fresh data store. Typically, streaming/real-time sources offer more choice in how best to handle adding the model around the raw data – and there are solutions like HighByte’s Intelligence Hub, that allow the model information to be added at the “edge” – the point where the data is captured in the first place. With the model definitions included at this point, you can set up multiple output streams – some feeding more in-the-moment views or integration points, some feeding data stores. In both cases, the model data having been imposed at the edge makes it easier for the ultimate user of the data to understand the context and the meaning of what’s in the stream.



Edge Tools vs Central Realistically, it’s you’re likely to need both. And the driving factor will not necessarily be technical. Edge works better when: 1. You have a team that deal well with spreading standardised templates. 2. Data sources are subject to less frequent change (utility assets are a good example of this). 3. The use cases require relatively straightforward “wrapping” of raw data with model information. 4. Central works well when. 5. The skills and disciplines to manage templates across many edge data collection footprints are scarce. 6. The mix of ultimate uses of the data are more complex – requiring more calculations or derivations or modelling of relationships between different types of data sources. 7. Change in underlying data sources is frequent enough that some level of dedicated and/or systematized change detection and remediation is needed.


Regardless of which tools are applied, the model definitions defined earlier, applied consistently, ensure that different reports, calculations and integration tools can be developed more easily, and adapted more easily as individual data sources under the models are inevitably tweaked, upgraded or replaced – as the new automation or sensors come in, their unique data structures simply need to be bound to the models representing them, and the “consumer” of their outputs will continue to work. So, while tools will be needed, ultimately the most valuable part of “doing” Data Ops is the thinking that goes into deciding what needs to be wrapped around raw data for it to become the fuel for your digital journey.

]]>
1,000 miles or around the block: Start one step at a time… https://ideashub.novotek.com/1000-miles-or-around-the-block-start-one-step-at-a-time/ Wed, 16 Mar 2022 11:44:08 +0000 https://ideashub.novotek.com/?p=2996

The rise of connected industrial technologies and Industry 4.0 has prompted the development and launch of countless systems with extensive capabilities and functions. This is often beneficial for businesses with a defined and set long-term strategy, but it can lead to forcing early adoption and spending when deployments and licensing outstrip company’s capacity to change work processes ad adopt new tech.


Here, Sean Robinson, software solutions manager at Novotek UK and Ireland, explains how less can be more with new plant tech deployments – and why immediate problem-solving needs to be a distinct effort within longer-term strategies.


Countless proverbs, maxims and quotes have been formed around the idea of moderation, dating back as far as – or even further – Ancient Greek society. The notion remains important to this day for everything from diet to technology. However, engineers and plant managers frequently over-indulge in the latter and over-specify systems that offer functionality well beyond what is necessary or even practically useful.

It can initially appear that there is no harm in opting for an automation or plant IT system that has extensive functionality, because this may help to solve future problems as they arise. That being said, and investment, positioned to be all-encompassing, like a full, material-receiving-throguh-WIP-execution-with-performances-analysis-and-enterprise-integration manufacturing execution system (MES) can sometimes present its own barriers to adoption for certain businesses, especially those in sectors that favour flexibility such as fast-moving consumer good (FMCG) or food manufacturing (also – interestingly – increasingly in the contract production side of consumer health and life sciences). Where core production processes and related enabling technology are well-established, it can be risky, expensive and overkill to treat the need to implement specific new capabilities as the trigger for wholesale replacement or re-working. They key is to identify where critical new functional needs can be implemented around the installed technology base in focused ways that deliver results, while leaving open the option of incrementally adding additional functionally-focused solutions in a staged way, over time.

At Novotek, our role is to help our customers choose technology that delivers on an immediate need, while opening up the potential to build incrementally in a controlled, low-risk way.

Fortunately, both the licensing models and the technical architectures of plant IT solutions are changing in ways that support this kind of approach. So, the software cost and deployment services costs of bringing on board very specific capabilities can be scaled to match the user base, and the technical and functional boundaries of a specific need. We can think of these focused deployments as “micro-apps”. A key part of this approach is that the apps aren’t built as bespoke, or as an extension of a legacy (and possibly obsolete) system. It’s a productised solution – with only the “right” parts enables and delivered to the right stakeholders. Consider quality in toiletry production and specifically challenges with product loss due to variability in the quality of raw materials. It’s safe to assume that a plant will already have local control systems in place elsewhere to track the overall quality outcomes, but monitoring the raw material quality is often left to supplier-side data that may be under used – serving as a record of supplier compliance with a standard, rather then being used to proactively trigger adjustments in key process settings to avoid losses. In this scenario, an ideal micro-app could be focused on captured raw material data, using machine learning to provide deep analysis of how existing machines can best process the material lot and alerting supervisors and process owners to take action. Such a function might have a small number of users; it might even have integration with inventory or quality systems replacing some manual data entry. So, the software licensing and services and timelines to deliver impact can be kept small. When we consider some of the demands manufacturers now face on fronts ranging from qualifying new supplier/materials, to furthering energy and water reduction, to adopting more predictive maintenance and asset management strategies. We see a lot of potential to tackle these with focused solutions that happen to borrow from the underlying depth and breath of MES solutions.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

]]>
Out-of-the-Box Solution Templates Offer More Than Meets The Eye. https://ideashub.novotek.com/out-of-the-box-solution-templates-offer-more-than-meets-the-eye/ Fri, 21 Jan 2022 13:23:08 +0000 https://ideashub.novotek.com/?p=2976

Industries such as food and beverage manufacturing and consumer packaged goods (CPG) production are fast-moving environments, with high traceability and proof-of-quality requirements alongside  throughput demands. As such, automation offers a lot of benefits to operations — so changes to existing systems, or implementing new ones can be seen as a source of risk, rather than opportunity. Here, Sam Kirby, a Solutions Engineer for Novotek UK & Ireland, looks at how manufacturers in the food, beverage and CPG sectors can reliably and rapidly extend automation deployments. 

Sam Kirby (Industrial IT & OT Automation Specialist)

The food and beverage industry has a long history with automated systems. In fact, one of the first fully automated production processes was that of an automatic flour mill, in 1785. The industry has generally kept abreast of automation developments since then, allowing it to keep ahead of ever-growing demand. Similar is true of CPG production, particularly in recent years, as product innovation has become a key business strategy.

CPG and food and beverage production tend towards automation because, in both sectors, much of the workforce is at the field level. As such, connecting systems to gain greater visibility into equipment health, output performance and product quality is invaluable. This is nothing new; engineers have been undertaking such projects for years, In particular, firms in these sectors have firmly established the benefits of connectivity and supervisory control and data acquisition (SCADA) systems. 

However, the fast-moving nature of product development, with knock-on effects on  operations means that systems are evolved in place – the goal is to undertake minimal technical work to allow for product and process changes without compromising the overall setup. There is an additional complication in that, due to the complexity of many production lines, the human-machine interfaces (HMIs) are often densely packed with information — much of which is seldom necessary for an operator’s day-to-day operations, but may be useful to maintenance and engineering staff. As small changes and additions build around the initial core, the firm can feel that the know-how captured in the system can’t be risked/lost so even as core technology upgrades are rolled out, the applications that have been developed end up reflecting that gradual evolution in place. And that evolution may mean: that the capabilities of the core product are underused and that legacy development tools and security  methods have been preserved long past their use-by date – this is explored more deeply in our article on technology strategy here.

In recent years, we’ve seen automation software providers work to address some of these challenges. Modern SCADA software can come with preset templates that are configured toreflect the common assets, process and related key data models for specific industry applications, such as pasteurising in a dairy plant or packaging in CPG environments. Such presets can reduce the setup time for most engineers, but beyond that, the templates provided by vendors can also offer a quick start on adopting more modern development tools, and best practices for structuring an application. With that in mind, such templates can provide time savings on basic building blocks for a new/refreshed system that in turn “give back” the time needed to migrate any unique code/intellectual property into the more modern platform.

Even with this leg-up on the application “plumbing”, many SCADA system still suffer from cluttered HMIs, and the vendor provided templates are intended to help address that as well.

“Efficient HMI” Concept – being delivered by GE Digital.

Experience serving the industrial sector has shown that in the most productive environments, SCADA systems present screen to operators that are easy to interpret. By having the operator’s work foremost in screen design, they can be up to 40%  more effective in spotting issue that require technical teams to resolve. Engineers can then respond faster to events on the production line. GE Digital has been delivering templates intended to support this “Efficient HMI” concept as part of their IFIX HMI/SCADA system. 

The templates, refine HMI displays to focus on the most critical information related to executing the work. This decluttered interface improves operator effectiveness in regular operation, and makes it easier to spot issues exceptions, with means improved situational awareness, and more focused reactions to such issues. The overall effect is a higher level of productivity, on measures such as machine utilisation, throughput and quality.

Following this approach, IFIX also features preconfigured sample systems that are aimed at elements of the food, beverage and CPG industries. For example, Novotek can supply the IFIX software with a preset tailored for juice plants, with a display that provides an overview of processes from raw material intake to filling and packaging. Beverage production engineers can run this system immediately following installation to gain an immediate assessment of how production is performing. Even where adaptation is needed, the templates provide working examples of both the efficient look-and-feel, and of the most modern approaches to the technical configuration underneath it all. So engineers and IT teams get a practical hand in furthering their technical knowledge, smoothing the adoption of new versions and related modern development tools. 

It’s not unusual for engineers to believe that preset templates might not adequately fit their unique operations, yet we’ve found that the preconfigured settings often provide a substantial benefit. Of course there is no substitute for testing this directly, which is why Novotek and GE Digital offer CPG companies and food and beverage manufacturers a free demo of IFIX to see how effectively the templates suit them. 

Automation may not necessarily be something new for the food, beverage and CPG sectors, but its continued evolution brings with it new implementation challenges and operational complexities. Novotek values the work by GE Digital on the Efficient HMI concept and application templates as they offer useful tools to customers to help them modernise quickly and safely. And by sharing the methods and example applications, the customer’s IT and engineering groups are given a helping hand in building a bridge from the technical realities of long-established SCADA systems to a more modern solution. 

]]>
Are your PLCs an easy target? A mindset shift can significantly reduce PLC firmware vulnerabilities https://ideashub.novotek.com/are-your-plcs-an-easy-target-reduce-plc-firmware-vulnerabilities/ Thu, 25 Nov 2021 14:06:48 +0000 https://ideashub.novotek.com/?p=2917

Since the beginning of the COVID-19 pandemic, businesses across the UK have faced a surge in cybercrime. In fact, research indicates that UK businesses experienced one attempted cyberattack every 46 seconds on average in 2020. Industrial businesses are a prime target for hackers and the ramifications of a data breach or denial-of-service attack are far-reaching, making system security imperative. Here, David Evanson, corporate vendor relationship manager at Novotek UK and Ireland, explains how industrial businesses can keep their vital systems secure.

For many business leaders and engineers, it is still tempting to consider large multinational companies or data-rich digital service providers to be the prime target for hackers. However, the growing volume of cyberattacks on businesses globally show that any company can be a target of malicious attacks on systems and services.

According to research by internet service provider Beaming, there were 686,961 attempted system breaches among UK businesses in 2020, marking a 20 per cent increase on 2019. Of these attacks, Beaming noted that one in ten intended to gain control of an Internet of Things (IoT) device — something that indicates a tendency to target system continuity rather than conventional data.

Both factors together are cause for alarm among industrial businesses of all sizes. Hackers are targeting all manner of companies, from start-ups to global organisations, and focussing more on the growing number of internet-connected devices and systems that were previously isolated.

The consequences of a device being compromised range from data extraction to service shutdown, and in any case the financial and production impacts to an industrial business are significant. There is no single quick fix to bolster cybersecurity due to the varying types of hacks that can take place. Some cyberattacks are complex and sophisticated; others less so. Many attacks on devices tend to fall into the latter category, which means there are some steps industrial businesses can take to minimise risk.

Novotek has been working closely with industrial businesses in the UK and Ireland for decades. One common thing that we have observed with automation hardware and software is that many engineers do not regularly upgrade software or firmware. Instead, there is a tendency to view automation as a one-off, fit-and-forget purchase. The hardware may be physically maintained on a regular schedule, but the invisible software aspect is often neglected.

GE Fanuc Series 90-30

Older firmware is more susceptible to hacks because it often contains unpatched known security vulnerabilities, such as weak authentication algorithms, obsolete encryption technologies or backdoors for unauthorised access. For a programmable logic controller (PLC), older firmware versions make it possible for cyber attackers to change the module state to halt-mode, resulting in a denial-of-service that stops production or prevents critical processes from running.

PLC manufacturers routinely update firmware to ensure it is robust and secure in the face of the changing cyber landscape, but there is not always a set interval between these updates.

In some cases, updates are released in the days or weeks following the discovery of a vulnerability — either by the manufacturer, Whitehat hackers or genuine attackers — to minimise end-user risk. The firmware version’s upgrade information should outline any exploits that have been fixed.

However, it’s important to note that legacy PLCs may no longer receive firmware updates from the manufacturer if the system has reached obsolescence. Many engineers opt to air-gap older PLCs to minimise the cybersecurity risk, but the lack of firmware support can also create interoperability issues with connected devices. Another part of the network, such as a switch, receiving an update can cause communications and compatibility issues with PLCs running on older versions — yet another reason why systems should run on the most recent software patches.

At this stage, engineers should invest in a more modern PLC to minimise risk — and, due to the rate of advancement of PLCs in recent years, likely benefit from greater functionality at the same time.

Firmware vulnerabilities are unavoidable, regardless of the quality of the PLC. At Novotek, we give extensive support for the Emerson PACSystems products that we provide to businesses in the UK and Ireland. This involves not only support with firmware updates as they become available, but also guidance on wider system resilience to ensure that businesses are as safe as possible from hardware vulnerabilities. The growth in cyberattacks will continue long beyond the end of the COVID-19 pandemic, and infrastructure and automation are increasingly becoming targets. It may seem a simple step, but taking the same upgrade approach to firmware that we do with conventional computers can help engineers to secure their operations and keep running systems safely.

]]>
Bridging the connectivity gap https://ideashub.novotek.com/bridging-the-connectivity-gap/ Mon, 06 Sep 2021 10:18:03 +0000 https://ideashub.novotek.com/?p=2860

In the age of connectivity, there is no shortage of useful information that engineers can leverage to optimise and improve operations. Everything from the speed of motors to the weather forecast can influence production. However, bringing these data sources together in a secure way is a challenge faced by many engineers. Here, George Walker, managing director of Novotek UK and Ireland, explains how engineers can bridge the gap between local process data and external data sources.

The Internet of Things (IoT) may still be a relatively new concept for many consumers and professional service businesses, but the idea of machine-to-machine communication and connectivity is nothing new for industry. In fact, it’s been more than 50 years since the programmable logic controller (PLC) first became popular among industrial businesses as a means of controlling connected systems.

The principle behind the PLC is quite simple: see, think and do. The controller will ‘see’ what is happening in a process based on the input data from the connected devices and machines. The PLC then processes this input and computes if any adjustments are required and if so, it signals these commands to the field devices. Traditionally, the field devices that could be controlled was limited, but recent developments in sensor technology have made specific components and resources much more measurable.

For example, if a water tank is almost at full capacity in a food processing plant, data from connected sensors can feed that information to a PLC. The PLC then sends the signal for the valve to close once the water volume exceeds a certain threshold, which prevents overflow. This is a simple control loop that sufficiently meets the need of the process.

Unfortunately, even as edge computing and PLC technology has advanced and offered more sophisticated data processing and control at the field-level, many plant engineers continue to setup their devices in this way. In reality, modern edge devices and industrial PCs (IPCs) are capable of providing much greater control, as well as responding to external commands or variables that were previously beyond the scope of control systems.

The outer loop

While the idea of the Industrial IoT (IIoT) is predominately a means of branding modern connectivity, the wider Industry 4.0 movement has brought with it some valuable advancements in edge and PLC technology. Among these advancements is the potential for on-premises automation and control systems to not only connect with local devices in an inner loop, but to draw from external sources: an outer loop.

The outer loop can take several forms, depending on what is most applicable or relevant to a process or operation.

For example, some more digitally mature businesses might have outer loops that feature an enterprise resource planning (ERP) system, supply chain management software or a wider manufacturing execution system (MES). These systems will share and receive relevant information or send required adjustments — such as due to raw material intake or low stock — to an edge device, which feeds into the inner loop. This allows industrial businesses to make use of more comprehensive data analysis than can be achieved in local data systems.

Alternatively, an outer loop could draw from data sources that are completely external to a plant’s operations. For example, a wind farm operator could use an outer loop that drew from sources of meteorological data for wind forecasts. This could inform the optimum pitch and yaw of a turbine, controlled by a field device.

Another example, and one that will resonate with many industrial businesses, is energy price. The cost of power from the electrical grid fluctuates throughout the day, which might mean that on-site generation — such as solar panels or heat recovery processes — become more economical during times of peak grid demand. An outer loop can communicate this data efficiently to the relevant systems in a business, and changes can then be enacted that allow the business to reduce energy costs.

Establishing secure connection

Clearly, there is a benefit for industrial businesses to establish both inner and outer loops. However, there is one barrier to deployment that most engineers encounter: hardware limitations.

Traditional PLCs were designed in a rather utilitarian manner to complete control functions effectively and in a straightforward manner. This no-frills approach persists even with modern PLCs — even with today’s technical specifications, most PLCs are not designed in a way that struggles to handle much more than a real-time operating system and some control applications.

Attempting to set up such a PLC to interact with an outer loop would either not work at all or severely hinder performance and risk failure.

Engineers can tackle this problem by introducing a separate gateway device that serves as an intermediary between the outer loop and the inner loop. However, this is a somewhat inelegant solution that requires investment in additional devices, which will require ongoing maintenance and introduce yet another device into already large system networks. Across an entire site, this quickly becomes costly and complicates network topologies.

A better solution is an unconventional one. It is possible to set up a modern automation controller in such a way that it breaks the conventions of PLCs, as long as the device is capable of multi-core processing at pace. From Novotek’s perspective, one of the best modern units that meet this need is Emerson Automation’s CPL410 automation controller.

The CPL410 can split inner and outer loop processing between its multiple processor cores. The inner loop and PLC processes can run from a single core, while another core — or even a group of cores, depending on complexity — can run more sophisticated containerised applications or operating systems. Additional cores can broker between the inner and outer loops, ensuring reliability and security.

A multi-core setup is useful because it allows the PLC processes and gateway to be consolidating into a single unit, without compromising performance capacity or speed. It also means that ageing or obsolete PLCs can be upgraded to a controller such as the CPL410 during any modernisation initiatives, minimising additional capital costs.

Although the idea behind the IoT is not a new one for industrial businesses, the fact that other sectors are embracing the idea means more external data points than ever before are available. With systems in place that can support effective inner and outer loops, industrial businesses can leverage the increased connectivity of external markets and enhance their own operations.

]]>
A recipe for lasting success https://ideashub.novotek.com/a-recipe-for-lasting-success/ Wed, 01 Sep 2021 11:03:50 +0000 https://ideashub.novotek.com/?p=2802 Few businesses routinely challenge every part of their organisation like food manufacturers. New technologies and digital transformation can help food manufacturers manage the constant change, but the traditional approach of comprehensive digitalisation planning is often not flexible enough to ensure success. Here, Sean Robinson, software solutions manager at food automation expert Novotek UK and Ireland, explains why the key ingredient for success in flexible food manufacturing are micro-applications.

Food production is truly a sector that operates under the mantra of “reinvent the everyday, every day”. The sector is constantly evolving, whether manufacturers are innovating new product ranges that meet changing consumer tastes or switching packaging materials to extend shelf-life or reduce waste. And these are just examples of substantial shifts; food manufacturers are also regularly making smaller challenges by refining recipes, adapting processes or adjusting ingredient and material supply lines.

Despite — or perhaps because of — the environment of constant change, food processors can benefit more than many other manufacturers from carefully targeted use of data collection, visualisation and analysis solutions. After all, yesterday’s optimisation isn’t particularly optimal if today means a new stock-keeping unit (SKU), a new critical ingredient supplier or a new recipe.

The approach that many businesses take to becoming data-driven is to extensively map out their digitalisation journey, with each aspect comprehensively planned. This doesn’t generally support the flexibility needed in food manufacturing.

Rather than taking this approach, modern solutions make it possible to build or buy micro-applications that share common data infrastructure and even app-building or visualisation tools. This means that impactful new capabilities can be adopted through fast initial works that create re-usable building blocks. Later works then become incremental, rather than potentially having different systems creating overlapping capabilities.

Micro-apps in practice

We can see how this micro-app approach can be put into action by considering one of the most common challenges in food processing: managing the effect of variability in key ingredients, so that yields are maximised with minimal re-work or ingredient waste. It’s likely that a manufacturer would already have some of the information needed to address the challenge. The question is, how can you quickly supplement what’s in place?

It’s a safe bet that the factory has automation and maybe supervisory control and data acquisition (SCADA) systems, so there is an abundance of machine-generated data to tell us about the details of how processes are performing. Focussing more closely on yield performance, we can assume our manufacturer has a lab system where in-process and finished good tests give very clear indicators of how well a product is being made.

From Novotek’s experience, the most common gaps in tackling yield issues come from two areas. The first is supplier quality data, which is often provided either written down or in an electronic format that doesn’t mesh with existing systems. This makes analysis more difficult, because there’s no actual database to work from.

The second area is that the variations in raw materials that affect yields may actually be within the specifications defined for those materials. As such, there may not be an obvious fix. It’s likelier that material data needs to be analysed alongside several process performance and quality performance data points. Understanding the relationships between more than two or three variables will probably mean adding a new kind of analysis tool.

Micro-apps can be highly focussed on the core capabilities required. In this case, the micro-app would provide three core functions. First, it would provide a simple means to capture ingredient quality data as it’s received, into a system that also holds the specific material characteristic specifications and limits – all on a “by-lot” basis. It would also offer a machine learning tool that can help clarify how the range of material quality variation can be managed in relation to what machine settings or recipe adjustments might allow for good final yield and quality results.

Finally, the micro-app would be able to alert production staff to make recommended changes to a recipe or process as different raw material lots are staged for use – an automated monitor of yield/quality risk from material variation. This could be as simple as a new smart alarm sent back to existing SCADA, or a notification on a smartphone.

Industrial software vendors are adapting their offers, in recognition of the trend towards micro-apps aimed at specific business processes. So, the software licensing needed to enable material data collection and quality specification monitoring on a key process would be built around a low user count and narrow set of underlying configuration and integration points, rather than a comprehensive plant-wide project. That can mean starting investments in the low thousands for software and some deployment work.

Some of Novotek’s customers are now progressing through projects defined by such very specific functional needs. Our job at Novotek is to ensure that any new solutions serve the purpose of being able to act as supplements to other such micro-apps in the future.

Next stages

A strategic advantage of micro-apps is that the planning and execution stages are less time-intensive than a far-reaching, plant-wide digitalisation project. Food engineers can do several things to begin reinventing their everyday processes. For example, food manufacturers can deploy predictive downtime applications on key processes. These are apps that can even take into consideration whether the products made have their own impact on failure modes.

Each micro-app reflects an opportunity to make the overall food manufacturing operation more adaptable. This means that innovation in products, processes and business models can be done, all the while knowing that refining and optimising the “new” won’t be held up by tools and practices that are too difficult to adapt from the “old”.

]]>
Free whitepaper: Enhancing data management in utilities https://ideashub.novotek.com/free-whitepaper-enhancing-data-management-in-utilities/ Fri, 20 Aug 2021 10:30:00 +0000 https://ideashub.novotek.com/?p=2748 Innovation has been one of the biggest focuses for utilities operators in recent years, particularly in the water market due to pressures from regulatory bodies. However, innovation is a broad term that offers no indication of the best and most impactful changes to implement.

The best approach may be to let the data dictate where to focus your innovation efforts. Or, if there’s a lack of useful data, then that itself may be the answer.

In this whitepaper, Novotek UK and Ireland explains how utilities operators can get to grips with data management to create an effective data-driven approach to innovation. Covering how to consolidate and modernise assets for data collection, how to make sense of utilities data and which method to use to get the most long-term value from data, the whitepaper is an invaluable resource for utilities operations managers and engineers.

Complete the form below to receive a copy of the whitepaper.

Subscribe to receive the Enhancing data management in utilities whitepaper

* indicates required
]]>
Free whitepaper: Introduction to industrial data https://ideashub.novotek.com/free-whitepaper-introduction-to-industrial-data/ Wed, 18 Aug 2021 17:59:00 +0000 https://ideashub.novotek.com/?p=2745 Data is the backbone of the modern industrial revolution happening around us. However, many business leaders do not know how to effectively manage their data or establish an industrial data strategy that will set them up for success.
In this whitepaper, Novotek UK and Ireland offers a guide to improving your data practices. The whitepaper covers how to develop field-level plans that align with business goals, why the context of data is imperative, how to manage large data quantities and what an effective data strategy looks like.

Complete the form below to receive a copy of the whitepaper.

Subscribe to receive the Introduction to industrial data whitepaper:

* indicates required
]]>
Getting started with food digitalisation https://ideashub.novotek.com/getting-started-with-food-digitalisation/ Mon, 16 Aug 2021 11:09:00 +0000 https://ideashub.novotek.com/?p=2809 The food and beverage industry is one where innovation in product development or design can boast a significant competitive advantage. As such, it’s no surprise that food manufacturers are increasingly considering digitalisation of operations to augment adaptability, improve throughput and strengthen flexibility. Here, Sean Robinson, service leader at food automation software specialist Novotek UK and Ireland, explains how food manufacturers can plan digitalisation in the most effective way.

In the past 12 months, the food industry has been forced to re-evaluate and re-assess its operational priorities. For years, many manufacturers focussed on flexible production to enable diverse product lines and mass customisation, in line with shifting consumer demands. In 2020, this was forced to change, and production efficiency and operational adaptability became the focus. Once more, automation and digital technologies came to the forefront of food manufacturing priorities.

Digitalisation is a word that has been banded around a lot in industrial markets for the past few years, serving as a catch-all phrase encompassing everything that generates, records and communicates data. Unfortunately, as with most amorphous phrases, this leads to confusion among managers about how to introduce these technologies, which causes costly errors in implementation, such as overlapping data collection systems or introduction of technologies that do not serve a strategic purpose.

For food manufacturers at the beginning of their digitalisation journey, the first step is to define an agreed and important goal, which the company can reverse engineer a solution from. Whether looking to deliver on a continuous improvement object that has been identified as part of a formal process, or just illustrating the value of an engineering team unleashed with time to think, it’s key to let the desired improvement dictate what kind of digitalisation will be needed.

For example, if material costs are too high and the agreed goal is to reduce them, a digitalisation project should establish systems that identify the factors influencing this. Understanding the root causes for yield problems could require a combination of machine data, ambient condition data, quality or lab data and information about material quality provided by suppliers. Thinking through where data is readily available, versus where it’s trapped in paper, spreadsheets or isolated automation, will ensure the plan can deliver on the purpose.

Planning at the outset of investing in digitalisation, but some food manufacturers may will have undoubtedly already rushed into digitalisation in years past. For businesses with some digital or automation technologies in place, one of the most valuable things to do is review the lay of the existing digital landscape. The easiest approach to doing this is to apply the ‘three Rs’ to your existing data: reduce systems overlap, reuse data and recycle data.

Reducing data collection system overlap not only makes it easier for managers to identify the source of a specific data set, it also streamlines costs. Why have a downtime system collecting machine event data, a yield analysis system collecting overlapping data and a work in progress tracking system that is separate to both of those? Having three systems collecting fundamentally the same data means duplicated configuration and deployment costs, as well as possible conflict over which one holds the ‘truth’.

An effective data and digitalisation strategy should also aim to use collected data in various calculations to produce several insights. For example, the downtime event data collected for OEE calculations may be part of what’s needed to solve a quality problem. The energy and water data collected for sustainability reporting may hold the key to real savings opportunities. Wherever there is a connection to a data source, managers should think of ways to make sure that a data point only needs to be collected once in order to be used many times.

Finally, offline analysis tools and some of the new analytics packages on the market could mean that old data offers recurring value as a firm chases finer and finer points of improvement. So, it’s important to set up a data management approach and data management platforms that can give you the option of making repeated use of data.

Digitalisation projects can lead to more innovative and effective ways of working for food manufacturers, but they rely on careful planning and strategic implementation. By giving full consideration to the goals to be reached or how data is used within a site, food businesses can ensure their systems are always effectively aligned with their goals.

]]>