Explore implications – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Thu, 25 Nov 2021 14:06:53 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png Explore implications – Novotek Ideas Hub https://ideashub.novotek.com 32 32 Are your PLCs an easy target? A mindset shift can significantly reduce PLC firmware vulnerabilities https://ideashub.novotek.com/are-your-plcs-an-easy-target-reduce-plc-firmware-vulnerabilities/ Thu, 25 Nov 2021 14:06:48 +0000 https://ideashub.novotek.com/?p=2917

Since the beginning of the COVID-19 pandemic, businesses across the UK have faced a surge in cybercrime. In fact, research indicates that UK businesses experienced one attempted cyberattack every 46 seconds on average in 2020. Industrial businesses are a prime target for hackers and the ramifications of a data breach or denial-of-service attack are far-reaching, making system security imperative. Here, David Evanson, corporate vendor relationship manager at Novotek UK and Ireland, explains how industrial businesses can keep their vital systems secure.

For many business leaders and engineers, it is still tempting to consider large multinational companies or data-rich digital service providers to be the prime target for hackers. However, the growing volume of cyberattacks on businesses globally show that any company can be a target of malicious attacks on systems and services.

According to research by internet service provider Beaming, there were 686,961 attempted system breaches among UK businesses in 2020, marking a 20 per cent increase on 2019. Of these attacks, Beaming noted that one in ten intended to gain control of an Internet of Things (IoT) device — something that indicates a tendency to target system continuity rather than conventional data.

Both factors together are cause for alarm among industrial businesses of all sizes. Hackers are targeting all manner of companies, from start-ups to global organisations, and focussing more on the growing number of internet-connected devices and systems that were previously isolated.

The consequences of a device being compromised range from data extraction to service shutdown, and in any case the financial and production impacts to an industrial business are significant. There is no single quick fix to bolster cybersecurity due to the varying types of hacks that can take place. Some cyberattacks are complex and sophisticated; others less so. Many attacks on devices tend to fall into the latter category, which means there are some steps industrial businesses can take to minimise risk.

Novotek has been working closely with industrial businesses in the UK and Ireland for decades. One common thing that we have observed with automation hardware and software is that many engineers do not regularly upgrade software or firmware. Instead, there is a tendency to view automation as a one-off, fit-and-forget purchase. The hardware may be physically maintained on a regular schedule, but the invisible software aspect is often neglected.

GE Fanuc Series 90-30

Older firmware is more susceptible to hacks because it often contains unpatched known security vulnerabilities, such as weak authentication algorithms, obsolete encryption technologies or backdoors for unauthorised access. For a programmable logic controller (PLC), older firmware versions make it possible for cyber attackers to change the module state to halt-mode, resulting in a denial-of-service that stops production or prevents critical processes from running.

PLC manufacturers routinely update firmware to ensure it is robust and secure in the face of the changing cyber landscape, but there is not always a set interval between these updates.

In some cases, updates are released in the days or weeks following the discovery of a vulnerability — either by the manufacturer, Whitehat hackers or genuine attackers — to minimise end-user risk. The firmware version’s upgrade information should outline any exploits that have been fixed.

However, it’s important to note that legacy PLCs may no longer receive firmware updates from the manufacturer if the system has reached obsolescence. Many engineers opt to air-gap older PLCs to minimise the cybersecurity risk, but the lack of firmware support can also create interoperability issues with connected devices. Another part of the network, such as a switch, receiving an update can cause communications and compatibility issues with PLCs running on older versions — yet another reason why systems should run on the most recent software patches.

At this stage, engineers should invest in a more modern PLC to minimise risk — and, due to the rate of advancement of PLCs in recent years, likely benefit from greater functionality at the same time.

Firmware vulnerabilities are unavoidable, regardless of the quality of the PLC. At Novotek, we give extensive support for the Emerson PACSystems products that we provide to businesses in the UK and Ireland. This involves not only support with firmware updates as they become available, but also guidance on wider system resilience to ensure that businesses are as safe as possible from hardware vulnerabilities. The growth in cyberattacks will continue long beyond the end of the COVID-19 pandemic, and infrastructure and automation are increasingly becoming targets. It may seem a simple step, but taking the same upgrade approach to firmware that we do with conventional computers can help engineers to secure their operations and keep running systems safely.

]]>
Bridging the connectivity gap https://ideashub.novotek.com/bridging-the-connectivity-gap/ Mon, 06 Sep 2021 10:18:03 +0000 https://ideashub.novotek.com/?p=2860

In the age of connectivity, there is no shortage of useful information that engineers can leverage to optimise and improve operations. Everything from the speed of motors to the weather forecast can influence production. However, bringing these data sources together in a secure way is a challenge faced by many engineers. Here, George Walker, managing director of Novotek UK and Ireland, explains how engineers can bridge the gap between local process data and external data sources.

The Internet of Things (IoT) may still be a relatively new concept for many consumers and professional service businesses, but the idea of machine-to-machine communication and connectivity is nothing new for industry. In fact, it’s been more than 50 years since the programmable logic controller (PLC) first became popular among industrial businesses as a means of controlling connected systems.

The principle behind the PLC is quite simple: see, think and do. The controller will ‘see’ what is happening in a process based on the input data from the connected devices and machines. The PLC then processes this input and computes if any adjustments are required and if so, it signals these commands to the field devices. Traditionally, the field devices that could be controlled was limited, but recent developments in sensor technology have made specific components and resources much more measurable.

For example, if a water tank is almost at full capacity in a food processing plant, data from connected sensors can feed that information to a PLC. The PLC then sends the signal for the valve to close once the water volume exceeds a certain threshold, which prevents overflow. This is a simple control loop that sufficiently meets the need of the process.

Unfortunately, even as edge computing and PLC technology has advanced and offered more sophisticated data processing and control at the field-level, many plant engineers continue to setup their devices in this way. In reality, modern edge devices and industrial PCs (IPCs) are capable of providing much greater control, as well as responding to external commands or variables that were previously beyond the scope of control systems.

The outer loop

While the idea of the Industrial IoT (IIoT) is predominately a means of branding modern connectivity, the wider Industry 4.0 movement has brought with it some valuable advancements in edge and PLC technology. Among these advancements is the potential for on-premises automation and control systems to not only connect with local devices in an inner loop, but to draw from external sources: an outer loop.

The outer loop can take several forms, depending on what is most applicable or relevant to a process or operation.

For example, some more digitally mature businesses might have outer loops that feature an enterprise resource planning (ERP) system, supply chain management software or a wider manufacturing execution system (MES). These systems will share and receive relevant information or send required adjustments — such as due to raw material intake or low stock — to an edge device, which feeds into the inner loop. This allows industrial businesses to make use of more comprehensive data analysis than can be achieved in local data systems.

Alternatively, an outer loop could draw from data sources that are completely external to a plant’s operations. For example, a wind farm operator could use an outer loop that drew from sources of meteorological data for wind forecasts. This could inform the optimum pitch and yaw of a turbine, controlled by a field device.

Another example, and one that will resonate with many industrial businesses, is energy price. The cost of power from the electrical grid fluctuates throughout the day, which might mean that on-site generation — such as solar panels or heat recovery processes — become more economical during times of peak grid demand. An outer loop can communicate this data efficiently to the relevant systems in a business, and changes can then be enacted that allow the business to reduce energy costs.

Establishing secure connection

Clearly, there is a benefit for industrial businesses to establish both inner and outer loops. However, there is one barrier to deployment that most engineers encounter: hardware limitations.

Traditional PLCs were designed in a rather utilitarian manner to complete control functions effectively and in a straightforward manner. This no-frills approach persists even with modern PLCs — even with today’s technical specifications, most PLCs are not designed in a way that struggles to handle much more than a real-time operating system and some control applications.

Attempting to set up such a PLC to interact with an outer loop would either not work at all or severely hinder performance and risk failure.

Engineers can tackle this problem by introducing a separate gateway device that serves as an intermediary between the outer loop and the inner loop. However, this is a somewhat inelegant solution that requires investment in additional devices, which will require ongoing maintenance and introduce yet another device into already large system networks. Across an entire site, this quickly becomes costly and complicates network topologies.

A better solution is an unconventional one. It is possible to set up a modern automation controller in such a way that it breaks the conventions of PLCs, as long as the device is capable of multi-core processing at pace. From Novotek’s perspective, one of the best modern units that meet this need is Emerson Automation’s CPL410 automation controller.

The CPL410 can split inner and outer loop processing between its multiple processor cores. The inner loop and PLC processes can run from a single core, while another core — or even a group of cores, depending on complexity — can run more sophisticated containerised applications or operating systems. Additional cores can broker between the inner and outer loops, ensuring reliability and security.

A multi-core setup is useful because it allows the PLC processes and gateway to be consolidating into a single unit, without compromising performance capacity or speed. It also means that ageing or obsolete PLCs can be upgraded to a controller such as the CPL410 during any modernisation initiatives, minimising additional capital costs.

Although the idea behind the IoT is not a new one for industrial businesses, the fact that other sectors are embracing the idea means more external data points than ever before are available. With systems in place that can support effective inner and outer loops, industrial businesses can leverage the increased connectivity of external markets and enhance their own operations.

]]>
A recipe for lasting success https://ideashub.novotek.com/a-recipe-for-lasting-success/ Wed, 01 Sep 2021 11:03:50 +0000 https://ideashub.novotek.com/?p=2802 Few businesses routinely challenge every part of their organisation like food manufacturers. New technologies and digital transformation can help food manufacturers manage the constant change, but the traditional approach of comprehensive digitalisation planning is often not flexible enough to ensure success. Here, Sean Robinson, software solutions manager at food automation expert Novotek UK and Ireland, explains why the key ingredient for success in flexible food manufacturing are micro-applications.

Food production is truly a sector that operates under the mantra of “reinvent the everyday, every day”. The sector is constantly evolving, whether manufacturers are innovating new product ranges that meet changing consumer tastes or switching packaging materials to extend shelf-life or reduce waste. And these are just examples of substantial shifts; food manufacturers are also regularly making smaller challenges by refining recipes, adapting processes or adjusting ingredient and material supply lines.

Despite — or perhaps because of — the environment of constant change, food processors can benefit more than many other manufacturers from carefully targeted use of data collection, visualisation and analysis solutions. After all, yesterday’s optimisation isn’t particularly optimal if today means a new stock-keeping unit (SKU), a new critical ingredient supplier or a new recipe.

The approach that many businesses take to becoming data-driven is to extensively map out their digitalisation journey, with each aspect comprehensively planned. This doesn’t generally support the flexibility needed in food manufacturing.

Rather than taking this approach, modern solutions make it possible to build or buy micro-applications that share common data infrastructure and even app-building or visualisation tools. This means that impactful new capabilities can be adopted through fast initial works that create re-usable building blocks. Later works then become incremental, rather than potentially having different systems creating overlapping capabilities.

Micro-apps in practice

We can see how this micro-app approach can be put into action by considering one of the most common challenges in food processing: managing the effect of variability in key ingredients, so that yields are maximised with minimal re-work or ingredient waste. It’s likely that a manufacturer would already have some of the information needed to address the challenge. The question is, how can you quickly supplement what’s in place?

It’s a safe bet that the factory has automation and maybe supervisory control and data acquisition (SCADA) systems, so there is an abundance of machine-generated data to tell us about the details of how processes are performing. Focussing more closely on yield performance, we can assume our manufacturer has a lab system where in-process and finished good tests give very clear indicators of how well a product is being made.

From Novotek’s experience, the most common gaps in tackling yield issues come from two areas. The first is supplier quality data, which is often provided either written down or in an electronic format that doesn’t mesh with existing systems. This makes analysis more difficult, because there’s no actual database to work from.

The second area is that the variations in raw materials that affect yields may actually be within the specifications defined for those materials. As such, there may not be an obvious fix. It’s likelier that material data needs to be analysed alongside several process performance and quality performance data points. Understanding the relationships between more than two or three variables will probably mean adding a new kind of analysis tool.

Micro-apps can be highly focussed on the core capabilities required. In this case, the micro-app would provide three core functions. First, it would provide a simple means to capture ingredient quality data as it’s received, into a system that also holds the specific material characteristic specifications and limits – all on a “by-lot” basis. It would also offer a machine learning tool that can help clarify how the range of material quality variation can be managed in relation to what machine settings or recipe adjustments might allow for good final yield and quality results.

Finally, the micro-app would be able to alert production staff to make recommended changes to a recipe or process as different raw material lots are staged for use – an automated monitor of yield/quality risk from material variation. This could be as simple as a new smart alarm sent back to existing SCADA, or a notification on a smartphone.

Industrial software vendors are adapting their offers, in recognition of the trend towards micro-apps aimed at specific business processes. So, the software licensing needed to enable material data collection and quality specification monitoring on a key process would be built around a low user count and narrow set of underlying configuration and integration points, rather than a comprehensive plant-wide project. That can mean starting investments in the low thousands for software and some deployment work.

Some of Novotek’s customers are now progressing through projects defined by such very specific functional needs. Our job at Novotek is to ensure that any new solutions serve the purpose of being able to act as supplements to other such micro-apps in the future.

Next stages

A strategic advantage of micro-apps is that the planning and execution stages are less time-intensive than a far-reaching, plant-wide digitalisation project. Food engineers can do several things to begin reinventing their everyday processes. For example, food manufacturers can deploy predictive downtime applications on key processes. These are apps that can even take into consideration whether the products made have their own impact on failure modes.

Each micro-app reflects an opportunity to make the overall food manufacturing operation more adaptable. This means that innovation in products, processes and business models can be done, all the while knowing that refining and optimising the “new” won’t be held up by tools and practices that are too difficult to adapt from the “old”.

]]>
Free whitepaper: Enhancing data management in utilities https://ideashub.novotek.com/free-whitepaper-enhancing-data-management-in-utilities/ Fri, 20 Aug 2021 10:30:00 +0000 https://ideashub.novotek.com/?p=2748 Innovation has been one of the biggest focuses for utilities operators in recent years, particularly in the water market due to pressures from regulatory bodies. However, innovation is a broad term that offers no indication of the best and most impactful changes to implement.

The best approach may be to let the data dictate where to focus your innovation efforts. Or, if there’s a lack of useful data, then that itself may be the answer.

In this whitepaper, Novotek UK and Ireland explains how utilities operators can get to grips with data management to create an effective data-driven approach to innovation. Covering how to consolidate and modernise assets for data collection, how to make sense of utilities data and which method to use to get the most long-term value from data, the whitepaper is an invaluable resource for utilities operations managers and engineers.

Complete the form below to receive a copy of the whitepaper.

Subscribe to receive the Enhancing data management in utilities whitepaper

* indicates required
]]>
Can your IPC handle the heat? https://ideashub.novotek.com/can-your-ipc-handle-the-heat/ Mon, 05 Jul 2021 10:55:00 +0000 https://ideashub.novotek.com/?p=2667 High operating temperatures are abundant in the industrial sector, whether it’s the elevated ambient temperature of oil and gas refining or the continuous operations with reduced airflow of heavy machinery. These high temperatures pose a common problem to the performance of industrial electronics, particularly industrial PCs (IPCs). Here, David Evanson, corporate vendor relationship manager at Novotek UK and Ireland, explains how engineers and managers can ensure their IPCs can handle the heat.

It’s no secret that IPCs play an essential role in modern industrial operations. These vital units undertake a range of tasks, from managing equipment performance data to motion and automated system control. It’s therefore no surprise that the IPC market continues to go from strength to strength. In fact, ResearchAndMarkets forecasts that the global IPC market will grow at a compound annual growth rate (CAGR) of 6.45 per cent, to be valued at $7.756 USD by 2026.

IPCs feature prominently on the factory floor, generally either in control cabinets or mounted onto machinery. Being on the frontline means that engineers and plant managers know that, as a minimum, they need to specify ruggedised IPCs for their operations. What sometimes gets overlooked, however, is the operating temperature range of an IPC unit.

Electronic circuits and computing components are highly susceptible to extreme temperatures, be they high or low. At high temperatures, components can deteriorate faster. In the case of IPCs, modern CPUs are designed to prevent accelerated component deterioration by throttling their processing performance. This succeeds in reducing the heat produced in processing circuits, but it means that processes running on the IPC run slowly or become unresponsive — not ideal for real-time control applications.

In certain markets, considering operating temperature range is second nature for engineers. For example, an IPC tasked with controlling or collecting data from a welding robot will be specified to withstand high temperatures.

However, temperature should be a consideration even in unassuming industrial environments. If an IPC is situated outside, the exposure to sunlight — alongside reduced airflow in an installation — can cause an increase in temperature that can reach up to 70 degrees Celsius. Both Novotek and its partner Emerson Automation have encountered industrial businesses that have experienced this problem.

Of course, the solution to the challenge of overheating in IPCs is to specify a unit that boasts good thermal performance in an extended operating temperature. Unfortunately, not all IPCs that claim to offer this feature are actually effectively tested in conditions that accurately reflect real-world operating conditions, which can lead to some IPCs failing when deployed in the field.

The reason why extended temperature IPCs fail is due to the way that the testing is undertaken. In many cases, the IPC is tested in a thermal chamber that has significant forced air flow conditions, which reduces the efficacy and the accuracy of the test itself. A more effective way of testing is for the IPC manufacturer to block the airflow, which simulates a realistic use condition in a cabinet environment.

Emerson Automation conducts its tests under these restricted airflow conditions, which allows it to accurately demonstrate that its IPCs can perform at high temperatures without throttling processing performance. The company has even shared a video of its IPC thermal testing process, highlighting the capabilities of its RXi2-BP.

It’s for this reason that Emerson’s IPCs are the go-to option from Novotek’s perspective, because they ensure reliable and consistent operation in demanding environmental conditions.

With IPCs playing such a vital role in modern industry, its important that they are up to the task not only in terms of computing capacity, but also environmental performance. When plant managers and engineers can specify an IPC with assurances of the accuracy of thermal testing, it provides peace of mind that the unit can handle the heat.

]]>
What can Industrial PCs and displays handle? https://ideashub.novotek.com/what-can-industrial-pcs-and-displays-handle/ Fri, 14 May 2021 08:30:00 +0000 https://ideashub.novotek.com/?p=2656

Industrial PCs (IPCs) and displays are routinely used in harsh environments. Even in less extreme industrial environments, IPCs and displays are often subject to heavy handling or accidental damage. So, its important to know what your IPCs and displays can (and can’t) handle.

There are many cases where engineers might overlook the need for robustness and durability in IPCs and displays, particularly in environments where there is unlikely to be any extreme shock, vibrations or impacts.

However, things happen in the day-to-day activities on the factory floor that can impact IPCs and displays, and it’s important to ensure systems remain operational and intact in these cases. Even if a system doesn’t need to have an extended operating temperature, you also don’t want it to temporarily stop functioning if somebody accidentally knocks or scratches it!

  • Summary:
  • Even in traditional industrial environments, IPCs and displays need to be robust
  • Testing systems for physical damage or performance disruptions is invaluable

Transcript:

Hi, my name is Gene Juknevicius. I am a solution architect at Emerson. At Emerson, we know that customers need reliable industrial PCs and displays to run their applications. So let’s see how robust Emerson’s industrial PC and display really is. 

So here we have our RXI2-BP Industrial PC and two RXI monitors. By the way, let me show you an interesting feature of a display port interface that our industrial PCs and monitors do support. It’s called a multi stream transport. It allows us to daisy chain multiple independent monitors to a single display port output on an industrial PC. 

In this case, I’ll be showing video on one display and performance statistics on the other display. 

So, let’s see how robust the industrial PC is. For the test I’m going to drop it on the floor while operating. We’re going to watch the video to make sure that there are no interruptions. So for the test, let’s see how high the table is – it’s about 75 centimetres. We obviously need to make sure that the cables are long enough. 

And, please, don’t do this test at home — it requires a certified IPC dropper. Ready? 

Look at the video. It keeps running. As you can see, the video did not even flicker. This is exactly what we want.  

Okay, so industrial PCs are robust enough. What about our displays? When visiting some of our oil and gas customers, I was told that people in the field tend to use large screwdrivers as a stylus for the touch screen.  In our case I have this large wrench. Let’s try to use that.  

Okay, what about trying to use a display as a cutting board?  

As you can see, neither the large wrench nor a sharp cutter left any scratches on a display.  

Our goal is to deliver solutions that you can deploy anywhere and not worry if they will operate, so that you can focus on your application. 

]]>
Obsolescence in pharma automation https://ideashub.novotek.com/obsolescence-in-pharma-automation/ Thu, 21 Jan 2021 10:14:38 +0000 http://ideashub.novotek.com/?p=1966 Despite the life sciences being an industry where precision and quality is vital, pharmaceutical companies have been slow to upgrade their process control automation systems. Here, Sean Robinson, service leader at Novotek UK and Ireland, explains why, as many paper-under-glass data handling systems begin reaching obsolescence, it is time for pharmaceutical companies to upgrade their process control automation without affecting FDA/EU compliance.

Broadly speaking, pharma companies have made little progress in modernising their data collection practices. This is largely due to fears that changes in data handling will affect compliance with regulations laid out by the Food and Drug Administration (FDA) or EU governing bodies. Modernisation, through the increased flexibility and efficiency it can bring, can better equip companies to face increased competitive pressures from patent extinction on flagship products, allow more rapid scaling of product and packaging variants, and help address the increasing presence of  counterfeit products.

The evolution of Good Manufacturing Practices (GMP – later with an added A for Automated) has been critical in developing and maintaining consumer trust in pharmaceutical products. In practice, though, it has led to conservatism in the way processes are developed and launched, and rigidity in the way data is collected in relation to production activities. The last generational rollout of automation in the pharma sector occurred during the early 2000s, and saw the entrenchment of minimalist approaches to use of tracking systems alongside core automation. While other industries adopted plant IT systems that made extensive use of automated data flows to support performance analysis and continuous improvement efforts, life sciences companies effectively reproduced their old paper tracking systems in electronic format.

Such ”paper-under-glass” systems have tended to be very limited in functional scope and often address only digitising the logging of data directly needed to provide batch reporting supporting the release of product into the marketplace. To this day, many pharmaceutical companies are reluctant to update their paper-under-glass approach due to a misconception that any data collected automatically effectively becomes part of their official batch record regimen, and must therefore be managed in strict accordance with regulations such as the US FDA’s 21 CFR Part 11. While regulators (the FDA in particular) have gone to great lengths to clarify the way such regulations should be interpreted, this initial confusion has had a lasting effect.

Electronic Signature Regulations (ESRs) such as 21 CFR Part 11 outline the requirements for a combination of secure system configuration capabilities, data logging/transactional auditing capabilities and physical data security management. These rules are in place to allow companies to record data with fidelity and prevent data being lost.

What is misunderstood here is that data not needed for the batch record does not need to comply to ESRs. Therefore, managing records related to understanding performance improvement opportunities related to  manufacturing assets and processes does not need to be complex and time-consuming.

Paper-under-glass also often leads to ineffective evaluation of batch quality because the detail in electronic batch records is inadequate to run root cause analysis of quality issues that arise. In the event of a machine breaking down, paper-under-glass systems will not give useful insight into what has caused the malfunction, and companies lose productivity. By limiting the footprint of what data is collected automatically, pharma companies deprive themselves of the rich detailed asset data that would correlate to asset events such as downtime, or to quality events where the underlying issue may be driven by a combination of asset health information, process execution information and material characteristic information. While a paper-under-glass system may help log what is happening, easy access to these additional data sets is crucial to quickly understanding the “why”.

Fortunately, pharmaceutical manufacturers today have clearer insight than they did two decades ago into how automated systems can comply with ESRs and GAMPs. Now, with many of these systems due to be approaching obsolescence in the coming years, it’s time for pharmaceutical businesses to consider how they can adopt automation and plant IT that supports improvements in all aspects of production without impacting regulatory compliance.

Novotek’s experience in the sector has shown that there can be a number of ways to deploy solutions ranging from SCADA systems with batch execution footprints; to Historians; to manufacturing execution systems. For each of these there are well-understood approaches where the configuration ensures that compliance critical data can be logged in appropriately controlled ways, while easing the collection of separate data for root cause and improvement analysis.  Best-in-class pharmaceutical companies typically leverage a suite of solutions providing the mix of execution support, track and track and detailed end-to-end visibility into the production process and assets. From that, the mix of capabilities extends to include:

  • Eased proof of quality and compliance
  • Faster and deeper understanding of root causes of quality events and productivity losses
  • Simpler deployment of anticounterfeiting measures (as the mix of modern IT and automation solutions open up better methods than some of the bolt-ons built around now-obsolete automation and ERP platforms)
  • Greater adaptability of production lines to run a broader range of qualified products

As conditions in the life sciences marketplace become tougher for producers, we at Novotek believe that there is an opportunity for firms to rethink the role of production technology – to see it as a source of competitive advantage. And the generational change in systems now underway due to obsolescence means now is the time for pharmaceutical businesses to seize that advantage.

]]>
Modern traceability for the food industry https://ideashub.novotek.com/modern-traceability-for-the-food-industry/ Mon, 11 Jan 2021 10:34:00 +0000 http://ideashub.novotek.com/?p=2237 Food scandals are like garden weeds. Every year, no matter how much you try to stop them, they keep coming back. While some food scandals may be out of the hands of manufacturers, having the right traceability systems in place can help to minimise the effect of scandals. Here, George Walker, managing director of Novotek UK and Ireland, looks at why effective traceability systems are a worthy investment.

Currently, many food and beverage companies solely rely on manual paper checks for their traceability procedures. Data is collected manually by the workers, which is then collated. The files are then collected and stored in offices, which must be sorted through in the event of a regulatory inspection or a recall.

Manually collecting vital information like this can have an impact on the accuracy of the information and the speed at which it can be collected. However, European companies in the food and beverage industry are bound to laws that mean their traceability procedures must be extremely tight to avoid any prosecution or fines.

All countries in the EU are bound by the General Food Law of 2002 to ensure that food safety is maintained for all European consumers. The law defines traceability as the ability to trace and follow food, feed and ingredients through all stages of production, processing and distribution.

When food or feed is found to be unsafe, companies are under obligation to withdraw or recall the product. They must also notify the national authorities who can then monitor where the product may have spread to and whether further action should be taken. Otherwise, a contaminated feed could have wide reaching consequences across the European food supply chain.

To comply with this regulation, companies must have traceability systems in place that allow them to identify where their products have come from and where they are going to. They must do this quickly to reduce the spread of contaminated product and the amount of product recalled, affecting the supply chain.

There are strict traceability regulations for all EU food manufacturers to comply with. While these regulations are nothing new, businesses can make it easier for themselves to comply with the regulations by moving away from paper-based traceability systems.

It is almost inevitable that at some point in the food chain, mistakes will be made, and contaminated products may enter production. However, to minimise the disruption to their business, food manufacturers can invest in automated traceability systems. Then, if a contaminated product is found, the batches that it has been in contact with are traceable.

To create an effective web of traceability records, data should be collected from the three different stages of food production. Manufacturers can use PLCs and HMIs to collect the data. Emerson Automation’s PACSystems range of control systems are easily integrated into existing systems, meaning that the benefits of improved traceability are not outweighed by a huge cost to integrate the new products.

In the first stage, all raw materials should be easily identified by the batch and date code, meaning they are tracked as soon as they come into the factory. Then, as the raw material is processed, manufacturers should note the number of units produced and the amount of any waste product. Finally, the third stage, where the product is sent to customers, is one of the easiest to keep traceability records of, as most businesses keep detailed records of orders and their recipients.

By feeding the data collected along the production line into a MES system, the plant manager can view all of this traceability information at a glance. In the event of a recall, the plant manager can then share this information with the relevant authorities quickly, without having to track down missing data, minimising the disruption to the plant.

Not only does improving traceability systems help the manufacturer in the unfortunate event of a recall, it also helps the businesses’ transparency. Novotek has previously worked with a Dutch milk producer where GE Digital’s Historian application allowed the company to improve its transparency to its customers. As the brand promoted itself in organic retailer stores, proving the origin of the milk was important to the success of the brand, meaning the producer had to invest in an improved traceability system. Food scandals don’t look likely to disappear anytime soon, but by investing in better control systems to improve traceability, food manufacturers can reduce the effect of the scandals on their business. Just as professional gardeners wouldn’t weed their gardens with inadequate tools, food manufacturers must not rely on out-dated manual collection methods. By carrying out traceability procedures using connected, automated systems, food manufacturers can protect themselves as much as possible against scandals.

]]>
Modernising hardware in the wastewater industry https://ideashub.novotek.com/modernising-hardware-in-the-wastewater-industry/ Mon, 11 Jan 2021 10:27:00 +0000 http://ideashub.novotek.com/?p=2230 Water is important in the Netherlands. Without correct water management, half of the country would be flooded. While the Netherlands’ water industry may be well developed, process improvements can still be made. Here, George Walker, managing director of Novotek UK and Ireland, looks at how water companies can use digitalisation to help them meet strict regulations.

There are several regulations that Dutch water companies must adhere to. Both national and EU regulations strictly control the cleanliness of the water that is put back into the water system. For example, the European Drinking Water Directive specifies a total of 48 microbiological, chemical and indicator parameters that must be monitored and tested regularly to meet the standard.

While some of the standards are set lower than the European Drinking Water Directive, for substances such as boron, bromate and fluoride, the Dutch national legislation adds a number of other substances to monitor such as Cryptosporidium and Polychlorobiphenyls. This means that overall, the water quality in The Netherlands is internationally recognised as being particularly high.

To meet these standards, water companies must ensure that they have strict procedures in place to meet the regulatory standards. While the water infrastructure in the Netherlands is particularly developed, there are still improvements that can be made to help the water companies effectively feed back this information to the government. 

The need for monitoring

One of the key challenges in the Dutch water industry is outdated equipment that makes it difficult for water companies to collect the information required across the treatment process. A 2017 report by the Rijksinstituut voor Volksgezondheid en Milieu stated that “in order to take preventative actions [against harmful contaminants in the sources of drinking water], it is necessary to monitor possible hazardous contaminants through the water supply chain.”

The same report stated that while most Dutch drinking water companies are improving their operational monitoring and management, stricter controls may be brought in in the future. Rather than monitoring for a concentration of 1 microgram per litre, the report suggests that the value may be lowered to 0,1 microgram per litre in the future.

While this increased control may be some time away, it’s vital that the Dutch water companies, from all aspects of the water supply chain, consider their reporting procedures.

However, they are often stunted by outdated equipment. Across Europe, the water industry is plagued with ageing infrastructure. There are numerous pieces of equipment used in the sanitisation process of water, as well as the pumps and vessels used to get the water around the plant. However, if the equipment is not able to be connected to an overarching control system, the plant loses the opportunity to collect important data.

Updating equipment

Even in a country with a water infrastructure as well funded as The Netherlands’, it is still difficult for water companies to update all their equipment in one go. Instead, water companies must draw up a plan of the processes that they want to be able to monitor better.

Once they have completed this plan, they can upgrade equipment that allows them to do this. For example, infrastructure managers may decide to first invest in connected equipment to help them feedback the level of contaminants at each stage of the purification process.

Once the equipment has been updated and there are plethora of connected devices reporting information, it is necessary to control these using an overarching control system to control and gather this information. This will allow the infrastructure manager to make actionable decisions and share this information with authorities when necessary.

Reporting

While water companies have kept manual records of contaminant levels for decades to comply with reporting legislation, this is not the most efficient way of doing this.

All water companies must follow strict quality control procedures internally and report to the Regional Public Health Inspector (RHI), at least on an annual basis. If concerns are raised at any point about the safety of drinking water, the water company must be able to provide supporting data within a short period, so any outbreak can be traced effectively.

When using manual data collection, it is time consuming to log the data, import it into spreadsheets and print the records. Due to the amount of human involvement in the process, mistakes could easily be made, or the historic data may be lost if not correctly filed.

To meet the regulations of the water industry, it is essential that water companies use a control system that allows them to manage this data effectively.

Managing data

If water companies invest in equipment to help them to collect data throughout the water purification process, it is worthless without a SCADA system that can collect and manage the data.

Not only will the SCADA system give the plant manager more awareness and control over the processes in the water treatment plant, it will allow the regulatory information to be safely collected and stored.

By using a SCADA system such as Novotek’s iFix Automation Software and digitising the collection of important data, this will reduce the time taken collecting the data manually and reduces the likelihood of human error. The information can then be displayed in a clear report, alerting the plant manager to any discrepancies. When required, the report can then be sent to the regulatory authorities.

As historic data is so valuable in the water industry, it is ideal for plant managers in this industry to also use an information system that can gather, archive and compress large amounts of data. If any problem is detected much later in the water treatment process, having this large volume of data available will make it much easier to identify the origin of the contaminant.

Having a large amount of data available through investing in better sensors and automation equipment across a water treatment plant will help plant managers to have a better awareness of their industry. However, without the right SCADA system and historic information system, such as GE Historian, the investment in the connected devices is worthless.

With water management being so important in The Netherlands and the water industry leading the way in Europe, it is likely that regulatory control will increase to ensure the country retains its reputation for some of Europe’s cleanest drinking water.

With this in mind, water treatment plant and infrastructure managers need to make sure that their plant uses an up-to-date control system that is relevant to the digital age to manage their data, to ensure that they can be one step ahead of any future regulatory changes.

]]>