Foundational work – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Mon, 18 Mar 2024 16:40:03 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png Foundational work – Novotek Ideas Hub https://ideashub.novotek.com 32 32 Sustainability – Many Birds, One Stone https://ideashub.novotek.com/sustainability-many-birds-one-stone/ https://ideashub.novotek.com/sustainability-many-birds-one-stone/#respond Mon, 18 Mar 2024 16:37:42 +0000 https://ideashub.novotek.com/?p=3406 ‘Two birds with one stone’ – so goes the well-known phrase. But what if you could get more than two birds for a single throw of a stone? How about many birds? And what if these birds were not just flights of fantasy but offered foundational improvements and real upsides, such as increased profitability, recovered capacity and the ability to meet sustainability targets?

Starkly stated sustainability targets such as ‘Net-Zero by 2030’ imply an inherent struggle, and while this may be true in certain arenas, there is a genuine opportunity to achieve environmental goals, automate accountability and improve profitability within manufacturing – all at once.

In this article, we’ll outline exactly how the right capabilities, infused with expertise, can offer a profitable and intelligent pathway to a brighter business and environmental future. Carbon is cash, and reducing your output means retaining capital and growing profitability for the future.

So how is this achieved? Firstly, by fostering a different mindset when conceptualising sustainability measures. Data on utility usage can tell you when you’ve used more or less, but this aggregated data doesn’t have the granularity to explain why. While this is fine for quantifying and reporting on consumption to participate in a carbon exchange, this approach offers no mechanisms to improve these figures. But it doesn’t have to be this way.

Success and Sustainability

Novotek Solutions delivers operational technology with a methodology shaped by a deep knowledge gained in over three decades of experience in IT domains.

We’ve led the way in delivering all our projects to a high, IT-compliant standard. Our solutions are supportable, maintainable, and extensible to keep your operation fit for the future.

In decades past, manufacturers in various sectors have embraced initiatives focused on continuous improvement, aiming to enhance production yields, improve equipment reliability, and minimise waste in materials, labour, and capital. Advanced measurement systems that track metrics like machine downtime and material usage, leading to the establishment of comprehensive factory data infrastructures, all support manufacturers’ end goals.

These systems contextualise raw data by associating it with specific details such as order numbers and product codes. Advanced platforms like Proficy Plant Applications from GE Vernova can integrate data from primary sources like water flow meters into this contextual framework. This practice of collecting detailed data related to core equipment and products results in a robust dataset, which serves multiple purposes:

  1. Automating Environmental and Compliance Reporting: Using directly measured consumption data to create regulatory reports and calculate incentives.
  2. Enhancing Carbon Accounting: With varying standards for translating energy consumption into emissions, having granular data allows for flexibility in reporting and adapting to evolving auditing requirements.
  3. Incorporating Footprint Analysis in Continuous Improvement: Analysing measured environmental factors alongside traditional performance metrics reveals the interplay between operational changes and environmental impact. Comparing a product’s footprint data across different times or locations helps identify significant variations.

This approach allowed a major North American brewer to spot cases where energy consumption varied when all other factors were equal. Measuring energy consumption next to production orders meant it could hunt for root causes through its efficiency management system.

Root causes for relative spikes in usage ranged from inefficient process control algorithms for heating or chilling equipment, inconsistent adherence to recipe setpoints, and poor power management relative to down or idle times. The brewer utilised this insight to make recipes and procedures consistent across all sites.

The result? The brewer met a 5-year energy-savings target in just three years!

Operator Behaviour, Transparency and Compliance

As more firms conclude that a functional information strategy is a critical first step in their sustainability journey, gaining the correct capabilities to gather and process data is essential. In times gone by, multiple data collection regimens assembled reports for different purposes, such as customers or regulators, which led to inconsistencies and undue workload on operators and analysts.

The alternative is a single data platform that serves multiple stakeholders, such as GE Vernova’s Plant Applications. Through a single platform, data is gathered once at an appropriate resolution, and the same data can then be repacked for multiple purposes.

Through this method, operations can automate the management and delivery of regulatory data. Adherence to future carbon passport schemes also becomes a process through which you already have the tools to deal with.

Turning to transparency, increasingly, customers are willing to pay a premium for ‘green’ products, where you can demonstrate a complete genealogy and the positive credentials of your products in total confidence. With a comprehensive data platform in place, you have the power to track and demonstrate the exact journey a product has gone through, from raw materials to finished goods. And that is not to overlook the power of transparent data on your operation.

With greater process visibility, automated with real-time data collection, operations gain the insight required for intelligence decision-making from the shop floor to the top floor. Ingesting and utilising this data with a powerful analytics platform drives an understanding of the cause-and-effect relationships between asset performance and input consumption. This granular data is then fed into corporate EHS and carbon accounting systems, allowing true utility cost profiles to be a part of production costing and planning exercises. Manufacturers then use cross-plant metrics to accelerate best-practice identification and dissemination.

But that’s not where it ends; by embedding analytics into control and visualisation programs, operators can be presented with rich information to drive decision-making at the shopfloor level. By using intelligent systems in this way, operations can also ensure they are not held hostage to the availability of specialists.

Innovative Strategies in Sustainability

To demonstrate how adopting a manufacturing execution system can offer a ‘many birds for one stone’ solution, we can look at the capabilities and conditions of an operation both before and after implementation.

Before

Without a detailed understanding of how changing utility inputs will affect processes, efforts to be ‘green’ can cause efficiency and material losses while also potentially introducing quality or product safety risks.

The differences between equipment and processes also present difficulties in formulating an effective strategy. With better data collection, all elements of variability can be profiled – including materials used in processes.

After

Data-driven decision-making brings cost, quality and carbon footprint into balance. With the confidence to act backed by information, tuning processes and utility infrastructure ensures sustainability efforts do not compromise operational performance.

The root causes of overconsumption are more easily understood, and strategies to mitigate them can be formulated and actioned at pace.

The ‘Many Birds’ at a Glance

If we’ve demonstrated anything in this article, we hope it’s the broad scope of what’s possible when looking to drive sustainability – and reap the real rewards on offer for manufacturing! Here are the key takeaways of what’s on the table as we progress towards environmental goals:

  1. Expose hidden relationships between production and sustainability factors.
    • A single MES solution provides insight into materials, recipes, assets and processes to find the root causes of the overconsumption of utilities.
  2. Gain a single source of truth and improve the visibility of your consumption.
    • Granular data gathered by the single platform can be packaged, analysed and presented to serve many needs.
  3. Integrate metrics and analysis to provide additional insight.
    • Automating analytics within a single, scalable platform provides value from the shop floor to the top floor and drives fast, accurate decision-making powered by information.
  4. Automate regulatory compliance and power transparency and traceability.
    • Gain competitive capabilities to demonstrate green credentials to customers and other stakeholders.

Last but not least, and in a nutshell, why select Plant Applications from GE Vernova?

  • Flexibility in Data Management: The platform can easily link basic time-series data from meters to a wider range of elements like materials, products, and events, all through straightforward configuration.
  • Support for Multiple Stakeholders: Plant Applications offers a variety of reporting and analytics capabilities, catering to both internal stakeholders focused on improvement and external stakeholders, ensuring their diverse needs are met.
  • Open and Layered Approach: Unlike many sustainability metrics systems that are manual or limited to specific sensors, Plant Applications enhances existing sensor, automation, and software investments, offering a more integrated solution.

Continue the conversation

Do you have any questions about sustainability and manufacturing? Chat to one of our friendly experts to find out more.

]]>
https://ideashub.novotek.com/sustainability-many-birds-one-stone/feed/ 0
How to Implement IT Compliant OT https://ideashub.novotek.com/how-to-implement-it-compliant-ot/ Thu, 26 Oct 2023 14:32:53 +0000 https://ideashub.novotek.com/?p=3402 As manufacturing operations adopt more intelligent systems, we’ve seen control systems, equipment, and networks rebranded as Operational Technology (OT). With this has come a change in approach from IT departments, who for decades wanted nothing to do with the weird and wonderful equipment that populated the OT space. While keeping the operational world at arm’s length was possible for IT in the past, they are now converging at such a pace and in a way that is impossible, or even perilous, to ignore.

A vital convergence

Cybersecurity is a crucial concern. OT equipment has become more IT aligned by necessity through standard protocols and ethernet/IP connectivity. Like a bucket of cold water, this fact woke the IT world to the significant vulnerabilities presented by connected operational systems. Furthermore, the press has continued to fill with stories of backdoors exploited by nefarious actors and the dire consequences of which to reputations, service, and profitability.

It was time for OT to be taken seriously and become part of the IT estate with the same high standards and best practice approaches to security.

So, what does this mean for you as a manufacturer?

Firstly, you must ensure that your control systems, such as PLC, SCADA etc., are secure from threats by keeping systems up to date and only providing connectivity between systems that require it. Leaving your entire operation wide open, with everything connected to everything else, is particularly hazardous. The optimal solution is to establish communication channels secured via switches and routers, allowing protocols to be enabled and disabled as required. Through this method, you can install firewalls between departments to further mitigate the threat of a cybersecurity breach.

The second point to consider is access control. Users should only be granted permissions to systems they require within an IT-supported domain. Paired with appropriate password complexity, a policy of regularly changing those passwords can minimise a potential vector of attack.

Next is virtualisation. By abstracting OT systems from the IT hardware, you can install physical hosts in an environmentally controlled data centre; rather than the old method of putting server racks under desks in control rooms, where they were subject to dust, heat, and the occasional accidental kicking from a steel-toe-capped boot.

Rounding out this brief overview is patching and backups. Patching regularly, at the same frequency as IT systems, ensures systems are constantly kept up to date and reduces the impact of ‘timely’ vulnerabilities such as Log4j. We still visit sites where Windows XP, NT and Server 2000 are still in use. These operating systems are running long after official support has ended, meaning security patches are no longer available and the vulnerabilities are well known and widely published.

Because OT should now be firmly on your IT department’s radar, creating a thorough backup regime will mean your systems are recoverable in the event of data loss due to a ransomware attack, operator error or any other disruption.

Experience and Expertise

Novotek Solutions delivers operational technology with a methodology shaped by a deep knowledge gained in over three decades of experience in IT domains.

We’ve led the way in delivering all our projects to a high, IT-compliant standard. Our solutions are supportable, maintainable, and extensible to keep your operation fit for the future.

Read more

]]>
Are your PLCs an easy target? A mindset shift can significantly reduce PLC firmware vulnerabilities https://ideashub.novotek.com/are-your-plcs-an-easy-target-reduce-plc-firmware-vulnerabilities/ Thu, 25 Nov 2021 14:06:48 +0000 https://ideashub.novotek.com/?p=2917

Since the beginning of the COVID-19 pandemic, businesses across the UK have faced a surge in cybercrime. In fact, research indicates that UK businesses experienced one attempted cyberattack every 46 seconds on average in 2020. Industrial businesses are a prime target for hackers and the ramifications of a data breach or denial-of-service attack are far-reaching, making system security imperative. Here, David Evanson, corporate vendor relationship manager at Novotek UK and Ireland, explains how industrial businesses can keep their vital systems secure.

For many business leaders and engineers, it is still tempting to consider large multinational companies or data-rich digital service providers to be the prime target for hackers. However, the growing volume of cyberattacks on businesses globally show that any company can be a target of malicious attacks on systems and services.

According to research by internet service provider Beaming, there were 686,961 attempted system breaches among UK businesses in 2020, marking a 20 per cent increase on 2019. Of these attacks, Beaming noted that one in ten intended to gain control of an Internet of Things (IoT) device — something that indicates a tendency to target system continuity rather than conventional data.

Both factors together are cause for alarm among industrial businesses of all sizes. Hackers are targeting all manner of companies, from start-ups to global organisations, and focussing more on the growing number of internet-connected devices and systems that were previously isolated.

The consequences of a device being compromised range from data extraction to service shutdown, and in any case the financial and production impacts to an industrial business are significant. There is no single quick fix to bolster cybersecurity due to the varying types of hacks that can take place. Some cyberattacks are complex and sophisticated; others less so. Many attacks on devices tend to fall into the latter category, which means there are some steps industrial businesses can take to minimise risk.

Novotek has been working closely with industrial businesses in the UK and Ireland for decades. One common thing that we have observed with automation hardware and software is that many engineers do not regularly upgrade software or firmware. Instead, there is a tendency to view automation as a one-off, fit-and-forget purchase. The hardware may be physically maintained on a regular schedule, but the invisible software aspect is often neglected.

GE Fanuc Series 90-30

Older firmware is more susceptible to hacks because it often contains unpatched known security vulnerabilities, such as weak authentication algorithms, obsolete encryption technologies or backdoors for unauthorised access. For a programmable logic controller (PLC), older firmware versions make it possible for cyber attackers to change the module state to halt-mode, resulting in a denial-of-service that stops production or prevents critical processes from running.

PLC manufacturers routinely update firmware to ensure it is robust and secure in the face of the changing cyber landscape, but there is not always a set interval between these updates.

In some cases, updates are released in the days or weeks following the discovery of a vulnerability — either by the manufacturer, Whitehat hackers or genuine attackers — to minimise end-user risk. The firmware version’s upgrade information should outline any exploits that have been fixed.

However, it’s important to note that legacy PLCs may no longer receive firmware updates from the manufacturer if the system has reached obsolescence. Many engineers opt to air-gap older PLCs to minimise the cybersecurity risk, but the lack of firmware support can also create interoperability issues with connected devices. Another part of the network, such as a switch, receiving an update can cause communications and compatibility issues with PLCs running on older versions — yet another reason why systems should run on the most recent software patches.

At this stage, engineers should invest in a more modern PLC to minimise risk — and, due to the rate of advancement of PLCs in recent years, likely benefit from greater functionality at the same time.

Firmware vulnerabilities are unavoidable, regardless of the quality of the PLC. At Novotek, we give extensive support for the Emerson PACSystems products that we provide to businesses in the UK and Ireland. This involves not only support with firmware updates as they become available, but also guidance on wider system resilience to ensure that businesses are as safe as possible from hardware vulnerabilities. The growth in cyberattacks will continue long beyond the end of the COVID-19 pandemic, and infrastructure and automation are increasingly becoming targets. It may seem a simple step, but taking the same upgrade approach to firmware that we do with conventional computers can help engineers to secure their operations and keep running systems safely.

]]>
Bridging the connectivity gap https://ideashub.novotek.com/bridging-the-connectivity-gap/ Mon, 06 Sep 2021 10:18:03 +0000 https://ideashub.novotek.com/?p=2860

In the age of connectivity, there is no shortage of useful information that engineers can leverage to optimise and improve operations. Everything from the speed of motors to the weather forecast can influence production. However, bringing these data sources together in a secure way is a challenge faced by many engineers. Here, George Walker, managing director of Novotek UK and Ireland, explains how engineers can bridge the gap between local process data and external data sources.

The Internet of Things (IoT) may still be a relatively new concept for many consumers and professional service businesses, but the idea of machine-to-machine communication and connectivity is nothing new for industry. In fact, it’s been more than 50 years since the programmable logic controller (PLC) first became popular among industrial businesses as a means of controlling connected systems.

The principle behind the PLC is quite simple: see, think and do. The controller will ‘see’ what is happening in a process based on the input data from the connected devices and machines. The PLC then processes this input and computes if any adjustments are required and if so, it signals these commands to the field devices. Traditionally, the field devices that could be controlled was limited, but recent developments in sensor technology have made specific components and resources much more measurable.

For example, if a water tank is almost at full capacity in a food processing plant, data from connected sensors can feed that information to a PLC. The PLC then sends the signal for the valve to close once the water volume exceeds a certain threshold, which prevents overflow. This is a simple control loop that sufficiently meets the need of the process.

Unfortunately, even as edge computing and PLC technology has advanced and offered more sophisticated data processing and control at the field-level, many plant engineers continue to setup their devices in this way. In reality, modern edge devices and industrial PCs (IPCs) are capable of providing much greater control, as well as responding to external commands or variables that were previously beyond the scope of control systems.

The outer loop

While the idea of the Industrial IoT (IIoT) is predominately a means of branding modern connectivity, the wider Industry 4.0 movement has brought with it some valuable advancements in edge and PLC technology. Among these advancements is the potential for on-premises automation and control systems to not only connect with local devices in an inner loop, but to draw from external sources: an outer loop.

The outer loop can take several forms, depending on what is most applicable or relevant to a process or operation.

For example, some more digitally mature businesses might have outer loops that feature an enterprise resource planning (ERP) system, supply chain management software or a wider manufacturing execution system (MES). These systems will share and receive relevant information or send required adjustments — such as due to raw material intake or low stock — to an edge device, which feeds into the inner loop. This allows industrial businesses to make use of more comprehensive data analysis than can be achieved in local data systems.

Alternatively, an outer loop could draw from data sources that are completely external to a plant’s operations. For example, a wind farm operator could use an outer loop that drew from sources of meteorological data for wind forecasts. This could inform the optimum pitch and yaw of a turbine, controlled by a field device.

Another example, and one that will resonate with many industrial businesses, is energy price. The cost of power from the electrical grid fluctuates throughout the day, which might mean that on-site generation — such as solar panels or heat recovery processes — become more economical during times of peak grid demand. An outer loop can communicate this data efficiently to the relevant systems in a business, and changes can then be enacted that allow the business to reduce energy costs.

Establishing secure connection

Clearly, there is a benefit for industrial businesses to establish both inner and outer loops. However, there is one barrier to deployment that most engineers encounter: hardware limitations.

Traditional PLCs were designed in a rather utilitarian manner to complete control functions effectively and in a straightforward manner. This no-frills approach persists even with modern PLCs — even with today’s technical specifications, most PLCs are not designed in a way that struggles to handle much more than a real-time operating system and some control applications.

Attempting to set up such a PLC to interact with an outer loop would either not work at all or severely hinder performance and risk failure.

Engineers can tackle this problem by introducing a separate gateway device that serves as an intermediary between the outer loop and the inner loop. However, this is a somewhat inelegant solution that requires investment in additional devices, which will require ongoing maintenance and introduce yet another device into already large system networks. Across an entire site, this quickly becomes costly and complicates network topologies.

A better solution is an unconventional one. It is possible to set up a modern automation controller in such a way that it breaks the conventions of PLCs, as long as the device is capable of multi-core processing at pace. From Novotek’s perspective, one of the best modern units that meet this need is Emerson Automation’s CPL410 automation controller.

The CPL410 can split inner and outer loop processing between its multiple processor cores. The inner loop and PLC processes can run from a single core, while another core — or even a group of cores, depending on complexity — can run more sophisticated containerised applications or operating systems. Additional cores can broker between the inner and outer loops, ensuring reliability and security.

A multi-core setup is useful because it allows the PLC processes and gateway to be consolidating into a single unit, without compromising performance capacity or speed. It also means that ageing or obsolete PLCs can be upgraded to a controller such as the CPL410 during any modernisation initiatives, minimising additional capital costs.

Although the idea behind the IoT is not a new one for industrial businesses, the fact that other sectors are embracing the idea means more external data points than ever before are available. With systems in place that can support effective inner and outer loops, industrial businesses can leverage the increased connectivity of external markets and enhance their own operations.

]]>
A recipe for lasting success https://ideashub.novotek.com/a-recipe-for-lasting-success/ Wed, 01 Sep 2021 11:03:50 +0000 https://ideashub.novotek.com/?p=2802 Few businesses routinely challenge every part of their organisation like food manufacturers. New technologies and digital transformation can help food manufacturers manage the constant change, but the traditional approach of comprehensive digitalisation planning is often not flexible enough to ensure success. Here, Sean Robinson, software solutions manager at food automation expert Novotek UK and Ireland, explains why the key ingredient for success in flexible food manufacturing are micro-applications.

Food production is truly a sector that operates under the mantra of “reinvent the everyday, every day”. The sector is constantly evolving, whether manufacturers are innovating new product ranges that meet changing consumer tastes or switching packaging materials to extend shelf-life or reduce waste. And these are just examples of substantial shifts; food manufacturers are also regularly making smaller challenges by refining recipes, adapting processes or adjusting ingredient and material supply lines.

Despite — or perhaps because of — the environment of constant change, food processors can benefit more than many other manufacturers from carefully targeted use of data collection, visualisation and analysis solutions. After all, yesterday’s optimisation isn’t particularly optimal if today means a new stock-keeping unit (SKU), a new critical ingredient supplier or a new recipe.

The approach that many businesses take to becoming data-driven is to extensively map out their digitalisation journey, with each aspect comprehensively planned. This doesn’t generally support the flexibility needed in food manufacturing.

Rather than taking this approach, modern solutions make it possible to build or buy micro-applications that share common data infrastructure and even app-building or visualisation tools. This means that impactful new capabilities can be adopted through fast initial works that create re-usable building blocks. Later works then become incremental, rather than potentially having different systems creating overlapping capabilities.

Micro-apps in practice

We can see how this micro-app approach can be put into action by considering one of the most common challenges in food processing: managing the effect of variability in key ingredients, so that yields are maximised with minimal re-work or ingredient waste. It’s likely that a manufacturer would already have some of the information needed to address the challenge. The question is, how can you quickly supplement what’s in place?

It’s a safe bet that the factory has automation and maybe supervisory control and data acquisition (SCADA) systems, so there is an abundance of machine-generated data to tell us about the details of how processes are performing. Focussing more closely on yield performance, we can assume our manufacturer has a lab system where in-process and finished good tests give very clear indicators of how well a product is being made.

From Novotek’s experience, the most common gaps in tackling yield issues come from two areas. The first is supplier quality data, which is often provided either written down or in an electronic format that doesn’t mesh with existing systems. This makes analysis more difficult, because there’s no actual database to work from.

The second area is that the variations in raw materials that affect yields may actually be within the specifications defined for those materials. As such, there may not be an obvious fix. It’s likelier that material data needs to be analysed alongside several process performance and quality performance data points. Understanding the relationships between more than two or three variables will probably mean adding a new kind of analysis tool.

Micro-apps can be highly focussed on the core capabilities required. In this case, the micro-app would provide three core functions. First, it would provide a simple means to capture ingredient quality data as it’s received, into a system that also holds the specific material characteristic specifications and limits – all on a “by-lot” basis. It would also offer a machine learning tool that can help clarify how the range of material quality variation can be managed in relation to what machine settings or recipe adjustments might allow for good final yield and quality results.

Finally, the micro-app would be able to alert production staff to make recommended changes to a recipe or process as different raw material lots are staged for use – an automated monitor of yield/quality risk from material variation. This could be as simple as a new smart alarm sent back to existing SCADA, or a notification on a smartphone.

Industrial software vendors are adapting their offers, in recognition of the trend towards micro-apps aimed at specific business processes. So, the software licensing needed to enable material data collection and quality specification monitoring on a key process would be built around a low user count and narrow set of underlying configuration and integration points, rather than a comprehensive plant-wide project. That can mean starting investments in the low thousands for software and some deployment work.

Some of Novotek’s customers are now progressing through projects defined by such very specific functional needs. Our job at Novotek is to ensure that any new solutions serve the purpose of being able to act as supplements to other such micro-apps in the future.

Next stages

A strategic advantage of micro-apps is that the planning and execution stages are less time-intensive than a far-reaching, plant-wide digitalisation project. Food engineers can do several things to begin reinventing their everyday processes. For example, food manufacturers can deploy predictive downtime applications on key processes. These are apps that can even take into consideration whether the products made have their own impact on failure modes.

Each micro-app reflects an opportunity to make the overall food manufacturing operation more adaptable. This means that innovation in products, processes and business models can be done, all the while knowing that refining and optimising the “new” won’t be held up by tools and practices that are too difficult to adapt from the “old”.

]]>
Getting started with food digitalisation https://ideashub.novotek.com/getting-started-with-food-digitalisation/ Mon, 16 Aug 2021 11:09:00 +0000 https://ideashub.novotek.com/?p=2809 The food and beverage industry is one where innovation in product development or design can boast a significant competitive advantage. As such, it’s no surprise that food manufacturers are increasingly considering digitalisation of operations to augment adaptability, improve throughput and strengthen flexibility. Here, Sean Robinson, service leader at food automation software specialist Novotek UK and Ireland, explains how food manufacturers can plan digitalisation in the most effective way.

In the past 12 months, the food industry has been forced to re-evaluate and re-assess its operational priorities. For years, many manufacturers focussed on flexible production to enable diverse product lines and mass customisation, in line with shifting consumer demands. In 2020, this was forced to change, and production efficiency and operational adaptability became the focus. Once more, automation and digital technologies came to the forefront of food manufacturing priorities.

Digitalisation is a word that has been banded around a lot in industrial markets for the past few years, serving as a catch-all phrase encompassing everything that generates, records and communicates data. Unfortunately, as with most amorphous phrases, this leads to confusion among managers about how to introduce these technologies, which causes costly errors in implementation, such as overlapping data collection systems or introduction of technologies that do not serve a strategic purpose.

For food manufacturers at the beginning of their digitalisation journey, the first step is to define an agreed and important goal, which the company can reverse engineer a solution from. Whether looking to deliver on a continuous improvement object that has been identified as part of a formal process, or just illustrating the value of an engineering team unleashed with time to think, it’s key to let the desired improvement dictate what kind of digitalisation will be needed.

For example, if material costs are too high and the agreed goal is to reduce them, a digitalisation project should establish systems that identify the factors influencing this. Understanding the root causes for yield problems could require a combination of machine data, ambient condition data, quality or lab data and information about material quality provided by suppliers. Thinking through where data is readily available, versus where it’s trapped in paper, spreadsheets or isolated automation, will ensure the plan can deliver on the purpose.

Planning at the outset of investing in digitalisation, but some food manufacturers may will have undoubtedly already rushed into digitalisation in years past. For businesses with some digital or automation technologies in place, one of the most valuable things to do is review the lay of the existing digital landscape. The easiest approach to doing this is to apply the ‘three Rs’ to your existing data: reduce systems overlap, reuse data and recycle data.

Reducing data collection system overlap not only makes it easier for managers to identify the source of a specific data set, it also streamlines costs. Why have a downtime system collecting machine event data, a yield analysis system collecting overlapping data and a work in progress tracking system that is separate to both of those? Having three systems collecting fundamentally the same data means duplicated configuration and deployment costs, as well as possible conflict over which one holds the ‘truth’.

An effective data and digitalisation strategy should also aim to use collected data in various calculations to produce several insights. For example, the downtime event data collected for OEE calculations may be part of what’s needed to solve a quality problem. The energy and water data collected for sustainability reporting may hold the key to real savings opportunities. Wherever there is a connection to a data source, managers should think of ways to make sure that a data point only needs to be collected once in order to be used many times.

Finally, offline analysis tools and some of the new analytics packages on the market could mean that old data offers recurring value as a firm chases finer and finer points of improvement. So, it’s important to set up a data management approach and data management platforms that can give you the option of making repeated use of data.

Digitalisation projects can lead to more innovative and effective ways of working for food manufacturers, but they rely on careful planning and strategic implementation. By giving full consideration to the goals to be reached or how data is used within a site, food businesses can ensure their systems are always effectively aligned with their goals.

]]>
Can your IPC handle the heat? https://ideashub.novotek.com/can-your-ipc-handle-the-heat/ Mon, 05 Jul 2021 10:55:00 +0000 https://ideashub.novotek.com/?p=2667 High operating temperatures are abundant in the industrial sector, whether it’s the elevated ambient temperature of oil and gas refining or the continuous operations with reduced airflow of heavy machinery. These high temperatures pose a common problem to the performance of industrial electronics, particularly industrial PCs (IPCs). Here, David Evanson, corporate vendor relationship manager at Novotek UK and Ireland, explains how engineers and managers can ensure their IPCs can handle the heat.

It’s no secret that IPCs play an essential role in modern industrial operations. These vital units undertake a range of tasks, from managing equipment performance data to motion and automated system control. It’s therefore no surprise that the IPC market continues to go from strength to strength. In fact, ResearchAndMarkets forecasts that the global IPC market will grow at a compound annual growth rate (CAGR) of 6.45 per cent, to be valued at $7.756 USD by 2026.

IPCs feature prominently on the factory floor, generally either in control cabinets or mounted onto machinery. Being on the frontline means that engineers and plant managers know that, as a minimum, they need to specify ruggedised IPCs for their operations. What sometimes gets overlooked, however, is the operating temperature range of an IPC unit.

Electronic circuits and computing components are highly susceptible to extreme temperatures, be they high or low. At high temperatures, components can deteriorate faster. In the case of IPCs, modern CPUs are designed to prevent accelerated component deterioration by throttling their processing performance. This succeeds in reducing the heat produced in processing circuits, but it means that processes running on the IPC run slowly or become unresponsive — not ideal for real-time control applications.

In certain markets, considering operating temperature range is second nature for engineers. For example, an IPC tasked with controlling or collecting data from a welding robot will be specified to withstand high temperatures.

However, temperature should be a consideration even in unassuming industrial environments. If an IPC is situated outside, the exposure to sunlight — alongside reduced airflow in an installation — can cause an increase in temperature that can reach up to 70 degrees Celsius. Both Novotek and its partner Emerson Automation have encountered industrial businesses that have experienced this problem.

Of course, the solution to the challenge of overheating in IPCs is to specify a unit that boasts good thermal performance in an extended operating temperature. Unfortunately, not all IPCs that claim to offer this feature are actually effectively tested in conditions that accurately reflect real-world operating conditions, which can lead to some IPCs failing when deployed in the field.

The reason why extended temperature IPCs fail is due to the way that the testing is undertaken. In many cases, the IPC is tested in a thermal chamber that has significant forced air flow conditions, which reduces the efficacy and the accuracy of the test itself. A more effective way of testing is for the IPC manufacturer to block the airflow, which simulates a realistic use condition in a cabinet environment.

Emerson Automation conducts its tests under these restricted airflow conditions, which allows it to accurately demonstrate that its IPCs can perform at high temperatures without throttling processing performance. The company has even shared a video of its IPC thermal testing process, highlighting the capabilities of its RXi2-BP.

It’s for this reason that Emerson’s IPCs are the go-to option from Novotek’s perspective, because they ensure reliable and consistent operation in demanding environmental conditions.

With IPCs playing such a vital role in modern industry, its important that they are up to the task not only in terms of computing capacity, but also environmental performance. When plant managers and engineers can specify an IPC with assurances of the accuracy of thermal testing, it provides peace of mind that the unit can handle the heat.

]]>
What can Industrial PCs and displays handle? https://ideashub.novotek.com/what-can-industrial-pcs-and-displays-handle/ Fri, 14 May 2021 08:30:00 +0000 https://ideashub.novotek.com/?p=2656

Industrial PCs (IPCs) and displays are routinely used in harsh environments. Even in less extreme industrial environments, IPCs and displays are often subject to heavy handling or accidental damage. So, its important to know what your IPCs and displays can (and can’t) handle.

There are many cases where engineers might overlook the need for robustness and durability in IPCs and displays, particularly in environments where there is unlikely to be any extreme shock, vibrations or impacts.

However, things happen in the day-to-day activities on the factory floor that can impact IPCs and displays, and it’s important to ensure systems remain operational and intact in these cases. Even if a system doesn’t need to have an extended operating temperature, you also don’t want it to temporarily stop functioning if somebody accidentally knocks or scratches it!

  • Summary:
  • Even in traditional industrial environments, IPCs and displays need to be robust
  • Testing systems for physical damage or performance disruptions is invaluable

Transcript:

Hi, my name is Gene Juknevicius. I am a solution architect at Emerson. At Emerson, we know that customers need reliable industrial PCs and displays to run their applications. So let’s see how robust Emerson’s industrial PC and display really is. 

So here we have our RXI2-BP Industrial PC and two RXI monitors. By the way, let me show you an interesting feature of a display port interface that our industrial PCs and monitors do support. It’s called a multi stream transport. It allows us to daisy chain multiple independent monitors to a single display port output on an industrial PC. 

In this case, I’ll be showing video on one display and performance statistics on the other display. 

So, let’s see how robust the industrial PC is. For the test I’m going to drop it on the floor while operating. We’re going to watch the video to make sure that there are no interruptions. So for the test, let’s see how high the table is – it’s about 75 centimetres. We obviously need to make sure that the cables are long enough. 

And, please, don’t do this test at home — it requires a certified IPC dropper. Ready? 

Look at the video. It keeps running. As you can see, the video did not even flicker. This is exactly what we want.  

Okay, so industrial PCs are robust enough. What about our displays? When visiting some of our oil and gas customers, I was told that people in the field tend to use large screwdrivers as a stylus for the touch screen.  In our case I have this large wrench. Let’s try to use that.  

Okay, what about trying to use a display as a cutting board?  

As you can see, neither the large wrench nor a sharp cutter left any scratches on a display.  

Our goal is to deliver solutions that you can deploy anywhere and not worry if they will operate, so that you can focus on your application. 

]]>
A secure knowledge base https://ideashub.novotek.com/a-secure-knowledge-base/ Tue, 06 Apr 2021 15:27:00 +0000 https://ideashub.novotek.com/?p=2838 The notion that ideas become reality especially applies to cybersecurity in critical national infrastructure. Security breaches can result in very real losses of water or energy; but ideas around cyber threats are obscured by misconceptions around the nature of such attacks and how to deal with them. Sean Robinson, service leader of automation specialist Novotek UK and Ireland, explains how a compact controller could negate these threats, and improve companies’ internal understanding of cyberattacks.

An annual report by Kaspersky Lab, The State of Industrial Cybersecurity 2018, revealed several interesting facts about how industrial cybersecurity is perceived by businesses and applied to Industrial Control Systems (ICS). The survey of 230 worldwide professionals reveals disconnections between what is feared by businesses, and what’s happening in reality.

For instance, 66 per cent of the surveyed businesses were most concerned about advanced persistent threats (APT), like data leaks and spying (59 per cent), because of their perceived potential impact. In reality, APT’s make up 16 per cent of cybersecurity incidents. Actually, conventional malware and virus outbreaks are becoming the greater problem. These attacks are not overly sophisticated and made up 64 per cent of cybersecurity incidents, last year.

Aside from misconceptions about the external threat landscape, disparities also exist within organisations. In relation to Kaspersky Lab’s survey, technology website tripwire.com cited a report by the SANS Institute. SANS found that, among nearly three-quarters of firms that were confident in their ability to secure their industrial internet of things (IIoT), there were more likely to be different internal perceptions about the effectiveness of their security measures. While leaders and department managers were more likely to have a “rosy outlook” of their security, operational technology departments had a more pessimistic view.

Such misconceptions would be even more of a concern within critical national infrastructures. Cyberattacks against water, energy or chemical supplies can have very real consequences for countries and their populations.

Upgrading control systems

From a hardware and systems perspective, more than half — 54 per cent — of the surveyed businesses identified integrating ICS with IT systems and Internet of Things (IoT) ecosystems as among the most pronounced challenges. This last statistic places a wider challenge faced by plant managers into a whole new context: specifically, how best to achieve space and cost savings by reducing the size and complexity of plant equipment.

Plant managers are turning to new systems to achieve greater levels of flexibility and profitability in their production. This coincides with older programmable automation controller (PAC) systems, like trusted Series 90-30 controllers, reaching the end of their operational lifespans. In many cases, these 90-30 systems have been relied upon as integral to plant operations for upwards of 25 years.

How can plant managers effectively upgrade their systems, while ensuring that cybersecurity measures keep up with the rate of technology adoption — and the external threat landscape? Fortunately, answers lie in smart hardware and its role in helping manufacturers enhance process flexibility and performance.

Centralised security

One solution lies in better control. The RSTi-EP CPE100 is a compact controller for PAC systems — specifically, to control the RX3i CPU from Emerson which has emerged as a popular and effective upgrade for 90-30 systems. In a nutshell, the RSTi-EP CPE100 leverages the power and flexibility of PAC systems in smaller applications.

The RSTi-EP CPE100, entire PAC systems can be programmed in stand-alone applications, or the system can be used as an auxiliary controller in larger process applications that use the RX3i. Not only does the system leverage the power and flexibility of PAC systems in smaller applications, there are also benefits in terms or cybersecurity — indeed, the RSTi CPE100 is secure by design.

With the system, companies can apply optimised security right from the very start. RSTi CPE100 incorporates technologies like Trusted Platform Modules and secure, trusted, and measured boot. It allows centralised configurations, so that encrypted firmware updates can be executed from a secure central location. Specifically, a suite of cybersecurity technologies can help prevent unauthorized updates. Meanwhile, built-in security protocols can protect against man-in-the-middle attack (MITM) — where the attacker secretly inters with communications between two parties — and denial-of-service (DoS) attacks.

Speaking of the “man-in-the-middle”, another key takeaway from Kaspersky Lab’s report is that, going forward, industrial companies must also pay more attention to employees’ understanding and awareness of cyber threats. Because the RSTi-EP CPE100 can streamline application development and integration, a further benefit of the system is that it simplifies training for operators and maintenance workers.

While cyberattacks on ICS computers are misunderstood by many within industry, it’s necessary to overcome these misconceptions while keeping up with the best cybersecurity measures. Novotek recommends that managers should pay attention to system security from the very beginning of their integration. The more critical the application, the more important it is that ideas surrounding cyberattacks accurately pre-empt the realities.

]]>
Consolidating tech in the utilities sector https://ideashub.novotek.com/consolidating-tech-in-the-utilities-sector/ Mon, 11 Jan 2021 10:07:00 +0000 http://ideashub.novotek.com/?p=2225 Despite the utilities sector being one of the first areas of industry to digitalise its operations in the 1970s, business leaders have been slow to make systematic changes in recent years. Here, Sean Robinson, service leader at Novotek UK and Ireland, explains how the latest technologies can better equip utilities companies to adapt to future energy demands.

In 2015, at the UN climate conference of parties (COP), world leaders agreed to take united action in limiting the rise of global temperatures to less than two degrees Celsius. The pressures to reduce carbon emissions, as well as the shift to post-recession, less energy-intensive industries, have led to a surge in demand for new power and utilities offerings across the globe.

Fossil fuels account for up to 82 per cent of the world’s primary energy usage, but as governments begin to tightly regulate this usage and renewable energy generation is on the rise, utilities companies need to evolve. This presents several growth opportunities for the utilities industry to integrate additional services into their portfolio. However, one of the greatest challenges facing utilities companies is the integration of new and emerging technologies into their business models.

Utilities companies need to begin by evaluating their current systems and infrastructure against their business goals. At Novotek, we’ve found that many utilities companies are using legacy equipment or disparate systems from a wide range of suppliers that, often, are out of sync with other operations in the facility.

While global investment in digital electricity infrastructure and software may have grown by over 20 per cent annually since 2014, at Novotek we are urging utilities companies in particular to move faster. By modernising and consolidating a facility’s existing systems into one, businesses can make a significant return on investment (ROI).

Because data comes from a broad range of sources, consolidation allows organisations to present data easier, while also facilitating effective data analysis.

Data consolidation techniques reduce inefficiencies like data duplication, costs related to reliance on multiple databases and multiple data management points.

Currently, the utilities sector is greatly fragmented as a result of decades of outsourcing in incremental functional and geographic silos. With technologies that exist today, like GE Digital’s Predix Plant Applications software, which is part of the Predix manufacturing execution system (MES) suite, utilities companies can now manage the hundreds of devices and pieces of equipment operating simultaneously across not just one plant, but an entire portfolio from one system in real-time.

Combining predictive machine learning and advanced analytics, the technology can help utilities managers to transition from a reactive to a proactive and prescriptive operating model.

This is because Plant Applications allow plant managers to analyse and configure insights from the data collected, to make informed business decisions and establish new unfragmented processes, to improve other areas of the business, like reducing waste. By 2025, data analytics will to be a core component in assisting companies, like those operating in the utilities sector, in making key business decisions. By consolidating various processes and integrating automation technologies, like GE Digital’s, utilities companies can optimise their operations to significantly improve performance and retain a competitive edge.

]]>