Life sciences – Novotek Ideas Hub https://ideashub.novotek.com Ideas Hub Wed, 01 Jun 2022 13:12:54 +0000 en-US hourly 1 https://wordpress.org/?v=5.7.11 https://ideashub.novotek.com/wp-content/uploads/2021/03/Novotek-logo-thumb-150x150.png Life sciences – Novotek Ideas Hub https://ideashub.novotek.com 32 32 DataOps: The Fuel Injectors For Your Transformation Engine? https://ideashub.novotek.com/dataops-the-fuel-injectors-your-transformation-engine/ Thu, 19 May 2022 11:43:48 +0000 https://ideashub.novotek.com/?p=3060

Data – everyone agrees it’s the fuel for the fires of innovation and optimisation. The industrial world is blessed with an abundance of rich, objective (being machine-generated) data, so should be well-equipped to seek new advantages from it. Too often, the first efforts an industrial firm takes to harness its machine and process data for new reporting or advanced analysis initiatives involve simple use cases and outputs that can mask what it takes to support a mix of different needs in a scalable and supportable way. Data Ops practices provide a way of systemically addressing the steps needed to ensure that your data can be made available in the right places, at the right times, in the right formats for all the initiatives you’re pursuing.


Industrial data (or OT data) poses particular challenges that your Data Ops strategy will address:

  • It can be generated at a pace that challenges traditional enterprise (or even cloud layer) data collection and data management systems (TO say nothing of the costs of ingestion and processing during reporting/analysis typical of cloud platforms is considered).
  • The data for functionality identical assets or processes is often not generated in a consistent structure and schema.
  • OT data generally does not have context established around each data point – making it difficult to understand what it represents, let alone the meaning inherent in the actual values!
  • Connecting to a mix of asset types with different automation types and communications protocols is often necessary to get a complete data set relevant to the reporting or analytics you’re pursuing.
  • A wide array of uses demands different levels of granularity of some data points and a breadth of collection points that is significantly wider than many individual stakeholders may appreciate.

These are the reasons why in many firms, the engineering team often ends up becoming the “data extract/Excel team” – their familiarity with the underlying technology means they can take snapshots and do the data cleansing necessary to make the data useful. But that’s not scalable, and data massaging is a far less impactful use of their time – they should be engaged with the broader team interpreting and acting on the data!


Data Ops – Quick Definition There’s no one way to “do” Data Ops. In the industrial world, it’s best thought of as a process involving: – Determining the preferred structures and descriptions (models) for OT data, so it may serve the uses the organisation has determined will be valuable. – Assessing what approaches to adding such models can be adopted by your organisation. – Choosing the mix of tools needed to add model structures to a combination of existing and new data sources. – Establishing the procedure to ensure that model definitions don’t become “stale” if business needs change. – Establishing the procedures to ensure that new data sources, or changing data sources are brought into the model-based framework promptly.


A Rough Map is Better Than No Map.

Take a first pass at capturing all the intended uses of your OT data. What KPIS, what reports, what integration points, and what analytics are people looking for? Flesh out those user interests with an understanding of what can feed into them:

  1. Map the different stakeholder’s data needs in terms of how much they come from common sources, and how many needs represent aggregations, calculations or other manipulations of the same raw data.
  2. Flesh out the map by defining the regularity with which data needs to flow to suit the different use cases. Are some uses based on by-shift, or daily views of some data? Are other users based on feeding data in real-time between systems to trigger events or actions?
  3. Now consider what data could usefully be “wrapped around” raw OT data to make it easier for the meaning and context of that data to be available for all. Assess what value can come from:
    1. Common descriptive models for assets and processes – a “Form Fill & Seal Machine” with variables like “Speed” and “Web Width” (etc.) is a far easier construct for many people to work with then a database presenting a collection of rows reflecting machines’ logical addresses with a small library of cryptically structured variables associated to each one.
    2. An enterprise model to help understand the locations and uses of assets and processes. The ISA-95 standard offers some useful guidance in establish such a model.
    3. Additional reference data to flesh out the descriptive and enterprise models. (eg: Things like make and model of common asset types with many vendors; or information about a location such as latitude or elevation). Be guided by what kind of additional data would be helpful in comparing/contrasting/investigating differences in outcomes that need to be addressed.
  4. Now assess what data pools are accumulating already – and how much context is accumulating in those pools. Can you re-use existing investments to support these new efforts, rather than creating a parallel set of solutions?
  5. Finally, inventory the OT in use where potentially useful data is generated, but not captured or stored; particularly note connectivity options.

Avoiding A Common Trap “Data for Analytics” means different things at different stages. A data scientist looking to extract new insights from OT data may need very large data sets in the data centre or cloud, where they can apply machine learning or other “big data” tools to a problem. A process optimisation team deploying a real-time analytic engine to make minute-by-minute use of the outputs of the data scientists’ work may only need small samples across a subset of data point for their part of the work. Data Ops thinking will help you ensure that both of these needs are met appropriately.


Map’s Done – Now How About Going Somewhere?

The work that comes next is really the “Ops” part of Data Ops – with the rough map of different uses of OT data at hand, and the view of whether each use needs granular data, aggregated data, calculated derivations (like KPIs), or some kind of combination, you’ll be able to quickly determine where generating desired outputs requires new data pools or streams, or where existing ones can be used. And for both, your data modelling work will guide what structures and descriptive data need to be incorporated.

At this point, you may find that some existing data pools lend themselves to having asset and descriptive models wrapped around the raw data at the data store level – ie: centrally. It’s a capability offered in data platform solutions like GE’s Proficy Historian. This approach can make more sense than extracting data sets simply to add model data and then re-writing the results to a fresh data store. Typically, streaming/real-time sources offer more choice in how best to handle adding the model around the raw data – and there are solutions like HighByte’s Intelligence Hub, that allow the model information to be added at the “edge” – the point where the data is captured in the first place. With the model definitions included at this point, you can set up multiple output streams – some feeding more in-the-moment views or integration points, some feeding data stores. In both cases, the model data having been imposed at the edge makes it easier for the ultimate user of the data to understand the context and the meaning of what’s in the stream.



Edge Tools vs Central Realistically, it’s you’re likely to need both. And the driving factor will not necessarily be technical. Edge works better when: 1. You have a team that deal well with spreading standardised templates. 2. Data sources are subject to less frequent change (utility assets are a good example of this). 3. The use cases require relatively straightforward “wrapping” of raw data with model information. 4. Central works well when. 5. The skills and disciplines to manage templates across many edge data collection footprints are scarce. 6. The mix of ultimate uses of the data are more complex – requiring more calculations or derivations or modelling of relationships between different types of data sources. 7. Change in underlying data sources is frequent enough that some level of dedicated and/or systematized change detection and remediation is needed.


Regardless of which tools are applied, the model definitions defined earlier, applied consistently, ensure that different reports, calculations and integration tools can be developed more easily, and adapted more easily as individual data sources under the models are inevitably tweaked, upgraded or replaced – as the new automation or sensors come in, their unique data structures simply need to be bound to the models representing them, and the “consumer” of their outputs will continue to work. So, while tools will be needed, ultimately the most valuable part of “doing” Data Ops is the thinking that goes into deciding what needs to be wrapped around raw data for it to become the fuel for your digital journey.

]]>
1,000 miles or around the block: Start one step at a time… https://ideashub.novotek.com/1000-miles-or-around-the-block-start-one-step-at-a-time/ Wed, 16 Mar 2022 11:44:08 +0000 https://ideashub.novotek.com/?p=2996

The rise of connected industrial technologies and Industry 4.0 has prompted the development and launch of countless systems with extensive capabilities and functions. This is often beneficial for businesses with a defined and set long-term strategy, but it can lead to forcing early adoption and spending when deployments and licensing outstrip company’s capacity to change work processes ad adopt new tech.


Here, Sean Robinson, software solutions manager at Novotek UK and Ireland, explains how less can be more with new plant tech deployments – and why immediate problem-solving needs to be a distinct effort within longer-term strategies.


Countless proverbs, maxims and quotes have been formed around the idea of moderation, dating back as far as – or even further – Ancient Greek society. The notion remains important to this day for everything from diet to technology. However, engineers and plant managers frequently over-indulge in the latter and over-specify systems that offer functionality well beyond what is necessary or even practically useful.

It can initially appear that there is no harm in opting for an automation or plant IT system that has extensive functionality, because this may help to solve future problems as they arise. That being said, and investment, positioned to be all-encompassing, like a full, material-receiving-throguh-WIP-execution-with-performances-analysis-and-enterprise-integration manufacturing execution system (MES) can sometimes present its own barriers to adoption for certain businesses, especially those in sectors that favour flexibility such as fast-moving consumer good (FMCG) or food manufacturing (also – interestingly – increasingly in the contract production side of consumer health and life sciences). Where core production processes and related enabling technology are well-established, it can be risky, expensive and overkill to treat the need to implement specific new capabilities as the trigger for wholesale replacement or re-working. They key is to identify where critical new functional needs can be implemented around the installed technology base in focused ways that deliver results, while leaving open the option of incrementally adding additional functionally-focused solutions in a staged way, over time.

At Novotek, our role is to help our customers choose technology that delivers on an immediate need, while opening up the potential to build incrementally in a controlled, low-risk way.

Fortunately, both the licensing models and the technical architectures of plant IT solutions are changing in ways that support this kind of approach. So, the software cost and deployment services costs of bringing on board very specific capabilities can be scaled to match the user base, and the technical and functional boundaries of a specific need. We can think of these focused deployments as “micro-apps”. A key part of this approach is that the apps aren’t built as bespoke, or as an extension of a legacy (and possibly obsolete) system. It’s a productised solution – with only the “right” parts enables and delivered to the right stakeholders. Consider quality in toiletry production and specifically challenges with product loss due to variability in the quality of raw materials. It’s safe to assume that a plant will already have local control systems in place elsewhere to track the overall quality outcomes, but monitoring the raw material quality is often left to supplier-side data that may be under used – serving as a record of supplier compliance with a standard, rather then being used to proactively trigger adjustments in key process settings to avoid losses. In this scenario, an ideal micro-app could be focused on captured raw material data, using machine learning to provide deep analysis of how existing machines can best process the material lot and alerting supervisors and process owners to take action. Such a function might have a small number of users; it might even have integration with inventory or quality systems replacing some manual data entry. So, the software licensing and services and timelines to deliver impact can be kept small. When we consider some of the demands manufacturers now face on fronts ranging from qualifying new supplier/materials, to furthering energy and water reduction, to adopting more predictive maintenance and asset management strategies. We see a lot of potential to tackle these with focused solutions that happen to borrow from the underlying depth and breath of MES solutions.

There are unquestionably many cases where a plant-wide solution like an MES is necessary or even preferable. We and our key technology and services partners have delivered many such “complete” systems across the country. However, it should certainly not be considered the only option for agile industrial businesses. If each factory can be thought of as a collection of work processes/functions that need to be delivered, then implementing the supporting/enabling technology as a collection of micro-apps can make sense. And when balancing risk, cost and speed to value, sometimes, moderation in plant technology deployments can provide the most bountiful benefits.

]]>
Are your PLCs an easy target? A mindset shift can significantly reduce PLC firmware vulnerabilities https://ideashub.novotek.com/are-your-plcs-an-easy-target-reduce-plc-firmware-vulnerabilities/ Thu, 25 Nov 2021 14:06:48 +0000 https://ideashub.novotek.com/?p=2917

Since the beginning of the COVID-19 pandemic, businesses across the UK have faced a surge in cybercrime. In fact, research indicates that UK businesses experienced one attempted cyberattack every 46 seconds on average in 2020. Industrial businesses are a prime target for hackers and the ramifications of a data breach or denial-of-service attack are far-reaching, making system security imperative. Here, David Evanson, corporate vendor relationship manager at Novotek UK and Ireland, explains how industrial businesses can keep their vital systems secure.

For many business leaders and engineers, it is still tempting to consider large multinational companies or data-rich digital service providers to be the prime target for hackers. However, the growing volume of cyberattacks on businesses globally show that any company can be a target of malicious attacks on systems and services.

According to research by internet service provider Beaming, there were 686,961 attempted system breaches among UK businesses in 2020, marking a 20 per cent increase on 2019. Of these attacks, Beaming noted that one in ten intended to gain control of an Internet of Things (IoT) device — something that indicates a tendency to target system continuity rather than conventional data.

Both factors together are cause for alarm among industrial businesses of all sizes. Hackers are targeting all manner of companies, from start-ups to global organisations, and focussing more on the growing number of internet-connected devices and systems that were previously isolated.

The consequences of a device being compromised range from data extraction to service shutdown, and in any case the financial and production impacts to an industrial business are significant. There is no single quick fix to bolster cybersecurity due to the varying types of hacks that can take place. Some cyberattacks are complex and sophisticated; others less so. Many attacks on devices tend to fall into the latter category, which means there are some steps industrial businesses can take to minimise risk.

Novotek has been working closely with industrial businesses in the UK and Ireland for decades. One common thing that we have observed with automation hardware and software is that many engineers do not regularly upgrade software or firmware. Instead, there is a tendency to view automation as a one-off, fit-and-forget purchase. The hardware may be physically maintained on a regular schedule, but the invisible software aspect is often neglected.

GE Fanuc Series 90-30

Older firmware is more susceptible to hacks because it often contains unpatched known security vulnerabilities, such as weak authentication algorithms, obsolete encryption technologies or backdoors for unauthorised access. For a programmable logic controller (PLC), older firmware versions make it possible for cyber attackers to change the module state to halt-mode, resulting in a denial-of-service that stops production or prevents critical processes from running.

PLC manufacturers routinely update firmware to ensure it is robust and secure in the face of the changing cyber landscape, but there is not always a set interval between these updates.

In some cases, updates are released in the days or weeks following the discovery of a vulnerability — either by the manufacturer, Whitehat hackers or genuine attackers — to minimise end-user risk. The firmware version’s upgrade information should outline any exploits that have been fixed.

However, it’s important to note that legacy PLCs may no longer receive firmware updates from the manufacturer if the system has reached obsolescence. Many engineers opt to air-gap older PLCs to minimise the cybersecurity risk, but the lack of firmware support can also create interoperability issues with connected devices. Another part of the network, such as a switch, receiving an update can cause communications and compatibility issues with PLCs running on older versions — yet another reason why systems should run on the most recent software patches.

At this stage, engineers should invest in a more modern PLC to minimise risk — and, due to the rate of advancement of PLCs in recent years, likely benefit from greater functionality at the same time.

Firmware vulnerabilities are unavoidable, regardless of the quality of the PLC. At Novotek, we give extensive support for the Emerson PACSystems products that we provide to businesses in the UK and Ireland. This involves not only support with firmware updates as they become available, but also guidance on wider system resilience to ensure that businesses are as safe as possible from hardware vulnerabilities. The growth in cyberattacks will continue long beyond the end of the COVID-19 pandemic, and infrastructure and automation are increasingly becoming targets. It may seem a simple step, but taking the same upgrade approach to firmware that we do with conventional computers can help engineers to secure their operations and keep running systems safely.

]]>
Obsolescence in pharma automation https://ideashub.novotek.com/obsolescence-in-pharma-automation/ Thu, 21 Jan 2021 10:14:38 +0000 http://ideashub.novotek.com/?p=1966 Despite the life sciences being an industry where precision and quality is vital, pharmaceutical companies have been slow to upgrade their process control automation systems. Here, Sean Robinson, service leader at Novotek UK and Ireland, explains why, as many paper-under-glass data handling systems begin reaching obsolescence, it is time for pharmaceutical companies to upgrade their process control automation without affecting FDA/EU compliance.

Broadly speaking, pharma companies have made little progress in modernising their data collection practices. This is largely due to fears that changes in data handling will affect compliance with regulations laid out by the Food and Drug Administration (FDA) or EU governing bodies. Modernisation, through the increased flexibility and efficiency it can bring, can better equip companies to face increased competitive pressures from patent extinction on flagship products, allow more rapid scaling of product and packaging variants, and help address the increasing presence of  counterfeit products.

The evolution of Good Manufacturing Practices (GMP – later with an added A for Automated) has been critical in developing and maintaining consumer trust in pharmaceutical products. In practice, though, it has led to conservatism in the way processes are developed and launched, and rigidity in the way data is collected in relation to production activities. The last generational rollout of automation in the pharma sector occurred during the early 2000s, and saw the entrenchment of minimalist approaches to use of tracking systems alongside core automation. While other industries adopted plant IT systems that made extensive use of automated data flows to support performance analysis and continuous improvement efforts, life sciences companies effectively reproduced their old paper tracking systems in electronic format.

Such ”paper-under-glass” systems have tended to be very limited in functional scope and often address only digitising the logging of data directly needed to provide batch reporting supporting the release of product into the marketplace. To this day, many pharmaceutical companies are reluctant to update their paper-under-glass approach due to a misconception that any data collected automatically effectively becomes part of their official batch record regimen, and must therefore be managed in strict accordance with regulations such as the US FDA’s 21 CFR Part 11. While regulators (the FDA in particular) have gone to great lengths to clarify the way such regulations should be interpreted, this initial confusion has had a lasting effect.

Electronic Signature Regulations (ESRs) such as 21 CFR Part 11 outline the requirements for a combination of secure system configuration capabilities, data logging/transactional auditing capabilities and physical data security management. These rules are in place to allow companies to record data with fidelity and prevent data being lost.

What is misunderstood here is that data not needed for the batch record does not need to comply to ESRs. Therefore, managing records related to understanding performance improvement opportunities related to  manufacturing assets and processes does not need to be complex and time-consuming.

Paper-under-glass also often leads to ineffective evaluation of batch quality because the detail in electronic batch records is inadequate to run root cause analysis of quality issues that arise. In the event of a machine breaking down, paper-under-glass systems will not give useful insight into what has caused the malfunction, and companies lose productivity. By limiting the footprint of what data is collected automatically, pharma companies deprive themselves of the rich detailed asset data that would correlate to asset events such as downtime, or to quality events where the underlying issue may be driven by a combination of asset health information, process execution information and material characteristic information. While a paper-under-glass system may help log what is happening, easy access to these additional data sets is crucial to quickly understanding the “why”.

Fortunately, pharmaceutical manufacturers today have clearer insight than they did two decades ago into how automated systems can comply with ESRs and GAMPs. Now, with many of these systems due to be approaching obsolescence in the coming years, it’s time for pharmaceutical businesses to consider how they can adopt automation and plant IT that supports improvements in all aspects of production without impacting regulatory compliance.

Novotek’s experience in the sector has shown that there can be a number of ways to deploy solutions ranging from SCADA systems with batch execution footprints; to Historians; to manufacturing execution systems. For each of these there are well-understood approaches where the configuration ensures that compliance critical data can be logged in appropriately controlled ways, while easing the collection of separate data for root cause and improvement analysis.  Best-in-class pharmaceutical companies typically leverage a suite of solutions providing the mix of execution support, track and track and detailed end-to-end visibility into the production process and assets. From that, the mix of capabilities extends to include:

  • Eased proof of quality and compliance
  • Faster and deeper understanding of root causes of quality events and productivity losses
  • Simpler deployment of anticounterfeiting measures (as the mix of modern IT and automation solutions open up better methods than some of the bolt-ons built around now-obsolete automation and ERP platforms)
  • Greater adaptability of production lines to run a broader range of qualified products

As conditions in the life sciences marketplace become tougher for producers, we at Novotek believe that there is an opportunity for firms to rethink the role of production technology – to see it as a source of competitive advantage. And the generational change in systems now underway due to obsolescence means now is the time for pharmaceutical businesses to seize that advantage.

]]>