The lab of the 21st century

BIOVIA, Dassault Systèmes
By Daniela Jansen, Director of Product Marketing, BIOVIA, Dassault Systèmes
Monday, 28 March, 2022


The lab of the 21st century

For many years the laboratory informatics community has been talking about the ‘lab of the future’, ‘Lab 4.0’, the ‘smart lab’ or the ‘lab of the 21st century’. We have been in the 21st century for quite a while now — but what has happened so far?

Labs have moved from paper to electronic (paper on glass) and some of them have moved to the digital lab by implementing digital workflows. But if the lab of the 21st century is supposed to be new and transformative, this isn’t enough. Labs need to become truly digitalised (using digital technologies to change a business model).

The key to digital — and even more for digitalised lab operations — is connectivity to build the foundation for a new way of working.

Many companies have considered and even tested new technologies that have come up: cloud computing, data lakes, the Internet of Laboratory Things, AI and machine learning, virtual and augmented reality, voice control. Are these technologies relevant for the lab of the 21st century? Are they providing any value?

Only connectivity allows organisations to leverage these new advanced technologies and to make an impact on the experience of lab scientists, the productivity of the lab and the reuse of scientific data.

Connectivity

The most basic connectivity in the lab is between data generators like instruments or any application to capture/enter data and data consumers. These are the tools to create reports and documents, to generate analytics and dashboards, and to provide secure long-term storage of the large amount of valuable data generated in the lab.

However, laboratory operations encompass more than this, and connectivity must go much further. Systems to manage samples and chemicals, as well as equipment and personnel, are part of the lab environment. It also includes processes like the development of methods and their execution, the preparation of samples and experiments, and the analysis, reporting and decision-making. Moreover, it must be able to bi-directionally communicate with an organisation’s business systems like the enterprise resource planning (ERP) system. It sounds complex to integrate all these elements but the benefits are significant.

Technology and standardisation

What is required to enable this connectivity? The basis will be an infrastructure that is platform-based to provide the required backbone, cloud-enabled to ensure an agile collaborative way of working with a low total cost of ownership (TCO) and that can leverage a data lake to ensure long-term storage of many different data types. However, technology is not enough. Standardisation is key — and different aspects of standardisation have to be considered.

An important aspect is the actual data format and a standardised framework. A standard data format must define the specifications for a vendor- and technique-agnostic format to store the data and contextual metadata so it allows for long-term and real-time access to the data. The standardisation of taxonomies and ontologies (vocabulary) allows for a controlled vocabulary and relationships for the contextual metadata about material, equipment, process, results and properties. The outcome of the work of pre-competitive consortia like Allotrope and the Pistoia Alliance can help.

In addition, labs should ensure their scientific data are FAIR (findable, accessible, interoperable, reusable). The FAIR data principles act as a guideline to support the implementation of technology. They make data more valuable as it is easier to find, combine and integrate thanks to the formal shared knowledge representation.

The value of connectivity

  • Enables the liquidity of data and information flow between people and systems, vertically and horizontally.
  • Helps to overcome data silos and departmental disconnect.
  • Supports processes like materials characterisation, formulations, process development, stability studies or batch releases in one seamless experience.
  • Increases lab productivity up to 40%.
  • Ensures data integrity and improves data quality.
  • Drives collaboration and data sharing on a global scale.
  • Allows integration of new advanced technology and devices.
  • Enables data- and knowledge-based decision-making — in real time.
  • Creates a new transformative user experience.

One step further

What if we think a bit further? If we connect to experts in modelling and simulation, we can replace physical testing by virtual testing without having to build the in-house expertise. If we connect to other labs, in-house or externally, we can optimise the scheduling of lab work in unprecedented ways, removing the testing bottleneck throughout the organisation. This additional level of connectivity will elevate the productivity of the laboratories across departments, provide deeper insight into work throughout the value chain and drive successful innovation in a complex business environment.

How far are you on your journey into the lab of the 21st century? To explore this, watch this webinar about labs in the 21st century: shorturl.at/lqACG.

Image credit: ©stock.adobe.com/au/metamorworks

Please follow us and share on Twitter and Facebook. You can also subscribe for FREE to our weekly newsletters and bimonthly magazine.

Related Articles

AI can detect COVID and other conditions from chest X-rays

As scientists compare different AI models to improve automated chest X-ray interpretation, a new...

Image integrity best practice: the problem with altering western blots

Image integrity issues are most likely to come from western blots, so researchers and...

Leveraging big data and AI in genomic research

AI has fast become an integral part of our daily lives, and embracing it is essential to the...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd