Blogs & Articles

Filter
Topic
Show
Sort

Digital Transformation to speed the search for new materials

Traditional approaches for making, testing and deciding which new materials need to be replaced by a holistic digital solution

Materials scientists today can engineer complex nanostructures and model topologically intricate architectures with relative ease. But unlike in other fields, they can’t seem to shed antiquated workflows and outmoded data storage and analysis tools.

In a new white paper, researchers in the cheminformatics group at Revvity Signals argue that this paradox of progress is quite pronounced on the applications side. When developing new materials for specific applications — be they biodegradable plastics, hydrogels, battery electrolytes or perfumes — materials scientists routinely find themselves sorting through tab after tab of data on old-school spreadsheets.

To remedy this all-too-familiar situation, the paper puts forward a new vision for how materials scientists might work. The authors detail an end-to-end approach to materials development that combines the capability and speed of an online search engine with the flexibility and ease-of-use of apps on a mobile phone. They also explain how this vision backs the company’s new integrated informatics platform.

Digital transformation has become a bit of a buzzword. But putting in place good modern systems really does matter — you can do better science and make better decisions, which will translate into a commercial advantage.

Make, Test, Decide

The materials development process breaks into three broad categories: Make, Test, and Decide. In each one, new digital technologies can increase efficiency, remove bias and support reproducibility.

In the Make Stage, researchers synthesize new materials for testing, and a good recording system is vital. The time-tested laboratory paper notebooks served that purpose for centuries. Electronic notebooks (ELN), are an improvement and offer simplified data recording. But it can be difficult to access data from them later. That’s because first-generation e-notebooks save data in databases, which were developed to minimize storage space. In contrast, tagging technology, which allows the software and the user to add an unlimited number of descriptive tags to data, prioritizes data accessibility over storage space.

The testing stage involves measuring various properties of a material, which means that researchers need an objective way to select the optimal set of testing parameters. Traditionally, this selection of testing parameters has been the domain of human experts and their intuition, but data-driven approaches can eliminate human bias and minimize the risk of selecting sub-optimal parameters.

When it comes to processing the experimental data during the testing stage, materials scientists gravitate toward two approaches: homegrown spreadsheets, which offer flexibility but are difficult for non-experts to create, and bespoke, commercial tools, which are easy to use but are difficult to modify since they are designed to do a single task. Ideally, the authors say, material scientists shouldn’t have to choose between flexibility and ease of use. Rather, in an environment that resembles apps on a smart phone, they can select the appropriate module for the data-processing task in hand.

You have this tension between a generic tool like Excel or bespoke tools. One solution to that is an environment containing a suite of applications.

Finally, in the all-important Decide phase, researchers select the most promising candidate materials for development. The authors argue that to improve workflows, researchers should look to harness the same indexing technology that Google and Amazon use to collect and quickly access data. It offers both flexibility and speed — flexibility because an index can incorporate data from multiple sources and formats, and speed because the algorithms that generate selections based on indexed data are extremely efficient. This contrasts with conventional approaches that often involve picking a material by laboriously scanning an Excel sheet of data.

Applying a comprehensive informatics platform to the Make–Test–Decide process promises to both greatly simplify the development of new materials and cut the costs involved. Investing in technology for conducting materials research can have a huge impact.

For our materials research customers, our integrated informatics solutions can foster more efficient and successful materials development. What does this mean? Companies can quickly innovate, accelerate product development, and speed to market, by leveraging a powerful enterprise product suite, including ChemDraw, Signals Lead Discovery, Signals Notebook, and Spotfire®.

We enable our materials science customers to revolutionize the research and development of the materials they need to make, as well as energy, chemicals, and food. We help them innovate and accelerate their R&D cycle so they can bring products to market faster and increase scientific insights and breakthrough innovations.

To learn more about how an integrated informatics can foster more efficient and successful materials development, read our new white paper, Digital Transformation Journeys for Material Science. Materials science is one area that we support, as we offer solutions for Industrial Segments, supporting an array of industries including specialty chemicals, agrochemicals, energy & petrochemicals, flavors & fragrances, food & beverage, and electronics.

Data Collection Geared Toward Translational Medicine

Blog: Clinical and Translational

This practice is lab- and data-driven, aspiring to make a bench-to-bedside approach that’ll efficiently develop therapeutic strategies. It identifies biomarkers that can then be used to inform the patient’s molecular profiles and disease etiologies. Biomarkers can include blood sugar levels that identify patients with diabetes, or certain gene mutations that can signal a patient’s risk of developing cancer.

Translational medicine’s use of establishing molecular profiles has been beneficial in creating drugs that are specialized to target specific pathways based on patient diagnoses. This approach, when compared to one-size-fits-all drug production, creates fewer side effects with better results. Translational medicine can be an effective method when done well, with financial benefits on top of health achievements. That said, it’s a data-heavy activity. Here’s an overview of working with data in translational medicine:

Data Collection

From the lab to treatment, there will be ample information collected throughout the process. For successful drug development, all this data must be sorted and analyzed; to make this more effective, data should be responsibly collected from the start with a standardized and efficient practice.

Guidelines for data collection include:
• Collecting enough samples that you can establish statistical significance
• Using the same clinical samples across the entire population
• Using data models with datasets from various sources
• Making sure data is clean and well-curated enough for cross-study analysis

Not only is quality data useful for developing effective drugs, it can also be used retroactively to determine why some drugs weren’t working. For example, the initial compound in a drug for treating non-small cell lung cancer was initially an ineffective treatment for immune responses in autoimmune diseases. Now the drug, Keytruda, is useful for an entirely different purpose than intended.

Data Simplification

The data analysis that follows large-scale genome products can end up fragmented during the process. The creation of many analysis pipelines and informational silos can make it difficult for scientists and clinicians to collaborate.

To be effective, this all needs to be sorted and broken down. A good initial step is making sure that the data is accessible across the board so that even non-experts, or teams, can look at the data, analyze it, and apply their biological understandings. Implementing strategies to streamline data access can save time.

Scalable data management and accessible data tools are vital for translational medicine to succeed, and with that in use, patients can expect to receive valuable, effective drugs. 

Topic
Show
Sort
Topic
Show
Sort
Filter
Topic
Show
Sort
Topic
Show
Sort
Topic
Show
Sort