Google Cloud Integrates with FDA MyStudies

Google Cloud recently announced that they are working to expand FDA MyStudies, an open-source technology platform that supports research organizations as they collect and report real-world data for regulatory submissions. LabKey provides the backend data management for FDA my studies via a dispersed data model that securely partitions personally identifying information to help ensure patient privacy.

Google Cloud FDA MyStudiesWith Google Cloud, organizations will now have another alternative to securely host their deployment of MyStudies. This secure hosting provides organizations with safeguards in the ownership and management of data in their studies including the ability to select which of its researchers and clinicians are able to access what data, and to help optimize the use of that data as directed by participants.

In developing the FDA MyStudies platform, LabKey Server was selected as the data management partner due to its flexible open architecture and ability to securely handle personal health information in a compliant manner. LabKey Server also provides role-based governance of the data stored on the Registration and Response servers to ensure that data is only accessible to authorized users. 

At LabKey, we are excited to learn about the integration with Google Cloud and look forward to having an additional option for secure hosting of the FDA MyStudies platform.  

Additional Resource:
Case Study – FDA MyStudies Mobile App

Centralizing Biologics Assay Data with LabKey Biologics

[vc_video link=”https://youtu.be/2WpQ_ncpsMQ” el_width=”80″ align=”center”]Centralizing biologics assay data so it can be easily viewed and analyzed in a single place is one of the core challenges biotherapeutic research teams encounter. Without centralized assay data, teams must turn to time-consuming and error-prone manual integration methods to collaborate and maximize the value of their analytical data.

LabKey Server’s robust mechanisms for defining assays and uploading assay data are helpful for bringing analytical data into a single system, but our biologics software– LabKey Biologics, takes this one step further and provides valuable structure and connections that help biologics research teams understand how that analytical data fits into the larger context.

LabKey Assay Designs: Standardizing Analytical Data Structures

LabKey Biologics allows teams to define or customize biologics assay designs (many editable assay design templates come pre-defined within the system) to capture assay data in a specific structure. Some teams using LabKey Biologics may need to define only a few assay designs to support their research, while others working in a more complex environment may need to define 100+.

When building an assay design, laboratory teams can choose to add fields specific to batches, runs and results to provide additional context for their data. LabKey supports a wide variety of field types (including integer, float, boolean, and text) and users can configure aliases, validation rules, and a variety of other characteristics for each.

Centralizing Assay Data In the System

Once an assay design has been built in LabKey Biologics, data can be added to the system using that design to provide its structure. There are a number of different ways to load your assay data into LabKey Biologics including:

  • Automatically uploading via the LabKey API
  • Manually uploading spreadsheets
  • Pasting spreadsheet-type data
  • Entering data into a form

The first strategy, automatically uploading, leverages the APIs of LabKey Biologics to allow other instruments or file systems to talk directly to the system. The other three strategies involved a manual process suitable for varying use cases. If you have a large spreadsheet to integrate, you might want to use the manual upload method or paste the tabular data directly into the application. If you just have a couple of values you want to enter, simply entering the data into a form in the LabKey Biologics UI might be the simplest strategy.

Leveraging Sample Lineage for Context

As you know, analytical data generated during experiments is tied to a specific sample. Because the lineage of samples is tracked in LabKey Biologics, the application will automatically query sample lineage when data is uploaded and present relevant biological entities side-by-side with analytical data in the bioregistry. Showing the sequence or molecule information adjacent to the analytical data, allows scientists to easily ask questions of their data and conduct comparisons with other sequences that are seen in different experiments.

Ready to see this functionality in action? Watch the quick look video above. For more information or to request a demo- Click Here

4 Tips for Improving Lab Sample Management Workflows

Research and clinical laboratories alike are continuously striving to improve their lab sample management workflows. As more organizations invest in translational and collaborative research, the need to collect samples with higher integrity and reproducibility increases. By using sample management software to improve their laboratory workflows, labs can generate more reproducible and reliable endpoints while also boosting processing efficiency. Standardizing processes and workflows can help improve identification methods and reduce processing errors within laboratories. 

Common ways to improve your lab sample management workflows include:

  1. Standardize the processes within your lab.
    Developing Standard Operating Procedures (SOPs) for your laboratory sample management increases the reliability of endpoint samples and therefore the assay results. Well-developed SOPs will account for common troubleshooting methods and provide mechanisms for users to document abnormalities. This in turn will provide more insight to groups downstream about the sample fidelity. SOPs also act as excellent training resources for new staff and ensure the entire team is well trained and prepared for managing samples in the lab.

  2. Determine your laboratory capacity.
    Establishing capacity for any workflow is important. Determining the capacity of both your laboratory staff as well as equipment capacity is important to establish limits that will guide the most effective and reliable lab sample management workflow. By evaluating the available capacity of staff and equipment, new insights can be learned to improve lab and sample processing efficiency.
     
  3. Automate where possible. By introducing automation methods, even if they are as straightforward as barcode generation and scanning, you can improve the laboratory capacity as well as reduce the potential for errors. Barcodes in particular will enhance sample identification and minimize additional effort from laboratory staff. Other advanced methods of automation, such as robotic processing, may also provide value but are typically more costly and can increase processing time in some cases.
  4. Deploy integrated systems that help laboratory work. Using software systems in the lab that do not hinder sample management processes, but instead enhance them, is important for gaining adoption and boosting laboratory morale. By having integrated systems that are easy-to-use, researchers can perform work faster with less ramp-up time, and can recognize discrepancies easier. Downstream, integrated systems offer organization leadership the mechanisms to track workflow data over time with the ability to easily adjust workflows to improve processing.

LabKey Sample Manager

Developed in coordination with our lab partners, LabKey Sample Manager has been intuitively designed to boost the efficiency and productivity of laboratories. The application allows for the registration and tracking of samples and their derivatives, as well as the assignment of samples to custom sample processing workflow jobs. This provides a flexible yet standard mechanism of capturing tasks in each workflow, and can act as a virtual record of sample processing. Additionally, Sample Manager allows laboratory staff to easily capture data generated from these workflows and associate them with samples. 

Contact Us for more information regarding Sample Manager.

Related Post: Benefits of Unifying Lab Samples with Assay Data

Our Commitment to Data Security

Since our inception in 2003, security has been of the utmost importance to LabKey and our research partners. We take a proactive approach to improving the platform to overcome the ever-evolving cyber security challenges faced by companies in every industry. 

At LabKey, security begins with educating our developers on avoiding security vulnerabilities through proper coding and best practices. Every feature is designed with security in mind, ensuring that only authorized users are able to view and modify data. Each release of LabKey Server undergoes extensive automated and manual testing to identify potential bugs and vulnerabilities before release. Additionally, we employ automated security scanners and regularly use independent cyber security firms to perform manual penetration testing of the platform. Detailed reports from these efforts are reviewed and any issues are quickly fixed. 

Recently Added Security Features

LabKey releases three major versions of the LabKey Server platform each year followed by subsequent maintenance releases. We encourage our users to always update to the most current release of LabKey Server to ensure they have the maximum protection against security risks. Below are just a few security features included in recent releases. 

Cross-Site Request Forgery (CSRF) Protection – Beginning with version 19.1, LabKey Server enforces CSRF protection (requiring verification of a CSRF token) on all POST requests and, as of 19.3, it will detect all mutating operations and block them if they are attempted outside of a CSRF-protected POST request. For more information on CSRF attacks, click here.

External Redirects Whitelist – LabKey Server restricts the host names that can be used in parameters that provide redirect URLs. By default, only redirects to the same LabKey instance are allowed. Other server host names must be whitelisted by administrator to allow them to be automatically redirected. For more information on unvalidated redirects, click here.

Antivirus Scanning (premium feature) – File uploads, attachments, archives and other content imported through the pipeline or webdav can now be scanned for viruses using ClamAV. 

Vulnerability Management & Resolution

On rare occasions, we may identify or be notified of an issue that poses a security risk. When this occurs, the LabKey team promptly assesses the issue and determines a timeline for resolution based on the severity of the risk. After our initial assessment, we fix the identified vulnerabilities and deliver maintenance releases of the latest production version to our clients as soon as possible. Maintenance releases are also delivered for critical security issues and high priority bug fixes in older production versions of LabKey Server that are still in active use.

To download the most recent version of LabKey Server, click here.

Benefits of Unifying Lab Samples with Assay Data

Many laboratories experience challenges with their sample data management functions. Samples, aliquots of samples, and the data derived from them are often stored as a mixture of paper and electronic records, or in hard-to-search electronic laboratory notebooks. Difficulty in managing complex interlinkages and the inability to tie results data back to sample records can cause duplicate assays to be performed and puts data integrity at risk. Having a system that unifies assay data directly with sample information provides bench scientists, lab managers and investigators with a comprehensive view of the work that has been, or needs to be performed in the lab. It also provides downstream researchers with valuable information on how and when results were derived for each sample.

By unifying your lab samples with assay data, you can:

  1. Analyze, interpret and report the data more accurately. With an understanding of how samples were generated and pre-processed, scientists can ensure their downstream data analysis takes these sample metadata into account when they perform their analysis. This improved reproducibility in the lab and generates publication-ready data.
  2. Easily attain accurate insight into what data has been collected around a given sample, avoiding the time consuming process of searching through lab notebooks or loose data files. Knowing what data has been generated will reduce duplicative experiments, increase productivity and ultimately result in less overhead expenses.
  3. Generate data provenance between data that is created from samples, allowing users and auditors to easily understand how data was generated.

LabKey Sample Manager 

Developed in coordination with our lab partners, LabKey Sample Manager has been intuitively designed to boost the efficiency and productivity of laboratories. The application allows for the integration of various types of tabular assay data with samples and their corresponding metadata. This provides a complete picture of ongoing experiments and data collected. Additionally, Sample Manager allows laboratory staff to easily capture sample lineage and assign samples to customizable laboratory workflows.

Contact Us for more information regarding Sample Manager.

Learning and Networking at the LabKey User Conference 2019

The annual LabKey User Conference was held in Seattle on October 3rd and 4th. As with every LabKey User Conference, the event provided LabKey users the opportunity to connect with other researchers and developers as well as learn directly from the LabKey team. Through user presentations, technical talks, discussion groups and networking sessions, attendees expanded their knowledge of the LabKey platform and learned how LabKey is solving data integration, collaboration and workflow challenges faced by other organizations.

For the LabKey team, the User Conference is a wonderful opportunity to learn from our users. The LabKey Server platform, Biologics and our upcoming Sample Manager application have all been developed with significant input and guidance from the scientific research community. As we develop LabKey solutions, this input continues to be the guide by which we prioritize features, integrations, UX decisions and more.

This year’s LabKey User Conference was filled with insightful contributions from speakers and valuable workshops and guidance from the LabKey team. Below are just a few highlights of presentations at the conference. In the coming weeks, we will be sharing some of the exciting presentations from our users and developers. To view this year’s conference program with full abstracts, click here.

LabKey User Presentations

“A Wayfarer’s Guide to the Galaxy of LabKey inside the NIHR Oxford Biomedical Research Center”
Oliver Freeman – Oxford Biomedical Research Centre Clinical Informatics Group, University of Oxford 

“Mapping Cell Line Development Workflows with LabKey Biologics”
Bo Zhai, Cell & Developability Science, Janssen Research & Development

“LabKey for Multicenter R&D on Biofuels and Bio-based Products”
James R Collett, Chemical and Biological Process Development Group, Pacific Northwest National Laboratory (PNNL)

“LabKey and ORIEN Informatics at City of Hope”
Vincent La, City of Hope

“Molding and Maximizing the LabKey Platform for Clinical Translational Research”
Anthony Corbett, Research Data Integration and Analytics Group, University of Rochester Medical Center

LabKey Tech Talks

ReactJS Development: Getting Started
Best practices for building ReactJS based applications with LabKey. Development tips and tricks, practical examples, and advice for developers.

Sample Lineage
Learn how Sample Derivation works in LabKey Server. Explore underlying data structures and use of APIs. See how to access lineage in LabKey.

Visualizing LabKey Data
An overview of how external integrations expand your options for creating reports and visualizations. See how to present LabKey data using Tableau, Spotfire, Matlab, Shiny and more!

LabKey Development Process
Learn about the internal LabKey development process. Covers feature branch workflow, pull requests, test automation, formal and patch releases, and other things external developers need to know.

Quality Control with LabKey Server
Explore strategies for quality control and reporting within LabKey Server and Biologics. Learn to incorporate QC trend reporting, automatic QC options during import, and setting of QC states into your workflows.

Thank you to all who attended and contributed to a valuable and insightful LabKey User Conference!

For the full conference program – click here

Tableau Integration with LabKey for Research Data Visualizations

More than a tool to facilitate data integration and analyses, LabKey has partnered with Tableau to help research teams visually communicate their findings and build consensus with stakeholders. After all, the most important scientific discovery can’t change the world unless it can get your attention. Key insights gleaned from complex biomedical research data can be overwhelming to explain and even more difficult for some audiences to understand. Insightful and beautiful visualizations created in Tableau from data within LabKey Server can help bring complex data to life, and clearly communicate key results.

Seamless Integration of LabKey Data with Tableau

LabKey integrates scientific data with a wide variety of external analysis and presentation tools, including Tableau Desktop. Gone is the need to hand over research data to visual designers who may not understand the science. The LabKey integration with Tableau allows you to make compelling charts and plots with tools designed for analysis. With Tableau and LabKey together, you can easily create compelling graphs, tables  and other visualizations from your own research data. Presentations can be “live” so they automatically update when additional data is incorporated, or if you prefer, your research data visualizations can reflect a static snapshot.

Tableau Technology Partner 

As a Tableau Technology Partner, LabKey adds the ability to connect biomedical  research data to the analytics and visualizations available with Tableau. Drag and drop the data you want, customize colors, styles, and layouts, and never pause or lose the integrity of your ongoing research. Tableau partners with leading technology companies in the data and analytics industry to seamlessly integrate with Tableau so people can collect, store, transform and connect to the data that is important to them.

Video: Using Tableau to Visualize Data in LabKey Server

Read more here:

https://www.labkey.org/wiki/Documentation/page.view?name=tableau

Custom LIMS Software for Engineered Mini-Proteins

Optide-Hunter

Scorpion venom can kill you, but there is a lot to learn from it. Keeping the part of the molecule that crosses the blood brain barrier and attaching a specifically targeted therapy for treating brain tumors is being made possible by the Olson Lab at Fred Hutch with the help of a custom LIMS software developed on LabKey Server.

The engineering of protein-based therapeutics is a complicated but promising strategy for improving treatments for cancer and infectious disease. And it’s not just the chemistry that is complex. The Olson Lab experiments with nature-inspired bioengineered mini-proteins modified with synthetic chemistry to produce “Optides” (optimized peptides), which hold promise for optimizing therapeutic properties. Managing all the experimental data and metadata presents a myriad of challenges which LabKey Server is well suited to handle.

Customized LIMS Software for Protein Engineering

The Olson Lab has developed Optide-Hunter, a LIMS software built with LabKey Server. The platform supports a generalized protein compounds workflow for tracking entities and assays from creation to preclinical experiments.

You’ll find a compound registry, in-silico and in-vivo assays, support for high-throughput and large-scale production, and automated data loading. Optide-Hunter also supports automated chromatogram classification and external pre-processing of high performance liquid chromatography (HPLC) data. Other users can customize the software for their unique workflows.

You can learn more about the project and partnership with LabKey in the case study. Continue reading to learn how to explore the Optide-Hunter yourself right now.


Getting Started

You can explore a read-only version of the Optide-Hunter yourself right now with no account or registration required.

  1. Click here to open the Optide-Hunter in a new tab. Keep these instructions alongside.
  2. Click the Optides project icon at the bottom of the screen. The home page shows the project files, including custom R code and custom module examples you can download.
  3. Each topic along the top menu bar covers a different aspect of the project. Hover over CompoundsRegistry and click Samples to see the registry of compounds for protein expression and conjugation. A set of wiki pages listed on the right guide you with details about the elements shown.
  4. For example, lineage relationships are represented by ordering compounds in a specific hierarchy. Before variant sequences are registered, corresponding homologues must be registered and assigned IDs.
  5. Next, explore the assays along the menu bar. For example, HTProduction > Assays. Click HPLC Assays on the Assay List, then view and filter the data to find compounds of interest.
  6. On the Programs menu, select the QueryAssays option then enter one or more Compound IDs, for example “CNT0001356” and click Submit. Two grids of Matching Constructs and InsilicoAssays Matches will be populated with the search results to give you a common view.

Create Your Own Trial of Optide-Hunter

After exploring our read-only example, you can create your own trial instance and try uploading your own data, customizing the user interface, and developing your own queries and reports. To launch your 30 day trial, create or log in to your account via this link, then select the “Optide-Hunter – Case Study” option.


This project was published in the journal BMC Bioinformatics with the title “Laboratory Information Management Software for Engineered Mini-protein Therapeutic Workflow“. Learn more about the collaboration with LabKey in our case study.

LabKey European User Conference Events

In early June, LabKey held two European User Conference events bringing together users in Basel, Switzerland and London, UK. Attendees had the opportunity to share their success stories and learn about the multitude of ways that LabKey could assist in their research efforts. Experts from LabKey led training sessions and gathered user feedback on upcoming development projects including our soon to be launched Sample Management solution. If you missed it, here’s a brief recap and we hope you will join us next time!

Using LabKey to Enable Reproducible Research

Dr. Thomas Schlitt, Staff Scientist, presented how the Department of Cognitive and Molecular Neurosciences at the University of Basel uses LabKey to gather and analyse data collected from volunteers taking part in psychological research. The data collected includes results of memory tests, picture recognition tasks, and reports on adverse events and medication taken during research studies. These diverse datasets are brought together using templates of lists and scripts to enable efficient adaptations in new studies.

A Wayfarer’s Guide to the Galaxy of LabKey inside the NIHR Oxford Biomedical Research Centre

Oliver Freeman, Technical Architect, discussed a few ways LabKey’s deployment inside the National Institute of Health Research (NIHR) Oxford Biomedical Research Centre is aiding research. One of their LabKey projects is a data collection tool for the Hepatology Research Project, a study with a custom-built UI utilizing Extract-Transform-Load processes (ETLs). LabKey is also used in the NIHR HIC Hepatitis Project to allow submission of files into the collation system, followed by the viewing and querying of the collated data. In the future, they will deploy a portal for clinical research data products across the organization employing native LabKey features including the metadata catalogue, project and folder level security, and LDAP authentication to control who has access to the data.

Genomic Research of Diagnostics of Rare Diseases using LabKey

Dr. Jean-Baptiste Rivière, Assistant Professor, described how the McGill University Health Centre (MUHC) and its Research Institute (RI-MUHC) is using LabKey Server to manage the explosion in data production triggered by the movement of genomic technologies from research to broad use for etiologic diagnosis of patients. They rely on the flexibility and robustness of the LabKey platform as they collect, standardize, integrate, and share diverse health and laboratory data on rare diseases and cancer. Specific uses include web-based test requisition forms and questionnaires, tracking of biological specimens and laboratory results, and generation of bilingual clinical reports of molecular results.

[vc_column width=”1/6″][vc_column width=”2/3″][vc_cta h2=”Join Us in Seattle this Fall” add_button=”bottom” btn_title=”Learn More and Register” btn_style=”custom” btn_custom_background=”#779e47″ btn_custom_text=”#ffffff” btn_align=”left” btn_link=”url:http%3A%2F%2Fwww.labkey.com%2Fproduct%2F2019-labkey-user-conference-workshop%2F|title:2019%20LabKey%20User%20Conference%20%26%20Workshop%20%7C%20Seattle%2C%20WA||”]The next LabKey User Conference will be held in Seattle October 3-4, 2019, bringing you more great presentations and opportunities to connect with the LabKey community. Hope to see you there![/vc_cta][vc_column width=”1/6″]

LabKey Server helps research teams efficiently manage, analyze, and publish biomedical research data at scale, maximizing its value. To learn about other applications of LabKey Server, check out more user presentations, or contact us.

American Society for Mass Spectrometry Conference 2019

From June 2-6, 2019, LabKey Vice President Josh Eckels will be in Atlanta for the Annual Conference of the American Society of Mass Spectrometry. He will attend the the Skyline User Group Meeting and present a poster highlighting new quality control features for targeted mass spectrometry.

Meeting: Skyline User Group

Sunday, June 2
Georgia Aquarium, Atlanta

Panorama is a web-based complement to Skyline, used by more than a hundred organizations to manage, analyze, and share targeted mass spec data generated by Skyline. Since being unveiled at ASMS 2015, Panorama’s support for QC workflows has expanded significantly.

Read more in the Skyline User Group Meeting Summary >

Poster: Customizable quality control metrics and notifications with Panorama, AutoQC, and Skyline

Monday, June 3
Poster Number: 430

Introduction: Panorama’s newest quality control (QC) capabilities further extend its automated system suitability monitoring for targeted mass spectrometry assays. Panorama first added its QC folder type in 2015 in conjunction with AutoQC, a utility that monitors for newly-acquired system suitability runs targeting operator defined sets of peptide and small molecule standards. New runs are automatically analyzed using Skyline and imported into Panorama. Recent work greatly expands the metrics that can be monitored. Coupled with new types of statistical analyses and email alerts of outliers in newly acquired data, Panorama now offers an even more robust workflow. 

Methods: Panorama now supports metrics that track values associated with the entire run such as statistics related to iRT regression, single replicate calibration and pressure traces, adding to its existing support for metrics associated with individual peptides and small molecules. Users can visualize their data using statistical process control plots, customize the metrics applied, subscribe to email notifications, and export in a variety of formats.

Conclusions: Panorama offers a growing collection of customizable QC metrics and has expanded beyond values associated with individual peptides and small molecules. Users can assess quality for data that is specific to their experimental design, opt-in and opt-out of the full library of metrics to eliminate false positives and focus their system suitability checks on the most diagnostic data. Panorama helps users consolidate their workflows to reduce the data processing bottlenecks that occur in many laboratories.

As of June 2019, more than 350 labs are using Panorama projects to manage targeted mass spectrometry assays and major pharmaceutical companies and other organizations have deployed their own in-house installations of Panorama.

Read more and see the full poster >

Panorama Partners Program

The Panorama Partners Program accelerates the adoption and integration of Panorama into member organizations’ targeted mass spectrometry workflows. Partners work directly with the Panorama and Skyline development teams, help shape the direction of ongoing development, and receive exclusive premium features.

Learn more about joining the Panorama Partners >