NMC DSP definitions

From BioAssist
Jump to: navigation, search

NMC DSP - Mainpage

This page contains the definitions used within the NMC DSP (taken from the Glossary of the Functional Specification document). (@version 0.3)


A Challenge is a factor (light, food, drug, etc) which is administered to (a group of) test subjects to unbalance the biological system. This is done to see how a biological system re-establishes itself. For example, in nutritional studies sometimes a glucose syrup is given to test subjects to bring the blood sugar out of balance. A Challenge differs from a Treatment in the sense that a Challenge is administered to study the effect of a treatment.

Clean Data

Clean Data contains a list of identified metabolites, their values and units. The actual definition of clean data differs between studies. Therefore besides clean data, semi-cooked data should be stored. Preferably clean data should be obtained after normalization to reference blood samples, to make inter-study comparisons possible.

Data Support Platform

The Data Warehouse and the Tool Chain combined.

Data Warehouse

The database, input tool and output (query) tool combined.


Gas Chromatography Mass Spectrometry.


Grails is an open source web application framework which leverages the Groovy programming language (which is in turn based on the Java platform). It is intended to be a high-productive framework by following the “coding by convention” paradigm, providing a stand-alone development environment and hiding much of the configuration detail from the developer.

Internal Standards (IS)

Internal Standards are known (quantities of) substances (which are either not present in the samples or containing a different isotope (14C or 15N)) which are added to samples. These internal standards allow normalization within a chromatogram.


Liquid Chromatography Mass Spectrometry.

Meta Data

Meta data is all data that acts upon test individuals, but is not part of the study events (treatments and challenges). Examples are BMI, age, lighting conditions, nutrition, etc. If lighting is the treatment in your experiment, it is not meta data but study data.


Metabolites are the intermediates and products of metabolism. The term metabolite is usually restricted to small molecules and often includes small molecules obtained from medication or nutrition.


Metabolomics is the "systematic study of small-molecule profiles in living organisms. The metabolome represents the collection of all small molecules (include the molecules resulting from metabolism inside the organism, but also originating from medication and nutrition) in a biological organism. Metabolomics tries to measure a large part of the metabolome.

Nuclear Magnetic Resonance (NMR)

Nuclear Magnetic Resonance is a property that magnetic nuclei have in a magnetic field and applied electromagnetic (EM) pulse, which cause the nuclei to absorb energy from the EM pulse and radiate this energy back out. The energy radiated back out is at a specific resonance frequency which depends on the strength of the magnetic field and other factors. This allows the observation of specific quantum mechanical magnetic properties of an atomic nucleus. NMR can be used for metabolomics and is very quantitative, but less sensitive than mass spectrometry based techniques.

Open Source

Open Source is an approach to the design, development and distribution of software, offering practical accessibility to a software’s source code. Some consider open source as one of various possible design approaches, while others consider it a critical strategic element of their operations. Before open source became widely adopted, developers and producers used a variety of phrases to describe the concept; the term open source gained popularity with the rise of the Internet, which provided access to diverse production models, communication paths, and interactive communities.

Raw Data

Raw Data is all data outputted by a NMR or MS machine (chromatogram).

Semi-cooked data

Normalized data to internal standards (IS) and quality control (QC) samples.


In software, a toolchain is the set of computer programs (tools) that are used to create a product (typically another computer program or system of programs). The tools may be used in a chain, so that the output of each tool becomes the input for the next, but the term is used widely to refer to any set of linked tools.

Tool Controller

A black boxed web service which acts as an abstraction layer to the separate tools in the toolchain. The tool controller handles the flow of data through tools in a configurable tool chain.


A treatment is a factor (light, food, drug, etc) which is administered to (a group of) test subjects in order to study the effects of that particular treatment. A treatment can also be a placebo administered to a control group in order to compare results to another parallel treatment in another group.

Web Application Framework

A web application framework is a software framework that is designed to support the development of dynamic websites, Web applications and Web services. The framework aims to alleviate the overhead associated with common activities performed in Web Development such as database access and abstraction, templating and session management.

Web Service

A Web Service is a software system designed to support interoperable machine- to-machine interaction over a network. It has an interface described in a machine-processable format (specifically WSDL). Other systems interact with the Web service in a manner prescribed by its description using SOAP-messages, typically conveyed using HTTP with an XML serialization in conjunction with other Web-related standards.

Quality Control (QC) Samples

Quality control (QC) samples are (literally) a mixture of all samples in an experiment and are introduced during measurment. From all samples in an experiment a small subsample is taken, and all these subsamples combined are used as a source for the quality control samples. In this way, the quality control samples represent all metabolites in a study and, if included often enough in between analysis of samples, can be used to correct for drift caused by machines chanhing sensitivity over time.