Early Access
Contact Us
Menu
Early Access
Contact Us

Is outdated technology preventing a cure for cancer?

Jul 25, 2018 7:00:00 AM

Cancer is one of the leading causes of deaths globally and is responsible for almost 1 in 6 deaths in the world. It’s certainly not a new disease either, with evidence of cancer cells found in Egyptian mummies from 1600 BC. It is absolutely extraordinary that cancer is still plaguing families and society the world over, and while diagnoses and treatment have improved dramatically over the last ten years, there is still an immense volume of research to get through and data to crunch if we are ever going to find a cure. The world is coming together to fight cancer, but how do we keep the momentum going and actually get to the stage where the breakthroughs are coming much faster? When can we get to a point that not a single human life is lost to cancer?

Right now, there are numerous institutes and organisations out there that are tackling the problem head-on, including but not limited to the World Cancer Research Fund International, the Union for International Cancer Control (UICC), touchONCOLOGY, and the World Health Organisation. Similarly, The Francis Crick Institute is a consortium of of six of the UK's most successful scientific academic organisations - the Medical Research Council, Cancer Research UK, the Wellcome Trust, UCL (University College London), Imperial College London and King's College London. They’re a progressive research body working in biomedical research, but the challenges that they’re currently facing unfortunately exemplify the difficulties found in biomedical research in general.

cancer-feature

The Francis Crick Institute’s computational molecular biologists are currently working with massive and complex datasets that, based on ongoing advancements in sensor tech and data collection, are continuously and rapidly expanding. A typical dataset contains a diverse range of data - everything from DNA and protein sequences to information on protein-protein complex formation. The problem is that working with these datasets in any meaningful way is a significant computational challenge that imposes costly server requirements, the need for a diverse and specialised team (including heavy ops requirements), and an extremely slow time to market. We cannot effectively utilise the data that we are currently gathering.

There’s a clear need for an improved production pipeline in scientific and medical research. Something that, in The Francis Crick Institute’s words would be “easy to use, universally accessible via the Internet, and dynamic in both the range of questions that can be asked of the data and the information content displayed”. Accessibility is a key consideration: to get the breakthroughs that we’re seeking, we need to first and foremost let these scientists focus on their research, rather than low level engineering concerns. Unfortunately, right now that’s not the case as existing technologies are proving extremely difficult to work with at scale, from the need to incorporate a patchwork of middleware to get things done to a high propensity towards crashes.

Our hope is to provide a platform that can bridge this current divide. We’re currently working with partners in the Life Sciences sector to scale massive datasets and conduct large-scale spatial simulation in an effort to create a step-change in biomedical research. If you are currently engaged in activity in this sector and are interested in working together to solve these challenges, please feel free to drop us an email.

You May Also Like

These Stories on Life Sciences

Subscribe by Email

No Comments Yet

Let us know what you think