IT in the life sciences industry is at a tipping point. Today, there are so many powerful technologies—DevOps, OpenStack, containers, software-defined storage, big data, Hadoop, the list goes on and on—that are leading enterprises to a smarter way of developing enterprise applications and to a more modern, efficient, scalable, cloud-based architecture. This is great news for the massively data-driven life sciences industry.
That said, figuring out the best architectural foundation to support this data, leverage it (and of course, monetize it) is complex. Much of what exists in the data centers of life sciences organizations is antiquated. There are proprietary systems, lots of manual processes, monolithic applications and tightly coupled integration.
As IT teams in life sciences map their strategies, we recommend they consider transitioning from what’s been their legacy environment to one that embraces the next-gen technologies including open source platforms like OpenStack, lots of virtualization and container technologies, DevOps, and open source Platform as a Service (Paas) for application development, Hadoop and more.
Let’s break it down in more detail. Traditionally, the infrastructure, application development, applications and data integration used at life sciences companies have been proprietary and inflexible. The data center infrastructure, for example, has consisted largely of proprietary operating systems and hardware. With a new software-defined data center that’s built for the cloud, companies may better meet their needs by turning to open and elastic operating systems like Linux, cloud computing stacks like OpenStack and hybrid cloud management for effectively operating a mix of public and private clouds.
Bare metal workloads and virtual machines (VMs) should give way to open source virtualization. Life science companies should investigate the use of open source, Linux containers. These are basically like VMs and are similar to compute virtualization, but they aren’t bound to the hypervisors. Containers essentially virtualize the operating system so that multiple workloads can run on a single host, instead of virtualizing a server to create multiple operating systems. They are easily ported across different servers without the need for reconfiguration and require less maintenance because there are few operating systems. OpenStack specifies Docker, a Linux format for containers that’s designed to automate the deployment of applications as highly portable, self-sufficient containers.
One of the key IT architectural foundation strategies life sciences companies need to invest in is modern application development. In a report released by Ernst & Young (EY) in 2014, the services firm noted that an agile environment can help life sciences organizations create opportunities to turn data into innovative insights. Typical software development life cycles that require lengthy validations and quality control testing prior to deployment can stifle innovation. Agile software development—which is adaptive and is rooted in evolutionary development and continuous improvement—can be combined with DevOps, which focuses on the integration between the developers and the teams who deploy and run IT operations. Together, these can help life sciences organizations amp up their application development and delivery cycles. EY notes in its report that life sciences organizations can significantly accelerate project delivery, for example, “from three projects in 12 months to 12 projects in three months.”
What do you think should be the priorities for life sciences organizations as the transition from legacy to next-gen? Let us know in the comments section below.