Friday, 31 October 2008

GROMACS 3.3.3 scaling for short coarse-grained simulation

I have run short (20ns) coarse-grained MD simulations where system contained:

- 1 Phospholamban pentamer protein (pdb:1ZLL) 255 amino acid residues
- 229 molecules of DPPC phospholipids
- 2454 molecules of water
- 3 Cl- counter ions

with 1, 2, 4 and 8 cores in parallel on my server.


Red line represents linear scaling in perfect world based on performance of one core simulation. Blue line represent polynomial regression line from results for 2, 4 and 8 cores simulations.

My server has two quad-core processors and it looks like that in this case it does not make sense to use more than 4 cores in parallel for one simulation.

If you are interested about result of this short simulation, this is how it looks like in starting point random state in 0 ns and how it looks like after 20 ns simulation.
Water molecules and hydrocarbon lipid chains are not shown for clarity. Color chains represent five amino acid chains of phospholamban protein. Lipid headgroups are represented as colored balls. How you can see lipid bilayer is nicely formed in 20 ns timepoint.

Before I am going to do more scalability tests how to best utilize server hardware I decided to upgrade to Gromacs 4.0 that suppose to have two orders of magnitude better scaling than 3.3 version ...

Thursday, 30 October 2008

Software instalation

I had to install different software packages and libraries to make server usable for my needs:

Just to mention few of them: EMACS, RCS, GCC, LAM/MPI, LIBAIO, FFTW, GROMACS, RUBY, RUBY ON RAILS ...

I was able to run first parallel coarse-grained MD simulation with Gromacs .... yupi!

Wednesday, 29 October 2008

Ubuntu 8.04 LTS Server Edition

Installation of operating system for my server, Ubuntu 8.04 Server edition 64-bit version took exactly 10 minutes (!). I am impressed, it looks like it had recognized all hardware correctly and it is running now just fine.

Lets just do some disk speed test.

jan@server:~$ sudo hdparm -tT /dev/sda

/dev/sda:
Timing cached reads: 2790 MB in 2.00 seconds = 1395.58 MB/sec
Timing buffered disk reads: 1068 MB in 3.00 seconds = 355.58 MB/sec

It looks quite good, I will compare later with workstation ...

Now lets install Gromacs, molecular dynamics software package ...

Hardware arrived

Hardware bought from DELL Outlet:

Server - Dell PowerEdge T605
-------------------------------

Processors: 2x AMD Opteron Quad-Core 2.1GHZ
Memory : 4x 4GB Dual Rank Dimms - 667MHZ
Storage : 4x 146 GB SAS 15K 3.5" HDs with data stripping (RAID0)

Workstation - Dell Precision WorkStation T3400
-------------------------------------------------

Processor: Intel Pentium Dual-Core 2.2 GHZ
Memory : 4x 1GB Dual Rank Dimms - 667MHZ
Storage : 2x 750GB SATA 7.2K 3.5" HDs with data stripping (RAID0)
Video : nVidia Quadro FX570 256 MB

+ 2 TFTs 22'' Iiyama Pro Lite E2202WS-B1

PROBLEMS:

1. Server is too noisy for home environment (more noisy than my hoover!) - Two powerfull systems fans are running at full speed all the time. I did not find any option in BIOS to regulated their speed. I went to first computer shop and bought two low noice fans and replaced the ones from server. Server is now reasonably quite but I will have to check processors temperature regularly, once I wil start simulations ...

2. One of TFTs has 2 stuck pixels (one red and one blue), another one is perfect - I am too lazy to send it back, I will try to fix later with JScreenFix.

Tuesday, 28 October 2008

Welcome

Hi, I am creating this blog to keep track of my work for my PhD from Computational Biochemistry.

In my PhD I am trying better understand how two proteins SERCA (Ca2+ ATPase) and Phospholamban regulate pumping of calcium ions from endo(sarco)plasmic reticulum. For this I am going to use computer simulation technique called coarse-grained molecular dynamics. I will go to more details about all this later.

Unfortunately for now I have full time job and I will work on my PhD only in my spare time. Lets see how it works. I hope I can return to my full-time PhD soon. Today I am expecting computer hardware to arrive ...