skip to main |
skip to sidebar
Some time a go, I followed a Scientific computing course and I've learn the basics to configure a Free tier machine in Amazon web services to have my own installation of Sage(math). But I've made a mistake in the configuration (because I didn't know it can be changed) and that machine has only 8GB of storage. The consequences of it has been to have an almost full disk and updates where almost impossible (uninstall required to update, and reinstall later on). In practice it has mean that i've been stack in sage 5.11 when today it's 6.11.
But this Free tier has up to 30GB of hard disk space! And I have configured a new machine with enough space and the service are being moved to that new machine.
As this machine has been set up from scratch, I like to write what I did in command line for my records. And if it would get useful for others, great.
After having a fresh ubuntu 13.10 64bit AMI (Amazon Machine Image) on a t1.micro, I've:
$ sudo apt-get update
$ sudo apt-get dist-upgrade
$ sudo apt-get
install ipython python-numpy python-scipy python-matplotlib python-dev
git gfortran openmpi-bin liblapack-dev apache2 gnutls-bin
This is to have an up to dated machine and some basic needed packages. Then is necessary to add the ppa (personal package archive) where sage's packages are stored:
$ sudo apt-add-repository -y ppa:aims/sagemath
$ sudo apt-get update
$ sudo apt-get install sagemath-upstream-binary
$ sudo apt-get install python-openssl openssl libssl-dev libssl-doc python-openssl-doc python3-openssl
$ sudo sage -i pyopenssl-0.13.p0
Once installed the main package and some optional packages first execution can be called:
$ sage -notebook interface=\'\' secure=True
In here, a command line interactive process will ask for the sage's admin user password. And after that the script for the init.d can be prepared and started, like described in a previous post.
Just a short thought that has come to my mind recently. A few years a go, in Belgium there where a huge political crisis that left the country without official government during 541 days (close to a year and a half). During this crisis the north (Flemish) and the south (Wallonia) were at the limit of rupture.
The European Union Treaties wasn't an extra problem then, and was also explained that they thought in a triple break, leaving the Bruxelles capital region as an special area as the EU capital. At that time no one have says that those two new possible countries should request a for (re)admission as state member.
Why, nowadays, the Scottish people are threatened that, depending on the result of the self-determination process, they will have to request their admission from scratch? Why also Catalonia people, whose process is an even worst situation because the Spanish government denies any right to vote, are threatened to be expelled from the Union?
A very long time a go, when I was a cluster scheduler developer, I've seen my record of concurrent threads. In the stress test of the system, the scheduler had up to 480 threads consuming a 1% of the cpu. That was good and the objective because an scheduler shall not be doing anything else than planning (with in different levels an abstractions, managing restrictions and complains). The code was full of semaphores. events and signals. The tests were succeed without deadlocks. Since then this has been my record in number of threads.
In my code I use very often threading. In a object oriented design many threads can be doing different things with those objects and it is fun when many of them are interacting and the application works as expected with a good performance. I've play with OpenMP and MPI, but I've never raised my record of concurrent threads, until now.
I'm following, in coursera, a subject about heterogeneous parallel programming with CUDA and this has broken the record in very very small pieces. In one of the exercises, a matrix multiplication, the data set is: A_{200,100}*B_{100,256} where the result is C_{200,256}. That is 200*256 cells and one thread each to do the underlying basic operations.
Using CUDA blocks and grids, where often the data doesn't fit exactly and you exceed a bit, this matrix, in my case I've used 16*16 blocks on a 17*12 grid of blocks. This rises my record of parallel threads up to 52224 threads. More than a hundred times...
Event this numbering, imho there are categories on this threading count. In this record, those threads where doing almost the same with different data. In my previous record, there where only a few of them doing the same thing. That splits also the threading count in two categories...
Update 20140202: Did I set my threading record to 52.224? Doing another exercise on this cuda course, I've break it again: a convolution of a 2k*2k coloured image (3 channels, and a 5*5 mask) 12.582.912 parallel threads did the job in 19ms...