Tharaka's Blog

Last 9 months of my life

Posted in Uncategorized by Tharaka de Alwis on October 3, 2010

Its been sometime since I’ve posted anything on the blog. The last 9 months were one of the most challenging months I’ve faced in my entire life. The MSc course on Computer Science that I follow at University of Moratuwa, Sri Lanka¬† is a course that truly test your determination, courage and intellectual knowledge. Anyone whose interested in the doing this course, beware and be prepared. Because the amount of research you may have to do on different subjects are immense and exam papers are tough. Things are not over yet, I’ve got 15 months (more) of hardship to carry on. With my MSc research project work starting from now on, I will post interesting finding on the world SaaS – Software as a service

Nerds and Geeks – 2nd edition out now

Posted in Uncategorized by Tharaka de Alwis on October 3, 2010
Nerds and Geeks 2nd Edition cover page

Nerds and Geeks 2nd Edition cover page

Nerds N’ Geeks is an IT Magazine that gives you the latest technological trends and knowledge on the IT field. Its FREE Online second edition is out now!!! You can download it from¬†

You can read one of my articles on ” how to build your web apps on the cloud” on page 11.

Use cases of Parallel and Concurrent Programming

Posted in Programming paradigms by Tharaka de Alwis on February 13, 2010

Parallel and Concurrent computing is a form of computation in which many calculations are carried out simultaneously by many thread of executions [Definition by Dr. Srinath Perera – lecturer at UoM CS department]. Concurrent programs can be executed sequentially on a single processor and Parallel programs can be executed in multi processor environment dealing with multiple computers.

These are three example use cases of Concurrent and Parallel Programming scenarios:

1) Use Case: Search for Extra-Terrestrial Intelligence (SETI)

  • Arecibo radio telescope
  • SETI@home facility
  • home user with SETI@home client
Description: SETI@home (“SETI at home”) is a distributed computing (grid computing) project using Internet-connected computers, hosted by the Space Sciences Laboratory, at the University of California, Berkeley, in the United States.This usecase describes how SET@home project works on grid computing (parallel computing) model to perform scientific data search and analysis.
  1. Home users need to install SETI@home client software on their PCs are connected to internet.
Normal Flow:
  1. Capture radio transmissions from extraterrestrial intelligence using observational data from the Arecibo radio telescope.
  2. The data are digitized, stored, and sent to the SETI@home facility.
  3. The data are then parsed into small chunks in frequency and time.
  4. SETI@home facility will send each chunk to SET@home client home PCs.
  5. Home PC will analyze for interesting data and distinguish from noise.
  6. Send results back to the SETI@home facility
Why use Parallel/Concurrent computing? One of the main goals of SETI@home was to make use of unlimited & unused computation power of normal PCs connected to the internet and prove the viability and practicality of the ‘distributed grid computing’ concept.
Benefits With over 5.2 million participants worldwide, the project is the distributed computing project with the most participants to date. With these increasing numbers of volunteer user’s computation power, the funders for the SETI project did not need to buy expensive super computers for the computation.As a result of the project SETI project the BOINC environment, a development of the original SETI@home, is providing support for several computationally intensive projects in a wide range of disciplines.


2) Use Case: Download a file from BitTorrent

  • User (person who wants to download a file)
  • Bit Torrent Tracker (Knows all seeds and peers)
  • Seeds (computers running BitTorrent that have the complete file)
  • Peers (Other clients who are in the process of downloading the file)
  • Swarms (All seeds and peers who are active during the time torrent user’s download process)
Description: BitTorrent is a protocol that enables fast downloading of large files using minimum Internet bandwidth. Unlike other download methods, BitTorrent maximizes transfer speed by gathering pieces of the file internet users want and downloading these pieces simultaneously from people who already have them.This P2P process makes popular and very large files, such as videos and television programs, download much faster than is possible with other protocols.This usecase describes how a Torrent user can search for a torrent and download it from seeds/pears.
  1. Torrent client software already installed on the user’s PC
  2. Bit Torrent Tracker is active.
  3. There are live active swarms.
Normal Flow:
  1. Users browse the web to find a torrent of interest, download it, and open it with a BitTorrent client.
  2. BitTorrent client software communicates with a BitTorrent tracker to find swarm.
  3. The BitTorrent tracker helps the client software trade pieces of the file you want with other computers in the swarm.
  4. User receives multiple pieces of the file simultaneously.
  5. If user continues to run the BitTorrent client software (after the download is complete) allowing peers to receive files; user’s future download rates improves in the “tit-for-tat” system.
Why use Parallel/Concurrent computing?: When a user sends out a request to download for a file, the BiTrorrent Tracker locate the file, by querying other computers that are connected to the Internet and running the file-sharing software. So parallel computation principle is applied in this usecase.Torrent software also concurrently manages downloads and uploads of different peers and it also manages the tit-for-tat rating of the user.
Benefits BitTorrent protocol allows users to distribute large amounts of data without putting the level of strain on their computers that would be needed for standard Internet hosting. A standard host’s servers can easily be brought to a halt if extreme levels of simultaneous data flow are reached. The protocol works as an alternative data distribution method that makes even small computers like mobile phones with low bandwidth capable of participating in large data transfers.

3) Use Case: Facial recognition using 3D images/videos at an Airport

  • Security personal at an airport
Description: With the number of terrorist attacks increasing, airports have started using facial recognition software based on 3D images or videos to identify intruders/terrorists. A newly-emerging trend in facial recognition software uses a 3D model, which claims to provide more accuracy. Capturing a real-time 3D image of a person’s facial surface, 3D facial recognition uses distinctive features of the face.This usecase describes how places like airports can identify intruders using facial recognition software with the help of concurrent computation.
Trigger: Security personal sees suspect in the airport premises.
Normal Flow:
  1. Security personal of the airport points at a video image to acquire a live picture of a subject (3D) and pass it to the facial recognition software.
  2. Software determines the head’s position, size and pose.
  3. The system then measures the curves of the face on a sub-millimeter (or microwave) scale and creates a template.
  4. The system translates the template into a unique code. This coding gives each template a set of numbers to represent the features on a subject’s face.
  5. When a 3D image is taken, different points (usually three) are identified. For example, the outside of the eye, the inside of the eye and the tip of the nose will be pulled out and measured. Once those measurements are in place, an algorithm (a step-by-step procedure) will be applied to the image to convert it to a 2D image.
  6. After the conversion, the software will run concurrent threads to then compare the image with the 2D images in the database to find a potential match.
  7. If a possible intruder match was found, Security personal will need to call Air security for surveillance.
Why use Parallel/Concurrent computing? The matches for possible face recognition need to be processed quickly. To improve performance of the identification process, algorithms are written with the use of concurrent computation.
Benefits: Possible matches are retrieved real time from the recognition software allowing time for administrator identifies intruders before they make an adverse effect.