Google Answers Logo
View Question
 
Q: Minimum Spanning tree algorithm ( Answered 1 out of 5 stars,   0 Comments )
Question  
Subject: Minimum Spanning tree algorithm
Category: Computers > Algorithms
Asked by: anuj_kansal12-ga
List Price: $15.00
Posted: 25 Feb 2005 10:17 PST
Expires: 27 Mar 2005 10:17 PST
Question ID: 480783
1) What is the 'c' code with minimum spnning tree algorithm?
2) Whst is the 'c' mpi code with minimum spnning tree algorithm?
Answer  
Subject: Re: Minimum Spanning tree algorithm
Answered By: maniac-ga on 25 Feb 2005 17:52 PST
Rated:1 out of 5 stars
 
Hello Anuj_kansal12,

There are several good implementations of minimum spanning trees
written in both C alone and C using MPI to implement a parallel
algorithm.

For example:
  http://www.arl.wustl.edu/~jst/cse/541/src/intro.html
[be sure to scroll down about 1/2 down the page]
implements minimum spanning tree algorithms using Kruskal's method,
Prim's method, and a Round Robin method. There is also a program there
to check the output of the previous programs.

Search phrase
  c source code minimum spanning tree

For an MPI implementation, the choices are a little more limited but
this first reference at
  http://www.eece.unm.edu/~dbader/code.html
[again scroll down about 1/2 into the page]
has several methods implemented. The first file (mst-1.0.tar.gz) has
four different implementations. The second file claims to use an
algorithm that achieves near linear speed up with added processors.
There are also references to technical papers that describe the
techniques in more detail if needed.

Search phrase
  c mpi source code minimum spanning tree

Good luck with your work.
  --Maniac

Request for Answer Clarification by anuj_kansal12-ga on 25 Feb 2005 20:44 PST
Actually for the 'C' mpi code when i'm trying to run the code it's not
working as the windows needs to know what program has created it. Due
to which i'm unable to see the code.

Request for Answer Clarification by anuj_kansal12-ga on 26 Feb 2005 07:46 PST
This is only the source code for 'C'.But i want he complete
program.Since i was unable to do it, then only i deci ded to spend the
money on google. Hence please provide me the full program which i can
compile and execute. Also that i need it before monday.
Thanx 
Anuj Kansal

Clarification of Answer by maniac-ga on 26 Feb 2005 18:18 PST
Hello Anuj_kansal12,

You ask in a clarification for "complete programs". Both examples that
I identified are complete programs (ready to compile / link /
execute), written in C as you originally asked. For example, at
  http://www.arl.wustl.edu/~jst/cse/541/src/intro.html
there is souce to produce the program "mst_kruskal" which implements
Kruskal's algoritm for minimum spanning trees. If you are using
windows - run it from a DOS window after you have built it (it is a
"command line" application). If you have problems, I can answer
general questions [Note - I do not have access to a WIndows PC right
now...].

The source code for the MPI application is stored as a compressed
(gzip) "tar" file. This is a common format for exchanging files for
Unix systems. I believe WinZIP will expand it for you. There are other
applications that can do so as well.

After this is done, you should get a "README" file and separate
directories for each application (source code, etc.).. You will also
need an MPI library but I assume you have access to such a library for
linking. If you need a free MPI library - I suggest using MPICH which
is available for a variety of systems - as source code or pre built.
For example, on Windows see
  http://www-unix.mcs.anl.gov/mpi/mpich/mpich-nt/

  --Maniac
E

Request for Answer Clarification by anuj_kansal12-ga on 01 Mar 2005 10:51 PST
I tried this 'c' code send by you. But it's not working anyways. So if
you can help me with the correct code that gives results on execution
then send it, otherwise tell me how can i get the refund.
Thanx 
Anuj Kansal

Clarification of Answer by maniac-ga on 01 Mar 2005 17:17 PST
Hello Anuj_kansal12,

I have insufficient information from you to solve the problems you are
having with the C code referenced in the answer. For example, I cannot
determine if you are having problems compiling or running the
software. I can certainly provide other references such as:
  http://perkinsg.tripod.com/c_source_code/kruskal.c
which is a simpler program that implements Kruskal's algorithm for a
specific tree but I cannot be sure you will not have the same problems
as before.

Please explain the problems you are having - in as much detail as you
can - so I can help provide a complete solution.

  --Maniac

Request for Answer Clarification by anuj_kansal12-ga on 01 Mar 2005 21:02 PST
The problems that i am facing is while compiling the program. There
are many onclude files that need to be involved in the compiler like
those of partition.h, graph.h, etc. Since this code is giving so many
problems i've spend many hours in ordre to find out the include files.
Stil it doesn't work. Hence please try to compile and run the code
before u send it to me if possible.
Thanx 
Anuj Kansal

Clarification of Answer by maniac-ga on 02 Mar 2005 18:27 PST
Hello Anuj_kansal12,

You did not indicate which program you were trying to compile. The
instructios refer to the header files used by mst_rrobin. Adjust the
filenames as needed to fetch the header files you need (e.g., for
mst_kruskal, fetch partition.h).

For stdinc, try:
  http://www.arl.wustl.edu/~jst/cse/541/src/stdinc.h
which I derived from a search like
  site:arl.wustl.edu stdinc
From my review of this header file, it refers to several standard
header files (e.g., stdio, stdlib) as well as defining a few helpful
inline procedures such as "min", and "max". Let me know if you get
compilation errors from this file and I should be able to suggest work
arounds. [I do not have the same compiler you do so I cannot confirm
it works on your system - but I know it compiles with gcc/g++]

For the other header files, see farther down on the same page
  http://www.arl.wustl.edu/~jst/cse/541/src/intro.html
for the data structures section of the original page...

For list.h and dlist.h look for 
  Linked List an d
  Doubly Linked List

For partition.h and lopartition.h look for
  Partition Data Structure and
  Lazy Leftist Heaps Data Structure

For wgraph.h look for
  Weighted Undirected Graph

For all of these, both the header file and implementation (.c) are av
ailable as well as a test program. There is also a link to a
"makefile" just above the data structures section which lists the main
programs and files needed for each one.

In case you need another file (.h or .c), it appears that all the
files are in the same directory
  http://www.arl.wustl.edu/~jst/cse/541/src/
Simply add the filename to the end of the URL above to fetch the file you need.

  --Maniac
?

Request for Answer Clarification by anuj_kansal12-ga on 08 Mar 2005 08:30 PST
First of all i want to let u know that i am unable to compile the MPI
code even when i've downloaded all the libraries. Secondly i want the
'c' code and MPI code for the same algorithm. Whereas i have these
codes for diferent algorithms currently and that too are not working.
So please if you can check first before sending it to me that'll be
appreciated.So that i don't have to bother u again and again for the
problems related to compilation and execution.

Clarification of Answer by maniac-ga on 13 Mar 2005 19:18 PST
Hello Anuj_kansal12,

If you are having problems with the MPI program and want to use the
exact same algorithm for both programs, I suggest the following:

Modify the working C program (not the MPI one) and make the following changes:

[1] Add a call to MPI_Init near the start of the main program as described at
  http://www-unix.mcs.anl.gov/mpi/www/www3/MPI_Init.html

[2] Add a call to MPI_Finalize near the end of the main program as described at
  http://www-unix.mcs.anl.gov/mpi/www/www3/MPI_Finalize.html

[3] Determine the size of the MPI application and find out which is
the "main program" with the following steps:

[3a] Call MPI_Comm_size as described at
  http://www-unix.mcs.anl.gov/mpi/www/www3/MPI_Comm_size.html

[3b] Call MPI_Comm_rank as described at
  http://www-unix.mcs.anl.gov/mpi/www/www3/MPI_Comm_rank.html

The idea with this step is to determine the number of applications
running (to divide the work) and which one is "rank 0" which can be
used like a "main program" to control the other applications.

[4] Add an if statement based on the rank returned by MPI_Comm_rank.
Consider "rank 0" as the main application and the others the worker
applications. The primary for loop will be within this if statement
but be modified as described in steps 5 and 6.

[5] For the "if" part (e.g., main application), do the following:

[5a] Optional - read the data for the tree to be processed. I don't
know your execution environment, if the application is indeed
distributed across several computers that do not share a filesystem, I
strongly suggest this step.

[5b] Optional - send a message using MPI_Send to describe the data
that will be following in subsequent messages (e.g., number of values
to receive, size of data values to receive) in each worker application
(those not of ranke 0). MPI_Send is described at
  http://www-unix.mcs.anl.gov/mpi/www/www3/MPI_Send.html

[5c] Create a new for loop that calls MPI_Send for each worker
application to provide the data it needs to process.

[5d] Create a new for loop that calls MPI_Recv for each worker
application to receive the results of the processing. MPI_Recv is
described at
  http://www-unix.mcs.anl.gov/mpi/www/www3/MPI_Recv.html

[5e] Process the results (if needed) and print out the Minimum Spanning Tree.

[6] For the "else" part of the if statement (e.g., worker application).

[6a] If 5a IS NOT done, read the data describing the tree.

[6b] If 5b IS done, call MPI_Recv to get the information sent by the
main application.

[6c] Call MPI_Recv as needed to get the information sent in step 5b.

[6d] Add a new for loop that does the processing of the part of the
data received in step 6c.

[6e] Call MPI_Send to return the results of 6d to the main application.

The results of the modifications described above should produce an MPI
application that matches your newly described needs.

For reference, a simple MPI "Hello World" application that makes these
calls and can be used to help check out your MPI installation is at
http://www.tc.cornell.edu/CTC-Main/Services/Education/Topics/MPI/Basics/_CExample.htm
which is a simple application that sends a message from tne main
application to the worker applications (which then prints out the
message received).

  --Maniac
?
anuj_kansal12-ga rated this answer:1 out of 5 stars
As  iasked a solution for the program the researchers should compile
and execute the program first before sending it. So that there could
be no doubts.

Comments  
There are no comments at this time.

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  


Google Home - Answers FAQ - Terms of Service - Privacy Policy