hello friends........


i need a program which counts the total number of galaxy stars of an image.........i think if the RGB value is less than 50.....then is wont be a star otherwise it will be detected as a star......all the clusters of the stars joined together should be considered as a single star.

its my assignment n i need to submitt it very soon...............

can any 1 help me.........

Sure, we can help! Just remember to follow the forum rules, and pay attention to the posted announcements (here and here), and there won't be any problems.

This post by vegaseat should get you started...

commented: That'll do +16

The basic algorithm I would use to start with goes something like this. As the code gets more mature I would look to refine and optimize it.

Loop through each pixel in the image:
    if the pixel's calculated luminance is more than a set constant:
         record pixel's location
         if none of the eight neighboring pixels are recorded:
             increment the star count

Note that I haven't looked at this algorithm for design flaws or potential problems, but its simplicity makes it good as a starting point.

I don't get it. Do you want the program in Python or C++ as posted here-> click

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.