Does anyone have a idea about this programe,
Assume that you are paid on the basis of one cent the first day, two cents the second day, four cents the third day, with the daily amount continuing to double in this way. Design an algorithm and use it to write a Python program that uses loops to calculate the amount of money a person would earn over a period of time (maximum period one year) if paid in this way. The program should be implemented with a 'pre-tested loop' terminated by a sentinel like 0 or [Enter] to a "How many days?" type prompt (i.e. do NOT control the loop by asking something like "Do you want to do another calculation?" or "Do you want to quit?" etc). The user's response should be validated to ensure that the value entered falls between 1 and 366 days inclusive. (If it doesn't, the user should be requested to enter the value again). Output (for each pass through the main loop) should consist of a two column table with 'day' in the first column and 'today's salary' in the second column. At the bottom of the table the total salary should be displayed. All monetary values should be shown as dollar amounts, not cents (for example $17.50 not 1750 cents).