I'm trying to understand the following algorithm. Its function is to
take as input an integer signifying a certain number of cents, and then
any number of integers representing coin denominations from greatest
to least. A 1 cent coin is required. The output is the minimum number
of coins necessary to produce the total number of cents from the
denominations provided. For example,
15 5 1
would output 3, as 5+5+5 is the minimum way to make 15
Using divide and modulus won't work because of cases like 15 6 5 1,
because using 6 cent coins would result in using 5 total.
#include <iostream>
using namespace std;
int min(int a, int b){
if(a<b)
return a;
return b;
}
int main(){
int amt;
int den;
int coins=0;
int * store;
int maxguess=0;
while(cin>>amt>>den){
store=new int[amt+1];
for(int n=0;n<amt+1;n++)
store[n]=n;
while(den>1){
for(int j=den; j<=amt;j++){
store[j]=min(store[j], store[j-den]+1);
}
cin>>den;
}
cout<<store[amt]<<'\n';
delete store;
}
return 0;
}
Everything except
for(int j=den; j<=amt;j++){
store[j]=min(store[j], store[j-den]+1);
}
is pretty trivial. Any ideas how this thing works? The store[j-den]+1
has me especially confused. |