I'm working on a homework/project to tokenize a set of operands (double) and operators as inputs. It's been going well enough and I have been able to finish it by implementing my own tokenizer but when I tested my output exe with digits greater than 7 I get a random output, which probably means I got "fresh" memory location (eg 123456 is OK, but 1234567 and higher digit numbers are no good).
Here's a code snippet:
token::token(istream& f)
{
string g;
f>>g;
int gpos=0;
//checks for presence of a decimal point
bool thereisadecimal=false;
do
{
if(id_operand.find(g[gpos])!=string::npos)
{
try
{
//checks for decimal points to check if invalid input
//checks if value has already been set
// (it's on a loop for scan)
if(_type!=operand_)
{
_type = operand_;
_value = strtod(g.c_str(),NULL);
}
}
}...
++gpos;
}while(g.length()!=(gpos-1));
....
When I try the inputs greater than six digits I get erroneous outputs. Why is this happening and is there a way to go around this?