1

I am trying to tokenize a File and insert certain strings into an array. When I tokenize the file and print out each token, it works fine, but when I put each token into an array and print out the contents of the array, the contents are not the same at all.

char *filenames[1000];    
token = strtok(line, " ");              
while (token != NULL) {                     
    printf("%s\n", token);                      
    /*                      
      filenames[i] = token;                     
      i++;                      
    */                      
    token = strtok(NULL, " ");                  
}                   
ck = fgets(line, 1000, fp);    
for (j = 0; j <= i; j++){               
    printf("%s \n", filenames[j]);              
}

Am I supposed to malloc the array of filenames, or malloc each token?

1 Answer 1

1

You keep reusing the same pointer over and over. Try something like:

filenames[i] = strdup(token);

You should also remember to free(filenames[i]) when you're done with them.


If you don't have strdup you can roll your own or just use:

filenames[i] = malloc(strlen(token) + 1);    /* XXX check malloc return. */
strcpy(filenames[i], token);
Sign up to request clarification or add additional context in comments.

1 Comment

Thank You! I used the malloc instead of strdup and it worked. For some reason though, when I tokenize my file, lines are being inserted as tokens even though there are no lines between my words. My file looks like this: <list> filename frequency filename frequency <\list>. and all I want inserted in the filenames array are the unique filenames. The strtok should be tokenizing them by " " but I dont know why the extra lines are being added in as tokens. Any idea why?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.