Alright, I get your point. I stripped the program down to what you suggested and ran it with my large (60,000 lines) rawdata.txt file. The program is not segfaulting... This is strange because when reading from the COM port, it does segfault in much shorter time. Why could this be?
Maybe what's coming in at the COM port differs from the text file? Perhaps you could try logging what is read at the com port into a text file and then look through it for differences from what you're expecting. If you write out to log immediately after the fgets then you can be reasonably sure of getting an accurate representation of what's coming in. Also unflushed data in buffers can be lost when the program crashes so make sure when logging you open the log file, write the data and close the log file - it'll slow the program down a bit but you'll be certain to see what comes in at the time the crash occurs. Or if you don't have that much storage, write the data to screen but again make sure you flush the buffers to make sure that what is printed is displayed before the crash - I don't know how you do that with cout but if you use printf instead then fflush(stdout) will do the trick, i.e. Code: printf("Hello\n"); fflush(stdout); strcpy(NULL,"Crash!"); - this way Hello is guaranteed to be printed when the crash occurs.
Well... The problem seems to be very narrowed down. When I got rid of my cout and wrote with fprintf to a file, it did not segfault. I guess this points that it was overflowing the stdout buffer. Thanks for all the help!