I am trying to read a csv file with over 170000 rows with 10 columns each entry. I wrote this code using c++ (in visual studio 2017) to read it, but it only reads 3600 entries before failing.
// Trial1.cpp : Defines the entry point for the console application.
//
#include "stdafx.h"
#include <iostream>
#include <cstdlib>
#include <fstream>
#include <sstream>
using namespace std;
int main()
{
ifstream file("Brightest Day.csv");
if (!file.is_open())
{
cout << "ERROR: File Open" << "\n";
}
string data[3000][10];
for (long i = 0; i < 3000; i++)
{
for (int j = 0; j < 10; j++)
{
getline(file, data[i][j], ',');
}
}
for (long i = 0; i < 3000; i++)
{
for (int j = 0; j < 10; j++)
{
cout<<data[i][j]<<" | ";
if (j == 10)
{
cout << "\n";
}
}
}
return 0;
}
Even if it could only read around 10000 entries, I'd call it a success
if (!file.is_open()) { cout << "ERROR: File Open" << "\n"; }looks like botched error handling to me. If the file is not open you print an error but continue anyway.for (long i = 0; i < 3000; i++)...over 170000 rows....i < 3000...string data[3000][10];- besides, that array is never going to hold 170000 strings, no matter how hard you really want it to) - use astd::vector.