shlogg · Early preview
Judy @esproc_spl

Splitting Huge CSV Files Into Smaller Chunks For Memory Efficiency

Split huge CSV file into smaller chunks using esProc script: Create cursor, loop through rows, export to sample[n].csv. Read 100000 rows at a time due to memory constraints.

Problem description & analysis
Below is CSV file sample.csv:
v2aowqhugt,q640lwdtat,8cqw2gtm0g,ybdncfeue8,3tzwyiouft,…
f0ewv2v00z,x2ck96ngmd,9htr2874n5,fx430s8wqy,tw40yn3t0j,…
p2h6fphwco,kldbn6rbzt,8okyllngxz,a8k9slqfms,bqz5fb7cm9,…
st63tcbfv8,2n862vqzww,2equ0ydeet,0x5tidunc6,npis28avpj,…
bn1u58s39a,mg7064jlrb,edyj3t4s95,zvuf9n29ai,1m0yn8uh0n,…
…
The file contains a huge volume of data that cannot be wholly loaded into the memory. 100000 rows at most can be loaded at a time into the available memory space. So we need to split the file into multiple smaller CSV files containing 100000 rows each,...