CSV files are everywhere. Exports from databases, spreadsheets, analytics platforms, CRMs — and every one of them needs to be converted into structured JSON before your application can use it. Writing a CSV parser sounds simple until you encounter quoted fields containing commas, mixed line endings, or inconsistent column counts.
The DocForge API handles all of these edge cases for you. Send CSV text, get clean JSON back. This guide covers every feature of the POST /api/csv-to-json endpoint with working examples.
Basic CSV Parsing
The simplest case: standard comma-delimited CSV with a header row. The API uses the first row as object keys and returns an array of objects.
curl -X POST https://docforge-api.vercel.app/api/csv-to-json \ -H "Content-Type: application/json" \ -d '{ "csv": "name,email,role\nAlice,alice@example.com,admin\nBob,bob@example.com,user" }'
{
"data": [
{ "name": "Alice", "email": "alice@example.com", "role": "admin" },
{ "name": "Bob", "email": "bob@example.com", "role": "user" }
],
"meta": { "rows": 2, "columns": ["name", "email", "role"] }
}
Handling Quoted Fields
Real-world CSV files often contain commas inside fields. According to RFC 4180, these fields should be enclosed in double quotes. The DocForge API correctly handles quoted fields, escaped quotes (doubled double-quotes), and newlines within quoted strings.
curl -X POST https://docforge-api.vercel.app/api/csv-to-json \ -H "Content-Type: application/json" \ -d '{ "csv": "product,description,price\n\"Widget A\",\"A small, lightweight widget\",9.99\n\"Widget B\",\"A large \"\"premium\"\" widget\",24.99" }'
The parser correctly splits on commas that are outside of quotes and unescapes doubled double-quotes within field values.
Custom Delimiters and TSV Files
Not all tabular data uses commas. Tab-separated values (TSV), pipe-delimited files, and semicolon-delimited exports from European locales are common. Pass the delimiter parameter to handle any separator character.
curl -X POST https://docforge-api.vercel.app/api/csv-to-json \ -H "Content-Type: application/json" \ -d '{ "csv": "name\temail\trole\nAlice\talice@example.com\tadmin", "delimiter": "\t" }'
JavaScript Integration
Here is a practical example of converting a CSV file upload to JSON in a web application:
async function csvFileToJson(file) { const text = await file.text(); const response = await fetch('https://docforge-api.vercel.app/api/csv-to-json', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ csv: text }) }); const { data, meta } = await response.json(); console.log(`Parsed ${meta.rows} rows with columns: ${meta.columns.join(', ')}`); return data; } // Usage with a file input document.getElementById('csv-upload').addEventListener('change', async (e) => { const records = await csvFileToJson(e.target.files[0]); console.log(records); });
Python Integration
Converting a CSV file to JSON in Python is equally straightforward:
import requests def csv_to_json(csv_text, delimiter=","): response = requests.post( "https://docforge-api.vercel.app/api/csv-to-json", json={"csv": csv_text, "delimiter": delimiter} ) result = response.json() print(f"Parsed {result['meta']['rows']} rows") return result["data"] # Parse a CSV file with open("users.csv") as f: records = csv_to_json(f.read()) # Parse a TSV file with open("data.tsv") as f: records = csv_to_json(f.read(), delimiter="\t")
Common CSV Edge Cases
The DocForge parser handles the edge cases that trip up naive CSV splitting approaches:
- Empty fields — Consecutive delimiters produce empty string values, not missing keys.
- Trailing newlines — Extra newlines at the end of the file are ignored rather than producing empty rows.
- Mixed line endings — Windows (CRLF), Unix (LF), and old Mac (CR) line endings all work correctly.
- Unicode content — UTF-8 content including accented characters, CJK characters, and emoji are preserved.
- Inconsistent columns — Rows with fewer fields than the header get empty string values for missing columns.
Reverse Direction: JSON to CSV
Need to go the other way? The DocForge API also provides POST /api/json-to-csv for converting an array of JSON objects back to CSV format. It auto-detects all unique keys across all objects to build the column headers. See the API documentation for details.
When to Use an API vs. a Library
Use the API when you want zero parsing dependencies, consistent behavior across languages, or when you are processing user-uploaded CSV files that might contain unexpected formatting. Use a library when you need offline processing, have strict latency requirements below 10ms, or process CSV files larger than the 100KB free tier limit.
For most web applications, the API approach is simpler and more maintainable. You trade a few milliseconds of network latency for zero dependencies and guaranteed RFC-compliant parsing.
Try DocForge API Free
500 requests/day, no credit card required. Parse CSV to JSON in milliseconds.
Try It Live