CSV
Delimiter
JSON

How to convert CSV to JSON

  1. Paste your CSV data into the input field — the first row must contain the column headers (field names).
  2. Choose the delimiter that matches your CSV file: comma (,), semicolon (;), or tab (⇥).
  3. Click "Convert" — the JSON array output appears instantly with each row mapped to an object.
  4. Switch to "JSON → CSV" using the toggle above if you need the reverse direction.
  5. Click "Copy" or "Download .json" to use the output in your application, API, or data pipeline.

How CSV to JSON conversion works

The converter splits the CSV input into lines, uses the first line as the header row (column names), and maps each subsequent line to a JSON object using the headers as keys and the row values as values. The output is a JSON array where each element corresponds to one CSV row. The conversion runs entirely in your browser — no data is sent to any server.

RFC 4180-compliant quoted fields are handled correctly: values wrapped in double quotes are unquoted, escaped double quotes ("" inside a quoted field) are unescaped to a single quote, and newlines inside quoted fields are preserved. All output values are strings — CSV does not store data type information, so numeric coercion is not applied automatically.

CSV → JSON array conversion
// Input CSV (comma delimiter):
name,age,city
Alice,30,Kyiv
Bob,25,Lviv
Carol,35,Odesa

// Output JSON array:
[
  { "name": "Alice", "age": "30", "city": "Kyiv"  },
  { "name": "Bob",   "age": "25", "city": "Lviv"  },
  { "name": "Carol", "age": "35", "city": "Odesa" }
]

// Note: all values are strings. To convert numeric fields:
// data.forEach(row => { row.age = Number(row.age) })

Who uses CSV to JSON conversion

CSV is the universal export format for spreadsheets, databases, and reporting tools. JSON is the universal import format for web applications and APIs. Converting CSV to JSON is the bridge between these two worlds.

  • Frontend developers — loading data from a spreadsheet export (Excel, Google Sheets) into a JavaScript application or chart library.
  • Backend developers — importing database exports or legacy CSV files into JSON-based APIs or document stores.
  • Data engineers — transforming CSV exports from CRMs, ERPs, and analytics tools into JSON for ingestion into NoSQL databases or event streams.
  • Content managers — converting spreadsheet content into JSON data files for static site generators (Hugo, Astro, Eleventy).
  • QA engineers — converting test data from spreadsheets into JSON fixtures for automated test suites.
  • Business analysts — preparing Excel or Google Sheets data for use in JavaScript-based reporting and visualization tools.

CSV input requirements and delimiter guide

The converter follows the RFC 4180 CSV standard. To get reliable output, ensure your CSV meets these requirements before converting:

  • First row must be headers — the column names in the first row become the JSON object keys. If your CSV has no header row, add one before converting — otherwise the first data row is treated as headers and lost.
  • Consistent delimiter — all rows must use the same delimiter. Select the matching option in the converter before clicking Convert.
  • Quoted fields for special characters — values containing the delimiter, double quotes, or newlines must be wrapped in double quotes. The converter handles these correctly when they are properly quoted.
  • UTF-8 encoding — the converter processes the text as UTF-8. Files exported from Excel may be in Windows-1252 encoding — resave as UTF-8 first if you see garbled characters.
DelimiterWhen to select it
Comma (,)Default for most tools and APIs; standard in English-locale Excel, Google Sheets, PostgreSQL COPY
Semicolon (;)European-locale Excel exports; Google Sheets downloads in French, German, Spanish locales
Tab (⇥)TSV files (.tsv); data containing many commas (addresses, text fields); some database exports

When to convert to JSON — and when to keep CSV

Convert to JSON when:

  • Target is a web application or API — JavaScript natively parses JSON; CSV requires a separate parsing library or manual import step.
  • Loading into a NoSQL database — MongoDB, Firestore, DynamoDB, and similar stores use JSON-like documents as their native format.
  • Static site generator data files — Hugo, Astro, Eleventy, and 11ty all support JSON data files for template rendering.
  • JavaScript chart libraries — Chart.js, D3, Recharts, and similar libraries accept JSON arrays as their data source directly.
  • Feeding an API or webhook — REST and GraphQL endpoints expect JSON-encoded request bodies.

Keep CSV when:

  • Recipients use Excel or Google Sheets — spreadsheet users are more comfortable filtering and editing CSV than JSON files.
  • Bulk database imports — SQL databases use COPY and LOAD INFILE commands that accept CSV natively and efficiently.
  • Logs and large flat datasets — CSV is more compact and can stream line-by-line without loading the full file into memory.
  • Data interchange with non-developer teams — CSV is universally editable in any spreadsheet tool without special tooling.

Edge cases: quoted fields, numbers, and special characters

CSV appears simple but has several edge cases that trip up basic parsers. This converter handles all of them correctly by following the RFC 4180 specification:

  • Commas inside values — "New York, USA" is a single quoted field containing a comma. The converter treats it as one value, not two columns.
  • Double quotes inside quoted fields — RFC 4180 escapes an embedded quote by doubling it: "say ""hello""" contains say "hello". The converter unescapes these correctly.
  • Newlines inside quoted fields — a multiline value inside double quotes is treated as a single field. The embedded newline is preserved in the JSON string.
  • Leading/trailing whitespace — whitespace inside quoted fields is preserved. Unquoted values are trimmed of leading and trailing spaces.
Quoted fields and special characters in CSV
// CSV input with quoted fields:
name,bio,score
Alice,"Software engineer, Kyiv",95
Bob,"Says ""hello"" often",87
Carol,"Line one
Line two",91

// JSON output:
[
  { "name": "Alice", "bio": "Software engineer, Kyiv", "score": "95" },
  { "name": "Bob",   "bio": "Says \"hello\" often",   "score": "87" },
  { "name": "Carol", "bio": "Line one\nLine two",        "score": "91" }
]

Frequently Asked Questions

Does the first row of the CSV need to be the headers?
Yes. The converter uses the first row as the source of JSON object keys (column names). If your CSV does not have a header row, add one manually before converting — otherwise the first data row will be treated as headers and lost from the output.
How do I choose the right delimiter?
Open the file in a plain text editor to inspect it. If columns are separated by commas, choose comma. If separated by semicolons (common in European Excel exports), choose semicolon. If separated by tabs (common in TSV files or certain database exports), choose tab. Choosing the wrong delimiter produces a JSON output with a single field containing the entire unsplit row.
What happens to empty cells in the CSV?
Empty cells produce empty strings ("") in the JSON output. A CSV row like Alice,,Kyiv produces { "name": "Alice", "age": "", "city": "Kyiv" }. If you need null instead of empty string for downstream processing, replace empty strings after conversion.
Are numbers in CSV converted to numbers in JSON?
No — all values are output as strings regardless of content. CSV does not store type information, so 30 in the CSV becomes "30" (a string) in the JSON. To convert numeric fields after parsing: data.forEach(row => { row.age = Number(row.age) }). This avoids silent type coercion errors on non-numeric fields.
How are quoted fields handled?
Fields wrapped in double quotes are unquoted in the output. A field like "New York, USA" becomes the string New York, USA. Embedded double quotes escaped with doubling (say ""hello"") become single double quotes in the output (say "hello"). This follows the RFC 4180 standard.
What about newlines inside quoted CSV fields?
The converter handles multiline values inside quoted fields correctly. A field containing an actual newline character (not a backslash-n) is parsed as a single value and output as a JSON string with an escaped newline (\n).
Does the converter support Unicode and non-ASCII characters?
Yes. The converter is fully Unicode-aware — accented letters, CJK characters, Cyrillic, Arabic, and emoji are all handled correctly. If you paste from a file saved in a non-UTF-8 encoding (such as Windows-1252 from Excel), re-save the file as UTF-8 first to avoid garbled characters.
What if some rows have fewer columns than the header row?
Short rows produce JSON objects with missing keys omitted entirely — the missing fields do not appear as empty strings. If you need consistent keys across all objects, pre-pad short rows with empty cells (e.g., trailing commas) before converting.
Can I convert a CSV file without headers?
Not directly — the first row is always treated as column names. To work around this, add a synthetic header row (col1,col2,col3) before the data, then rename the keys in the JSON output after conversion.
What is the output JSON structure?
The output is a JSON array of objects. Each row becomes one object; each column header becomes a key; each cell value becomes the corresponding string value. The array is formatted with 2-space indentation for readability.
Is any data sent to a server during conversion?
No. The entire conversion runs in your browser using JavaScript. No data is transmitted over the network — there are no privacy concerns, no server-side file size limits, and no rate limits.
Can I convert the JSON back to CSV?
Yes — switch to "JSON → CSV" using the toggle above the tool. Note that the round-trip is not lossless: all values are now strings in the CSV. Converting back to JSON produces strings, not the original numbers or booleans.