JSON
Delimiter
CSV

How to convert JSON to CSV

  1. Paste your JSON array into the input field — each object in the array becomes one row in the CSV output.
  2. Choose your delimiter: comma (,) for standard CSV, semicolon (;) for European locales, or tab (⇥) for Excel TSV files.
  3. Click "Convert" — the CSV output appears instantly with column headers derived from the JSON object keys.
  4. Switch to "CSV → JSON" using the toggle above if you need the reverse direction.
  5. Click "Copy" or "Download .csv" to use the output in Excel, Google Sheets, a database import, or a data pipeline.

How JSON to CSV conversion works

The converter expects a JSON array of objects — each object becomes one row, and the keys of the first object become the column headers. It runs entirely in your browser: the JSON string is parsed into an array, headers are extracted from the first element's keys, and each object is mapped to a delimited row. No data leaves your machine. Values that contain the delimiter, double quotes, or newlines are automatically wrapped in double quotes and escaped per the RFC 4180 standard.

The delimiter choice determines which applications open the file correctly. Comma is the universal default, but European locales often use semicolon because comma is the decimal separator there. Tab-separated files (TSV) are used when values regularly contain commas — common for address data, product descriptions, and financial records.

JSON array → CSV conversion
// Input JSON array:
[
  { "name": "Alice", "age": 30, "city": "Kyiv"  },
  { "name": "Bob",   "age": 25, "city": "Lviv"  },
  { "name": "Carol", "age": 35, "city": "Odesa" }
]

// Output CSV (comma delimiter):
name,age,city
Alice,30,Kyiv
Bob,25,Lviv
Carol,35,Odesa

// Output CSV (semicolon — European format):
name;age;city
Alice;30;Kyiv
Bob;25;Lviv
Carol;35;Odesa

Who uses JSON to CSV conversion

JSON is the standard format for APIs and web applications. CSV is the standard format for spreadsheets, databases, and data analysis tools. Converting between them is a daily task for anyone bridging these two worlds.

  • Backend developers — exporting API response data to CSV for QA review, client reporting, or onboarding imports.
  • Data analysts — loading JSON exports from web apps (analytics tools, CRMs, e-commerce platforms) into Excel or Google Sheets for analysis.
  • Database administrators — preparing data for bulk import into SQL databases, which typically accept CSV as the import format.
  • Product managers — converting structured data exports into spreadsheets for stakeholder reports and planning.
  • ETL pipelines — transforming JSON payloads from REST APIs into CSV for loading into data warehouses (BigQuery, Redshift, Snowflake).
  • QA engineers — exporting test data sets from JSON fixtures to CSV for comparison, diff analysis, and test case documentation.

Delimiter options: comma, semicolon, and tab

The choice of delimiter determines which applications can open the resulting file without manual configuration. Comma is the most universally supported, but the correct choice depends on your target application and the content of your data.

DelimiterCharacterBest for
Comma,Standard CSV; default for most tools, APIs, and libraries worldwide
Semicolon;European locale users where comma is the decimal separator (France, Germany, Spain, Italy)
TabTSV files; data that regularly contains commas (addresses, descriptions, financial figures)

Values that contain the chosen delimiter are automatically wrapped in double quotes in the output — this is the RFC 4180 quoting rule. A value like New York, USA becomes "New York, USA" when the delimiter is comma. You do not need to pre-process your data to avoid conflicts.

When to use CSV — and when to keep JSON

Use CSV when:

  • Opening in Excel or Google Sheets — spreadsheet tools natively open and edit CSV; JSON requires a plugin or an import script.
  • Database bulk import — SQL databases (PostgreSQL, MySQL, SQLite) have efficient COPY and LOAD commands that accept CSV directly.
  • Sharing flat tabular data — when the recipient is a non-developer who needs to filter, sort, and inspect rows without coding.
  • Smaller file size — CSV stores each value once; JSON repeats the key name on every row, making it 2–5× larger for wide datasets.
  • ETL and data warehouse ingestion — BigQuery, Redshift, and Snowflake all accept CSV as a first-class import format.

Keep JSON when:

  • Data has nested objects or arrays — CSV cannot represent hierarchy; nested data is flattened or lost during conversion.
  • Data types matter — CSV stores everything as strings. JSON preserves numbers, booleans, and null values with their correct types.
  • The target is an API or web application — REST APIs and JavaScript applications natively consume JSON without an extra parsing step.
  • Schema flexibility is needed — JSON arrays can contain objects with different keys; CSV requires all rows to share the same columns.

Handling nested objects and arrays in JSON

CSV is a flat, two-dimensional format — it has rows and columns, but no nesting. If your JSON objects contain nested objects or arrays as values, the converter serializes them to their JSON string representation inside the CSV cell. This preserves the data but means the nested content appears as a string rather than being expanded into separate columns.

Nested JSON in CSV output
// Input JSON with nested object and array:
[
  {
    "name": "Alice",
    "address": { "city": "Kyiv", "zip": "01001" },
    "tags": ["admin", "user"]
  }
]

// Output CSV (nested values serialized as JSON strings):
name,address,tags
Alice,"{""city"":""Kyiv"",""zip"":""01001""}","[""admin"",""user""]"

// To get flat columns, pre-process the JSON first:
// { "name": "Alice", "address_city": "Kyiv", "address_zip": "01001", "tags_0": "admin" }
// This converter handles flat arrays of objects best.

Frequently Asked Questions

What JSON structure does the converter expect?
The converter expects a JSON array of flat objects: [{ "key": "value", ... }, ...]. Each object becomes one CSV row. The keys of the first object are used as column headers. If subsequent objects have different keys, missing values are output as empty strings and extra keys are omitted.
What happens to nested objects or arrays inside the JSON?
Nested objects and arrays are serialized as JSON strings inside the CSV cell. For example, { "address": { "city": "Kyiv" } } becomes a cell containing {"city":"Kyiv"}. CSV cannot represent hierarchy, so pre-flatten your JSON if you need separate columns for nested properties.
Which delimiter should I choose?
Use comma for most tools and APIs worldwide. Use semicolon if your target is Excel on a European locale (where comma is the decimal separator — Excel automatically interprets semicolon-delimited files as CSV in those settings). Use tab when your data values commonly contain commas, such as addresses or product descriptions.
What happens if objects in the array have different keys?
Headers are derived from the first object only. If later objects have extra keys, those columns are silently omitted. If later objects are missing keys that the first object had, those cells output as empty strings. Pre-normalize your JSON to ensure all objects share the same keys for reliable output.
How are null and undefined values handled?
null values are output as empty strings in the CSV. undefined values are also output as empty strings. If you need to distinguish null from empty string in downstream systems, pre-process the JSON before conversion.
How are values that contain the delimiter or quotes handled?
Values containing the delimiter character, double quotes, or newlines are automatically wrapped in double quotes following the RFC 4180 standard. Double quotes inside the value are escaped by doubling them. A value of say "hello" becomes """say ""hello"""" in the CSV. You do not need to pre-process your data.
Does the converter support Unicode and non-ASCII characters?
Yes. The conversion is fully Unicode-aware — accented letters, CJK characters, Arabic, Cyrillic, and emoji are all preserved correctly. The downloaded .csv file is encoded as UTF-8. When opening in Excel, use "Data → From Text/CSV" and select UTF-8 encoding to display non-ASCII characters correctly.
Can I convert a single JSON object instead of an array?
The converter expects a JSON array. Wrap a single object in an array: [{ "key": "value" }] to get a one-row CSV. To transpose a single object into a two-column key–value CSV, pre-process the JSON first: Object.entries(obj).map(([k,v]) => ({ key: k, value: v })).
How does Excel open the CSV output from this converter?
In English locales, double-clicking a .csv file opens it with comma as the delimiter automatically. In European locales, Excel may default to semicolon. If the columns appear merged, use "Data → From Text/CSV" in Excel and manually specify the delimiter and UTF-8 encoding.
What is the maximum file size the converter handles?
There is no server-side limit — the conversion runs entirely in your browser. Practical limits are set by your browser's memory. JSON files up to several megabytes (thousands of rows) convert without issue on modern devices.
Is any data sent to a server during conversion?
No. The entire conversion runs in your browser using JavaScript. No data is transmitted over the network — there are no privacy concerns, no server-side file size limits, and no rate limits.
Can I convert the CSV back to JSON?
Yes — switch to "CSV → JSON" using the toggle above the tool. Note that the round-trip is not lossless: all values become strings in the JSON output because CSV does not store type information. Numbers that were integers in the original JSON will be strings after the round-trip.