JSON to CSV Converter
Paste a JSON array — get CSV output instantly. Choose your delimiter below.
How to convert JSON to CSV
- Paste your JSON array into the input field — each object in the array becomes one row in the CSV output.
- Choose your delimiter: comma (,) for standard CSV, semicolon (;) for European locales, or tab (⇥) for Excel TSV files.
- Click "Convert" — the CSV output appears instantly with column headers derived from the JSON object keys.
- Switch to "CSV → JSON" using the toggle above if you need the reverse direction.
- Click "Copy" or "Download .csv" to use the output in Excel, Google Sheets, a database import, or a data pipeline.
How JSON to CSV conversion works
The converter expects a JSON array of objects — each object becomes one row, and the keys of the first object become the column headers. It runs entirely in your browser: the JSON string is parsed into an array, headers are extracted from the first element's keys, and each object is mapped to a delimited row. No data leaves your machine. Values that contain the delimiter, double quotes, or newlines are automatically wrapped in double quotes and escaped per the RFC 4180 standard.
The delimiter choice determines which applications open the file correctly. Comma is the universal default, but European locales often use semicolon because comma is the decimal separator there. Tab-separated files (TSV) are used when values regularly contain commas — common for address data, product descriptions, and financial records.
// Input JSON array:
[
{ "name": "Alice", "age": 30, "city": "Kyiv" },
{ "name": "Bob", "age": 25, "city": "Lviv" },
{ "name": "Carol", "age": 35, "city": "Odesa" }
]
// Output CSV (comma delimiter):
name,age,city
Alice,30,Kyiv
Bob,25,Lviv
Carol,35,Odesa
// Output CSV (semicolon — European format):
name;age;city
Alice;30;Kyiv
Bob;25;Lviv
Carol;35;OdesaWho uses JSON to CSV conversion
JSON is the standard format for APIs and web applications. CSV is the standard format for spreadsheets, databases, and data analysis tools. Converting between them is a daily task for anyone bridging these two worlds.
- Backend developers — exporting API response data to CSV for QA review, client reporting, or onboarding imports.
- Data analysts — loading JSON exports from web apps (analytics tools, CRMs, e-commerce platforms) into Excel or Google Sheets for analysis.
- Database administrators — preparing data for bulk import into SQL databases, which typically accept CSV as the import format.
- Product managers — converting structured data exports into spreadsheets for stakeholder reports and planning.
- ETL pipelines — transforming JSON payloads from REST APIs into CSV for loading into data warehouses (BigQuery, Redshift, Snowflake).
- QA engineers — exporting test data sets from JSON fixtures to CSV for comparison, diff analysis, and test case documentation.
Delimiter options: comma, semicolon, and tab
The choice of delimiter determines which applications can open the resulting file without manual configuration. Comma is the most universally supported, but the correct choice depends on your target application and the content of your data.
| Delimiter | Character | Best for |
|---|---|---|
| Comma | , | Standard CSV; default for most tools, APIs, and libraries worldwide |
| Semicolon | ; | European locale users where comma is the decimal separator (France, Germany, Spain, Italy) |
| Tab | ⇥ | TSV files; data that regularly contains commas (addresses, descriptions, financial figures) |
Values that contain the chosen delimiter are automatically wrapped in double quotes in the output — this is the RFC 4180 quoting rule. A value like New York, USA becomes "New York, USA" when the delimiter is comma. You do not need to pre-process your data to avoid conflicts.
When to use CSV — and when to keep JSON
Use CSV when:
- Opening in Excel or Google Sheets — spreadsheet tools natively open and edit CSV; JSON requires a plugin or an import script.
- Database bulk import — SQL databases (PostgreSQL, MySQL, SQLite) have efficient COPY and LOAD commands that accept CSV directly.
- Sharing flat tabular data — when the recipient is a non-developer who needs to filter, sort, and inspect rows without coding.
- Smaller file size — CSV stores each value once; JSON repeats the key name on every row, making it 2–5× larger for wide datasets.
- ETL and data warehouse ingestion — BigQuery, Redshift, and Snowflake all accept CSV as a first-class import format.
Keep JSON when:
- Data has nested objects or arrays — CSV cannot represent hierarchy; nested data is flattened or lost during conversion.
- Data types matter — CSV stores everything as strings. JSON preserves numbers, booleans, and null values with their correct types.
- The target is an API or web application — REST APIs and JavaScript applications natively consume JSON without an extra parsing step.
- Schema flexibility is needed — JSON arrays can contain objects with different keys; CSV requires all rows to share the same columns.
Handling nested objects and arrays in JSON
CSV is a flat, two-dimensional format — it has rows and columns, but no nesting. If your JSON objects contain nested objects or arrays as values, the converter serializes them to their JSON string representation inside the CSV cell. This preserves the data but means the nested content appears as a string rather than being expanded into separate columns.
// Input JSON with nested object and array:
[
{
"name": "Alice",
"address": { "city": "Kyiv", "zip": "01001" },
"tags": ["admin", "user"]
}
]
// Output CSV (nested values serialized as JSON strings):
name,address,tags
Alice,"{""city"":""Kyiv"",""zip"":""01001""}","[""admin"",""user""]"
// To get flat columns, pre-process the JSON first:
// { "name": "Alice", "address_city": "Kyiv", "address_zip": "01001", "tags_0": "admin" }
// This converter handles flat arrays of objects best.