- Get link
- X
- Other Apps
- Get link
- X
- Other Apps
This article explains in detail how to parse JSON with Power Query in Excel and Power BI, from simple files to complex web APIs, so that you can reliably convert nested JSON into clean analytical tables while maintaining performance and refreshability.
1. Why parsing JSON with Power Query matters
JSON has become a standard format for APIs, log files, cloud exports, and modern applications.
Excel and Power BI users increasingly receive raw JSON instead of CSV or databases, and they need a repeatable way to turn that JSON into usable tables.
Power Query is designed exactly for this purpose, offering built-in connectors, a dedicated JSON engine, and a graphical interface to navigate, expand, and shape complex structures.
By mastering how to parse JSON with Power Query, you can automate recurring data imports, eliminate manual copy-paste steps, and ensure that your reports refresh reliably as the upstream JSON changes.
2. JSON and Power Query data structures
To parse JSON with Power Query effectively, it is important to understand how JSON types map to Power Query types.
| JSON construct | Example JSON | Power Query type | Power Query representation |
|---|---|---|---|
| Object | | Record | [id = 1, name = "Alice"] |
| Array | | List | {1, 2, 3} |
| Array of objects | | List of records | {[id = 1], [id = 2]} |
| Primitive value | | Text, number, logical, etc. | "status" |
| Nested structure | | Record with nested list or record | [order = [id = 1, lines = {...}]] |
Internally, Power Query uses the Json.Document function to convert raw JSON text or binary into these record, list, and primitive structures before you start expanding them in the UI.
3. Loading JSON into Power Query
3.1 Loading a JSON file in Excel
The most common scenario is a JSON file exported from another system that you want to analyze in Excel.
In modern Excel, you can load a JSON file with the dedicated JSON connector.
- On the Data tab, select Get Data.
- Choose From File, then From JSON.
- Browse to the JSON file and confirm.
- Excel opens the Power Query Editor, showing the root JSON node as a list or record.
Behind the scenes, the connector uses M code similar to the following.
let Source = Json.Document( File.Contents("C:\Data\orders.json") ) in Source Depending on the structure, the initial view might show a single List, a Record, or an already tabular structure that you can expand right away.
3.2 Loading JSON from a web API in Excel or Power BI
Power Query can also call web APIs that return JSON, which is common in Power BI and in more advanced Excel solutions.
- In Excel, select Data > Get Data > From Other Sources > From Web.
- In Power BI Desktop, select Get Data > Web.
- Enter the API URL that returns JSON and proceed through any credential prompts.
- Power Query detects the JSON and creates an initial query that uses
Json.DocumentoverWeb.Contents.
A typical M pattern looks like this.
let Source = Json.Document( Web.Contents( "https://api.example.com/orders?from=2025-01-01&to=2025-01-31" ) ) in Source In Power BI, the JSON connector and the Web connector both end up using this combination of Web.Contents and Json.Document to retrieve and parse the JSON response.
4. Parsing JSON text columns with the Parse > JSON command
Sometimes JSON arrives not as a file or API source, but as text stored inside a column, for example a log table with a Payload field that contains JSON strings.
Power Query provides a dedicated Parse feature that converts a text column directly into structured JSON.
- Select the column that contains JSON text.
- On the Transform tab, select Parse, then JSON.
- Power Query converts each cell into a structured
RecordorListvalue. - Click the expand icon in the column header to expand the fields you need into separate columns.
You can also perform the same operation from Add Column > Parse > JSON if you want to keep the original text column and add a new structured column instead of overwriting it.
Note : When parsing JSON text, ensure that every row contains valid JSON and uses consistent encoding, otherwise the step may throw an error or create mixed data types.
5. Navigating lists and records in the Power Query UI
After loading or parsing JSON, the Power Query preview often displays links such as List, Record, or Table in each row.
The key skill is to convert lists into tables and expand records in the right order until you have a flat tabular structure.
5.1 Converting lists to tables
When you see a column where each cell is a List, the usual workflow is as follows.
- Click the List value in the preview to drill into a single list.
- On the List Tools > Transform tab, select To Table.
- Accept the default conversion settings unless you need a different delimiter or error behavior.
- You now have a one-column table, typically with a column named Column1.
- Use the expand icon in Column1 to expand nested records into separate columns.
5.2 Expanding records
When a column shows Record in each row, it means each cell contains multiple named fields.
Click the expand icon on the column header and select which fields you want to bring into the table.
You can optionally clear the prefix option so that the new column names are cleaner and shorter.
| Scenario | UI action | Effect |
|---|---|---|
| List of primitive values | List Tools > To Table | Create one column with one row per value. |
| List of records | To Table, then expand Column1 | Create multiple columns for record fields. |
| Record values in a column | Expand icon on column header | Split each record into multiple columns. |
| Nested table in a column | Expand icon on table column | Join the nested table to the main table by rows. |
Note : Expand only the fields you need instead of expanding everything, especially when working with large or deeply nested JSON, to improve refresh performance and reduce memory usage.
6. End-to-end example: parsing an orders JSON array
Consider a simple API response that returns an array of orders, each with nested line items.
[ { "orderId": 1001, "customer": { "id": 1, "name": "Acme Corp" }, "orderDate": "2025-01-10", "lines": [ {"sku": "A001", "qty": 2, "price": 10.0}, {"sku": "B005", "qty": 1, "price": 25.0} ] }, { "orderId": 1002, "customer": { "id": 2, "name": "Beta Ltd" }, "orderDate": "2025-01-11", "lines": [ {"sku": "A001", "qty": 1, "price": 10.0} ] } ] The M code for a clean solution can be built step by step.
let // 1. Load JSON from a file or API Source = Json.Document( Web.Contents("https://api.example.com/orders") ), // 2. Convert the top-level list of orders into a table OrdersList = Source, OrdersTable = Table.FromList( OrdersList, Splitter.SplitByNothing(), {"OrderRecord"} ), // 3. Expand each order record into separate columns ExpandedOrders = Table.ExpandRecordColumn( OrdersTable, "OrderRecord", {"orderId", "customer", "orderDate", "lines"}, {"orderId", "customer", "orderDate", "lines"} ), // 4. Expand the customer record ExpandedCustomer = Table.ExpandRecordColumn( ExpandedOrders, "customer", {"id", "name"}, {"CustomerId", "CustomerName"} ), // 5. Expand the lines list, creating one row per line item LinesTable = Table.ExpandListColumn( ExpandedCustomer, "lines" ), // 6. Expand the line item record ExpandedLines = Table.ExpandRecordColumn( LinesTable, "lines", {"sku", "qty", "price"}, {"Sku", "Quantity", "UnitPrice"} ), // 7. Add a computed column for line amount AddAmount = Table.AddColumn( ExpandedLines, "LineAmount", each [Quantity] * [UnitPrice], type number ) in AddAmount After these steps, you obtain a fact table with one row per order line, including order header data, customer attributes, and line-level measures such as quantity and amount.
7. Advanced patterns for complex JSON
7.1 Dynamic parameters in JSON API URLs
Many APIs require dates, IDs, or paging parameters in the URL.
Instead of hard-coding those values, you can build them dynamically from a parameters table or from query parameters.
let // Parameter table with FromDate and ToDate Params = Excel.CurrentWorkbook(){[Name = "ApiParameters"]}[Content], FromDate = Text.From(Params{0}[FromDate]), ToDate = Text.From(Params{0}[ToDate]), // Build a query record for Web.Contents Source = Json.Document( Web.Contents( "https://api.example.com/orders", [ Query = [ from = FromDate, to = ToDate ] ] ) ) in Source This pattern makes your JSON queries much easier to maintain and allows business users to control refresh periods without editing M code.
7.2 Automatically expanding unknown JSON schemas
In some scenarios, the shape of the JSON can change over time, for example when a SaaS provider adds new fields to an API.
Instead of manually updating expand steps, you can create a generic Power Query function that recursively walks through lists and records, converting everything into a fully expanded table.
A common approach is to split the logic into two functions, such as a parser that loads JSON and a column expander that iteratively expands all nested structures until only primitive columns remain.
Note : Generic JSON expansion functions are powerful but can be expensive on wide or deeply nested payloads, so they should be used carefully and often combined with filters that limit the amount of data before full expansion.
7.3 Combining multiple JSON files
It is common to receive one JSON file per day or per entity and then combine them into a single table.
Power Query can combine multiple JSON files from a folder using a pattern similar to combining CSVs, but with a custom function that parses each JSON file.
let // 1. List all files in the folder Source = Folder.Files("C:\Data\DailyJson\"), // 2. Keep only .json files Filtered = Table.SelectRows( Source, each Text.Lower(Text.End([Name], 5)) = ".json" ), // 3. Define a function that parses a single JSON file ParseJsonFile = (FileContent as binary) as table => let JsonValue = Json.Document(FileContent), AsTable = Table.FromList( JsonValue, Splitter.SplitByNothing(), {"Record"} ), Expanded = Table.ExpandRecordColumn( AsTable, "Record", {"field1", "field2"}, {"Field1", "Field2"} ) in Expanded, // 4. Invoke the function for each file AddParsed = Table.AddColumn( Filtered, "Data", each ParseJsonFile([Content]) ), // 5. Combine all parsed tables Combined = Table.Combine(AddParsed[Data]) in Combined This pattern allows you to maintain one parsing function and reuse it across hundreds of JSON files from the same system.
8. Performance and best practices when parsing JSON
Parsing JSON can become a performance bottleneck if the files are large, the structures are deeply nested, or you expand more data than you actually need.
8.1 Filter early, expand late
Whenever possible, apply filters before expanding large nested lists or records.
For web APIs, push filters to the server by using query parameters rather than retrieving everything and filtering in Power Query.
8.2 Avoid unnecessary auto-detect steps
Power Query sometimes inserts automatic type detection and changed-type steps immediately after loading JSON.
For large datasets, these steps can be surprisingly expensive, so it is often better to remove them and add explicit type conversions only at the end of your query.
8.3 Use Table.Buffer selectively
For complex transformations, buffering a table can improve performance by avoiding multiple evaluations of the same step.
However, buffering very large tables can also increase memory usage, so it should be applied only where you have confirmed a benefit through measurement.
8.4 Leverage the JSON connector and automatic table detection
Recent improvements to the JSON connector and automatic table detection help Power Query flatten many common structures with fewer manual steps, especially for JSON files and APIs that follow predictable patterns.
Note : Regularly review query diagnostics and refresh times when working with large JSON sources, and consider partitioning data by date or category to keep each query focused and efficient.
FAQ
How to parse deeply nested JSON in Power Query.
Break the problem into layers and flatten one level at a time.
Start from the root list or record, convert lists to tables, and expand records iteratively.
For very complex structures, consider writing a reusable function that recursively processes lists and records, or start from the automatic table detection result and then customize the final levels manually.
Difference between using Parse > JSON and Json.Document.
The Parse > JSON command is a user interface feature that converts a text column into structured values, which is convenient when JSON is stored inside rows of a table.
The Json.Document function is the core M function that parses JSON from any text or binary source, such as files or web responses.
In practice, Parse > JSON generates a step that internally calls Json.Document on each cell, while connectors like From JSON or From Web use Json.Document at the query source level.
How to handle optional or missing JSON fields.
JSON APIs often omit fields that are null or optional, which can result in missing columns or null values after expansion.
To handle this, expand all expected fields, then replace nulls with default values where appropriate, and avoid relying on column positions.
In more advanced models, you can define a schema table that lists all expected fields and use it to enforce consistent column sets across different responses or files.
Best way to refresh JSON API data securely in Power BI.
Use the Web connector with properly configured credentials and avoid embedding secrets directly in the URL.
Where possible, store API keys in parameters or use Azure Key Vault and organizational authentication methods.
Always test scheduled refresh in the Power BI service to ensure that gateway configuration, authentication, and privacy levels are aligned with your JSON data sources.
Can Power Query handle streaming or log style JSON files.
Yes, provided that the log format is valid JSON per row or can be transformed into valid JSON.
A typical pattern is to split the log file into individual lines, treat each line as a JSON text value, and then apply Parse > JSON to that column.
After parsing, you can expand records and lists as usual and apply filters on timestamps, severities, or other log fields to keep the resulting model manageable.
추천·관련글
- GC Flow Instability Fix: Proven Steps to Stabilize Gas Chromatography Flow
- Fix FTIR Baseline Slope: Proven Methods for Accurate Spectra
- How to Stabilize pH After Acid Neutralization: Proven Process Control Strategies
- Suppress Solvent Peak Interference in NMR: Proven Solvent Suppression Techniques and Settings
- Elemental Analysis Recovery: Expert Fixes for Low Results in CHNS, ICP-MS, ICP-OES, and AAS
- Resolve Safety Data Sheet (SDS) Information Inconsistencies: Expert Workflow for Compliance and Risk Control
- Get link
- X
- Other Apps