Home

How to Import JSON to Google Sheets: Apps Script or No-Code Connector?

There are two common ways to import JSON data from RESTful APIs into Google Sheets: code and no-code. The code way is suitable for tech-savvy users who are more or less proficient in Apps Script or Python. The no-code way involves the use of a connector that will do the job. In this tutorial, we’ll introduce both ways and explain how you can import JSON data to Google Sheets and even schedule recurrent imports. 

Methods to convert JSON to Google Sheets

There are three main ways to import JSON data into Google Sheets, depending on your technical skills and needs:

Coupler.io is a no-code data automation solution available as a web app and a Google Sheets add-on. It lets you connect to any REST API, fetch JSON data, and import it into your spreadsheet without writing code. You can also schedule automatic updates (hourly, daily, weekly, etc.), which makes it perfect for automated reports. 

Google Apps Script is a built-in scripting language based on JavaScript. It allows you to write custom functions that fetch JSON data from an API and import it into your spreadsheet. You can configure headers, handle pagination, and even make authenticated requests using tokens or basic auth. This method gives you full control but you’ll need coding experience to set it up and troubleshoot errors.

Third-party add-ons from Google Workspace. If you prefer ready-to-use tools, the Google Workspace Marketplace offers several third-party add-ons to import JSON. These often come with user-friendly interfaces, built-in authentication handling, and support for multiple APIs, but the free plan may have limitations.

Import JSON to Google Sheets with Coupler for FREE

If you need to connect a spreadsheet to an API and retrieve JSON data, use the no-code JSON connector by Coupler.io. It’s a user-friendly solution that does not require any coding skills to configure. 

However, using API requires technical knowledge, so be prepared to work with API documentation of the JSON data source. Each API may have specific requirements, such as handling authentication, pagination, rate limits, and so on. 

If you don’t have a technical background, then don’t worry. We will explain how to import JSON files to Google Sheets step by step.

Step 1: Collect data 

Click Proceed in the form below where we’ve already preselected JSON as a source and Google Sheets as a destination app. You’ll be given the option to get started with Coupler.io for free, with no credit card required. 

Coupler.io also lets you import JSON to Excel, JSON to Tableau, and other data warehouses and BI tools.

Then configure the following parameters:

  • JSON URL: Enter the URL to the JSON file like this:
https://filesampleshub.com/download/code/json/sample2.json

Or the API URL with the API endpoint where the HTTP requests are sent. You can find the JSON URL in the RESTful API documentation of your data source. 

https://min-api.cryptocompare.com/data/pricemultifull
1 JSON URL
  • HTTP method: Pick an HTTP method for making a request based on the documentation of your data source platform. GET is the default method.
  • Request headers: Apply HTTP headers for your request. For example, the Authorization header lets you specify the credentials required to make an HTTP request.
json source request headers
  • URL query parameters: Add query parameters to filter the requested JSON.
json source url query

Optionally, you can specify the columns (fields) to be imported into your spreadsheet and select nested objects from the JSON response.

Step 2: Transform data

Once you’ve configured the connection to your JSON data, you can preview and organize it before loading it to Google Sheets. Coupler.io allows you to:

  • Manage columns: Hide, rename, or edit columns to clean up your dataset before importing.
  • Add new columns: Create custom columns to enrich your data with formulas, tags, or labels.
    Filter and sort data: Include only the data that matters and organize it before import.
  • Aggregate data: Summarize your data by grouping and applying functions like SUM, COUNT, or AVERAGE. For example, you can group API responses by date and calculate the total sales per day without formulas in your spreadsheet.
  • Blend data from multiple sources:
    • Append: Stack data from multiple JSON sources (with similar structure) into a single table. This is useful when combining results from different endpoints or paginated requests.
    • Join: Merge JSON data from different sources based on a common key, like user ID or product code. Ideal for creating consolidated reports when your data lives in separate systems.
JSON to Google Sheets transform

Step 3. Load data and schedule refresh

Connect your Google account and select a spreadsheet on your Google Drive and a sheet to import JSON data. You can create a new Google Sheets tab by entering a new name. 

google sheets destination

Check out our guide on how to import JSON to Power BI.

Optionally, you can specify the cell range for the data import, change the import mode, and add a column with information about the last data refresh. 

To automate JSON data import, toggle on the Automatic data refresh and configure the desired schedule.

3. Coupler.io schedule

Once you’ve set up your JSON connector for Google Sheets, save & run it. With Coupler.io, it takes a couple of minutes to import JSON to Google Sheets. Moreover, you can use Coupler.io as both the web app and the Google Sheets add-on.

Free Google Sheets add-on to load JSON data

If you prefer working within Google Sheets, Coupler.io also offers a free add-on that brings all its importing features directly into your spreadsheet without switching tabs or opening external tools.

Installing the add-on is quick and easy. Once added, it allows you to create importers, configure your JSON connections, and automate data refreshes from the sidebar inside Google Sheets. Install the Coupler.io add-on from the Google Workspace Marketplace.

googleapp workspace coupler.io

Open your Google Sheet, and the add-on will appear under the Extensions menu. From there, you can launch Coupler.io.

launch coupler.io add on

To import JSON data, click on Add importer from the Coupler.io sidebar. 

add importer couplerio add on

A new window will open, and it will look just like the web version of Coupler.io.

Now, select JSON as the data source. Then, configure the connection just like we described in the section above.

coupler add on json settings

Similar to the web app, the Coupler.io add-on offers data transformation features for organizing data. 

After that, proceed to the destination settings. As you’re using the Coupler.io add-on, you do not need to select a destination app.  It is already set to the Google Sheet you’re in. 

All you have to do is pick an existing sheet tab or create a new one where you want your JSON data to load. If you want to send the data to a different spreadsheet, you can do it by editing the destination settings.

save data google sheets

Enable the automatic data refresh and set up a schedule so that the data updates automatically every week, day, hour, 30, or even 15 minutes as required. When everything is set, click Save and Run.

coupler add on data refresh

The data will now be imported directly into your sheet and will be ready to use. With the Coupler.io add-on, you can set up automatic updates, manage multiple imports in one place, and tweak your settings whenever you need to without switching tabs.

Import JSON file to Google Sheets using Google Apps Script

If you’re comfortable with writing code and want a more customizable approach to export data from JSON API to Google Sheets, Google Apps Script is a solid option. It is a built-in scripting platform for Google Workspace that lets you write JavaScript code to automate tasks directly within your spreadsheet.

In this section, we’ll walk you through how to use Apps Script to connect your sheet to a JSON API, including support for advanced features like POST requests and authentication. Google Sheets doesn’t provide functions to import JSON by default. However, you can go with the IMPORTJSON script created by Brad Jasper and Trevor Lohrbeer. It creates 4 functions to load JSON data to Google Sheets based on your requirements.

  • ImportJSON(url, query, parseOptions)
    Imports JSON data from a URL via GET request.
  • ImportJSONFromSheet(sheetName, query, options)
    Reads JSON text from a named sheet, parses it, and returns structured data.
  • ImportJSONViaPost(url, payload, fetchOptions, query, parseOptions)
    Imports JSON data using a POST request. Useful for APIs requiring POST method.
  • ImportJSONBasicAuth(url, username, password, query, parseOptions)
    Fetches JSON data protected by HTTP Basic Authentication.

1. ImportJSON – load JSON data from an API URL

The ImportJSON function retrieves JSON data from a publicly available URL via a simple GET request. It then flattens this JSON into a table format, suitable for insertion into Google Sheets, automatically extracting headers and data rows.

Formula syntax

=ImportJSON("url", query, parseOptions)

Parameters

  • url: The URL to the JSON data source (required)
  • query: Comma-separated list of JSON paths to import (optional)
  • parseOptions: Comma-separated list of options to control data processing (optional)
    • noInherit: Don’t inherit values from parent elements
    • noTruncate: Don’t truncate values longer than 256 characters
    • rawHeaders: Don’t prettify headers
    • noHeaders: Don’t include headers, only data
    • allHeaders: Include all headers from the query parameter in listed order
    • debugLocation: Prepend row/column location to each value

Example

To import JSON data from a public endpoint, enter this formula in a cell:

=ImportJSON("https://www.jsonkeeper.com/b/3FTM")

Your sheet will parse and display the API data. The first row will show the column headers, and the rows below will display the values from the JSON response.

importjson from url

2. ImportJSONFromSheet – Import JSON from a Google Sheets tab

If you already have raw JSON stored in a spreadsheet or exported manually, you can parse and structure it using the ImportJSONFromSheet function from the Apps Script library.

Formula syntax

ImportJSONFromSheet(sheetName, query, options)

Parameters

  • sheetName: Name of the sheet containing JSON data (required)
  • query: Comma-separated list of JSON paths to import (optional)
  • options: Comma-separated list of options to control data processing (optional)
    • Same options as ImportJSON

Example

Use the formula below to get this data in another tab:

=ImportJSONFromSheet("RawJSON")

This will parse the JSON and present the data in a table format.

importjsonfromsheet

You can also customize it with optional parameters, pull only selected fields, and adjust formatting behavior.

=ImportJSONFromSheet("RawJSON", "/user/name,/user/email", "noTruncate,rawHeaders")
  • noTruncate prevents cutting off long strings.
  • rawHeaders keeps header formatting unchanged.

3. ImportJSONViaPost – Import JSON from an API using POST parameters

Some APIs expect you to send data in the request body using the POST method to receive a response. For instance, submitting a form or querying a database often requires you to pass input values.

Google Apps Script allows you to send such POST requests using a custom function called ImportJSONViaPost.

Formula syntax

=ImportJSONViaPost(url, payload, fetchOptions, query, parseOptions)

Parameters

  • url: The URL to the JSON data source (required)
  • payload: Data to send with the POST request (required)
  • fetchOptions: Comma-separated list of options for the HTTP request (optional)
    • validateHttpsCertificates: Whether to validate HTTPS certificates
    • followRedirects: Whether to follow redirects
    • muteHttpExceptions: Whether to suppress exceptions
  • query: Comma-separated list of JSON paths to import (optional)
  • parseOptions: Comma-separated list of options to control data processing (optional)
    • Same options as ImportJSON

Example

Use this formula in a cell in your spreadsheet:

=ImportJSONViaPost(
  "https://reqres.in/api/users",
  "{""name"":""Aishu"",""job"":""Writer""}",
  "",
  "",
  ""
)

This sends a POST request to an API (https://reqres.in) with a payload containing a name and job. As you can see below, the API returns a confirmation response, including an ID and timestamp.

importjsonviapost

This method is useful when you need to send custom input (payload) to get a response, particularly while working with POST-based APIs like CRM tools or database query endpoints.

4. ImportJSONBasicAuth – Import JSON from an API that requires authorization

Some APIs protect their data using authorization tokens. These tokens need to be passed in the request header to access the API response. With Google Apps Script, you can do this using a custom function that includes the Authorization header.

Formula syntax

=ImportJSONBasicAuth(url, username, password, query, parseOptions)

Parameters

  • url: The URL to the protected JSON data source (required)
  • username: Username for authentication (required)
  • password: Password for authentication (required)
  • query: Comma-separated list of JSON paths to import (optional)
  • parseOptions: Comma-separated list of options to control data processing (optional)
    • Same options as ImportJSON

Example

To use it in your spreadsheet, try this formula:

=ImportJSONAuth("https://your-api-url.com/data", "Bearer your-api-token", "", "")

Just replace:

  • "https://your-api-url.com/data" with your API endpoint
  • "Bearer your-api-token" with your actual token (e.g., Bearer sk-123…)

The function will fetch the JSON data, parse it, and display it in a tabular format in your spreadsheet. It is useful when working with APIs that require secure access or user-based permissions.

importjsonauth

5. ImportJSONAdvanced() – Import JSON using advanced parameters

If you’re working with a complex API or need full control over how data is retrieved and processed, use ImportJSONAdvanced() inside the Apps Script editor. This powerful and flexible function lets you define request methods, headers, payload, query filters, and even transformation behavior.

Unlike the other functions, ImportJSONAdvanced() cannot be used directly in a spreadsheet cell. Instead, you’ll need to create a wrapper function inside the Apps Script editor and run it from there.

importjsonadvanced

How to use the script

To do so, open your spreadsheet and go to Extensions > Apps Script

json google sheets apps script

Copy the following script from the article or GitHub. Paste the code in the script editor and click Save.

/*====================================================================================================================================*
  ImportJSON by Brad Jasper and Trevor Lohrbeer
  ====================================================================================================================================
  Version:      1.5.0
  Project Page: https://github.com/bradjasper/ImportJSON
  Copyright:    (c) 2017-2019 by Brad Jasper
                (c) 2012-2017 by Trevor Lohrbeer
  License:      GNU General Public License, version 3 (GPL-3.0) 
                http://www.opensource.org/licenses/gpl-3.0.html
  ------------------------------------------------------------------------------------------------------------------------------------
  A library for importing JSON feeds into Google spreadsheets. Functions include:

     ImportJSON            For use by end users to import a JSON feed from a URL 
     ImportJSONFromSheet   For use by end users to import JSON from one of the Sheets
     ImportJSONViaPost     For use by end users to import a JSON feed from a URL using POST parameters
     ImportJSONAdvanced    For use by script developers to easily extend the functionality of this library
     ImportJSONBasicAuth   For use by end users to import a JSON feed from a URL with HTTP Basic Auth (added by Karsten Lettow)

  For future enhancements see https://github.com/bradjasper/ImportJSON/issues?q=is%3Aissue+is%3Aopen+label%3Aenhancement
  
  For bug reports see https://github.com/bradjasper/ImportJSON/issues

  ------------------------------------------------------------------------------------------------------------------------------------
  Changelog:
  
  1.6.0 (June 2, 2019) Fixed null values (thanks @gdesmedt1)
  1.5.0  (January 11, 2019) Adds ability to include all headers in a fixed order even when no data is present for a given header in some or all rows.
  1.4.0  (July 23, 2017) Transfer project to Brad Jasper. Fixed off-by-one array bug. Fixed previous value bug. Added custom annotations. Added ImportJSONFromSheet and ImportJSONBasicAuth.
  1.3.0  Adds ability to import the text from a set of rows containing the text to parse. All cells are concatenated
  1.2.1  Fixed a bug with how nested arrays are handled. The rowIndex counter wasn't incrementing properly when parsing.
  1.2.0  Added ImportJSONViaPost and support for fetchOptions to ImportJSONAdvanced
  1.1.1  Added a version number using Google Scripts Versioning so other developers can use the library
  1.1.0  Added support for the noHeaders option
  1.0.0  Initial release
 *====================================================================================================================================*/

/**
 * Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create 
 * a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in 
 * the JSON feed. The remaining rows contain the data. 
 * 
 * By default, data gets transformed so it looks more like a normal data import. Specifically:
 *
 *   - Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values 
 *      of the rows representing their parent elements.
 *   - Values longer than 256 characters get truncated.
 *   - Headers have slashes converted to spaces, common prefixes removed and the resulting text converted to title case. 
 *
 * To change this behavior, pass in one of these values in the options parameter:
 *
 *    noInherit:     Don't inherit values from parent elements
 *    noTruncate:    Don't truncate values
 *    rawHeaders:    Don't prettify headers
 *    noHeaders:     Don't include headers, only the data
 *    allHeaders:    Include all headers from the query parameter in the order they are listed
 *    debugLocation: Prepend each value with the row & column it belongs in
 *
 * For example:
 *
 *   =ImportJSON("http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json", "/feed/entry/title,/feed/entry/content",
 *               "noInherit,noTruncate,rawHeaders")
 * 
 * @param {url}          the URL to a public JSON feed
 * @param {query}        a comma-separated list of paths to import. Any path starting with one of these paths gets imported.
 * @param {parseOptions} a comma-separated list of options that alter processing of the data
 * @customfunction
 *
 * @return a two-dimensional array containing the data, with the first row containing headers
 **/
function ImportJSON(url, query, parseOptions) {
  return ImportJSONAdvanced(url, null, query, parseOptions, includeXPath_, defaultTransform_);
}

/**
 * Imports a JSON feed via a POST request and returns the results to be inserted into a Google Spreadsheet. The JSON feed is 
 * flattened to create a two-dimensional array. The first row contains the headers, with each column header indicating the path to 
 * that data in the JSON feed. The remaining rows contain the data.
 *
 * To retrieve the JSON, a POST request is sent to the URL and the payload is passed as the content of the request using the content 
 * type "application/x-www-form-urlencoded". If the fetchOptions define a value for "method", "payload" or "contentType", these 
 * values will take precedent. For example, advanced users can use this to make this function pass XML as the payload using a GET 
 * request and a content type of "application/xml; charset=utf-8". For more information on the available fetch options, see
 * https://developers.google.com/apps-script/reference/url-fetch/url-fetch-app . At this time the "headers" option is not supported.
 * 
 * By default, the returned data gets transformed so it looks more like a normal data import. Specifically:
 *
 *   - Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values 
 *     of the rows representing their parent elements.
 *   - Values longer than 256 characters get truncated.
 *   - Headers have slashes converted to spaces, common prefixes removed and the resulting text converted to title case. 
 *
 * To change this behavior, pass in one of these values in the options parameter:
 *
 *    noInherit:     Don't inherit values from parent elements
 *    noTruncate:    Don't truncate values
 *    rawHeaders:    Don't prettify headers
 *    noHeaders:     Don't include headers, only the data
 *    allHeaders:    Include all headers from the query parameter in the order they are listed
 *    debugLocation: Prepend each value with the row & column it belongs in
 *
 * For example:
 *
 *   =ImportJSON("http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json", "user=bob&apikey=xxxx", 
 *               "validateHttpsCertificates=false", "/feed/entry/title,/feed/entry/content", "noInherit,noTruncate,rawHeaders")
 * 
 * @param {url}          the URL to a public JSON feed
 * @param {payload}      the content to pass with the POST request; usually a URL encoded list of parameters separated by ampersands
 * @param {fetchOptions} a comma-separated list of options used to retrieve the JSON feed from the URL
 * @param {query}        a comma-separated list of paths to import. Any path starting with one of these paths gets imported.
 * @param {parseOptions} a comma-separated list of options that alter processing of the data
 * @customfunction
 *
 * @return a two-dimensional array containing the data, with the first row containing headers
 **/
function ImportJSONViaPost(url, payload, fetchOptions, query, parseOptions) {
  var postOptions = parseToObject_(fetchOptions);
  
  if (postOptions["method"] == null) {
    postOptions["method"] = "POST";
  }

  if (postOptions["payload"] == null) {
    postOptions["payload"] = payload;
  }

  if (postOptions["contentType"] == null) {
    postOptions["contentType"] = "application/x-www-form-urlencoded";
  }

  convertToBool_(postOptions, "validateHttpsCertificates");
  convertToBool_(postOptions, "useIntranet");
  convertToBool_(postOptions, "followRedirects");
  convertToBool_(postOptions, "muteHttpExceptions");
  
  return ImportJSONAdvanced(url, postOptions, query, parseOptions, includeXPath_, defaultTransform_);
}

/**
 * Imports a JSON text from a named Sheet and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create 
 * a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in 
 * the JSON feed. The remaining rows contain the data. 
 * 
 * By default, data gets transformed so it looks more like a normal data import. Specifically:
 *
 *   - Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values 
 *      of the rows representing their parent elements.
 *   - Values longer than 256 characters get truncated.
 *   - Headers have slashes converted to spaces, common prefixes removed and the resulting text converted to title case. 
 *
 * To change this behavior, pass in one of these values in the options parameter:
 *
 *    noInherit:     Don't inherit values from parent elements
 *    noTruncate:    Don't truncate values
 *    rawHeaders:    Don't prettify headers
 *    noHeaders:     Don't include headers, only the data
 *    allHeaders:    Include all headers from the query parameter in the order they are listed
 *    debugLocation: Prepend each value with the row & column it belongs in
 *
 * For example:
 *
 *   =ImportJSONFromSheet("Source", "/feed/entry/title,/feed/entry/content",
 *               "noInherit,noTruncate,rawHeaders")
 * 
 * @param {sheetName} the name of the sheet containg the text for the JSON
 * @param {query} a comma-separated lists of paths to import. Any path starting with one of these paths gets imported.
 * @param {options} a comma-separated list of options that alter processing of the data
 *
 * @return a two-dimensional array containing the data, with the first row containing headers
 * @customfunction
 **/
function ImportJSONFromSheet(sheetName, query, options) {

  var object = getDataFromNamedSheet_(sheetName);
  
  return parseJSONObject_(object, query, options, includeXPath_, defaultTransform_);
}


/**
 * An advanced version of ImportJSON designed to be easily extended by a script. This version cannot be called from within a 
 * spreadsheet.
 * 
 * Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create 
 * a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in 
 * the JSON feed. The remaining rows contain the data. 
 *
 * The fetchOptions can be used to change how the JSON feed is retrieved. For instance, the "method" and "payload" options can be 
 * set to pass a POST request with post parameters. For more information on the available parameters, see 
 * https://developers.google.com/apps-script/reference/url-fetch/url-fetch-app .
 *
 * Use the include and transformation functions to determine what to include in the import and how to transform the data after it is
 * imported. 
 *
 * For example:
 *
 *   ImportJSON("http://gdata.youtube.com/feeds/api/standardfeeds/most_popular?v=2&alt=json", 
 *              new Object() { "method" : "post", "payload" : "user=bob&apikey=xxxx" },
 *              "/feed/entry",
 *              "",
 *              function (query, path) { return path.indexOf(query) == 0; },
 *              function (data, row, column) { data[row][column] = data[row][column].toString().substr(0, 100); } )
 *
 * In this example, the import function checks to see if the path to the data being imported starts with the query. The transform 
 * function takes the data and truncates it. For more robust versions of these functions, see the internal code of this library.
 *
 * @param {url}           the URL to a public JSON feed
 * @param {fetchOptions}  an object whose properties are options used to retrieve the JSON feed from the URL
 * @param {query}         the query passed to the include function
 * @param {parseOptions}  a comma-separated list of options that may alter processing of the data
 * @param {includeFunc}   a function with the signature func(query, path, options) that returns true if the data element at the given path
 *                        should be included or false otherwise. 
 * @param {transformFunc} a function with the signature func(data, row, column, options) where data is a 2-dimensional array of the data 
 *                        and row & column are the current row and column being processed. Any return value is ignored. Note that row 0 
 *                        contains the headers for the data, so test for row==0 to process headers only.
 *
 * @return a two-dimensional array containing the data, with the first row containing headers
 * @customfunction
 **/
function ImportJSONAdvanced(url, fetchOptions, query, parseOptions, includeFunc, transformFunc) {
  var jsondata = UrlFetchApp.fetch(url, fetchOptions);
  var object   = JSON.parse(jsondata.getContentText());
  
  return parseJSONObject_(object, query, parseOptions, includeFunc, transformFunc);
}

/**
 * Helper function to authenticate with basic auth informations using ImportJSONAdvanced
 *
 * Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create
 * a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in
 * the JSON feed. The remaining rows contain the data.
 *
 * The fetchOptions can be used to change how the JSON feed is retrieved. For instance, the "method" and "payload" options can be
 * set to pass a POST request with post parameters. For more information on the available parameters, see
 * https://developers.google.com/apps-script/reference/url-fetch/url-fetch-app .
 *
 * Use the include and transformation functions to determine what to include in the import and how to transform the data after it is
 * imported.
 *
 * @param {url}           the URL to a http basic auth protected JSON feed
 * @param {username}      the Username for authentication
 * @param {password}      the Password for authentication
 * @param {query}         the query passed to the include function (optional)
 * @param {parseOptions}  a comma-separated list of options that may alter processing of the data (optional)
 *
 * @return a two-dimensional array containing the data, with the first row containing headers
 * @customfunction
 **/
function ImportJSONBasicAuth(url, username, password, query, parseOptions) {
  var encodedAuthInformation = Utilities.base64Encode(username + ":" + password);
  var header = {headers: {Authorization: "Basic " + encodedAuthInformation}};
  return ImportJSONAdvanced(url, header, query, parseOptions, includeXPath_, defaultTransform_);
}

/** 
 * Encodes the given value to use within a URL.
 *
 * @param {value} the value to be encoded
 * 
 * @return the value encoded using URL percent-encoding
 */
function URLEncode(value) {
  return encodeURIComponent(value.toString());  
}

/**
 * Adds an oAuth service using the given name and the list of properties.
 *
 * @note This method is an experiment in trying to figure out how to add an oAuth service without having to specify it on each 
 *       ImportJSON call. The idea was to call this method in the first cell of a spreadsheet, and then use ImportJSON in other
 *       cells. This didn't work, but leaving this in here for further experimentation later. 
 *
 *       The test I did was to add the following into the A1:
 *  
 *           =AddOAuthService("twitter", "https://api.twitter.com/oauth/access_token", 
 *                            "https://api.twitter.com/oauth/request_token", "https://api.twitter.com/oauth/authorize", 
 *                            "<my consumer key>", "<my consumer secret>", "", "")
 *
 *       Information on obtaining a consumer key & secret for Twitter can be found at https://dev.twitter.com/docs/auth/using-oauth
 *
 *       Then I added the following into A2:
 *
 *           =ImportJSONViaPost("https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=fastfedora&count=2", "",
 *                              "oAuthServiceName=twitter,oAuthUseToken=always", "/", "")
 *
 *       I received an error that the "oAuthServiceName" was not a valid value. [twl 18.Apr.13]
 */
function AddOAuthService__(name, accessTokenUrl, requestTokenUrl, authorizationUrl, consumerKey, consumerSecret, method, paramLocation) {
  var oAuthConfig = UrlFetchApp.addOAuthService(name);

  if (accessTokenUrl != null && accessTokenUrl.length > 0) {
    oAuthConfig.setAccessTokenUrl(accessTokenUrl);
  }
  
  if (requestTokenUrl != null && requestTokenUrl.length > 0) {
    oAuthConfig.setRequestTokenUrl(requestTokenUrl);
  }
  
  if (authorizationUrl != null && authorizationUrl.length > 0) {
    oAuthConfig.setAuthorizationUrl(authorizationUrl);
  }
  
  if (consumerKey != null && consumerKey.length > 0) {
    oAuthConfig.setConsumerKey(consumerKey);
  }
  
  if (consumerSecret != null && consumerSecret.length > 0) {
    oAuthConfig.setConsumerSecret(consumerSecret);
  }
  
  if (method != null && method.length > 0) {
    oAuthConfig.setMethod(method);
  }
  
  if (paramLocation != null && paramLocation.length > 0) {
    oAuthConfig.setParamLocation(paramLocation);
  }
}

/** 
 * Parses a JSON object and returns a two-dimensional array containing the data of that object.
 */
function parseJSONObject_(object, query, options, includeFunc, transformFunc) {
  var headers = new Array();
  var data    = new Array();
  
  if (query && !Array.isArray(query) && query.toString().indexOf(",") != -1) {
    query = query.toString().split(",");
  }

  // Prepopulate the headers to lock in their order
  if (hasOption_(options, "allHeaders") && Array.isArray(query))
  {
    for (var i = 0; i < query.length; i++)
    {
      headers[query[i]] = Object.keys(headers).length;
    }
  }
  
  if (options) {
    options = options.toString().split(",");
  }
    
  parseData_(headers, data, "", {rowIndex: 1}, object, query, options, includeFunc);
  parseHeaders_(headers, data);
  transformData_(data, options, transformFunc);
  
  return hasOption_(options, "noHeaders") ? (data.length > 1 ? data.slice(1) : new Array()) : data;
}

/** 
 * Parses the data contained within the given value and inserts it into the data two-dimensional array starting at the rowIndex. 
 * If the data is to be inserted into a new column, a new header is added to the headers array. The value can be an object, 
 * array or scalar value.
 *
 * If the value is an object, it's properties are iterated through and passed back into this function with the name of each 
 * property extending the path. For instance, if the object contains the property "entry" and the path passed in was "/feed",
 * this function is called with the value of the entry property and the path "/feed/entry".
 *
 * If the value is an array containing other arrays or objects, each element in the array is passed into this function with 
 * the rowIndex incremeneted for each element.
 *
 * If the value is an array containing only scalar values, those values are joined together and inserted into the data array as 
 * a single value.
 *
 * If the value is a scalar, the value is inserted directly into the data array.
 */
function parseData_(headers, data, path, state, value, query, options, includeFunc) {
  var dataInserted = false;

  if (Array.isArray(value) && isObjectArray_(value)) {
    for (var i = 0; i < value.length; i++) {
      if (parseData_(headers, data, path, state, value[i], query, options, includeFunc)) {
        dataInserted = true;

        if (data[state.rowIndex]) {
          state.rowIndex++;
        }
      }
    }
  } else if (isObject_(value)) {
    for (key in value) {
      if (parseData_(headers, data, path + "/" + key, state, value[key], query, options, includeFunc)) {
        dataInserted = true; 
      }
    }
  } else if (!includeFunc || includeFunc(query, path, options)) {
    // Handle arrays containing only scalar values
    if (Array.isArray(value)) {
      value = value.join(); 
    }
    
    // Insert new row if one doesn't already exist
    if (!data[state.rowIndex]) {
      data[state.rowIndex] = new Array();
    }
    
    // Add a new header if one doesn't exist
    if (!headers[path] && headers[path] != 0) {
      headers[path] = Object.keys(headers).length;
    }
    
    // Insert the data
    data[state.rowIndex][headers[path]] = value;
    dataInserted = true;
  }
  
  return dataInserted;
}

/** 
 * Parses the headers array and inserts it into the first row of the data array.
 */
function parseHeaders_(headers, data) {
  data[0] = new Array();

  for (key in headers) {
    data[0][headers[key]] = key;
  }
}

/** 
 * Applies the transform function for each element in the data array, going through each column of each row.
 */
function transformData_(data, options, transformFunc) {
  for (var i = 0; i < data.length; i++) {
    for (var j = 0; j < data[0].length; j++) {
      transformFunc(data, i, j, options);
    }
  }
}

/** 
 * Returns true if the given test value is an object; false otherwise.
 */
function isObject_(test) {
  return Object.prototype.toString.call(test) === '[object Object]';
}

/** 
 * Returns true if the given test value is an array containing at least one object; false otherwise.
 */
function isObjectArray_(test) {
  for (var i = 0; i < test.length; i++) {
    if (isObject_(test[i])) {
      return true; 
    }
  }  

  return false;
}

/** 
 * Returns true if the given query applies to the given path. 
 */
function includeXPath_(query, path, options) {
  if (!query) {
    return true; 
  } else if (Array.isArray(query)) {
    for (var i = 0; i < query.length; i++) {
      if (applyXPathRule_(query[i], path, options)) {
        return true; 
      }
    }  
  } else {
    return applyXPathRule_(query, path, options);
  }
  
  return false; 
};

/** 
 * Returns true if the rule applies to the given path. 
 */
function applyXPathRule_(rule, path, options) {
  return path.indexOf(rule) == 0; 
}

/** 
 * By default, this function transforms the value at the given row & column so it looks more like a normal data import. Specifically:
 *
 *   - Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values 
 *     of the rows representing their parent elements.
 *   - Values longer than 256 characters get truncated.
 *   - Values in row 0 (headers) have slashes converted to spaces, common prefixes removed and the resulting text converted to title 
*      case. 
 *
 * To change this behavior, pass in one of these values in the options parameter:
 *
 *    noInherit:     Don't inherit values from parent elements
 *    noTruncate:    Don't truncate values
 *    rawHeaders:    Don't prettify headers
 *    debugLocation: Prepend each value with the row & column it belongs in
 */
function defaultTransform_(data, row, column, options) {
  if (data[row][column] == null) {
    if (row < 2 || hasOption_(options, "noInherit")) {
      data[row][column] = "";
    } else {
      data[row][column] = data[row-1][column];
    }
  } 

  if (!hasOption_(options, "rawHeaders") && row == 0) {
    if (column == 0 && data[row].length > 1) {
      removeCommonPrefixes_(data, row);  
    }
    
    data[row][column] = toTitleCase_(data[row][column].toString().replace(/[\/\_]/g, " "));
  }
  
  if (!hasOption_(options, "noTruncate") && data[row][column]) {
    data[row][column] = data[row][column].toString().substr(0, 256);
  }

  if (hasOption_(options, "debugLocation")) {
    data[row][column] = "[" + row + "," + column + "]" + data[row][column];
  }
}

/** 
 * If all the values in the given row share the same prefix, remove that prefix.
 */
function removeCommonPrefixes_(data, row) {
  var matchIndex = data[row][0].length;

  for (var i = 1; i < data[row].length; i++) {
    matchIndex = findEqualityEndpoint_(data[row][i-1], data[row][i], matchIndex);

    if (matchIndex == 0) {
      return;
    }
  }
  
  for (var i = 0; i < data[row].length; i++) {
    data[row][i] = data[row][i].substring(matchIndex, data[row][i].length);
  }
}

/** 
 * Locates the index where the two strings values stop being equal, stopping automatically at the stopAt index.
 */
function findEqualityEndpoint_(string1, string2, stopAt) {
  if (!string1 || !string2) {
    return -1; 
  }
  
  var maxEndpoint = Math.min(stopAt, string1.length, string2.length);
  
  for (var i = 0; i < maxEndpoint; i++) {
    if (string1.charAt(i) != string2.charAt(i)) {
      return i;
    }
  }
  
  return maxEndpoint;
}
  

/** 
 * Converts the text to title case.
 */
function toTitleCase_(text) {
  if (text == null) {
    return null;
  }
  
  return text.replace(/\w\S*/g, function(word) { return word.charAt(0).toUpperCase() + word.substr(1).toLowerCase(); });
}

/** 
 * Returns true if the given set of options contains the given option.
 */
function hasOption_(options, option) {
  return options && options.indexOf(option) >= 0;
}

/** 
 * Parses the given string into an object, trimming any leading or trailing spaces from the keys.
 */
function parseToObject_(text) {
  var map     = new Object();
  var entries = (text != null && text.trim().length > 0) ? text.toString().split(",") : new Array();
  
  for (var i = 0; i < entries.length; i++) {
    addToMap_(map, entries[i]);  
  }
  
  return map;
}

/** 
 * Parses the given entry and adds it to the given map, trimming any leading or trailing spaces from the key.
 */
function addToMap_(map, entry) {
  var equalsIndex = entry.indexOf("=");  
  var key         = (equalsIndex != -1) ? entry.substring(0, equalsIndex) : entry;
  var value       = (key.length + 1 < entry.length) ? entry.substring(key.length + 1) : "";
  
  map[key.trim()] = value;
}

/** 
 * Returns the given value as a boolean.
 */
function toBool_(value) {
  return value == null ? false : (value.toString().toLowerCase() == "true" ? true : false);
}

/**
 * Converts the value for the given key in the given map to a bool.
 */
function convertToBool_(map, key) {
  if (map[key] != null) {
    map[key] = toBool_(map[key]);
  }  
}

function getDataFromNamedSheet_(sheetName) {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var source = ss.getSheetByName(sheetName);
  
  var jsonRange = source.getRange(1,1,source.getLastRow());
  var jsonValues = jsonRange.getValues();
  
  var jsonText = "";
  for (var row in jsonValues) {
    for (var col in jsonValues[row]) {
      jsonText +=jsonValues[row][col];
    }
  }
  Logger.log(jsonText);
  return JSON.parse(jsonText);
}


While it requires writing code in the Apps Script editor, it’s perfect for handling edge cases like authenticated requests, non-GET methods, or APIs with custom payload and header structures.

Google Apps Script is a powerful tool if you’re comfortable with code. Whether you need to pull public data, make authenticated API calls, send POST requests, or work with dynamic JSON structures, these functions give you the building blocks to set up custom imports.

Third-party add-ons to convert JSON to Google Sheets

If you’re not a fan of coding or just want to get things done faster, Google Sheets add-ons offer plug-and-play solutions to import JSON data. 

Below is a comparison of some popular add-ons that help you fetch JSON data into Google Sheets.

Google Sheets Add-onDescriptionRatingInstalls
Coupler.ioA beginner-friendly add-on to import JSON from APIs into Google Sheets without coding. Supports automated data scheduling, transformations, and 60+ app integrations. Great for automating live dashboards and reporting.? 4.8850K+
Data Connector – JSON API OAUTHThis add-on is built with developers in mind. It offers OAuth2 support even in the free plan, making it useful for working with more secure APIs. You can send GET/POST/PUT requests, apply JMESPath filters for precise data extraction, and use the built-in query editor to fine-tune what you pull in.? 4.861K+ 
ApiphenyApipheny is an excellent no-code API connector tailored for developers, analysts, and business users. It supports GET and POST requests, lets you pass headers and body data, and even run bulk imports. One standout feature is its scheduling and automation capabilities, making it a great choice for recurring tasks.? 4.6100K+ 
SupermetricsA powerhouse for marketers and analytics teams. Supermetrics connects to 100+ platforms including Google Analytics, Facebook Ads, HubSpot, and more. While it supports JSON API imports, its strength lies in automating marketing and sales reports across tools. Comes with a polished UI and powerful data refresh options.? 4.4700K+ 
ImportJSONThis is the most lightweight and straightforward JSON importer. It’s designed for users who want to pull data from public APIs or JSON URLs with a simple custom function (=ImportJSON()), similar to a formula. Perfect for basic use cases where you don’t need complex authentication or filters.? 3.9199K+

What is the best option to import JSON data to Google Sheets?

Importing JSON to Google Sheets is not a walk in the park, having said that, it’s not rocket science either. The Apps Script works pretty well but you need to be tech-savvy enough to write custom functions. And those that are available on the web are not that reliable to rest your reporting on. 

The JSON connector or importer by Coupler.io gives you flexibility in configuring the way to upload JSON to Google Sheets from your source. Moreover, this reporting automation solution offers 60+ ready-to-go data sources and over 10 destinations. Not to mention white-label report templates to automate your reporting workflows for free. Check them out for yourself!

Automate data import to Google Sheets with Coupler.io

Get started for free

One thought on “How to Import JSON to Google Sheets: Apps Script or No-Code Connector?

  1. I’ve run into issues manually converting JSON into spreadsheets before, so seeing a streamlined approach like this is super helpful. Curious if it handles nested JSON structures well or if there’s a specific way those are formatted during the import?

Comments are closed.