Netsuite RESTlets walkthrough
NetSuite RESTlets lets you develop custom RESTful web services from your NetSuite account using SuiteScript.
Prerequisite
Valid NetSuite RESTlets connector in Data Integration. If you do not have a connection, create one by following the Netsuite connection topic.
Pulling data
- Navigate to the Data Integration account.
- In the Source tab of a Source to Target river, select Nestuite RESTlets.
- Select the connection from the drop-down menu.
- Make sure to match the Timestamp with the metadata date's format. Keep in mind that the only available report is 'Pull Saved Search.'.
- Select the extract method and filters.
- When using the extract method: incremental, make sure to match the Incremental Type with the incremental field's type.
Limitation
Each Custom search report has a role and other permissions which are available in its settings. You need to make sure it has proper role and permission, otherwise, less data or no data returns from that report.
Below is an example of permission needed to fetch all records. When the restriction were not set to all accessible subsidiaries fewer data would return.

Missing roles
If you cannot view the role (Data Integration) you created for reading permission web services:
- Go to Setup > Users/Roles > Manage Roles.
- Go into Data Integration role (which you set to grant to read data via third-party apps).
- Add the role you need (Employee Records in the relevant tab) and click Save.
- Refresh the custom search roles and set the results checked for Data Integration role.
Rate limitation
When you encounter the error message indicating that the script execution time has surpassed the permitted duration, referred to as SSS_TIME_LIMIT_EXCEEDED, it implies that the script or operation is taking longer than allowed to finish.
Example of the error code:
Error in netsuite_restlets response: 400-{'code': 'SSS_TIME_LIMIT_EXCEEDED', 'message': 'Script Execution Time Exceeded.'}"
To address this issue, you may consider Implementing batch processing. If your script handles data volumes, dividing it into smaller batches could be beneficial. This approach can distribute the workload evenly and prevent timeouts from occurring.