Showing results for 
Search instead for 
Did you mean: 

Latest Power BI update broke Google Cloud BigQuery connections, both desktop and server/embedded

We have dashboards *in production* for several clients that stopped updating since 2 days ago. We have tried everything from reconnecting with the same credentials, new credentials, trying to connect from PBI Desktop, and nothing works unless we keep the row count for each entity very low (1000 rows works for instance). This is an URGENT issue as we have clients who use the embedded dashboards and they are not updating. Please let us know if this will not be fixed so we can look into alternatives to PBI.

Status: New
Helper I

Same for me! 


I had to downgraded to spetmber version and everything works fine ( again ) 


Something very weird:I have also the same issues with fact table ( stop at 1000 rows) but I have tried to load an other fact table ( 1 million rows) and it was working fine. The difference between both are for the second fact table: 


- No relationship with other table 

- No transformation ( in my model I do a join on the fact who can go up to 1000rows) 


Maybe it will help to solve that BIGGGGGG isssue! 


Thx a lot, 



Frequent Visitor

Same here:


Power BI Desktop works if using an old version but Service is not working. 

But to my knowledge they are working on the issue.

Not applicable

A few additional details:

- dBeaver (which is jdbc) still works with any size resultset;

- Excel through the simba ODBC driver v2.30 (dated July 29th, 2020) gives the same error as PBI service and desktop;

- One of my coworkers had not updated PBI desktop to the latest version, and his refreshes were still working. As soon as he updated to the latest version, he started getting the error.


This is an excerpt of the query submission's debug trace and error by the ODBC driver:


Oct 21 18:14:20.135 DEBUG 1428 Simba::BigQuery::GJobsQueryRequest::BuildUri: URL Endpoint: /projects/bb-br-2/queries
Oct 21 18:14:20.135 DEBUG 1428 Simba::BigQuery::GJobsQueryRequest::BuildRequestBody: paramString: {"query":"select `dataset_latest_period_start`,\r\n `dataset_latest_date`,\r\n `dataset_latest_month`,\r\n `dataset_latest_year`,\r\n `row_days_from_latest_date`,\r\n `row_days_from_latest_date_desc`,\r\n `row_period`,\r\n `ad_active_latest_date`,\r\n `ad_active_latest_week`,\r\n `ad_flight_total_days`,\r\n `ad_flight_days_latest_week`,\r\n `ad_flight_min_date`,\r\n `ad_flight_max_date`,\r\n `ad_sk`,\r\n `media_platform`,\r\n `media_vehicle`,\r\n `media_network`,\r\n `date_id`,\r\n `impressions`,\r\n `clicks`,\r\n `objective_results`,\r\n `billing_events`,\r\n `interactions`,\r\n `engagements`,\r\n `video_views`,\r\n `video_views_25pct`,\r\n `video_views_50pct`,\r\n `video_views_75pct`,\r\n `video_views_100pct`,\r\n `reach`,\r\n `spend`,\r\n `conversions_add_to_cart`,\r\n `conversions_purchase`,\r\n `created`\r\nfrom `bb-br-2`.`cdm`.`vw_ad_fct`","dryRun":false,"maxResults":100000,"useLegacySql":false,"timeoutMs":300000000,"useQueryCache":true,"defaultDataset":{"projectId":"bb-br-2","datasetId":"cdm"}}
Oct 21 18:14:20.136 DEBUG 1428 Simba::BigQuery::GExponentialBackoff::GExponentialBackoff: StartTime: 13302796
Oct 21 18:14:20.136 INFO 1428 Simba::BigQuery::RESTAction::DoAction: Sending POST request to
Oct 21 18:14:25.843 DEBUG 1428 Simba::BigQuery::BigQueryAPIClient::FinalizeExecute: Fetch with HTAPI: true
Oct 21 18:14:25.843 DEBUG 1428 Simba::BigQuery::GJobsGetRequest::BuildUri: URL Endpoint: /projects/bb-br-2/jobs/job_K3_mHihlJ182DCb7hpC1P1Tega4q
Oct 21 18:14:25.843 DEBUG 1428 Simba::BigQuery::GJobsGetRequest::BuildParams: paramString: prettyPrint=false
Oct 21 18:14:25.843 DEBUG 1428 Simba::BigQuery::GExponentialBackoff::GExponentialBackoff: StartTime: 13308515
Oct 21 18:14:25.843 INFO 1428 Simba::BigQuery::RESTAction::DoAction: Sending GET request to
Oct 21 18:14:26.126 ERROR 1428 Simba::BigQuery::BigQueryAPIClient::GetResponseCheckErrors: HTTP error: Not found: Job bb-br-2:job_K3_mHihlJ182DCb7hpC1P1Tega4q [code: 404][reason: notFound]
Oct 21 18:14:26.155 ERROR 1428 Simba::ODBC::ClassicQueryExecutor::Execute: [Simba][BigQuery] (100) Error interacting with REST API: Not found: Job bb-br-2:job_K3_mHihlJ182DCb7hpC1P1Tega4q
Oct 21 18:14:26.155 ERROR 1428 Simba::ODBC::StatementState::DoExecDirect: [Simba][BigQuery] (100) Error interacting with REST API: Not found: Job bb-br-2:job_K3_mHihlJ182DCb7hpC1P1Tega4q
Oct 21 18:14:26.155 ERROR 1428 Simba::ODBC::StatementStateAllocated::SQLExecDirectW: [Simba][BigQuery] (100) Error interacting with REST API: Not found: Job bb-br-2:job_K3_mHihlJ182DCb7hpC1P1Tega4q
Oct 21 18:14:26.155 ERROR 1428 Simba::ODBC::Statement::SQLExecDirectW: [Simba][BigQuery] (100) Error interacting with REST API: Not found: Job bb-br-2:job_K3_mHihlJ182DCb7hpC1P1Tega4q
Oct 21 18:14:26.156 INFO 1428 Simba::ODBC::Connection::SQLGetInfoW: InfoType: SQL_ACTIVE_STATEMENTS (1)

Not applicable

Here's another quick update:


Uninstalling version 2.86.727.0 (October 2020) and reinstalling the previous one, 2.85.985.0 (September 2020) got refresh from BigQuery working again. The irritating downside is that we have several dashboards for several clients updating several times a day that we will need to update manually and reupload to the PBI embedded tenant.



Frequent Visitor

I'm just doing the same thing now.

Would be great to get an estimate on when this can be fixed.

Regular Visitor

I also faced this bug. PowerBI service and latest PowerBI Desktop failed to refresh large dataset in Singapore.

I think PowerBI developers did not set `location` parameter correctly in BigQuery REST API for pagination, so it failed to get bigquery `getqueryresults` job.

If the `location` parameter does not match your dataset location, it will return `Error interacting with REST API: Not found: Job <JOB_ID>` error message.


Workaround: Migrating your dataset to US, US is the default location of BigQuery. It works fine with PowerBI without `location` parameter in BigQuery REST API.


BigQuery jobs.getQueryResults API:

My tweets about this bug:


Not applicable

Study, excellent catch, we indeed have all of our datasets outside of the US. However for us migrating them to the US is a no-go as we'll run into compliance and legal issues by doing so.

If anyone from the PBI MS dev team reads these posts at all, I'd say they have everything they need to fix the issue in half an hour, I know I would. But somehow I suspect it's going to take a lot longer than that...

Not applicable

My business critical POWERBI Dashboards published in Premium capacity of powerbi service ( also not working anymore and unable to open PBIX file in latest version of POWERBI desktop version.


Yes this issue has to be fixed immediately. Not sure how new version would be released which bus which impacts business critical reports.


I am also awaiting fro resolution. Please let me know if anyone finds workaround or fix.

Frequent Visitor

Just to add to the workarounds and locations. Keep in mind that EU is also a default location so that is a possible solution as well.


Our workaround is this:

Keep ETL running in europe-north1

Setup a copy dataset schedule (easily done via transfers in BQ) for the PBI facing dataset(s) to location EU daily.

Redirect PBI reports to the new BQ dataset. (Tips is to have the dataset name as a parameter in PBI).



Frequent Visitor

Up on status page now:


Power BI customers using Google BigQuery as a data source may see refreshes not complete successfully with the message "ODBC: ERROR [HY000] [Microsoft][BigQuery] (100) Error interacting with REST API". Engineers are working on a fix and expect it to be deployed by end-of-day 10/31/2020.