Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
ciken
Resolver I
Resolver I

BIgQuery Account Permissions

Good morning. I have been having some trouble updating my PowerBI service lately. I am receiving a message that states:

 

  • Processing error: ERROR [HY000] [Microsoft][BigQuery] (131) Unable to authenticate with Google BigQuery Storage API. Check your account permissions.

 

When I update from PowerBI desktop it works just fine, then I have to save and publish manually. I'd like to get back to the service updating automatically in the middle of the night so numbers are valid at the start of our day. Unfortunately, no matter how many times I reconnect to BigQuery via the Data Source Credentials it still gives me the error. 

 

I have reached out to MSFT support but they claim it's a google bigquery issue. I've reached out to Google but they claim if it works in desktop and not in service it's an MSFT issue. 

 

Help??

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi!

 

UPDATED 2020-10-03

All we needed to do was actually to assign the predefined GCP IAM role "BigQuery Read Session User" - https://cloud.google.com/bigquery/docs/access-control#bigquery - to every PowerBI user that needs access to BigQuery and refresh reports. You can do that in several ways:

1. Directly assign this role to every relevant user. (Most manual work)

2. Add the matching permissions to an existing custom IAM role already assigned to the user. (Less manual work)

3. Create a GCP service account and granting access to it matching the predefined GCP IAM role "BigQuery Read Session User". Then using the gcloud cli you can add "domain-wide" policies (or anything else suitable covering your relevant user scopes) for impersonation of the service account. I'm not going to describe details about impersonation, you need to check the GCP docs. I'll only mention this part of the docs that were harder to find:

https://cloud.google.com/iam/docs/reference/rest/v1/Policy
(This method should require the least manual work in the long run)

UPDATE: I haven't successfully implemented this third option myself, I just thought it would work this way. We use the second option ourselves.

 

Good luck!

 

Thank you Jesus!

ADD-ON 2021-01-12 (Swedish time zone)
On 2020-09-28 (or the day after) I prayed to God in the morning for technical solutions that would point to Him. On 2020-09-29 I got an email from my boss about our PowerBI environment not being able to communicate with BigQuery. Could solve it the same day within a few hours after quickly praying to God for a solution. Could get that solution into production the same evening to our top PowerBI users. The rest of the users got it rolled out the next day. Shared a clumsy solution 2020-09-30 here in this forum. Optmized the solution and updated the post. Everything worked for our end users from start. Our connectivity issues started on 2020-09-19 so I guess Microsoft and Google had worked on solving the problem from then, as far as I know. We had a support case open for several days and we got no help to sort our issue from there, until after I posted this. No one else was presenting any way to get it working anywhere either. There was also a PowerBI consultant working with the issue for several days at our company, without any luck. After I posted this solution, the PowerBI support sent us an email about how to solve our case, where they referred to this answer... 🙂
Again, thank you Jesus!
(I'm aware that the solution was of a temporary nature - the problem was probably fixed by Google with a long-term solution not long after I posted here.)

 

ADD-ON 2021-04-06

If the PowerBI user Bob has BigQuery access to multiple GCP projects, then you need to assign the predefined GCP IAM role "BigQuery Read Session User" to Bob in EVERY such GCP project. Or PowerBI will fail to load larger result sets from BigQuery at all. Just experienced and solved for us recently. Hope it helps someone!

View solution in original post

10 REPLIES 10
songogeta32
Frequent Visitor

I am using a service account which has role https://cloud.google.com/bigquery/docs/access-control#bigquery.admin still I am getting the error.

 

 Is the solution to enable high throughput api in odbc driver?

 

Anonymous
Not applicable

Hi!

 

UPDATED 2020-10-03

All we needed to do was actually to assign the predefined GCP IAM role "BigQuery Read Session User" - https://cloud.google.com/bigquery/docs/access-control#bigquery - to every PowerBI user that needs access to BigQuery and refresh reports. You can do that in several ways:

1. Directly assign this role to every relevant user. (Most manual work)

2. Add the matching permissions to an existing custom IAM role already assigned to the user. (Less manual work)

3. Create a GCP service account and granting access to it matching the predefined GCP IAM role "BigQuery Read Session User". Then using the gcloud cli you can add "domain-wide" policies (or anything else suitable covering your relevant user scopes) for impersonation of the service account. I'm not going to describe details about impersonation, you need to check the GCP docs. I'll only mention this part of the docs that were harder to find:

https://cloud.google.com/iam/docs/reference/rest/v1/Policy
(This method should require the least manual work in the long run)

UPDATE: I haven't successfully implemented this third option myself, I just thought it would work this way. We use the second option ourselves.

 

Good luck!

 

Thank you Jesus!

ADD-ON 2021-01-12 (Swedish time zone)
On 2020-09-28 (or the day after) I prayed to God in the morning for technical solutions that would point to Him. On 2020-09-29 I got an email from my boss about our PowerBI environment not being able to communicate with BigQuery. Could solve it the same day within a few hours after quickly praying to God for a solution. Could get that solution into production the same evening to our top PowerBI users. The rest of the users got it rolled out the next day. Shared a clumsy solution 2020-09-30 here in this forum. Optmized the solution and updated the post. Everything worked for our end users from start. Our connectivity issues started on 2020-09-19 so I guess Microsoft and Google had worked on solving the problem from then, as far as I know. We had a support case open for several days and we got no help to sort our issue from there, until after I posted this. No one else was presenting any way to get it working anywhere either. There was also a PowerBI consultant working with the issue for several days at our company, without any luck. After I posted this solution, the PowerBI support sent us an email about how to solve our case, where they referred to this answer... 🙂
Again, thank you Jesus!
(I'm aware that the solution was of a temporary nature - the problem was probably fixed by Google with a long-term solution not long after I posted here.)

 

ADD-ON 2021-04-06

If the PowerBI user Bob has BigQuery access to multiple GCP projects, then you need to assign the predefined GCP IAM role "BigQuery Read Session User" to Bob in EVERY such GCP project. Or PowerBI will fail to load larger result sets from BigQuery at all. Just experienced and solved for us recently. Hope it helps someone!

Thanks draim!  Yes, assigning the predefined GCP IAM role "BigQuery Read Session User" worked for our case.  Much appreciated!

Anonymous
Not applicable

Hi AgTooOldForThis!

 

I'm glad it helped you!

 

David

AgTooOldForThis
Frequent Visitor

I have similar issue with one particular published server dataset that will not on-demand or schedule refresh that started ~Sept 22.  I have July desktop with no problems to refresh locally on desktop.

 

One thing I've noticed is if I create a new report with the exact same connections to the problematic dataset (w/ July Desktop as before), the new published server dataset refreshes fine.  However if I save-as the problematic report and publish, the error persists with this cloned published server dataset. 

 

I have other published reports that connect to BigQuery with no issues at all that are supposedly using the same credentials.

Anonymous
Not applicable

I had this problem as well. I was using the Simba ODBC driver as the BQ connection. 

 

The solution was that the Simba connector advanced options I increased the Rows Per Block as well as the Default String Column Length fields to accomodate my larger dataset and then refreshed and this issue went away. 

 

Not sure that this would help you, but it helped me, so I'm leaving this solution here for when I forget about it a year from now and need it again. 🙂

markmacardle
Advocate I
Advocate I

Having same issue. Two users out of dozens stopped being able to connect to BigQuery. One of those users is also having online reports fail to update. 

 

Tried upgrading to latest version of Power BI, and clearing permissions and reauthorising, neither of which worked.

 

Reverting to the July version of Power BI has fixed the problem in desktop, but still facing the issue that of online reports not being able to update. 

Anonymous
Not applicable

Identical issue started for us today.

 

Unable to authenticate with Google BigQuery Storage API. Check your account permissions.. The exception was raised by the IDbCommand interface.
 
Lexey
Helper I
Helper I

please, try it

GoogleBigQuery.Database([BillingProject="project name bq"])

Lexey
Helper I
Helper I

I have this problems too!

and i have error: The key didn't match any rows in the table.

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors