Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.
Hello All,
I am having an issue with BigQuery: in short, I have 573,347 rows in my table in BigQuery, yet PBI only imports 3,072 of them. Does anyone know why this is happening and how to fix it? I have tried 'import' mode vs 'DirectQuery' and it still doesn't work. I am using the latest version of PBI Desktop.
Thank you!
Hi
My compagny faces 2 problems when Power BI is connected to GCP Bigquery
Power BI (on Desktop & Service) is THE integration control tower, so please show each issues messages
Work on data quality or stop native BQ connector on desktop because reliability is not good for customers
Hello, I ran in a similar issue and problem comes from the fact that Power BI does not support long strings. When creating a view truncating long fields, all the rows are ingested.
This has nothing to do with Google Big Query limitations, it's a Power BI limitation
I think it's perfectly legitimate to point finger to PBI because even if data is in an unsupported format, or in my case, length , having silent errors is not acceptable.
Thank you
Hi @Anonymous
Can you please share the doc where you get the info?
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-data-types#text-type
But looking twice that was not my exact issue, text type in PBI is ~500MB and my biggest string was 1.3MB so this is more likely a memory overflow issue. Anyway real issue is that if user does not pay enough attention they'll end up working with incomplete data without any notification
Cheers
You might wanna check the limitations on Google Big Query
https://cloud.google.com/bigquery/quotas
I don't think this has to do with Power Bi
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
Hi Amine,
Thanks for taking the time to reply. I found that it was actually a PBI issue: I was trying to import some text data and for some reason PBI didn't like it, when I removed that column containing text from the import - everything worked and it imported all of the rows. Have you ever had this experience before? It is strange because up until last week PBI was importing all of the rows with the text data but then just stopped.
Hello PBI_ponderer
i having the same issue, by any chance that column text data was a json code? or a massive amount of text?
Hi Allegro
It's a buffer issue on the Bq connector knowed by Ms and Simba (the sub contractors)
Don't hesitate to complain fort that on pbi issues forum
No I haven't met that issue. But before pointing finger on PBI you need to make sure what that column contains that causes the issue then act upon it. so by elemination look inside that column what is not correct, empty some rows from that column and verify... step by step you can find the issue.
Let me know
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
User | Count |
---|---|
93 | |
84 | |
78 | |
75 | |
66 |
User | Count |
---|---|
115 | |
105 | |
93 | |
65 | |
60 |