Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello all,
I have a case where we have data processed and stored in BigQuery Google service and have to transpose it to Azure MySQL and connect it to PBI to allow several users to develop dashboards and read them.
What we chose to transfer data is to go from big query to google cloud storage and split the DataFrames in CSVs
Then from GCS to Azure for MySQL and UNION those CSV, the final result will be two DataFrames with approx 70M rows. (25 gigs of data in CSV), I believe a daily refresh will be more than enough.
Relative to this I have several questions:
- Is Azure MySQL a good choice of DB to tackle this problem or should I use another service: Azure SQL, Azure spark...?
- In terms of data load it will be impossible to work with the full dataset locally and direct query is not available on MySQL, is it possible to define the database structure, types.. in the web version, program incremental imports and work locally with only a portion of the dataset ? I remember having issue when trying to have the whole data loaded on the web and loading only a sample locally...
- Do you have an idea of the pricing plan we should go for to support this data load? The free PBI version only allows to store dashboards with 1 Gig of data I believe.
Thanks for your answers =D
L
Please le me know if I should provide more details or if the questions are unclear...
best,
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.
User | Count |
---|---|
13 | |
6 | |
4 | |
3 | |
2 |
User | Count |
---|---|
14 | |
10 | |
5 | |
3 | |
3 |