Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Enabling Historic Data Analysis creates a sql azure database with all records pushed it from an application.
I want to use this option, i'll push 50.000 rows/day. Are there any limitations about the maximum size of the stored database table ?
I read here about rest api limitations, and it says that i can save at max 5M rows stored per table. Anyway, i'm not going to use the rest api via code, instead, i want to use the easily UI into PBI Service. Do both cases have same limitations?
If this is the limit in the intern database generated, my question is: what would happen after the row #5.000.000??
Does this act like a LIFO queue, deleting older rows, and inserting the new ones, or what?
Hi @rbobadilla,
>>i'm not going to use the rest api via code, instead, i want to use the easily UI into PBI Service. Do both cases have same limitations?
In fact, easily UI also based on rest api(it will convert page records to request with specific rest api), so rest API limitation also works for it.
>>If this is the limit in the intern database generated, my question is: what would happen after the row #5.000.000??
Nope, I don't think it will auto replace old data, it will return command error message to alert request amount are over the limit.
Regards,
Xiaoxin Sheng
Hi @v-shex-msft, thank you for your answer.
I found here some information
There is a "FIFO Dataset" option.
Coding i can edit this option, but using the UI actually i don't know how to change the policy restriction to became to a FIFO dataset.
Any idea of this?
Regards
Rodrigo
Hi @rbobadilla,
>>Coding i can edit this option, but using the UI actually i don't know how to change the policy restriction to became to a FIFO dataset.
I don't think this is a editable option(for easily UI) and also not found any related document told how to change this.
BTW, current power bi rest api not contain any update methods for rows, you need to delete old records and add new records to achieve update operation.
Regards,
Xiaoxin Sheng
I'm using the following code to push my data do the API:
(New-Object System.Net.WebClient).Proxy.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials
Add-Type -Path "Oracle Path"
$datasource = 'DataSourceName';
$username = "*******"
$password = "*****"
$connectionString = 'User Id=' + $username + ';Password=' + $password + ';Data Source=' + $datasource
$query = "sql"
##you would find your own endpoint in the Power BI service
$endpoint = "API ENDPOINT"
#Fetch data and write out to files
$connection = New-Object Oracle.ManagedDataAccess.Client.OracleConnection($connectionString);
$connection.open();
$command=$connection.CreateCommand();
$command.CommandText=$query;
$reader=$command.ExecuteReader()
$objects = @(
while ($reader.Read()) {
[pscustomobject]@{
"API COLUMNS" = $reader['Database_Column'];
}
}
)
Invoke-RestMethod -Method Post -Uri "$endpoint" -Body (ConvertTo-Json @($objects))
$objects.Clear()
$connection.Close();
$connection.Dispose();
What should i do to delete my dataset before posting it? (I've tried this link but i could not manage to do it https://docs.microsoft.com/en-us/rest/api/power-bi/pushdatasets/datasets_deleterows)
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.