cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
Anonymous
Not applicable

Power BI API Dataflow Import Operation

Hi all, I am testing Dataflow creation via API. I am following "Power BI dataflow REST API reference" instructions.

 

I can export a Dataflow via API but when I try to create a Dataflow with Import operation I get this error:

 

  1. Method not allowed. The requested resource does not support http method 'POST'

 

This is the code extracted from Postman App for C# RestSharp library:

 

var client = new RestClient("https://api.PowerBI.com/v1.0/myorg/groups/<mygroupid>/import?datasetDisplayName=DataflowTest.json&ty...");
var request = new RestRequest(Method.POST);
request.AddHeader("cache-control", "no-cache");
request.AddHeader("Content-Type", "application/json");
request.AddHeader("Authorization", "Bearer <mytoken>");
request.AddHeader("content-type", "multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW");
request.AddParameter("multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW", "------WebKitFormBoundary7MA4YWxkTrZu0gW\r\nContent-Disposition: form-data; name=\"\"; filename=\"Dataflowtest.json\"\r\nContent-Type: application/json\r\n\r\n\r\n------WebKitFormBoundary7MA4YWxkTrZu0gW--", ParameterType.RequestBody);
IRestResponse response = client.Execute(request);

 

Did you get the same error code?

 

Thanks.

1 ACCEPTED SOLUTION

import is a bit trickey to get right

here is a sample:

 

POST https://api.powerbi.com/v1.0/myorg/groups/{groupId}/imports?datasetDisplayName=model.json HTTP/1.1
accept: application/json, text/plain, */*
cache-control: no-cache
Authorization: Bearer eyJ0eXAiOi...nQ
User-Agent: PostmanRuntime/7.3.0
accept-encoding: gzip, deflate
content-type: multipart/form-data; boundary=--------------------------613655670645709167014348
content-length: ...
Connection: close
 < BODY>
----------------------------613655670645709167014348
Content-Disposition: form-data; name=""; filename="model.json"
Content-Type: application/json
 
{
    "name": "my model",
    "version": "1.0",
    ...
}
----------------------------613655670645709167014348--

View solution in original post

10 REPLIES 10
V55
Helper I
Helper I

Hi @Anonymous, @amos_ortal

 

I am able to create dataflow using import API in postman tool. In which, I manually fetched authorization token from the browser dev tools.

 

Now I need to do it through powershell script.
Could you please tell me how did you get Authorization token in your code? 

 

I am basically trying to create a powershell script which asks for certain input parameters and then execute post API to create dataflow.

Thanks.

Hi, 

 

Did you manage to find a solution for this. I am wanting to do exactly what you are looking at but have been un-successful.

 

Thanks

Hi, It worked for me. I tried in PostMan.

 

URL - Select POST. 

https://api.powerbi.com/v1.0/myorg/groups/<groupid>/imports?datasetDisplayName=model.json

 

Header:

Authorization - Bearer ... (I fetched it from browser while accessing powerBI)

Accept - application/json, text/plain, */*

Content-Type - multipart/form-data

 

Body:

Choose form-data and select choose Files button and import your json. While importing, make sure the json has different dataflow name. If existing name is given, it will return error. Thanks.

Anonymous
Not applicable

To be able to fully incorporate a dataflow DTAP environment, moving dataflow model.json code definitions from development/test to acceptation and production workspaces "automatically" is a must have. Right now we have 3 dataflows loaded with 120 entities with lots of Power Query transforms, using Power Query parameters we can eventually dynamically switch input datasource to D/T/A/P Servers and databases. (in our case, SQL Servers/DB's).

 

I believe in the near future there should also be an "Import dataflow" UI configuration possible, besides the API's. 


@V55 Still couldn't get the POSTMAN setup to work, perhaps you could eleborate on this if you have time? 

 

Should be nice to developt a Powershell and incorporate it in our Azure Devops cycle, so we only develop a dataflow once, and deploy it entire (D)TAP environment. 

 

Conceptually, something like this, I am no programmer (still need to complete HEADERS, BODY and parse online storage of model.json file 😞 

 

 

$headers = Get-PowerBIAccessToken

$ws_dev = "put your DT workspace GUID string"
$ws_uat = "put your UAT workspace GUID string"
$ws_prod = "put your PROD workspace GUID string"

$df_dev = "put your dataflow id you want to move"

Invoke-RestMethod -Headers $headers -Uri 'https://api.powerbi.com/v1.0/myorg/groups/" + $ws_dev + "/dataflows/ "+ $df_dev" Invoke-RestMethod -Headers $headers -Uri 'https://api.powerbi.com/v1.0/myorg/groups/" + $ws_uat + "/imports?datasetDisplayName=yourdataflowmodel.json

 

 

 

V55
Helper I
Helper I

I too faced the same problem.

 

GET operations were successful.

I even tried DELETE dataflow api and it executed fine. So I guess it is not related to permissions.

 

As per the doc -Power BI dataflow REST API Reference

The body will contain a model.json file encoded as form data.

 

I am suspecting issue because of this. I am using POST MAN. I have raw json file. 

Can someone tell me how to provide json as encoded form data. Do I have to manually give key value pairs?

Is there easy way to try with POSTMAN tool?

import is a bit trickey to get right

here is a sample:

 

POST https://api.powerbi.com/v1.0/myorg/groups/{groupId}/imports?datasetDisplayName=model.json HTTP/1.1
accept: application/json, text/plain, */*
cache-control: no-cache
Authorization: Bearer eyJ0eXAiOi...nQ
User-Agent: PostmanRuntime/7.3.0
accept-encoding: gzip, deflate
content-type: multipart/form-data; boundary=--------------------------613655670645709167014348
content-length: ...
Connection: close
 < BODY>
----------------------------613655670645709167014348
Content-Disposition: form-data; name=""; filename="model.json"
Content-Type: application/json
 
{
    "name": "my model",
    "version": "1.0",
    ...
}
----------------------------613655670645709167014348--

View solution in original post

Hi,can you please show me how to do it in postman.If will be great if you provide with images

Thanks in advance

Anonymous
Not applicable

Thanks @amos_ortal

Using the same request as in previous tests it is working with Postman app, I am starting C# request tests.

 

The request URL is incorrect in the Power BI dataflow reference doc, the original:

https://api.powerbi.com/v1.0/myorg/groups/{groupId}/import?datasetDisplayName=<name>.json

 

The fixed url seems to be:

https://api.powerbi.com/v1.0/myorg/groups/{groupId}/imports?datasetDisplayName=

 

 

v-shex-msft
Community Support
Community Support

Hi @Anonymous,

 

 

This request need to register an azure AD app before using, have you finished register before test?

 

In addition, I found you add other query string to request url, I check on whitepaper but not found it allow additional query parameters:

/v1.0/myorg/groups/{groupId}/imports?datasetdisplayName=<name>.json&type=dataflow

 

Notice: <name>.json should stored dataflow entity structure defined with json formula.

 

Reference links:

Developer resources for Power BI dataflows (Preview)

Register an Azure AD app to embed Power BI content

 

You can also take a look at following video about dataflow:

 

Regards,

Xiaoxin Sheng

Community Support Team _ Xiaoxin
If this post helps, please consider accept as solution to help other members find it more quickly.
Anonymous
Not applicable

Hi @v-shex-msft,

 

Thanks for your reply.

 

Yes, I did the registration process and permission granting. The fact is that I can execute GET commands and the API works as expected.

 

I added some documentation parameters to the query string I attached in the previous message but in my tests with the documentation url I got this error too.

 

About the JSON structure, I get it from the export function in PowerBI service, the datasetdisplayName parameter is pointing to that file. The content of the file is sent as form in the request.

 

About the video, I have watched it carefully but I cant find an explanation about how accessing CDM in default Azure Storage Gen2 (if possible...) can be performed or changing the default Azure Storage Gen2 for a custom Storage. 

 

Thanks again, I will keep you updated.

Helpful resources

Announcements
UG GA Amplification 768x460.png

Launching new user group features

Learn how to create your own user groups today!

November Power BI Update 768x460.png

Check it Out!

Click here to read more about the November 2021 Updates!

M365 768x460.jpg

Microsoft 365 Collaboration Conference | December 7–9, 2021

Join us, in-person, December 7–9 in Las Vegas, for the largest gathering of the Microsoft community in the world.

Top Kudoed Authors