cancel
Showing results for 
Search instead for 
Did you mean: 
Reply
NEMESIS Occasional Visitor
Occasional Visitor

How do a make my workflow easier (big data source files)?

Hi everyone,

First post on the forums so hopefully I am in the right place.

I'm looking for some advice on how to make my workflow and PBI life a bit easier.

 

Some basic info...

 

Microsoft Account Type:

Office365 Enterprise

PowerBI Pro

 

DATA SOURCE: (to give you an idea of scale)

Excel data file: 298 MB (and growing every week)

 

SHEET1 – Data from SAP (client details and transactions):

Number of rows of data: 398,687(and growing every week)

Number of columns of data: 104 A-CZ

SHEET2  – Data from SAP (client details and transactions):

Number of rows of data: 94,494(and growing every week)

Number of columns of data: 104 A-CZ

SHEET3 – Data from SAP (client details and transactions):

Number of rows of data: 92,733(and growing every week)

Number of columns of data: 104 A-CZ

SHEET4 – My admin Tab which is mainly made up of look up formulas:

Number of rows of data: 1,84(and growing every week)

Number of columns of data: 17 A-Q

 

PowerBI Report (one of many)

83 MB

15 Pages

Graphs & Pie charts (booking trends, values, annual histories, company analysis, client analysis)

Data maps

Lots of slicers

 

The data files and the PBIX files are stored on an internal server.

 

Hardware

Dell Optiplex3420

16 GB RAM

6th Gen Intel® Core™ i5-6500 (Quad Core 3.2GHz, 3.6Ghz Turbo, 6MB, w/ HD Graphics 530)

Windows 10

Office 2016

OR Amazon Work Space.

 

SERVER:

No idea but we are a FTSE100 company so I’d like to think it’s suitable.

 

Data connection:

Speedtest.net says 300MB/300MB

 

Situation:

Adding the data to Excel is becoming a painful process as the file is getting very large and it continually grey screens and bluewheels (or crashes).

We have to manually refresh the dashboards once we have updated the Excel datafiles which is sometimes very slow (over 10 mins).

 

We (3 of us) update the datafiles once per week and refresh the PBIXs.

The Excel files are password protected for data security.

 

We have not had any formal training on PBI, everything has been learned from Youtube and trial and error. We would start again if it meant the long term solution was better.

 

 

Questions:

In a nutshell – how can I make the workflow faster and save time?

 

I have multiple cloud storage options available – would it be faster using the cloud to store the XLS and PBIX? (Onedrive, Box, AWS, Sites)

 

Should I use something other than Excel to hold the data sets? (I can’t do a live link to SAP until 2018 at the earliest/if at all and I’d stil need to link to some Excel data)

 

I’ve never used Azure, could this be a faster option?

 

How can I make the PBIXs update without having to manually update them?

 

Some sanitised example screenshots:

 1.png2.png3.png4.png

5.png6.png

 

Thanks in advance for any suggestions or advice!

 

Mark

 

1 ACCEPTED SOLUTION

Accepted Solutions
Super User
Super User

Re: How do a make my workflow easier (big data source files)?

OK, a number of things you could do. One, you could switch your Excel data source to a combine binaries Folder query so that you wouldn't have to update a huge Excel file every time but rather just dump the new Excel file of the new data into the folder and it will all get appended together.

 

You could use Azure SQL DB or something similar. Very low cost and if you put your Azure SQL DB in the same data center as your Power BI tenant, then that should greatly improve speed.

 

You need to publish your Power BI file to the Service and set the automatic refresh schedule there.

 


I have book! Learn Power BI from Packt


Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!

1 REPLY 1
Super User
Super User

Re: How do a make my workflow easier (big data source files)?

OK, a number of things you could do. One, you could switch your Excel data source to a combine binaries Folder query so that you wouldn't have to update a huge Excel file every time but rather just dump the new Excel file of the new data into the folder and it will all get appended together.

 

You could use Azure SQL DB or something similar. Very low cost and if you put your Azure SQL DB in the same data center as your Power BI tenant, then that should greatly improve speed.

 

You need to publish your Power BI file to the Service and set the automatic refresh schedule there.

 


I have book! Learn Power BI from Packt


Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!

Helpful resources

Announcements
Community News & Announcements

Community News & Announcements

Get your latest community news and announcements.

Summit North America

Power Platform Summit North America

Register by September 5 to save $200

Virtual Launch Event

Microsoft Business Applications Virtual Launch Event

Watch the event on demand for an in-depth look at the new innovations across Dynamics 365 and the Microsoft Power Platform.

MBAS Gallery

Watch Sessions On Demand!

Continue your learning in our online communities.

Users Online
Currently online: 182 members 2,234 guests
Please welcome our newest community members: