Power BI – Content too large for table?!
The Dreaded Message
When pasting your data (that offer in the ribbon to “create a new table by typing or pasting in new content” is tempting isn’t it!) you may get the dreaded message:
The clipboard contents are too large to be pasted into the table. Please split your data into multiple tables with less than 3,000 cells. You can combine them into a single table within the Power Query editor.
Oh dear, it was all going so well! You could transform the datatype columns and apply any Power Query / M, and the paste surprisingly works with nice clean data! But now you’re up against it – probably because it seems handy to avoid using a database. But why not just use “get data” from Excel? Ah, that’s probably why you’re here, because you’ve found out a nasty little secret about the Power BI portal… the data sources are fixed and can’t be switched off from their original source, ie your server or PC.
Close – But No Cigar
In fact, maybe you tried – flip via Transform Data button into Power Query, then right click the table to deselect the promising looking option “Include in report refresh”. Nice try, and logical, but no dice. See the horror story on the Microsoft Community Issue where you can add a vote with the 370 other folk who were frustrated enough to kick up a fuss. There was a lot of confuddle about it, with Microsoft thinking the issue was fixed, but it wasn’t – the confusion was caused by the fact the “Include in report refresh” only applied to Desktop, not the Power BI service portal. We also set up a support ticket with MS for this so will see how that goes, hopefully a fix soon.
Back to your requirement – simply getting static or semi-static data into your Power BI. If you’re not publishing or collaborating, and sticking with Power BI Desktop, ie just the .pbix file, then your best bet is to simply “get data” from a CSV or Excel file – even if you have to copy it in there first. That solves that, locally.
But if publishing or collaborating, you’re still stuck – because guess what? Even that local one-off “get data” from CSV or Excel will suffer from the problem mentioned above – you’ll want to deselect “Include in report refresh” to stop the failure on next or scheduled refresh. D’oh! Now please vote on that issue link provided!
The Only Way Out
The only way out of this little conundrum (you can’t just paste the data into the model if it’s more than 3,000 cells) is to use either:
- Data Gateway;
- Azure SQL DB.
The Data Gateway resolves this by wiring up your permanently available (it is, right…?) data source to the Power BI model. This is not as simple as it could be as there are 2 types, being Power BI Service Gateway and On-Premises Gateway – the latter needing installation on your PC or server. Unfortunately, they suffer from a few rough edges and are not always intuitive to manage. Plus, and crucially, the data source will need to be permanently available for every refresh – so on a daily schedule, let alone hourly or more, the machine must be switched on. That ratchets up a lot of Virtual Server hosting costs or pointless electricity keeping your PC on, and worse – there is bound to be a time when availability does go down and then it will cause a failed refresh!
An Azure SQL DB is a better option because of something not immediately obvious – this cloud source does not require a dedicated Power BI Data Gateway. The Power BI Service simply connects to it without fuss. Of course, you still need to get your data into the Azure SQL DB (which is not as simple as it seems, as you can’t do a direct import via SSMS yet [UPDATE: With at least SMSS 18.5 you can, once you setup logins and users appropriately] but it will be worth it once it’s there, because Microsoft will have to take the pressure of keeping it permanently available without you managing the machine it’s held on. That’s nice.
If you have better things to do than this, like run a business, we can set it all up for you – just drop us a line.