I have another question. Now that the project is underway, It appears that I have over 86 million Invoices to extract into a flat file. When I run the select statement, (Now using DBEAVER) the job bombs out. I'm not a system admin and so I can't add tables to the Cache dB, only pull them out and I was thinking about importing the flat files into SQL server for further manipulation. Is there a best practice for getting this enormous table out of Cache?

I have tried a small sample extract using SQL code in the 'System Explorer' function on the management portal.

With just a few joins and a small sampling of invoices, I get this message:

I have to extract large volumes of various data for an integrations project. Are the other tools mentioned better (DBeaver and DataGrip) to handle large files?