I, too, used to have problems exporting large result sets (7 - 8 GB) to delimited files. Neither SQLCMD nor BCP nor SSIS could handle dynamic result sets, dynamic text-qualification, adding a column header row, etc. So, I built my own tool to handle this. It currently exits as the DB_BulkExport Stored Procedure in the SQL# SQLCLR library (that I wrote), though the plan is to break it out into a stand-alone export utility and expand on the features.
Please note that the DB_BulkExport Stored Procedure is only available in the Full (paid) version of SQL# (i.e. it is not in the Free version), but I am not aware of any free utility to do this type of thing (hence why I wrote my own).
Regarding the export of large result sets: I so far have not run into problems with memory since it writes each row out to the file as it is read from the result set.
If you only need this export for a single table / query that doesn't really change in terms of structure / schema, then it might be best to write a small, specialized app yourself, either in .NET or PowerShell. The tricky part is in handling dynamic requirements. But if you know what the columns are named and the datatypes, etc, then it should be a simple-enough matter of opening the output file, executing the query, and then for every row in the SqlDataReader just String.Concat everything together, including text-qualification where needed and applying appropriate Format specifiers when needed (i.e. Date(Time) values).