Blog

15Aug
2007
Uploading Large Files To ColdFusion 8

ColdFusion makes processing uploaded files easy, a simple <CFFILE ACTION="upload"> does the trick. However, ColdFusion has not handled large files very well, large as in many hundreds of megabytes, or even gigabytes. Part of the issue was that ColdFusion kept all uploaded content in memory so as to support the GetHTTPRequestData() function.

ColdFusion 8 fixes this problem, and now supports receiving uploaded files of any size. But, if the file is large, the "content" member of the GetHTTPRequestData() structure will be empty. If you do need access to the uploaded data, simply save the file and then read it.

If you need ColdFusion 8 to always populate "content" (compatible with CFMX7 and earlier), set the system property "coldfusion.markResetForMulitPart" to "true".

Comments (18)



  • Derek Versteegen

    Ben,

    Honestly, I'm not quote sure I know what you're talking about here or how I can implement this. If you have a moment to elaborate a little that would be a huge help to me and I'd be greatly appreciative.

    The reason I am curious about this is because I am working on an image hosting and print fulfillment solution for photo print lab that caters only to professional photographers. My clients customers will be uploading megabytes of zip files. My hosting provider just upgraded me to CF8 so I am eager to take advantage of some of these capabilities. I was going to start with <cfthread> to make the user experience better but having read this post, I'd like to understand where and how I might implement your suggestion.

    Thanks in advance for any reply you have time to offer.

  • TimM

    Hey Ben...

    I run with a pool of App Developers that use CF7 Flash Forms extensively. We are all finding that they don’t work well in Vista (for our customers that have to use I.E.).

    As you type in this environment, keystrokes are dropped. This is a pretty big item, and none of us can find a solution. There do seem to be many comments about this issue on the Net, but no resolutions…

    Can you point to a source to help us resolve this issue?

    #2Posted by TimM | Aug 16, 2007, 09:18 AM
  • Steven Erat

    @Derek, Ben is saying that CF8 automatically handles very large file uploads better than previous ColdFusion versions. You needn't do anything differently, just use CFFILE action=upload.

    If it happens that you have an existing CF application that somehow expects the 'content' key of the GetHTTPRequestData() method to be there in the request during a file upload, then, and only then, do you have to worry about changing your code. If you're not using GetHTTPRequestData() in the request for a file upload, then ignore the caveat.

  • Gareth

    What about timeout issues? I think this was the main problem that we faced when uploading large files. The file would begin to upload, but the page would timeout before the file finished uploading (even if it wasn't a massive file).

    #4Posted by Gareth | Aug 16, 2007, 11:19 AM
  • Derek Versteegen

    @Steven,
    Thanks for the clarification. I wasn't sure if there was anything different I needed to consider doing.

    @Gareth,
    I think that is where <cfthread> becomes very valuable. If you start your download in another thread you can let the user move on. This releases the page the person was waiting for after the upload would have otherwise been complete. I think that's how it would work. I imaging you'd be able to dictate what else should happen once the upload is complete/fails. But that is where I need to learn more about <cfthread> to see how it addresses the concern we share (timing out).

  • Joe Danziger

    Does this mean we might be now be able to create a progress bar for file uploads? That would be an amazing help and feature to have in CF..

  • Derek Versteegen

    This post looks helps to clarify some things:
    http://www.forta.com/blog/index.cfm/2007/5/21/Mult...
    does that mean <cfthread ... file="#myFile#" ... > the file attribute shifts the duty to the other file. Where #myfile# is a CFM file that contains all the logic associated with the task - like what to if teh file uploaded is not a zip file or is not able to be unzipped, or times out, etc?

    as for a thermometer/progress bar, I've done an javascript/css one that I flush to the client during an upload but its not a true progress bar - it just goes back and forth until the process is done. In order to do a true progress bar you need the file size first and then it needs to either be constantly compared as bits are written or it would have to sniff the bandwidth and estimate the time based on the very same file size. Traditionally, CF is only able to output the file size once it is on the server where it can read it from the OS itself.

    Therefore, I think we are restricted by the HTML browse control since all it captures is the physical path of the file that is in turn passed to the server via the browser.

  • Courtney

    Is there a "if the file is large" threshold we should know about (where CONTENT won't be populated)?

    #8Posted by Courtney | Aug 20, 2007, 09:52 AM
  • Hemant

    Rupesh has also written about it on his blog: http://coldfused.blogspot.com/2007/08/coldfusion-8...

    #9Posted by Hemant | Aug 22, 2007, 04:51 AM
  • Dan

    Is there anyway to improve the upload performance in CF7? Is there anyway we can allocate more server memory to the processes used in uploading?

    #10Posted by Dan | Aug 22, 2007, 05:35 PM
  • Ben Forta

    Dan, I don't believe so, there is no way to tell CF7 to not load that file. Another good reason to upgrade.

    --- Ben

    #11Posted by Ben Forta | Aug 22, 2007, 06:28 PM
  • Budd Wright

    Ben, just upgraded CFMX 6.1 to 8 Standard, and when testing this new feature of CF 8 (very important feature!), we're still getting JVM error log entries like this:

    javax.servlet.ServletException: ROOT CAUSE:
    coldfusion.util.MemorySemaphore$MemoryUnavailableException: Memory required (695952487 bytes) exceeds the maximum allowed memory.

    The file we're uploading is approximately 695MB, and we've set the JVM Max Heap Size to 1024, reset the CF server, etc... so we've allowed the JVM to use more than the memory required, but the file upload still fails, and still acts as though it requires JVM to set aside the memory initially... I don't get it -- with the new feature as you describe it, memory should no longer be a factor?

    Any ideas?!

  • Steven Erat

    Bud,

    Check out the CF Admin's Settings page. There is a group of settings near the bottom for Request Size Limits. One field is Maximum Size of Post Data, which defaults to 100MB. Try bumping that up to a value higher than your largest expected file upload and try again. How's that?

  • Budd Wright

    That one didn't work... but the Request Memory Throttle one did! That was set by default at 200 MB, and when I bumped it up to 1024, so that my 695MB size would fit under that, I was thereafter able to upload without problem -- SWEET!

    Ben / Anyone Else, watching the server's performance monitor, CF8 file upload certainly removes the memory requirement issue ... but the CPU is still at upwards of 80% during large file uploads. I tested, and loading other applications on the same server aren't affected by this -- still pretty fast -- but why is the CPU being hit so hard during file uploads?

  • Steven Erat

    For anyone wanting to see an example of how memory utilization is improved during large file uploads in CF 8, see this blog:

    http://www.talkingtree.com/blog/index.cfm/2007/9/1...

  • Budd Wright

    Is anyone able to set CF Admin's "Request Throttle Memory" to 2048 (2 GB) or higher in a Windows setting? We have clients who want to upload very large files -- yes, 2GB or more, please, keep the snickering to a minimum and let's avoid FTP commentary :), and despite the fact that the server has 3GB+ of RAM, every time I set the Request Throttle Memory setting to anything above 1792 MB, uploads fail with a JRE log error of "exceeded memory limit" etc error... and this even with files of various sizes such as 600 MB!

    It seems like a bug somewhere in CF8 or the JVM... not sure. I can't bump the setting above 1792 MB reliably, though, and it's killing me trying to figure it out... anyone able to set it higher and upload successfully files larger than 1 GB ?

  • Steven Erat

    This sounds like a good use case for why you might want to try ColdFusion 8 on Solaris with a 64-bit JVM. You may find the following useful:

    http://www.talkingtree.com/blog/index.cfm/2007/9/1...

    http://en.wikipedia.org/wiki/64-bit
    &lt;QUOTE&gt;Some operating systems reserve portions of process address space for OS use, effectively reducing the total address space available for mapping memory for user programs. For instance, Windows XP DLLs and userland OS components are mapped into each process's address space, leaving only 2 to 3.8 GB (depending on the settings) address space available, even if the computer has 4 GiB of RAM. This restriction is not present in 64-bit Windows.&lt;/QUOTE&gt;

    http://java.sun.com/docs/hotspot/HotSpotFAQ.html
    &lt;QUOTE&gt;The maximum theoretical heap limit for the 32-bit JVM is 4G. Due to various additional constraints such as available swap, kernel address space usage, memory fragmentation, and VM overhead, in practice the limit can be much lower. On most modern 32-bit Windows systems the maximum heap size will range from 1.4G to 1.6G. On 32-bit Solaris kernels the address space is limited to 2G. On 64-bit operating systems running the 32-bit VM, the max heap size can be higher, approaching 4G on many Solaris systems. ... If your application requires a very large heap you should use a 64-bit VM on a version of the operating system that supports 64-bit applications.&lt;/QUOTE&gt;

  • Steven Erat

    Whoops, this is the link on my blog which I meant to post:

    http://www.talkingtree.com/blog/index.cfm/2007/8/2...