We are querying for contentversion data in apex and would like to take the first part of the file data and send chunk by chunk to another webserver.
Is there a way to only take 10,000 bytes of the blob versionData. Does anyone have any example apex that does this?
Currently we are able to query, but as soon as refer to the data field we blow our heap size (it's a big file).
We would like to adjust the code below to only take chunks in the second line of apex...
contentversion[] cv=[select id, title, versiondata from contentversion where title like 'Deadmau5%'];
String att64 = EncodingUtil.base64Encode(cv[0].VersionData);
Attribution to: James
Possible Suggestion/Solution #1
I don't think it is currently possible to chuck blob data.
I've had a similar issue with the Attachment body - Salesforce LimitException: Apex heap size too large.
Right now I've done two things to help:
- Move to a Batchable implementation with the batchSize/scope set to 1. That way you will only ever be dealing with a single Attachment/ContentVersion (reducing the overall heap size). Plus the batch context has a larger heap size (12 MB rather than 6 MB).
- Check if the base64 encoding of the blob is going to push you past the heap size limit.
It isn't foolproof by any means, but can help avoid hitting the limit.
if(Limits.getHeapSize() + att.BodyLength > Limits.getLimitHeapSize()) {
// Do something else
}
Failing that, you might be better to pull the binary data into the external system rather than push it in. You could callout to a webservice from Apex with the ContentVersion.Id and Salesforce Session details. Then the webservice can establish a session back, with say the partner API, and pull the blob data down with a retrieve call.
Attribution to: Daniel Ballinger
This content is remixed from stackoverflow or stackexchange. Please visit https://salesforce.stackexchange.com/questions/4575