0

I'm new to Azure's Blob cloud. I want to basically upload a video file from my android app to Azure cloud but I can't because as soon as the size reaches 32MB it stops throws an exception as OutOfMemory. So I did a bit of research on how should I fix this problem and I came up with a solution to break a file into bytes and than upload it as multiple blobs. At the end compile them into one blob. But I don't know how to do that. I tried using commitBlockList but for that I can't get the Id's of each blobs.

try {
        // Setup the cloud storage account.
        CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
        int maxSize = 64 * Constants.MB;

        // Create a blob service client
        CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
        blobClient.getDefaultRequestOptions().setSingleBlobPutThresholdInBytes(maxSize);
        CloudBlobContainer container = blobClient.getContainerReference("testing");
        container.createIfNotExists();
        BlobContainerPermissions containerPermissions = new     BlobContainerPermissions();
        containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
        container.uploadPermissions(containerPermissions);
        CloudBlockBlob finalFile = container.getBlockBlobReference("1.jpg");
        CloudBlob b = container.getBlockBlobReference("temp");
        String Lease = b.getSnapshotID();
        b.uploadFromFile(URL);
        List<BlockEntry> blockEntryIterator = new ArrayList<>();
        blockEntryIterator.add(new BlockEntry(Lease));
        finalFile.commitBlockList(blockEntryIterator);
    } catch (Throwable t) {

    }

~~~UPDATE~~~

I tried to break the file into parts but now I have this error "The specified blob or block content is invalid". public void splitTest(String URL) throws IOException, URISyntaxException, InvalidKeyException, StorageException {

    new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"STARTED");
    CloudBlockBlob blob = null;
    List<BlockEntry> blockList = null;
    try{
        // get file reference
        FileInputStream fs = new FileInputStream( URL );
        File sourceFile = new File( URL);

        // set counters
        long fileSize = sourceFile.length();
        int blockSize = 3 * (1024 * 1024); // 256K
        int blockCount = (int)((float)fileSize / (float)blockSize) + 1;
        long bytesLeft = fileSize;
        int blockNumber = 0;
        long bytesRead = 0;

        CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
        CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
        CloudBlobContainer container = blobClient.getContainerReference("testing");
        String title = "Android_" + getFileNameFromUrl(URL);
        // get ref to the blob we are creating while uploading
        blob = container.getBlockBlobReference(title);
        blob.deleteIfExists();

        // list of all block ids we will be uploading - need it for the commit at the end
        blockList = new ArrayList<BlockEntry>();

        // loop through the file and upload chunks of the file to the blob
        while( bytesLeft > 0 ) {

            blockNumber++;
            // how much to read (only last chunk may be smaller)
            int bytesToRead = 0;
            if ( bytesLeft >= (long)blockSize ) {
                bytesToRead = blockSize;
            } else {
                bytesToRead = (int)bytesLeft;
            }

            // trace out progress
            float pctDone = ((float)blockNumber / (float)blockCount) * (float)100;


            // save block id in array (must be base64)
            String x = "";
            if(blockNumber<=9) {
                traceLine( "blockid: 000" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "blockid000" + blockNumber;
            }
            else if(blockNumber>=10 && blockNumber<=99){
                traceLine( "blockid: 00" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "blockid00" + blockNumber;
            }
            else if(blockNumber>=100 && blockNumber<=999){
                traceLine( "blockid0: " + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "blockid0" + blockNumber;
            }
            else if(blockNumber>=1000 && blockNumber<=9999){
                traceLine( "blockid: " + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "blockid" + blockNumber;
            }
            String blockId = Base64.encodeToString(x.getBytes(),Base64.DEFAULT).replace("\n","").toLowerCase();
            traceLine( "Base 64["+x+"] -> " + blockId);
            BlockEntry block = new BlockEntry(blockId);
            blockList.add(block);

            // upload block chunk to Azure Storage
            blob.uploadBlock( blockId, fs, (long)bytesToRead);

            // increment/decrement counters
            bytesRead += bytesToRead;
            bytesLeft -= bytesToRead;

        }
        fs.close();
        traceLine( "CommitBlockList. BytesUploaded: " + bytesRead);
        blob.commitBlockList(blockList);
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"UPLOAD COMPLETE");
        return;
    }
    catch (StorageException storageException) {
        traceLine("StorageException encountered: ");
        traceLine(storageException.getMessage());
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
        assert blockList != null;
        blob.commitBlockList(blockList);
        return;
    } catch( IOException ex ) {
        traceLine( "IOException: " + ex );
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
        assert blockList != null;
        blob.commitBlockList(blockList);
        return;
    } catch (Exception e) {
        traceLine("Exception encountered: ");
        traceLine(e.getMessage());
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
        assert blockList != null;
        blob.commitBlockList(blockList);
        return;
    }
}

~~~WORKING UPDATE~~~

For anyone who wants to reuse this method to open file > and read it bytes by bytes can use this. I am able to upload ~1GB of file but after that it gives me Error 500 and crashes. If anyone has any solution please do let me know.

public boolean splitTest(String URL) throws IOException, URISyntaxException, InvalidKeyException, StorageException {
    new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"STARTED");
    CloudBlockBlob blob = null;
    List<BlockEntry> blockList = null;
    try{
        // get file reference
        FileInputStream fs = new FileInputStream(URL);
        File sourceFile = new File(URL);

        // set counters
        long fileSize = sourceFile.length();
        int blockSize = 512 * 1024; // 256K
        //int blockSize = 1 * (1024 * 1024); // 256K
        int blockCount = (int)((float)fileSize / (float)blockSize) + 1;
        long bytesLeft = fileSize;
        int blockNumber = 0;
        long bytesRead = 0;

        CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
        CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
        CloudBlobContainer container = blobClient.getContainerReference("testing");
        String title = "Android/Android_" + getFileNameFromUrl(URL).replace("\n","").replace(" ","_").replace("-","").toLowerCase();
        // get ref to the blob we are creating while uploading
        blob = container.getBlockBlobReference(title);
        traceLine("Title of blob -> " + title);

        if(blob.exists())
            blob.deleteIfExists();

        blob.setStreamWriteSizeInBytes(blockSize);
        // list of all block ids we will be uploading - need it for the commit at the end
        blockList = new ArrayList<>();

        // loop through the file and upload chunks of the file to the blob
        while( bytesLeft > 0 ) {
            // how much to read (only last chunk may be smaller)
            int bytesToRead = 0;
            if ( bytesLeft >= (long)blockSize ) {
                bytesToRead = blockSize;
            } else {
                bytesToRead = (int)bytesLeft;
            }

            // trace out progress
            float pctDone = ((float)blockNumber / (float)blockCount) * (float)100;


            // save block id in array (must be base64)
            String x = "";
            if(blockNumber<=9) {
                traceLine( "tempblobid0000" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "tempblobid0000" + blockNumber;
            }
            else if(blockNumber>=10 && blockNumber<=99){
                traceLine( "tempblobid000" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "tempblobid000" + blockNumber;
            }
            else if(blockNumber>=100 && blockNumber<=999){
                traceLine( "tempblobid00" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "tempblobid00" + blockNumber;
            }
            else if(blockNumber>=1000 && blockNumber<=9999){
                traceLine( "tempblobid0" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "tempblobid0" + blockNumber;
            }
            else if(blockNumber>=10000 && blockNumber<=99999){
                traceLine( "tempblobid" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
                x = "tempblobid" + blockNumber;
            }
            String blockId = Base64.encodeToString(x.getBytes(),Base64.NO_WRAP).replace("\n","").toLowerCase();
            traceLine( "Base 64["+ x +"] -> " + blockId);
            BlockEntry block = new BlockEntry(blockId);
            blockList.add(block);
            // upload block chunk to Azure Storage
            blob.uploadBlock( blockId, fs, (long)bytesToRead);
            notification2(a,pctDone);
            //a.update(pctDone);
            // increment/decrement counters
            bytesRead += bytesToRead;
            bytesLeft -= bytesToRead;
            blockNumber++;
        }
        fs.close();
        traceLine( "CommitBlockList. BytesUploaded: " + bytesRead + "\t total bytes -> " + fileSize + "\tBytes Left -> " + bytesLeft);
        blob.commitBlockList(blockList);
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"UPLOAD COMPLETE");
        return true;
    }
    catch (StorageException storageException) {
        traceLine("StorageException encountered: ");
        traceLine(storageException.getMessage());
        traceLine("HTTP Status code -> " + storageException.getHttpStatusCode());
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
        if (blob != null) {
            blob.commitBlockList(blockList);
        }
        return false;
    } catch( IOException ex ) {
        traceLine( "IOException: " + ex );
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
        if (blob != null) {
            blob.commitBlockList(blockList);
        }
        return false;
    } catch (Exception e) {
        traceLine("Exception encountered: ");
        traceLine(e.getMessage());
        new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
        if (blob != null) {
            blob.commitBlockList(blockList);
        }
        return false;
    }

}
7
  • 1
    Please share the code you've written. Commented Jan 5, 2016 at 16:54
  • 1
    Not really sure what you're doing here ... Please see Emily's response here ... stackoverflow.com/questions/24424543/…. I think your best bet would be to reduce singleBlobPutThresholdInBytes to somewhat lower value as suggested by Emily (default is 32 or 64 MB) and let the SDK do the chunking for you. Commented Jan 5, 2016 at 17:07
  • I did try that but it runs out of memory just after 32MB can you explain this approach? "If you’d still like to use a more manual approach, the PutBlock and Put Block List API references are here and provide generic, cross-language documentation. These have nice wrappers in the CloudBlockBlob class of the Azure Storage Android library called uploadBlock and commitBlockList which may save you a lot of time in manual request construction and can provide some of the aforementioned conveniences." Commented Jan 5, 2016 at 17:10
  • Can you show the code where you tried it? Commented Jan 5, 2016 at 17:12
  • 2
    Try changing the following line int maxSize = 64 * Constants.MB; to something like int maxSize = 1 * Constants.MB; Then the SDK will split the file in 1 MB chunks and upload that file. You won't be needing last 3 lines of your code (just above your catch statement. If it fails even then, try reducing the maxSize to 512 KB or even smaller. Commented Jan 5, 2016 at 17:25

1 Answer 1

2

This comes down to what kind of phone you're using. If you're running out memory at 32MB you're on a very small phone or using a lot of other processes. Look at your phone and how much memory you have available and like Gaurav mentions and my other answer mentions, reduce the threshold to that level. Chunking yourself won't really help for what you're trying to do.

The two settings you want to look at are singleBlobPutThresholdInBytes and setStreamWriteSizeInBytes on the blob itself. singleBlobPutThresholdInBytes affects when we start chunking vs putting the whole blob and setStreamWriteSizeInBytes affects the size of the blocks if chunking occurs. The put threshold defaults to 64MB and the write size to 4MB. Note that if you reduce the write size you may not be able to get the maximum block blob size as block blobs are limited to 50k blocks. Try reducing singleBlobPutThreshold to the amount of memory you can handle up until 4MB - if it's less than 4MB you're really in trouble and need to reduce the stream write size as well.

Sign up to request clarification or add additional context in comments.

8 Comments

I tried it but can you help me regarding the ways in which I can upload multiple blobs and merge them into one?
What happened when you tried it? If it didn't work, please update your question and we can take things from there.
I've updated my code Emily can and now I'm getting this error "The specified blob or block content is invalid" can you please guide me why am I getting this error? .
Why are you breaking it into blocks rather than just setting the singleBlobPutThresholdInBytes and uploading?
Because on some phones it was crashing because of heap memory
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.