I'm new to Azure's Blob cloud. I want to basically upload a video file from my android app to Azure cloud but I can't because as soon as the size reaches 32MB it stops throws an exception as OutOfMemory. So I did a bit of research on how should I fix this problem and I came up with a solution to break a file into bytes and than upload it as multiple blobs. At the end compile them into one blob. But I don't know how to do that. I tried using commitBlockList but for that I can't get the Id's of each blobs.
try {
// Setup the cloud storage account.
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
int maxSize = 64 * Constants.MB;
// Create a blob service client
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
blobClient.getDefaultRequestOptions().setSingleBlobPutThresholdInBytes(maxSize);
CloudBlobContainer container = blobClient.getContainerReference("testing");
container.createIfNotExists();
BlobContainerPermissions containerPermissions = new BlobContainerPermissions();
containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
container.uploadPermissions(containerPermissions);
CloudBlockBlob finalFile = container.getBlockBlobReference("1.jpg");
CloudBlob b = container.getBlockBlobReference("temp");
String Lease = b.getSnapshotID();
b.uploadFromFile(URL);
List<BlockEntry> blockEntryIterator = new ArrayList<>();
blockEntryIterator.add(new BlockEntry(Lease));
finalFile.commitBlockList(blockEntryIterator);
} catch (Throwable t) {
}
~~~UPDATE~~~
I tried to break the file into parts but now I have this error "The specified blob or block content is invalid". public void splitTest(String URL) throws IOException, URISyntaxException, InvalidKeyException, StorageException {
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"STARTED");
CloudBlockBlob blob = null;
List<BlockEntry> blockList = null;
try{
// get file reference
FileInputStream fs = new FileInputStream( URL );
File sourceFile = new File( URL);
// set counters
long fileSize = sourceFile.length();
int blockSize = 3 * (1024 * 1024); // 256K
int blockCount = (int)((float)fileSize / (float)blockSize) + 1;
long bytesLeft = fileSize;
int blockNumber = 0;
long bytesRead = 0;
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
CloudBlobContainer container = blobClient.getContainerReference("testing");
String title = "Android_" + getFileNameFromUrl(URL);
// get ref to the blob we are creating while uploading
blob = container.getBlockBlobReference(title);
blob.deleteIfExists();
// list of all block ids we will be uploading - need it for the commit at the end
blockList = new ArrayList<BlockEntry>();
// loop through the file and upload chunks of the file to the blob
while( bytesLeft > 0 ) {
blockNumber++;
// how much to read (only last chunk may be smaller)
int bytesToRead = 0;
if ( bytesLeft >= (long)blockSize ) {
bytesToRead = blockSize;
} else {
bytesToRead = (int)bytesLeft;
}
// trace out progress
float pctDone = ((float)blockNumber / (float)blockCount) * (float)100;
// save block id in array (must be base64)
String x = "";
if(blockNumber<=9) {
traceLine( "blockid: 000" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "blockid000" + blockNumber;
}
else if(blockNumber>=10 && blockNumber<=99){
traceLine( "blockid: 00" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "blockid00" + blockNumber;
}
else if(blockNumber>=100 && blockNumber<=999){
traceLine( "blockid0: " + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "blockid0" + blockNumber;
}
else if(blockNumber>=1000 && blockNumber<=9999){
traceLine( "blockid: " + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "blockid" + blockNumber;
}
String blockId = Base64.encodeToString(x.getBytes(),Base64.DEFAULT).replace("\n","").toLowerCase();
traceLine( "Base 64["+x+"] -> " + blockId);
BlockEntry block = new BlockEntry(blockId);
blockList.add(block);
// upload block chunk to Azure Storage
blob.uploadBlock( blockId, fs, (long)bytesToRead);
// increment/decrement counters
bytesRead += bytesToRead;
bytesLeft -= bytesToRead;
}
fs.close();
traceLine( "CommitBlockList. BytesUploaded: " + bytesRead);
blob.commitBlockList(blockList);
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"UPLOAD COMPLETE");
return;
}
catch (StorageException storageException) {
traceLine("StorageException encountered: ");
traceLine(storageException.getMessage());
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
assert blockList != null;
blob.commitBlockList(blockList);
return;
} catch( IOException ex ) {
traceLine( "IOException: " + ex );
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
assert blockList != null;
blob.commitBlockList(blockList);
return;
} catch (Exception e) {
traceLine("Exception encountered: ");
traceLine(e.getMessage());
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
assert blockList != null;
blob.commitBlockList(blockList);
return;
}
}
~~~WORKING UPDATE~~~
For anyone who wants to reuse this method to open file > and read it bytes by bytes can use this. I am able to upload ~1GB of file but after that it gives me Error 500 and crashes. If anyone has any solution please do let me know.
public boolean splitTest(String URL) throws IOException, URISyntaxException, InvalidKeyException, StorageException {
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"STARTED");
CloudBlockBlob blob = null;
List<BlockEntry> blockList = null;
try{
// get file reference
FileInputStream fs = new FileInputStream(URL);
File sourceFile = new File(URL);
// set counters
long fileSize = sourceFile.length();
int blockSize = 512 * 1024; // 256K
//int blockSize = 1 * (1024 * 1024); // 256K
int blockCount = (int)((float)fileSize / (float)blockSize) + 1;
long bytesLeft = fileSize;
int blockNumber = 0;
long bytesRead = 0;
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
CloudBlobContainer container = blobClient.getContainerReference("testing");
String title = "Android/Android_" + getFileNameFromUrl(URL).replace("\n","").replace(" ","_").replace("-","").toLowerCase();
// get ref to the blob we are creating while uploading
blob = container.getBlockBlobReference(title);
traceLine("Title of blob -> " + title);
if(blob.exists())
blob.deleteIfExists();
blob.setStreamWriteSizeInBytes(blockSize);
// list of all block ids we will be uploading - need it for the commit at the end
blockList = new ArrayList<>();
// loop through the file and upload chunks of the file to the blob
while( bytesLeft > 0 ) {
// how much to read (only last chunk may be smaller)
int bytesToRead = 0;
if ( bytesLeft >= (long)blockSize ) {
bytesToRead = blockSize;
} else {
bytesToRead = (int)bytesLeft;
}
// trace out progress
float pctDone = ((float)blockNumber / (float)blockCount) * (float)100;
// save block id in array (must be base64)
String x = "";
if(blockNumber<=9) {
traceLine( "tempblobid0000" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "tempblobid0000" + blockNumber;
}
else if(blockNumber>=10 && blockNumber<=99){
traceLine( "tempblobid000" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "tempblobid000" + blockNumber;
}
else if(blockNumber>=100 && blockNumber<=999){
traceLine( "tempblobid00" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "tempblobid00" + blockNumber;
}
else if(blockNumber>=1000 && blockNumber<=9999){
traceLine( "tempblobid0" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "tempblobid0" + blockNumber;
}
else if(blockNumber>=10000 && blockNumber<=99999){
traceLine( "tempblobid" + blockNumber + ". " + String.format("%.0f%%",pctDone) + " done.");
x = "tempblobid" + blockNumber;
}
String blockId = Base64.encodeToString(x.getBytes(),Base64.NO_WRAP).replace("\n","").toLowerCase();
traceLine( "Base 64["+ x +"] -> " + blockId);
BlockEntry block = new BlockEntry(blockId);
blockList.add(block);
// upload block chunk to Azure Storage
blob.uploadBlock( blockId, fs, (long)bytesToRead);
notification2(a,pctDone);
//a.update(pctDone);
// increment/decrement counters
bytesRead += bytesToRead;
bytesLeft -= bytesToRead;
blockNumber++;
}
fs.close();
traceLine( "CommitBlockList. BytesUploaded: " + bytesRead + "\t total bytes -> " + fileSize + "\tBytes Left -> " + bytesLeft);
blob.commitBlockList(blockList);
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"UPLOAD COMPLETE");
return true;
}
catch (StorageException storageException) {
traceLine("StorageException encountered: ");
traceLine(storageException.getMessage());
traceLine("HTTP Status code -> " + storageException.getHttpStatusCode());
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
if (blob != null) {
blob.commitBlockList(blockList);
}
return false;
} catch( IOException ex ) {
traceLine( "IOException: " + ex );
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
if (blob != null) {
blob.commitBlockList(blockList);
}
return false;
} catch (Exception e) {
traceLine("Exception encountered: ");
traceLine(e.getMessage());
new ConversionNotificationSetup().sendNotification(a.getApplicationContext(),"FAILED");
if (blob != null) {
blob.commitBlockList(blockList);
}
return false;
}
}
singleBlobPutThresholdInBytesto somewhat lower value as suggested by Emily (default is 32 or 64 MB) and let the SDK do the chunking for you.int maxSize = 64 * Constants.MB;to something likeint maxSize = 1 * Constants.MB;Then the SDK will split the file in 1 MB chunks and upload that file. You won't be needing last 3 lines of your code (just above your catch statement. If it fails even then, try reducing the maxSize to 512 KB or even smaller.