I have some strange behavior which I can't figure out.
I'm using a WCF service to save files to a certain DB table. The WCF service has a single method which takes a JSON string as parameter. The JSON in this case is a serialized command which contains a List<FileData> amongst other properties. The WCF service deserializes the JSON and runs the CommandHandler for this specific command.
An end-user ran into a bug when he tried to upload a file with a size of 52 MB. The WCF service returned a 404 error.
I was able to reproduce this in Visual Studio. After changing the config file according to this article, the 404 disappeared.
But now a new exception appeared: While the command is successfully serialized client side, successfully processed by WCF, the deserialization throws an OutOfMemoryException. This is the top of the stacktrace:
at Newtonsoft.Json.JsonTextReader.ReadData(Boolean append, Int32 charsRequired) at Newtonsoft.Json.JsonTextReader.ReadData(Boolean append) at Newtonsoft.Json.JsonTextReader.ReadStringIntoBuffer(Char quote) at Newtonsoft.Json.JsonTextReader.ParseString(Char quote) at Newtonsoft.Json.JsonTextReader.ParseValue() at Newtonsoft.Json.JsonTextReader.ReadInternal() at Newtonsoft.Json.JsonReader.ReadAsBytesInternal() at Newtonsoft.Json.JsonTextReader.ReadAsBytes() at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.ReadForType(JsonReader reader, JsonContract contract, Boolean hasConverter)
I wrote a unittest to prove the bug. But against all odds this test passes, in other words, no OutOfMemoryException is thrown.
The test for the sake of completeness:
[TestMethod]
public void LoadBigFile_SerializeDeserialize_DoesntThrowOutOfMemoryException()
{
// Arrange
byte[] bytes = new byte[80000000];
Random r = new Random(23);
r.NextBytes(bytes);
var command = new SomeCommand(new List<FileData>
{
new FileData(
fileFullName: @"D:\SomePdfFile.pdf",
modifyDate: DateTime.MaxValue,
data: bytes
)
});
var data = JsonConvert.SerializeObject(command);
// Act
var deserializedCommand =
JsonConvert.DeserializeObject<SomeCommand>(data);
// Assert
Assert.AreEqual(bytes.Length, deserializedCommand.Files.First().Data.Length);
}
So, I took my chances and changed the config file in production and tried to upload the same file. And that just works!!! No OutOfMemoryException!
Now my question is, why does the OutOfMemoryException only happen in Visual Studio, while a unittest in the same instance of VS doesn't? It feels a bit strange that I can't test uploading big files in Visual Studio, while it works in production. Notice that I also tried to run in Debug as in Release Mode.
Some details:
- Using Json.Net 7.0.1
- Visual Studio 2015, update 2
- WCF hosted in IIS Express locally, IIS in production
- Windows 10 latest build 64 bit
- Production server Windows server 2008 R2 64 bit
- .Net Framework 4.5.2